Thursday, 29 May 2014
Brains are More than Threshold Units
I was reading the following article yesterday:
Laser mimics biological neurons using light
The article is about a component that can emit a laser depending on the intensity of light hitting it. In technical language, this is called a threshold unit. In fact, a threshold unit is a concept. It is a name used to describe any kind of material/device/stuff which can receive some kind of input, be it in the form of energy, information or anything else, and emit something else whenever the input amount exceeds some threshold value.
The connection of threshold units with the brain is that neurons are a kind of threshold unit. There are many different kinds of neurons. Some of them are constantly emitting pulses of electricity at a certain frequency, while others remain "silent" until they become excited. An "excited" neuron changes its firing frequency whenever the amount of electricity it receives from other neurons to which it is connected goes above some value. As you can easily notice, this value is the threshold in a threshold unit.
The simplest threshold unit is the famous perceptron. The perceptron is not an object, it is a mathematical model that was developed to mimic the main function of the neuron. The perceptron is a mathematical structure with a certain number of "boxes" that work as input entries and one output box that spills out a number whenever the inputs get larger than some value. These models date back to the 1950's.
Because the perceptron is a mathematical model, one can fit any threshold unit to something similar to a perceptron. During decades, perceptrons and other kinds of neural networks created by connecting perceptrons in many different ways have been studied. They are capable of memorization and learning under certain limits.
Then, might ask, why am I being picky and saying that the brain is more than threshold units? Neurons are threshold units and threshold units can be used to construct neural networks, but what makes a network learn is the pattern of connections of its units, not the units themselves. The point is that knowledge is not stored in the units, it's stored in the links they form with each other. If you put trillions of threshold units in a regular square network, nothing will ever happens in terms of learning. Without learning, thinking is not that great... So, what is missing?
The answer receives the name of plasticity. This is the capacity that neurons possess of creating new connections and cutting old ones. This is what changes the patterns and makes memorization and generalization (the two pillars of learning) possible. Although faster threshold units like the ones in the article might improve the speed of information transmission, that is not a guarantee that it will improve higher abilities like creativity and understanding. It might lead to faster reflexes, for instance, but not faster learning as it has nothing to do with creating and severing connections.
It is not that the article does not describe an interesting work. It is, but one needs to be very careful with the actual implications of each line of research when we read about a new 'big breakthrough' every week...
Labels:
artificial intelligence,
brain,
neurons,
physics
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment