Youtube comments in video regarding pattern completion, intelligence, brain function, and relation to prediction and function at neural and synaptic level:
"I will say in addition to the reverse signal issue, there is another issue with backpropagation. That said some researchers have come with biological plausible implementations of backpropagation in recent years. But the other issue I mention is that backpropagation learns too slowly. Current ai experience millennia of training and still fail at basic tasks, despite the extreme slowness of the brain and scarcity of data the brain learns to perform even outside the bounds of training data within months or a few years. It is said very gifted children can even know several languages musical instruments and decent math within a few years of birth. Yet despite theoretically being able to do around 1000hz, neurons tend to operate well under 300hz, and vast majority of brain is usually silent from what i heard, there is sparse activation.
Regards Predictive coding, I agree with your assessment. But I do believe that something similar is taking place through a Generative Adversarial Network (GAN) like turing learning mechanism via synaptic competition. Synaptic competition is implemented via capture of proteins related to strengthening of synapses. All the synapses that fire close to the time of neural activity, iirc, generate molecular tags that capture synapse strengthening proteins and all these synapses strengthen, while those that fail to do so weaken over time. How does this cause a similar pattern completion comparison mechanism? The input from sensory organs arrives in some synapses, while the signals from other neurons part of a larger pattern arrive at other synapses, normally given the sensory input is part of the larger pattern both signals match, but if there is noise there can be mismatch.
But in essence it can be seen as surrounding pattern completing or trying to activate the neuron in case noise blocks sensory signal, thus a partial match appears complete. This can be seen for example in language, when people speak, they sometimes omit small chunks of sentences unconsciously, but the receiver if they know the language never even becomes aware that the speech is missing tiny chunks, iirc. It is also one of the things that makes learning a foreign language more difficult.
But if understood from gan turing learning perspective, surrounding neurons can be viewed as the counterfeit signals or fake signal generating networks trying to trick neuron whilst real sensory signals can be seen as the real thing. And due to synaptic competition, the surrounding tissue gets better at faking, predicting, or completing the signal. (But what it does is merely connect the most related pattern detectors from nearby tissue to the neuron or pattern detector). This is because the competition rewards successful completion or successful prediction at the synaptic level of the neuron."
Turing Learning Enables Machines to Learn Through Observation Alone
No comments:
Post a Comment