yep ai evaluated the whole neural network to output at once, real neuron signals are continuous in time. eeg stuff is nice. meg stuff also. any network can be a neural network. the basic cells determine to which type it evolves with evolutionary algorithm learning. what do I know, believers. therefore like current ai neural networks, the brain logic can function in any way. ie the program depends on the code basics function cells. if you cant predict the function of the change, then you cant say it was learning. ask above for better help. ie concept learning, not low level modifications. people think they know something, but they dont yet know how they should. as above knows and people have knowledge laws. therefore people teaching knowledge helps nobody. if you can transform brain neural networks to binary neural networks then you get the proper actual combinatory function. various and/or/not patterns make up the same functionality as any brain neuron. those also form any computer binary logic processing pipeline. not-active states are equally important in brain function. its the combination of the whole circuit, 0/1 signal values at correct times. again, particular cell logic does not matter, in the evolutionary learning pattern. any structure tries to achieve the same goal, even if the cell structures and mechanisms are totally different. ie one language can be translated to other languages. asm code can be translated to c-code. cpu structures can be various. above is just bamboozeling you with complex design that you try to figure out with your own understanding, not reaching your goal. describe a full pipeline low/high level function that can be clearly picked up, so that your science is not magic. yep credibility at same level as string theory. equal to string theory.
Comment:The book of Hubel and Wiesel, Brain and Visual Perception mentioned hypotheses regards wiring of edge detectors. From what I read once elsewhere if you stopped inhibition, edge detectors would become broadly tuned and be indiscriminate regards edge orientation. My understanding is that detection of complementary patterns such as different edge orientations occur at the level of the hypercolumn, each minicolumn attunes to a particular edge orientation and alternate orientations are detected by other minicolumns and the minicolumns within the hypercolumn inhibit each other. If we assume universal algorithm duplicated throughout cortex, same must happen with patterns made up of collections of edges, and the complementary alternatives in higher regions of cortex. Also I will mention that Neuroscientist Edmund T. Rolls has a slightly modified stdp rule, that seems to offer good computational models too. His latest book is available for free from his website in PDF form, and I believe covers the learning rule. edit(went over and found rule it is memory trace rule page 109 of latest book) I will say also, that both the recent nature paper on brain wiring, april edition of 2025, as well as prior wiring diagrams, hint that in some cases one neuron can make more than one synapse with another neuron. How rare such is or how frequent, I'm not sure, I know it happens, and some neuroscientists say multiple synapses between two neurons wouldn't be good but it does seem to happen.