Have you ever wondered how your brain effortlessly presents visual data to you in such an organized and seamless way? After visual sensory data enters the eye, it is first processed by the retina via the lateral geniculate nucleus (LGN) cells, which are the only cells in the eye that directly communicate with the visual cortex. The retina acts like a camera and records a brief snapshot of your visual field before being heavily processed by the brain. Because there are so few LGN cells connecting to the visual cortex, the brain does not receive much detailed information about the incoming visual stream and has to rely on other means to construct what you see.
Researchers Logan Chariker, Robert Shapley, and Lai-Sang Young sought to determine how the brain does so much image construction with so little visual information. To obtain a more detailed analysis of how this happens, they analyzed cortical gamma-band activity in the primary visual cortex (V1) of monkey brains. A gamma brainwave has an average frequency of 40 Hz but can vary between 25 Hz - 100 Hz. Area V1 is most closely associated with the conscious processing of visual sensory data. Neural modules exist underneath the cortical surface which help to analyze different regions of the visual field.
The other thing to keep in mind about gamma-band activity is that it typically occurs in episodic bursts and is not continuous. The researchers mentioned that gamma-band bursts in area V4 are closely associated with increased attention. Of particular interest to them was how these synchronous gamma-band activity bursts originate in the brain. They analyzed "cortical activity with a large-scale, data-driven model of macaque V1" (abbreviated "CSY") to account for the structure and function of gamma-band activity in visual cortical regions. They extrapolated that the CSY model might also hypothetically explain gamma-band activity in other non-visual cortical areas.
How are these critical gamma-band rhythms produced, and how do they act synchronously? The CSY model is a highly realistic model of the macaque monkey visual cortex and offers some important clues as to how these synchronized bursts of neural activity occur. Researchers suggested a specific mathematical model for the CSY cells known as the "leaky integrate-and-fire" or LIF neurons. They used a differential equation to describe the membrane potential of the neurons and defined parameters in which the neurons would fire or not. Neurons were described as being either excitatory (E) or inhibitory (I), depending on their electrochemical state.
Individual neurons in the visual cortex receive electrochemical signals from other excitatory and inhibitory neurons, but they will only fire after exceeding their specific electric membrane potential. Determining the exact membrane potential of an individual neuron is nearly impossible, and what makes matters more complicated is that there is a myriad of other neurons connected to each neuron, so understanding how an electrochemical fluctuation in one neuron affects the others is even more complicated. Information passes through the visual cortex and back to the eye through frequent feedback loops that are highly sensitive to changes and perturbations. The researchers liken the effects of these feedback loops to the Butterfly Effect, whereby a small change in LGN nerve signals eventually results in large-scale visual representation changes. These changed happen through a process known as "recurrent excitation," which involves the layering of data processing through feedback loops.
According to neuroscientist Alessandra Angelucci, "[They showed] that you can generate all orientations in the visual world just using a few neurons connecting to other neurons." Through their experiments, Chariker et al. were able to successfully reverse-engineer the neural networks responsible for the reaction to specific types of visual stimuli, including edge detection and other elementary properties of vision. The activity of neurons is highly dependent on and connected to the activation and propagation of activity of neurons in surrounding regions. Kevin Hartnett likens this activity to the way a dance party initially begins with just a few people and gradually picks up steam until people fill up the venue.
Over time, Chariker et al. intend to conduct further research into the precise way the brain constructs the visual reality around you, including color recognition, object motion, and object contrast. Being able to decode the way the brain processes visual data would undoubtedly have far-reaching implications for artificial intelligence and medical science, and assist in understanding other human senses such as taste, smell, hearing, and touch. There is much more there than meets the eye!
Chariker, Logan, Robert Shapley, and Lai-Sang Young. “Rhythm and Synchrony in a Cortical Network Model.” The Journal of Neuroscience 38, no. 40 (2018): 8621–34. https://doi.org/10.1523/jneurosci.0675-18.2018.
Hartnett, Kevin, and Quanta Magazine. “A Mathematical Model Unlocks the Secrets of Vision.” Quanta Magazine. Accessed October 17, 2019. https://www.quantamagazine.org/a-mathematical-model-unlocks-the-secrets-of-vision-20190821/.
Howard Blumenfeld is a mathematics professor at Las Positas College with a keen interest in psychology, neurology, cognitive science, and philosophy. He earned his B.A. in Pure Mathematics with Minor Degrees in Psychology and Literature at the University of California, San Diego, and his M.A. in Mathematics (Emphasis on Community College Education) from San Diego State University. In his spare time, he likes to cook, lift weights, and spend time with his beautiful wife and son.