13.8 C
New York
Monday, March 4, 2024

Questions About the Brain and Neural Dynamics Should be a Priority for Quantum Computing

Questions About the Brain and Neural Dynamics Should be a Priority for Quantum Computing

Image credit: Getty

Quantum computing is set to revolutionize areas where classical computers show limitations, particularly in cryptography and the simulation of complex physical and chemical systems, including quantum mechanics itself. Notably, quantum algorithms like Shor’s for integer factorization could significantly disrupt traditional cryptographic techniques, presenting a mix of challenges and opportunities in the realm of data security [1]. While research in quantum computing has historically focused on these well-established topics, the exploration of new problem classes that are particularly suitable for quantum computation is an active area of research, including the application of known quantum algorithms and the development of new ones.

A nascent and relatively unexplored topic is the application of quantum computing in neuroscience. Specifically, modeling, simulating, and asking specific questions about brain functions. The goal is to identify and understand the neural dynamics and brain algorithms responsible for functional cognitive properties. We argue that prioritizing research in this area is important, as it holds the potential to significantly enhance our understanding of the brain while simultaneously motivating and contributing to the theory and development of quantum algorithms and computing.

To be sure, the intersection of quantum computing, neuroscience, and related topics has generated considerable interest, leading to quite a few books, technical papers, and popular articles. Some of what has been written is speculative but scientifically thoughtful (see for example, [2], [3], [4], [5], [6], [7]), and a lot is not. Probably most well-known are the arguments by Hammoroff, Penrose, and others suggesting that microtubules in neurons act as quantum computing elements in fundamental ways [8], [9]. While still controversial, recent experimental results suggest that neurons might indeed leverage properties of quantum effects in how they compute [10], a remarkable result. Our focus here, however, is different. We are particularly interested in the practical application of quantum algorithms for deciphering and understanding neural dynamics, rather than exploring the possibility of quantum computational mechanisms within the brain itself.

Despite the growing interest in this topic, rigorous mathematical models and quantitative arguments for how to develop and apply quantum algorithms for solving meaningful neuroscience questions remain in their infancy. Attempting to decipher the brain’s algorithms and the neural dynamics that underlie these processes offers a unique opportunity for the development of quantum computing. This pursuit will test existing and emerging methods and inspire new ones. Concurrently, it will provide neuroscience with fundamentally new computational and simulation tools, potentially catalyzing a paradigm shift in our understanding of the brain.

Mathematics is critical for understanding the brain’s complexity [11], [12]. The brain’s global and emergent functions, such as cognition and possibly even consciousness, emerge from intricate interactions between many biological components that operate across multiple spatial and temporal scales. To make sense of these complex interactions, mathematics serves as an essential organizing language and framework. It provides the necessary tools to manage and interpret the vast number of details and dynamics involved. While experimental data across genetic, molecular, cellular, and network scales of organization are crucial, they are not sufficient on their own in isolation to fully explain how brain computations and emergent cognitive properties arise. In addition, mathematics goes beyond just organizing these details; it enables scientists to make predictions and guide experimental testing.

The purpose of applying mathematics to neuroscience is not necessarily to replicate every biological detail but rather to abstract the brain’s algorithms from the underlying biological ’wetware’ responsible for implementing those algorithms. This approach contributes significantly to a systems understanding of the brain. It has also contributed significantly towards advancing machine learning and artificial intelligence. Furthermore, mathematical modeling of the brain is important for exploring complex neurological and neuropsychiatric disorders.

The structural complexity and heterogeneity of the human brain (and those of many other species) are difficult to truly grasp. The degree of genetic, molecular, and cellular, specialization and variability is vast, and despite the amount of research that has taken place over decades, scientists are just beginning to map it all out [13]. This huge degree of physical complexity supports a functional state space and combinatorial computational space with a dimensionality that defies any possible intuitive comprehension. An average 3 lb adult human brain contains approximately 85 billion neurons, each forming between 10,000 to 100,000 synaptic connections, leading to an estimated 1016, 10 quadrillion, synapses [14], [15], [16]. For example, consider just 1000 neurons, each acting as statistically independent computational elements capable of firing on their own. If just 2% are firing at any given time, the number of possible distinct encodings is 1000!/20! · (1000 − 20)! = 10⁴¹. If the entire number of neurons in the neuronal network of the brain was to be taken into account, the dimensionality of the computational space expressed as the number of distinct encodings would be a number with about 892 billion digits. By contrast, there are about 1021 stars in the observable universe.

There are also about another 86 billion non-neuronal cells in the brain. Of these, about 20% are astrocyte glial cells. Certain sub-types of astrocytes can both listen in and modulate neuronal signaling and information processing [17]. These cells form an independent network onto themselves, while at the same time cross-talk with the neuronal network. From a computational perspective, very little is still known about how interactions between neurons and astrocytes result in functional and cognitive outputs. It is one of the most exciting topics in systems and computational neuroscience. As if this was not all enough, the brain’s complexity is not just due to the sheer number of connections and size of its computational space, but how these connections and resultant encodings are dynamically reconfigured and modulated. Understanding neuromodulation is an entire research effort in itself.

Taken together, this degree of complexity seems to give rise to emergent cognitive properties in ways we simply do not understand. Moreover, because emergent cognitive functions are potentially dependent on both the scale and the dynamic reconfiguration of neural activity, certain neural properties may not even be observable at scales below some critical threshold. While controversial and an active area of research, similar emergent ’learned’ phenomena have been proposed in the context of large language models [18].

Given this degree of complexity, how might quantum computing contribute to deciphering the brain and the neural dynamics that underlie its functions? We propose two distinct, but related, lines of research that would greatly benefit from quantum computing. The first is large-scale simulations of neural dynamics across scales of spatial and temporal organization, bounded and informed by the known physiology. The second is carefully chosen and structured problems about neural dynamics and the underlying neurobiology and physiology. We will discuss each in turn.

Given the sheer size of the computational space and dimensionality of the human and other animal brains, sufficiently large-scale simulations would allow observing, experimenting, and iterating numerical experiments under a wide range of parameter and model conditions. If complex emergent cognitive properties are in part due to sufficiently large interactions among foundational physiological and biological components and processes, the need to carry out very large iterative simulations may be critical to understanding the dynamics that give rise to cognitive properties. Simulations of this kind would support experimentally observing, modeling, and ultimately understanding patterns and effects that depend on the scale of the computational space, to the extent it can be computed.

Of particular importance will be the development of algorithms and methods that support computing the time evolution of neural network dynamic models. Successfully doing so in a practical setting at scale is exceedingly challenging both from a theoretical and engineering perspective, but of immense consequence. While several algorithmic approaches for the time evolution of quantum computational systems have been discussed [19], [20], it is not at all obvious how to set up and construct the necessary mathematical conditions to compute specific (local) instances of internal neuronal dynamics and then allow them to temporally (globally) evolve on a network. This effort will likely need to use known quantum algorithms and methods creatively, and possibly develop new ones. Although beyond the immediate scope of this commentary, any resultant methods would almost certainly have important applications to artificial quantum neural networks and machine learning, and possibly to a future quantum internet.

However, open-ended large-scale simulations by themselves will not be sufficient for understanding how the brain works. This in effect led to the significant challenges and missed targets faced by the highly publicized and hugely funded Blue Brain Project [21]. It will necessitate carefully chosen and defined problems about the neural dynamics being simulated. This is critical. Observing neural dynamics in action — for example, the firing patterns of large numbers of neurons — in isolation and without context, by itself, cannot reveal the underlying algorithms that are operating on those dynamics or why they exist. It is no different than attempting to understand the brain from a systems and engineering perspective by studying a single participating protein or ion channel in isolation. So setting up and asking the right questions about the dynamics that can leverage quantum computing will be key.

One approach is to take a page from contemporary physics research programs. The first step would be to set up and carry out large-scale simulations of neural dynamics appropriately constrained and bounded by the known physiology and biophysics. In effect, generating large amounts of data. Such an effort could leverage quantum computing but would also leverage state-of-the-art classical computing methods. (We comment further on this below.) Iterative numerical experiments subject to different perturbations and boundary conditions would likely enrich the results. The next step would be to mathematically model data from numerical experiments in an attempt to understand the results analytically and develop theories to predict new effects that can be tested in subsequent numerical experiments. This is where asking the right questions comes into play. Any such predictions would be stress-tested in simulations constrained by the physiology and biophysics of the dynamic models being simulated. Ultimately, the most important results would need to be tested and validated (or discarded) experimentally in neurobiological models in the lab.

Any questions about the relevant neuroscience may be important to themselves, or they may be a piece of a larger puzzle, contributing partial information or guiding an experimental or theoretical line of investigation. Quantum computing has a unique role to play in deciphering and understanding the contributions that network scale and dimensionality have on the neurobiological questions being addressed.

In particular, the interplay between computational dimensionality and the physical constraints brain networks are necessarily subject to and (in contrast to artificial neural networks) seem to take advantage of will give us a new perspective on brain algorithms [22], [23] [24]. We may begin to observe and understand how and why processes responsible for high-level cognitive functions emerge only beyond a critical threshold scale. Other types of physical considerations may be equally relevant in different ways. This is important because it is almost certain that we have yet to fully discover the right mathematical ideas and descriptions that implement the brain’s algorithms. This could involve bridging the gap between known mathematics applied in new ways, or even the necessity to discover or invent new mathematics altogether.

For example, we have previously derived a mathematical framework and associated principle called the Refraction Ratio [22], [23]. This ratio sets constraints and mathematically predicts optimal dynamics on networks with a physical structural geometry and resulting signaling latencies. Brain networks fall into this category. Fundamentally, this principle determines how efficiently the brain can process information. Beyond the theory itself, we have shown that at least certain classes of neurons optimize the Refraction Ratio across their synapses [25], and that the connectome (i.e. the connectivity network structure) of the worm C. elegans takes advantage of the Refraction Ratio to optimize neural computations that correlate with known associated behavioral outputs [26]. Proof of concept work in our lab is now exploring if we can use specific quantum algorithms to determine exponentially faster (compared to classical algorithms) if geometric networks computing the Refraction Ratio dynamics become ’epileptic’, or whether the dynamics die away, or if they can potentially sustain inherent recurrent activity (i.e. remain ’healthy’). What we are targeting here is not the simulation of the dynamics itself, but rather, an interesting and important question about the dynamics that is set up to inherently take advantage of quantum computing.

For both lines of neuroscience research — simulations and specific questions — methodologically, it is likely that, at least in the foreseeable future, classical-quantum computing hybrid models will be most successful. For example, the development of specialized (classical) neuromorphic hardware specifically for carrying out large-scale simulations of the brain has a very long and rich history [27], [28]. At least at certain scales, current efforts are now approaching simulation sizes that are relatively comparable to the human brain [29]. But because, as we argue above, simulations alone will not be enough, appropriately asking the right questions to probe this huge combinatorial and computational space is where we suggest quantum computing has a unique and complementary role to play.

Are there any shared considerations that cut across the types of relevant problems that might offer some guidance as to how to pick and define them? Perhaps there are. Probably the most significant similarity across relevant problems is a dependency of some kind on the dimensionality and size of the state and computational space.

What are the relevant mathematics and models? If an algorithm in such a computational space depends in part on a random walk or stochastic search procedure, how does it optimize or even compute anything of value given the dimensionality of the search and solution spaces? Does some new mathematics need to be discovered? Or can known mathematics be applied in new ways? Is there a novel heuristic process involved that needs to be described? It is doubtful that we will be able to arrive at such discoveries without numerical experimentation analytically.

In the meantime, given the present noisy intermediate-scale quantum (NISQ) era the field is in [30], what kinds of toy or proof-of-concept problems can be formulated and solved? What analytic and mathematical proofs can be extended? For example, we briefly introduced the problem our group is exploring. Can scientists set the conditions for larger and more important neuroscience problems to be solved (or at least started) over the next 5–10 years that cannot be solved today?

Reciprocally, pursuing quantum computing approaches to brain simulations and questions related to neural dynamics could drive progress in quantum computing itself. Systems neuroscience as a target application for quantum computing provides a concrete set of difficult problems and questions that could, by necessity, drive innovation in quantum algorithms and theory. The complexity and sheer scale of the state space and computational space of brains may motivate further advances in the engineering. These considerations are not restricted to neuroscience applications. It is likely that other exceedingly high-dimensional complex areas of application, such as modeling climate, molecular dynamics, and artificial neural networks, will give as much back to quantum computing as quantum computing provides. This would be mutual reciprocity at its best.

The author would like to thank Eric Trout and Rita J. King for many insightful discussions and feedback. A pre-print version of this article will be published in arXiv. A peer-review version is under submission.

[1] Shor, P. W. Polynomial-time algorithms for prime factorization and discrete logarithms on a quantum computer. SIAM Journal of Computation 26, 1484–1509 (1997).

[2] Vierre, E., Geraci, J. & Silva, G. A. Quantum computing for neuroscience and neuroscience for quantum computing (In: Convergence: Artificial Intelligence and Quantum Computing, 2nd edition, in press).

[3] Huang, D., Wang, M., Wang, J. & Yan, J. A survey of quantum computing hybrid applications with brain-computer interfaces. Cognitive Robotics https://doi.org/10.1016/j.cogr.2022.07.002 (2022).

[4] Zhao, R. & Wang, S. A review of quantum neural networks: Methods, models, dilemma. arXiv arXiv:2109.01840 (2021).

5] Silva, G. A. Large-scale simulations of the brain may need to wait for quantum computers. Forbes https://www.forbes.com/sites/gabrielasilva/2021/09/02/large–scale–simulations–of–the– brain–may–need–to–wait–for–quantum–computers/?sh=3934b5b07254 (2021).

[6] Miranda, E. et al. An approch to interfacing the brain witih quantum computers: Practical steps and caveats. arXiv arXiv:2201.00817 (2022).

7] Swan, M., Santos, R. P. D. & Lebedev, M. Quantum Computing for the Brain (WSPC, 2022).

[8] Hameroff, S. R. Conscious events as orchestrated space-time selections. Journal of Consciousness Studies 3, 36–53 (1996).

[9] Hameroff, S. R. Quantum computation in brain microtubules? The Penroese-Hameroff ’orch or’ model of consciousness. Philosophical Transactions of the Royal Society of Lond 622, 1869–1896 (1998).

[10] Kerskens, C. M. & Perez, D. L. Experimental indications of non-classical brain functions. Journal of Physics Communications 105001 (2022).

[11] Silva, G. A. The need for the emergence of mathematical neuroscience: Beyond computation and simulation. Frontiers in Computational Neuroscience 5, https://doi.org/10.3389/fncom.2011.00051 (2011).

[12] Silva, G. A. Why we need mathematics to understand the brain. Medium https://medium.com/cantors– paradise/why–we–need–mathematics–to–understand–the–brain–93c9e44b0186 (2020).

[13] https://www.quantamagazine.org/new-cell-atlases-reveal-untold-variety-in-the-brain-and-beyond- 20231213/ .

[14] Herculano-Houzel, S. The human brain in numbers: A linearly scaled-primate brain. Frontiers in Human Neuroscience doi: 10.3389/neuro.09.031.2009 (2009).

[15] Azevedo, F. A., Carvalho, L. R., Grinberg, L. T., Marcelo, J. & Ferretti, R. E. Equal numbers of neuronal and nonneuronal cells make the human brain an isometrically scaled-up primate brain. Journal of Comparative Anatomy doi: 10.1002/cne.21974 (2009).

[16] DeFelipe, J., Marco, P., Busturia, I. & Merchan-Perez, A. Estimation of the number of synapses in the cerebral cortex: methodological considerations. Cerebral Cortex 9, 722–732 (1999).

[17] de Ceglia, R., Ledonne, A., Litvin, D. G. & et. al. Specialized astrocytes mediate glutamatergic gliotransimission in the cns. Nature 622, 120–129 (2023).

[18] Wei, J. et al. Emergent abilities of large language models. arXiv hhttps://arxiv.org/abs/2206.07682 (2022).

[19] Venegas-Andraca, S. E. Quantum walks: A comprehensive review. arXiv arXiv:1201.4780 (2012).

[20] Kendon, V. Quantum computing using continous-time evolution. Interface Focus 10, arXiv:20190143.4780 (2020).

[21] Documentary follows implosion of billion-euro brain project. Nature https://www.nature.com/articles/d41586–020–03462–3.

[22] Buibas,M.&Silva,G.A. A framework for simulating and estimating the state and functional topology of complex dynamic geometric networks. Neural Computation 23, 183–214 (2010).

[23] Silva, G. A. The effect of signaling latencies and node refractory states on the dynamics of networks. Neural Computation 31, 2492–2522 (2019).

[24] jascha Achterberg, Akarca, D., Strouse, D., Duncan, J. & Astle, D. E. Spatially embedded recur-rent neural networks reveal widepread links between structural and functional neuroscience fundings. Nature Machine Learning 5, 1369–1381 (2023).

[25] Puppo, F., George, V. K. & Silva, G. A. An optimized function-structure design principle underlies efficient signaling dynamics in neurons. Nature Scientific Reports 8, 10460 (2018).

[26] George, V. K., Puppo, F. & Silva, G. A. Computing temporal sequences associated with dynamic patterns on the c. elegans connectome. Frontiers in systems neuroscience doi: 10.3389/fnsys.2021.564124 (2021).

[27] Roy, K., Jaiswal, A. & Panda, P. Towards spike-based machine intelligence with neuromorphic computing. Nature 575, 607–617 (2019).

[28] Schuller, I. K. & Committee), R. S. O. Neuromorphic computing: From materials to systems architecture. Department of Energy https://www.osti.gov/servlets/purl/1283147 (2015).

[29] World first supercomputer capable of brain-scale simulation being built at western sydney university. Western Sydney University https://www.westernsydney.edu.au/icns.

[30] Preskill, J. Quantum computing in the nisq era and beyond. arXiv https://arxiv.org/abs/1801.00862 (2018).

Source link

Latest stories