Research

We develop novel analytical and algorithmic tools to address questions at the interface of Systems Neuroscience and Applied Mathematics:

Stochastic neural dynamics

How can neural networks reliably process information in spite of biological noise? Can the same neural assemblies exhibit different coding strategies? How do network structure and input drive combine to explain the adoption of a given coding strategy?

More fundamentally, can a meaningful neural computation be distinguished from—perhaps irrelevant— spontaneous neural activity? In other words, do neural computations have a physical, observable signature?

Information coding capabilities

How to quantify the flow of behaviorally relevant information? What are the limitations and biophysical constraints on information processing? Do neural networks process information transfer close to their theoretical optimum?

More specifically, are certain features of neural networks, e.g. recurrent connections, optimally tuned for efficient information processing? Can design principles be deduced from such optimization requirements? 

Neural assemblies as transport networks

Information processing in the brains involves the transport, transformation, consumption and release of resources by neural cellular networks. Can neural networks be better understood as efficient transport systems?

In particular, how is the information flow constrained by simple physical requirements such as the conservation of resource fluxes? Can network structures emerge and persist in plastic neural assemblies as a result of competition for resource fluxes? Does information processing in the brain minimize energy dissipation?


 

We believe that some of the above questions are relevant to Systems Biology in general. In fact, we have already addressed a number of these questions in the context of microbial communities, another example of complex biological system.