CEMNet

A unified framework for Perceptual Inference in Sensory cortices


Abstract

The neural mechanisms that underlie perceptual inference in cortex are a matter of intense research. Neural models implementing Bayesian models have proved useful to uncover neural computations in a variety of paradigms. Still no existing neurocomputational model provides a general framework for understanding perception in naturalistic conditions. Indeed these models suffer from three shortcomings: i) they cannot perform accurate inference in naturalistic environments where numerous objects interact in complex fashion; ii) they cannot simultaneously represent likelihood estimates over the presence of several objects in the same graphical model; iii) they cannot deal with correlated sensory evidence, though such correlations are ubiquitous in sensory systems. We propose a new unifying model of perception called Constrained Entropy Maximization Network (CEMNet) that provides a theoretical framework for inference in complex naturalistic environments. CEMNet stores an internal model of the environment by representing regularities across stochastic variables as constraints; those constraints shape the response of the network. The aim of the project is to develop the computational model of CEMNet and test it empirically. We will first study how CEMNet can be implemented in a realistic neural network. Second, we will simulate the network and show that, unlike existing neural models, CEMNet can deal with the difficulties of inference in complex environments. Third, a behavioral experiment will monitor how human subjects integrate correlated sensory evidence. Last, we will probe the ability of human subjects to simultaneously maintain likelihood estimates over multiple interacting variables in an audiovisual motion integration paradigm. We will record neural signals in MagnetoEncephaloGraphy while subjects perform the task and look for a specific neural signature of CEMNet signals.

Marie Curie fellow: Alexandre Hyafil

PI and Supervisor: Gustavo Deco

Funded by: European Union’s Seventh Framework Programme for research, technological development and demonstration. EU Marie Curie Intra-European Fellowship.

Project reference: 629613