# Past Events (2010)

**30/11/10 11:00 - 12:00**

**ANC Seminar: John Skilling (Host: Iain Murray)**

**Foundations of Computational Inference**

How should we compute Bayesian inference, involving both evidence and posterior, in high dimension? On the negative side, dimension itself is "soft" and not fundamental. Inference needs probability mass, but that requires volume which is not available from point samples. Direct Monte Carlo needs exponentially many samples, which is impossible. Bridge sampling is an impoverished method that only gives Bayes factors for overlapping models. The harmonic mean is merely a sophisticated evaluation of 0/0. High posterior density (HPD) regions are impossible to locate, and the Chib approximation is blind to local dropouts. Hamiltonian

Monte Carlo shows promise, but its particles move uncomfortably fast through regions of large likelihood. Thermal methods such as simulated annealing fail in the presence of phase changes (which are increasingly important). On the positive side, high dimension is actually very simple because all specialised structure falls away. It's properly amenable only to correspondingly simple algorithms. I will present nested sampling as

the appropriate control, and Galilean Monte Carlo (GMC) as the appropriate exploration.

**09/11/10 11:00 - 12:00**

**ANC Seminar: Yee Whye Teh (Host: Iain Murray)**

**Hierarchical Bayesian Models of Language and Text**

In this talk I will present a new approach to modelling sequence data called the sequence memoizer. As opposed to most other sequence models, our model does not make any Markovian assumptions. Instead, we use a hierarchical Bayesian approach which enforces sharing of statistical strength across the different parts of the model. To make computations with the model efficient, and to better model the power-law statistics often observed in sequence data, we use a Bayesian nonparametric prior called the Pitman-Yor process as building blocks in the hierarchical model. We show state-of-the-art results on language modelling and text compression.

This is joint work with Frank Wood, Jan Gasthaus, Cedric Archambeau and Lancelot James.

**02/11/10 11:00 - 12:00**

**ANC Seminar: Rafal Bogacz (Host: Peggy Series)**

**Decision making in the cortico-basal-ganglia circuit**

During this talk I will present computational models describing decision making process in the cortico-basal ganglia circuit. In the first part I will review models describing how speed and accuracy of decisions is controlled in the cortico-basal-ganglia circuit, and present results of a recent experiment attempting to distinguish between these models. In the second part of the talk, I will present a model assuming that the cortico-basal-ganglia circuit performs statistically optimal test that maximizes speed of decisions for any required accuracy. In the model, this circuit computes probabilities that considered alternatives are correct, according to Bayes’ theorem. This talk will show that the equation of Bayes’ theorem can be mapped onto the functional anatomy of a circuit involving the cortex, basal ganglia and thalamus. This theory provides many precise and counterintuitive experimental predictions, ranging from neurophysiology to behaviour. Some of these predictions have been already validated in existing data and others are a subject of ongoing experiments.

**19/10/10 11:00 - 12:00**

**ANC Seminar: Mark Girolami (Host: Charles Sutton)**

**Manifold Markov chain Monte Carlo methods**

In this talk Metropolis adjusted Langevin and Hamiltonian Monte Carlo sampling methods defined on the Riemann manifold are proposed. The methods can provide automated adaptation mechanisms that circumvent

the costly pilot runs that are required to tune proposal densities for Metropolis-Hastings or indeed Hamiltonian Monte Carlo and Metropolis adjusted Langevin algorithms. This allows for efficient sampling even in very high dimensions where different scalings may be required for the transient and stationary phases of the Markov chain. The methodology proposed exploits the Riemann geometry of the parameter space of statistical models and thus automatically adapts to the local structure when simulating paths across this manifold, providing highly efficient convergence and exploration of the target density. The performance of these manifold Monte Carlo methods is assessed on Mixture models, logistic regression models, log-Gaussian Cox point processes, stochastic volatility models and Bayesian estimation of dynamic systems described by non-linear differential equations. The strengths and weaknesses of the proposed methods are discussed.

**13/10/10 16:00 - 17:00**

**ANC Seminar: Nir Friedman (Host: Guido Sanguinetti)**

**Continuous-Time Message Passing**

Continuous-Time Bayesian networks is a compact modeling language for multi-component processes that evolve continuously over time. Inference is crucial element required for learning such models from observations and using them for prediction. In this talk I will introduce a variational principle from which we derive two approximate inference algorithms, which are natural extensions of mean-field and belief-propagation for continuous time processes. Both algorithms involve passing messages representing inhomogeneous Markov processes over subsets of components. This leads to relatively simple procedures that allow incorporating adaptive ordinary differential equations solvers to perform individual steps. The two algorithms have complementary characteristics: The mean-field algorithm provides a lower bound on the probability of evidence, which make it attractive for learning applications, while belief-propagation provides quite accurate empirical results on queries about the posterior distribution.

**12/10/10 11:00 - 12:00**

**ANC Seminar: Tobi Delbruck (Host: Matthias Hennig)**

**Tobi Delbruck, Inst. of Neuroinformatics, University of Zurich and ETH Zurich **http://www.ini.uzh.ch/~tobi

**Dynamic vision events from a spiking silicon retina **

How do our eyes work? Not much like your digital camera! Rather than sending our brain stroboscopic sequences of still photographs, our eyes output highly compressed asynchronous digital impulses that already take a big step towards vision. In this seminar I will demonstrate how a silicon retina chip we developed is used to cheaply solve fast robotic vision problems, and how it achieves this by emulating notions from real retinas like local gain control and asynchronous spike-based digital output. This talk will include live demonstrations of the vision sensor.

**21/09/10 11:00 - 12:00**

**ANC Seminar: Matthias Bethge (Host: Matthias Hennig)**

**How effective are neural model representations of natural images in modeling higher-order correlations? **

The physical structure of objects manifests itself in the higher-order correlations of natural images. The content of an image is still recognized easily if one removes all structure responsible for second-order correlation but otherwise leaves the higher-order correlations untouched. Since modeling higher-order correlations is a challenging task, it can be highly informative to examine to what extent neural model representation have the capacity to account for them. In this talk I will give an overview on the research done in my lab over the last 3-4 years in order to quantify the potential of different mechanisms to capture these higher-order correlations. I will also present a new model which achieves significantly improved performance over previous ones.

**18/05/10 11:00 - 12:00**

**ANC Seminar: Tom Heskes (Host: Charles Sutton)**

**Approximating marginals in latent Gaussian models **

In a seminal paper (Journal Royal Statistical Society Series B, 2009), Rue, Martino and Chopin propose a nested approach for approximating posterior marginals in latent Gaussian models. They show that their deterministic procedure, based on Laplace's method, reaches excellent accuracy in a fraction of the time that Monte Carlo sampling methods take to obtain similar performance. However, for a relevant class of models, Laplace's method is seriously off. We show how expectation propagation in such cases can still provide excellent approximations at comparable (and sometimes even better) computational efficiency. [joint work with Botond Cseke]

**11/05/10 11:00 - 12:00**

**ANC Seminar: Ben Vincent (Host: Mark van Rossum)**

**The visual system and energy efficiency **

Many theories of neural processing focus quite rightly on computational and information processing concerns. While this is entirely sensible, the brain is not an abstract computational device, it generates heat, it is noisy, and requires a high energy-density diet to power it. Much of my work focuses on the role of such biophysical factors. In a series of studies which use simple neural networks/Bayesian models, I and colleagues have found that the remarkably simple notion of ‘do work, whilst being energy efficient’ can explain multiple properties of the neural organisation of early sensory systems.

**27/04/10 11:00 - 12:00**

**ANC Seminar: Michael Daw (Host: Mark van Rossum)**

**"Coordinated development of feedforward inhibition in neonatal cortex". **

Early changes in the expression of neuronal chloride transporters result in a developmental switch at GABAergic synapses from depolarising transmission to the hyperpolarising transmission which is typical in the adult brain. Studies in a number of brain areas suggest that depolarising GABA may act as the major excitatory transmitter in neonates. In this context we studied the development of thalamocortical (TC) feedforward inhibition (FFI). In neonates few interneurons are activated by TC axons and GABAA receptors reverse close to resting potential. FFI increases rapidly in the first postnatal week as a result of both a shift in GABAA reversal potential and a simultaneous recruitment of fast-spiking interneurons. As such, in this system, the relatively depolarised reversal potential at GABAergic synapses may not be excitatory per se but rather allow the recruitment of interneurons without imposing functional inhibition.

**26/03/10 11:00 - 12:00**

**ANC Seminar: Matthias Seeger, Saarland University (Host: Chris Williams)**

**"Bounding the Gaussian Process Information Gain: Applications to PAC-Bayesianism and Analyzing GP Bandit Optimization" **

PAC-Bayesian theorems provide the tightest known generalization error bounds for certain GP and kernel methods, and closely related ideas lead to information consistency rates for GP prediction. Roughly, they are tight because of their finedependence on (1) prior and model assumptions, and (2) on the data itself. While (1) is generally beneficial, (2) calls for further analysis into maximum fluctuations over samples, certainly if results are to be applied to active sampling scenarios.

We show that in several rather different settings, the information gain turns out to dominate the data-dependent part. We show how this term can be bounded, both in expectation and in worst case (under mild assumption on the input distribution), in terms of rate bounds on the kernel operator spectrum, which are known for a number of frequently used kernels, like Gaussian and Matern. We give applications to information consistency rates for GP prediction and to the analysis of a Bayesian GP optimization algorithm.

Joint work with Sham Kakade, Andreas Krause, Niranjan Srinivas, Dean Foster

**19/03/10 13:00 - 14:00**

**ANC Seminar: Dan Butts, University of Maryland Dept. (Host: Matthias Hennig)**

**"Beyond the receptive field: the role of inhibition in formatting visual information" **

Despite being well characterized anatomically and physiologically, our understanding of how the visual pathway processes information is relatively impoverished, due in part to our reliance on the “linear receptive field” as a description of neuronal function. The receptive field describes the average visual stimulus that a neuron responds to, and is known to break down in describing cortical neurons (such as “complex cells”), whose response involves nonlinear combinations of more than one visual feature. I will describe a general nonlinear (GN) modeling framework that can describe multiple receptive field elements, and be determined using standard extracellular recordings. Surprisingly, this modeling approach reveals that neurons in the retina and LGN -- thought to be relatively linear -- generate responses through the interplay of nonlinear excitatory and inhibitory receptive fields. This interplay is responsible for the precise timing in LGN spike trains, and also is likely responsible for many aspects of adaptation to stimulus contrast. This suggests novel functional roles for interneurons in the retina and LGN, and by demonstrating how visual information is formatted across the early stages of the visual pathway, provides insight into what may be important for subsequent cortical processing.

**02/03/10 14:00 - 15:00**

**ANC Seminar: Christophe Andrieu (Host: Charles Sutton)**

**"Particle MCMC" **

Markov chain Monte Carlo (MCMC) and sequential Monte Carlo (SMC) methods have emerged as the two main tools to sample from high dimensional probability distributions. Although asymptotic convergence of MCMC algorithms is ensured under weak assumptions, the performance of these algorithms is unreliable when the proposal distributions that are used to explore the space are poorly chosen and/or if highly correlated variables are updated independently. We show how it is possible to build efficient high dimensional proposal distributions by using SMC methods. This allows us not only to improve over standard MCMC schemes but also to make Bayesian inference feasible for a large class of statistical models where this was not previously so. We demonstrate these algorithms on a non-linear state space model and a Lévy-driven stochastic volatility model.

**16/02/10 11:00 - 12:00**

**ANC Seminar: Marlene Bartos - University of Aberdeen (Host: Mark van Rossum)**

**Postnatal differentiation of basket cell networks from slow to fast signaling devices**

Abstract: Gamma frequency (30-100 Hz) oscillations in the mature cortex underlie higher cognitive functions. Fast signaling in GABAergic interneuron networks plays a key role in the generation of these oscillations. During development of the rodent brain, however, gamma activity appears at the end of the first postnatal week but frequency and synchrony reach adult levels only by the fourth week. The mechanisms underlying the maturation of gamma activity are unclear. Here we demonstrate that hippocampal basket cells (BCs), the proposed cellular substrate of gamma oscillations, undergo marked changes in their morphological, intrinsic and synaptic properties between postnatal day (P) 6-25. During maturation, action potential duration, propagation time, duration of the release period, and decay time constant of inhibitory postsynaptic currents decreases by ~30-60%. Thus, postnatal development converts BCs from slow into fast signaling devices. Computational analysis reveals that BC networks with young intrinsic and synaptic properties as well as reduced connectivity generate oscillations with moderate coherence in the lower gamma frequency range. In contrast, BC networks with mature properties and increased connectivity generate highly coherent activity in the upper gamma frequency band. Thus, late postnatal maturation of BCs enhances coherence in neuronal networks and will thereby contribute to the development of cognitive brain functions.

**02/02/10 11:00 - 12:00**

**ANC Seminar: Ian Dugid (Host: Mark van Rossum)**

**Sensory information processing in the cerebellum**

GABAA receptor-mediated synaptic inhibition is essential for regulating the excitability of neuronal networks throughout the mammalian brain. Activation of inhibitory synapses results in a transient ‘phasic’ form of inhibition that regulates the window for temporal summation of discrete excitatory inputs thus enforcing precisely timed output spikes. In addition to phasic inhibition, a variety of neurons also express extrasynaptic GABAA receptors that mediate a persistent ‘tonic’ inhibitory conductance that shapes the voltage response to synaptic input by decreasing the membrane time constant and narrowing the temporal window for synaptic integration. Despite over a decade of investigation, the physiological role of tonic inhibition, particularly in the context of sensory information processing, remains unknown. Our recent research, using whole-cell recordings from single cerebellar neurons in vivo and computational modelling, demonstrates that tonic inhibition controls the ability of neurons to discriminate between salient sensory information and ongoing network activity. Thus, tonic inhibition appears to act as a tunable filter that optimises the flow of sensory information through the cerebellar cortex.

**19/01/10 11:00 - 12:00**

**ANC Seminar: Kei Ito (Host: Douglas Armstrong)**

**Flybrain Neuron Database, a comprehensive online database of the Drosophila brain neurons**

Morphological and functional knowledge of the neurons of the Drosophila melanogaster brain has been accumulated drastically, thanks to the recent improvements in visualization techniques. Identified neurons, however, have been reported in diverse publications with inconsistent formats, making it very difficult to acquire comprehensive overview about what is known and what remains uninvestigated. To address this problem, we developed an online database, called Flybrain Neuron Database, which aims to collect information about all the Drosophila brain neurons reported so far. Images and verbal descriptions about the neuron names, projection sites, etc., are provided. The database currently carry the records of about 360 types of neurons, which may already cover most of the known neurons in the fly brain.

In addition, the database also provides information about the three-dimensional labeling patterns of useful molecular markers that label specific subsets of neurons, such as antibodies as well as GAL4 and LexAV enhancer-trap strains. Unlike previous databases of other brains, we also implemented an interactive browsing system of the brain structures. The tool provides volume-rendered images of the brains generated from raw confocal section image stack upon users' request. Users can rotate, magnify, and cut the brain at any angles so that relative positions and three-dimensional morphologies of the brain structures can easily be figured out.

To inherit our knowledge to our descendants, such database digital atlases should be maintained for a very long period, well over 100 years, like printed atlases. Long term maintenance of databases, however, is not easy. Technical and organizational measures to enable such long-term archive should be considered seriously. Some of the topics concerning this problem will be discussed.