ANC Workshop: Matthias Hennig and Alireza Alemi, Chair: Emilia Wysocka
What 


When 
May 03, 2016 from 11:00 AM to 12:00 PM 
Where  4.31/4.33 
Add event to calendar 
vCal iCal 
Matthias Hennig
"Unsupervised spike sorting for large scale, high density multielectrode arrays"
Over the past years, we have developed a set of methods and tools to analyse largescale high density multielectrode array recordings. I will give an overview of this work, and specifically explain how we solved the highdimensional clustering problem of assigning millions of events detected in these recordings to single neurons.
Alireza Alemi
"Optimizing information storage in recurrent neural networks"
Recurrent neural networks can store memory patterns as fixedpoint attractors of their dynamics. A measure of performance of attractor networks is their storage capacity, which is 2N patterns for conventional recurrent network of N
binary neurons. The threethreshold learning rule (3TLR) is able to store up to this maximal capacity without relying on an explicit “error signal”. However, this storage capacity is equivalent to a maximal information capacity of 2 bits/weight for unconstrained weights which is far from ideal since an unconstrained weight has the capacity to store infinite amount of bits in a noiseless theoretical scenario — a capacity that conventional attractor networks cannot achieve.
Here, I propose a hierarchical attractor network that can achieve an ultra high information capacity. The network has two layers: a visible layer with N_{v} neurons, and a hidden layer with N_{v} neurons. The visibletohidden connections are set at random and kept fixed during the training phase, in which the memory patterns are stored as fixedpoints of the network dynamics. The hiddentovisible connections, initially normally distributed, are learned via 3TLR. My simulations suggest that the maximal information capacity grows exponentially with the expansion ratio N_{h}/N_{v}. As a first order approximation to understand the mechanism providing the high capacity, I performed a naive meanfield approximation (nMFA) of the network. The rapid increase in capacity was captured by the nMFA, revealing that a key underlying factor is the correlation between the hidden and the visible units. The nMFA can be reformulated as a perceptron problem which is amenable to an analytical calculation—the socalled replica method— the result of which is in agreement with the simulations. Additionally, it was observed that, at maximal capacity, the degree of symmetry of the connectivity between the hidden and the visible neurons increases with the expansion ratio. These results highlight the role of hierarchical neural architecture in information storage.