Skip to yearly menu bar Skip to main content


Poster

Rotation-invariant clustering of neuronal responses in primary visual cortex

Emmanouil Froudarakis · Paul Fahey · Edgar Walker · Erick Cobos · Jacob Reimer · Fabian H Sinz · Andreas Tolias · Matthias Bethge · Alexander Ecker · Santiago Cadena · Ivan Ustyuzhaninov


Abstract:

Similar to a convolutional neural network (CNN), the mammalian retina encodes visual information into several dozen nonlinear feature maps, each formed by one ganglion cell type that tiles the visual space in an approximately shift-equivariant manner. Whether such organization into distinct cell types is maintained at the level of cortical image processing is an open question. Predictive models building upon convolutional features have been shown to provide state-of-the-art performance, and have recently been extended to include rotation equivariance in order to account for the orientation selectivity of V1 neurons. However, generally no direct correspondence between CNN feature maps and groups of individual neurons emerges in these models, thus rendering it an open question whether V1 neurons form distinct functional clusters. Here we build upon the rotation-equivariant representation of a CNN-based V1 model and propose a methodology for clustering the representations of neurons in this model to find functional cell types independent of preferred orientations of the neurons. We apply this method to a dataset of 6000 neurons and visualize the preferred stimuli of the resulting clusters. Our results highlight the range of non-linear computations in mouse V1.

Chat is not available.