The Computational Challenges of Connectomics
Date and Time
Friday, March 16, 2012 - 12:00pm to 1:00pm
Green Hall 0-S-6
Sebastian Seung, from Massachusetts Institute of Technology
According to a doctrine known as connectionism, brain function and dysfunction depend primarily on patterns of connectivity between neurons. Connectionism has been explored theoretically with mathematical models of neural networks since the 1940s. It has proved difficult to test these models through activity measurements alone. For conclusive empirical tests, information about neural connectivity is also necessary, and could be provided by new imaging methods based on serial electron microscopy. The bottleneck in using these new methods is now shifting to the data analysis problem of extracting neural connectivity from the images. New multibeam scanning electron microscopes will soon generate a petabyte of image data from a cubic millimeter of brain tissue every two weeks. From such images, it should be possible to map every connection between neurons in the volume---in principle. Unfortunately, it could take up to a million years for a single person to carry out this feat manually. Clearly, our capacity to acquire "big data" from the brain has far outpaced our ability to analyze it. My lab has been developing computational technologies to deal with this data deluge. These include image analysis by artificial intelligence (AI) based on machine learning, and methods of "crowdsourcing" image analysis to human intelligence. Our technologies have been implemented in eyewire.org, an online community that mobilizes laypersons to help map neural connections of the retina by interacting with AI.