Quick links

Colloquium

Memorization and Association on a Realistic Neural Model

Date and Time
Friday, March 11, 2005 - 4:00pm to 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
Leslie G. Valiant, from Harvard University
Host
Sanjeev Arora
A central open question of computational neuroscience is to identify the data structures and algorithms that are used in mammalian cortex to support successive acts of the basic cognitive tasks of memorization and association. This talk addresses the simultaneous challenges of realizing these two distinct tasks with the same data structure, and doing so while respecting the following four basic quantitative parameters of cortex, the neuron number, the synapse number, the synapse strengths, and the switching times. Previous work apparently had not succeeded in reconciling all these opposing constraints, the low values of synapse strengths that are typically observed experimentally having contributed a particular obstacle. We describe a computational scheme that supports both memory formation and association, and is feasible on networks of model neurons that respect the widely observed values of the above-mentioned four quantitative parameters. Our scheme allows for both disjoint and shared representations. The algorithms are simple, and in one version both memorization and association require just one step of neighborly influence. The issues of interference among the different circuits that are established, of robustness to noise, and of the stability of the hierarchical memorization process are addressed. A calculus, therefore, is implied for analyzing the capabilities of particular neural systems and subsystems, in terms of their basic numerical parameters.

Interaction and Audiovisual Abstraction

Date and Time
Wednesday, March 9, 2005 - 4:00pm to 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
Golan Levin, from CMU
Host
Adam Finkelstein
Golan Levin is an artist, composer, performer and engineer interested in developing artifacts and events which explore supple new modes of reactive expression. His work focuses on the design of systems for the creation, manipulation and performance of simultaneous image and sound, as part of a more general inquiry into the formal language of interactivity, and of nonverbal communications protocols in cybernetic systems. Through performances, digital artifacts, and virtual environments, often created with a variety of collaborators, Levin applies creative twists to digital technologies that highlight our relationship with machines, make visible our ways of interacting with each other, and explore the intersection of abstract communication and interactivity. Identified by Technology Review as one of the world's "Top 100 Innovators Under 35" [2004], and dubbed by El Pais as "one of the most brilliant figures in contemporary audiovisual art" [2002], Levin has exhibited widely in Europe, America and Asia.

Computational Foundations for Statistical Learning: Enabling Massive Science

Date and Time
Wednesday, March 2, 2005 - 4:00pm to 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
Alexander Gray, from CMU
Host
Jaswinder Singh
The data sciences (statistics, and recently machine learning) have always been part of the underpinning of all of the natural sciences. `Massive datasets' represent potentially unprecedented capabilities in a growing number of fields, but most of this potential remains unlocked, due to the computational intractability of the most powerful statistical learning methods. The computational problems underlying many of these methods are related to some of the hardest problems of applied mathematics, but have unique properties which make classical solution classes inappropriate. I will describe the beginnings of a unified framework for a large class of problems, which I call generalized N-body problems. The resulting algorithms, which I call multi-tree methods, appear to be the fastest practical algorithms to date for several foundational problems. I will describe four examples -- all-nearest-neighbors, kernel density estimation, distribution-free Bayes classification, and spatial correlation functions, and touch on two more recent projects, kernel matrix-vector multiplication and high-dimensional integration. I'll conclude by showing examples where these algorithms are enabling previously intractable data analyses at the heart of major modern scientific questions in cosmology and fundamental physics.

MyLifeBits

Date and Time
Thursday, February 17, 2005 - 4:00pm to 5:30pm
Location
Computer Science Large Auditorium (Room 104)
Type
Colloquium
Speaker
Jim Gemmell, from Microsoft
Host
Kai Li
MyLifeBits is a lifetime store of everything. It is the fulfillment of Vannevar Bush's 1945 Memex vision including full-text search, text & audio annotations, and hyperlinks. MyLifeBits is both an experiment in lifetime storage and a software research effort. In this talk, we will demonstrate the software we have developed for MyLifeBits, which leverages SQL server to support: hyperlinks, annotations, reports, saved queries, pivoting, clustering, and fast search. MyLifeBits is designed to make annotation easy, including gang annotation on right click, voice annotation, and web browser integration. It includes tools to record web pages, IM transcripts, radio and television. The MyLifeBits screensaver supports annotation and rating. We are beginning to explore features such as document similarity ranking and faceted classification. We have collaborated with the WWMX team to get a mapped UI, and with the SenseCam team to digest and display SenseCam output. www.mylifebits.com has more information. One of the demos will be based on the summer intern project of Aleks Aris from UMd.

C++: Evolving a language in and for the real world

Date and Time
Thursday, February 10, 2005 - 4:00pm to 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
Bjarne Stroustrup, from Texas A&M
Host
Brian Kernighan
The C++ programming language is now 25 years old, and used by millions of programmers. Developing such a language is a challenging business. I will present a personal view of some of the factors that were and still are critical for C++'s sustained success. Many of the ideals and principles that were applied -- with varying degrees of success -- involve non-technical concerns. Real-world language evolution differs significantly from "green field" and "blue sky" design, since it requires long-term evolution in a world of shifting external pressures rather than specifying an ideal design once and for all.

Scalable and Expressive Iterative Combinatorial Exchanges

Date and Time
Wednesday, February 9, 2005 - 4:00pm to 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
David Parkes, from Harvard
Host
Larry Peterson
Combinatorial auctions (CAs) allow one seller to allocate a collection of heterogeneous and discrete goods to multiple buyers, each of which may have complex valuations, e.g. "I only want A if I get B" , "I only want A or B", etc. Yet, many real world problems (such as resource allocation on Federated systems like PlanetLab) are more naturally two-sided, with multiple buyers and sellers. Combinatorial Exchanges (CEs) extend CAs to this two-sided setting. In this talk, I outline three key components in our exchange design: a tree-based bidding language, adaptive preference elicitation, and a payment rule to promote simple and truthful bidding. In moving towards a deployment of such a market for computational resources, I discuss some of the wider, institutional and technological, issues that must be addressed. In closing, I mention some additional directions in computational mechanism design that will impact resource allocation and arbitration in real systems.

Understanding the brain as a memory system: biology, model, and implementation.

Date and Time
Thursday, February 3, 2005 - 4:00pm to 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
Jeff Hawkins, from Redwood Neuroscience Institute and PalmOne
Host
Brian Kernighan
A clear understanding of how the brain works will make it possible for us to build intelligent machines, in silicon, that will exceed our human ability in surprising ways. The brain is not a computer, but a memory system that stores experiences in a way that reflects the true structure of the world, remembering sequences of events and their nested relationships and making predictions based on those memories. It is this memory-prediction system that forms the basis of intelligence, perception, creativity, and even consciousness. The neocortex is the seat of most aspects of perception and high-level thought. I propose that the neocortex can be understood as a hierarchical sequence memory. This talk will describe the anatomy and physiology of the cortex, and show that they can be accurately captured by a mathematical model based on conditional probabilities. An implementation of this system will be shown that demonstrates robust invariant image recognition.

Phase Transitions in Hard Optimization Problems

Date and Time
Thursday, December 9, 2004 - 4:30pm to 6:00pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
Dimitris Achlioptas, from Microsoft
Host
Nicholas Pippenger
ABSTRACT: Many NP-complete problems are easy for random inputs. For example, in a random graph where each edge is present with probability 1/2 one can find a Hamiltonian cycle in linear time, almost surely. On the other hand, random instances of satisfiability and graph coloring appear to be hard. Moreover, each problem appears to be hardest near its "threshold", a constraint density around which the probability of solubility drops rapidly from near 1 to near 0. Locating these two thresholds and understanding the behavior of algorithms near them is an active topic of research in artificial intelligence, combinatorics, probability, theoretical computer science, and statistical physics.

We study how the structure of the set of solutions evolves in each of these two problems as constraints are added. This allows us to determine the location of the threshold for both random satisfiability and random graph colorability up to second order terms. To do this we prove that for a large class of random constraint satisfaction problems the second moment of the number of solutions is captured by an "entropy-energy" tradeoff. Critical values of this tradeoff correspond to points where the structure of the space of solutions undergoes dramatic change. By identifying these critical points, we not only get to locate thresholds but we also gain rigorous insight on why algorithms have a hard time near them.

Based on joint works with Assaf Naor (Microsoft) and Yuval Peres (Berkeley).

Event (no name)

Date and Time
Tuesday, December 7, 2004 - 4:00pm to 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
TBD
Host
Virginia Hogan

Derandomizing Auctions

Date and Time
Thursday, December 2, 2004 - 4:00pm to 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
Colloquium
Speaker
Andrew Goldberg, from Microsoft
Host
Robert Tarjan
We study the problem of designing seller optimal auctions. Prior to this work, the only previously known auctions that are approximately optimal in worst case employ randomization. Our main result is the existence of deterministic auctions that approximately match the performance guarantees of these randomized auction. We give a fairly general derandomization technique for turning a randomized mechanism into an asymmetric deterministic one with approximately the same properties. In doing so, we bypass an impossibility result for symmetric deterministric aucyions by using asymmetry, i.e., the order in which bids appear in the input.

Our derandominazion technique uses an auxillary graph of exponential size and takes exponential time even if the corresponding randomized auction is polynomial time. We leave as an open question the existence of a polynomial-time computable deterministic auction that is approximately optimal.

Our work shows an interesting relationship between mechanism design and several areas of computer science, including competitive algorithms, computational learning, and coding theory.

Joint work with Gagan Aggarwal, Amos Fiat, Nicole Immorica, Jason Jason Hartline, and Madu Sudan.

Follow us: Facebook Twitter Linkedin