Quick links

In Pursuit of Low-Latency Interactions on Mobile Devices

Date and Time
Monday, February 15, 2016 - 12:30pm to 1:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
CS Department Colloquium Series
Host
Kyle Jamieson

 

Human attention is a scarce resource, but when available, it can also be wonderfully perceptive. My research seeks to understand: what does it take for mobile devices --- power-constrained as they are ---  to operate at the speed of human perception? And what new opportunities emerge as a result?

 Via a pair of vignettes, I illustrate two such low-latency mobile systems. The first focuses on app streaming, an emerging app execution model in which remote servers execute logic and rendering on behalf of thin clients. App streaming promises any device access to any app at any time. Unfortunately, the reality is that wide-area network latencies often exceed thresholds above which many interactive apps such as games tend to be deemed too slow. In response, I describe Outatime, a speculative execution system for app streaming that masks network latency.  In Outatime, the server renders speculative frames of future possible outcomes, delivering them to the client one entire roundtrip early, and recovers quickly from mis-speculations when they occur. Clients perceive little latency. Outatime has been implemented on two high-quality, commercially-released twitch-based games.  Users report strongly preferring Outatime to standard streaming, since Outatime delivers real-time interactivity as fast as --- and in some cases, even faster than --- traditional local client-side execution.

 In a second example of low-latency interaction, I describe a Kinect-like device tracking system, FAR. Unlike Kinect, FAR is portable, requiring only the phones in our hands. Yet FAR performs continuous, fast and accurate phone-to-phone localization that matches the (often very fast) speed and sensitivity of human movement.  In fact, FAR's accuracy is comparable to --- and in some cases, even superior to --- that of Kinect.  Lab trials and many real world deployments indicate that FAR can fully support dynamic human motion in real-time.

David Chu is a researcher in Microsoft's Mobility and Networking Research Group. His research interests are in mobile systems and applications, cyber-physical systems, sensing systems, ubiquitous computing and applied machine learning.  The main thrust of David's current work is toward low-latency perception-aligned mobile systems. He received the Best Paper award in MobiSys 2015, the Best Paper nomination in MobiSys 2012, the Best Demo award in MobiSys 2014, and the Best Demo nomination in SenSys 2011.  David's research has appeared on multiple occasions in tech news such as TechCrunch, PC Magazine, GameSpot, Ars Technia, Slashdot, The Verge, Engadget and Wired. At Microsoft, David has contributed to Windows and Windows Phone, Xbox and HoloLens. David received his B.S. from the University of Virginia in 2004; and his M.S. and Ph.D. from the University of California, Berkeley in 2005 and 2009, respectively, while an NSF Graduate Research Fellow.

Follow us: Facebook Twitter Linkedin