Everyday Sensing and Perception (ESP)
Context-aware computing has long held the promise of rich, natural, personalized
interactions with both mobile devices and an increasingly instrumented environment.
Applications enabled by context-awareness range from serious health monitoring and
assisted living deployments to serendipitous gaming and social meet-up systems. The
goal of the Everyday Sensing and Perception (ESP) project is to help realize this vision by
developing a system that can infer a users context with 90% accuracy over 90% of their
day. ESP is focusing on the perception of everyday situations that many context-aware
applications depend on. Specifically, ESP is developing the ability to infer:
- Location: Where is the user, in both absolute (latitude, longitude) as well as symbolic (Grocery Store) terms?
- Activity: What is the user doing right now in terms of physical (standing) and
object-based (washing dishes) activity?
- Social interaction: Who is the user interacting with and what role are they
acting in (teacher)?
To reach a 90% level of coverage, the ESP research approach is to employ sensors
integrated into a users mobile devices to sense their environment and how they interact
with it. ESP is investigating both low-power, low data-rate sensors (e.g., RFID tags,
accelerometers and radios), as well as high data-rate sensors (e.g., video cameras and
microphones). To achieve 90% level of accuracy, ESP is developing state of the art
machine learning and distributed computing algorithms including:
Joint modeling of video and audio data with other worn sensors
Federating training across users
On-the-fly refinement of user models with online learning
Parallelization of machine learning algorithms
Compressive sensing and synopsis based reasoning for mobile devices
The ESP project is also investigating a variety of application and user-interaction
implications of high quality, high coverage inference. We are specifically researching:
The challenges and opportunities of context inference in education and the social coordination domain
The use of planning techniques in context augmented user experience
The use and control of projectors, including mobile and personal video projectors
Novel adaptive, multi-modal user interfaces for handheld devices