Abstract
Summary form only given. The talk will expound on the vision where smart-watches (and other wrist-worn devices) are not just isolated sensing platforms, but part of a sensing ecosystem that couples together multiple personal devices. While an individual smartwatch can undoubtedly capture certain gestural activities and health vitals, a far richer set of applications can be enabled by performing coordinated sensing across a distributed set of one or more wrist-worn devices and a smartphone. I will describe ongoing work that involves such coordinated sensing for both (a) newer, gestural-based interactive and immersive interfaces and (b) improved unobtrusive recognition of daily activities, and the associated challenges for three key performance metrics: energy, latency and accuracy. I will provide specific examples on how we are using (i) novel distributed sensor data pipelines to enable real-time, low-latency recognition of gestures, and (ii) smartphone and infrastructural sensing as effective contextual triggers for continuous capture of commonplace urban lifestyle activities.