Home About Publications Dataset Platform

MIT Media Lab: GroupMedia

Socially-Intelligent Wearables

According to Gartner, cellphones are soon expected to become the most popular consumer device on the planet. About half of the 800 million cell phones shipped in 2005 were more powerful than Pentium 1 computers. By quantifying the behavior of cell phone users it now seems possible to predict answers to questions like whom they got along with, what movie they enjoyed, how well they spoke, or even what product they might buy.

This real-time information could be used for feedback and training, to customise experiences and interactions with machines, take images or annotate conversations, or even connect friends and colleagues with appropriate privacy restrictions

The GroupMedia project evolved from our work at the Wearable Computing Group, a.k.a. Borglab, driven by the need for more perceptual socially-aware applications for cell phones and PDAs. We measure speech speaking styles (speech feature processing), head-nodding, body motion (accelerometry) and physiology (galvanic skin response) to understand interest in conversations, effectiveness of elevator pitches, movie audience reactions, speed-dating, focus groups, and group interaction dynamics.

Human Dynamics Group at MIT Media Lab