XPod
January 1, 2004 - June 1, 2008
The XPod system aims to integrate awareness of human activity and musical preferences to produce an adaptive system that plays the contextually correct music. The XPod project introduces a “smart” music player that learns its user’s preferences and activity, and tailors its music selections accordingly. We are using a BodyMedia device that has been shown to accurately measure a user’s physiological state. The device is able to monitor a number of variables to determine its user’s levels of activity, motion and physical state so that it may predict what music is appropriate at that point. The XPod user trains the player to understand what music is preferred and under what conditions. After training, the XPod, using various machine-learning techniques, is able to predict the desirability of a song, given the user’s physical state.
Students
Publications
2007
- S. Dornbush, J. English, T. Oates, Z. Segall, and A. Joshi, "XPod: A Human Activity Aware Learning Mobile Music Player", InProceedings, Proceedings of the Workshop on Ambient Intelligence, 20th International Joint Conference on Artificial Intelligence (IJCAI-2007), January 2007, 4433 downloads, 13 citations.
2005
- S. Dornbush, K. Fisher, K. McKay, A. Prikhodko, and Z. Segall, "XPod a human activity and emotion aware mobile music player", InProceedings, Proceedings of the International Conference on Mobile Technology, Applications and Systems, November 2005, 7992 downloads, 18 citations.
Research Areas
Assertions
- (Project) XPod has related publication (Publication) XPod: A Human Activity Aware Learning Mobile Music Player