]>
We describe a system that learns to recognize richer contexts using sensor data from a person's Android phone along with annotations on her calendar and general background knowledge. Geo-social locations include the concepts of 'home' and 'school' and can be extended to others like 'work' or 'a restaurant'.
Our framework combines data from the phone's sensors (GPS, WI-FI, Bluetooth, acceleration, proximity, etc.) with data mined from applications (e.g., calendar) to produce features that can be used in a machine learning system. Training data from several university students and staff was collected using a system that periodically prompted the user for her true geo-social location and activity. The resulting classifier models were used to predict the individual user's context from new sensor data. The data from a set of users was combined to create a generic model.
We report on an evaluation of the individual and generic models in the university setting for predicting context. Finally, we discuss how our extended context notion can be applied to many interesting applications for smart phone users.
Committee: