How do we make smartphones even smarter?
That is the central question driving the Smartphone Sensing Group at Dartmouth.
Smartphones are open and programmable and come with a growing number of powerful embedded sensors, such as an accelerometer, digital compass, gyroscope, GPS, microphone, and camera, which are enabling new sensing applications across a wide variety of domains such as social networks, mobile health, gaming, entertainment, education and transportation.
Application delivery channels such as the AppStore and Market have transformed plain old cell phones into app phones, capable of downloading a myriad of applications in an instant.
The Smartphone Sensing Group is turning the everyday smart phone into a cognitive phone by pushing intelligence to the phone and the computing cloud to make inferences about people's behavior, surroundings and their life patterns.
We are developing new software technology for smartphones to sense, learn, visualize, and share information about ourselves, friends, communities, the way we live, and the world we live in.
Some of the sensing algorithms, systems and applications we have developed in collaboration with Tanzeem Choudhury (Cornell University) and others include CenceMe, SoundSense, NeuralPhone, Jigsaw, Darwin Phones, NextPlace, EyePhone, BeWell, Community-Guided Learning , and Community Similarity Networks.
Interested in smartphone sensing: checkout our webpage and take a look at a survey on smartphone sensing.
CBS News Sunday Morning Neural Phone is featured as part of the cover story on The next step in bionics aired on CBS, October 2011
2011 papers on smartphone sensing published in UbiComp, Pervasive, ICDM and Pervasive Health.
Our work on the Neural Phone is featured in the NYTimes Magazine article on The Cyborg in us all, September 2011
UbiComp 2011 Our paper on Community Similarity Networks (CSN) was nominated for best paper award, September 2011
Interview on the IT conversation network about smartphone sensing, January 2011