View more articles about

mobile phones ,

Smile! Smartphone app is tagging you

DUKE (US) — Forget bothering to tag photographs with your friends’ names, what they’re doing, and where they’re doing it. A new cell phone application can do it for you.

Dubbed TagSense, the new app works by using the multiple sensors on a mobile phone, and those of other mobile phones in the vicinity.

“In our system, when you take a picture with a phone, at the same time it senses the people and the context by gathering information from all the other phones in the area,” says Xuan Bao, a Ph.D. student in computer science working with Romit Roy Choudhury, assistant professor of electrical and computer engineering at Duke University.

“Phones have many different kinds of sensors that you can take advantage of,” says Chuan Qin, a visiting graduate student at Duke from University of Southern California. “They collect diverse information like sound, movement, location, and light. By putting all that information together, you can sense the setting of a photograph and describe its attributes.”

By using information about the environment of a photograph, it may be possible to achieve a more accurate tagging of a particular photograph than could be achieved by facial recognition alone. Such information about a photograph’s entirety provides additional details that can then be searched at a later time.

For example, the phone’s built-in accelerometer can tell if a person is standing still for a posed photograph, bowling, or even dancing. Light sensors in the phone’s camera can tell if the shot is being taken indoors or outdoors on a sunny or cloudy day.

The sensors can also approximate environmental conditions, such as snow or rain, by looking up the weather conditions at that time and location. The microphone can detect whether or not a person in the photograph is laughing, or quiet. All of these attributes are then assigned to each photograph.

With multiple tags describing more than just a particular person’s name, it would be easier to not only organize an album of photographs for future reference, but find particular photographs years later, Bao says.

With the exploding number of digital pictures in the cloud and in our personal computers, the ability to easily search and retrieve desired pictures will be valuable in the future.

“So, for example, if you’ve taken a bunch of photographs at a party, it would be easy at a later date to search for just photographs of happy people dancing,” Qin says. These added details of automatic tagging could help complement existing tagging applications, Choudhury says.

“While facial recognition programs continue to improve, we believe that the ability to identify photographs based on the setting of the photograph can lead to a richer, more detailed way to tag photographs,” he says.

“TagSense was compared to Apple’s iPhoto and Google’s Picasa, and showed that it can provide greater sophistication in tagging photographs.”

TagSense would most likely be adopted by groups of people, such as friends, who would “opt in,” allowing their mobile phone capabilities to be harnessed when members of the group were together. To protect privacy, it would not request sensed data from nearby phones that don’t belong to the group.

Graduate students Chaun Qin and Xuan Bao developed TagSense, a smartphone app that automatically tags photos.

The experiments were conducted using eight Google Nexus One mobile phones on more than 200 photos taken at various locations across the Duke campus, including classroom buildings, gyms, and the art museum.

The current application is a prototype—it is believed that a commercial product could be available in a few years.

Researchers from USC contributed to the technology, that was supported by the National Science Foundation.

More news from Duke University: www.dukenews.duke.edu

Related Articles