Marketers could soon use the images you post on social media to figure out your “top-of-mind” associations with their brands.
Using five million such images, researchers have taken a first step toward this capability in a new study.
Eric Xing, associate professor of machine learning, computer science, and language technologies at Carnegie Mellon University, and Gunhee Kim, then a Ph.D. student in computer science, looked at images associated with 48 brands in four categories—sports, luxury, beer, and fast food. The images came from popular photo-sharing sites such as Pinterest and Flickr.
Their automated process produced clusters of photos that are typical of certain brands—watch images with Rolex, tartan plaid with Burberry. But some of the highly ranked associations underscored the type of information particularly associated with images and especially with images from social media sites.
For instance, clusters for Rolex included images of horse-riding and auto-racing events, which the watchmaker sponsored. Many wedding clusters were highly associated with the French fashion house of Louis Vuitton.
Both instances, Kim notes, are events where people tend to take and share lots of photos, each of which is an opportunity to show brands in the context in which they are used and experienced.
Marketers are always trying to get inside the heads of customers to find out what a brand name makes them think or feel. What does “Nike” bring to mind? Tiger Woods? Shoes? Basketball?
Researchers have used questionnaires to gather this information, but, with the advent of online communities, more emphasis is being placed on analyzing texts that people post to social media.
“Now, the question is whether we can leverage the billions of online photos that people have uploaded,” says Kim, now with Disney Research Pittsburgh. Digital cameras and smartphones have made it easy for people to snap and share photos from their daily lives, many of which relate in some way to one brand or another.
“Our work is the first attempt to perform such photo-based association analysis,” Kim says. “We cannot completely replace text-based analysis, but already we have shown this method can provide information that complements existing brand associations.”
From images to ads
Kim and Xing obtained photos that people had shared and had tagged with one of 48 brand names. They developed a method for analyzing the overall appearance of the photos and clustering similar appearing images together, providing core visual concepts associated with each brand.
They also developed an algorithm that would then isolate the portion of the image associated with the brand, such as identifying a Burger King sign along a highway, or adidas apparel worn by someone in a photo.
Kim emphasizes that this work represents just the first step toward mining marketing data from images. But it also suggests some new directions and some additional applications of computer vision in electronic commerce.
For instance, it may be possible to generate keywords from images people have posted and use those keywords to direct relevant advertisements to that individual, in much the same way sponsored search now does with text queries.
Kim will present the research December 7 at the IEEE Workshop on Large Scale Visual Commerce in Sydney, Australia, and at WSDM 2014, an international conference on search and data mining on the web, February 24-28 in New York City. The National Science Foundation and Google supported the work.
Source: Carnegie Mellon University