In recent experiments, significantly fewer women than men saw Google ads promising help getting jobs that pay more than $200,000. The findings raise questions about the fairness of targeted online ads.
The study, which used a tool called AdFisher that runs experiments with simulated user profiles, establishes that the gender discrimination was real, says Anupam Datta, associate professor of computer science and of electrical and computer engineering at Carnegie Mellon University, where AdFisher was developed.
Still unknown, he emphasizes, is who or what is responsible. Was it the preference of advertisers? Or was it the unintended consequence of machine learning algorithms that drive online recommendation engines?
“This just came out of the blue,” Datta says of the gender discrimination finding, which was part of a larger study of the operation of Google’s Ad Settings webpage, formerly known as Ad Preferences. The finding underscores the importance of using tools such as AdFisher to monitor the online ad ecosystem.
“Many important decisions about the ads we see are being made by online systems,” Datta says. “Oversight of these ‘black boxes’ is necessary to make sure they don’t compromise our values.”
The study, published in the Proceedings on Privacy Enhancing Technologies and presented this month, used the automated AdFisher tool to run 21 experiments evaluating Ad Settings, a webpage Google created to give users some control over the ads delivered to them.
What’s inside the ‘black box’?
AdFisher creates hundreds of simulated users, enabling researchers to run browser-based experiments in which they can identify various effects from changes in preferences or online behavior. AdFisher uses machine-learning tools to analyze the results and perform rigorous statistical analyses.
“We can’t look inside the black box that makes the decisions, but AdFisher can find changes in preferences and changes in the behavior of its virtual users that cause changes in the ads users receive,” says Michael Carl Tschantz, a PhD alumnus of Carnegie Mellon who is now a researcher at the International Computer Science Institute in Berkeley, California.
Previous researchers have been able to show only a correlation, not causation, between changes in Ad Settings and ads displayed to users.
To study the impact of gender, researchers used AdFisher to create 1,000 simulated users—half designated male, half female—and had them visit 100 top employment sites. When AdFisher then reviewed the ads that were shown to the simulated users, the site most strongly associated with the male profiles was a career coaching service for executive positions paying more than $200,000.
“The male users were shown the high-paying job ads about 1,800 times, compared to female users who saw those ads about 300 times,” says Amit Datta, a PhD student in electrical and computer engineering. By comparison, the ads most associated with female profiles were for a generic job posting service and an auto dealer.
The researchers have no evidence that Google is doing anything illegal or that it violates its own policies, Anupam Datta says. Though AdFisher can identify discrepancies, it can’t explain why they occur without a look inside the black box, he adds. Such discrepancies could come from the advertiser or Google’s system selecting to target men.
Anupam Datta is currently working with Microsoft Research to get an inside look at Microsoft’s ad ecosystem. He says he hopes other organizations will use tools such as AdFisher to monitor the behavior of their ad targeting software and that regulatory agencies such as the Federal Trade Commission will use the tool to help spot abuses.
In addition to the findings regarding gender discrimination, another experiment showed that some changes in ads presented to users based on their browsing activity are not transparently explained via the Ad Settings page.
When the simulated users visited webpages associated with substance abuse, Google subsequently began sending them more ads for a drug rehabilitation center.
Even so, this change was not reflected in Ad Settings, making it impossible for the user to change preferences and halt the substance abuse-related ads. Anupam Datta says that is likely because the change in ads was the consequence of remarketing. In remarketing, Google lets advertisers reach users who have already visited their page.
This finding demonstrates that the Ad Settings page doesn’t provide a complete picture of the inferences Google has made about a user. The authors noticed Google started highlighting this limitation on the Ad Settings page a few weeks ago.
In another experiment, the researchers found that adjusting Ad Settings can enable users to avoid some classes of ads they may dislike. They found that simulated users visiting online dating websites could remove that interest on Ad Settings to receive fewer ads related to online dating, giving users some choice over the ads that Google shows to them.
The National Science Foundation supported the work.
Source: Carnegie Mellon University