Police officers consistently use less respectful language with black community members than with white community members, the first systematic analysis of body camera footage shows.
Although subtle, widespread racial disparities in officers’ language use may erode police-community relations, researchers warn.
“…the many small differences in how they spoke with community members added up to pervasive racial disparities.”
“Our findings highlight that, on the whole, police interactions with black community members are more fraught than their interactions with white community members,” says Jennifer Eberhardt, professor of psychology at Stanford University and coauthor of the study in the Proceedings of the National Academy of Sciences.
The findings about racial disparities in respectful speech held true even after researchers controlled for the race of the officer, the severity of the infraction, and the location and outcome of the stop.
To analyze the body camera footage, researchers first developed an artificial intelligence technique for measuring levels of respect in officers’ language that they then applied to the transcripts from 981 traffic stops the Oakland, California Police Department made in a single month.
The data show that white residents were 57 percent more likely than black residents to hear a police officer say the most respectful utterances, such as apologies and expressions of gratitude like “thank you.”
Black community members were 61 percent more likely than white residents to hear an officer say the least respectful utterances, such as informal titles like “dude” and “bro” and commands like “hands on the wheel.”
“To be clear: There was no swearing,” says coauthor Dan Jurafsky, professor of linguistics and of computer science. “These were well-behaved officers. But the many small differences in how they spoke with community members added up to pervasive racial disparities.”
“The fact that we now have the technology and methods to show these patterns is a huge advance for behavioral science, computer science, and the policing industry,” says Rob Voigt, a linguistics doctoral student and the study’s lead author. “Police departments can use these tools not only to diagnose problems in police-community relations but also to develop solutions.”
The Oakland Police Department, like many police departments nationwide, has been using body-worn cameras to monitor police-community interactions. But drawing accurate conclusions from hundreds of hours of footage can be challenging, Eberhardt says. Just “cherry-picking” negative or positive episodes, for example, can lead to inaccurate impressions of police-community relations overall.
“The police are already wary of footage being used against them. At the same time, many departments want their actions to be transparent to the public.”
183 hours of footage
To satisfy demands for both privacy and transparency, the researchers needed a way to approach the footage as data showing general patterns, rather than as evidence revealing wrongdoing in any single stop.
Yet “researchers can’t just sit and watch every single stop,” Eberhardt says. “It would take too long. Besides, their own biases could affect their judgments of the interactions.”
So, researchers examined transcripts from 183 hours of body camera footage from 981 stops, which 245 different OPD officers conducted in April 2014.
In the first phase of the study, human participants examined a subsample of the transcribed conversations between officers and community members—without knowing the race or gender of either—and rated how respectful, polite, friendly, formal, and impartial the officers’ language was.
In the second phase, the researchers used these ratings to develop a computational linguistic model of how speakers show respect, including apologizing, softening commands, and expressing concern for listeners’ well-being. They then created software that automatically identified these words, phrases and linguistic patterns in the transcripts of the officers’ language.
In the third phase, researchers used this software to analyze the remaining transcripts—a total of 36,000 officer utterances with 483,966 words. Because the team had so much data, they could statistically account for the race of the officer, the severity of the offense, and other factors that could affect officers’ language.
“Understanding and improving the interactions between the police and the communities they serve is incredibly important, but the interactions can be difficult to study,” Jurafsky says. “Computational linguistics offers a way to aggregate across many speakers and many interactions to detect the way that everyday language can reflect our attitudes, thoughts and emotions—which are sometimes outside of our own awareness.”
“Our findings are not proof of bias or wrongdoing on the part of individual officers,” Eberhardt cautions. “Many factors could drive racial disparities in respectful speech.”
Tone of voice
The research team is currently extending their work to analyze the language used by community members during the traffic stops and to study other linguistic features captured by the body cameras, including tone of voice. They also plan to explore the interplay of officers’ and community members’ speech as it unfolds over time.
“There is so much you can do with this footage,” Eberhardt says. “We are very excited about the possibilities.”
Eberhardt praised the City of Oakland and OPD for being open to having their data examined, and said she hopes that other departments across the country will invite similar collaborations.
“I’m hopeful that, with the development of computational tools like ours, more law enforcement agencies will approach their body camera footage as data for understanding, rather than as evidence for blaming or exonerating. Together, researchers and police departments can use these tools to improve police-community relations.”
Source: Stanford University