Dating apps are rife with ‘digital-sexual racism’

"Research finds that sexual racism, or the act of ranking 'individuals as romantic partners in a way that reinforces ideas of racial hierarchy,' runs rampant in dating," explains Celeste Curington. (Credit: Good Faces Agency/Unsplash)

A sociologist explains how racism manifests on dating apps.

Despite the popularity of dating apps among those seeking intimate connections, they can pose unique problems and may even exacerbate existing ones, says Celeste Curington, assistant professor of sociology at Boston University.

In her book The Dating Divide: Race and Desire in the Era of Online Romance (UC Press, 2021), Curington and her colleagues build on existing research on race and dating and spotlight a form of racism–digital sexual racism–unique to the online dating world.

Here, Curington breaks down key takeaways from her research and provides solutions for ways apps and their users can improve the online dating experience. (This interview has been edited for length and clarity.)

Q

How do you define “digital-sexual racism”?

A

Research finds that sexual racism, or the act of ranking “individuals as romantic partners in a way that reinforces ideas of racial hierarchy,” runs rampant in dating.

The anonymity built into the design of online dating apps, the contemporary context of neoliberal colorblindness or “not seeing race”, consumerism and the rise of new digital technologies, disguises enduring racial discrimination in intimate life.

These “individual preferences,” massively and systematically segregate the internet, reinforcing categorical thinking and police digital self-presentation, all without the need for in-person avoidance.

Q

Based on your research, how has the internet changed the dating landscape?

A

For those with access, the internet allows people to connect with others that they wouldn’t ordinarily encounter day-to-day due to residential segregation and lack of racial and class diversity in schools and in the workplace.

The internet also provides a sort of anonymity that may be absent in people’s own social circles and the spaces they frequent and it can also provide a space for community building and belonging. The COVID-19 pandemic has also made people more reliant on online apps for social interaction than before.

At the same time, it’s too early to say that online dating has massively changed the scene. In our own research, we find that traditional gender scripts are reinforced via online dating. Heteronormative expectations around communication and presentation remain intact. Racial discrimination is widespread, if not amplified. And while more interactions are available via apps, the quality of the interactions is not necessarily great.

Most of the people we interviewed for our book complained about the “McDonald’s assembly line” effect of online dating as they swipe through profiles. Some believed that this led to superficial, scripted, and underwhelming interactions. Some enjoyed garnering interest from users, likening it to an “ego boost,” while others felt that the feeling was fleeting. They questioned why they felt they increasingly needed to be “seen” and evaluated by others to feel valued.

BIPOC users, especially Black women on mainstream dating apps, found that violent racist and sexist behavior directed towards them is front and center. Homophobia, transphobia, fatphobia, and misogyny also all intersect with racism to shape the experiences (and behaviors) of users.

Digital racism is unfortunately pervasive; as online dating allows for greater interracial interactions, it also brings to light the deep sexual racism that is less visible in our everyday lives. This is no small issue. Research finds that men who used the platforms heavily found sexual racism as more acceptable. This is a public issue.

Q

Are dating app algorithms racially biased?

A

Algorithms have a key role in digital sexual racism; they’re created by people within our social world where systemic gendered racism pervades. New technologies encompass a range of discriminatory designs that encode and amplify inequity through various mechanisms such as machine learning technologies.

In our book, we find that all groups, including gays and lesbians, are generally most likely to ignore Black women. Asian men and Black men are also least desired. What this means is that there is the possibility that an algorithm will therefore present less Black women or Black men or Asian men to any given user because past usership has “taught” the algorithm that these groups are “least desirable.”

Our ability to understand how behaviors are modulated by platforms is complicated by the invisibility of the proprietary algorithms that secretly shape dater interaction on many such sites. The outcomes of machine learning reproduce and amplify structural inequalities.

Q

In conducting your research for the book, were there any statistics that were particularly shocking or surprising to you?

A

I was not surprised to find that anti-Black digital sexual racism was so prevalent. But I found it noteworthy that digital sexual racism was so paramount that it outweighed key variables that are commonly thought to shape partner preferences such as gender, body weight and class. While these variables do matter, they seem to matter most among racial groups versus across racial groups.

White women were the most discriminatory in their exclusion of Black daters, which is also an important finding. Of course, this finding is reflected in qualitative studies which find that white men have more autonomy in dating interracially. White men are less likely than women to face policing and backlash by friends, family members, and random people in public.

When we think of these divisions, we also should also recognize that people are doing the work to maintain them. And in the case of online dating, where the issue of opportunity is not central, white women are generally very much actively reinforcing these divisions. BIPOC users also are generally anti-Black in their sorting behaviors as well.

Q

What steps can dating sites and apps take to address discrimination?

A

Emergent work seeks to create fairer, more socially just, algorithms. This is an important step in the process.

Removing race filters

Some members of the public have also called to do away with the race filter in online dating–some companies have taken this step. But I believe that a “color-blind” approach to online dating isn’t going to solve all the issues that we track in our book. Removing the ability to filter out does not suggest that the most marginalized users will suddenly no longer face racist misogyny.

Prioritizing interest-based interactions

So where does that leave us? For one, in addition to combatting the hidden role of algorithms, apps could allow daters to connect first over shared interests. This would initiate an initial interaction, avoiding the focus on racialized physical features such as color, hair and body type. Of course, appearance matters, as we can attest to by the fact that profiles lacking pictures in our own data were generally ignored. However, removing pictures until later in the online communication process is one possible approach that would de-emphasize race at the initial sorting mechanism. Another step worth considering is incorporating profile statistics on how often a given user responds to those of differing demographics.

Making apps safer for everyone

While dating apps do often offer users the ability to block or report troublesome users, more aggressive steps need to be taken to immediately stop these users. Another issue involves data security. Some research reports that dating apps are not doing enough to protect LGBTQIA+ users who are in areas that are actively hostile to LGBTQIA+ people.

Overall, dating companies hold a tremendous amount of influence in designing how daters approach one another and go about the process of dating. We need to push companies to become far more socially conscious. For example, companies can take the lead in educating their users about sexual racism and how individual dating behaviors feed into larger, more systematic trends.

They can do more to protect LGBTQIA+ people’s privacy and safety by ensuring encryption and stopping the sale of user’s personal information to third party entities.

They can also take the lead in hiring diverse employees that take a socially conscious approach in all levels of their companies—including design.

Focusing on real-life interactions

These changes will matter much more if we continue to work towards dismantling intersectional oppressions in our own lives and within the greater society. I’m convinced that our efforts to undo these oppressions in “real life” (housing, education, culture representation, everyday forms of gendered racism, interpersonal relationships etc.) will impact the oppressions that also existand are reinforcedonline.

Source: Thalia Plata for Boston University