As Valentine’s Day approaches, couples across the country are preparing for this long-standing tradition—and there’s a very good chance they met through online dating. But while dating apps can help people find a partner (or just a fun date), they can also subject users to incredible hate and harassment. Despite the fact that dating apps have accrued significant reach and influence, these companies provide very little transparency around how they keep users safe and how they moderate content. Much of the conversation around online platform accountability focuses on companies like Facebook and Google. But dating apps face many of the same issues.
The online dating sphere has changed radically since Match.com, founded in 1995, transformed the dating landscape by moving hundreds of thousands of meet cutes from cafés to chat boxes. Then came the “swipe right” apps. Grindr was launched in 2009, followed by Tinder, Bumble, and many other apps that are now household names. As these apps grew in popularity, so did services that catered to individuals with specific ethnic, racial, religious, and even interest backgrounds. Looking for South Asian partners? Dil Mil is there for you. Want to find a partner whose résumé matches your expectations? There’s even an app for that.
But these apps can also put users in harm’s way. Stories of hate, harassment, sexual assault, and downright weird encounters both on apps and during app-facilitated dates have gained notoriety. Many of these anecdotes underscore the opaque process users have to go through when trying to report an offending account and platforms’ lackluster responses to user reports. Despite this, dating apps provide very little transparency and accountability around how they handle safety and content issues on their services. For example, very few of these platforms currently publicly publish clear and easily accessible copies of their community guidelines. These policies are essential for understanding what kinds of content are permitted on a service.
Some dating apps have made some progress in this regard. Tinder’s high-level guidelines prohibit nudity and sexual content, hate speech, and users under 18 from using the app, among other things. Coffee Meets Bagel’s guidelines include specific restrictions around photographs involving guns, drugs, and other things.
Uber, another platform that brings people together in the offline world, publishes transparency reports outlining the volume and nature of safety incidents, such as sexual assaults, that occur during app-facilitated interactions in the offline world. (It’s the only tech platform that currently does something like this.) Social media companies also publish transparency reports that outline the scope and scale of their content policy enforcement efforts, including the removal of content that has been determined to contain hate speech, bullying and harassment, and graphic nudity. Despite calls for dating apps to follow suit, no major dating app publishes a transparency report.
Today, approximately one-third of young people in the United States say they use dating apps. These services are especially popular among certain marginalized groups, such as lesbian, gay, and bisexual adults. A failure to provide transparency and accountability around how a dating app moderates content can therefore create barriers to access and establish a system that fails to adequately protect the safety of its users. As research indicates, when vulnerable groups feel threatened online, they often begin to self-censor and may even leave an online platform altogether. Given that dating apps are prominent methods for building community and finding partners, friends, and even business connections, these companies must ensure that their services are welcoming and inclusive for all users.
is a partnership of
New America, and
Arizona State University
that examines emerging technologies, public policy, and society.
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .