Quantcast
Channel: Technology | The Atlantic
Viewing all articles
Browse latest Browse all 7191

Rape Is Not a Data Problem

$
0
0

Sexual violence is having a moment in American culture. From Florida State to Columbia, colleges across the nation are struggling with how to address campus rape. Last month, the theft of nude photos of prominent female celebrities prompted the Internet to do what it does worst, and spread them virally across 4chan and Reddit. And of course, no single recent act of violence captured the public eye as strongly as Ray Rice's punch seen 'round the world, delivered to his then-fiancée Janay Palmer in an Atlantic City elevator—and since witnessed by millions, courtesy of TMZ.

Sexual violence is a pernicious, often secretive problem desperate for creative solutions. So it’s unsurprising that many of the well-intentioned among us are considering how technology might be used to address it. In August, four undergraduate students at North Carolina State University debuted Undercover Colors, a nail polish that changes color when it comes into contact with common date-rape drugs, like Rohypnol. The polish is designed to let a woman “discreetly ensure her safety by simply stirring her drink with her finger” to detect the presence of a drug, and presumably extricate herself from the situation.

Along the same lines, Good2Go, an iPhone/Android app released in September, aims to create “hard data” about consent to a sexual encounter. The app requires its user to assert that she is Good2Go (that is, she consents to have sex). It then asks her to assess how intoxicated she is—if she’s “pretty wasted,” the app won’t permit her to consent—and sends a verification code to her potential paramour’s phone to confirm his identity. The app’s intent is to “facilitat[e] communication” around sexual consent by “creating a pause” for discussion between potential partners. (iTunes pulled the app earlier this month, saying it violated developer guidelines which prohibit "excessively objectionable or crude content.” In response, the founders have shuttered the website and are reconsidering the app's future.)

Both Undercover Colors and Good2Go have been subject to a number of well-founded and well-articulated critiques: On account of their clumsy, unrealistic fit with the social realities of sexual violence, their implication that women (not men) should be responsible for preventing rape, and (in Good2Go’s case) the possibility that personal information collected by the app might be sold or subpoenaed. But “rape solutions” like these also reveal a deeper and far thornier issue: the tendency to address sexual violence as a data problem.

Undercover Colors and Good2Go are technological tattletales. Both are designed to tell the truth about an encounter, with the objectivity and dispassion of a database or a chemical reaction. Tattletale solutions make sense only if we see rape, fundamentally, as a problem of bad data. But thinking about rape this way implies that what we’re most worried about is men being wrongly accused of sexual assault. That the reports women provide aren’t reliable, and should be replaced by something “objective.” These technologies prioritize the creation of that data over any attempt to empower women or to change the norms around sexual violence; they’re rape culture with a technological veneer.

Even after the fact—once the act has happened—our tendency is to view sexual violence through the lens of data rather than human experience. Take the footage of Janay Palmer’s battery, subject to tremendous public scrutiny and comment. Across the board, that film has been framed almost exclusively through the lens of data and proof: Who saw the video, and when? What does it reveal about the NFL brass’s veracity and institutional policies? Whose account does it confirm, and on whose does it cast doubt? The main characters in this story are Ray Rice, Roger Goodell, and TMZ; Janay Palmer is just a woman’s face at the receiving end of a fist. Somehow the assault isn’t about her at all.

By looking at sexual assault through a data lens, technologies like these collapse complex experiences into discrete yes-or-no data points. But sex is not a singular act, and consent is an ongoing conversation: Some acts are agreed upon but others are not, and participants are allowed to change their minds at any point. Focusing on data production drives us to think of sexual violence in black-and-white terms—a dangerous oversimplification of a far messier and more nuanced reality.

Silicon Valley might see this issue as a catch-22. When technologists make silly apps, they are slammed for ignoring real social problems. When they take on real social problems, they’re criticized for treating them like silly apps. It’s encouraging to see techies trying to address knotty social issues like sexual violence. But if technology is going to intervene for good, it needs to adopt a more nuanced approach—one that appreciates that not every problem can be treated as a data problem. Laundry delivery is a data problem; rape is not.

So how can the tech world help to take on sexual violence? There are successful models out there. Consider Pivot, created by students and faculty at the University of Washington last year. Pivot aims to empower victims of human trafficking by connecting them with essential services. The tool consists of a simple paper insert, printed with resources for trafficking victims, that comes packaged with a nondescript sanitary pad. A woman can open the package and read the information alone in a bathroom stall, undetected by her captor; she can flush the water-soluble insert, saving only the phone number for a trafficking hotline (encoded to look like a set of “lucky numbers” from a fortune cookie). In a similar vein, Aspire News looks like a traditional news aggregator app—but discreetly links to resources for victims of domestic violence, and offers a secret shortcut (three taps at the top of the screen) to alert a user’s trusted contacts if she is in an emergency situation.

These tools are sensitive to the realities of sexual violence by opening channels for communication on women’s terms; they recognize that domestic violence is often characterized by secrecy, and that many women choose to seek help through networks of trust and long-term support. Most importantly, they amplify the voices of victims by giving them resources to speak for themselves, rather than subjugating their voices to a more “truthful” data source.

Will tools like Pivot and Aspire News “solve” the problem of sexual violence? Of course not; no technology can do this on its own. But they do show that technology can be used thoughtfully to address complicated social problems. To do this, it has to be sensitive to the social, economic, and institutional realities in which it’s embedded.

We need to ensure that we don’t simply create tattletale technologies that “speak for” victims by translating their complex realities into reductive, proof-oriented data points. To do so makes these “solutions” part of the problem.

This article was originally published at http://www.theatlantic.com/technology/archive/2014/10/rape-is-not-a-data-problem/381904/









Viewing all articles
Browse latest Browse all 7191

Trending Articles