Dangerous Tech: How Our Devices Discriminate

After September 11, the United States launched the Terrorist Surveillance Program to monitor potential terrorist activity on a global scale. During its implementation, the National Security Administration swept both international and domestic communication, even though many of these conversations were outside the scope of the initiative. Although it was temporarily ruled unconstitutional, this ruling was overturned and the program has continued in various iterations since its implementation. As a result, much of our data is now open to government and corporate monitoring. While this information would ideally be used only to keep us safe, evidence shows that it is exacerbating racial, gendered and sexual biases in our culture.

There are two major ways data reinforces biases. The first occurs when data collection programs are designed—whether intentionally or otherwise—to be biased. Biometric scanners, which use physical traits to identify individuals, are one clear example. According to Shoshana Amielle Magnet, “[Biometric] technologies suffer from ‘demographic failures,’ in which they reliably fail to identify particular segments of the population. That is, even though they are sold as able to target markets and sell products to people specifically identified on the basis of their gender and race identities, instead these technologies regularly over-target, fail to identify, and exclude particular communities.”

In other words, biometric devices are often trained to classify certain features as normative—light skin, male features, slender traits—while other features are labeled as abnormal. As such, women, LGBT folks and people of color are much more likely to be misidentified or to have their bodies raise red flags in the system. When this occurs, the program needs human intervention to resolve the issue. Rather than fixing the problem, however, this often creates an opportunity for harassment and abuse. One common example is airport scanners, which allow TSA agents to see outlines of anatomical features, some of which may not be visible otherwise. Not only does this process increase harassment against women, whose bodies are always at risk of being sexualized and objectified, it also has the potential to out individuals who are transgender or gender-non-conforming, leading to harassment and even assault.

The second way that data reinforces biases is through misuse. Once data is collected, it has the opportunity to be used maliciously. For example, Facebook recently released a statement noting that it has 10 million more users in the U.S.than are identified by the census. While this data may be the result of an error, or a commentary on the number of users who create multiple accounts, it also has the potential to be misinterpreted by groups seeking to promote a specific, and possibly harmful, narrative. Notably, anti-immigration groups may presume that the Facebook’s extra users are immigrants and may respond by harassing or tracking down the presumed offenders. Such behavior is not entirely uncommon when findings are not clearly explained or verified, as was made clear by the recent concern surrounding vaccinations.

Even if you don’t use social media, it is highly likely that your data is being collected and misused. Smart phones and fitness trackers both record geographic data, and many of these corporations sell it to data brokers. These brokers, in turn, sell the data to advertising companies who target users with customized ads. One ad agency, Copley Advertising, partnered with a number of organizations opposed to reproductive rights to show women graphic anti-abortion ads while they were sitting in Planned Parenthood clinics. While such behavior is clearly unconscionable, it is legal within the United States. As Aaron Pressman notes, “The Federal Trade Commission last year cracked down on the use of undisclosed consumer smartphone tracking in stores, but the agency’s solution—that stores disclose the practice in lengthy and rarely read privacy policies—seems unlikely to provide much protection. The FTC’s authority doesn’t extend to nonprofit political groups.” This means that the onus is on the individual to protect themselves by disabling location tracking, using AdBlock or other privacy software and/or turning off their device.

While there are steps you can take to protect your privacy—turning off location data on your smart device, using an anonymous search engine like DuckDuckGo, using a Virtual Private Network, paying for items in cash—it can be costly or difficult to implement effective data privacy techniques consistently. As such, it is crucial for us to advocate for stronger privacy protection laws on both a state and federal level. The European Union offers a great model for doing so—their laws state that “under EU law, personal data can only be gathered legally under strict conditions, for a legitimate purpose. Furthermore, persons or organizations which collect and manage your personal information must protect it from misuse and must respect certain rights of the data owners which are guaranteed by EU law.” Implementing similar laws in the United States would offer greater privacy protections to citizens and would offer protections to marginalized groups—women, people of color, non-binary individuals—who frequently experience discrimination.

 

 

Tagged:

About and

Christina Boyles is the Digital Scholarship Coordinator at Trinity College. She is co-founder of the Makers by Mail project and the founder of the Hurricane Memorial Project. Her research explores the relationship between surveillance, social justice and the environment. Her published work appears in The Southern Literary Journal, The South Central Review and Plath Profiles, and her forthcoming work will appear in the Debates in the Digital Humanities series and Studies in American Indian Literatures.
The Fembot Collective is a collaborative of faculty, graduate students and librarians promoting research on gender, new media and technology. The Fembot community spans North America and Asia and encourages interdisciplinary and international participation.