Abortion in the Age of Surveillance Capitalism

0
5967

In response to concerns about Google’s privacy violations, then-CEO Eric Schmidt declared in 2009 that “if you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place.” While blunt, Schmidt made explicit what was already widely understood in tech industry circles — there is immense value in seeing, tracking, and knowing all.

His words resonate differently over a decade later, where tech giants find themselves at yet another crossroads. After the Supreme Court’s decision in Dobbs v. Jackson Women’s Health Organization in June 2022, 13 states outright criminalized abortion. Many women soon after expressed concern over fertility and period-tracking apps, fearing that the data they share could be used to incriminate them. Experts have since pointed out that the full extent of data collection extends far beyond period-tracking apps. Search queries, card transactions, and even location data have all been noted as sources of incriminating evidence. As a result, the Dobbs decision has sparked a nationwide conversation about privacy and data ownership, which has translated to bipartisan attempts for data privacy legislation.

The public is primarily concerned about incrimination — when sensitive health data gets in the hands of law enforcement. On this front, while some platforms like Google have announced protections, such as deleting location data relating to abortion clinics and fertility centers, many tech giants have remained silent. But this begs the question, why do health apps collect such data in the first place? Why is our location data constantly monitored? Why is every interaction, from card transactions to fridge openings, becoming digitized? 

The answer lies in surveillance capitalism. Coined by the sociologist Shoshana Zuboff in her 2018 book “The Age of Surveillance Capitalism: The Fight For a Human Future,” surveillance capitalism is a framework that describes the business models behind tech platforms, data brokers, and advertisers: Tech platforms track user data, which is used by data brokers to further personalize advertisements and improve their efficacy. In the case of abortion surveillance, this framework enables a symbiotic relationship between corporate profit motives and the interests of public law enforcement. The tools in place to facilitate ad targeting have seamlessly transitioned to becoming instruments of criminalization. Abortion surveillance is not tech gone rogue, but business as usual.

Methods of abortion surveillance

During the weeks following the initial leak of the Dobbs decision, many raised alarm over data collected by period and fertility trackers. Because these tools are used by so many people — nearly one in three women in America, according to a 2019 Kaiser Family survey —  any risk of compromised user data is extremely frightening. If seized by law enforcement, these apps, which collect user data to analyze menstruation cycles, could be wielded as evidence to incriminate someone seeking abortion.

It is necessary for these period and fertility trackers to collect some data to be viable, of course. However, many are using the data for more than simply calculating information. As a recent FTC settlement with popular period-tracking app Flo reveals, these apps collect user data to sell to third-party advertisers and data brokers (in Flo’s case, without user discretion). They then use this data to personalize advertisements. Someone who is pregnant may receive more ads for maternity clothes, for example. 

Flo is not unprecedented, however. Health and fitness apps have a long history of relaxed regulation. Despite many being connected to sensitive health records, they do not fall under the rigor of HIPAA. In the past, the FTC and FDA have categorized these platforms as having “low-level risk,” offering non-binding recommendations in lieu of regulation.

While period and fertility tracking apps have made headlines, they are just one of many tools that can be used to surveil abortion-seekers. Online search data, like queries for abortion medication, has been used as evidence before. In 2017, Mississippi woman Latice Fisher experienced a stillbirth — after investigating her phone, law enforcement officers used search data for drugs like mifepristone and misoprostol as evidence of feticide. Buying these items in the real world is not risk-free either, as digital records of card transactions for abortion medication can also be seized

Law enforcement acquire user data through obtaining a suspect’s physical device, a warrant to a suspect’s digital footprint, or records from a search engine or tech platform directly. Geofencing, an advertising tool where location data is constantly collected to deliver product recommendations, has become one of the most prevalent methods to obtain warrants.

Beyond data access attempts from law enforcement, people seeking abortion can be discovered through other private citizens. Anyone can report suspected abortions to the police, who then set warrants and subpoenas into motion. In Texas, new anti-abortion law SB 8 takes this principle one step further, placing the onus on private citizens to sue people “aiding or abetting” abortion. Another notable technique used by anti-abortion organizers is buying geofencing data en masse, acting as advertisers, and later bombarding visitors of an abortion clinic with advertisements for abortion alternatives like crisis pregnancy centers or adoption agencies. Users who experience this targeting often report seeing ads weeks to months after being near a clinic. 

Even police officers have routinely created fake social media accounts, posing as “friends” of suspects, to gain access to information that could suggest their involvement in an abortion. In the wake of criminalized abortion, efforts from both private and law enforcement agencies to obtain data that implicates users in abortion will become increasingly common.

However, criminalization is not the only way to affect the livelihoods of people seeking abortion. In an interview with the HPR, sex worker and critical tech researcher Olivia Snow described the potential for tech platforms to cause harm through systematic deplatforming. She has been kicked off of platforms ranging from financial services to Doordash, despite never informing them of her sex work. In her words, sex workers act as “canaries in coal mines” for surveillance oppression; the tactics institutions use to punish sex workers serve as a template for punishing other identities or actions. Just as platforms are able to detect sex work, they may be able to detect abortion through data like location, transaction amounts, and payment descriptions.

Under so many instruments of surveillance, people seeking abortion now face a labyrinth of precautions: One must not tell anybody, refuse to use search engines for help, pay in cash only, use a burner phone to enter the clinic, and use VPNs. Also, they should probably never use a period or fertility service. These conditions to obtain an abortion, free of oversight, seem ridiculous — almost impossible — to satisfy.

Introducing surveillance capitalism 

Methods for state and private actors to track abortions have forced people seeking abortions to jump through an ever-growing number of hoops – but this dynamic is not inevitable. Rather, surveillance capitalism provides tech giants with the incentive to collect massive amounts of user data.

Google pioneered this economic model. In 2001, while millions of users were using this search engine, several venture capital firms were wary of its viability as a business. After seeing multiple failed attempts at making profits on a search engine through paywalls or sponsored content, Google looked towards marketing user data. It already kept track of user searches to improve its operations, so expanding this data for marketing purposes was feasible. As the years progressed, tech giants developed physical devices, such as smart fridges, Nest thermostats, and the short-lived Google Glass, with sensors to collect more data.

Rhetoric eased the transition from these tech giants’ existing sources of revenue to a business model of data accumulation. Google, when starting out its strategy of harnessing user data, described the input as “data exhaust.” Zuboff argues in her 2018 book that this phrasing was intentional — it creates the impression that user behavior data is costless and ordinarily useless. However, this obscures the intangible costs of relinquishing data and privacy. In an age where all user data is surveilled, a single missed period or search for misoprostol carries insurmountable weight.

In other ways, this framing of “exhaust” is somewhat accurate — at least from the perspective of advertisers. This is because surveillance capitalism is imbued with radical indifference; it views people as sites of extraction and data as nondescript raw material. Current systems of data accumulation and surveillance fit this framework quite neatly. For example, anti-abortion groups buying geofencing data to target advertisements is not exceptional: it’s exactly what geofencing is intended to do. Geofencing creates a stream of location data, which serves as a product to be bought and sold. This functions no differently than any other good that is created and eventually sold. From the perspective of a data broker, what is the difference between a product sold to the Starbucks corporation and one sold to an anti-abortion group? Both groups represent viable paying customers.

With the framework of surveillance capitalism in mind, we can begin to understand why data accumulation is so vital for the livelihoods of many of these top tech platforms. If there was no effort for expansive data collection, these companies would cease to exist. Substantive data protections in other countries, like the 2016 General Data Protection Regulation in the EU, have resulted in massive losses for both tech giants and smaller sites alike. As a result, these platforms have lobbied to ensure that similar regulations never spread to the US. Tech platforms like Google and Facebook have devoted millions of dollars lobbying to protect expansive data collection. Facebook, in particular, has worked to defeat data privacy legislation in the states of Washington, New Hampshire, Connecticut, Alaska, and Montana. 

Solutions and surveillance futures

The current state of abortion surveillance is bleak. Every card transaction, internet search, or location ping could be used as evidence of an abortion, with a plethora of state and private actors willing to collaborate. A lens of surveillance capitalism enables us to understand the context of how these systems came to be. We understand that individual measures are largely ineffective; until the system that enables tech platforms and data brokers to profit off behavioral surplus is disrupted, these actors have little incentive to self-regulate. However, framing the current abortion landscape in terms of surveillance capitalism points us towards solutions. One key lesson from Zuboff’s book is that surveillance capitalism is not inevitable. It is not destiny, but instead the product of intentional choices.

Some law experts have pointed to strong data privacy legislation. Executive Order 14076 aims to make health apps fall under HIPAA, which places more guardrails on data sharing. While an important first step, there still exist several avenues for incrimination. Eva Galperin, director of cybersecurity at the Electronic Frontier Foundation, looks towards the need for end-to-end encryption in her interview with the HPR. In August 2022, text messages between a teenager and her mother on Facebook Messenger were seized by police officers to implicate themselves in abortion. Because Messenger, like many other chatting apps, do not encrypt messages by default, these private messages were accessible to law enforcement. She recognizes the availability of end-to-end encryption platforms as a necessity that must be defended against legislative attacks like the EARN-IT Act, which aims to mandate backdoors on end-to-end encryption sites like Whatsapp.

But ultimately, it is the system of surveillance capitalism that needs to change. From time immemorial, people have kept secrets, doing things they “don’t want anyone to know” about. It is only recently that interlocking networks of tech platforms, data brokers, and law enforcement agencies have become the arbiters of these secrets. Until economic models for technology stop relying on the accumulation of user data, we will continue to face threats to our foundational civil liberties.

Image by Ev Henke licensed under the Unsplash License.