The Brooklyn Rail

SEPT 2023

All Issues
SEPT 2023 Issue
Critics Page

Passive Acceptance and Our (Dis)Consent

Screenshot of <em>dataDouble</em> browser extension installed on Google Chrome, alongside a resultant portrait of the artist. Courtesy the artist.
Screenshot of dataDouble browser extension installed on Google Chrome, alongside a resultant portrait of the artist. Courtesy the artist.

Back in 2020, I started saving all the emails I received announcing changes to the terms and conditions of various platforms on which I had an account. I have since amassed emails from the usual suspects—Instagram, Google, YouTube, PayPal, Dropbox—along with notices from Indiegogo, Patreon, Zocdoc, Coursera, and more. They continue to arrive at a steady pace, enough that it’s become noticeable how often these agreements are changed as systems continue to evolve.

Like many other artists working with technology, I have been intrigued by terms of service for a while: their volume, their incomprehensibility to the average user, how most of us recoil from spending time with them. But it wasn’t until I began collecting these notices in bulk that I started to recognize another thematic consistency. The emails all strike a reassuring tone, emphasizing that no user action is needed, and letting us know that things won’t noticeably change. They offer summaries of the adjustments made, but rather than encouraging us to find out more, they almost dissuade us from going further, promising that doing nothing will still allow us to use the services. There is a huge gulf between the bare minimum “need-to-know” that the emails provide, and the fine print that is often too overwhelming to read closely.

<em>dataDouble</em> participant portrait. Courtesy the artist.
dataDouble participant portrait. Courtesy the artist.

In much of my recent artwork, I have been consumed with the speed and seamlessness that we expect from digital systems. Sociologist Susan Leigh Star wrote in 1999 that the “normally invisible quality of working infrastructure becomes visible when it breaks; the server is down, the bridge washes out, there is a power blackout.”1 In today’s era of “always on, always connected” devices and networks, the idea of a “breakdown” has expanded to include even momentary delays in getting what we want. We have somehow developed sky-high expectations of technology, not least of which is that it should do what we want immediately. Gone are the days of waiting whole minutes for webpages to load over a dial-up connection; our interactions with digital systems have become increasingly fleeting, designed for repetitive, instantaneous feedback loops.

The desire for instant gratification bleeds into our relationships to terms of service. We are in such a hurry to be on these platforms that any delay—any time taken to really think about the true costs and implications of these services—hacks away at our already fraying patience. In all likelihood, we don’t actually know what, exactly, we’re signing up for when we use the internet. We do now know, in broad strokes, that technology companies build their wealth on selling user data to advertisers; as the saying goes, “if you’re not paying for it, you’re the product.” While some might find that a bit reductionist, we cannot deny that we have lost control over what we are giving up when we use digital platforms. Our passive acceptance of these terms and conditions paves the way for constant surveillance, and for the warping and shaping of our online behaviors and identities to match what is deemed profitable and usable to others.

<em>dataDouble</em> participant portrait. Courtesy the artist.
dataDouble participant portrait. Courtesy the artist.

And yet we shrug our shoulders, and continue to click “accept” without so much as glancing over the thing we are accepting. Taking the time to really, truly comprehend the vast amounts of data we are leaving behind with every click, scroll, and like—along with the permissions we are giving Big Tech for how to use them, and the ways we are flattened and compartmentalized as a result—seems like an impossibility. It’s too abstract to understand immediately. Here we see another feedback loop: our resignation to the power differential embedded within the agreement leads to its perpetuation, which only leads to more resignation.

In 2018, I began a project called dataDouble, which draws inspiration from Kevin D. Haggerty and Richard V. Ericson’s concept of the same name. They wrote in 2000 that data doubles, or likenesses of individuals created from their accumulated data, work as “a form of pragmatics: differentiated according to how useful they are in allowing institutions to make discriminations among populations”2. The data double, in other words, replicates you, but not exactly. It uses all the data you give up when you click “accept” on those terms of service to construct a version of you that is warped and reshaped to fit institutional goals that may not always have your best interests at heart.

<em>dataDouble</em> participant portrait. Courtesy the artist.
dataDouble participant portrait. Courtesy the artist.

It's a big concept to grasp on its own; my project is an attempt to bring the idea closer to home. The core of dataDouble is a browser extension available for Chrome or Firefox, which asks you to upload a photograph of yourself upon install and then works in the background tracking your browsing behavior locally (none of the data is ever sent to a server or collected by me in any way). After fourteen days, the extension returns your portrait to you, but it has been altered and flattened using image manipulation techniques that correspond to your browsing—for example, high social media use results in more defined and cartoonish edges, and visits to financial sites reduce the number of hues used. The portrait is clearly still an image of you, but lacks the depth of true human experience—the ineffable qualities that form the basis of personal identity.

dataDouble works to translate something we know is happening in theory into a form where we can see its effects, quite literally, on our faces. The project offers a moment of reflection on the impacts of our increasingly online lives, and a way of understanding what might happen over time as we click “accept” mindlessly while signing up for yet another platform or service. The speed, seamlessness, and convenience that we crave from technology might make our lives easier in the immediate, but as we hurry to agree to conditions we can’t see, we might be giving up more than we think in the long run.

  1. Star, S.L. (1999). The ethnography of infrastructure. American Behavioral Scientist, 43(3), 377-391. DOI: 10.1177/00027649921955326.
  2. Haggerty, K.D. & Ericson, R.V. (2000). The surveillant assemblage. The British Journal of Sociology, 51(4), 605-622. DOI: 10.1080/00071310020015280.

Contributor

Roopa Vasudevan

Roopa Vasudevan is a new media artist and scholar making work about things we take for granted in our everyday interactions with technology. She is an Assistant Professor in the Department of Art at the University of Massachusetts, Amherst.

close

The Brooklyn Rail

SEPT 2023

All Issues