Art and TechnologyApril 2025

Processes of Memory in the Age of Information

img1

Joseph Kosuth One and Three Mirrors [Eng.], 1965. Two mounted photographs and mirror [mirror: 48 x 48 inches; mounted photograph (mirror): 53 9/16 x 54 5/16 inches; mounted photograph (definition): 31 1/2 x 39 3/8 inches] overall: 57 1/2 x 157 1/2 inches. © Joseph Kosuth. Courtesy the artist and Sean Kelly, New York/Los Angeles. Photo: Jason Wyche.

“We cannot conceive of a more impartial and truthful witness than the sun, as its light stamps and seals the similitude of the wound on the photograph put before the jury; it would be more accurate than the memory of witnesses, and as the object of all is to show truth, why should not this dumb witness show it?”
Franklin v. The State of Georgia, 69 Ga. 36 (1882); quoted in Susan Schuppli Can the Sun Lie? (2014–15).

Joseph Kosuth’s Self Defined Subject (1966) is currently on view as part of the artist’s eightieth birthday retrospective exhibition, Future Memory at Sean Kelly, showcasing works from every decade of Kosuth’s career. Whatever the subject in the conceptual artist’s career, meaning is never easily found. One and Three Mirrors (1965) plays with iconic, indexical, and symbolic constructions of a mirror through the placement of a mirror, a black-and-white photograph of the mirror where it is placed, and a print of a dictionary definition. With the mirror showing our selves, what we mean partly reflects our own condition, inverted as it may be. Of course, we all know that, even though we don’t need to recall or allude to it each time; similarly with the less than perfect truth of sun-touched photographs. Remembering it may be more important now as we relinquish responsibility for memory to machines.

img2

Screenshot of Windows diagnostic pop-up.

I was somewhat appalled to find this as the first sentence in the Wikipedia entry for “Memory”: “Memory is the faculty of the mind by which data or information is encoded, stored, and retrieved when needed” (my italics). Politically right now, globally and locally, the concepts of history, memory, and archives are very much at stake. They are torn between divergent beliefs of how they operate and what they can or should do. Metaphors may be “just” figures of speech, but their shorthand “creeps in this petty pace” and from day to day dulls the signifying. Memory’s call and response with technology redefines its subject and object.

img3

Sean Fader, Insufficient Memory, 2020. Screenshot of Google Earth Interactive Tour.

Sean Fader’s project, Insufficient Memory (2019–20), memorializes sites where queer people were murdered in 1999 and 2000—just after Matthew Shepard’s brutal murder and President Bill Clinton’s call for legislation regarding hate crimes committed against queer people. Fader purposefully used the Sony Mavica Digital of that time period. It took ten seconds to shoot one JPG, whirring as it processed to the disk. The camera provided one and a half hours of screen usage per battery charge, so in theory one could take some five hundred images, but the floppy disk could only store up to 1.4MB, around forty low-grain JPGs, but likely only twenty for a professional photographer, before clearing the files from the disc to a computer. Fader’s pixelated prints are granular testimonies of what could be seen back then, as well as our view of the past. There are material stakes still present in our ways of seeing that Fader’s project embraces.

In How We Became Posthuman (1999), N. Katherine Hayles addressed how “the seeming incorporeality of information … is the product of ideological forces and institutional practices that obscure the social and material bases that circulate and produce information.” When we are also treated as information—something which has prevailed since the nineteenth century but accelerated under data, platform, or surveillance capitalism—our bodies and histories become incorporeal and so do not matter. Except they do, as we see through Fader’s view finder.

img4

Sean Fader, Left: Documentation of Sony Digital Mavica, Right: Detail of image from Sony Digital Mavica, 2020.

Humans went from clay and dirt infused with spirit to bodies with distinct minds. In the third century BCE, the development of hydraulic engineering introduced the flow of humors governing human feeling. The 1500s offered an interest in automata and the mechanical clocks of the 1600s made us wind up toys with springs and gears: “a machine that winds its own springs—the living image of perpetual motion … man is an assemblage of springs that are activated reciprocally by one another,” wrote Julien Offray de La Mettrie in L’Homme Machine (1747). What a homunculus!

The investigations of the 1700s into electromagnetism made minds vibrate with radiant possibility. Then came trains and steam engines. The 1800s introduced the telegraph with the camera swiftly following, both adopted by Henri Bergson in his work on memory (1896). Metaphors emerge: emotional charge, train of thought, human wiring, circuits of analysis, mental impressions.

And then, in the 1940s, the one we fancy now: the computer.

The tech industry has terms like memory bank, memory cycle, memory controller…. There are memory types: random access, read-only, flash, registered, virtual, non-volatile, and more. Motherboard itself implies a generational relationship; the printed circuit board holds and connects system elements like the central processing unit (CPU), memory (there it is again!), and auxiliary parts, like… daughterboards (used to describe sound or video cards, and other custom items). That gendering genealogy is worth remembering, too.

From clay to computer, these metaphors serve something, but they aren’t solutions to whatever question we struggle to form. Many scientists and philosophers dispute the computer analogy, and it’s worth attending to the havoc it wreaks.

Artistic use of digital tools—those frequently hailed for their storage, archiving, and posterity purposes—can cultivate, critique, and complicate the fixity of their marketing narratives.

img5

Amelia Winger-Bearskin, 2024, photo the artist took of the chant book at the Glencairn Museum, shown to Amelia by the museum’s director, Brian Henderson. This is the specific chant from the book that her artwork Glimpse is based upon.

Amelia Winger-Bearskin was at the Glencairn Museum enjoying medieval illuminated notation of chants that led to her work Glimpse (2024), expanded from a mere fragment. The notation guides are suggestions based on a practice that the participants would have discussed, expressive of choices and instruments, which also leaves them vague and uncertain for future readers. She had to translate the fragment available, but any correct interpretation would require knowing the year it was written because ideas and language surrounding notions of God were particular to different moments, though even dates are uncertain since chant books gather through constant additions. Her discomfort expressing a liturgical message and recognition of the issues with translation granted her the liberty to develop something inspired by, but not beholden to, the fragment. She produced an “ad hoc weird” text with the input of the museum curator and ChatGPT to fill out the fragment into a complete chant to be performed. The use of generative AI, in this instance, admirably adopts its very wrongness.

Distributed storage of archives was an early impulse for creating the internet and that decentralized repository produces what we enjoy about it. If a server goes down, another one serves us the information we seek. It can be deleted, as those of us in the United States have observed occurring recently, but so can it be redistributed for good or ill—resharing social media posts that damage people’s lives or reinstating research deleted from websites.

Internet content played a large role in the data sets associated with “AI”; blockchain proclaimed permanent storage. We bemoan the loss of documents, contacts, images, files of all sorts, when we can’t retrieve them from our various devices, but also our own digital hoarding. What is being held and transmitted?

img6

Amelia Winger-Bearskin, 2024, Glimpse promotional image, image used to promote the sound work commissioned for the Glencairn Museum for their annual Sacred Arts Festival.

The generated content increasingly prevalent on websites makes much information we find unworthy of reading, let alone memorializing (here, a bonkers text about an Emily Dickinson poem by “Editor 2” on the Elite Skills website; here, something about Edward Hopper’s Nighthawks (1942) “written by human,” adding “no AI,” though I leave you to decide). Digital literacy is crucial and digital natives often heartbreakingly inured.

Cybernetics created a world in which we live. It’s not just that it created the computer on your desk, the “phone” that is so much more in your pocket, the heat sensing drones, the satellites in space, the designs for traffic flows and food distribution, or management practices, but it also reimagined memory and relationship. Norbert Wiener coined the term by adopting the Greek word for steersman to indicate the way that the field would manage information, through control and communication.

We might not refer to cybernetics anymore, but we surely live in a cybernetic world. Despite internal disputes, the work done at the Macy conferences (1946–53) reimagined psychology with mental processes as information gates that enabled the new field of cognitive science; the participants established metaphors that govern much of how we describe ourselves with the brain processing information, retrieving knowledge, and storing memories, enabling their comparison to computers—many of the workers doing this computation were, after all, women, which machines could replace. The mind is not a computer.

In Beautiful Data: A History of Reason and Vision since 1945 (2015), Orit Halpern describes how mathematician Norbert Wiener believed that “human behavior could be mathematically modeled and predicted, particularly under stress.” This partly stems from Claude Shannon’s insight in “The Mathematical Theory of Communication” (1948): “If one tries to overcrowd the capacity of the audience, it is probably true, by direct analogy, that you do not, so to speak, fill the audience up and then waste only the remainder by spilling. More likely, and again by direct analogy, if you overcrowd the capacity of the audience you force a general error and confusion,” which is easier to predict.

Alarm, resistance, and exhaustion are the three stages that occur in every disease or stress of life, explained Marshall McLuhan in Understanding Media (1964).

The proliferation of information, connections, and opportunities in this machine age has by many, if not most, accounts made us oversaturated, overstimulated, overwhelmed. Ernest Becker wrote in the preface of The Denial of Death that:

The man of knowledge in our time is bowed down under a burden he never imagined he would ever have: the overproduction of truth that cannot be consumed. For centuries man lived in the belief that truth was slim and elusive and that once he found it the troubles of mankind would be over. And here we are in the closing decades of the 20th century, choking on truth.

That was in 1973, over fifty years ago.

img7

Sean Fader, 2020, Allison Decatrel Inverness, Florida, Murdered, October 31, 1999, Archival InkJet Image.

The oft-cited “muzzle velocity” strategy of Steve Bannon aims to “flood the zone.” Many then discuss this as “information overload,” but back in 2009, Clay Shirky, a New York University media professor, proposed that the actual problem was “filter failure”: the internet devastated publishing economics that had up front costs but returns upon distribution, because editors discerned for readers what content deserved attention. Online dissemination endowed discovery of niche (cost-prohibitive) material, but also eliminated any mitigating costs to posts. Anything, anywhere, all the time decimated the former guardrails to information’s value as meaningful. Gatekeeping, as always, has two sides.

Ironically, the January 2025 Sixth Circuit Court of Appeals ruling that eliminated net neutrality (remember that?) means some services and sites may get more oomph than others. It’s pay to play time. Muzzle velocity floods, but it also permits one information stream to surge another. Put another way: Will one billionaire be able to fund swift flows and delay the current of another? Whose news will snooze? Muzzle becomes the operative word.

Mark Leckey described, in a recent New Social Environment conversation, a walk in the park near his home shortly after COVID confinement had been lifted; the familiar stroll was transformed. He recorded his reaction: “Everything just fills me up and it’s too much.” Layered into the video Carry Me Into Wilderness (2022), the ecstasy is terrifying and his medieval imagery reminded me that the transportations of saints must have been awful as well as awesome. Leckey’s repetition of a social media clip in which a guy hurtles himself into a glass bus stop shelter, inserted across To The Old World (Thank You For the Use of Your Body) (2021) (seen at 1:47–2:27, or 7:12 to the end here), alongside the bystander’s “Oh my god!” offers a strange adjacency to that earlier period’s ascetic denial of the body in hope of divine encounter. Whether information overload or filter failure, a desperate urge for some kind of clarity or enlightenment surges until exhaustion shifts to numbness. That’s no loss of sensation but a pent up feeling that, in due course, cracks open. Recent social media videos of people screaming into pillows—or mimicking it silently—provide a catharsis for such suppressed sentiments. Leckey spoke about how singing in unison helps. We gather and breathe.

img8

Mark Leckey, Carry Me Into The Wilderness, at Sant’Andrea de Scaphis, Rome, 2022. Courtesy: the artist; Sant’Andrea de Scaphis, Rome. Photo: Daniele Molajoli.

Marshall McLuhan and Quentin Fiore contemplate disorder by comparing the medieval to their own moment in The Medium is the Massage (1967): “These are difficult times because we are witnessing a clash of cataclysmic proportions between two great technologies. We approach the new with the psychological conditioning and sensory responses to the old. This clash naturally occurs in transitional periods.” In this transition period, one so much like the tumult of the Early Modern that artists keep revisiting such that I find myself peeking into its histories to see what I might glimpse… I wonder if the age of information, so obsessed with control and communication, might have lost the pertinence of fragility. Though memory studies seek to determine what memory is (in part to mitigate its loss), perhaps we forgot along the way something more profound.

Just like the mind is not a computer, memory is so much more than information. By all accounts, memory includes forgetting, as well as the ability to recall. There is a danger in our frailty, the ease with which history gets reshaped or erased. At the same time, it’s in the certainty of this uncertainty that how and what we piece together matters. Our attitudes towards memory have been shaped by these technology metaphors, but that socialization may have lost something we need to re-member. Perhaps it’s not only the pieces we call bits of information that are part of the puzzle. Our technologies store information so we don’t have to, but if we no longer retain dozens of birthdays and phone numbers because devices provide them for us, what are we choosing, or neglecting, to do instead with this aptitude we call memory? Therein may be a subject ill-defined by the black mirror of our time. It might be the human question for this machine era.

Acknowledgements: My thanks to Tom Roush, M.D. for several thoughtful conversations and the College Art Association for the opportunity to engage these issues on a panel with Nathaniel Stern, Sean Fader, and Amelia Winger-Bearskin during the 113th conference in New York City on February 12th.

Close

Home