The Brooklyn Rail

APRIL 2022

All Issues
APRIL 2022 Issue
ArtSeen

Stephanie Dinkins: On Love and Data

Detail view, Stephanie Dinkins,<em> A <u>                        </u> Smiling</em> (2021), three GAN computer generated prints on aluminum, each: 20 x 20 inches, and three GAN computer generated videos, durations variable. Courtesy the artist. Photo courtesy of Queens Museum, credit: Hai Zhang.
Detail view, Stephanie Dinkins, A                          Smiling (2021), three GAN computer generated prints on aluminum, each: 20 x 20 inches, and three GAN computer generated videos, durations variable. Courtesy the artist. Photo courtesy of Queens Museum, credit: Hai Zhang.
On View
Queens Museum
March 13 – August 14, 2022

The artist Stephanie Dinkins is focused on systems-based change, and she’s a self-described techno-optimist. In December 2020, she said of AI, one of her chosen media: “I see how it can be used against communities. I see how it can be used to track us, all the things. But at the same time, I think about what is possible with the technology: how people can use it now as a site of opportunity.” She then described how it could even become a democratic survey tool, taking in many ideas, analyzing them, and sketching exciting frameworks for action.

For Dinkins, one of the main problems with AI is the existing data it uses. In 2021, she worked with a text-to-image General Adversarial Network (GAN), a machine learning system that creates composite images of realistic-looking human faces from widely-used but implicitly biased datasets. Included in her first survey exhibition at the Queens Museum, On Love and Data, which originated at the University of Michigan Stamps Gallery, Dinkins’s series, “A                          Smiling” (2021), includes three videos and three prints that show what the GAN generated when prompted to simulate a Black woman’s face by entering language about Black or African-American women smiling and about dark skin. The results are possibly triggering. One image is barely recognizable as human. Instead, it resembles an early Carroll Dunham drawing, a leathery, tree-stump-like form with two mouths and no eyes. The other two look more like human faces but misassembled, blurred, and swollen.

<p>Installation view, Stephanie Dinkins, <em>Secret Garden</em> (2021, site-specifically reconfigured alongside <em>The Panorama of the City of New York</em>, 2022), interactive video installation with three-channel video projection, 7- channel audio, including 6 channels of interactive sound, depth cameras, computers, duration variable. Technical engineer: Sidney San Martín. Courtesy the artist. Courtesy the artist. Photo courtesy of Queens Museum, credit: Hai Zhang.</p>

Installation view, Stephanie Dinkins, Secret Garden (2021, site-specifically reconfigured alongside The Panorama of the City of New York, 2022), interactive video installation with three-channel video projection, 7- channel audio, including 6 channels of interactive sound, depth cameras, computers, duration variable. Technical engineer: Sidney San Martín. Courtesy the artist. Courtesy the artist. Photo courtesy of Queens Museum, credit: Hai Zhang.


With facial recognition, algorithms are likely to mix up Black women’s faces and people with dark skin. Pundits and scholars argue that US law enforcement databases skew towards people of color while content on the web skews white, male, and Western. Also, technicians initially calibrated color photography to read white skin tones. If we rely on the same law enforcement databases or companies like Microsoft, Google, and Meta, our dataset is shaky at best. Instead, Dinkins is interested in the ability of small data to build counter-narratives. For example, what data does a single neighborhood hold? What is accessible and helpful in the archive of a family or a church? Thankfully, new datasets might not have to be enormous. They might only have to be better—more deliberate, descriptive, and compassionate. Dinkins’s exhibition includes a vinyl wall text from 2021 that asks, “What does AI need from you?” This show proposes that it needs new images and better information.

To that end, Dinkin’s app Binary Calculations Are Inadequate to Assess Us (2021), available online and installed at the Queens Museum on monitors and in voting booths, asks for donations of data honoring “lives, cultures, and values often under-considered or ignored in the technosphere.” The app asks for images and detailed descriptions of images that illustrate abundance, flourishing, and generosity. After you upload and describe your image, you can define the color in the image from a drop-down menu with an array of choices from marshmallow to olive, almond, amber, chestnut, hickory, or obsidian. You can date the image anywhere from the years 1700–2070 and place it at any event, as appropriate, from a treasure hunt to a rally. Finally, the app asks, “What would you need to make your life exactly as you imagine it to be?”

Dinkins is also the founder of her own movement, Afro-now-ism, whose manifesto is offered in a newsprint broadsheet takeaway at the museum:

Afro-now-ism is taking the leap and the risks to imagine and define oneself beyond systemic oppression … For black people in particular, it means conceiving yourself in the space of free and expansive thought and acting from a critically integrated space, allowing for more community-sustaining work.

Optimistic, grounded, and real, its full title is Afro-now-ism: The Unencumbered Black Mind is a Wellspring of Possibility. In an extended version published online in Noema Magazine in June 2020, Dinkins writes about COVID-19 and racial justice movements making inequities visible and what she sees as the reciprocal human responsibility “to reconceive the systems that threaten communities rendered simultaneously hyper-visible and invisible by their perceived difference.”

<p>Installation view, Stephanie Dinkins, <em>Afro-Now-Ism</em> (2021), neon, 112 x 120 inches. Courtesy of Stamps Gallery, University of Michigan.</p>

Installation view, Stephanie Dinkins, Afro-Now-Ism (2021), neon, 112 x 120 inches. Courtesy of Stamps Gallery, University of Michigan.


At the entrance to the exhibition, an oval lenticular print titled Afro-now-ism / Professor Commander Justice, 2021, is framed with a gold snake and shows a dignified woman wearing a curly, white wig. She seems to brace herself, her eyes seem to narrow, and the skin at the bridge of her nose crinkles while she leans forward and screams. The same professor is seen around the corner in the interactive video installation #Say It Aloud (2021). Here, it appears she is feeling better, as if her scream helped her blow off steam. Her body might be made up of flowers or honeycomb, and the wall label compares it to layers of okra slices and seeds. Across the gallery and behind a curtain, you can record a video response to the question, “What do you need to release to move forward?” Your answer will eventually float onscreen into the installation in a computer-generated gem-like cosmic pod. All the while, Professor Commander Justice encourages you to ground yourself and liberate your mind from contemptuous thinking.

In interviews, Dinkins describes her formative experiences growing up in Staten Island, where her grandmother kept a garden as an intentional practice in community-building. A three-channel video installation, Secret Garden, 2021–22, is installed in The Panorama of the City of New York, a permanent 1:1200 scale model housed at the museum since the 1964-1965 World’s Fair. Facing and looking over the highly detailed miniature city, videos of the larger-than-life Black women of various ages are shown with fields of flowers and trees behind them. They seem to be on the lookout for something—fellowship or freedom, perhaps. They appear at once like travelers, guardians, witnesses, and beacons. Sound pieces simultaneously emanate voices, forming a multigenerational narrative of oral histories past, present, and AI-generated, speaking phrases like: “imagine the world,” “center and sustain ourselves,” and “fully empowered communities.”

Existing AI exacerbates human bias and exclusion. Suppose more diverse groups of people help to generate AI. Suppose the datasets become likewise more diverse. We could then perhaps trust machine learning to be part of culturally inclusive and equitable systems we can use in good conscience. In Dinkins’s vision, AI is not a nightmare. It’s something we have to engage with, care for, and shape.

Contributor

Marcus Civin

Marcus Civin is a performance artist, writer, and Assistant Dean in the School of Art at Pratt Institute. He has written for Afterimage, Artforum, Art Papers, Maake Magazine, Southwest Contemporary, and Momus, among other publications.

close

The Brooklyn Rail

APRIL 2022

All Issues