Boogaloo Bias by Jennifer Gradecki and Derek Curry
The interactive installation reverse-engineers facial recognition technologies to question their unregulated use by law enforcement agencies
Have you ever wondered how facial recognition technology succeeds so well at identifying suspects in criminal cases? The answer: It doesn’t.
Recent research at the intersection of artificial intelligence (AI) and surveillance has revealed problems with the use of facial recognition technology for law enforcement purposes. Exposing these problems—and their implications for criminal justice—is the aim of Boogaloo Bias, an interactive artistic installation currently on display at Science Gallery Atlanta’s 2023 JUSTICE exhibition.
Exhibit creators Jennifer Gradecki and Derek Curry, both assistant professors of art and design at Northeastern University in Boston, originally developed Boogaloo Bias for Science Gallery Detroit’s 2021 TRACKED AND TRACED exhibition. When they subsequently applied to participate in JUSTICE, Science Gallery Atlanta curator Floyd Hall, who had seen their work in Detroit, invited them to participate in the Atlanta exhibition.
Boogaloo Bias gives viewers a firsthand look at what it’s like to be under technological surveillance and potentially identified as a criminal suspect. “We wanted to highlight some of the known problems with law enforcement agencies’ use of these technologies,” says Curry.
One of these problems is that celebrity images are fed into facial recognition databases as a substitute for photos of the actual suspect. A study by the Georgetown Law Center found that when police don’t have a high-quality image of a suspect on which to base their search, they sometimes feed their facial recognition program with an image of a celebrity they think resembles the suspect to see what leads it generates.
Sometimes called “brute forcing”, a play on a cryptography term that refers to trying every possible password combination until a match is found, this practice, as applied to facial recognition, is not only absurd, it’s alarming, says Gradecki. “Police can use this in their daily practice without understanding the technology, how it works, how it fails, and they can use it in these really bizarre ways with no oversight,” she says.
Another problem is forensic sketching. Police will sometimes use hand-drawn forensic sketches as data for their facial recognition programs when high-quality images of suspects aren’t available. But the accuracy of sketches can vary widely, depending on the witness and other factors. Says Gradecki: “People's memories can be highly problematic when it comes to forensic sketch creation.”
Walk into Gradecki and Curry’s installation, and you’ll quickly grasp on a visceral level how this technology operates—and how easy it is for mistakes to occur. As you enter the gallery, you’ll see yourself on a video screen with a box around your face: the computer is searching for criminal matches. Who will it find when it compares your image with its database? Boogaloo Bias reverse-engineers current practices around facial recognition technology to give viewers this hands-on experience of being assessed and potentially matched to a criminal suspect. Using a live feed of the Science Gallery Atlanta exhibition, the program ‘brute forces’ the generation of leads from visitors.
To drive home the absurdity of these practices, Gradecki and Curry have used the faces of movie characters from the 1984 film Breakin’ 2: Electric Boogaloo as celebrity doppelgängers to help the computer identify and find members of the real-life anti-law enforcement militia group Boogaloo Bois among exhibition attendees. (The Boogaloo Bois emerged from 4chan meme culture and have participated in US protests on both the political left and the political right since January 2020). As the computer processes the live feed from the Science Gallery Atlanta cameras (this feed is not saved or recorded), numbers flash up on the video screen, indicating the predicted accuracy threshold for potential matches between an exhibitgoer and a Boogaloo Bois militia member, as represented by their Breakin’ 2 counterpart. Surprised to discover that the technology thinks you’re part of an anti-law enforcement group you’ve never heard of? Gradecki and Curry hope the experience sticks with you.
Law enforcement officials may be as surprised as you are to discover these problems. Curry says that police do not always receive accurate information about this new technology. The companies that sell AI software to law enforcement agencies tend to make grand claims about its efficacy, claiming up to 99 percent accuracy in facial recognition. “But they do those ratings in ideal conditions and lighting; not in real life,” says Curry. “Law enforcement agencies, drowning in a sea of open-source data, are sold programs that obscure their shortcomings and minimize inaccuracies that can quickly creep in. “Little things can throw off the program off in a big way,” Gradecki says.
Ultimately, both artists hope exhibitgoers come away from Boogaloo Bias asking questions. What counts as an accurate match when identifying suspects? When does AI create more problems than it solves? “There are social as well as technical dimensions to this work,” says Curry. “How do you say that a face is a match or not? How do you navigate automation in decision-making?”
Questions of justice are also involved. Currently, there are no restrictions on how law enforcement agencies use facial recognition technology—no public oversight and no federal regulations. Curry and Gradecki are committed to bringing this research into the public sphere so that others can learn about what’s going on.
“We’re academics,” says Curry, “but the goal is to take our findings and turn them into something accessible through art. We want to make this information available to ordinary people. Because everyone has the right to know what these technologies promise and what they do —and don’t— deliver.”