These young activists are battling all-knowing tech

The teenagers of Encode Justice are sick of being preyed upon by supposedly helpful algorithms.

Impact
Updated: 
Originally Published: 

When California voters passed a bill to end cash bail in 2018, the legislation seemed great on its surface. Activists have repeatedly pointed out problems with bail, namely that it preys on low-income people and communities of color. But the law, known as Senate Bill 10, was delayed taking effect, thanks to bond industry lobbyists who successfully got a referendum on S.B.10 onto the 2020 ballot. That’s when San Jose high school student Sneha Revanur came across the debate over S.B.10 — and she realized the law would come at a great cost.

S.B.10’s sponsor, Democratic state Sen. Robert Hertzberg, described the bill as the “ground zero in the fight over criminal justice reform.” But the bill has one key issue: If you get rid of cash bail while allowing jails to stay, then something needs to replace bail in the system. In California, politicians planned to use pretrial risk assessment tools, where algorithms would determine who stayed in jail and who went free.

Supporters of S.B.10 touted the power of algorithms to solve racial bias, but the proposal left a sour taste in Revanur’s mouth. “I actually came across a previous investigation that uncovered staggering rates of racial bias in very similar algorithms,” she tells Mic. Perhaps most shockingly, the 2016 ProPublica investigation Revanur found concluded that Black people were labeled more likely to commit a future crime at nearly twice the rate of white people.

When Revanur learned S.B.10 was on the November 2020 ballot for a veto referendum, she decided it was time to organize. That summer, Revanur launched Encode Justice, a coalition of youth activists fighting to change how we understand and utilize AI. Together, she and a team of 15 other teenagers published op-eds, arranged phone banking, and hosted events to alert people to the dangers of the so-called “pretrial risk assessment tools” that would replace cash bail.

“The measure was defeated,” Revanur shares. California voters struck down S.B.10, letting cash bail stand for now. But for Encode Justice, the work was far from over. “At that point,” Revanur says, “I realized that this issue transcends the state of California. This issue transcends the issue of criminal justice. It’s really a full, multi-pronged issue that spans health care, criminal justice, voting rights, employment, and so much more — this issue of algorithms being used as tools of oppression.”

***

“Algorithms are being used as tools of oppression” is so central to Encode Justice’s ethos that the sentence is featured on its homepage. For organizers, though, the point isn’t to create a fear of algorithms as a whole. As Damilola Awofisayo, a high school senior from Virginia, points out, “The key phrase in that statement is that they’re being wielded as ‘tools.’ Algorithms can be used for good purposes, and evil, because they are tools.”

Nowadays, algorithms are inescapable. While most people think of algorithms in terms of their online impact, like deciding the content you see on social media, algorithms and their biases impact people outside of the internet. For example, last year, organizers warned that a new HUD rule would pave the way for algorithmic discrimination in housing. And if you’re trying to access health care, algorithms may play a key role, like deciding which programs you’re able to enroll in. In 2019, one study found that a popular health care algorithm prioritized healthier white patients over sicker Black ones for extra care.

People think that because algorithms are feats of engineering, “they can’t be biased.” ... But that’s far from the truth.

However, getting people to understand algorithms as tools of oppression requires a massive shift in how technology is viewed. People are living under the umbrella of “technochauvinism,” Revanur explains, “in which we believe that technology provides a solution for every single problem.” Part of technochauvinism’s enchantment comes from bombarding people with false messaging about technology as inherently neutral.

People think that because algorithms are feats of engineering, “they can’t be biased, and they’re objective,” Alexandra Raphling, a high school senior from Santa Monica, California, tells Mic. But that’s far from the truth. “When we look at the demographics of software engineers,” Raphling notes, “we see that they’re primarily very privileged white and Asian men.”

“What we’ve been seeing recently is how they’re baking their own prejudice into these algorithms that are then being used and treated as these objective systems, and implemented in many different parts of our society,” Raphling says.

Algorithms dictate daily life in both overt and subtle ways. Yet despite the fact that young people are increasingly targeted by surveillance efforts, like Chicago’s gang database or tools like Security Classroom, youth are often ignored within conversations about how algorithms automate and perpetuate oppression. Awofisayo herself can point to one clear moment in her life when technology caused her harm.

While attending a summer camp before her sophomore year, Awofisayo was misidentified by facial recognition software. An entirely different person’s picture was matched with her name, which resulted in Awofisayo not being able to participate in the first week of programming. When it happened, Awofisayo says, “I didn’t really understand that it was something that is actually seriously wrong, and seriously wrong with this system. I kind of thought, ‘Oh, it would just happen to anybody.’”

These type of misidentifications don’t happen to just anybody, though. Multiple studies have found that popular facial recognition software is terrible at recognizing Black people — and the issue becomes worse if you’re both Black and a woman. For example, a 2019 study showed that Amazon’s facial recognition service, Rekognition, often classified dark-skinned women as men. Technology researcher Joy Buolamwini showed how she literally had to wear a white mask for some software to detect her face.

Awofisayo’s experiences inspired her to join Encode Justice this summer, where she now serves as the organization’s director of external affairs. “People kind of accept [algorithms] as a way of life,” she tells Mic. “They're not understanding that the more that these things are pervading our everyday lives and are becoming more powerful in our society, they could have a really detrimental effect.”

***

In just over a year, Encode Justice has exploded to include 250 members from over 35 states and 25 countries. Due to its size, the organization has two main structures: the central team, which includes its advocacy, education, and policy teams, and then localized organizing hubs. Throughout its growth, Encode Justice has remained true to its youth-led roots with an executive team composed of 12 high-schoolers and three college students.

Part of Encode Justice’s popularity comes from it being so unique. As Awofisayo explains, “There really isn’t any other student organization that is focused on algorithmic justice using the platforms that we know as youth. It’s not only our underlying mission but also the way that we’re approaching it, [with] a lens that takes the power youth have and made Encode Justice a respected name in student organizations nationwide.”

That doesn’t mean that Encode Justice hasn’t run into some issues. According to Revanur, most of the people involved with Encode Justice now didn’t have much awareness about algorithmic injustices before joining the organization. That by itself is no problem. However, Raksha Govind, a high school senior who runs Encode Justice’s Georgia chapter, tells Mic that she hears one common line when reaching out to new fellows: “‘I don’t know that much about computer science.’”

“People oftentimes are excluded from these conversations if they don’t have a Ph.D. or any sort of technical background,” Revanur explains. “People aren’t able to translate this into an issue of equity and civil rights.”

Clockwise from top left: Damilola Awofisayo, Alexandra Raphling, Raksha Govind, and Sneha Revanur.

1 / 4

To combat this problem, Encode Justice focuses on showing people the connection between algorithms and social justice overall. You don’t need a computer science degree to talk about something that affects you every day. In addition, Govind says, “There are so many interests that we incorporate into our organization that really reach out to people because they can find the niche that they really like.” For example, if you’re a writer, then you can contribute to the organization’s blog, which is edited by Raphling.

“As the most interconnected generation, we should be taking a more active stance on [algorithms], but we’ve been previously left out,” Revanur says. “It’s time ... to go more youth-involved, especially because we are going to be the next generation of regulators, activists, developers — people living with the impacts of these technologies.”

***

Encode is currently working on some new campaigns, including a project called Stand Up to Surveillance, which specifically addresses facial recognition technology. Much like pretrial risk assessment tools, activists nationwide have long sounded the alarm about the risks posed by facial recognition. Encode Justice isn’t just fighting for facial recognition to be fixed, but instead for an overall federal ban on government use of the technology.

“Even if we could achieve perfect algorithmic neutrality, [surveillance] would still be a flagrant violation of our civil liberties and of our rights ... to not be surveilled 100% of the time,” Raphling explains. If perfected, facial recognition would just be an incredibly sinister tool that could be used against oppressed communities.

“It’s really easy for the people in power to wield that power in unjust ways when they have the surveillance, and essentially, the tools they want to prevent us from voting, to prevent us from exercising our civil liberties in any way we can,” Raphling continues. “It’s just a breach of privacy in a way that’s not really reconcilable with our rights as humans and as Americans.”

As Revanur puts it, “[These algorithms] are dangerous when they’re accurate, and they’re dangerous when they’re not.”

While the Stand Up to Surveillance campaign has a clear focus on facial recognition, it’s also helping to educate people on just how common surveillance is. If you’re in the United States, you’re being surveilled — there are no if, ands, or buts about it. That surveillance can take shape in many ways, but it’s happening. A 2019 study found that the U.S. has the most surveillance cameras per person in the world.

Encode Justice isn’t just about taking away technology. While algorithms can be used as tools of oppression, they can also be used to uplift communities. The young people getting involved with Encode are working to ensure that uplift becomes the new default. To Govind, the end goal is clear: “forming a generation who knows the boundaries and knows the limits of technology.”

“[It’s important to have] that mentality that [says], ‘Okay, this is not ethical. This is going to harm this community. It’s gonna oppress this group of people,’” Govind continues. “More than just talking about existing technologies, we're able to implement this mindset of what’s right and what’s not in our generation. And that way when we grow up, when we build, it’s in the most just way possible.”