Codepunk Logo

CODEPUNK

Tech Lives Matter - Diversity in the Technology Industry

by Michael Szul on tags: technology, artificial intelligence, logic magazine, diversity, labor

Note: This post/concept first appeared on YouTube as a part of our Digital Shots series. We revisited the concept and expanded the dialog below, but it is still mostly a written stream of consciousness, so forgive the organization and grammar.

Ben Tarnoff has a nice essay—or pamphlet—(the non-fiction equivalent to a novella) about the "making of the tech worker movement," and as Black Lives Matter gathers more media attention and acquires the collective mind-share of individuals in different circles, I think it's time we turn our attention—as an industry—to how technology has also played a part in the oppression of minorities and the stunting of the growth of those from diverse backgrounds (and divergent cognitive processes). Minorities are extremely underrepresented in many different areas of technology, and the literature is rife with discussions on algorithmic bias and how machine learning disproportionately affects minorities and people of color. As an industry, what we need to focus on is how the software that we build on a daily basis is used in society and what negative impact it could have.

For a significant amount of time being a software engineer—being a technology worker—has given us the impression that we are not "labor." That we are not the working class. We believe we are white collar. We believe we are this special segment of the industry. Software engineers and technologists often have this assumption of "specialness"—that we are indeed somehow special and disconnected from the plight of our fellow citizens. But even more egregious, we believe that we are somehow disconnected from the responsibilities of the software and the applications that we build.

It goes without saying that it's great to see some technology workers starting to unite. For example, they're starting to understand that working on algorithms for facial recognition software might not be a good idea, or that working on systems that are then sold to governments for war is not really the best use of their talent. It's great to see the moral compass evolve when we talk about how technology affects the environment. Because if the government will not hold big technology companies accountable then we have to—the workers. We've already seen how some employees are quitting Facebook over their handling of misinformation and hate speech. We've seen how workers have effected business decisions at Amazon and Microsoft. With Google, workers managed to disrupt the cooperation and investment in Project Maven and began to independently organize to influence other decisions:

Organized workers at [Google] were able to get executives to drop Project Maven, the company's artificial-intelligence program that the Pentagon contracted for, and Project Dragonfly, a strategy to launch a censored search engine in China.

Google protests from the Hindustan Times

Largely, these protests are less about diversity and more able the outcomes of technology. Project Maven is billed as:

a way of maintaining "advantages over increasingly capable adversaries," the project is formally known as the Algorithmic Warfare Cross-Functional Team. Its objective is "to turn the enormous volume of data available to DoD into actionable intelligence and insights at speed" for human analysts.

Maven was a rallying cry for many Google workers according to Tarnoff:

[…] [T]here was another important reason that Google became a hotspot for organizing. Its employees tended to have a more utopian outlook, aptly summarized by its former slogan, "Don't be evil." Employees expected executives to put ethical con- siderations above profit-making. When this expectation was not fulfilled, it created a sense of betrayal that fed a process of radicalization.

(more...)