Benjamin 2019: Difference between revisions

From Whiki
Jump to navigation Jump to search
(Created page with "Benjamin 2019|Benjamin, Ruha. ''Race After Technology: Abolitionist Tools for the New Jim Code.'' Polity, 2019. the New Jim Code”: the employment of new technologies that r...")
 
No edit summary
 
(One intermediate revision by the same user not shown)
Line 1: Line 1:
Benjamin 2019|Benjamin, Ruha. ''Race After Technology: Abolitionist Tools for the New Jim Code.'' Polity, 2019.
Benjamin, Ruha. ''Race After Technology: Abolitionist Tools for the New Jim Code.'' Polity, 2019.


the New Jim Code”: the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era.  
the New Jim Code”: the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era.  


as this book shows, colorblindness is no longer even a prerequisite for the New Jim Code. In some cases, technology “sees” racial difference, and this range of vision can involve seemingly positive affirmations or celebrations of presumed cultural differences. And yet we are told that how tech sees “difference” is a more objective reflection of reality than if a mere human produced the same results. Even with the plethora of visibly diverse imagery engendered and circulated through technical advances, particularly social media, bias enters through the backdoor of design optimization in which the humans who create the algorithms are hidden from view
as this book shows, colorblindness is no longer even a prerequisite for the New Jim Code. In some cases, technology “sees” racial difference, and this range of vision can involve seemingly positive affirmations or celebrations of presumed cultural differences. And yet we are told that how tech sees “difference” is a more objective reflection of reality than if a mere human produced the same results. Even with the plethora of visibly diverse imagery engendered and circulated through technical advances, particularly social media, bias enters through the backdoor of design optimization in which the humans who create the algorithms are hidden from view


The animating force of the New Jim Code is that tech designers encode judgments into technical systems but claim that the racist results of their designs are entirely exterior to the encoding process. Racism thus becomes doubled – magnified and buried under layers of digital denial.
The animating force of the New Jim Code is that tech designers encode judgments into technical systems but claim that the racist results of their designs are entirely exterior to the encoding process. Racism thus becomes doubled – magnified and buried under layers of digital denial.

Latest revision as of 16:34, 2 January 2020

Benjamin, Ruha. Race After Technology: Abolitionist Tools for the New Jim Code. Polity, 2019.

the New Jim Code”: the employment of new technologies that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era.

as this book shows, colorblindness is no longer even a prerequisite for the New Jim Code. In some cases, technology “sees” racial difference, and this range of vision can involve seemingly positive affirmations or celebrations of presumed cultural differences. And yet we are told that how tech sees “difference” is a more objective reflection of reality than if a mere human produced the same results. Even with the plethora of visibly diverse imagery engendered and circulated through technical advances, particularly social media, bias enters through the backdoor of design optimization in which the humans who create the algorithms are hidden from view

The animating force of the New Jim Code is that tech designers encode judgments into technical systems but claim that the racist results of their designs are entirely exterior to the encoding process. Racism thus becomes doubled – magnified and buried under layers of digital denial.

If private companies are creating public policies by other means, then I think we should stop calling ourselves “users.” Users get used. We are more like unwitting constituents who, by clicking submit, have authorized tech giants to represent our interests.

In examining how different forms of coded inequity take shape, this text presents a case for understanding race itself as a kind of tool – one designed to stratify and sanctify social injustice as part of the architecture of everyday life.

Who gets muted in this brave new world? The view that “technology is a neutral tool” ignores how race also functions like a tool, structuring whose literal voice gets embodied in AI. In celebrating diversity, tokenistic approaches to tech development fail to acknowledge how the White aesthetic colors AI. The “blandness” of Whiteness that some of my students brought up when discussing their names is treated by programmers as normal, universal, and appealing. The invisible power of Whiteness means that even a Black computer scientist running his own company who earnestly wants to encode a different voice into his app is still hemmed in by the desire of many people for White-sounding voices.

This is an industry with access to data and capital that exceeds that of sovereign nations, throwing even that sovereignty into question when such technologies draw upon the science of persuasion to track, addict, and manipulate the public. We are talking about a redefinition of human identity, autonomy, core constitutional rights, and democratic principles more broadly.

Many tech enthusiasts wax poetic about a posthuman world and, indeed, the expansion of big data analytics, predictive algorithms, and AI, animate digital dreams of living beyond the human mind and body – even beyond human bias and racism. But posthumanist visions assume that we have all had a chance to be human.

Taken together, all these features of the current era animate the New Jim Code. While more institutions and people are outspoken against blatant racism, discriminatory practices are becoming more deeply embedded within the sociotechnical infrastructure of everyday life. Likewise, the visibility of successful non-White individuals in almost every social arena can obscure the reality of the systemic bias that still affects many people. Finally, the proliferation of ever more sophisticated ways to use ethnicity in marketing goods, services, and even political messages generates more buy-in from those of us who may not want to “build” an ethnicity but who are part of New Jim Code architecture nevertheless.

this book employs a conceptual toolkit that synthesizes scholarship from STS and critical race studies. Surprisingly, these two fields of study are not often put into direct conversation. STS scholarship opens wide the “Black box” that typically conceals the inner workings of socio-technical systems, and critical race studies interrogates the inner workings of sociolegal systems. Using this hybrid approach, we observe not only that any given social order is impacted by technological development, as determinists would argue, but that social norms, ideologies, and practices are a constitutive part of technical design

Race as technology: this is an invitation to consider racism in relation to other forms of domination as not just an ideology or history, but as a set of technologies that generate patterns of social relations, and these become Black-boxed as natural, inevitable, automatic.

It is not the facts that elude us, but a fierce commitment to justice that would make us distribute resources so that all students have access to a good eucational environment. Demanding more data on subjects that we already know much about is, in my estimation, a perversion of knowledge. The datafication of injustic ... in which the hunt for more and more data is a barrier to acting on what we already know.

While inclusion and accuracy are worthy goals in the abstract, given the encoding of longstanding racism in discriminatory design, what does it mean to be included, and hence more accurately identifiable, in an unjust set of social relations? Innocence and criminality are not objective states of being that can be detected by an algorithm but are created through the interaction of institutions and individuals against the backdrop of a deeply racialized history, in which Blackness is coded as criminal.

What is privacy for already exposed people in the age of big data? For oppressed people, I think privacy is not only about protecting some things from view, but also about what is strategically exposed.

If we are to draw a parallel with the context of information technologies, racial fixes are better understood not as viruses but as part of the underlying code of operating systems -- often developed as solutions to particular kinds of predicaments without sufficient awareness of the probelsm that they help produce and preserve.

On closer inspection, I find that the varying dimensions of the New Jim Code draw upon a shared set of methods that make coded inequity desirable and profitable to a wide array of social actors across many settings; it appears to rise above human subjectivitiy (it has impartiality) because it is purportedly tailored to individuals, not groups (it has personalization), and ranks people according to merit, not prejudice (or positioning) -- all within the framework of a forward-looking (i.e. predictive) enterprise that promises social progress. These four features of coded inequity prop up unjust infrastructures, but not necessarily to the same extent at all times and in all places, and definitely not without eliciting countercodings that retool solidarity and rethink justice.

Whether or not design-speak sets out to colonize human activity, it is enacting a monopoly over creative thought and praxis. Maybe what we must demand is not liberatory designs but just plain old liberation.

If, as Cathy O'Neil writes, 'Big Data processes codify the past. They do not invent the future. Doing that requires moral imagination, and that's something only humans can provide,' then what we need is greater investment in socially just imaginaries. This, I think, would have to entail a socially conscious approach to tech development that would require prioritizing equity over efficiency, social good over market imperatives.

If, as many have argued, the rhetoric of human betterment distorts an understanding of the multifaceted interplay between technology and society, then a thorough going commitment to justice as the potential to clarify and inspire possibilities for designing this relationship anew. Justice, in this sense, is not a static value but an ongoing methodology that can and should be incorporated into tech design. For this reason, too, it is vital that people engaged in tech development partner with those who do important sociocultural work honing narrative tools through the arts, humanities, and social justice organizing.