BIAS EMBEDDED IN TECHNOLOGY—AN EMANCIPATORY CHALLENGE FOR OUR TIME
Over the years, Advances in Nursing Science (ANS) has published a large number of articles addressing the harm of bias, prejudice, and stereotyping and how these cultural dynamics harm the health and well-being of all people,1 but especially those who are disadvantaged by them. A number of these articles acknowledge the persistence of harmful practices and attitudes in health care in general, and in nursing in particular, calling for emancipatory responses—awareness and action—to change the status quo. This issue adds to this significant body of literature in ANS, and yet as we take in the important insights of these and other nursing literature that address these challenges, we cannot avoid acknowledging the intractability and persistence of the very problems we aspire to change.
The incredible growth of information technology, particularly offered on the Internet and the world wide web, was assumed by some to be a path to creating a more equal universe—a world where knowledge would be democratized. Now, it is clear that the vision of a democratized Internet environment has faded, and in its place we see information technology weaponized as a tool for domination and power. The fact is that technological tools are not free from bias. They are created by human beings, who build in to the technologies they create their attitudes, biases, and stereotypes, often in such subtle ways that most people cannot detect what is actually happening.
A recent interview posted on the Scholarly Kitchen with Safiya Umoja Nobel,2 the author of the new book Algorithms of Oppression 3 explores some of the challenges facing scholarly publishing and the persistence of bias built in to technologies, both in content and in structure. Another notable blog post by Charleton McIlwain,4 an edited volume titled The Intersectional Internet: Race, Sex, Class and Culture ONline,5 and Sara Wachter-Boettcher's “Technically Wrong” book6 are additional excellent resources to begin to understand the problems we face.
It is notable that these excellent sources are, for the most part, authored by women and people of color, and they are published on sites and by publishers that are not in the mainstream. Nevertheless, for those of us in dominant groups where dynamics of discrimination are structured and persist, it is time for us to take notice and to take every action we can in the direction of change. McIlwain's post4 provides important clues about our own ongoing complicity, knowingly or unknowingly, to perpetuate the problem, even in ways we access information using search engines. McIlwain's investigations show the tendency of search engine users to migrate toward sources that are produced by and that reflect the dominant cultural perspective, while avoiding those sources that are produced by, or reflect the views of culturally different and minority authors.
Naming, acknowledging, and understanding the subtle ways in which technology perpetuates bias is a start. Next, it is up to each of us to shift our practices and habits, and to challenge the assumptions underlying our interactions with information technologies. It is time to take notice of the writings of nurses of color, and of other scholars who are dedicated to overcoming bias and discrimination. It is time to make sure students know this literature, that we nurture habits of deep reflection on ways we all participate in systems of injustice, and that we shift our own practices to fight injustice and racism, not sustain it.
—Peggy L. Chinn, PhD, RN, FAAN
3. Noble SU. Algorithms of Oppression: How Search Engines Reinforce Racism. New York, NY: NYU Press; 2018.
5. Noble SU, Tynes BM, eds. The Intersectional Internet: Race, Sex, Class, and Culture Online. Switzerland: Peter Lang International Academic Publishers; 2016.
6. Wachter-Boettcher S. Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech. New York, NY: W. W. Norton & Company; 2017.