Illuminating Technology’s Blind Spots: Report on the Women and Technology Keynote Address

“Dodging Silver Bullets: Understanding the Role of Technology in Social Change” with Chelsea Barabas

Chelsea Barabas gave the keynote address (live stream available here) of the Rewriting the Code: Women and Technology Forum on Friday, March 1. The Forum focused on exploring careers in technology, humanities, social change, communications, and the arts. It took place on Saturday, March 2 and included several panel discussions, such as Technology and Social Justice, Making History, The Best Career Advice I’ve Ever Gotten, and Technology and Storytelling.

In her keynote address, Barabas, a research scientist at MIT, spoke about the social implications of technology and how technology can be used to make the world a better place.

Striving to understand the role of technology in social change, Barabas discussed technology’s diversity problem. The prevailing theory for the “pale and male” look of the homogeneous technology workforce is that there are not enough diverse workers with the required skills to be hired. However, research shows that the “unemployment rate for Black and Native American engineering graduates double that of their peers” and “women make up 39% of the science and engineering graduates with only 15% employed in a STEM career, a rate that is half that of their male peers.” Because these diverse potential hires have the required skills but are still not getting the tech jobs, this homogeneous workforce is not due to a lack of skills issue.

Chelsea Barabas Keynote Address
Chelsea Barabas giving the Rewriting the Code: Women and Technology Keynote Address on March 1
Photo Credit: Jenny Bagger ’19

In response, algorithmic recruitment platforms popped up and attempted to solve these hiring, recruitment, and retention issues by creating a comprehensive database of coders, building an algorithm on top of that database that provided hiring recommendations and functioned as a search engine for recruiters, and serving as a solution to the information processing problem of recruiting and hiring predominantly white males. These platforms seemingly allowed firms to more efficiently and more accurately find talent where they were not already looking.

However, these algorithms posed many questions, which Barabas raised: How are these algorithms developed? What ends up fueling these recommendations? How do we create a measuring stick to evaluate talent? Who ended up visible and invisible under this algorithmic gaze?

Most of the time, the factors that were the most relevant to this algorithm were the same as those the recruiters previously used when scoping out new talent. These factors include candidates’ universities attended and professional pedigrees. By reflecting the decisions recruiters made in the past, this technology is simply a reinforcement of the old practices but behind this “veneer of scientific objectivity and neutrality,” as Chelsea put it. Often, people view technology as objective tools that are free of people’s biases, but in reality, this is not the case. Maintaining this view in error increases the risk of obscuring the discriminatory practices of society’s past behind this mask of scientific objectivity.

“AI is like a child. It absorbs the default assumptions about how the world works, unless we teach it otherwise.”

Chelsea Barabas

Therefore, we must teach it otherwise. The perceived objectivity of technology creates the risk of legitimizing the biases of those who program it. Because technology inherits the blind spots of those who create it, we must expand the diversity of its creators. This means cultivating a heterogeneous workforce and embracing the necessity of diverse programmers.

“There are no silver bullet solutions to these social problems. If a technology solution seems too good to be true, it probably is. If people on your team are trying to build something and can’t see the holes in it, then your team probably isn’t diverse enough.”

Chelsea Barabas

Babaras concluded the Keynote Address with this dose of reality and call to action. Technology can provide solutions to social problems, but finding and selecting a diverse team of coders to create this technology is a social problem within itself that technology has proven unable to fix.

Barbaras’ Keynote Address kickstarted an enriching weekend of women discussing, learning from each other, and sharing their own experiences with the intersection of technology and the humanities.

-Jenny Bagger ’19, DH Undergraduate Fellow

Chelsea Barabas’ visit was sponsored by the Class of 1963 Fund.

Leave a Reply

Your email address will not be published. Required fields are marked *