“We superimpose human social values onto a mathematical system. The question becomes – Whose values are encoded in the system?”
This quote from More than a Glitch by Meredith Broussard is a question worth considering.
In More than a Glitch, she writes from her experience in academic research and algorithm auditing, as well as her own personal encounters with algorithmic bias. She empathically navigates an area few venture -retelling tales from her interviews with survivors of technochauvinism while intermingling with her insights from her research. The book is an excellent gateway into the field and expertly de-codes and builds upon other well-known books, articles, and even tweets. (As X, formally known as Twitter, has found – once a word enters the common vocabulary, it’s not so easily removed. Language moves slower. Hence, I will continue to use tweets.) Personally, the book contains so many great citations and references that I’ve taken away and added to my “to-be-read” list.
The case studies she uses in supporting her argument are stories of algorithmic oppression that are not limited to one group. A clear proponent of intersectionality, Broussard expertly navigates racial, gender, and ability bias, and by focusing on the negative impact of bias itself, her message is clear – the negative consequences of these algorithms are most harmful to persons without the resources to overcome and as a result, will suffer the most. And to be perfectly clear – these consequences can end in premature death.
The book is a decry against technochauvinism and the resulting “tech bro” culture. Broussard seeks to go beyond just illustrating how technology permeates society and showcases the innate bias built into these systems. However, she does not leave the reader in a complete state of despair, and by highlighting her work and the work of her heroes and colleagues, she leaves the reader with a glimmer of hope. Or at least, this was the case for me.
Perhaps Broussard herself most eloquently summarizes the key takeaway:
“If we are building AI systems that intervene in people’s lives, we need to maintain and inspect and replace the systems the same way we maintain and inspect and replace bridges and roads.”
Here I agree with her. There is a human obligation to acknowledge the failing of these systems and strive to not only improve algorithms to not merely reflect the flawed world that is but be constantly audited and reviewed to reflect the society as it changes. And importantly, not reflect only one viewpoint of the flawed world. Technology should be adaptive to a changing world.
Overall, I think the book was a good read and would recommend it, especially to those interested in data science or public policy. And personally, I look forward to tackling my new reading list.