Back to Browse
Non-fictionBeginnernarrativecriticalresearch

Code-Dependent

by Madhumita Murgia

Not enough ratings yet — via Open Library

Murgia exposes how AI systems encode discrimination and bias into our daily lives.

"AI does not see the world. It sees the data we fed it — and that data was made by us, with all our biases intact.".

Editorial Summary

Code-Dependent by Madhumita Murgia, a Financial Times journalist and AI researcher, investigates how artificial intelligence systems perpetuate discrimination, bias, and inequality across healthcare, criminal justice, hiring, and finance. Through on-the-ground reporting and interviews with affected communities, Murgia documents how machine learning algorithms—trained on biased historical data—automate and amplify systemic discrimination while appearing neutral and objective. The book distinguishes itself from other AI critiques by centering the voices and lived experiences of those harmed by algorithmic systems rather than focusing primarily on technical explanations. Murgia argues that the problem is not merely technical but structural: these systems reflect and reinforce existing power imbalances in society. Code-Dependent serves as an urgent investigation into how the push for automation and efficiency in artificial intelligence has created new mechanisms for discrimination at scale.

Perspective

"Code-Dependent does what most algorithmic harm books don't: it centers the people being hurt rather than the systems doing the hurting, giving the abstract problem a human scale that makes it impossible to dismiss. Murgia's distinctive contribution as a journalist is her on-the-ground access to affected communities across multiple countries, documenting how bias plays out differently depending on which society has deployed the system. Readers who have grown numb to abstract AI ethics arguments will find that specific human stories restore the urgency."

Similar Books

Matched by concept and theme