Code-Dependent
by Madhumita Murgia
Murgia exposes how AI systems encode discrimination and bias into our daily lives.
"AI does not see the world. It sees the data we fed it — and that data was made by us, with all our biases intact.".
Editorial Summary
Code-Dependent by Madhumita Murgia, a Financial Times journalist and AI researcher, investigates how artificial intelligence systems perpetuate discrimination, bias, and inequality across healthcare, criminal justice, hiring, and finance. Through on-the-ground reporting and interviews with affected communities, Murgia documents how machine learning algorithms—trained on biased historical data—automate and amplify systemic discrimination while appearing neutral and objective. The book distinguishes itself from other AI critiques by centering the voices and lived experiences of those harmed by algorithmic systems rather than focusing primarily on technical explanations. Murgia argues that the problem is not merely technical but structural: these systems reflect and reinforce existing power imbalances in society. Code-Dependent serves as an urgent investigation into how the push for automation and efficiency in artificial intelligence has created new mechanisms for discrimination at scale.
Perspective
"Technologists, policymakers, and anyone concerned with AI governance should read this now, as debates over the EU AI Act and algorithmic accountability intensify globally. Murgia's evidence-based reporting cuts through Silicon Valley's neutrality myths and exposes why fixing bias in machine learning requires confronting institutional racism and inequality, not just better datasets."
Matched by concept and theme



