Technology Social Issues Big Ideas

Code Dependent: Living in the Shadow of AI (Summary)

by Madhumita Murgia

Imagine a knock on your door. It's a child protection agent, here to investigate you for child abuse. The accuser? Not a neighbor, not a teacher, but an algorithm that has flagged you as a high-risk parent based on opaque data points like your zip code and how often you've visited an emergency room. This isn't science fiction; it was the reality for a father in Illinois, one of the many people whose lives are being quietly dismantled by automated systems they have no power to challenge.

Your New Boss is an Algorithm

AI is increasingly used to manage, monitor, and even fire workers, creating a system of relentless surveillance and automated decision-making with no human recourse or empathy.

An Amazon delivery driver in the UK is constantly tracked by an app monitoring his every move. He receives automated demerits for things beyond his control, like traffic, and can be 'deactivated'—fired—by the system without ever speaking to a human manager.

AI Inherits Our Worst Biases

Far from being objective, AI systems are trained on historical data riddled with human prejudice. As a result, they often amplify and automate discrimination against marginalized communities.

In the Netherlands, a government algorithm designed to detect welfare fraud disproportionately targeted low-income families and ethnic minorities, leading to thousands being falsely accused and financially ruined in a national scandal.

AI is Powered by an Invisible Human Workforce

The illusion of seamless AI is built on the backs of millions of low-paid 'ghost workers' who perform repetitive, often psychologically damaging tasks like content moderation and data labeling.

A content moderator in India spends his days viewing and categorizing the most horrific content on the internet—beheadings, child abuse, extreme violence—to train AI systems, all while suffering from severe, untreated PTSD.

Life-and-Death Decisions are Being Outsourced to Code

Critical decisions in healthcare, criminal justice, and immigration are being handed over to predictive algorithms, often with devastating consequences and little accountability.

A Syrian asylum seeker in the UK has his refugee claim evaluated by an AI system that analyzes his voice for signs of deception. His entire future rests on the judgment of a machine with a questionable record of accuracy and fairness.

Go deeper into these insights in the full book:
Buy on Amazon
Listen to the full audio book with an Audible Free Trial.
As an Amazon Associate, qualifying purchases help support this site.