Automated systems don’t remove human bias they move it, and they hide it from view.
I grew up in small-town America. The kind of place where you knew the person behind the counter at the hardware store and bought your milk from someone whose name you could actually remember. It wasn’t romantic — it was just how commerce worked. You exchanged money for goods, and nobody was quietly adjusting your price based on what phone you carried or how long you lingered in the dairy aisle.
That world is vanishing. Not because people stopped wanting it, but because a different architecture of exchange has been built on top of it — one that watches, sorts, predicts, and prices with a precision that would make any historical caste enforcer envious.
This essay is about that architecture. It draws on the work of scholars who have been mapping it for years — Shoshana Zuboff, Ruha Benjamin, Cathy O’Neil, Virginia Eubanks, and others — and it tries to connect their insights to something I think we’re collectively failing to name: the emergence of a digital caste system. Not caste in the sense of explicit hereditary ranking, but caste as a structural logic — a way of sorting people into categories that determine access, opportunity, and dignity, enforced not by tradition but by code.
The mechanism is new. The shape is old.
From The Digital Caste: Surveillance Capitalism and the Architecture of Permanent Inequality on michaeljoelhall.com

Community Discussion
or explore The Shala Daily