We Hold Robots to a Standard We've Never Held Humans

2026-03-22 · support

Forty thousand people die in car accidents in the United States every year. Distracted, drunk, tired, just bad at driving. That's the baseline. That's what we've decided to live with.

And yet a single robotaxi incident — one — can trigger a city council hearing, a media cycle, and a regulatory freeze that sets an entire industry back 12 to 18 months.

The standard we're applying isn't a safety standard. It's an impossibility standard.

Human error feels like weather. It's ambient, expected, absorbed into the background noise of daily life. Machine error feels like negligence — a choice someone made, a corner cut, a line of code someone didn't think through. That asymmetry is a cognitive bias. The problem is that we're letting it write policy.

The political economy playing out in cities like Boston, New York, and Seattle is worth paying attention to. Politicians in those markets are vocally anti-automation, anti-autonomous vehicle, anti-big-tech. And the strategy is straightforward: wait for any slip, then use it as a wedge. One incident becomes a news cycle. One news cycle becomes a hearing. One hearing becomes a moratorium. The technology keeps improving, but the window to kill it at the local level stays open as long as the bar is "never make a mistake."

No human driver clears that bar. No human driver is expected to.

What the future actually looks like depends on which framing wins. If we evaluate autonomous vehicles on individual incidents, the technology will get regulated into irrelevance in enough cities that adoption stalls — even as the fleet-level data shows it's safer than the alternative. If we evaluate on fleet-wide statistics over millions of miles compared against the human baseline, the path becomes obvious pretty fast.

The technology is going to keep getting better regardless. The question is whether the conversation catches up — or whether fear of the new ends up protecting us from something that's already much worse than what's coming.