256.1

Compiled Thoughts
About

Who's Driving the Autonomous Vehicle Shift?

How many sensors does it take to trust a machine with your life? Waymo says as many as possible: lidar for depth, radar for velocity, cameras for context, $100K+ per vehicle in redundant perception. Tesla bets on one type: cameras alone, no lidar, no radar, with billions of training miles teaching vision systems to see like humans.

The split reveals competing theories about safety. Sensor maximalists argue redundancy catches failures when individual systems degrade. Vision-only advocates insist adding sensors adds failure modes, not safety. The industry remains fractured over whether autonomous systems should verify reality through diverse hardware or sophisticated software.

Honestly, the vision-only argument never made sense to me. Why limit machines to what humans have when they can perceive more? Cost matters, but safety should come first.

Compare official California DMV disengagement filings. Switch between 2022 and 2023, filter for adverse weather or sensor faults, and see how Waymo, Cruise, Zoox, and Aurora fare on miles per intervention and edge-case density.

The core challenge

Each company's sensor stack represents a different approach. More sensors mean more data to fuse and higher costs. Fewer sensors mean cheaper deployment but narrower perception and less redundancy when conditions degrade.

Waymo: sensor fusion maximalist

Waymo layers every available sensor type. Their 6th-generation system integrates lidar, cameras, and radar into a fused perception pipeline. When one sensor fails or provides ambiguous data, others compensate.

The redundancy pays off in safety metrics. Swiss Re analysis of 25.3 million fully autonomous miles shows 88% reduction in property damage claims and 92% reduction in bodily injury claims compared to human drivers. The cost: $100K+ in sensors per vehicle, complex calibration, and massive data fusion compute.

Tesla: vision-only contrarian

Tesla uses cameras alone, no lidar, no radar (they removed radar in 2021). The argument: humans drive with two eyes; vision should suffice if processing is sophisticated enough. Tesla trains neural networks on billions of real-world miles collected from consumer vehicles, a data advantage no competitor matches. If successful, the approach enables sub-$1K sensor cost and rapid fleet scaling.

The argument is fundamentally flawed for robotaxis. Humans kill 40,000 people per year in the US alone; matching human performance isn't the goal. Autonomous systems must be significantly better than human drivers to justify deployment, especially when passengers surrender control. Safety-critical automation demands sensor redundancy precisely because failure modes differ from biological perception.

Cameras fail in direct sunlight glare, heavy rain, fog, and snow. Conditions where lidar and radar provide critical redundancy. Without range-finding sensors, depth estimation relies entirely on vision algorithms that struggle with textureless surfaces and low-contrast scenes. The strategy prioritizes cost over safety margins: win if vision proves sufficient, catastrophic if it doesn't.

Toggle lidar, radar, and camera coverage while the stack encounters fog, construction detours, and occluded pedestrians. Composite confidence and expected disengagement rates recalc using weights sourced from the 2023 DMV narratives.

The middle ground and alternatives

Cruise (GM-backed) pursued Waymo-style sensor fusion but paused operations after an October 2023 incident where a robotaxi dragged a pedestrian 20 feet. The failure wasn't sensor hardware; it was decision-making logic under ambiguous conditions. Perfect perception doesn't guarantee safe planning when the world presents no-win scenarios. Cruise resumed limited testing in 2024 but shifted toward gradual consumer vehicle integration.

Aurora avoids urban chaos entirely, targeting long-haul trucking on structured highway environments where lane discipline is predictable, speeds are constant, and pedestrians are absent. Easier problem, smaller market, but potentially faster path to commercial viability.

Technical and regulatory challenges

Scroll through CPUC, DMV, NHTSA, and ADOT decisions from 2022–2024. Each event links to the official order and signals how Waymo, Cruise, Zoox, or Aurora had to pause or retool operations.

The "long tail" problem is the key challenge: systems perform well for thousands of miles but fail on rare edge cases. Proving robustness against near-infinite rare events defines safety validation. Regulatory patchwork creates complexity; no federal mandate, varying state/local laws. High-profile accidents trigger public trust erosion and stricter regulations industry-wide.

Outcome determinants

Sensor maximalists pay for redundancy with capital, vision-first challengers pay with risk. You can cut sensors or you can cut risk, but physics and public trust make it hard to cut both.