256.1

Signal Processing
Blog Labs About

Who's driving the autonomous vehicle shift?

How many sensors does it take for a car to handle real-world driving safely? Waymo stacks lidar, radar, and cameras, roughly $100K in sensors per vehicle to buy redundancy. Tesla centers cameras in production systems, skipping lidar and radar, and leans on scale data to close the gap.

The split reflects different assumptions about risk and cost. Sensor maximalists argue redundancy catches failures when individual systems degrade. Vision-only advocates insist adding sensors adds failure modes, not safety. The industry remains split over whether autonomous systems should verify reality through diverse hardware or sophisticated software.

The vision-only case is still debated, especially for robotaxis, and it never made sense to me. Why limit machines to what humans have when they can perceive more? Cost matters, but safety should come first.

The core tradeoff

Each company's sensor stack represents a different approach. More sensors mean more data to fuse and higher costs. Fewer sensors mean cheaper deployment but narrower perception and less redundancy when conditions degrade.

Sensor stack archetypes

Waymo: layers every available sensor type. Their 6th-generation system integrates lidar, cameras, and radar into a fused perception pipeline. When one sensor fails or provides ambiguous data, others compensate.

The redundancy pays off in safety metrics. Swiss Re analysis of 25.3 million fully autonomous miles shows 88% reduction in property damage claims and 92% reduction in bodily injury claims compared to human drivers. The cost is roughly $100K in sensors per vehicle, complex calibration, and massive data fusion compute.

Tesla: uses cameras alone, no lidar, no radar (they removed radar in 2021). The argument: humans drive with two eyes; vision should suffice if processing is sophisticated enough. Tesla trains neural networks on billions of real-world miles collected from consumer vehicles, a data advantage no competitor matches. If successful, the approach enables sub-$1K sensor cost and rapid fleet scaling.

For robotaxis, the vision-only case is harder to justify. Human drivers still cause too many fatalities each year, and matching human performance is not the goal. Autonomous systems need to be better than humans to justify deployment, especially when passengers surrender control. Safety-critical systems usually rely on redundancy because failure modes do not align.

Cameras degrade in glare, rain, fog, and snow. These are conditions where lidar and radar can add redundancy. Without range-finding sensors, depth estimation relies entirely on vision algorithms that struggle with textureless surfaces and low-contrast scenes. The strategy trades cost against safety margin. If vision closes the gap, it wins on cost. If not, the margin is thin.

See the strain: Sensor Redundancy Rig

The interactive below lets you experience the argument firsthand. Toggle sensors off and watch composite detection confidence drop. Switch scenarios to see how fog cuts cameras while radar stays steadier, or how night conditions flip the reliability hierarchy. The disengagement rates are derived from actual California DMV 2023 filings.

Middle ground: Cruise (GM-backed) pursued Waymo-style sensor fusion but paused operations after an October 2023 incident where a robotaxi dragged a pedestrian 20 feet. The failure was in decision-making under ambiguity, not sensors. Better perception helps, but planning can still fail in edge cases. Cruise resumed limited testing in 2024 but shifted toward gradual consumer vehicle integration.

Aurora targets long-haul trucking on structured highway environments where lane discipline is predictable, speeds are constant, and pedestrians are absent. Easier problem, smaller market, but potentially faster path to commercial viability.

Constraints and outcomes

The "long tail" problem is the challenge: systems perform well for long stretches but can fail on rare edge cases. Proving robustness against rare events defines safety validation. Regulatory patchwork creates complexity; no federal mandate, varying state and local laws. High-profile accidents trigger public trust erosion and stricter regulations industry-wide.

Redundancy costs money, minimal sensor stacks raise risk. Cutting both cost and risk at once is hard.