The EV maker adopts a camera-only approach to Autopilot, but will it be enough for Level 3 driving?
Tesla moves to reliance solely on cameras for Autopilot operation, dropping ultrasonic sensors and radar.
The automaker will leave some features disabled in new cars until cameras can replace the function of ultrasonic sensors in practice, to be activated later.
The EV maker has shunned LiDAR, but is not seen as advancing toward SAE Level 3 driving.
A year after Tesla announced it would move to a vision-only approach to Autopilot, the automaker has revealed it will even drop ultrasonic sensors along with radar in favor of cameras. Ultrasonic sensors are used to detect objects at close range and were part of the original Autopilot suite that also featured a front-facing radar in addition to eight cameras around the car.
Now, the small sensors that are best known for detecting vehicles during parking maneuvers are on the way out, as Tesla embraces a camera-based system.
"In 2021, we began our transition to Tesla Vision by removing radar from Model 3 and Model Y, followed by Model S and Model X in 2022," the automaker said. "Today, in most regions around the globe, these vehicles now rely on Tesla Vision, our camera-based Autopilot system."
Tesla says it will remove ultrasonic sensors (USS) from the Model 3 and the Model Y, while the Model S and the Model X will lose them starting in 2023. As such, Tesla Vision vehicles that won't have USS sensors installed will be delivered to buyers with a number of systems inactive, including Autopark, Park Assist, Summon, and Smart Summon. Tesla plans to restore this functionality at a later date via an over-the-air update, once Tesla Vision gains parity with ultrasonic sensors.
"Along with the removal of USS, we have simultaneously launched our vision-based occupancy network—currently used in Full Self-Driving (FSD) Beta—to replace the inputs generated by USS," the automaker added. "With today’s software, this approach gives Autopilot high-definition spatial positioning, longer range visibility and ability to identify and differentiate between objects."
Tesla has famously shunned a more-is-better approach to sensors in its Autopilot and FSD systems, including LiDAR, placing it in the minority in the autonomous tech industry. By contrast, other automakers and developers have opted to incorporate as many types of sensors as possible and have been able to drive down the cost of several crucial sensors. Vehicles with Level 3 capability that will soon arrive on the market feature new solid-state LiDAR sensors that have become much more affordable over time.
Critics of Tesla's move have noted that a camera-only approach makes driver-assist systems borderline useless at night, while its avoidance of LiDAR is a cost-cutting measure at its core. Dropping ultrasonic sensors is seen as another cost-cutting measure by some industry observers, while a reliance on cameras places a heavier focus on developing software to interpret two-dimensional images.
It's worth noting that no Level 3 or Level 4 system currently in development is slated to rely solely on cameras, with automakers trying to pack as many sensor types as possible into Level 4 robotaxis. Some industry watchers interpret Tesla's plan to rely on cameras as a confirmation that Autopilot and FSD will not move beyond SAE Level 2 capability, as the automaker has admitted to regulators in the past, thereby requiring constant attention from drivers.
BY JAY RAMEY
https://www.autoweek.com/news/technology/a41544080/tesla-drops-ultrasonic-sensors-autopilot/