For years, we’ve seen acts of “courage” run rampant in the tech industry, first with the removal of the 3.5mm headphone jack, then bundled chargers and SIM card trays. But it appears this disease is spreading to the automotive industry and—surprise, surprise—it is Tesla that has caught the bug.
The company is going the whole hog on its Tesla Vision concept, relying solely on cameras to provide its driver assistance technologies. Last year, it controversially ditched the radar sensors—used for autonomous emergency braking and adaptive cruise control—on its cars, and now it is also removing the ultrasonic parking sensors. This affects the Model 3 and Y starting in North America, Europe, Middle East and Taiwan early this month, and will be rolled out to the Model S and X next year.
These sensors are crucial not just for automated parking features but also for the simple act of judging the distance to an obstacle. In fact, Tesla says that for the time being, drivers will not get any audible or visual warnings when parking, so they’ll have to rely solely on the 360-degree camera system. The park assist, remote parking and Smart Summon (which autonomously drives the car from a parking spot to your location) features will also be disabled; the company says it will enable them after a “short period.”
In Tesla’s words, the list of features culled for the time being are:
- Park Assist: Alerts you of surrounding objects when the vehicle is traveling below 8 km/h
- Autopark: Automatically manoeuvres into parallel or perpendicular parking spaces
- Summon: Manually moves your vehicle forward or in reverse via the Tesla app
- Smart Summon: Navigates your vehicle to your location or location of your choice via the Tesla app
In its announcement, Tesla says it is “confident that this is the best strategy for the future of Autopilot [its semi-autonomous driving system] and the safety of our customers.” But there’s clearly an element of cost-cutting as well—even though the components are very cheap (the technology has existed for four decades now), the company will still recoup a significant amount of revenue with this move, given the number of sensors on each car (12) and the production volumes it is chasing.
Tesla is also claiming that the functionality of these sensors will be replaced by a wider roll-out of its “vision-based occupancy network,” a camera-based object mapping system first introduced on vehicles specced with the Full Self-Driving Beta. This gives the car “spatial positioning, longer-range visibility and ability to identify and differentiate between objects,” it said, adding that the functionality will be improved over time.
Does Tesla Vision actually work?
Tesla grabbed the headlines last year when it decided to replace the radar sensors on its cars with the Tesla Vision suite of cameras. There are eight cameras used for this system—three at the top of the windscreen, two on the front fenders, two on the B-pillars and one near the rear number plate. A “powerful” neural network is used to process the imagery from these cameras and detect road markings and obstacles.
This stands in sharp contrast to the rest of the industry, which is responding to demand for greater driving autonomy by increasing the number of sensors on each vehicle. Most notably, a few car companies (such as Volvo) are moving ever closer to introducing lidar sensors for accurate object mapping—the same objective Tesla is trying to achieve using just cameras.
The company claims that since it jettisoned radar sensors, its cars have either matched or improved their active safety ratings in the US and Europe, and “perform better in pedestrian automatic emergency braking (AEB) intervention.” But this statement is complicated by a couple of key factors.
Firstly, while the US’ Insurance Institute for Highway Safety (IIHS) has tested 2022 Tesla models for frontal crash prevention, it only provides scores for AEB—a feature that generally only relies on the front cameras anyway. Its European counterpart, Euro NCAP, hasn’t tested a Tesla since 2020, during which it gave the Model 3 just two stars out of four for assisted driving.
The reality is that since Tesla took radar sensors away, owners have reported a spate of “phantom braking” incidents, where their cars simply brake for no apparent reason—often because the cameras detected vehicles in the oncoming lane. As you can imagine, this behaviour is highly dangerous and can lead to accidents. It’s so severe that the US’ National Highway Traffic Safety Administration (NHTSA) opened up an investigation after it received 354 complaints; this number swelled to 758 by June.
Tesla has been in regulatory hot soup before—the company uses the terms “Autopilot” and “Full Self Driving” in marketing its driving assistance features, despite the fact that none of its systems are fully autonomous; they are at the upper end of Level 2 semi-autonomy at best. Regulators have had to force Tesla to add warnings denoting this in manuals and support documents, but these terms continue to be used to this day—leading some owners to drive without using their hands.
Beyond the cost savings, there’s a very good reason why other carmakers outfit their models with a variety of sensors, and that’s redundancy. As good as cameras and vision-based object detection technologies can be—and they can be very good—they still have their limitations. Cameras can get dirty, and compared to ultrasonic, radar and lidar sensors, they’re not very good at sensing depth. Latency can also impact a car’s ability to detect a hazard in time, though that’s improving all the time.
Having lots of other sensors ensures that when one form of detection fails, another can step in to fix the issue. And as the saying “jack of all trades, master of none” suggests, having several specialised sensors to do each of their jobs better than a series of cameras is probably the best solution—at least until the technology improves significantly over the next few years. The upshot is this: if you’re buying a Tesla anytime in the near future, be extra careful when you’re parking.