Tesla Autopilot

airbag_recallIn October 2014, Tesla Motors began selling sedans with ultrasonic sensors placed around both bumpers and sides. For an additional fee, Tesla customers could purchase a “technology package” that used the sensors, as well as a camera, a front radar, and digitally controlled brakes, to help avoid collisions by essentially allowing the car to take over and stop before crashing. However, most of the hardware sat there waiting and gathering data, until a year later, when the company sent a software update to the 60,000 sensor-equipped cars it had sold in that time. The software update was officially named Tesla Version 7.0, but its nickname — Autopilot — was what stuck.

It did in fact give drivers something similar to what airline pilots use in flight. The car could manage speed, steer within and change lanes, as well as to park itself. Some of these features, like automatic parallel parking, were already offered from other car companies (including Mercedes, BMW, and GM), but the self-steering was suddenly, overnight – via a software update, a giant leap toward full autonomy.

Tesla customers were excited and posted videos of themselves on the highway, hands free, reading the paper and sipping coffee. Autopilot exists in a legal gray area, but it was a big step toward an ever nearing future, one that will reshape not just the car and our relationship with it but the road and our entire transportation infrastructure.

None of the vehicles offered by Tesla are fully autonomous vehicles and should not be treated as such. Autopilot should be used as no more than a slightly more controlled – cruise control. The technology is, by Tesla’s words, in a “Public Beta Testing”, and has some very dangerous bugs. It has been noted by drivers as not always recognizing vehicles that are at a complete stop and not always able to alert the driver in a situation where the vehicle can not safely perform the maneuver.

The first recorded accident ended in the driver’s death. When a semi-truck made a maneuver across a divided highway, neither the driver nor the vehicle software applied the brakes. The vehicle went under the trailer, sheered the roof off the car, and continued nearly 300 feet, until it struck a utility pole. There have been a number of other accidents in multiple countries, involving the Autopilot feature, which will hopefully be a warning to the owners of these vehicles. It is recommended to keep a hand ready to grab or resting on the wheel, and this Autopilot feature is still not fully autonomous, meaning the driver should pay attention to the road in the same way that you would when using cruise control.

It does have some helpful features and isn’t completely without merit. The car can’t start in Autopilot; it requires a set of circumstances (good data, basically) before you can engage the setting. These include clear lane lines, a relatively constant speed, a sense of the cars around you, and a map of the area you’re traveling through—roughly in that order. Most drivers are bad at estimating distances to begin with, and are constantly trying to switch lanes when the next one looks faster, causing accidents in the process. With Autopilot, you no longer have to stare at the bumper ahead of you while sitting in stop and go traffic, and can look around to see the variety of bad decisions drivers make, stopping and starting and stopping again. Meanwhile, the car will accelerate and slow more smoothly than it ever could have with a typical driver in charge.

With its incremental approach, Tesla stands in contrast to Google and other companies that have small test fleets gathering data in hopes of someday launching fully autonomous cars. For Tesla, its customers and their partially autonomous cars are a widely distributed test fleet. The hardware required for true autonomy is already in place, so the transition can play out in software updates. Musk has said that could be technically feasible — if not legally so — within two years.

Fully Autonomous Hardware?

According to the electric-car company, the vehicles leaving the production line will now be “hardware ready” for fully autonomous travel. Tesla’s new sensor suite will upgrade what was a single forward-facing camera – to eight cameras giving a 360-degree view around the car. It also updates the 12 ultrasonic sensors, while keeping a single forward-facing radar. Yet independent experts and representatives from competitor firms say this system is still insufficient for full level 5 autonomy — the National Highway Traffic Safety Administration’s highest rating — which requires more (and better) radar, multiple cameras with different apertures at each position and 360-degree laser-sensing capabilities.

What Tesla’s upgraded hardware does do is vastly improve the company’s ability to pull high-quality data from its vehicles already on the road, giving it an unrivaled ability to comply with new regulatory guidelines requiring granular data about autonomous-drive functions in a variety of conditions. Whereas its competitors’ autonomous-drive programs harvest data from small test fleets and extrapolate from there, Tesla has made every car it sells into an independent experiment of conditions that can only be found on the open road. All this real-world data gives Tesla a unique opportunity to validate its autopilot technology.

If the company had announced Autopilot 2.0 as another step toward an eventual fully autonomous system, this would be a good (if not earth-shattering) development. Unfortunately, that’s not what Tesla did. Instead, it called its new hardware suite “full self-driving hardware.” It said the technology would demonstrate the system’s ability to drive cross-country without any human intervention. Tesla even hinted that a feature will allow its cars to be rented out as autonomous taxis when not in use by their owners.

Though Tesla’s website noted that many of these features will need validation and regulatory approval, this caveat was lost in the hype. As with Autopilot 1.0, Tesla is again inviting a mismatch between owner/operator expectations and its systems’ capabilities.

True, human drivers are extremely dangerous and yes, autonomous-drive technology has the potential to save lives. But there is a danger of assuming such a system is more capable than it really is.

Tesla has taken another important step toward eventual autonomy, but they need to dial back public statements and sales strategy to reflect only what each update is actually capable of, so as not to mislead owners and put everyone on the road at risk.

:::::::::::::::::::::::::::::::::::::::::::::::::