technology

Elon Musk’s self-driving evangelism masks risk of Tesla autopilot, experts say

By  | 

Teslas ambitious futurism has earned it loyal fans but after the death of a driver using autopilot, some say the company should be more explicit about limitations

Elon Musks rockets have only ever gone to low Earth orbit, but that hasnt stopped him from making promises about Mars. The Tesla CEOs wild, ambitious futurism has earned him billions of dollars, fawning coverage from the tech press, and a subreddit devoted entirely to discussion of his divinity.

But it has also now landed him and his company in the awkward position of explaining why a feature they called autopilot should not be to blame for failing to prevent one of their vehicles from driving under a truck in May, killing 40-year-old Joshua Brown the first known fatality involving a self-driving car.

Brown was a Tesla enthusiast with a need for speed, according to reports by the Associated Press. He had previously posted a video to YouTube of the vehicle avoiding an accident while in autopilot mode, which Musk retweeted.

In a blog post on Thursday about the incident, Tesla said: Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.

The driver of the truck told the AP that he heard a Harry Potter movie being played from the wreck of the car, and the Florida sheriffs department confirmed that a portable DVD player was found in the car, though it is still not known whether Brown was watching a film at the time of the crash.

The National Highway Traffic Safety Administration is investigating the accident.

Choice of language

Since Tesla launched Autopilots enhanced cruise control features in October 2014 and especially since Autopilot was updated to include lane-changing capability in October 2015 the company has been careful to stress that drivers remain responsible and liable for any collisions.

The company pointed out that this was the first fatal accident in 130m miles driven by all Tesla cars under the control of the autopilot feature, compared with a US average of one fatality every 94m miles. It also pointed out that autopilot is still in a public beta phase.

Joshua
Joshua Brown, the first person to die in a self-driving car accident. Photograph: Facebook

But other companies have chosen more conservative descriptions for their assisted cruise control. Volvo calls its current features semi-autonomous tech or park assist pilot, which analysts say are more accurate descriptors than the autopilot term Tesla uses. Mercedes calls its package driver assistance and drive pilot, Toyota has a safety sense feature, and Honda calls it sensing.

The fine print of the Tesla Model S owners manual includes disclaimers for all manner of conditions and scenarios including heavy rain, snow, fog, and direct sunlight that appear to include the circumstances of the 7 May accident.

But in January, just a few months after he chastised drivers for some fairly crazy videos on YouTube of hands-free hijinks, Musk told reporters that the Model S was probably better than humans at this point in highway driving. Before the updated autopilot was released, he said that the car was almost able to go [between San Francisco and Seattle] without touching the controls at all.

In April, he distracted fans and reporters from the news of a recall of 2,700 Model X SUVs with a joking tweet comparing autopilot to the video game Mario Kart.

Talulah Riley, Musks wife, shared and deleted an Instagram video of herself driving on the highway between Los Angeles and San Diego without holding the wheel.

How autonomous?

Many of Teslas current owners are tech enthusiasts who are eager to try out the most cutting-edge vehicles. And when Musk products are presented as the forefront of the autonomous revolution, the perception of some of those drivers is, Im already there. Im already living it, said Karl Brauer, senior analyst with Kelley Blue Book, an automotive research company.

Far from being there, however, Teslas semi-autonomous technology is in what Bryant Walker Smith, an assistant professor of law and engineering at the University of South Carolina, called the mushy middle of the automation spectrum where vehicle-assisted driving is increasing in scope but is not yet capable of replacing the driver completely.

Smith, who specializes in the legal ramifications of autonomous cars, said that this middle ground has unique dangers: The difficulty comes if [issues the computer cant handle] happen so infrequently. Say it can do 99% but not 100% then people are not ready for the 1%. We see problems of under-stimulation. People get bored, they start texting or watching a DVD.

Smith warned against reading too much into this single instance or blaming Tesla, pointing out that there is already carnage on American roads which kills 100 people every day. We dont know how many of those could have been prevented if those vehicles had automatic braking, or a Tesla Autopilot system.

However Brauer said it was time for Musk to shift his messaging to clearly acknowledge the limitations of the cars. These are not self-driving cars. They are driver assist features. Honda and Toyota have been really clear about that from day one … It really is important.

Brauer said Musks colorful remarks about the technology often present an exaggerated and dangerous image of the vehicles capabilities. Theres been potentially mixed messages between a disclaimer in the car that says youve got to be alert the whole time and this thing is being beta tested, and then his comment that the car is probably better than humans.

The term autopilot draws a dangerous link to flying, said Mary Missy Cummings, a Duke University robotics professor and former military pilot.

Most car companies love to equate the driving experience with a cool flying experience but they need to step away from that parallel, said Cummings, who has been outspoken about the dangers of self-driving cars and the need for tighter restrictions. I get to fly a fighter jet, because Im highly trained, she said. This accident really highlights the fact that people do not understand the capabilities and limitations of these vehicles.

Elon
Elon Musk introducing the falcon wing door on the Model X electric sports-utility vehicles in Fremont, California, 2015 Photograph: Stephen Lam/Reuters

How humans think

For now, the crash makes it clear that Tesla should shut off its autopilot feature for certain high-speed driving on freeways, said Cummings. Either fix it or turn it off … The car was in a place where the computer was blind. The computer couldnt see the environment for what it was.

Some experts, however, said the reality is that humans often fail to follow safety recommendations and tend to do a poor job at perceiving risk. Thats not a judgment on any individual. Thats human nature. This could be anyone of us, said Don MacKenzie, professor of civil and environmental engineering at the University of Washington.

Ragunathan Rajkumar, professor at the engineering department of of Carnegie Mellon university, agreed. The question is whether the humans read, understand, and follow what is said on the screen, which I think most people tend to ignore pretty quickly, he said.

The other issue is that we humans tend to be overconfident with something based on limited experience. He pointed to the video Brown posted, in which the system avoided a collision, as potential evidence that Brown had become overconfident in the systems abilities.

He said this did something good for me, so it will be reliable and good for me in every other context. This leap in confidence comes naturally to humans, but it doesnt necessarily mean the technology will work in every situation, Rajkumar said. Its a weakness in how humans think.

Representatives for Tesla declined to comment further.

We use cookies to give you the best online experience. By agreeing you accept the use of cookies in accordance with our cookie policy.

Privacy Settings saved!
Privacy Settings

When you visit any web site, it may store or retrieve information on your browser, mostly in the form of cookies. Control your personal Cookie Services here.


We use Google Tag Manager to monitor our traffic and to help us AB test new features.

Decline all Services
Accept all Services