Musk condemns “crazy” Tesla Autopilot videos

Since Tesla’s Autopilot launched last month, we’ve seen a range of terrifying videos of more and more stupid people abusing some seriously clever tech. Despite being intended to make motorway driving easier, videos on YouTube have shown Tesla Model S owners using Autopilot mode on country roads – and even without a driver.

In response, Elon Musk has vowed to prevent similar dangerous situations wants to put “constraints” in place to prevent them.

Elon Musk responds

Musk addressed the recent emergence of videos on an earnings call with investors, saying “I do want to emphasise, we discouraged it. There’s been some fairly crazy videos on YouTube – this is not good.”

“We will be putting some additional constraints on when Autopilot can be activated to minimise the possibility of people doing crazy things with it.”

Although he wasn’t clear on what form the constraints would take, there are several possible solutions that Tesla could implement fairly easily. One likely constraint could be the installation of a occupancy sensor, that would only engage the semi-autonomous features with a driver at the wheel.

What’s more, the same GPS functionality used to navigate the car could ensure that Autopilot is only used on motorways, and not narrow country roads.

Should Autopilot have been released?

Tesla’s Autopilot may have highlighted the stupidity of certain owners, but it’s also emphasised the responsibility manufacturers will have when it comes to releasing semi-autonomous features. It’s clear that, although extremely intelligent, Tesla’s self-driving software isn’t foolproof, and it needs to be if it’s released to the public.

To find out how far away we are from a self-driving future, read: How far away are we really from autonomous cars?

In the first video, Tesla’s semi-autonomous Autopilot is being used on a narrow, winding road rather than a motorway. Alarm bells should already be ringing.

Nevertheless, the Model S’ Autopilot feature appears to function normally, but then suddenly disengages before handing control back over to the driver. Instead, they continue to film and the Tesla veers into the path of an oncoming car. As you’d expect, the video ends abruptly after that.

In another video, the driver leaves the Autopilot mode on after leaving the motorway. The result? Not great.

Tesla Autopilot: What is it?

Launched earlier in the month to Tesla owners in North America, Autopilot allows the Model S to autosteer, change lanes on motorways, adjust speed depending on the traffic and even parallel park semi-autonomously. The software was initially trialled in August by a group of 600-700 beta testers, and uses a radar system and 12 sensors to survey the car’s surroundings.

Despite being subject to a lengthy trial period, not all of the Autopilot’s mistakes have been the result of user errors. A quick look at the Tesla Motors Club reveals several instances of the Autopilot going wrong, even when used in the conditions for which it was designed.

Even before launch, Elon Musk advised caution over the new Autopilot features. Rather than relying solely on the software, the Tesla boss said the Autopilot mode was designed to improve confidence and enjoyment of motorway driving. “We’re being especially cautious at this early stage, so we’re advising drivers to keep their hands on the wheel just in case,” he added.

So pretty much the opposite of stuff like this, then:

Humans are the weakest link

When Google’s car had its injury-causing first accident, it wasn’t because of a software glitch or a sensor failure – it was caused by a human not paying attention. In the same way, the majority of “Autopilot malfunctions” are being caused by drivers failing to understand the capabilities of the Autopilot system.

But what about the genuine issues with Tesla’s autonomous software? Simply put, they’re a necessary and expected part of the development of autonomous technology. One day, cars will be wirelessly connected to each other, and their movement will be mapped and regulated by a central, cloud-based or ad hoc system. In the meantime, current autonomous systems use a combination of sensors and software to intelligently react to the road – just like a human would.

Tesla’s Autopilot uses deep learning to achieve its artificial intelligence, and this learning process is constant. With each mile covered, Tesla’s system will improve – and, in a few years time, it could be as good as any human driver.

tesla_model-s_autopilot

“Each driver is a trainer in how the Autopilot should work,” Musk told VentureBeat. “There are some things that the vehicle may not be completely prepared for, like always recognising stop signs and traffic lights, but future versions will address these issues.” On the launch of the Autopilot features, Musk added: “Long term it will be way better than a person. It never gets tired, it’s never had anything to drink, it’s never arguing with someone in the car. It’s not distracted.”

Until then? It’s probably best to read – and follow – the instructions first.

To find out how far away we are from a self-driving future, read: How far away are we really from autonomous cars?

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.