Tesla warning over self-driving autonomous technology

By Laura Berrill
Tesla warns customers its full self-driving autonomous software "may do the wrong thing at the wrong time" following a number of car crashes

Tesla has stressed that self-driving vehicle drivers must remain attentive behind the wheel of their cars, even when on autopilot. The newest version of the software was delayed due to technological challenges, but it doesn’t fully control the car, just offers drivers some assistance.

Earlier this year a Tesla motorist was spotted in the back seat of his vehicle as it travelled down a freeway in the San Francisco-Oakland Bay Bridge area and was subsequently arrested for reckless driving. Tesla cars can be ‘easily tricked’ into driving in autopilot mode, with no-one at the wheel, according to testers from a US consumer organisation. This was revealed just two days after a Tesla crashed in Texas and killed both men who were inside the car. Officials said of the accident that neither of the men were in the driving seat at the time of the crash.

The autopilot setting on a Tesla enables the car to ‘steer, accelerate and brake automatically within its lane’ but is not in the full self-driving capability, which once completed, is said to allow the car to conduct short and long distance journeys without intervention from the person in the driving seat. However, Elon Musk has warned that there will be ‘unknown issues and advised drivers to be paranoid’.

Levels of autonomy

There are six levels of automation in the cars, starting obviously at zero, but Tesla’s current autopilot feature is at level two, which is partial automation. Although the cars can steer as well as control accelerations, humans are still needed to sit behind the steering wheel in order to take control at any given time.

There have been reports of a number of car crashes associated with drivers using the autopilot feature and not paying attention to what is going on on the road and some of which are being looked into by the US National Transportation Safety Board.

In one incident, an Apple engineer died when his Tesla Model X while in autopilot mode hit a concrete barrier - he had previously complained about the setting’s tendency to veer towards obstacles. If at level five, there would be no need for steering wheels or pedals to control speed or braking. The idea is these vehicles could be capable of doing anything an experienced human diver could do, including going off road.

 

Share

Featured Articles

Sony & AI Singapore Join to Build Language Diversity in LLMs

With Sony AI and AI Singapore broadening the training of LLMs to languages other than English, they hope to better server non-English speaking users

How Wipro Is Using AI to Help One of the US’ Busiest Airport

Leveraging Microsoft’s Azure Data Platform, Wipro's AI solution will consolidate data from various departments to a single platform for JFKIAT to analyse

Moody's: How AI is Changing Financial Analysis

Financial services company Moody's has released a study highlighting how AI is transforming financial analysis

IPhone 16: What Is Included in Its “Apple Intelligence”

AI Applications

Why AI Ranks High on DHL's Logistics Trend Radar

AI Applications

Anthropic Challenging OpenAI with Claude Enterprise Launch

AI Strategy