Tesla warning over self-driving autonomous technology

By Laura Berrill
Share
Tesla warns customers its full self-driving autonomous software "may do the wrong thing at the wrong time" following a number of car crashes

Tesla has stressed that self-driving vehicle drivers must remain attentive behind the wheel of their cars, even when on autopilot. The newest version of the software was delayed due to technological challenges, but it doesn’t fully control the car, just offers drivers some assistance.

Earlier this year a Tesla motorist was spotted in the back seat of his vehicle as it travelled down a freeway in the San Francisco-Oakland Bay Bridge area and was subsequently arrested for reckless driving. Tesla cars can be ‘easily tricked’ into driving in autopilot mode, with no-one at the wheel, according to testers from a US consumer organisation. This was revealed just two days after a Tesla crashed in Texas and killed both men who were inside the car. Officials said of the accident that neither of the men were in the driving seat at the time of the crash.

The autopilot setting on a Tesla enables the car to ‘steer, accelerate and brake automatically within its lane’ but is not in the full self-driving capability, which once completed, is said to allow the car to conduct short and long distance journeys without intervention from the person in the driving seat. However, Elon Musk has warned that there will be ‘unknown issues and advised drivers to be paranoid’.

Levels of autonomy

There are six levels of automation in the cars, starting obviously at zero, but Tesla’s current autopilot feature is at level two, which is partial automation. Although the cars can steer as well as control accelerations, humans are still needed to sit behind the steering wheel in order to take control at any given time.

There have been reports of a number of car crashes associated with drivers using the autopilot feature and not paying attention to what is going on on the road and some of which are being looked into by the US National Transportation Safety Board.

In one incident, an Apple engineer died when his Tesla Model X while in autopilot mode hit a concrete barrier - he had previously complained about the setting’s tendency to veer towards obstacles. If at level five, there would be no need for steering wheels or pedals to control speed or braking. The idea is these vehicles could be capable of doing anything an experienced human diver could do, including going off road.

 

Share

Featured Articles

MoD Powers Up Data and AI Analytics with £50m Investment

Kainos has secured a £50m agreement to enhance the UK Ministry of Defence’s Defence Data Analytics Platform, supporting Royal Navy, Army and RAF operations

AI's Thirst for Water Raises Sustainability Concerns

In the wake of the UKs planned AI infrastructure boom, concerns are being raised over the potential impact more data centres could have on water supply

How Google’s AI Chip Upgrades Set Sustainability Standards

Google unveils new method for measuring environmental impact of AI hardware as processor efficiency gains deliver 70% decrease in carbon emissions

Google Makes Gemini 2.0 AI Model Available to Everyone

AI Applications

AI for Good: Why Microsoft is Using AI for Positive Change

AI Strategy

How Volvo's Autonomous Trucks are Using Gen AI

Technology