Tesla warning over self-driving autonomous technology

By Laura Berrill
Tesla warns customers its full self-driving autonomous software "may do the wrong thing at the wrong time" following a number of car crashes

Tesla has stressed that self-driving vehicle drivers must remain attentive behind the wheel of their cars, even when on autopilot. The newest version of the software was delayed due to technological challenges, but it doesn’t fully control the car, just offers drivers some assistance.

Earlier this year a Tesla motorist was spotted in the back seat of his vehicle as it travelled down a freeway in the San Francisco-Oakland Bay Bridge area and was subsequently arrested for reckless driving. Tesla cars can be ‘easily tricked’ into driving in autopilot mode, with no-one at the wheel, according to testers from a US consumer organisation. This was revealed just two days after a Tesla crashed in Texas and killed both men who were inside the car. Officials said of the accident that neither of the men were in the driving seat at the time of the crash.

The autopilot setting on a Tesla enables the car to ‘steer, accelerate and brake automatically within its lane’ but is not in the full self-driving capability, which once completed, is said to allow the car to conduct short and long distance journeys without intervention from the person in the driving seat. However, Elon Musk has warned that there will be ‘unknown issues and advised drivers to be paranoid’.

Levels of autonomy

There are six levels of automation in the cars, starting obviously at zero, but Tesla’s current autopilot feature is at level two, which is partial automation. Although the cars can steer as well as control accelerations, humans are still needed to sit behind the steering wheel in order to take control at any given time.

There have been reports of a number of car crashes associated with drivers using the autopilot feature and not paying attention to what is going on on the road and some of which are being looked into by the US National Transportation Safety Board.

In one incident, an Apple engineer died when his Tesla Model X while in autopilot mode hit a concrete barrier - he had previously complained about the setting’s tendency to veer towards obstacles. If at level five, there would be no need for steering wheels or pedals to control speed or braking. The idea is these vehicles could be capable of doing anything an experienced human diver could do, including going off road.



Featured Articles

ABBYY partner with Arsenal Women to offer AI solutions

Digital solutions provider ABBYY becomes Arsenal Women’s first official intelligent automation partner to offer expertise in business transformation

SAP announces Joule, its enterprise generative AI assistant

SAP's enterprise generative AI chatbot Joule is company's latest addition to its enterprise offering, promising to transform the way businesses run

Virgin Atlantic accelerates AI transformation with Amperity

Leading enterprise customer data platform will help Virgin Atlantic leverage a data-driven strategy to deliver highly personalised customer experiences

Sustainability LIVE: Event for AI leaders

AI Strategy

VMware and NVIDIA AI Foundation unlocks business potential

Machine Learning

TimeAI Summit Oct 2023 to unite tech giants and visionaries