Tesla warning over self-driving autonomous technology

By Laura Berrill
Tesla warns customers its full self-driving autonomous software "may do the wrong thing at the wrong time" following a number of car crashes

Tesla has stressed that self-driving vehicle drivers must remain attentive behind the wheel of their cars, even when on autopilot. The newest version of the software was delayed due to technological challenges, but it doesn’t fully control the car, just offers drivers some assistance.

Earlier this year a Tesla motorist was spotted in the back seat of his vehicle as it travelled down a freeway in the San Francisco-Oakland Bay Bridge area and was subsequently arrested for reckless driving. Tesla cars can be ‘easily tricked’ into driving in autopilot mode, with no-one at the wheel, according to testers from a US consumer organisation. This was revealed just two days after a Tesla crashed in Texas and killed both men who were inside the car. Officials said of the accident that neither of the men were in the driving seat at the time of the crash.

The autopilot setting on a Tesla enables the car to ‘steer, accelerate and brake automatically within its lane’ but is not in the full self-driving capability, which once completed, is said to allow the car to conduct short and long distance journeys without intervention from the person in the driving seat. However, Elon Musk has warned that there will be ‘unknown issues and advised drivers to be paranoid’.

Levels of autonomy

There are six levels of automation in the cars, starting obviously at zero, but Tesla’s current autopilot feature is at level two, which is partial automation. Although the cars can steer as well as control accelerations, humans are still needed to sit behind the steering wheel in order to take control at any given time.

There have been reports of a number of car crashes associated with drivers using the autopilot feature and not paying attention to what is going on on the road and some of which are being looked into by the US National Transportation Safety Board.

In one incident, an Apple engineer died when his Tesla Model X while in autopilot mode hit a concrete barrier - he had previously complained about the setting’s tendency to veer towards obstacles. If at level five, there would be no need for steering wheels or pedals to control speed or braking. The idea is these vehicles could be capable of doing anything an experienced human diver could do, including going off road.

 

Share

Featured Articles

Microsoft AI: Rumoured AI Product Could Advance Global Tech

Microsoft continues to invest in enterprise AI around the world, with new models expected to continue to expand its industry-leading portfolio

TM Forum leads the way for the AI-Native telco at DTW24

TM Forum’s global industry summit returns to Copenhagen to set out path for telco and tech leaders to future-proof their AI-Native journey

5 Ways That Businesses Can Keep Pace With AI Innovation

AI Magazine considers some of the ways that enterprise organisations can accelerate their AI progress to stay ahead in the digital revolution

HPE: Businesses Face Overconfidence Trap in AI Strategy

AI Strategy

AMD at 55: Strategy is Powering Advancements in AI

AI Strategy

Anthropic Claude AI Chatbot is Now Available as an iOS app

AI Applications