Tesla warning over self-driving autonomous technology

By Laura Berrill
Tesla warns customers its full self-driving autonomous software "may do the wrong thing at the wrong time" following a number of car crashes

Tesla has stressed that self-driving vehicle drivers must remain attentive behind the wheel of their cars, even when on autopilot. The newest version of the software was delayed due to technological challenges, but it doesn’t fully control the car, just offers drivers some assistance.

Earlier this year a Tesla motorist was spotted in the back seat of his vehicle as it travelled down a freeway in the San Francisco-Oakland Bay Bridge area and was subsequently arrested for reckless driving. Tesla cars can be ‘easily tricked’ into driving in autopilot mode, with no-one at the wheel, according to testers from a US consumer organisation. This was revealed just two days after a Tesla crashed in Texas and killed both men who were inside the car. Officials said of the accident that neither of the men were in the driving seat at the time of the crash.

The autopilot setting on a Tesla enables the car to ‘steer, accelerate and brake automatically within its lane’ but is not in the full self-driving capability, which once completed, is said to allow the car to conduct short and long distance journeys without intervention from the person in the driving seat. However, Elon Musk has warned that there will be ‘unknown issues and advised drivers to be paranoid’.

Levels of autonomy

There are six levels of automation in the cars, starting obviously at zero, but Tesla’s current autopilot feature is at level two, which is partial automation. Although the cars can steer as well as control accelerations, humans are still needed to sit behind the steering wheel in order to take control at any given time.

There have been reports of a number of car crashes associated with drivers using the autopilot feature and not paying attention to what is going on on the road and some of which are being looked into by the US National Transportation Safety Board.

In one incident, an Apple engineer died when his Tesla Model X while in autopilot mode hit a concrete barrier - he had previously complained about the setting’s tendency to veer towards obstacles. If at level five, there would be no need for steering wheels or pedals to control speed or braking. The idea is these vehicles could be capable of doing anything an experienced human diver could do, including going off road.



Featured Articles

New chips for artificial intelligence could be game changer

Energy-efficient and faster than its silicon-based counterparts, researchers have developed a new computer chip optimised for artificial intelligence

Securing the future of IoT with AI biometric technology

The world needs an IoT future, so It’s time to forget your password – for good. A new breed of AI-powered biometric security measures is required

EU networks plan to build a foundation for trustworthy AI

Artificial intelligence technologies are in their infancy, but commercial stakeholders must build them on foundations of trust, say research experts

ICYMI: Power users distrust AI and new National Robotarium


Reducing the impact of ecommerce with AI and ML

AI Applications

Now is the time for intelligent products and services

AI Strategy