Tesla shares fall after regulators investigate Autopilot

By Laura Berrill
U.S. regulators launch formal investigation into Tesla’s Autopilot system following crashes

US federal vehicle safety regulators have launched a formal investigation into Tesla’s Autopilot system following a series of crashes that have left at least 17 people injured and one dead. This is according to documents filed by the National Highway Traffic Safety Administration.

Report covers Tesla models with Autopilot built between 2015 and this year

Autopilot is Tesla’s limited self-driving feature that still requires a human to operate. Records of the crashes associated with the system go back to January 2018 and the NHTSA said it had identified 11 crashes where Tesla vehicles have “encountered first responder scenes and subsequently struck one or more vehicles.” Posted yesterday, the report covers an estimated 765,000 Tesla vehicles across models which were built between 2014 and 2021.

As a result, shares in the company closed down 4.32% on Monday following the NHTSA announcement.

The formal investigation comes just months after the NHTSA and National Transportation Safety Board said they were looking into the company following a crash in Texas. There have been several probes into Tesla’s Autopilot during recent months, which have included an investigation in March after a Model Y using the system reportedly struck a stationary police car.

Driver responsibility when using Tesla Autopilot

The document reads: “Most incidents took place after dark and the crash scenes encountered included scene control measures such as first responder vehicle lights, flares, an illuminated arrow board, and road cones. The involved subject vehicles were all confirmed to have been engaged in either Autopilot or Traffic Aware Cruise Control during the approach to the crashes.”

When engaged, Tesla’s Autopilot system allows vehicle drivers to maintain speed and lane centering, but does not make the vehicle safe to operate without a driver sitting behind the wheel. Drivers are still responsible for identifying any roadway obstacles and maneuvers from nearby vehicles sharing the road space.

In April, two people were killed in Texas when a Tesla crashed, hit a tree and went up in flames. Tesla has pushed ahead with technology it terms self-driving, increasing the driver assistance capabilities in some of its cars last fall despite criticism from some safety regulators who questioned whether the technology had been sufficiently tested.

 

Share

Featured Articles

Securing the future of IoT with AI biometric technology

The world needs an IoT future, so It’s time to forget your password – for good. A new breed of AI-powered biometric security measures is required

EU networks plan to build a foundation for trustworthy AI

Artificial intelligence technologies are in their infancy, but commercial stakeholders must build them on foundations of trust, say research experts

ICYMI: Power users distrust AI and new National Robotarium

A week is a long time in artificial intelligence, so here’s a round-up of AI Magazine articles that have been starting conversations around the world

Reducing the impact of ecommerce with AI and ML

AI Applications

Now is the time for intelligent products and services

AI Strategy

The eyes of a machine: automation with vision

Machine Learning