An unidentified female driver was killed and six others were injured on 20 April, when a Tesla vehicle crashed and caught fire on I-35 near Humble, Texas. The driver, a 39-year-old woman, died at the scene.
Despite the NTSB’s identification of the person involved in the crash, the agency is still looking into what might have caused the crash. This week, the official said the vehicle’s driver and passenger had been using the vehicle’s Autopilot function at the time of the crash.
Though the female driver’s death was confirmed as a result of an “autopilot-related fatality,” the official said the agency is waiting for additional information before declaring the accident a “fatal crash” of the vehicle. The National Highway Traffic Safety Administration (NHTSA) also has not yet announced the review or named any safety problems as a result of the crash.
Although Autopilot has been used in four other fatal crashes, in those cases, it was controlled by a person rather than by the driver. The cars were “unresponsive,” or it was unable to fully perform its function, rather than the car being in Autopilot mode at the time of the crash. This explains why the NTSB does not use Autopilot fatality categories.
The NTSB’s investigation also will look into why the Tesla’s vehicle software has failed to continue to acknowledge the presence of pedestrians in its parking/lane departure detection system. Tesla previously disclosed that it was launching an investigation into this issue.
This is the first fatality caused by a Tesla vehicle that has had its Autopilot function engaged. Another recent crash of a Tesla involved both Autopilot and human error. In that case, a driver failed to yield the right of way to a tractor-trailer and it subsequently smashed into the rear of the Tesla, injuring the three people in the car.
The NHTSA and the NTSB are still looking into the cause of the crash, but it is clear that Tesla faces regulatory scrutiny over driver safety issues and its Autopilot feature.