Despite the chattering on the web about the May 7 fatal crash of a Tesla that was operating on a Florida highway under the control of its Autopilot software, there is nothing definitive to be learned about the event until the NHTSA investigation is finished with its conclusions and recommendations. That is, unless Tesla releases more of what it knows already from its remote monitoring processes understood to be active all the time on all Tesla vehicles of the type that crashed.

The Florida Highway Patrol crash report and reported witness accounts since the crash became public knowledge on June 30 provide a few clues as to what happened. While coming down a slight grade at highway speeds, the top half of the car T-boned a semi-trailer crossing the highway. The bottom half of the car kept going at a good rate of speed for hundreds of yards. The driver, Joshua Brown, a U.S. Navy veteran, was dead when authorities arrived.

We do not know if Mr. Brown became aware of what was about to happen at any point before it happened, though Tesla reports that neither he nor the car applied the brakes. We do not know if he tried to exercise any other control options available to him in the final split seconds.

Much key information about the incident has not been revealed, most importantly the speed of the Tesla in the last minute of the driver’s life (estimated in the FHP report as 65 mph), and then in the immediate crash aftermath as the car continued to its final resting place perhaps under continuing automated control. Virtually none of the many status variables of the crashed vehicle in its final minutes and seconds have been revealed by Tesla. Presumably, they are all available, given that the evolution of the Autopilot is guided by intensive remote monitoring of how it is used by drivers both experienced in its use (the case with Mr. Brown) and those less experienced. Tesla did go on record that the sensor-software combination including radar and video that is supposed to see most obstacles on the road ahead did not interpret the semi-trailer as an obstacle.

Much could be learned from how the capabilities of the car’s Autopilot are described to owner-operators via various sources of information. Investigators need to analyze how the driver and his vehicle behaved during the several months of experience they had together. There are two dozen YouTube videos with commentary from Mr. Brown that he made during the months he was obviously enjoying his new Tesla while testing how it performed. These videos are important clues in the investigation now underway especially considering Mr. Brown’s multiple comments on the limitations of Autopilot’s capabilities. The blood that was drawn from the deceased Mr. Brown will likely have been tested for evidence of impairment.

Apart from this crash and this driver, there is much other evidence about the capabilities and limitations of the Autopilot in its operation since late 2015. Numerous examples have been informally documented on the web.

This event and the pending results of the investigation will bring new pressure (and new uncertainty) for the U.S. Government, and other governments worldwide, to regulate the process of developing and testing cars that can drive themselves, as well as regulate the features of the cars themselves. In the aftermath of this tragedy, which is now almost two months old, we urge NHTSA and Tesla to proceed expeditiously with the follow-on investigation.

At the same time, we urge all others to withhold judgment on what this incident means for development and deployment of new automated driver assistance capabilities.

Tesla is the exception here. Since Joshua Brown left behind documented video evidence that he was a skilled user of the Autopilot in his Tesla, it is incumbent upon this car maker to explain immediately to existing and future owners of this feature why Tesla has not already suspended Autopilot on all deployed vehicles since the day they learned of this crash from their vehicle monitoring. Tesla surely must have data and logic on hand that supports its decision not to do so. NHTSA should order the release of this information by Tesla, and if not forthcoming—perhaps prevented by pending lawsuits—order the disabling of Autopilot capabilities nationwide until its investigation of the crash is complete.

In the balance between progress in vehicle automation and the immediate safety of all roadway users, immediate safety is paramount. Should a second accident occur in the wake of the first and traceable to a similar set of circumstances, the effect on our ability to address long-term progress in vehicle safety and automobility would be crippling. We must continue developing automation but we must do so prudently.

 

related:

http://www.nytimes.com/2016/07/02/business/a-fatality-forces-tesla-to-confront-its-limits.html

3 comments on “How NHTSA should respond to the Tesla Autopilot crash in Florida

  • The U.S. National Transportation Safety Board (NTSB) on September 12 released a report on its investigation into this crash: https://www.ntsb.gov/news/press-releases/Pages/PR20170912.aspx

    Quoting some of the findings in the NTSB’s report:

    The Tesla’s automated vehicle control system was not designed to, and could not, identify the truck crossing the Tesla’s path or recognize the impending crash. Therefore, the system did not slow the car, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate.

    The Tesla driver’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations.

    If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains.

    The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement.

    Tesla made design changes to its “Autopilot” system following the crash. The change reduced the period of time before the “Autopilot” system issues a warning/alert when the driver’s hands are off the steering wheel. The change also added a preferred road constraint to the alert timing sequence.

Comments are closed.