Ten accidents would have been because of the car's software

Jan 22, 2016 11:53 GMT  ·  By

Google's monthly report for its self-driving car project is in, and according to data recorded by onboard computers, the car's human drivers intervened 13 times between September 2014 and November 2015 to avoid an accident.

In June 2015, Google decided to adopt an open and transparent policy regarding its self-driving car project. Its initial report showed that the car's AI was quite good at avoiding accidents and the accidents it was involved in were not the computer's fault, but mostly happened because of the car's human driver or other traffic participants.

Back then, the company also agreed to put out monthly reports on the project's status. A few days ago, December's report was released, and it included data on all the traffic events that made the car's AI disconnect from the autonomous mode and switch back to manual, as well as on the incidents when the driver forcibly took control of the car.

The car's AI needed traffic corrections 341 times since September 2014

According to Google's telemetry data, this happened 272 times because of the technology's failure to recognize traffic conditions and take an appropriate action for the car.

When this happened, the driver needed to put their hands on the wheel and take the proper course out of a situation. The average time a driver needed to keep their hands on the wheel was 0.84 seconds, allowing the autonomous mode to reconnect soon after.

There were also 69 situations where the driver needed to take "immediate" control of the vehicle because of potential safety-related incidents, like breaking traffic laws, cyclist and pedestrians walking into traffic, and an improper analysis and perception of traffic lights.

All these incidents were analyzed at Google's headquarters by the company's geeks, and using powerful simulation software, the company concluded that 13 out of these 69 incidents would have ended with "contact."

In two cases, the car would have collided with traffic cones while, in other three cases, the self-driving car would have collided with another vehicle, if not for the human driver.

Google says that 10 out of the 13 incidents were its technology's fault while the other three would have been caused by other traffic participants.