I’m intrigued and a bit puzzled by (Mobileye CEO) Amnon Shashua’s comments in this clip:
Here’s the full video. The technology part of Shashua’s talk starts around 48:30:
The part I find puzzling is that Shashua says the mean time between fatalities for human drivers is about 1 million hours. I’m feeling sleepy today so I might be getting this math wrong, but, at least in the U.S., a driving fatality occurs on average 1.13 times per 100 million miles or, by my calculation, once per 88.5 million miles. If the average time between fatalities is 1 million hours, then, by my math, that would imply an average driving speed of 88.5 miles per hour, which seems impossible. What gives?
AAA’s statistics imply an average U.S. driving speed of 37 miles per hour. If we assume this is correct, that would mean there’s a fatality every 2.4 million hours of driving. (Since 88.5 million miles per fatality / 37 mph = 2.4 million hours per fatality.) Is Shashua just rounding this number to the closest power of ten? Is he using non-U.S. statistics?
Shashua says the mean time between failures (MTBF) for Mobileye’s automatic emergency braking (AEB) system is “once every tens of thousands of hours of driving”. Presumably, this means 20,000+ hours.
Per NHTSA, the average miles between injuries on U.S. roads is 1.2 million miles. At an average speed of 37 miles per hour, a system’s mean time between injuries would have to be 33,000+ hours to match or exceed human safety.
The average miles between collisions in the U.S. is about 500,000 miles. At an average speed of 37 miles per hour, a system would need a mean time between crashes of 14,000+ hours to match or beat humans.
Another way to do these calculations is just to assume Shashua’s figure of 1 million hours per fatality is correct and then use the ratio between fatalities, injuries, and collisions from NHTSA. Injuries are 75x more frequent than fatalities, so a machine needs a mean time between injuries of 14,000+ hours (i.e. 1 million hours / 75). Collisions are 177x more frequent than fatalities, so the machine needs 6,000+ hours between collisions.
From the perspective of injuries and collisions, a system that fails “once every tens of thousands of hours of driving” sounds pretty good. It’s possible Shashua is being excessively conservative by using the rate of fatalities rather than injuries or collisions.
Also, it seems arbitrary for Shashua to multiply the 1 million hours figure by 10 in an attempt to exclude fatalities caused by drunk driving and distracted driving. Isn’t it precisely one of the benefits of machines that they don’t get drunk or distracted? Why exclude this safety advantage from consideration?
Rather than the ~1000x improvement Shashua says is needed, I would argue that it’s no more than ~100x, since I don’t agree with Shashua’s decision to 10x the figure in order to exclude drunk and distracted driving.
Based on the injuries and collisions rates, maybe something more like ~10x improvement is needed. A mean time between failures of 200,000 hours would be significantly better than humans if ≤100% of failures caused collisions, if ≤100% of failures caused injuries, and if ≤5% of failures caused fatalities.
Even a 2x improvement would be better than the human average in the U.S. if the ratio of machine-caused injuries and collisions to machine-caused fatalities were the same as the ratio of human-caused injuries and collisions to human-caused fatalities.
Please let me know if I’m made some math error or reasoning error. I could be missing something.