Oliver Cameron: We Now Live in a Driverless World

Oliver Cameron, the CEO of Voyage (a self-driving car startup testing minivans in a private city in Florida), just published this blog post about Waymo’s driverless rides:

I’m curious what he views their unique contribution is. He predicts all of the “me too” companies will disappear but I only see 3 unique approaches: Waymo intense data collection and structure, Tesla NNs at extreme scale and Comma - Tesla but open source and after market.

2 Likes

Voyage is distinct in that it’s operating at low speeds within gated communities in Florida and California. Whether that will result in a long-term technology or business advantage is a fair question.

Waymo’s feat is certainly noteworthy, and history making. The Kitty Hawk allusion is à propos. The possibility of urban driverless passenger transport is no longer in doubt. Still, I can’t shake the feeling that I’m watching a magician on stage. I’m impressed with the performance, but wonder what tricks were used to create the illusion. The fact is, Waymo One cars do their magic on a carefully prepared stage. Until they can do the same on the streets in my neighborhood, I’ll reserve heavy applause.

Waymo AMA on Thursday:

David Silver (Udacity):

Waymo’s response to the key question of what makes its vehicles safe enough to be driverless is, essentially, “trust us”.

And so far that works, at least for Waymo, which has done virtually everything right and caused no significant injuries, much less fatalities, in its ten years of existence.

Were Waymo to continue that trend indefinitely into the future, “trust us”, would continue to suffice.

Presumably, though, as Waymo ramps up miles and riders, collisions and injuries will happen. At that point, “trust us” probably won’t seem so sensible.

Note: I think Waymo actually has caused serious injuries, but that was back in 2011.

The car went onto a freeway, where it travelled past an on-ramp. According to people with knowledge of events that day, the Prius accidentally boxed in another vehicle, a Camry. A human driver could easily have handled the situation by slowing down and letting the Camry merge into traffic, but Google’s software wasn’t prepared for this scenario. The cars continued speeding down the freeway side by side. The Camry’s driver jerked his car onto the right shoulder. Then, apparently trying to avoid a guardrail, he veered to the left; the Camry pinwheeled across the freeway and into the median. Levandowski, who was acting as the safety driver, swerved hard to avoid colliding with the Camry, causing Taylor to injure his spine so severely that he eventually required multiple surgeries.

The Prius regained control and turned a corner on the freeway, leaving the Camry behind. Levandowski and Taylor didn’t know how badly damaged the Camry was. They didn’t go back to check on the other driver or to see if anyone else had been hurt. Neither they nor other Google executives made inquiries with the authorities. The police were not informed that a self-driving algorithm had contributed to the accident.

The only confirmed serious injury in that incident was from the human intervention. (By the way side note, people do NOT RIDE IN A TESLA WHILE USING SMART SUMMON it causes similarly herky jerky really painful motions assuming nobody is in the vehicle and could cause serious whiplash). And “boxing in a car” trying to merge isn’t the responsibility of the car on the freeway. It should read “Merging Camry failed to yield to traffic”. It might be polite to let someone in, but it’s not your responsibility to make space.

Blog post says that, in 2020, Waymo plans on “continuing to responsibly scale our fully driverless operations in our early rider program and begin offering that experience to more of our riders.”

The Verge rides in a driverless Waymo: