Two predictions about Tesla and autonomy

There are a number of potential bottlenecks that could slow down Tesla’s progress on autonomy (as I wrote about here). It might be a long while yet until we see Tesla’s data advantage trickle down to Autopilot and to the “Full Self-Driving Capability” software package.

But I predict that by the end of Q1 2022 (i.e. by March 31, 2022) Tesla’s partial autonomy features will operate in more environments (e.g. city streets), cover more driving tasks (e.g. turning at intersections), and handle those tasks more competently than any competing commercial Level 2 system that’s available in production cars from GM, Volkswagen, Toyota, or any other car manufacturer. The difference will be clear enough that reviewers like Consumer Reports and Motor Trend will agree Tesla’s system is the best.

I also predict that by the end of Q1 2022 the popular narrative in the media, among stock analysts, and among consultants that Waymo is the furthest ahead on full autonomy and that Tesla is one of the furthest behind will be called into doubt, if not disproven. By that time, Tesla’s demos (if they’re still just demos) will be at least on par with Waymo’s demos, if not superior. (Note: it’s possible that the prospect of full autonomy in general will be called into doubt, in which case the perception of Waymo leading and Tesla lagging might still disappear.)

What would change these predictions is if another company (such as GM, Mobileye, or Waymo) started emulating Tesla’s large-scale production fleet learning approach or if Tesla for some reason stopped pursuing its current approach (such as bankruptcy, although I consider that unlikely). As long as Tesla remains the only company pursuing this approach and as long as it keeps expanding its fleet by 50,000+ cars per quarter, these predictions hold.

I’d like to post these predictions on a site like Long Bets, but the minimum bet is $200 plus a $50 fee. Anyone know a good alternative?

1 Like

Buy stock. I always say if people think FSD will be ready by the end of 2020 they shouldn’t buy fsd, they should buy TSLA.

Yeah, I already own stock, but I also want the GLORY of being right (or the ignominity of being wrong).

P.S. I don’t know how to predict when full autonomy will happen. I just feel confident predicting that the large-scale production fleet learning approach will make progress faster than the small-scale test fleet approach.

Today Tesla has 1000x more vehicles than Waymo (600) and over 400x more vehicles than there are self-driving test vehicles in the United States (1,400). If the gap between Tesla and Waymo and between Tesla and the U.S. as a whole continues to widen, then I predict that 2.5 years from now Tesla will have made more progress on autonomy than any other company.

1 Like

In my opinion, the best way to get the glory is to publish your thoughts in various places and let your friends, family, and the general public know :sweat_smile:. What you’re doing now is a good way to go about it.

When/if you are right, you can always point back to the article. If you’re wrong, then you can hope that everyone has forgotten about your predictions and has moved on to other things!


Ah, but the rush of winning is heightened by the risk of losing!

This is the conundrum that equity research analysts face, but in reverse! People love to trash them, and they catch all the flak when they are wrong, and receive none of the credit when right. Unfortunately, it comes with the business of making public predictions. That dynamic causes people to generally become much more cautious and guarded with their bets. The risk/reward changes when there is more on the line, like your reputation

With that said, I’ve had moments where I’ve regretted NOT making my thoughts public (ie I had once written a pretty long article about shorting Facebook shortly before they blew up their quarter, but never published it), but have not often regretted publishing too much

1 Like

When do you think Tesla is going to reach Level 3?

I don’t know if Level 3 really makes sense. If I’m not mistaken, Level 3 means the driver has to take over occasionally, after a short warning. How do you ensure that the driver will respond in an alert state of mind in a timely fashion? Doesn’t that end up just being Level 2?

The big difference between L2 and L3 is that L2 you have to pay attention and L3 you don’t.
My guesses:

  1. People already think autopilot is L3 for at least a few seconds. Those are the people putting weights on their steering wheels.
  2. Tesla already implies this when nag is every half mile. In stop and go traffic nag is roughly once a minute. So L3 for a minute?
  3. No nag for greater than 2 minutes in ideal conditions, freeway, stop and go traffic, Tesla will enable this in 2.5 years.
  4. For no nag in greater than 5 minutes, 3 years plus.

You can also put a brick on your gas pedal, doesn’t mean it’s safe to do so. To me, this is insanely dangerous behaviour.

I think it’s dangerous to interpret Autopilot as Level 3. Tesla says it’s not.

Thanks for all your analysis strangecosmos! Here and elsewhere… you are the best.

The question for now I think is Will Tesla’s autonomous capabilities be sufficient by Musks target date (end of Q4 2020 to enable an at least basically functioning AV taxi fleet (owners who enlist and a smaller number of Tesla owned cars… (and microcars or whatever made by partner? Scale!)) ?

I think there is little doubt that the answer is yes, because FSD is not required for this… the car only needs to limp along to the next user and could be aided by remote drivers if car gets confused. The driver can then drive car themselves or use auto pilot. But a functioning shared AV taxi fleet will have a much great effect on stuff, regarding personal mobility anyway, in the near future than being able to sleep in your car on the highway.

L3 makes the most sense to me. Watch a movie. But take over for construction zones within 20 seconds.

L2 makes the least sense and most ripe for abuse. You have to be instantaneously ready.

1 Like

Is this a problem inflicted by the culture of finance, i.e. is it analysts and professional investors trashing each other? Or is it people outside of professional finance (e.g. financial media, retail investors) taking pot shots?

Outside mostly, although institutional investors sometimes as well (but not to the same degree). Mostly media, retail investors, etc. When you’re taking a public stance on an issue that people are betting will go the other way, it can get pretty ugly

Someone who spoke with an insider told me what Tesla’s definition of feature complete is:
25% of daily commutes can be done without driver intervention.

If they can get to 5% this year I’ll be blown away. I would wager that <5% of NOA trips can be completed without driver intervention.

Do you use Navigate on Autopilot often? I’d be curious to know your experience.

Confirmation free Noa is wrong as often as it’s right so I’ve given up on that. Regular NOA is an interesting alpha quality science experiment. But I would say it would miss probably 30% of exits in my experience, needlessly suggests changes lanes on straight interstates with no interchanges to “follow the route” and veers into incoming merge lanes so really is only comfortable to use in the HOV lane unless there are HOV lane side exits at which point it sometimes slows to 40mph in the HOV lane thinking it’s on the fruntage road.

I’ve never had a single NOA trip where I didn’t need to intervene, even excluding on ramps which it also whiffs on nearly every time.

It loves to merge into fast approaching cars, it won’t wait to turn on the blinker even if there is dense traffic and nowhere to go anytime soon instead of waiting for an opening coming up to start blinking.

It can’t stay in the lane with steep turns above 60mph. It doesn’t see turn speed limits either so it doesn’t work if there is a 55mph corner. It hugs the wrong lanes and gets super close to semis.

Its cut in detector is still mostly worthless so it continues to accelerate into people well into validly changing lanes.

In stop and go it leaves a full car length gap. And it takes a full 1.5 seconds to start moving again after traffic starts. And with v10 the jerk on acceleration as it realizes it is way behind is super high. It’s actually worse lumpiness in stop and go.

These are all problems relatively easy to handle with mostly traditional programming. These aren’t edge cases and they do a lousy job with bread and butter freeway driving.


My daily commute 90% is on Tesla’s autopilot. It includes back roads (curvy) and highway. I don’t pay too much attention but the car continues to flash a signal to have my hands on the wheel. There are a few tricky spots where I take control and don’t trust the autopilot.

The autopilot is slowly getting smarter. It is like watching a child learn.

1 Like

I have experienced some of the problems you’ve listed, but the dire impression you leave is counter to my own AP experience.

I’ve driven long distances on the interstates using NoA in no confirmation mode, average speed based lane changes, and no lane change warning other than the sound of the turn signal prior to making a lane change.

The performance of AP depends a lot on traffic density. The thicker the traffic, the more I have to override AP’s decisions. Personally, I’m impressed with its lane changing ability, but it’s not perfect. I sometimes have to push on the accelerator to hasten the maneuver. A move into the passing lane with a car approaching from behind at a much higher speed use to be a real problem. It doesn’t happen as often, but when it does I will to cancel the lane change before AP aborts the maneuver itself which can be a bit disconcerting.

Merging with on ramp traffic works pretty well. AP tends to slow down if there’s a car right along side. This works when the driver of the other car takes the hint, but not so well when the driver slows down themselves, and AP doesn’t respond by accelerating to make room behind. I’d like AP to move over if the left lane is clear. I think this is something that is being learned.

I rarely miss exits or interchanges these days. Where I do have to intervene, the error is consistent. It makes me suspect that the map data is in error which AP depends on to navigate. It could also be due to GPS location inaccuracy. Map errors may also be the reason AP gets confused executing driver initiated lane changes on some sections of secondary divided highways for no apparent reason. The symptoms are extreme tentativeness in executing the maneuver on a consistent basis on certain sections of those roadways.

AP is getting better and better at driving through construction zones. I used to have to disengage when entering these areas, but lately it manages quite well even when the lane markings are not clear. Lane position also seems to be improved when driving on wide lanes, and when passing semis.

I’m still using AP hardware 2.5, so I think some of the difficulties AP has are simply due to the limits of its processing capacity. Approaching cars and trucks are rendered on the UI quite well at times, and at other times, not well.

The rate of AP improvement has been slow to date which justifies a prediction of Q1 2022 for FSD. I would have put the date a year earlier, but that’s based on what I know about Tesla’s AP development strategy plus a large dose of wishful thinking.