Elon quote at 11:16 in his first Lex Fridman interview (from April 2019):

Well, there’s a lot of things that are learnt. There are certainly edge cases where say somebody’s on Autopilot and they take over. And then, okay, that’s a trigger that goes into our system that says, okay, did they take over for convenience, or did they take over because the Autopilot wasn’t working properly.

There’s also like, let’s say we’re trying to figure out what is the optimal spline for traversing an intersection. Then, the ones where there are no interventions are the right ones. So you then say okay, when it looks like this, do the following. And then you get the optimal spline for navigating a complex intersection.

FSD beta testers are having a hard time getting the correct spline for left turns at intersections. But we can surmise that their interventions are helping label the correct spline.