We’re not sure how “shadow mode” really works, or even if it’s running in HW2 Teslas right now. There are apparently (according to verygreen on TMC) triggers for stuff like suddenly changing the steering angle by a certain amount that trigger an upload of some kind of data to the cloud. Maybe this data could be used at Tesla headquarters to simulate the situation and see how Autopilot would have handled it.
At Tesla HQ, if the engineers have all the sensor data from a given moment on the road, they can just feed that sensor data into an instance of Autopilot and then see the output that Autopilot gives. Autopilot can’t tell the difference between real, live sensor data and replayed, pre-recorded sensor data. Autopilot would then output some actuation of the steering wheel, accelerator, and brake, which can be compared to what the human did in the same situation. This is just a hypothetical idea of how “shadow mode” might actually work in practice.
If this isn’t how shadow mode is supposed to work, and it’s really something that is supposed to — live, in the car — notice differences between its steering, accelerator, and brake output and what the human is doing with the controls, then perhaps shadow mode will have to wait until HW3 and Software V10 alpha.