I've experienced this before - not only with games such as Tetris (I could "feel" my brian working in a different way) - but also with looking at trees etc when engrossed in Lindenmayer Systems [1] that I was working with at the time.
I assumed it was basically something along the lines of your brian adapting to a "new" reality/situation and engaging the pattern-matching parts that work best for the challenge at hand. Then afterwards it remains on just in case you need it again, like a warm boot.
As I type about it, I realize it likely has relation to things like anxiety - useful in some situations (such as actual danger) but remains "on" when it doesn't have to and becomes intrusive.
After I tweaked the tree outline drawing algorithm for this side-project for like 20 hours over 3 days ( https://ajuc.github.io/outdoorsBattlemapGenerator/ ) I've been seeing these outlines EVERYWHERE :)
I am happy to take more comments from those who know Teslas more than I, but lets base our critiques on the available information (especially when it is indicated in the article).
This appears to be a different take based on the speedometer value. Also, he turned the system on 3 seconds before the wall, while giving the other system much more time. More importantly than all this, why didn't he used FSD, why is he using legacy Autopilot?
He was originally testing the Automatic Emergency Braking feature, but it couldn't stop in time for a single obstacle, so he cut them some slack and used Autopilot instead.
I've watched that a couple times and I'm pretty sure he accidentally cancelled autopilot by jerking the wheel with his hands. His hand movement lines up with the autopilot disengage. I've done that several times myself, it is deliberately sensitive to wheel torque as a manual disengagement mechanism.
the footage he posted on x does not fully match the in car footage on youtube so there was evidently multiple runs. it seems unlikely that the mirror wall could be repaired so I am left to conclude that he kept driving the tesla at the wall until he crashed through it (manually). that being said I would not be super surprised to find that a carefully crafted mirror wall would confuse autopilot. that seems to be a logical weakness of a vision based system.
Thanks for that link. I didn't see the unedited video since it's hidden on Twitter.
Two things to point out:
* The main video clip where he engages autopilot at 39mph is clearly a different run/different clip. Not inherently bad, just a bit weird and seems to imply he enables autopilot substanially earlier than he actually does in the unedited clip. I don't know why he would activate it so late and it's possible that affects the outcome slightly.
* The disengagement is clearly from him jerking the steering wheel. You can see him jerk the steering wheel left slightly and the autopilot noise happens precisely timed with that.
Neither of these are really meaningful to the end result I suppose. By the time he jerks the wheel to cause the disengagement it definitely does not have time to stop in time and wouldn't have affected the outcome of crashing into the wall.
That said, I think it's not particularly a fair comparison to compare basic Autopilot instead of their FSD software. Something like the dense fog or heavy rain cases could have had different outcomes for FSD since it slows down in adverse weather conditions. I'd be interested to see how more standard TACC/lane keep solutions from other competitors fair in these tests. I suspect most would also fail these.