Your experience for road trips mostly matches mine (2023 Model Y, HW4) but on local trips in the Midwest I’d estimate I make at least one safety critical disengagement a month (under a thousand miles). The frustrating thing is that it’s difficult to mentally model where it struggles, and there are often behavioral regressions between releases. Maybe this is what it’s like to be a driver's ed instructor, constantly being driven around by a young driver with different strengths and weaknesses
The past month has been particularly bad, where it’s struggled with timidity and hallucinations. Some recent disengagements I remember:
- It started to brake and pull over twice due to flashing (nonemergency) lights on the other side of a stroad
- Yesterday it had a green right turn but braked harshly when an opposing car started initiating their unprotected left, infuriating the car behind us (then it randomly activated and cancelled its turn signal twice trying to decide what freeway lane was correct, and the same car following us really got road rage)
- Today it braked harshly in a roundabout when another car was safely starting to creep in
Those were split between my wife and I driving, and neither of us is able to react quickly enough to cancel the braking, which was unwarranted and confusing. Fortunately no one has been tailgating us during these events and the worst damage FSD has caused us was just curbing a wheel while parking
Overall Teslas and FSD is quite impressive and I love having it, but it’s too easy to get a false sense of security, and it’s plausible to me that it could require *more* driving experience to use FSD safely than it does to just drive the Tesla normally
Oh I forgot another disengagement: last week it drove by some new but covered traffic lights, once it got close I guess it recognized them but assumed there was no electricity or something and decided to slam on the brakes
Ah, thank you for these examples! I've been driving around town in Portland today and seeing some errors but nothing as bad as you've described and nothing safety-critical.
I guess all of your examples are incorrect or phantom braking, which can be dangerous, I agree. The glass-half-full take is that it has false positives but no false negatives that either of us have seen. Namely, no cases (so far) where it needs to brake but doesn't. Previous versions of Tesla's soi-disant self-driving certainly had that problem.
A few more notes from the original trip. I had misplaced these when I wrote this post originally.
1. More instances of oblivion to speed limit signs. (I hear this is improve in FSD version 14.3.)
2. Small demerit for thinking that a car up ahead in the right line of a divided highway, in the glare of the sun, was oncoming traffic. It moved to the left lane to pass, saw that car, and you could just feel it thinking "oh shit" and scurrying back into the right lane. It repeated that a couple times. In the car's defense, with the glare of the sun it kinda did look like oncoming traffic. Just that I, with my human brain, could deduce that it wasn't really. The car was like the Memento guy about it. But also in the car's defense: better safe than sorry!
3. Bigger demerit for seemingly not seeing an 18-wheeler's turn signal and passing it when it would've been politer and safer to wait.
4. Presumably similar to the tar snake situation, but from earlier in the trip and I'm not sure what it might've seen on the road: another instance where it seemed to deftly dodge a ghost.
5. I mentioned this one vaguely in the post and it was just a navigation issue, but I was amused by how Bethany put it: driving us across the wrong side of the railroad tracks in the dark to murder us in a cornfield. (Maybe you had to be there.)
6. The self-parking was better than the internet had led me to expect but there were cases where it would awkwardly back into some obscure spot when there were many perfectly good spots it could've pulled straight into.
You mean the self-driving? I think it's a drastic difference, but perhaps mostly because the new software version (v14) is only available for the new hardware (HW4).
Your experience for road trips mostly matches mine (2023 Model Y, HW4) but on local trips in the Midwest I’d estimate I make at least one safety critical disengagement a month (under a thousand miles). The frustrating thing is that it’s difficult to mentally model where it struggles, and there are often behavioral regressions between releases. Maybe this is what it’s like to be a driver's ed instructor, constantly being driven around by a young driver with different strengths and weaknesses
The past month has been particularly bad, where it’s struggled with timidity and hallucinations. Some recent disengagements I remember:
- It started to brake and pull over twice due to flashing (nonemergency) lights on the other side of a stroad
- Yesterday it had a green right turn but braked harshly when an opposing car started initiating their unprotected left, infuriating the car behind us (then it randomly activated and cancelled its turn signal twice trying to decide what freeway lane was correct, and the same car following us really got road rage)
- Today it braked harshly in a roundabout when another car was safely starting to creep in
Those were split between my wife and I driving, and neither of us is able to react quickly enough to cancel the braking, which was unwarranted and confusing. Fortunately no one has been tailgating us during these events and the worst damage FSD has caused us was just curbing a wheel while parking
Overall Teslas and FSD is quite impressive and I love having it, but it’s too easy to get a false sense of security, and it’s plausible to me that it could require *more* driving experience to use FSD safely than it does to just drive the Tesla normally
Oh I forgot another disengagement: last week it drove by some new but covered traffic lights, once it got close I guess it recognized them but assumed there was no electricity or something and decided to slam on the brakes
Ah, thank you for these examples! I've been driving around town in Portland today and seeing some errors but nothing as bad as you've described and nothing safety-critical.
I guess all of your examples are incorrect or phantom braking, which can be dangerous, I agree. The glass-half-full take is that it has false positives but no false negatives that either of us have seen. Namely, no cases (so far) where it needs to brake but doesn't. Previous versions of Tesla's soi-disant self-driving certainly had that problem.
Yeah, I won't be considering buying a Tesla until Elon is 100% out of the company.
A few more notes from the original trip. I had misplaced these when I wrote this post originally.
1. More instances of oblivion to speed limit signs. (I hear this is improve in FSD version 14.3.)
2. Small demerit for thinking that a car up ahead in the right line of a divided highway, in the glare of the sun, was oncoming traffic. It moved to the left lane to pass, saw that car, and you could just feel it thinking "oh shit" and scurrying back into the right lane. It repeated that a couple times. In the car's defense, with the glare of the sun it kinda did look like oncoming traffic. Just that I, with my human brain, could deduce that it wasn't really. The car was like the Memento guy about it. But also in the car's defense: better safe than sorry!
3. Bigger demerit for seemingly not seeing an 18-wheeler's turn signal and passing it when it would've been politer and safer to wait.
4. Presumably similar to the tar snake situation, but from earlier in the trip and I'm not sure what it might've seen on the road: another instance where it seemed to deftly dodge a ghost.
5. I mentioned this one vaguely in the post and it was just a navigation issue, but I was amused by how Bethany put it: driving us across the wrong side of the railroad tracks in the dark to murder us in a cornfield. (Maybe you had to be there.)
6. The self-parking was better than the internet had led me to expect but there were cases where it would awkwardly back into some obscure spot when there were many perfectly good spots it could've pulled straight into.
How much does the driving quality depend on the hardware? Like 3 year old model x vs brand new model 3 or whatever
You mean the self-driving? I think it's a drastic difference, but perhaps mostly because the new software version (v14) is only available for the new hardware (HW4).