I just tried FSD last night in suburban Dallas for the first time with light traffic and it was harrowing. Drove in the wrong lanes, almost hit a trash can, accelerated way too fast on side streets, and it made a right turn at a red light without stopping or even coming close to slowing down. This was a 3 mile drive.
I've been using autopilot on the freeway for years and that's been mostly fine—but I'm never going to use FSD on streets again.
I’ve been using FSD for a month and my experience with it has been mostly great. I was skeptical initially because of the prevalence of this sort of comment online but it didn’t match my experience. There are predictable situations where it can’t navigate but you get used to them and anticipate them. Otherwise, it is a nice upgrade in functionality over EAP that generally makes the car nicer to operate.
I disengaged twice because I was terrified of what it might do but turned it back on. I'm generally a very early adopter with these kinds of things (I'm the type of person to always turn on "Beta" mode regardless of what I'm working with). So I have a high tolerance for things not working the way they should.
This is unbelievably terrible though. I really regret purchasing it.
Isn't that why lots of of jurisdictions have constraints on the learning driver (e.g. graduated licensing of some sort) and/or visibility requirements (e.g. car has to have a "learner" sticker of some sort) so that other drivers know?
We should require a test of one's ability to drive based on some basic standards before issuing a license. We should come up with a series of rules for what happens if someone does not adhere to these standards, as well as a mechanism of enforcement if they violate those rules.
They probably are to some extent, and you know where the liability would lie if they are responsible should something bad happen. What about this case ? There is no sense of responsibility or realization of the danger they are introducing at scale .
Even when they actually admit that they have failed at it [0]. I am not sure if they are aware of the doublespeak in this admission.
[0] Failure to realize a long-term aspirational goal is not fraud.
I expect that's exactly why you're paying to be a beta tester: they can keep your money and not deliver anything.
The one time I really beta tested a for-profit product, not only did I get it for free, I actually got a rebate on the final product (it was pycharm, and jetbrains gave me a free license for a year, which they got back many times over as I renewed yearly ever since).
Though I guess the early accesses I got for kickstarters were kinda like paying for a beta in a way.
I don't think 'beta tester' is a recognized class in terms of consumer law. You're a customer, a merchant, a manufacturer or a bystander. Besides that for something that costs that kind of money you can simply expect it to work.
This thread and the other comments on this post are amazing. One cannot sell a car without a seatbelt, but Tesla can use their customers for beta testing a dangerous system that drives a whole car around.
I have FSD Beta. These uncut youtube videos were a large reason of why I bought into FSD Beta, and I will say they show it in a much better light than my on-the-ground experience. I'm not even accusing the posters of selecting only good videos to begin with (although I'm sure that's also happening) - but the videos really don't do justice to the "in the car" feeling.
As a specific example, when the car suddenly slams on the brakes, especially at slow speeds, it doesn't look like a big deal on these GoPro cameras - but it feels like a much bigger deal when you're in the car and you feel the g-force of your body reacting to the sudden unexpected deceleration. Some of the more transparent and honest reviewers like "Chuck Cook" even add an overlay showing these forces in real time on their videos - but seeing a number briefly spike is a very different experience than feeling it live.
I would say FSD is getting closer to "safe" (not there yet though imho) but it's still very far away from "comfortable".
If a cop would've been behind me I would not have been surprised if they pulled me over thinking I was drunk. The car kept driving erratically, switching lanes and turning the turn signal on and off. If you would've been next to me you'd be fearing for your life
If we assume I just got repeatedly unlucky and this was a 1:1000 situation would that be acceptable to you?
I've been using autopilot on the freeway for years and that's been mostly fine—but I'm never going to use FSD on streets again.