Well that is a goal post. They can do it doesn’t matter if they are allowed to or not. My friends tesla I’ve seen it in action. It goes on the highway. It goes on all the streets even hilly ones unlike waymo. It can even find a free parking spot in the lot and fully park the car. All while he does nothing at all with his hands on his lap. So yeah it totally can self drive.
If that's how you're using it, then you're doing it wrong, because you're supposed to be supervising the drive the entire time. It's not self-driving if you're supervising. And you're required to supervise because... they're not confident it can actually self-drive.
where did i say i wasn't supervising? Are you saying the self-summon is the unsupervised part? Cause i can flip that logic on it's head and say, if they weren't confident that it can summon to me, why would they implement the feature?
The charge is that Tesla can't do self-driving successfully. If they could, you wouldn't need to supervise, as is the case with e.g. a Waymo taxi. That they require you to supervise is an admission that their system is not sufficient for self driving, i.e. they're not doing it successfully.
Waymo taxi's are always in geofenced areas with full HD maps down to the centimeter. I can't take it to tahoe like i can with my Tesla. my point is... my tesla drives itself... all the time and never have to disengage. People can shout technicalities all the time, and regulations where tesla fails because it "Doesn't have an operator that can take over" but for all intents and purposes, my car drives itself from my door to tahoe and back without me having to take over a single time.
If that's not full self driving, but waymo's geofenced, $45 for a ride down the street is, we just disagree.
What you mean is for your intents and purposes. Others in the thread have pointed out specific intents and purposes for which Tesla's approach fails -- driving when you don't have to pay attention. Which is the core function of self-driving, so not being able to do it is kind of whole thing.
> If that's not full self driving, but waymo's geofenced, $45 for a ride down the street is, we just disagree.
Geofenced or not, self driving is about not having to pay attention while the vehicle is in motion. If you have to supervise it, that's a very different thing from a system that will all you to looking at your phone or sleep.
Tesla's approach of trying to drive everywhere instead of a geofenced area is part of why their system is failing to deliver self driving.
Trying to do one thing well before expanding the performance envelope is good systems engineering practice. But Tesla has been widely testing their systems on the greater public, which has tragically resulted in deaths. This is why at the end of the day Tesla requires you to supervise their system while you operate it.
Are the Teslas in the Vegas Boring Loop still driven by humans? If so, how is it that Tesla seems unwilling to assume liability for what has to be one of the simplest driving tasks?
If it crashes, it's my fault. At every point i'm supervising.
Except in self summon, and if it side swipes the car on the way out, it's obviously still my fault. That's just never happened to me.
Where in my sentence did i say I wasn't fully in control of the situation? I just say i very, very rarely even have to disengage in situations.
On the very rare occassion that i do disengage, it's not really that the car is going to put me in a life threatening situation, it kinda just stops... and tweeks out a bit. Mainly at some super wierd triangle intersection in some of the small towns along the california coast.
Honestly i've come to "feel" the car after using it. I'll disengage if i even have a shadow of a doubt it's not going to work, and in situations where i've seen it "fail" before. It might have accomplished it, but instead i just drive through the wierd intersection and reengage.
This has already turned into a rant, but one last point; Have you driven in the other cars in Austin? They do the same thing. When it tweaks, or thinks it might tweak, they patch over to a human who takes control of the car.
Why don't you add anywhere to that list? If the car drives itself, why is it geofenced into only places google has HD mapped down to the centimeter. My car can drive itself to Tahoe, can a waymo?
Waymo's strategy is to be extremely cautious and slowly improve the system and increase its scope over time with the goal of establishing self-driving cars as a long-term viable solution. They know they need to increase the trust of many people. Therefore, they geofence to locations where they have an understanding with the local politicians and government, near support facilities, and high quality data.
Tesla chose a different strategy. It's hard to collect enough data to know exactly how safe it is.
That is the difference between "safety first" and "my personal convenience first", which again boils down to insurance liability.
As long as you pay for the people you injure and kill on the way, you can let your Tesla drive you anywhere, even if it can't do it. You can let it try, and maybe it fails. That's totally fine. You will be held liable, but you get to enjoy your trip to Tahoe.
Self-driving companies that have to pay for the systematic faults of their systems will usually move different.
Because Waymo is not stupid enough to take liability for a situation which they aren't extremely confident about. Tesla is also not stupid; they just don't take liability period.
> Disengagements occur when the self-driving system is deactivated with control handed back to humans because of a system failure or a traffic, weather or road situation that required human intervention.
> Waymo, for example, drove 352,545 miles in the state during the period with only 63 disengagements. Cruise vehicles drove about a third less, at 127,516 miles, and had 105 disengagements.
> The third best performance came from Nissan Motor Co, which drove 5,007 miles and had 24 disengagements, meaning that its vehicles had disengagements on average every 208 miles.
Notice that Tesla isn't even included. That's because they don't actually have full self driving tests ongoing like this. Just the half-assed version they beta test with their customers on public roads.
Just curious, but if it really was up to full self driving, why don't you think Tesla would have it certified as such? Being first to market as a true self driving vehicle would be a huge business win.
I don't think it's fair to call a car self-driving if the self-driving disengages itself every time its about to get into a nasty wreck because of its own actions. It's facially "self-driving except for when it's not", the "not" times you, of course, cannot predict.