When the self-driving car killed a pedestrian several years ago, the initial sentiment on this site for the first few hours was essentially "those dastardly pedestrians, darting into traffic at the last second, how are you supposed to avoid them?" It took several hours for enough information to percolate through to make people realize that the pedestrian had been slowly and quite visibly crossing the road and the self-driving car (nor the safety driver) never did a thing to react to it.
Another thing to keep in mind is that video footage is much lower quality than what we can see with our human eyeballs. At no point in the video can I clearly identify what the debris is, but it's clearly evident that the humans in the car can, because they're clearly reacting to it seconds before it's even visible to us in the dash-cam-quality footage. I will freely accept that many drivers are in fact bad drivers, but a carcass (I think?) on a lane visible for >10 seconds away is something that anyone who can't avoid needs to have their license revoked.
(Assuming I know which accident you're referring to) The car that killed the pedestrian in Florida wasn't using supervised full self driving, he was using autopilot (which was basically adaptive cruise control at the time).
Well, I have to admit that your friend has a point. Humans are bad at reacting quickly and correctly to unexpected situations, and some debris large enough to damage your car showing up from out of nowhere after several hours of boring driving along a largely straight highway with little traffic is definitely one of these situations. But a self-driving system worth its salt should always be alert, scanning the road ahead, able to identify dangerous debris, and react accordingly. So, different pair of shoes...
I'm not convinced. The debris is clearly visible to the humans a long way off and the adjacent lane is wide open. Avoiding road debris is extremely common even in more congested and treacherous driving conditions. Certainly it's possible that someone texting on their phone or something might miss it, but under normal circumstances it could have been easily avoided.
100% it would have. One of the main things the LiDAR system does is establish a "ground plane", which is the surface on which the car is expected to drive. Any hole or protrusions in that plane stick out like a sore thumb to a LiDAR system, you'll be able to see it in the raw data without much of a feature detector, so detecting them and reacting is fast and reliable.
Contrast with Tesla's "vision-only" system, which uses binocular disparity along with AI to detect obstacles, including the ground plane. It doesn't have as good a range, so with a low- profile object like this it probably didn't even see it before it was too late. Which seems to me a theme for Tesla autonomy.
In addition to detecting the object, Waymo has to make some determination about the material. Rigid heavy metal = slam on the brakes and/or swerve. Piece of tire or plastic bag = OK to run over if swerving or hitting the brakes would be more dangerous. Really hard problem that they're concerned about getting right before they open up highway driving.
LiDAR is also good for that because you can measure light remission and figure out how much of the LiDAR energy the material absorbed. Different materials have different remission properties which can be used to discriminate. Which is a compounding advantage because we tend to paint road line markers with highly reflective paints. This makes line markers blindly obvious to a LIDAR.
Stereo vision might have helped too to assess the size of it, but from the pre-crash display at 0:07 it looks like the Tesla didn't see the object at all, which is a bit surprising since it was large enough, and very clearly looked like exactly what it was - a big chunk of metal.
The car reacted the opposite a human would. If you saw something unidentifed ("road kill?") in the distance then you'd be focusing on it and prepared to react according to what it was. With an empty lane beside you I think most drivers would be steering around it just based on size, even before they realized exactly what it was (when emergency braking might be called for).
> and we should not tolerate self-driving systems that are as good as the worst of us
The person you replied to didn't do that, though:
> But a self-driving system worth its salt should always be alert, scanning the road ahead, able to identify dangerous debris, and react accordingly. So, different pair of shoes...
I think they meant the person you were responding to never claimed that the person they were responding to said that we should tolerate self driving systems that are no better than the worse of us, not that the person that the person you were responding to was responding to never said the thing you very clearly directly quoted.
I think you might have misunderstood someone here. The person you quoted made a generic statement about what we should expect from an autonomous vehicle, but never said (nor implied imho) that the person he responds to didn't expect the same.
Do they? "Many humans" would hit that? The humans in the car spotted the debris at least 8s before the impact. I don't think any humans would hit that in broad daylight unless they were asleep, completely drunk, or somehow managed to not look at the road for a full 10s. These are the worst drivers, and there aren't that many because the punishment can go up to criminal charges.
The argument that "a human would have made that mistake" backfires, showing that every Tesla equipped with the "safer than a human driver" FSD is in fact at best at "worst human driver" level. But if we still like the "humans also..." argument, then the FSD should face the same punishment a human would in these situations and have its rights to drive any car revoked.
Or they would hit it if they were busy fiddling with the car's autodrive system. These humans would have avoided it had they not wasted time speculating about whether the autodrive system would save them. They would have been safer in literally any other car that didnt have an autodrive.
I've been driving on local and highway roads for 30 years now and I have never come across a piece of debris so large that driving over it would damage my car. Seeing that video, I don't have high confidence that I would have dodged that hazard - maybe 70% sure? The thing is, usually there is plenty of traffic ahead that acts very obviously different in situations like this that helps as well.
All that to say that I don't feel this is a fair criticism of the FSD system.
> I have never come across a piece of debris so large that driving over it would damage my car
More likely you simply drove around the debris and didn't register the memory because it's extremely unlikely that you've never encountered dangerous road debris in 30 years of driving.
I think it's probably because of mostly driving in enough traffic that other cars would have encountered any critical objects first and created a traffic jam around an impassable section.
Honestly no, not in the middle of the road, but plenty on the side. The only things I come across in the middle of the roads are paper bags or cardboard for some reason.
But also, I doubt you would break your swaybar running over some retreads
Driving I-5 up to Portland I had to dodge a dresser that was standing upright somehow in the middle of the lane. The truck in front of me moved into the other lane revealing that thing just standing there, I had to quickly make an adjustment similar to what this tesla should have done. Teslas also have lower bellys, my jeep would have gone over the debris in the video no problem.
> All that to say that I don't feel this is a fair criticism of the FSD system.
Yes it is because the bar isn't whether a human would detect it, but whether a car with LiDAR would. And without a doubt it would, especially given those conditions: clear day, flat surface, protruding object is a best case scenario for LiDAR. Tesla's FSD was designed by Musk who is not an engineer nor an expert in sensors or robotics, and therefore fails predictably in ways that other systems designed by competent engineers do not.
I don't disagree with that characterization of the technical details. However I felt the task those drivers set out was asking a different question: how good would the FSD system be at completing a coast-to-coast trip? I don't think this can be answered after hitting a singular, highly unlikely accident without a lot more trials.
Imagine there was a human driver team shadowing the Tesla, and say they got T-boned after 60 miles. Would we claim that human drivers suck and have the same level of criticism? I don't think that would be fair either.
If you don't disagree on the characterization of the technical details, then you must realize how very fair it is for us to criticize the system for failing in the exact way it's predicted to fail. We don't need 1000 more trials to know that the system is technically flawed.
What if there is no debris the other 999 times, and the system works fine? The video does not give me that information as a prospective Tesla customer. This looks like a fluke to me.
Those 999 other times, the system might work fine for the first 60 miles.
This is a cross-country trip. LA to New York is 2776 miles without charging. It crashed the first time in the first 2% of the journey. And not a small intervention or accident either.
How you could possibly see this as anything other than FSD being a total failure is beyond me.
>asking a different question: how good would the FSD system be at completing a coast-to-coast trip?
>They made it about 2.5% of the planned trip on Tesla FSD v13.9 before crashing the vehicle.
This really does need to be considered preliminary data based on only one trial.
And so far that's 2.5% as good as you would need to make it one way, one time.
Or 1.25% as good as you need to make it there & back.
People will just have to wait and see how it goes if they do anything to try and bring the average up.
That's about 100:1 odds against getting there & back.
One time.
Don't think I would want to be the second one to try it.
If somebody does take the risk and makes it without any human assistance though, maybe they (or the car) deserve a ticker-tape parade when they get there like Chas Lindbergh :)
It does look like lower performance than a first-time driving student.
I really couldn't justify 1000:1 with such "sparse" data, but I do get the idea that these are some non-linear probabilities of making it back in one piece.
It seems like it could easily be 1,000,000:1 and the data would look no different at this point.
As a prospective Tesla customer this one test tells you that Telsa's FSD is not always able to identify or avoid objects in the road large enough to significantly damage your car in situations where humans can identify the object from a significant distance away. Running 999 other tests where there are no objects in the road does not improve your understanding of Tesla's ability to handle objects in the road. Ideally maybe you'd actually want to run 999 more tests with objects in the road to see if Tesla fails every time. If it identifies and avoids the object 99.9% of the time, then you could say this particular test was a fluke.
Now you can certainly argue that "objects in the road" is a category of failure mode you don't expect to happen often enough to care about, but it's still a technical flaw in the FSD system. I'd also argue it points to a broader problem with FSD because it doesn't seem like it should have been all that hard for the Tesla to see and avoid the object since the humans saw it in plenty of time. The fact that it didn't raises questions for me about how well the system works in general.
Tesla in 2016: "Our goal is, and I feel pretty good about this goal, that we'll be able to do a demonstration drive of full autonomy all the way from LA to New York, from home in LA to let’s say dropping you off in Time Square in New York, and then having the car go park itself, by the end of next year" he said on a press call today. "Without the need for a single touch, including the charger."
Roboticists in 2016: "Tesla's sensor technology is not capable of this."
Tesla in 2025: coast-to-coast FSD crashes after 2% of the journey
Roboticists in 2025: "See? We said this would happen."
The reason the robot crashed doesn't come down to "it was just unlucky". The reason it crashed is because it's not sufficiently equipped for the journey. You can run it 999 more times, that will not change. If it's not a thing in the road, it's a tractor trailer crossing the road at the wrong time of day, or some other failure mode that would have been avoided if Musk were not so dogmatic about vision-only sensors.
> The video does not give me that information as a prospective Tesla customer.
If you think it's just a fluke, consider this tweet by the person who is directing Tesla's sensor strategy:
Before you put your life in the hands of Tesla autonomy, understand that everything he says in that tweet is 100% wrong. The CEO and part-time pretend engineer removed RADAR thinking he was increasing safety, when really he has no working knowledge of sensor fusion or autonomy, and he ended up making the system less safe. Leading to predictable jury decisions such as the recent one: "Tesla found partly to blame for fatal Autopilot crash" (https://www.bbc.com/news/articles/c93dqpkwx4xo)
So maybe you don't have enough information to put your life in the hands of one of these death traps, but controls and sensors engineers know better.
> I take it that you're saying that if we provide the Tesla a road which contains nothing to hit, it won't hit anything?
Not quite. I am saying that basing the judgment on a rare anomaly is a bit premature. It's a sample size of 1, but I base this on my own driving record of 30 years and much more than 3000 miles where I never encountered an obstacle like this on a highway.
> Also, not interesting
I would have liked to see the planned cross-country trip completed; I think that would've provided more realistic information about how this car handles with FSD. The scenario of when there is a damn couch or half an engine on the highway is what's not interesting to me, because it is just so rare. Seeing regular traffic, merges, orange cones, construction zones, etc. etc. now that would have been interesting.
Here's an edge case I'm sure everybody has seen very rarely, but that's still not as uncommon as you think. Watch the video by Martinez if the top video is not correct:
Now 2018 might have been a record year, but there have been a number of others since then.
Fortunately for us all, drivers don't have to go through Houston to get from CA to NY, but you're likely to encounter unique regional obstacles the further you go from where everything is pre-memorized.
As we know 18-wheelers are routinely going between Houston and Dallas most of the way autonomously, and a couple weeks ago I was walking down Main and right at one of the traffic lights was one of the Waymos, who are diligently memorizing the downtown area right now.
I'll give Tesla the benefit of the doubt, but they are not yet in the same league as some other companies.
> It's a sample size of 1, but I base this on my own driving record of 30 years and much more than 3000 miles where I never encountered an obstacle like this on a highway.
How is that relevant? You may not personally encountered this precise circumstance but that doesn't mean anything.
If you were to encounter this same scenario, however, it is a near certainty that you wouldn't crash into it. And yet the self-driving did. That's what matters.
> I would have liked to see the planned cross-country trip completed
I mean once the car significantly damaged itself, it's not like it can continue.
Big credit to the people running the experiment for letting it run and show failure. Many vloggers might've just interfered manually to avoid the accident and edited that part of the video out in order to continue and claim success.
Unless you're on your phone, with that clear of a view and that much space, 100% you would dodge that, especially in a sedan where your clearance is lower than a truck.
> Seeing that video, I don't have high confidence that I would have dodged that hazard - maybe 70% sure?
Really? The people in the video clearly identify a large stationary object in the road a good 7 seconds before hitting it. You don't exactly need lightning quick reflexes to avoid hitting something in that scenario. Maybe more importantly, the Tesla did not seem to see the object at all at any distance. Even if you don't think you could have avoided it, do you think you would have entirely failed to see it and driven right into it at full speed? Because that's what the Tesla did.
Such an event might not be super common, but that doesn't make it an unfair criticism of Telsa's self-driving. Even if they've never seen a large object in the road before or react the wrong way, humans are generally capable of recognizing it when it happens and at least considering it's something they should take action on. The fact that Tesla can't do the same makes this an area where FSD is categorically worse than humans, and "avoiding stuff in the road" feels like an area where that's not a good deficit even if there generally isn't stuff in the road.
No way. I call in road debris on the freeway once every couple of months. People swerve around it and if it’s congested, people swerving around it create a significant hazard.
The meatbags riding in front saw the debris about 800 feet away if my napkin math is right. Uncommon occurrence or not, a computer not seeing it ever seems like an unacceptable performance standard.
> Well, I have to admit that your friend has a point. Humans are bad at reacting quickly and correctly to unexpected situations, and some debris large enough to damage your car showing up from out of nowhere
I read this comment before seeing the video and thought maybe the debris flies in with the wind and falls on the road a second before impact, or something like that.
But no, here we have bright daylight, perfect visibility, the debris is sitting there on the road visible from very far away, the person in the car doing commentary sees it with plenty of time to leisurely avoid it (had he been driving).
Nothing unexpected showed up out of nowhere, it was sitting right there all along. No quick reaction needed, there was plenty of time to switch lanes. And yet Tesla managed to hit it, against all odds! Wow.
My impression of Tesla's self driving is not very high, but this shows it's actually far worse than I thought.
> Well, I have to admit that your friend has a point. Humans are bad at reacting quickly and correctly to unexpected situations,
This was not one of those situations.
and some debris large enough to damage your car showing up from out of nowhere after several hours of boring driving along a largely straight highway with little traffic is definitely one of these situations.
Again, this was definitely not one of those situations. It was large, it was in their lane, and they were even yapping about it for 10 seconds.
> But a self-driving system worth its salt should always be alert, scanning the road ahead, able to identify dangerous debris, and react accordingly. So, different pair of shoes...
This is what humans already (and if we didn't do it, we'd be driving off the road). Based on what you're saying, I question that you're familiar with driving a car, or at least driving on a highway between cities.
Literally the passenger saw it and leaned it, the driver grabbed the steering wheel to brace himself it seems. That object on the road was massive, absolutely huge as far as on road obstacles go. The camera does not do any justice - it looks like it's 3 feet long, over a foot wide, and about 6 or 7 inches high laying on the road. Unless a human driver really isn't paying attention, they're not hitting that thing.
Bullshit. Some humans might hit thay because they werent paying attention, but most people would see that, slow down and change lanes. This is a relatively scenario that humans deal with. Even the passenger here saw it in time. The driver was relying on FSD and missed it
I dont think FSD has the intelligence to navigate this
I don't love Tesla (though I would like an electric car). I don't think it's unlikely that someone driving could have hit that or caused an even worse accident trying to avoid it.
However, I'm sure a human driver in the loop would have reacted. The driver sitting there watching a machine do it's thing 99% of the time and being expected to step in to save that situation though is a horrific misreading of human nature.
Those humans saw the debris. What happens next when a human is actively at the wheel is that the driver should look at all mirrors, decide whether to change lane or brake, execute. Or anything else that could lead to a movie like multiple car accident. Hitting the debris is the least dangerous line of conduct if there are cars all around. That looked like an empty road but who knows.
By the way, playing a lot of racing videogames is a great training for dealing with that sort of stuff, except maybe getting good at mirrors. I've been in a few dangerous situations and they were only the 10th thousand averted crash. Nothing to think, only reflexes.
I highly recommend people take a high performance/race driving course if they can. I did a single day one which involved high speed maneuverability trials designed to be useful in emergency scenarios (swerving, braking, hard turns) followed by a few laps around a racetrack.
It's one of the best ways to figure out what it feels like to drive at the limits of your car and how you and it react in a very safe and controlled environment.
Did you friend make any mention that the passenger saw it hundreds of feet away and even leaned in as they headed directly towards it? The driver also recognized it and grabbed the wheel as if to say "brace for impact!".
Obviously, in this particular case the humans wouldn't be hitting that. The people in the video have clearly seen the object, but they didn't want to react because that would have ruined their video.
Even if they did not understand what it is, in the real world when you see something on the road you slow down or do a maneuver to avoid it, no matter if it is a harmless peace of cloth or something dangerous like this. People are very good at telling if something is off, you can see it in the video.
Yes. Humans would. Which is why the car should be able to handle the impact. My honda civic has had worse without issue. The suspension should be beefy enough to absorb the impact with, at worst, a blown tire. That the car has broken suspension says to me that teslas are still too fragile, biuld more like performance cars than everyday drivers.
With millions of Teslas on the road one would think if that was true we would have heard something by now. My absolute worst car quality wise ever was a Honda Accord. And I owned shitty cars including a Fiat. My most reliable car was a Honda Civic before I “upgraded” to a brand new Accord. I abuse my Tesla and so far no issues driving in one of the worst roads in the country. I must hit 100 potholes per month and blew a tire already. It’s not a fun car to drive like a GTI (which I own as well) but it’s definitely a solid car.
Cars with "bad" suspension tend to survive potholes. A car with slow-to-move suspension will see the wheel dip less down into the hole when traveling at speed. But that is the exact opposite behabior you want when dealing with debris, which requires the supension to move up rather than down. "Good" systems will have different responce curves for up than down. Quazi-luxury cars fake this by having slow suspension in both directions, to give the sense of "floating over potholes".
Then you were not listening [1]. Tesla covered up 150,000 suspension defects, affecting approximately 5% of cars, with literal thousands of actual failures in operation. They blamed their defective suspensions on the customers for years and required over 30,000 of them to pay for repairs in defective components that, if not covered up, would have legally been Tesla’s responsibility to fix on their own dime.
A company actively covering up things that egregious can only be assumed to be doing things even worse regularly.
> A friend of mine who loves Tesla watched this video and said "many humans would have hit that".
The very same video demonstrates this is not true, since the human in the video sees the debris from far away and talks about it, as the self-driving Tesla continues obliviously towards it. Had that same human been driving, it would've been a non-issue to switch to the adjacent lane (completely empty).
But as you said, the friend loves Tesla. The fanboys will always have an excuse, even if the same video contradicts that.
I would counterpoint my little cheap Civic has hit things like that and hasn't broken a thing. HEH.
The best time was a very badly designed speed bump that I didn't even hit at high speed but one side was ridiculously inclinded vs. the other so the entire civic's frontend just smashed right into the pavement and dragged for 3 feet. I wouldn't be surprised if I went into a body shop and find the frontend tilted upwards by a few centimeters. lol
Timestamp 8:00-8:30. Your Civic is not hitting that and surviving any better than the Tesla. It just got luckier. It may be easier to get lucky in certain vehicles, but still luck based.
That's laughable. Any human who couldn't avoid a large, clearly-visible object in the middle of an empty, well-lit road should not be operating a vehicle.
That's not to say that there aren't many drivers who shouldn't be driving, so both can be true at once, but this is certainly not a bar against which to gauge autonomous driving.
Question - isn't P(Hitting | Human Driving) still less than P(Hitting | Tesla FSD) in this particular case [given that if this particular situation comes up - Tesla will fail always whereas some / many humans would not]?
The question is if avoiding the obstacle or breaking was the safest thing to do. I did not watch the entire test, but they are definitely cases where a human will suddenly break or change lanes and cause a very unsafe condition for other drivers. Not saying that was the case here, but sometimes what a human would do is not a good rule for what the autonomous system should do.
An enormous part of safe driving is maintaining a mental map of the vehicles around you and what your options are if you need to make sudden changes. If you are not able to react to changing conditions without being unsafe, you are driving unsafely already.
> That a human was still in the loop in addition to a computer and both missed it.
Listen to the audio in the video. The humans do see it and talk about it for a long time before the car hits it. Had a human been driving, plenty of time to avoid it without any rush.
They do nothing to avoid it presumably because the whole point of the experiment was to let the car drive, so they let it drive to see what happens. Turns out Tesla can't see large static objects in clear daylight, so it drives straight into it.
They had poor incentives. The driver wanted to only use FSD for the video which is dumb.
But it doesn’t help disprove that it’s entirely the computers fault. They could have taken action if they were a rational driver.
From my perspective I think many people would have failed to take action. Swerving or hard breaking at those speeds is very dangerous. And many things on the road like roadkill or bags can be driver over
I say the fact that you can hear them discuss the object well before hitting it yet clearly not try to avoid it means they did not actually "miss it", which is also why my first comment in this thread was in response to the notion that "a human did hit that..."
These drivers hitting the object while intentionally not intervening does not actually provide information as to whether other drivers not running the same "experiment" would've hit it.
Yes many human drivers would hit it. The bad ones. But we should want driverless cars to be better than bad drivers. Personally, I expect driverless cars to be better than good drivers. And no, good drivers would not hit that thing.
Hopefully we get there. Waymo hasn’t even started highway testing yet but maybe they will be sufficiently better that cameras in cheaper cars simply aren’t worth it. Or maybe a driver will always be in the loop and cross country FSD only is dumb