7. I don't give a fuck that fatal accidents happen and are caused by self driving cars. It would be insane to think it would be otherwise.
Humans kill 1.35 million people while driving cars globally every year. We're terrible drivers. If I see data that a global fleet of self driving cars will only kill 1.30 million people instead of the current human driven standard of 1.35, then I'm going to continue to rabidly support the transition and heap shame on the heads of any luddites who resist with faulty emotional appeals, who therefore are indirectly supporting the status quo of unnecessary pain and suffering.
We absolutely should hold self driving cars to a standard higher than human drivers, but that's not a very high bar. And a handful of anecdotal cases don't give us any useful data to compare and contrast the two options.
A lot of those 1.35 million are the fault of the driver, if we hypothetically replaced all cars with self driving ones and 1.30 million still died that would mean that risk for those who don't drink and drive and/or are generally more attentive and responsible etc. than the average driver etc. would increase significantly.
So no... a terrible suggestion.
> are indirectly supporting the status quo of unnecessary pain and suffering.
There are many measures which would result in a significant decrease in traffic deaths (globally) which we chose not to adopt (not to mention that newer cares are way safer than those 10-20+ years old). Those measures would overall be both cheaper and easier to achieve than self driving. They are not as cool and shinny though so we can just ignore them and still blame people for 'supporting the status quo of unnecessary pain and suffering'...
It is not sufficient if, in aggregate, self-driving cars have fewer accidents. If you lose a loved one in an accident where the accident could have been easily avoided if a human was driving, then you're not going to be mollified to hear that in aggregate, fewer people are being killed by self-driving cars! You'd be outraged to hear such a justification! The expectation therefore is that in each individual injury accident a human clearly could not have handled the situation any better. Self-driving cars have to be significantly better than humans to be accepted by society.
We don't make policy or design decisions as a civilization based on whether individuals are going to be emotionally outraged. We make those decisions based on data that leads to the best average outcome for everyone.
We make those decisions based on data that leads to the best average outcome for everyone.
That's great news for everyone who needs an organ transplant. You anonporridge are among 50,000 Americans who've been randomly selected as an organ donor. Your personal donation of all your organs will save at least 10 lives and cost only your own, leading to the best average outcome for everyone.
> We don't make policy or design decisions as a civilization based on whether individuals are going to be emotionally outraged.
I feel like we're living in very different democracies. My provincial government just offered to pay just over a third of a billion dollars for a hockey arena so they could have a better shot at winning an election. That's entirely playing to people's emotions.
So next time there is a pandemic we should just bomb the city it originates in before it can spread, got it. Both human emotions and logic play parts in policy making.
This is such a naive take. Bills get passed based on emotional appeal, not data. That's why politics is chock full of "it's for the children"/ "don't let the terrorists win" rhetoric.
I'm not sure this is really a universal opinion. The main difference between this and every other scenario we already do this kind of trade with today, e.g. medical professionals, is whether it's another person that is, on aggregate, better. That will definitely have some sway for some but certainly not everyone. "Someone will be upset" is also not the same thing as "society won't accept it". E.g., many people are upset with medical professionals, and there are plenty of cases of them being plain worse than a random person's guess, but the vast majority of society still relies upon them.
That said I agree it has to be more than "beats average", just how much so and why that is may be wildly different depending who you ask. I suppose that's the crux of the debate, not that there is just one obvious and well-known fact about acceptance some people are missing.
So people killing people is ok, but software killing people is way out? I have seen plenty of human-made accidents that were very readily avoidable - one that springs to mind is a colleague who killed himself and his three passengers by driving down the wrong side of the M1 at 180 mph while absolutely slaughtered (pun intended).
I mean, you’re saying that he couldn’t have handled that any better?
How would doing something irresponsible be different with self-driving? He would have just overridden the car’s control and still driven with 180 mph, or if some another poor soul was also caught up in the accident, there is no way a self-driving car could have done anything with a car suddenly appearing in some other lane with that speed avoiding a collision.
> It is not sufficient if, in aggregate, self-driving cars have fewer accidents.
Morally, and in terms of our own personal opinions, it should be sufficient, even if emotionally and to broader society, it isn't. We as individuals should not be advocating for the modality that maximizes the number of deaths, regardless of other trivial factors like status quo bias.
> We as individuals should not be advocating for the modality that maximizes the number of deaths, regardless of other trivial factors like status quo bias.
I'm not sure it's that simple. Traffic deaths are not entirely random, they can but there are actions you can take to decrease risk for yourself and other people in your car. If the number of deaths only marginally decreases the chance of death for some people (those who don't drive while intoxicated, don't use their phones, are more attentive etc.) will increase?
Also the state will have to grant legal immunity to car manufacturers so that they couldn't be sued to bankruptcy. That shouldn't provide them too many incentives to make their cars safer..
"Minimizing deaths" is probably too simplistic, but not by much. If we can be reasonably confident that replacing human drivers will lead to 30% less traffic deaths, I think it would take some pretty large extenuating circumstances for me to not want that to happen.
> Also the state will have to grant legal immunity to car manufacturers so that they couldn't be sued to bankruptcy. That shouldn't provide them too many incentives to make their cars safer..
Indeed we would need to be careful not to make the wrong incentives.
But there are already many measures which could decrease traffic deaths by up to 30% or so. They are expensive and/or inconvenient (but not even close to how expensive replacing all cars with self-driving ones would be). Which choose not to implement them for these reasons.
For instance ban all cars made prior to 2008 or so. That combined with massive investments into public transport (would decrease average miles driven, .e.g many EU countries have way less traffic fatalities per 100k pop. but about the same when adjusted by distance driven). Should be about 30% if not more and we don't even need self-driving cars...
So you must be pretty down on Tesla given their refusal to release anything but the most cherry picked data to backup their grand safety claims, right?
> If I see data that a global fleet of self driving cars will only kill 1.30 million people instead
Sure, if I see that then you are right! But: accidents are mostly caused by irresponsible human drivers, not by good drivers and self driving cars have trouble handling normal situations, let alone dangerous drivers.
So by replacing randomly from the pool of good+bad drivers you won’t meaningfully decrease accident rates, you might even increase it. Call me when we can selectively replace bad/irresponsible drivers! But then no need for self-driving as well. Especially when we incorporate the actually useful features like auto-emergency breaking, lane assist etc to “normal” cars will give you all the benefits of self driving, without the problems.
Ironically (and I'm not suggesting this is you), many folks who support such an approach (Elon included) balked at the idea of shutting things down during the height of Covid to reduce unnecessary deaths as a result of Covid (for example, hospitals being unable to help patients due to being overwhelmed with high Covid case counts).
I for one agree though, if self-driving can reduce deaths on aggregate then that's a good thing. I hope that self-driving cars give better signals to pedestrians though (maybe emitting noises less startling the car honks) - I think this would vastly improve the safety when a self-driving car is navigating a parking lot or busy downtown roads.
Hey you know we aren't allowed to mention our pro-corporate astroturfing here!
It is kind of mind boggling its against the rules to discuss that billion dollar companies might make accounts to drive sales/help their reputation. I've seen small companies doing it. I think its foolish to ignore that big companies can do it so well, its basically impossible to prove.
Now what is your excuse for the 4,285 human-driver caused fatalities that occurred in CA alone in 2021? How about the 1,370 caused by those driving under the influence?
> How about the 1,370 caused by those driving under the influence?
How exactly is it solved by self-driving? Your self-driving car won’t suddenly turn reality into fast&furious or whatnot and won’t be able to prevent an accident caused by someone else being grossly irresponsible with their car, and unless you mandate literally everyone to switch cars I don’t see it helping at all.
The problem is those assholes that drive that way — unless you can selectively replace them (at which point we could just ban them from driving and be happy), you won’t be actually ahead.
He bought a social network out of a sense of right-wing grievance and unbanned far right activists and self-described neonazis and white nationalists. That's a good place to start if you're looking for more there's more though you should look for.
If you don't believe it you might be? My stance is based on my experience as an active twitter user where racism, antisemitism, eugenics, and trans elimination rhetoric have gone from frequent to constant since he took over. There is an openly genocidal nationalist movement brewing in the US and musk is in the replies of their prominent figures dropping thinking-face and "interesting" a couple times a week. Some of them were banned and he unbanned them when he took over. Not sure how you judo that observation into echo chamber.
Tbf there are some people, actually most people, who don't know enough about him and his grift, and the media played along with his Iron Man image for a long time.
What's a Tesla-supported account? Is it anyone who doesn't share HN echo-chamber hate for Tesla, likes their products and thinks they have a great future? If so, where should I go collect my earnings?
By not reading past the word "Autopilot" in the product description and ticking the Accept Terms box instead of having their personal lawyer review it, like the rest of us do.
> instead of having their personal lawyer review it, like the rest of us do.
Wait, do you all have personal lawyers that you pay to review every (or any) EULA you accept? I clearly do not make enough money to hang out on this site.
Well I generally find it much harder to remain attentive when using an autopilot. I assume I'm not unique in the regard (i.e. it's an "autopilot" but technically you must pay as much attention to what happening if you were driving yourself. What would be the point of such feature? Well.. obviously companies obviously bring this up when something bad happens and not in their marketing material).
The problem is the car doesn't let you ignore it. You need to perform some kind of "hey, are you paying attention?" input on the steering wheel what seems like every 30 seconds or so(not sure of the exact numbers). Maybe the driver just happened to doze off in that intervening 30 seconds...at which point, he would have crashed anyway.
It is literally orders of magnitude harder to commute on roads than flying, and is very likely GAI territory. This cute little robot vacuum logic won’t cut it and won’t be better on its own.
I'm amused that there are stories for both "self driving cars are impossible", and "there are too many Waymo self driving cars on the streets in my neighborhood".
This. There isn't really anything _Tesla_ specific about this situation. It just happened to be a Tesla vehicle, but the actual "automated driving system" involved is made by a completely different company, and is present in many other vehicles - https://en.wikipedia.org/wiki/Mobileye#Comparison
If my Mazda has a failure I don't think "huh, I wonder if this is a 3rd party design that the was bought and integrated into the system." It's still on them to verify the quality and make sure that their marketing matches the reality of the product. The origination of the design doesn't matter that much if I'm pancaked on the back of a firetruck.
> isn't really anything _Tesla_ specific about this situation
My Subaru has lane-keeping tech. It's an absolute nag about my keeping my hands on the wheel and eyes on the road. I've driven a Tesla, and it's more lackadaisical.
It nags a lot on model 3/y in Europe unless it’s very slow stop and go traffic. Similarly than other brands. It will even put you in "jail" mode if you don’t react quickly to the nag too many times during a drive (didn’t happen to me as I’m obviously the most perfect driver).
In my experience Tesla is pretty strict, especially now with the driver camera able to tell if you are watching the road. With the new driver cam it cares less about you applying pressure to the wheel but it's still pretty nagging.
My Volvo on the other hand (which has Lane Keep Assist with Adaptive Cruise Control) will gladly just slowly drift into the other lane sometimes (not always) if I don't correct it.
Edit: who the heck downvotes someone sharing their personal experiences? It's not like I'm fan boy-ing out. Sigh.
Isn't the "autopilot" feature Tesla specific? Don't forget that Mobileeye cancelled their partnership with Tesla because they thought the feature was dangerous.
Early cars, such as the one in the article, used MobilEye's technology for "autopilot". Tesla only integrated their own ui with that technology until the partnership ended and they built everything from scratch.
When I was young I got a tech job in Spain that was great but required me working until night, so I will go home by car at 1AM-3AM.
I had three incidents of crazy vehicles that nearly killed me. The last one with more than 50% of probability of dying. I am alive by sheer luck.
I was so upset that went to the Police to denounce it. They told me that it was routine, that the guys going out the discos at that particular spot were so full of drugs and alcohol that there will be accidents all weekends. There were raids and people detained but they continued doing it.
I left the work and told my boss to look for someone that did not need to use the car there.
I can't wait for the day people could kill themselves if they want with drugs, but let their cars drive for them and do not kill others.
This was an old 2014 model with basically advanced traffic aware cruise control. I wonder how many Toyotas get into accidents with regular cruise control. The figure of merit is number of accidents per mile...
"from zero miles an hour to 155 miles an hour and so it'll detect if there's a car in your blind spot if you've got a highway barrier on one side if there's something you might you know move into in any way possible"
My wife's (ex-) 2019 Toyota CH-R had warnings for vehicles in blind spots while driving (or parking) and gave a nice feedback in the steering wheel if, when above 50 km/h, you'd steer out of your lane without using your blinker. It was very convenient and a nice security feature without claiming to be an auto-pilot.
Not my car but a member of my family's car does that with the steering wheel and I hate driving it. Granted I live in the country, but I want to be able to go over the line slightly to avoid something in the road without it trying to shove me back
Why does anyone care if it isn't killing bystanders? Self driving is the same as other vehicle performance enhancing features in one major respect. More risk, more reward.
If you purchase and use the feature, you should be aware that vision systems have limitations and operate accordingly.
IMHO, they should remove the guardrails and tune the system to prioritize bystander safety. Otherwise, I'm going to stick to my motorcycles.
Is this one of the models that had its radar removed or disabled? I'm not saying working radar would have prevented this accident, but working radar would have prevented this accident.
You believe that someone should be able to drive a car, cause bodily harm to others, and relevant information about those events would be private, not discoverable by interested parties? If so, why?
State law controls access to crash data records and the NTSB is broadly entitled to access them. You may also wish to research the concept of "implied consent". Your ability to drive on public roads comes with compromises to absolute privacy rights.
The problem with this is people trust it. This stuff will be sketchy until there is car to car and road to car communication. Getting that infrastructure in place to the last mile will never happen.
That said there are aspects of that tech now that could be utilized to improve safety in all new cars and it doesn't have to be configured to "drive" a car, but instead assist the driver.
Yeah i can't believe people trusted something called "Full self driving" and "autopilot". I mean come on, at what point should marketers be held accountable for how the things they say will be reasonably interpreted
This is probably a large underestimate given that Tesla tends to be very aggressive on not sharing all their data and not calling things autopilot related when they probably should be.
Than being said, even if we 10x this I would guess that number is still pretty low. The comparison should not be relative to 0, the comparison should be relative to number of equivalent human driving hours. How many teslas on the road are out there, how many hours/miles have they driven, and how does that stack up to average human driving in comparison?
Obviously those stats can be gamed, but I'm generally inclined to say that as long as there isn't some crazy increase then its a step in the right direction.
> the comparison should be relative to number of equivalent human driving hours
Not necessarily, this would be true if autopilot made the same kinds of mistakes that humans do - these are the kinds of mistakes that other humans, e.g. other drivers, motorcyclists, bicyclists, pedestrians, are on the lookout for. That is to say we're very experienced with the kinds of mistakes humans make and the conditions in which they make them, and so we can defend ourselves against them. If autopilot is making all-new mistakes under totally different conditions then it removes that ability to defend ones self and lead to even more issues. We humans have a keen sense for how other humans will react in a given situation.
Do we have the ability to defend ourselves? How do you defend against a drunk driver? Lots of people die to random car crashes because they aren't able to defend themselves.
The promising thing I've always thought about self driving is that while it may make stupid, unpredictable mistakes (drive into flashing firetruck at 60mph) it does this less than a typically human driver would make typical human casualty inflicting mistakes. Moreover we can build & improve to eliminate firetruck ramming; we unfortunately have not been able to eliminate drunk driving.
We've been unwilling to eliminate drunk driving. Breathalyzer ignition lock? Problem solved. Self-driving cars are an extremely poor solution (complexity and cost) to solve this simple problem. People have fought tooth and nail to enable drunk driving - and they've won. Just like they're fighting and winning for mass shootings. Even so, drunk drivers are largely predictable. People know when it's not a good time to be on the streets.
> We also receive a crash alert anytime a crash is reported to us from the fleet, which may include data about whether Autopilot was active at the time of impact.
> To ensure our statistics are conservative, we count any crash in which Autopilot was deactivated within 5 seconds before impact, and we count all crashes in which the incident alert indicated an airbag or other active restraint deployed. (Our crash statistics are not based on sample data sets or estimates.)
> In practice, this correlates to nearly any crash at about 12 mph (20 kph) or above, depending on the crash forces generated. We do not differentiate based on the type of crash or fault (For example, more than 35% of all Autopilot crashes occur when the Tesla vehicle is rear-ended by another vehicle). In this way, we are confident that the statistics we share unquestionably show the benefits of Autopilot.
Every answer to your questions is on the link below:
The Tesla safety report has never been a good-faith attempt at communicating their stats.
They compare Teslas to every single other vehicle on the road. The average Tesla owner is already in a demographic that would have dramatically lower accident rates than the average. The average age of a Tesla vehicle also dramatically lowers it. Then you add in the fact AP is skewed towards highway use and disengages in tricky situations...
Before you even factor in AP you're already starting with the picture Tesla wants to paint.
There's no shortage of Tesla owners in topical HN threads saying they never/rarely use Autopilot for any number of reasons, ranging from it's bat-shit crazy to phantom-braking still being a frequent occurrence.
So you really need the number of Teslas being operated by autonomous functionality, not the gross number of Teslas on the road, to make any kind of judgement here. And even within that set, if you had the data, it needs to be normalized for operating in ideal conditions vs. regular cars in ideal conditions.
The autonomous problems have seen enough coverage that everyone is aware at this point. Even Joe Rogan has talked about not really using those features of his Tesla multiple times on his very popular JRE podcast. What proportion of Tesla owners have the cars just because they're the fastest and most social-statusy EVs on the market, despite the autonomous b.s. and just ignore those features?
We don't know how many Tesla's on the road have purchased the optional FSD package. You would need to use that number, not the number of Teslas on the road.
I don't know if that is a lot or not. 17th whilst self driving engaged sounds like a lot more common, not all Tesla's have self driving, not every trip has it engaged.
Humans generally don't crash on the highway? You must live on a different planet. Every single day humans are doing absolutely batshit insane maneuvers on the highway. I can't recall a single time it's been a Tesla on autopilot that has done something that scared the shit out of me. It's really astonishing we let humans drive at all.
1. It's not the current production model.
2. The driver should have been paying attention.
3. We're not sure self-driving was on.
4. It's a beta version.
5. The correct measurement of hazard is some other metric that makes Tesla look better.
5. "No new technology has been fully accepted by society until it has killed someone".