Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

    > All it takes is for something non-routine to happen such as a car ahead reacting to an animal, or swerving as the driver reaches for something or spills coffee on themself, or a wheel come off a car (I've seen it happen to a car in front of me), or a car crosses the center median in opposite direction (which left my ex-boss hospitalized for 6 months)
Inherent in this statement is the assumption that in such types of events, a human would necessarily do better than the machine. Each of these are extremes and I doubt that most human drivers would be able to react to avoid an accident or damage most of the time.


Humans can make assumptions about humans better than machines. For example there was a trailer with a tire that blew out in front of me. It was rapidly slowing. I checked my mirrors and was being tail-gated too closely to hit the breaks. There was a car in the lane right of me, and lane right behind me.

I merged hard right before it was clear. I assumed the car on my right would do the same, and the driver further behind would break in time. They either made the same assumptions as me or took my queues and acted accordingly and everything turned out ok.

It is pretty amazing how people can coordinate on-the-fly.


It depends how quickly it happens, and how well they are paying attention. We all know those accident prone drivers who are never "at-fault" - just very bad at avoiding them!

However, the human has the major advantage of having a brain and being able to understand the consequences and potential outcomes of something in real-time as it is unfolding. I doubt most autopilots would understand the situations I mentioned - certainly not unless they were specifically pre-programmed/trained into the system. Would an autopilot even see what is going on inside a car if a driver is bending down below dashboard, or fighting with passenger, for example?


Here's the thing: the technology will get faster and better.

Your average driver is already at the limits of human capabilities with no room for improvement unless they go train to drive like F1 racers. So comparing humans to the current state of tech seems silly given that the tech will assuredly one day be faster than a human.


The F1 example is totally wrong. The F1 controller runs at 10kHz hard real-time, the car controller at 500mHz, a human driver can react at maximum 30ms. A F1 driver is nothing compared to its controller, whilst the normal car driver has comparable reaction times.

You cannot train a F1 driver for highly dynamic events at 20k rpm and 4000 NM forces at the axles, you need automated controllers for that. You can train him for simple things, like gear switching (5ms) and hitting the breakpoints right. But an AI will be at least 10.000 better than a human on this.

You need a slow human brain for the stupid mistakes instead.


> Your average driver is already at the limits of human capabilities with no room for improvement unless they go train to drive like F1 racers.

This is nowhere near true. The skill range from average driver (who knows nothing about car control or reacting correctly) to F1 drivers is vast. There's a huge gray area between these extremes that would be improve safety is people chose (or were forced to) get some training.

Even something as basic as taking a car control clinic once a year would improve the average driver skill and safety by a huge amount.


I don't think response time is the issue here - at the moment the computer doesn't have the intelligence or ability to learn of the human, so the human is going to be safer in situations the computer was not programmed to handle. I expect we will eventually get human level AI (but maybe not very soon), and if that is used for autopilot then the computer could have the safety edge with faster reaction time if in-car hardware is fast enough.


> the tech will assuredly one day be faster than a human

I'll be happy to take it seriously after that happens. In the meantime, I'm sticking with my belief that unreliable automation is worse than no automation.


> Your average driver is already at the limits of human capabilities with no room for improvement unless they go train to drive like F1 racers.

You just dreamed up this statement and immediately believed it.


Question is: who’s liable for the accident in these circumstances? Because to date the legal circumstances are such you would be heavily advised to keep your autopilot off.


Mercedes takes liability for it. That's their marketing pitch for the feature.


People tend to muddle through.

If we had the same lawyers available that Tesla does, humans too would be not responsible for much of anything.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: