I actually met plenty of people who say "oh, they just need to have less accidents than humans and we are good".
No, that's absolutely not true. For the most recent example - someone made a raspberry Pi controlled insulin pump. Insulin is actually incredibly dangerous to humans if you get the dose wrong, so making an insulin pump based on hardware that does not conform with highest safety standards is just not acceptable. You know what the person behind it said when it was pointed out to them? That it doesn't matter, because raspberry Pi is still going to kill less people than the number of those who die through incorrect injections due to tiredness or simple mistakes. That's absolutely incorrect - even if such machine lowered the overall number of deaths due to incorrect insulin injections, no one would ever allow it on any market ever. No company would ever get its way out of "poor hardware choice lead to death of Mr. Smith" by saying "hey, but actually, our machine kills less people than would die naturally due to similar causes each year, so you can't hold us accountable, right??".
To me, it's the same with automatic cars - they cannot merely "have less accidents than humans". They need to have 0 accidents or they won't be acceptable. That's why the bar is high. If you are in a situation where a choice is between hitting a pedestrian or running under a semi and possibly killing everyone in your car, no one is going to blame you for doing either - our primitive brains probably are going to go with whatever seems most logical at the moment, you can blame anything on adrenaline. Computers don't have that luxury. They need to make a calculated choice - and then whoever makes them(the computers) has to live with that choice. The computer chose to hit the pedestrian - now the company who wrote its code is being sued for millions - no matter how they frame it, that's not a situation anyone wants to be in. Of course I'm going off into theoreticals here, since we don't actually have this problem yet. But I am sure it will become an actual problem and it will need to be solved one way or another before widespread adoption.
Its more an economic issue than a moral/technical one.