Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is by far the correct outcome. Given how long Tesla has been giving users the impression that the software would be superhuman (since 2013 with "Autopilot", and unrelatedly since 2016 with "full self driving"), they're overdue for the consequences, even if in this case they were only found partially liable.

The driver took the fair share of the blame for actually making the decision, but the driver's defense that a feature called "autopilot" would supposedly intervene is exactly the kind of marketing gimmick that needs to be penalized.

It shouldn't have taken injury and death to get to this point.



This article is about autopilot which is separate from FSD and is essentially equivalent to cruise control in this context.


So why is it called autopilot if it’s just cruise control? And what about “automatic piloting” doesn’t include the part where the thing drives itself?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: