Hacker News new | past | comments | ask | show | jobs | submit login

You seem to think Tesla’s system is perfectly safe. I’ve seen nothing that indicates that—just the opposite, in fact.



Could you reference an FSD safety incident?

I know there have been some autopilot accidents but that’s not this system.


Do a web search for “Tesla fsd beta scary” for video examples.


The only examples I could find were of people uncomfortable with the automation, but not scared of an accident. Also no actual accident. Do you have a direct link?


My contention is only indirectly with the safety of the system, and mostly with the labeling and messaging. A cruise control system that is unsafe to leave unattended but isn't called "Fully Self-Driving" or even "Autopilot" is fine. There's no ambiguity. Everyone knows that cruise control is a dumb system that will happily crash you into the wall.

However, I have issues with calling and marketing something "Fully Self-Driving" and then adding paragraphs of text to explain that it's actually not fully autonomous, in fact it's not supposed to be autonomous at all and the driver has to have their full attention on the road and hands on the wheel and any other operation is dangerous. If this was something benign, I'd just consider it disingenuous and shady marketing (that seems par for the course for corporations), but in case of a 2 ton vehicle it's not just their own customers at risk, but everyone around them.

Yes, you and me know what this system really is and can approach it responsibly. But we see so much evidence that there are many people who have no scruples about ignoring this (willfully or not) and assuming that the system must be good enough if it's called "Fully Self-Driving" to unpack a lunch on their lap.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: