Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It only has to be less likely to cause that issue than a paralegal to be a net positive.

Some people expect AI to never make mistakes when doing jobs where people routinely make all kinds of mistakes of varying severity.

It’s the same as how people expect self-driving cars to be flawless when they think nothing of a pileup caused by a human watching a reel while behind the wheel.



In the pileup example, the human driver is legally at fault. If a self driving car causes the pileup, who is at fault?


My understanding is the firm operating the car is liable, in the full self driving case of commercial vehicles (waymo). The driver is liable in supervised self driving cases (privately owned Tesla)


Well, maybe its wheel fell off.

So, the mechanic who maintenanced it last?

...

We don't fault our tools, legally. We usually also don't fault the manufacturer, or the maintenance guy. We fault the people using them.


Any evidence it's actually better than a paralegal? I doubt it is.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: