There's major $$$, legal, and security ramifications for clients in many cases. Having an AI that can't properly deal in ambiguity and hallucinates an outright reckless idea 1% of the time is completely unacceptable.
Writing code, sure. A human ultimately reviews it. I suspect in the legal world a lot of legal writing can also be automated to some degree. But strategic decisions, designs, etc. very much need a human pulling the trigger.
Writing code, sure. A human ultimately reviews it. I suspect in the legal world a lot of legal writing can also be automated to some degree. But strategic decisions, designs, etc. very much need a human pulling the trigger.