I think it's probably a good idea that GPT4 avoids legal or psychological tasks. Those are areas where giving incorrect output can have catastrophic consequences, and I can see why GPT4's developers want to avoid potential liability.
I wasn't asking for advice. I was in fact exploring ways to measure employee performance (using GPT as a search engine) and criticized one of the assumptions in one of theoretical framework with my own experience. GPT retorted "burnout", I retorted "harassment grounded on material evidence", and it spitted a generic block about how it couldn't provide legal or psychological guidance. I wasn't asking for advice, it was just a point in a wider conversation. Switching to GPT-4-0314 it proceeded with a reply that matched the discussion main's topic (metrics in HR) switching it back to GPT-4-0613 it outputted the exact same generic block.
Yes, considering those are fields where humans have to be professionally educated and licensed, and also carry liabilty for any mistakes. It probably shouldn't be used for civil or mechanical engineering either.
It's ridiculous. We should have access to a completely neutered version if we want it, we should just have to sign for access if they are worried about begin sued.