>In a lot of contexts, whether someone leaned on an LLM--lightly or heavily--is sort of irrelevant
This isn't one of those contexts. it's called peer review for a reason. You don't get to outsource your duty to either a machine or some random person. It's explicitly you others have vested their trust in.
>The output is either good/reasonable or it's not.
In the world of human beings this isn't the only thing that matters. Reminds me of Zizek who pointed out the end result of the "AI revolution" isn't going to be machines acting like humans, but the reverse, humans LARPing as machines. Humans as obtuse as robots, rather than the other way around.
This isn't one of those contexts. it's called peer review for a reason. You don't get to outsource your duty to either a machine or some random person. It's explicitly you others have vested their trust in.
>The output is either good/reasonable or it's not.
In the world of human beings this isn't the only thing that matters. Reminds me of Zizek who pointed out the end result of the "AI revolution" isn't going to be machines acting like humans, but the reverse, humans LARPing as machines. Humans as obtuse as robots, rather than the other way around.