I agree you don't want the LLMs to have correlated errors. You need to design the system so they maintain some independence.
But even with humans the two humans will often be members of the same culture, have the same biases, and may even report to the same boss.
I agree you don't want the LLMs to have correlated errors. You need to design the system so they maintain some independence.
But even with humans the two humans will often be members of the same culture, have the same biases, and may even report to the same boss.