I'm a huge proponent of ADTs being a more comprehensible way to write code than some of the alternatives. But I do have to agree with you that there isn't really evidence that this is a basic mental model.
However
What we do see is a bunch of mathematical disciplines that end up creating properties like: AND, OR, Universal, Existential, Implication, (and a few others). They end up in places like: set theory, type theory, category theory, various logics, lattice theory, etc.
Now, maybe they're only copying one another and this is more of a memetic phenomena. Or maybe they've hit upon something that's important for human comprehensibility.
That would be the 'evidence' of the positive effect of ADTs (scare quotes because it might just be math memes and not fundamental). But we can also think about what I feel is legit evidence for the negative effect of lacking ADTs.
Consider what happens if instead of having the standard boolean logic operators and, or, not, xor, we only have the universal not-and operator. Now a straightforward statement like: A && B || C becomes (((A !& B) !& (A !& B)) !& ((A !& B) !& (A !& B))) !& (B !& B) [I think...]. It's more complicated to tell what's actually supposed to be going on AND the '&&' simulation can get intertwined with the '||' simulation. The result being that requirements changes or defect fixes end up modifying the object level expression in a way where there is no longer any mapping back to standard boolean logic. Comprehensibility approaches zero.
And we've seen this happen with interfaces and inheritance being used to implement what would otherwise be a relatively simple OR property (with the added benefit that pattern matching ADTs often comes with totality checking; not something you can do with interfaces which can always have another instance even up to and including objects loaded at runtime).
Appearing in symbolic reasoning tools we have invented doesn't really support them being how brains work, though? This is akin to saying that gears are how nature works because gears are everywhere in how we build things. I could maybe buy that with "friction" being a fundamental thing, but feels like a stretch for the other.
Now, I should add that I did not mean my question to be a criticism of them! I'm genuinely curious on evidence that they are a basic building block. Feels save to say they are a good building block, and those aren't the same thing.
As an easy example for them not being basic building blocks, I can't remember ever seeing anything like them in any assembly instructions for things. Put together a batting net for the kids. Lots of instructions, but nothing algebraic, in this sense. Looking at recipes for food. Nothing algebraic, really? Maybe I can squint and see some, but it would be hard. Exercise plans? Music lessons? Playbooks for a sport?
Again, though, I /do not/ intend this as a criticism of them. Genuinely curious on any investigation into them.
I think you're right to point out that it's too strong a claim to say that sum types are a basic building block of thought, although I believe they are very useful in coding regardless of that claim.
There is the still the ongoing debate about how much human perception and human reason are shaped by cultural forces vs. universal forces (where the latter asserts humans reason in the same/similar ways).
There's evidence that certain optical illusions don't work across cultures for example (I seem to remember those in Western countries have a tendency to mentally group things in rectangular boxes). The exact balance between cultural and universal forces isn't known and I doubt we could say anything about sum types in that regard.
However
What we do see is a bunch of mathematical disciplines that end up creating properties like: AND, OR, Universal, Existential, Implication, (and a few others). They end up in places like: set theory, type theory, category theory, various logics, lattice theory, etc.
Now, maybe they're only copying one another and this is more of a memetic phenomena. Or maybe they've hit upon something that's important for human comprehensibility.
That would be the 'evidence' of the positive effect of ADTs (scare quotes because it might just be math memes and not fundamental). But we can also think about what I feel is legit evidence for the negative effect of lacking ADTs.
Consider what happens if instead of having the standard boolean logic operators and, or, not, xor, we only have the universal not-and operator. Now a straightforward statement like: A && B || C becomes (((A !& B) !& (A !& B)) !& ((A !& B) !& (A !& B))) !& (B !& B) [I think...]. It's more complicated to tell what's actually supposed to be going on AND the '&&' simulation can get intertwined with the '||' simulation. The result being that requirements changes or defect fixes end up modifying the object level expression in a way where there is no longer any mapping back to standard boolean logic. Comprehensibility approaches zero.
And we've seen this happen with interfaces and inheritance being used to implement what would otherwise be a relatively simple OR property (with the added benefit that pattern matching ADTs often comes with totality checking; not something you can do with interfaces which can always have another instance even up to and including objects loaded at runtime).