Having done some research into human attention, I have to agree with Hommel et al: No one knows what attention is [1].
In current ANNs "attention" is quite well defined: how to weigh some variables based on other variables. But anthropomorphizing such concepts indeed muddies things more than it clarifies. Including calling interconnected summation units with non-linear transformations "neural networks".
But such (wrong) intuition pumping terminology does attract, well, attention, so they get adopted.
In current ANNs "attention" is quite well defined: how to weigh some variables based on other variables. But anthropomorphizing such concepts indeed muddies things more than it clarifies. Including calling interconnected summation units with non-linear transformations "neural networks".
But such (wrong) intuition pumping terminology does attract, well, attention, so they get adopted.
[1] https://link.springer.com/article/10.3758/s13414-019-01846-w