Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Having done some research into human attention, I have to agree with Hommel et al: No one knows what attention is [1].

In current ANNs "attention" is quite well defined: how to weigh some variables based on other variables. But anthropomorphizing such concepts indeed muddies things more than it clarifies. Including calling interconnected summation units with non-linear transformations "neural networks".

But such (wrong) intuition pumping terminology does attract, well, attention, so they get adopted.

[1] https://link.springer.com/article/10.3758/s13414-019-01846-w




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: