Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Native and non-native (even if fluent) speakers recruit different areas of the brain when speaking or comprehending a language. This is now well known from fMRI studies. http://www.sciencedirect.com/science?_ob=ArticleURL&_udi... : From the abstract (Listening to comprehensive but non-native language seems to demand more networked co-processing.) Another one: http://www.sciencedirect.com/science?_ob=ArticleURL&_udi...

Non-native speakers recruit more brain regions, suggesting that more "concepts/whatever" are being invoked in the formulation or parsing of a sentence. With this in mind, it would seem that even people who assume they are thinking in a certain language, are only accessing a post-conceptual process (language formulation). Conceptual relations between objects or other concepts may still have been put in place in a ways influenced by the language in which you picked them up.

For the same reasons as above, verbal encoding of empathy is probably not the succinct representation the brain uses; you just happen to have conscious access to the verbal encoding only. I would guess though, that something like the Implicit Association Test would uncover more. http://en.wikipedia.org/wiki/Implicit_Association_Test

As for your Spanish, you will stop using the English-to-Spanish pre-processing crutch once you get more fluent (with practice and repetition).



Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: