Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The comparison with programming languages is extremely misleading, because, unlike natural languages, programming languages do differ in the expressive power. An algorithm coded in Haskell or Lisp may well be 5x less code than an equivalent thing in C or Java. On the other hand, when a text is translated from one natural language to another the result is typically of the same length and typically there's even an almost one-to-one mapping between individual sentences.

To anyone interested in the relations between the language and thinking, I recommend an excellent book "The Stuff of Thought" by Steven Pinker. One chapter is devoted to the Sapir-Whorf hypothesis, more specifically, to debunking it.



Two things: again, I don't espouse the strongest form of Sapir Whorf. The interpretation of differences in speed of access to concepts, based on spread-activation, however, is solid. When I learned Sapir Whorf we were given the original interpretation and a brief of how it has been reinterpreted more recently -- but it still goes by the name of Sapir Whorf. I am not making a case for the possibility of expression. I am make a case for the salience of expressions in different languages. It should be easy to show this: take two languages and find two words of identical meaning, and measure the response speed to comprehension of these words. All other things equal, the higher-frequency word will be accessed faster. Hence, if one culture says Apple more than Orange, Apple is more salient in the former culture. You will then expect more efficient access to Apple-related concepts.

Second, what makes you think that natural languages do not differ in expressive power? Certain things are more easily said in one language than another, just as some constructs may be easier to represent in one language or another, although they may be able to create the same results.

I also don't know where you get the "typically the same length" from. Stephen Pinker wrote about the relative length of English in one of his books. I don't remember which he was comparing to, but English is longer. Pinker explained this along the lines of English being more "fault tolerant" due to more "redundancy" (my words).


This is probably more a function of the fact that languages change over time, so if a common task is hard to talk about, new forms will appear that make it easier. Programming languages see very little change (and even rarer are ambiguity or "non backwards compatible" changes introduced.)

Stuff like this is why I became interested in PL theory in the first place actually.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: