Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Dunning-Kruger and other memes (2015) (danluu.com)
243 points by ikeboy on June 27, 2016 | hide | past | favorite | 143 comments


"The less someone knows about a subject, the more they think they know." I think that's an exaggeration and understood by most people to be one.

What is, however, absolutely clear from the original study is that incompetent people have a highly inflated assessment on their own abilities: in all four experiments (!), the people with actual 10% abilities rated themselves at 50-70%. This is perfectly in line with eg. Urban Dictionary's definition[1]:

"A phenomenon where people with little knowledge or skill think they know more or have more skill than they do."

http://www.urbandictionary.com/define.php?term=Dunning-Kruge...

[1] Chosen intentionally because this is a "popular" source, not an academic one.


One thing about Dunning-Kruger that I've never heard mentioned is self-esteem affecting one's answer. As in, if you explicitly mark a low score for yourself you're both divulging to others and admitting to yourself that you think you're a failure in that category. So imagine you're presented with one of these forms... what incentive/preventative is given to not let your ego's self-preservation instincts kick in and put you down as "average" (50%+) for anything and everything?

Basically, I can't imagine how you'd account for the societal matter of bravado and I posit that it has an influence on the outcome of DK experiments. e.g. if you tried to account subvert bravado and had some kind of reward at the end to say "Hey, if you successfully guess your percentile for a given skill we'll give you a cookie." (which of course would incentivize you to put "0%" and intentionally bomb the test.)


They didn't ask people for their evaluations before the test; they asked after the test, and after it had been scored, to see where people thought their score ranked relative to the rest of the people who had taken the test.

So, people already knew how many things they had gotten right and wrong when making their judgement. They just judged that most other people had also gotten a similar number wrong.


They can't possibly know the makeup of other test takers. I could be 95th percentile of the total population but still 10th percentile in a test room.


Congrats you've discovered the other half of the Dunning Krueger effect. The smartest people in the test will assume others might know more than them, because they know enough to know they aren't experts.


So?


It doesn't matter what causes the wrong estimate. It's still someone with a lack of ability presenting himself as if he has ability, which he will also do, for the same reasons. when more is at stake.


Maybe you could ask participants to rank their proficiencies in a collection of skills. Then they could be tested on some or all of those skills to see if their ranking is correct.


What I think is most interesting about D-K is how it is referred to as universal, as an aspect of human psychology in general, but in fact the original studies were done in America on American college students and give quite different results when repeated in other cultures.


Do you have sources on that? If you read the article, it claimed that that might not be true.


Ah, actually, no, not beyond the one dismissed in TFA.

I'd have to modify my previous post, then: "What I think is most interesting about D-K is how it is referred to as universal, as an aspect of human psychology in general, but in fact the original studies were done in America on American college students".

As far as I can tell, TFA does not seem claim that there isn't a cultural dimension to D-K, just that we don't know what it is because no studies are cited. I would still say that it doubtful that the studies on Cornell students generalize to the entire population of the world when it comes to these things...


Do not forget that this very "self esteem" thingy is almost exclusively American. The other nations do not even think in such terms.


I can't even process what you're saying. What culture doesn't acknowledge self respect, and one's place in society relative to others?

Certainly there are small communities everywhere (even in America -- after all, they were prolific in America's infancy) that sought the creation of a hyper-communal and idyllic towns... but... those never scale because, well, people are selfish. But still. I just can't process what you're saying. Maybe I'm blinded by offense.


> What culture doesn't acknowledge self respect

But we're talking about self-esteem, which everyone is supposed to get for free. Self-respect, like the respect of others, must be earned. Equating the two is a quintessentially modern-American mistake (I'm a modern American, and I made this mistake myself for years), and I think that's what 'sklogic calls out above.


> What culture doesn't acknowledge self respect, and one's place in society relative to others?

What culture (besides the Northern Americans) would so blindly equate self respect to self esteem? The others understand better that you can respect yourself even without the overblown, unrealistic views on your own abilities and virtues.


> The others understand better that you can respect yourself even without the overblown, unrealistic views on your own abilities and virtues

I don't think self esteem means what you think it means.


>>Do not forget that this very "self esteem" thingy is almost exclusively American. The other nations do not even think in such terms.

This over inflated sense of importance or "self esteem thingy" may be exclusively American but not anywhere close to all Americans subscribe to it. There are 330+ million people of various ages and backgrounds living in the US. I don't see how painting with such a broad brush like you have been does anything but inflame debate.

My question is this though, as much as people like bandying DK around, has it not been replicated? Specific to you, has the effect not been tested anywhere outside North America?


You're saying Americans are the only people in the world who care about themselves? Other countries are full of purely rational actors who attach no emotional value to their own abilities? Have you ever met a real person outside of America?


> You're saying Americans are the only people in the world who care about themselves?

No, I'm saying that only Americans are so conditioned to value their "self esteem". There is no emphasis on this stupidity in the other cultures.

It's all about the emphasis: http://faculty.washington.edu/jdb/articles/Cai%20et%20al.%20...


Ok, only Americans have over-inflated self esteem, that's a different statement. Still wrong, but at least it's a common stereotype.

And I don't think that study supports your comment at all.

> Supporting this contention, our Chinese participants reported liking themselves every bit as much as our European American participants.

> This finding supports our claim that cultural differences in self-esteem arise from cultural differences in self-evaluations, with people from East Asian countries evaluating themselves less positively than people from Western countries.

> Thus, even though cognitive self-evaluations are lower in China than in America, they are not less predictive of global self-esteem. This finding suggests that global self-esteem is experienced similarly across dissimilar cultures

This study finds that self esteem is equally prevalent across the two cultures, and as far as I can see says nothing about how much the participants value their self esteem.


> Ok, only Americans have over-inflated self esteem,

Again. Only Americans are conditioned to value and cherish their self esteem. That was my statement.

Other cultures do not put any emphasis on self esteem (and many consider it as something shameful to even talk about it).


Fine, I would be interested to see evidence of that if you have any. The study you linked was not related, and that doesn't match up with my own experiences with non-American.


The study is related - it shows an attitude to self esteem, while you're apparently trying to compare the self esteem itself.


Do you have a quote that describes that, because that's not what I saw at all. The basic summary is that while Chinese people tend to evaluate their skills more modestly than Americans, their emotional self esteem is just as high. I didn't see anything analyzing how they value their self esteem, or how it affects their behavior.


What does that mean?


Americans are conditioned from the early childhood to value their "self esteem". The others do not even care.


The way I've come to see it is "The less you know about a subject, the less you think you don't know."

In other words, you have no solid grasp on the field and so cannot make an effective guess at your ability in the field - so go for middle of the road and that's possibly your best statistical guess...


In a way, it's pretty intuitive. Beginners are looking down the rabbit hole and think "well I've got this far so it can't go that much deeper can it"? They've got a CRUD app up and running and doing stuff - I remember I felt pretty invincible at that point.

A prime example from experience: "oh I know development. I use PHP includes all the time".

The biggest humbling exercise I do is to think back 6 months, 1 year and so on. I remember how much I thought I knew and what I produced (some stuff I'm still maintaining too, which is humbling in of itself). Realising how much I've learned between those times and realising I've barely made a dent both shows me how little I know and lights the " I must learn more" fire.


While on the other end of the scale:

"The more you know about the subject, the more you don't know."

Which explains the upper-quintiles topping out at at 70th percentile mean self-rating.


> What is, however, absolutely clear from the original study is that incompetent people have a highly inflated assessment on their own abilities: in all four experiments (!), the people with actual 10% abilities rated themselves at 50-70%.

I'm not sure this result is as surprising or as explanatory as D & K advocate.

If I ask you to estimate anything you have no experience measuring, then you'll probably ballpark around the average, maybe drift a bit above or below average based on how favorable you feel towards the thing you're measuring. That's exactly where everyone lands in the responses.

It's not like people in the 10th percentile are suddenly rating themselves in the 99th percentile or anything, most people are taking a blind guess that they're doing ok, pulling their weight, just a bit above the middle of the pack.

It shouldn't be surprising that this isn't correlated with performance, because "performance" and "evaluating the entire field of performers" rely on different skills and knowledge. You can be good at a thing and have never judged anyone else to rate them, let alone judged a sufficiently random sample of the population to get a sense for the average skill there.

We shouldn't have ever expected these things were correlated to begin with.

If you want to improve self assessment, don't just become more humble and self reflective, don't practice the subject in question, just go watch several random performers. Ranking what quartile you're in relies as much on knowing how random people do as it does on self evaluation.


I'm sure there are certain fields where the amount you realise you do not know is proportional to how much you do know.


> The pop-sci version of Dunning-Kruger is that, the less someone knows about a subject, the more they think they know.

Author's take on Dunning-Kruger is a strawman. I haven't seen that version be a "meme". I also dislike this single word rejection, the calling it a "meme", of how people talk about DK, it's like name calling or something. Unwarranted and arrogant dismissal. It's one thing to be wrong, it's another to be wrong and then also haughty about it. I feel like most people that I've seen bring DK up understand what the implications of it are. My favorite thing I've read about it is that the less you are competent in something the less you are able to gauge competence in that something.

> In two of the four cases, there’s an obvious positive correlation between perceived skill and actual skill, which is the opposite of the pop-sci conception of Dunning-Kruger. A plausible explanation of why perceived skill is compressed, especially at the low end, is that few people want to rate themselves as below average or as the absolute best.

In one sentence he's declaring someone else's opinion on it as pop-sci than offers his own similarly silly take on it. Oh wait he showed some charts. I did like seeing that people with more skill saw themselves as better than people with less skill, but conjecture on what people want to think of themselves? That's pop-sci.


> Author's take on Dunning-Kruger is a strawman.

That wasn't the author's take on DK, that was the author's take on the pop sci version of DK, very different. It is definitely a straw man, one which David Dunning has also publicly criticized, so the author's in good company.

Here's an episode of This American Life where Dunning claims that he created a meme (his word), but that people misuse DK as an epithet targeted against stupid people, missing the fact that this applies to all of us:

http://www.thisamericanlife.org/radio-archives/episode/585/t...

If you want to see how most people talk about DK, David Dunning runs a quick search on Twitter. I don't think the median depth of understanding on DK is quite as high as you expect...


>Author's take on Dunning-Kruger is a strawman. I haven't seen that version be a "meme".

The latter doesn't make it a strawman. Perhaps you've just haven't read the same posts/comments the author has read -- or as many.

Here:

1) Top voted definition from the urbandictionary: "A phenomenon where people with little knowledge or skill think they know more or have more skill than they do."

2) Article on OpenCulture website, titled "John Cleese on How “Stupid People Have No Idea How Stupid They Are” (a.k.a. the Dunning-Kruger Effect)".

3) Article in the online outlet unbiased.co.uk: "This behavioural concept (the discovery of which won the Ig Nobel Prize in 2000) describes the tendency of people who know very little to believe they know a lot. "

4) LinkedIn post: "Charles Darwin once said that ignorance tends to beget more confidence than knowledge. In a nutshell, it explains the Dunning-Kruger effect, which is a cognitive bias where incompetent individuals tend to overestimate their skill, cannot perceive the magnitude of their own inadequacy and will admit to their lack of ability only when they are trained in that particular skill."

5) Another popular blog: "The Dunning-Kruger Effect: Are the Stupid Too Stupid to Realize They’re Stupid?"

6) The Dunning-Kruger effect represents a cognitive bias under the influence of which relatively unskilled individuals suffer from illusory superiority, mistakenly assessing their ability to be much higher than it really is.

I could go on for ages...

>I feel like most people that I've seen bring DK up understand what the implications of it are.

You'd be surprised.

I actually don't get your comment, it's like you feel defensive and try to defend a psychological finding from being called a "meme".

First, it's not like the DK effect is a person and will feel bad.

Second, the internet is rampant with people who casually mention the DK effect in comments and posts and articles while not understanding fully what it is about.

>In one sentence he's declaring someone else's opinion on it as pop-sci than offers his own similarly silly take on it. Oh wait he showed some charts.

Some charts based on the original research, and more faithful to its conclusions and reportings than the pop-sci adaptation the author talks against. So?


It is a strawman. Your quotes don't say what the author wrote: "the less someone knows about a subject, the more they think they know". And again later: "In two of the four cases, there’s an obvious positive correlation between perceived skill and actual skill, which is the opposite of the pop-sci conception of Dunning-Kruger." He is specifically talking about an expectation of an inverted relationship - a negative correlation - and his whole argument is based on that.

By his definition, the "meme" would mean people believe that those who are among the worst in the world at something would estimate their ability the highest, and those who are among the best would estimate their ability the lowest (on average).

Do you really think people believe this? That olympic gold medalist swimmers would put themselves in the first percentile of fastest swimmers, while people who've never entered water would put themselves in the 99th (well, not necessarily 1st and 99th - maybe it's 3rd and 97th, but the medalists would choose the lowest number, and the non-swimmers the highest)? I don't think I know a single person who believes this.

I have huge respect for Dan Luu but he screwed up here and created a classic strawman.


The OP quotes diagrams that clearly show that incompetent people tend to overestimate their competence, and the most competent people tend to underestimate their competence. Almost all your quoted definitions match that data, so I don't think they're incorrect.

The strawman the OP is trying to dismantle is that people's self assessment declines as they gain competence. Apparently not; from that data self assessment is mostly flat. But a flat line (self assessment) intersecting with a rising line (competence) will still show the symptoms of the common definition of DK.


Maybe I am just misunderstanding the point you are trying to make, but none of the six examples you are citing is obviously getting the Dunning-Kruger effect wrong. They are all along the line that people with lower skills overestimate their skills, none obviously claims the wrong thing, i.e. that people think their skills exceed those of people that are actually more skilled.


The internet is rampant with people who casually mention the DK effect in comments and posts and articles while not understanding fully what it is about.

Sort of a meta-DK effect with regard to people's understanding of how DK really works, as it were.


Re strawman:

I saw someone making a very similar point to the one this post critiques, which reminded me of this post and prompted me to post it. It's definitely out there as something people believe.

See e.g. this comic http://www.smbc-comics.com/?id=2475

Edit: or this one http://orig04.deviantart.net/6527/f/2012/283/9/7/dunning_kru...


The SMBC comic does not actually claim to illustrate the Dunning-Kruger effect.


No, but it makes the same claim that the post here says is popular and wrong.


"Willingness to opine" is not identical to "self assessment of capability".

I am also not sure any SMBC comic can be reasonably considered to be making a claim of anything...


In addition to coldtea's examples, DK is particularly thrown around in competitive gaming communities, such as Reddit/Twitch around Dota 2 and League of Legends. I admit that it may not be as prevalent in the mainstream as other popular memes, but it's definitely there.


I've seen its use in those communities also. Usually the person applying thinks highly of their ability to identify a DK scenario. And, if someone professes not to understand something, DK is rarely applied in their favor. Bringing it up is effectively just a meaner way to say, "I think you're wrong."


    > Usually the person applying thinks highly of their 
    > ability to identify a DK scenario.
This is a critical component of the DK epithet, often accompanied by a tacky link to the wiki page.

https://hn.algolia.com/?query=dunning%20kruger&sort=byPopula...


This article touches on one of the highest-payback practices I've developed over the last few years: going to the original sources. I am constantly rewarded by this; typically finding out that the downstream analysis misunderstood some aspect, latched on to only a fraction of the whole story, or willfully misrepresented by speculating on absent data or by inserting a plausible narrative for items that fit a private agenda.


It's very scary.

Almost every time I check some thing in the news or public opinion, it's either wrong, misinterpreted or too simplified. Often it's just people talking from their ass. It's amazing how we can word as a society.


"Almost every time I check some thing in the news or public opinion, it's either wrong, misinterpreted or too simplified"

I'd attribute that to the decline of real (read professional) journalism, not an actual decline of society. People have been talking from their asses probably for as long as they can speak, but Internet allowed them to bypass usual bullshit meters.


It may be worth reading "The Chief", a bio of William Randolph Hearst then. Follow that with outright political slander in newspapers in the 19th Century.

Even Walter Cronkite might oughta not said what he did w.r.t. Vietnam. More time to pull out would have saved lives, and many, many people have felt very badly about that. That one is complicated.

No doubt that the sheer quantity of "news" has dragged more muck off the bottom of the barrel, but the good old days weren't so good.


You only remember the journalism being better because you remember the good stuff and forget just how much crap there was in a newspaper from 30 years back. The Internet surely allows more to be published and by anyone, but generally "journalism" in the past was just as focused on "how to we sensationalize something to sell papers" as current "news" is on "how do we get more views with no effort?"


The static typing example seems weird to me. I did not read the entire linked summary but only the first five and last three papers discussed there and they mostly hint at at least some positive effect for static typing but the author of the summary essentially just dismisses the results for various reasons. I am not saying that all the judgments in the summary are necessarily wrong but overall that summary seems a pretty strange basis for saying that static typing is worth nothing. And the author of the submission is also the author of the summary.


The conclusion I mead from that summary is not about the value of static typing one way or another but just that the empirical evidence on the matter is unclear and uniformly weak.

Perhaps this is just a matter of will and funding, but I suspect it's because the underlying question is poorly specified and—like most human issues—extremely hard to study. To me, it's not obvious that asking about static typing in isolation from other parts of language design is meaningful at all.


It is of course only anecdotal evidence but I know no developers using dynamically and statically typed languages that would deny the benefits of static typing per se. It's just a matter of fact that you get rid of a certain class of bugs that you have to otherwise cover with (unit) testing. There are of course drawbacks. If your language of choice does not offer type inference you have to write some additional boilerplate code but that may be countered to some extend by good tools with autocomplete and the like. You usually also get better refactoring tools with static typing. Then again it is sometimes harder to make something work because the type system is to restrictive and some duck typing would be really handy. It is a bit of a trade-off, some bugs go away, some code is harder to write. I guess that is what you are hinting at with looking at static typing in isolation. Nonetheless I still have to see a developer who confidently and unconditionally prefers dynamic typing over static typing, but maybe that is just some bias because of the places I work.


I know no developers using dynamically and statically typed languages that would deny the benefits of static typing per se.

I guess technically you're right here, because what I would do is deny that you have the ability to define "static typing" (or "dynamic typing") in a meaningful way.

It's just a matter of fact that you get rid of a certain class of bugs that you have to otherwise cover with (unit) testing.

Well, I do occasionally see static-type refugees coming to Python and writing code littered with "assert isinstance(..." checks and the like. But sooner or later they either learn to stop doing that, or go back to their safe space with horror stories about how they had to do all that manual checking. People who stick around have learned you don't need to do that stuff, and I can't think of any code I've personally written in, say, the past five years which used tests or runtime type checks to try to catch type-related bugs. In fact over the past ten years I can only think of one time when a static type checker would've caught a bug that got as far as being committed.


"I can't think of any code I've personally written in, say, the past five years which used tests or runtime type checks to try to catch type-related bugs. In fact over the past ten years I can only think of one time when a static type checker would've caught a bug that got as far as being committed."

This is clearly a case of "you don't know what you don't know." I strongly suspect that there are many things you deem "not something static typing would catch" that I have been catching with static typing. From the trivial (passing bid price where ask is expected) to the simple (keeping track of which array this index is supposed to point into) to the tricky (restricting certain behaviors to certain threads). And those three examples were from my last big C project - no esoteric type system (some of which make it even more powerful and even more convenient).

I won't go so far as to argue that extensive use of static checking is always a win, but it is a tool I find very effective. And it looks very little like "assert isinstance" strung about a Python codebase.


> In fact over the past ten years I can only think of one time when a static type checker would've caught a bug that got as far as being committed.

Then you really don't know how static types can help you.


I guess technically you're right here, because what I would do is deny that you have the ability to define "static typing" (or "dynamic typing") in a meaningful way.

A statically typed language is a language where the compiler uses type information about expressions at compile-time to check the validity of a program, otherwise it is a dynamically typed language.


This is exactly the fallacy that dynamic proponents are constantly running into. Do not help them by reinforcing such claims.

Static typing bears important semantics far beyond a mere "validity checking" and compile time optimisations. And this fact is overlooked far too often.


What then would be a better definition in your oppinion?


In a static type system, all of the expressions and expression-like constructs of your language may have a constraint attached to them, which guarantees that a value this expression yields have certain properties; In a static type system there is a well defined set of rules, describing how such constraints are transformed when expressions are combined in a certain way.

In a dynamic type system, no such constraints exist (besides for the constant literals, of course) and there are no rules for combining the constraints.

These definitions should cover all the spectrum.


These definitions should cover all the spectrum.

This gets close but I think it doesn't yield a conclusion that Java is statically typed, and I think most people want a definition that concludes Java is statically typed.

(unless you allow only fairly trivial "constraints" in which case you're no longer distinguishing between categories of languages, since essentially any language can satisfy a trivial-enough constraint on its expressions)


I think it's reasonable to draw the line somewhere between Java and, say, Python in the kind of constraints that a type system can enforce. Java does not have a particularly sophisticated type system, but I wouldn't be surprised if I could get some reasonable help from it (I haven't written any Java in years, much Java in longer...).


Just remember that one of the reasons why the JVM does all sorts of fancy/mind-boggling runtime heuristics and optimizations is precisely because it's not possible, at compile time, to always determine a usefully-specific type.

(and to really go off the deep end, in Java it's not possible to determine even what the universe of possible types is at compile time, which is why I like it as a counterexample to people who think they can meaningfully define "statically typed" and have it manage to include Java)


> it's not possible, at compile time, to always determine a usefully-specific type.

You seem to be using "type" here to mean "runtime representation", which is a common use but is only one specific case of what is being discussed above. As I understand it (and remember that my Java is rusty), it is entirely possible to statically guarantee that whatever a given expression evaluates to, it satisfies a particular interface (moreover, it was deliberately labeled as doing so). That may not be "usefully-specific" for an optimizer (or more carefully, it may leave some vague that will be more usefully specified at runtime; we haven't established that the level of specification is of no use), but it certainly seems to be a constraint of the sort that sklogic was talking about above, and it seems like something that could be useful to me as a programmer.

I actually think that "representational types" - constraints that tell us explicitly about how the data is represented in memory - is the place where the dynamic typing languages got it right. As long as it's internally consistent, I don't care exactly how the data is laid out, and automatic systems can do a plenty good enough job figuring that out (except at system boundaries where I need to worry about interchange, or if I care an unusual amount about performance). If representation is not something I care about, I should not be wasting time/attention talking about it. Where, IMO, this goes wrong is overgeneralizing to say, "... and since that's all types are good for, types are not useful." That's not all types are good for.

> (and to really go off the deep end, in Java it's not possible to determine even what the universe of possible types is at compile time, which is why I like it as a counterexample to people who think they can meaningfully define "statically typed" and have it manage to include Java)

How is the same thing untrue of C? or Haskell? If I write code that interacts with data generated by a dynamically linked module, that data might include newly defined types. This isn't a problem for the theory - we've still internally verified the constraints we're checking in each module, and potentially also at the interface (depending a bit on ABI).


it certainly seems to be a constraint of the sort that sklogic was talking about above

Well, this is the thing. sklogic and others seem to think there's some sort of clear bright line dividing, say, Python from Java, where in Python they'll claim you can only deduce a constraint like "is of type object" ahead of time but in Java you can deduce something more specific.

And setting aside the falsity of that (even on Python 2, which doesn't have the annotation syntax or library support to do type hinting, you can still deduce a lot of specific information AOT), I'd argue with the concept of a clear dividing line and say instead that there's a spectrum. And I say this precisely because in languages like Java you can end up only able to deduce something will be of, say, type "Entity", which while better than "Object" still feels less like a qualitatively different thing to me and less useful for expressing the kind of rich constraints hardcore static-type folks talk about. Yet people seem to accept it as a statically typed language.

How is the same thing untrue of C? or Haskell? If I write code that interacts with data generated by a dynamically linked module, that data might include newly defined types.

It can happen in other languages, but Java specifically does a lot of heavy lifting to work around this kind of issue, and some of that is specific to the features and quirks of Java. There's a reason, after all, why people non-jokingly refer to the JVM as the best dynamically-typed language runtime ever developed.

But again what we end up talking about is less a clear binary of "this language is dynamic, that language is static" and more of a spectrum of different combinations of features and design choices which sort of gradually blend into each other as you look at them side-by-side. Which is why I like to argue with the idea that we can meaningfully define "static" or "dynamic" typing.


> I can't think of any code I've personally written in, say, the past five years which used tests or runtime type checks to try to catch type-related bugs.

All bugs are type-related, with a sufficiently advanced type system.


Sure, with certain type systems you can write an "add" function and describe its type such that you can verify it really does return the sum of its arguments.

But all you've really accomplished is to move the potential bug to a different layer; before it was possible to write incorrect logic in the function and have a bug, now it's possible to write an incorrect type specification and have a bug. So now you need a type system to let you describe the behavior of your type system (and it has to be a separate thing from the type system). And then you're back where you started, just now with an extra layer of architecture.


Hi, I confidentally and unconditionally prefer dynamic typing. Just to let you know we exist. Please don't forget that static typing introduces the possibility of compile time defects. Even statically typed languages often support reflection, which is another way of getting the benefits dynamic types. The easier metaprogramming is, the happier I am.


Are you claiming that compile time defects are a flaw in the programming experience? Because running buggy code is a feature?


No, I am pointing out that compile time defects count as a type of defect- see for example the accounting in the CMU SEI PSP: http://www.sei.cmu.edu/reports/00tr022.pdf


Compile time reflection have absolutely nothing to do with dynamic typing.


Compile time reflection? Reflection as implemented in most languages allows you to determine the type of an object at runtime.


If you read the paper, the study was performed on Ph.d students and the language was Ansi C. >.>

Make of that what you will.


I think it's more that - for obscure internal and path-dependent reasons - people divide into cohorts who are lukewarm about static typing and cohorts who are quite for it.

His bit about it is purest hyperbole - "those who oppose static typing are like unto AGW deniers." As they say in Haskell circles - it's all fun and games until you invoke the I/O monad...


Because the author has an agenda, he's a dynamic typing and unit testing proponent. Do not expect anything even distantly resembling any kind of an objective science from someone who clearly is not interested in facts.


This is pretty much a personal attack and those are not allowed here.

Also, what you're saying doesn't match my recollection of luu's writings on this topic, which is that he's mildly in favor of static typing but changed his mind somewhat after looking at the dismal state of the evidence.


Who is not interested in facts? Did you even read the linked article? If you read nothing else, at least read the summary. He makes very very clear that he is a static typing proponent.


The type system question is different from the psychological examples: the problem is not people misinterpreting evidence, but that reliable empirical evidence simply does not exist. Papers on the matter are sparse, completely uneven and full of methodological issues.

Personally, I'd argue that "statically typed" vs "dynamically typed" does not even make sense as a single question. There's more difference between Haskell and Java than between Java and Python, and an experiment comparing two identical languages with and without static typing won't tell us much beyond those two languages. (I recall seeing at least one paper that did this; it's probably worth reading, but not for making broader conclusions.)

Moreover, there simply isn't a compelling way to measure most of the things that programmers actually care about like expressiveness, productivity or safety. Existing measures (like counting bugs, lines of code over time, experiments on small tasks often performed by students... etc) are limited, full of confounding variables and quite indirect in measuring what people actually care about. I've looked through various studies and experiments in software engineering and while some are compelling, many are little more than dressed-up anecdotes or "experience reports".

It's especially hard to study these things in the contexts that matter. What we care about is experienced programmers who've used specific technologies for years applying them in teams, at scale. What's easy to experiment on is people who've just learned something using it on tiny tasks. Observational studies that aim at the broader industry context are interesting but hard to generalize because of confounding variables and difficulty of measurement.

In the absence of this sort of evidence, people have to make decisions somehow, and it's not surprising that they overstate their confidence. We see this in pretty much everything else that doesn't have a strong empirical basis like questions around organizing workplaces, teams and processes. Just look at opinions people have about open offices, specific agile processes or interview procedures!

Another side to the question is that languages inevitably have a strong aesthetic component, and talking about aesthetics is difficult. But you're certainly not going to convince anyone on aesthetic matters with an experiment or observational study, any more than you can expect to accomplish anything like that in the art world!


Something I didn't realize before is that, meme or not, Dunning-Kruger tested perception vs skill on basic tasks, things that if someone asked me I might easily mistake my own ability, since it's something I'd feel like I should know how to do.

Ability to recognize humor isn't what I'd even call a skilled subject matter, and it's not something we learn in school or normally get exposed to graded metrics or comparisons against other people.

These aren't highly skilled subjects like Geophysics or Law or Electrical Engineering or Art History. I'd be willing to bet it's a lot easier to both self-identify lack of ability and admit lack of ability in a subject the more skilled it is.


I like to think SMBC[1] presents a more accurate graph of confidence vs knowledge, but I don't know enough to really speak about it.

[1]: http://www.smbc-comics.com/?id=2475


Quantifying the subtleties of knowledge as experience increases is difficult. For instance, one might understand the details but not compositional complexities. Or vice versa. But comparing the two situations is difficult and inextricably contextual.


John Oliver did an awesome bit recently on scientific studies and how popular conceptions of them, especially media portrayals, completely distort the results.

https://www.youtube.com/watch?v=0Rnq1NpHdmw


I'm curious as to why nobody here has yet to comment on OPs claims about "Hedonic Adaptation". I've been told by various sources that this is the way the brain works, even in my recent biology class where the teacher would say that "dopamine sensitivity" was to blame.

It seems like a really big deal to me if he's right, and could really change your outlook on life.


D-K is one of my favorite patterns. This is the first time I've seen these charts. Some questions about methodology:

The x-axis shows quartile, not score results. If the range was between 80 and 90%, then all participants were accurate in assessing their ability as "above average". [EDIT] I doubt that's the case, but would rather see scores.

How was the self declared expertise in "humor" judged? That seems pretty subjective. Maybe the subject is hilarious to his or her friends.

Did the subject know what the examiner's definition of "logical reasoning" is? Was that street logic or discrete structures? What if the subject was able to glance at the test questions? Only then answer the question as it pertains to the test. How would the results change?

Grammar is idiomatic. In some places "over yonder" is contextually concise. Other grammatical forms may never occur. How is self-assessment over tacit expertise judged? Maybe another glance at the test?

Maybe Dunning-Kruger shows that there is a disconnect in how examiner and subject interpret a question? Maybe, it is a matter of saving face in saying that you're above average? Maybe, because the subjects are college students, that they actually are above average? Or maybe, these are above average participants that aren't quite sure of the question, so they say that they're above average?


The idea that there's an inverse relationship between how much someone thinks they know about a subject and how much they actually know is pretty timeless. When people refer to Dunning-Kruger I take it to mean shorthand for that phenomena rather than a reference to results from a specific study done in 1999.

I may be misremembering, but when I first saw references to it on Slashdot, etc., it was from people reacting in amusement that someone was able to quantify and measure what seemed like such a commonly experienced aspect of human behavior. If someone had done an academic study on the increased likelihood of friends having scheduling and availability issues around weekends in which one friend was moving to a new house but was too cheap to get movers despite having plenty of money to do so, it would've gotten a similar response. :)

Since then, it's just been convenient having a name ("Dunning-Kruger", that is) for a concept that was widely understood but didn't have shorthand for referring to it. I'm not surprised that the study itself wasn't definitive and airtight.


One thing I never see in the income/happiness studies is - Is this just for a single person, or is it for a family? And if for a family, then what size is that family? I can see being happy earning 75k/year and being single, but not so much if I have eight other family members to support with that same salary. Is there some sort of "number of people being supported on this income" adjustment to the income/happiness studies?


At least one study uses household income [1]. The effect isn't adjusted for family size in any way I can see. Do note that the linked study differentiates between 'enjoyment of life' which they estimate starts to plateau around $75k / household and 'life satisfaction' which keeps going up with earnings [2]. That difference may explain much of the supposed controversy as outlined in the original post.

[1] http://www.pnas.org/content/107/38/16489.full#T2

[2] http://blogs.wsj.com/wealth/2010/07/02/money-can-buy-satisfa...


This article had more assumptions in it than examples of assumptions it was complaining about.


> Apparently, there’s a dollar value which not only makes you happy, it makes you as happy as it is possible for humans to be.

> If people rebound from both bad events and good, how is it that making more money causes people to be happier?

I saw graphs that proved happiness causes money. What did you see?

disclaimer: I am trying to be snide on the internet. What I mean to say is that I was confused by the use of the word "cause".


I see graphs that proved that being born into a financially well-off family with access to good education might have something to do with factors involved in the graph.

(Appreciated your humor. Am attempting a mildly humorous speculation about a plausible cause for both factors. It's probably less humorous than I think it is, but as someone lacking skill in humor, I overestimate my own hilarity.)



Dan Ariely and his team has done some great work on the "happiness" meme, and they generally support the popular notion that there are massively diminishing returns to accruing wealth. Yes, (as this post shows) happiness does continue to increase as you accrue wealth, but there are other things that you can do - including giving money AWAY - whose returns on happiness and satisfaction do not diminish. The point is, if you take a long view on life and what to focus on, getting to a certain level of financial stability should take a high priority, but becoming incredibly wealthy should not.


Is this a clever ruse to test whether we'll read the cited sources? ;^)


All the income happiness data seems to stop shortly after 64k. Hardly evidence that there is no plateau.


If you click on the link "is robust across every country studied, too", there is data up to and above 500k for the US.


Was ok up until type systems. Please stop citing this pathetic "empirical study" already, it's totally unscientific.


If that one is not, are there any scientific studies then?

Strongly typed languages require of me to do more work upfront, to satisfy their type checker. They must necessarily reject programs that would work correctly. In this process a lot of mistakes are eliminated, and this gives me more confidence that the result will work. I like that way of working. But does it produce more robust code? Is it more productive? It feels like it, but that doesn't mean it's true.


"Strongly typed" is not the same thing as "statically typed". Most dynamically typed languages are strongly typed, too. The distinction between static and dynamic type systems comes from whether type errors are caught at compilation time or run time.

Which basically settles the question for me as a programmer, anyway. Eliminating the possibility of a class of run time failures -- how can that not be a good thing?


I meant statically typed, thanks.

The question to me is not whether type checkers are useful tools, but at what point they become a hindrance. If I may rephrase your question: The programs rejected by the type checker, how can they not be bad programs?


There's an interesting one here:

http://macbeth.cs.ucdavis.edu/lang_study.pdf

TL.DR. Looks like FP reduces bugs, and static typed FP reduces them a bit more, but there isn't enough data for the more interesting fine-granied conclusions.

Also Perl results are interesting (unsettling?).


Interesting, thanks.


Given a huge skill gap and inability to factor out methodology differences, I cannot see how such a study can be done at all.

Anecdotes are the best we have. Far better than a pseudoscience with agenda.


Science can be analyzed for how pseudo it is, anecdotes cannot.


Anecdotes are case studies. In social sciences, for example, it's often the only thing you have. Do not dismiss this kind of evidence when you do not have any other options.


Not all anecdotes are case studies, but all case studies are anecdotal evidence.

The effect of type systems is probably not that big. Otherwise the proponents of type systems would have had an easy time proving it.


I love that the people who come into the comments to argue about the benefits of static typing seem to have totally missed the point that the post argues that you need evidence, not just beliefs.


And they seem to continue to false axiom of assuming "static" == "strong" and "dynamic" == "weak".

FFS, some people should finally learn.


> FFS, some people should finally learn

I realize you were commenting in a neighborhood with a whole lot of broken windows, but please don't do this.


Empirical evidence is nearly impossible in this area.

On the other hand, we have a solid theory, not some "beliefs". If you want to dismiss the entire PL theory, you have to try really hard to justify such a stupid move first. The problem is, most of the dynamic proponents know next to nothing about the PL theory anyway.


> Empirical evidence is nearly impossible in this area.

Then stop arguing as if you have evidence. It's really that simple.

Present something that is not derived from opinion and speculation and make this an argument that is not subjective.

For what it's worth, I prefer static, strong type systems and I was recently dreaming out loud about strong, static typing in erlang with a colleague. I don't confuse my opinion and speculation in what's good and not with fact, though, which is the big difference.

> If you want to dismiss the entire PL theory, you have to try really hard to justify such a stupid move first.

It's a fact that there exists no definite proof of the objective superiority of static, strong typing. I don't need to "dismiss the entire PL theory" (what a silly thing to even say; not all PL theory is concerned with types).

You've come exactly 0.0% closer to showing any kind of evidence, empirical or not, and have only speculated more (on the value of static, strong type systems and of the skill and knowledge of people who disagree with you).


> Please, do present the solid theory that is not simply derived from speculation and opinion.

What theory shows is enough to claim superiority:

1) Dynamic typing is a subset of a static typing. This thing alone is enough.

2) Static typing provides more semantic options in both compile and run time, meaning that you can do more diverse things. Also quite a strong claim for superiority.


My friend, I am one of the biggest proponents of static typing that you will find, and let me say you are talking absolute nonsense. Your argument is very poor indeed and your attitude is setting back the social cause of promoting statically typed languages.


> 1) Dynamic typing is a subset of a static typing. This thing alone is enough.

This is like saying that more syntax is better. No, cutting away from something can make it better. This argument is not at all enough to claim superiority.

(C can be considered a subset of C++. Which is better?)

> 2) Static typing provides more semantic options in both compile and run time, meaning that you can do more diverse things. Also quite a strong claim for superiority.

"More diverse things" is ill defined. Which are they and why are they a net win? This is not at all a strong claim for anything, except "There is more".


> No, cutting away from something can make it better

What?!?

You can build a dynamic type system on top of a static one. The opposite is impossible. What else is there to even talk about?

> "More diverse things" is ill defined.

It is very well defined. Static (i.e., compile time) metadata allows to infer constraints in compile time. Dynamic metadata is useless for deriving constraints. A very obvious consequence of this observation is that there will always be far more boilerplate with dynamic typing than with static.


> You can build a dynamic type system on top of a static one. The opposite is impossible. What else is there to even talk about?

We are talking about the value of different kinds of type systems and using them. Being able to build a dynamic one on top of a static one says very little about whether or not dynamic or static typing is better for actual usage. On top of this lots of languages have added gradual typing, so this idea that you cannot take a language that is not statically typed and add a type system seems misguided.

> A very obvious consequence of this observation is that there will always be far more boilerplate with dynamic typing than with static.

I hope you realize that this is not at all what reality looks like.


> We are talking about the value of different kinds of type systems and using them.

Exactly. And you're apparently suggesting that there may not be a single case where you may want static constraints. Kinda very strong position, needs very strong proofs indeed.

> gradual typing

Gradual typing IS a static typing, period.

> you cannot take a language that is not statically typed and add a type system seems misguided.

What?!?

You cannot build a gradual typing system on top of a dynamic one.

> this is not at all what reality looks like.

I can only conclude that you do not know much about the reality if you think so.


> Exactly. And you're apparently suggesting that there may not be a single case where you may want static constraints.

No, I have consistently asked for objective proof that static typing is a net win over dynamic typing, something you have yet to even address. I don't know if you're intentionally misrepresenting my argument or if you're simply misunderstanding it, but I think you should re-read this whole thread.

As I've said previously, I prefer static strong typing, but I'm also in touch with reality and to present my opinion and speculation as some kind of fact isn't something I'm interested in.

> I can only conclude that you do not know much about the reality if you think so.

If we're jumping to conclusions I'd like to conclude that you think all PL theory is type theory and that you're ignorant of every other bit of it (and also that you're the type of person to think your every opinion is fact. I think both of these have been on display in this thread, so I actually think that's a stronger conclusion than the one you've drawn).


Sorry, cannot reply down the thread, so I'll put my answer here:

> This is not necessarily true: Static typing quite often requires you to satisfy the type system

We're talking about static typing in general, not some particular implementation of it.

Any static type system with an "anything" type (think of the System.Object in .NET, for example) allows a transparent fallback to dynamic at any time.

So, claiming that "there is a cost" is an outright lie.

> I haven't stated that dynamic typing is better, but I have stated that people claiming one or the other need to have proof.

You know, there is a little funny thingy called "logic". And one of the most common tricks in logic is a proof by contradiction. When you're asking for a proof that static typing is superior, the simplest way is to start with "let's assume dynamic typing is superior". This is exactly what I did. Unfortunately, you could not follow.

> If your programs are as airtight as the "proof" you've given here, I'm not sure I ever want to use them.

It's understandable that a person who do not know much about type systems in particular and PL theory in general also apparently does not know much about proofs and logic in general. After all, type theory and proof theory are indistinguishable.


> You know, there is a little funny thingy called "logic". And one of the most common tricks in logic is a proof by contradiction. When you're asking for a proof that static typing is superior, the simplest way is to start with "let's assume dynamic typing is superior". This is exactly what I did. Unfortunately, you could not follow.

Condescending, but not to be confused with correct. I'll try as well:

Given your obviously limited knowledge and familiarity with English I can understand that you seem to have issues understanding my basic argument, but I'll restate it for you:

If you are trying to claim something as superior, you need to provide actual reasons for it, not just speculation.

I hope you followed that.

> It's understandable that a person who do not know much about type systems in particular and PL theory in general also apparently does not know much about proofs and logic in general. After all, type theory and proof theory are indistinguishable.

It's actually not understandable that someone who claims to have a lot of knowledge in type systems and type theory, as well as logic, to provide "proof" that in no way proves what was asked for. It's also surprising that someone who claims to be so well versed in PLT essentially says it's all type theory.

It's understandable if a person with reading comprehension issues would have problems reading this post, so if you have any questions regarding it (or the previous posts), feel free to ask.


It is very childish and stupid to respond to a proof with a shit like "no, this is not a proof".

> The idea that static type systems are better to use (in general) because you can make dynamic type systems on top of them is simply not something you can just say and then have taken as fact.

Oh, did not realise you're so incompetent (although I should have guessed after your epic fail with the gradual typing). Do I have to prove that 2+2=4 too?

Once again: dynamic typing is a subset of static typing and therefore it is less powerful. Period. You cannot do anything with this fact.

Also, funny that you did not respond to my accusation that you believe that type systems are only for "validity checking". Which suggests that I was right.


We bent over backwards not to ban you, gave you lots of warnings and cut you tons more slack than we usually do. You're aware of how unacceptable it is to post comments like this to HN, and still you did it repeatedly in this thread, turning a large section of it into a toxic waste dump.

Obviously, your account is now banned. If you don't want it to be banned, you can email us at hn@ycombinator.com, but please don't do that until you're sincerely committed to never spoiling HN like this again.


You're the one spoiling HN with your insults and disruption.


This kind of moderatorial bickering does not belong on HN.


And now you're disagreeing by downvoting.


Please stop editing your posts 40 minutes after you initially post them in order to muddle the post history. You're literally removing entire posts and adding completely new things, even as responses to responses. Work on your netiquette.

> Also, funny that you did not respond to my accusation that you believe that type systems are only for "validity checking". Which suggests that I was right.

No, not responding to something does not mean "you were right". I think strong static type systems are useful in many ways and I've already told you repeatedly that I prefer them. I have literally no care in the world what you assume about me.

Can you confidently say that you've given even one clear reason why it's better to use static typing over dynamic typing in this thread? You've gone on one rant about the fact that you can build a dynamic type system on top of a static one, but haven't actually presented even one fact about why it's better to use static typing over dynamic typing.

No one, in their next project, is going to build a dynamic type system in their static one, and then jerk off over that fact, when they could just start working on their ACTUAL project. Would you recommend to someone choosing between static and dynamic typing to start with a static one and then build their dynamic one on top of that?

Your posts are full of bullshit, stupid assumptions and personal attacks. I only wish you'll realize what a disservice it does to acceptance of your opinion and how hard it is to agree with you even while holding much of the same opinion, because you're so insufferable.


Do you feel like you should be able to make any nonsensical argument you want and people should take that as fact/proof? The idea that static type systems are better to use (in general) because you can make dynamic type systems on top of them is simply not something you can just say and then have taken as fact.

Also, I should add, I think you lost the "this is childish call" when, two posts into this thread, you made assumptions about everyone saying dynamic typing was useful (they obviously are not on your level, amirite?).

You started this thread off by saying empirical evidence of which is better for practical use is almost impossible to get. You then proceeded to argue your position as fact, knowing that you had no evidence to support it. If you want to present your opinion as fact, then feel free to find actual facts to support it.

Get over yourself.


You and sklogic took this thread into an inner circle of flamewar hell. This is a poster child for the kind of discussion we don't want here, which pollutes HN for everyone else. Please don't do this again.

If you feel provoked, catch it before it drives you into an angry back-and-forth. That's not easy, but it's something we all have to work on in order not to destroy this site.


This post is exactly what I come to HN to avoid: Moderators abusing users for no reason, disrupting discussions in the process.


> Please stop editing your posts 40 minutes after you initially post them in order to muddle the post history.

HN does not allow to continue a thread below a certain threshold.

> I think strong static type systems are useful in many ways and I've already told you repeatedly that I prefer them.

This is not what I was talking about. Learn to read.

My point is that you do not understand what does it mean that a bigger type system is providing new semantics for the language. You still fail to understand it, obviously, because this is a central point of my proof, which you failed to comprehend.

> No one, in their next project, is going to build a dynamic type system in their static one, and then jerk off over that fact, when they could just start working on their ACTUAL project.

Take a look at pretty much any code in static languages - it is almost always doing exactly this: various degrees of dynamic typing on top of static. Sometimes it is ugly, sometimes it is done the right way (LLVM is a good example of this approach).

> Would you recommend to someone choosing between static and dynamic typing to start with a static one and then build their dynamic one on top of that?

Even you somehow heard something about the gradual typing - which is exactly an example of this.

> Your posts are full of bullshit, stupid assumptions and personal attacks.

Omg. I'm only responding to your attacks. You're an uninformed and incompetent side in this argument, not me.


> HN does not allow to continue a thread below a certain threshold.

Click on the time (for example "2 minutes ago") to open the post and reply there.

> My point is that you do not understand what does it mean that a bigger type system is providing new semantics for the language. You still fail to understand it, because this is a central point of my proof.

And my point is that you haven't answered why using a static type system over a dynamic one is a net win for people, which is the entire point. If there is no empiric evidence, don't argue as if there were. You yourself admitted that there is no evidence for it, so why are you arguing from a contradictory position?

> Even you somehow heard something about the gradual typing - which is exactly an example of this.

No, the practical flow of doing that is exactly the opposite: Starting from dynamic typing and imposing types when they are needed, likely after an exploratory phase. You seem to be arguing from a much more disconnected view point where you take the position that the platforms underlying representation is what matters, and I am arguing from a practical perspective (What should the programmer choose for his next project?).


> Click on the time (for example "2 minutes ago") to open the post and reply there.

Shit. Is it a new feature? I do not remember this working before.

> And my point is that you haven't answered why using a static type system over a dynamic one is a net win for people

I am not interested in diverting the discussion from the topic. The topic was that static typing is superior to dynamic typing - i.e., more powerful and more flexible.

How does it translate to a "net win"? I do not care, honestly. There are fare to many factors other than the language features.


> I am not interested in diverting the discussion from the topic. The topic was that static typing is superior to dynamic typing - i.e., more powerful and more flexible.

> How does it translate to a "net win"? I do not care, honestly. There are fare to many factors other than the language features.

I'm glad we established that you don't care about the net win of this, so that we can agree that we're not at all talking about the same thing.

The net win is the entire point. I don't care how something is done if it's not a net win for me and my projects. It's irrelevant to say something is better in theory if there is absolutely no proof of it actually being better.

I can love macros, but would I go ahead and assert that a language with macros is absolutely better than a language without? No, they have a cost associated with them and their misuse makes things worse. Hence, they're not objectively superior to anything else. This goes for almost any feature, until you can prove a net win.

(Edit: The same goes for my point about C++ earlier. There are more things in C++ than there are in C, but most people would argue that C++ has too many things and that many of them actually make things worse. Hence, having more things in C++ could be considered a net loss.)


> they have a cost associated with them and their misuse makes things worse

Omg. My assessment was correct, after all.

> Hence, they're not objectively superior to anything else

Of course they are.


> Omg. My assessment was correct, after all.

If your assessment was that I prefer being pragmatic and that I believe most things have a cost, then yes. I think macros can be done right (in Scheme, etc.), but even then they have a potential cost.

> Of course they are.

And this is obviously where we definitely diverge. I think macros are great, but I acknowledge that you can't make that statement without big disclaimers. You don't seem to care about being pragmatic, so for you it's more clear cut.


I provided you with a proof. Cannot you follow such a trivial logic?

Let me repeat it again, slowly:

1) Dynamic typing is a subset of static typing. With static typing you can do everything that is possible with a dynamic typing, at no additional cost, while the opposite is not true.

2) Static typing is far more than a mere "validity checking", as you apparently seem to believe. These advanced semantic properties cannot be added on top of a dynamic type system, so, even suggesting that a dynamic type system may be somehow superior is automatically declaring that under no circumstances you will ever need any of these properties.

Is it so hard to follow?!?


> With static typing you can do everything that is possible with a dynamic typing, at no additional cost [...]

This is not necessarily true: Static typing quite often requires you to satisfy the type system and quite often is an exercise in what essentially amounts to paperwork. There is a cost.

> 2) Static typing is far more than a mere "validity checking", as you apparently seem to believe. These advanced semantic properties cannot be added on top of a dynamic type system, so, even suggesting that a dynamic type system may be somehow superior is automatically declaring that under no circumstances you will ever need any of these properties.

I haven't stated that dynamic typing is better, but I have stated that people claiming one or the other need to have proof. Dynamic typing is very rarely stated as better, whereas static typing is quite often cited as better subjectively, even if it's only opinion.

If your programs are as airtight as the "proof" you've given here, I'm not sure I ever want to use them.

Have your opinion and know that I share it (mostly), but also know that it's an opinion and that you'd do well in not confusing it with fact. Strong, static type systems are nice, but to say they're superior to dynamic type systems is an opinion and to present it as anything else is a lie.


> The problem is, most of the dynamic proponents know next to nothing about the PL theory anyway.

This is either a "triumph" of theory over reality, or an insult of a whole group of people.

In short: A lot of people like dynamic typing, and can be productive in it. If the "theory" you cite says static typing is better, the "theory" needs to be changed to reflect reality. If, on the other hand, you're wrong about the theory and/or about dynamic typing enthusiasts knowing it, you should apologize.


> A lot of people like dynamic typing, and can be productive in it.

"A lot of people can be productive in it" does not establish that nothing else could be better. I can be productive in bash, but I think we all agree "stringly typed" is not as good as most other approaches to programming language design.

Even "a lot of people find themselves to be most productive in it" doesn't tell us much, as other factors could very well dominate (most significantly familiarity of language and/or paradigm, but I'm sure we can both think of plenty of other candidates).


> I can be productive in bash, but I think we all agree "stringly typed" is not as good as most other approaches to programming language design.

That depends on the language, and how central text is to what it does.


That honestly surprises me. When text is central, that sounds like when you most need to be able to organize your data without worrying about whether a delimiter might occur in some content...


Note that I'm not saying that a stringly typed language mightn't, in some cases, be the best choice. Just that it will be the best choice for reasons other than being stringly typed, and a language that allowed better organization of your data while offering similar affordances would be an improvement.


[flagged]


This comment breaks the HN guidelines. Please post civilly and substantively, or not at all.

Note how much better it would be with just the first paragraph.

We detached this comment from https://news.ycombinator.com/item?id=11994296 and marked it off-topic.


But type systems do help. You don't have to go far to notice the shortcomings of any large enough project written in python, ruby, javascript, etc. Whereas a project of equivalent scale written in c#, typescript, java, dart, etc. is much easier to maintain and debug. So given enough discipline and enough good programmers I agree that there isn't much difference but in practice this is not the case and having the compiler double check your work helps a lot.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: