All examples in the post is Haskell. I think static typing is good, but Haskell is bad. Let me quote Ben Lippmeier's words in his PhD thesis:
"I am thinking of a data structure for managing collections of objects. It provides O(1) insert and update operations. It has native hardware support on all modern platforms. It has a long history of use. It's proven, and it's lightning fast.
Unfortunately, support for it in my favourite language, Haskell, appears to be somewhat lacking. There are people that would tell me that it's not needed, that there are other options, that it's bad for parallelism and bad for computer science in general. They say that without it, programs are easier to understand and reason about. Yet, it seems that every time I start to write a program I find myself wanting for it. It's called the store.
The mutable store, that is. I want for real destructive update in a real functional language. I wanted it for long enough that I decided I should take a PhD position and spend the next several years of my life trying to get it."
With great reluctance, I am over the last year starting to agree with your position on Haskell.
For many years I have had a lot of fun with Haskell, doing some NLP, and web stuff with Yesod and Scotty.
However, I am so much more production in a concise dynamic language like Ruby that I am re-evaluating my fondness for Haskell. Also, to a small degree, good IDEs like RubyMine catch some simple errors for you.
I, like Elben Shira, hope for radical new technologies that make me feel like I am developing in Ruby with the safety and speed of Haskell. I am looking forward to the future.
Well, Haskell does certainly have mutable storage. They just all live at the IO monad, because they are mutable...
Anyway, code that patches immutable data compiled by GHC behaves much like the plain mutable storage you'd write in C. It may not be always simpler to write (but for complex data, it normally is - and why are you writing so much simple data anyway? is it because of your language limitations?) but is checked by the compiler, what's a great bonus.
Well, er, Haskell does have destructive update. You are misunderstanding Ben's point is somewhat. "We present a type based mutability, effect and data sharing analysis for reasoning about such programs [containing mutable data]".
How am I misunderstanding? I am quoting exact words, after all. Re IORef and STRef, let's quote some more:
"Representing effects in value types is a double edged sword (p. 42). Haskell has fractured into monadic and non-monadic sub-languages (p. 44). Monad transformers produce a layered structure (p. 45)." These are section titles, be sure to read the entire sections which explain in details what the title means.
I can't work out why you're being downvoted. Kneejerk reaction for anything that is critical of Haskell maybe? Unfortunately that does seem to be very common on HN.
Ben Lippmeier makes a very good point and is being quoted in full here. Haskell does have issues, and people are trying to fix those issues. If they can be resolved isn't known yet, and the resulting language may be different from Haskell. Haskell is a long way from some pinnacle of perfection, and it is very much not a silver bullet to solving problems. Not that anything in computer software is a silver bullet [1].
Erlang is probably not the best example. Programmers using Erlang are almost certainly of a different caliber than those using Ruby and Python. I don't think it's very productive to study dynamic versus static typing without also controlling for the experience and discipline of the programmer. This is an element that many of the academics want to ignore because it's difficult to quantify, but I wouldn't trust any argument that exluded the human element because it's ultimately humans who use these tools. I still haven't seen anyone address Elben Shira's original point about the difficulty involved in understanding the return values of a function: what's really in that immutable request map? How do I use it?
Given experienced and disciplined programmers, the type system probably isn't as important. But given the quality of programmers who are using these languages in the real world, a good type system does seem to help.
That's the thing, experienced programmers are also better able to use the type system to encode program invariants, beyond just making sure you don't pass a string where an int is expected.
I've heard various CS teachers say that, when presented with Haskell as a first language, students took to it like a fish to water and could start writing useful code really fast.
My personal experience is that Haskell isn't really all that complex - when you stick to the core, it's actually fairly simple, with a small set of well thought out features (compared to Scala, for example, which I love but has a lot of features). Strangely enough, what I find hard when dealing with Haskell is the syntax, because it's so different from all the languages I've been taught or taught myself in the past. That shouldn't be a problem for people that have little to no previous experience and tackle it with a virgin mind.
What you said is purely anecdotal. Here's another anecdote I think worth mentioning (it is from a Haskeller who has spent years teaching Haskell to kids):
> this experience has taught me that whenever functional programmers claim that certain things are only unintuitive because programmers have had their minds polluted by non-Haskell programming languages... they are usually wrong.
> the more common claims you see about how purely functional programming is easier and more intuitive for students without previous imperative programming languages. Surprisingly, that's not the case. Even in situations where I don't expect it, I'm constantly fighting everyone's urge to think about situations first and foremost in imperative terms.
I think it's safer to assume that functional approaches have inherent complexity compared to imperative approaches. YMMV, of course.
The age of dynamic languages will be over when you see a modern static language that supports self-modifying runtime environments as well as Common Lisp or Smalltalk presently do.
This is an area where the state-of-the-art in static languages (Haskell, Idris, Agda) falls flat, and I suspect that's why (other than ignorance) we see so many comparisons from Haskell advocates between Haskell and the likes of Python, Ruby, or worse that try to paint dynamic languages as offering only brevity through the omission of type signatures as a major advantage over common static languages--an advantage for which Haskell can largely compensate--while completely ignoring the capabilities more sophisticated dynamic languages offer.
>The age of dynamic languages will be over when you see a modern static language that supports self-modifying runtime environments as well as Common Lisp or Smalltalk presently do.
Example of such a language is Java. It's runtime is highly dynamic. Class loaders allow you to create new classes, pre-process bytecode, and many other stuff.
The JVM has supported the kind of environment you describe for more than a decade. Fire up IDEA or Eclipse, put a break point in your code and modify it at runtime at your heart content.
You can even rerun portions of your code after you've modified it (rewind it), something that even Smalltalk never supported.
> The JVM has supported the kind of environment you describe for more than a decade. Fire up IDEA or Eclipse, put a break point in your code and modify it at runtime at your heart content.
The JVM is a virtual machine targeted by many languages, some of which are dynamic, and Java isn't a "modern" static language from the standpoint of type theory and provability.
In Pharo Smalltalk, for example, you can pause running code in the debugger and modify it using syntax the language doesn't actually support, and then, in the same runtime, open a class browser and modify the compiler to support the new syntax before finally accepting your changes in the debugger and resuming execution. You can swap every reference to one object with references to another object using become:, effectively swapping the identities of two objects globally at runtime.
Perhaps all of this is possible in Haskell and the like, but it's still to be demonstrated.
> You can even rerun portions of your code after you've modified it (rewind it), something that even Smalltalk never supported.
None of these links cover what I was discussing. The closest is part 4 ("restarting") but this section only describes code that can retry after a failure.
I was specifically talking about debuggers that let you rerun code after you've modified it without having to restart your application, something that Java debuggers have allowed for almost ten years now.
I continue to think that neither Lisp nor Smalltalk offer this functionality, can someone prove this wrong?
> I was specifically talking about debuggers that let you rerun code after you've modified it without having to restart your application, something that Java debuggers have allowed for almost ten years now.
Java got this from Lisp and Smalltalk. The first and third links covers what you're talking about. For Pharo, which is graphical, just open the image, open the debugger on some code and modify it to your liking before clicking "Proceed," or alternately, pick an earlier stack frame and do the same.
Did you even read this paragraph:
> In Pharo Smalltalk, for example, you can pause running code in the debugger and modify it using syntax the language doesn't actually support, and then, in the same runtime, open a class browser and modify the compiler to support the new syntax before finally accepting your changes in the debugger and resuming execution. You can swap every reference to one object with references to another object using become:, effectively swapping the identities of two objects globally at runtime.
This is a formal way to do homoiconicity, and frankly much better than codewalkers and CPS-conversions and similar tricks.
I now think that everything you can do with macros in CL, you can do in Haskell too and it will be cleaner. Another good example are lens and traversals, they are generalizations of generic references such that CL can only envy.
Someone should really rewrite On Lisp and Paradigms of AI in Haskell.. (and also Design Patterns from GoF) I think it would prove this point.
Just in case anyone is reading the blogs in a hurry, keep in mind that neither the Dmitri Sotnikov nor the post he mentioned by Maxime Chevalier-Boisvert actually try to refute Elben Shira's carefully worded prediction: "This is my bet: the age of dynamic languages is over. There will be no new successful ones."
Instead of making a case for "new successful dynamic languages" and why such new ones would emerge, both authors are defending the existing benefits of today's dynamic languages.
Elben Shira is saying that he predicts the next wave of successful languages will have static checking but also look & feel like a "dynamic" one made possible by smarter compiler technologies such as type inferencing. (Maxime Chevalier-Boisvert also mentions the new Crystal language as an example of this.)
If one is itemizing all the great points of Python/Ruby/Javascript, that's orthogonal to what Shira is betting on.
Oh definitely think the prediction is absurd. I didn't realize it wasn't obvious from my post.
The problem I outline is not that static typing involves additional boilerplate. It's the inherent additional cognitive load it introduces. A smarter compiler isn't going to help you there.
The fallacy of Elben's argument is in the assumption that there will be only advances in the field of static typing.
Meanwhile, people working with dynamic languages are improving their tools at an astonishing pace. For example, working with an editor like Cursive provides many refactoring options you could only expect from a static language a few years ago.
The biggest benefit though comes from interactivity though. Dynamic languages lend themselves extremely well to providing live coding environments. This allows for tools like the REPL and things like Figwheel https://github.com/bhauman/lein-figwheel and this kind workflow will be difficult to match for static languages.
It's obvious you think it's absurd and given your heavy involvement in Clojure, your reaction is not unexpected. But you are still not addressing the main claim of the articles (that no new dynamically typed language will ever emerge again and be successful) and also missing why statically typed languages have won (it's not just correctness guarantees, it's also the automatic refactorings they enable, which are impossible to achieve with dynamically typed languages, and performance).
Clojure and its family of language will be a thing of the past in a few years from now (Clojure is not even a thing of the present actually, given its minuscule mind share on the JVM).
Elixir (and Erlang itself, actually) are definitely the vanguard of dynamically typed languages at the moment. I think this is because they aren't just convenient to write "normal" code in (like Ruby, Python, JS, etc.), they also come with major advantages in writing concurrent and distributed code, which is increasingly important. BEAM is also a particularly good runtime with better performance (especially memory) characteristics than many of its dynamically typed competitors.
I think this is the counter-argument to the original claim; new dynamically typed languages can be successful by offering convenience for important functionality. Even so, I think type annotations to aid static analysis (which Elixir has and JS will likely have in the future) will be an important feature.
I think the bar has simply been raised for new languages with both static and dynamic type systems. The former can no longer be boilerplate-y, inflexible, and inconvenient, and the latter can no longer be statically uncertain and hard to reason about. This is a good thing.
I am addressing the main claim of the article quite clearly. Static typing adds complexity and there's no evidence that the added complexity results in any tangible benefits.
Therefore, my prediction is that dynamic typing isn't going anywhere unless some empirical evidence appears that clearly demonstrates the advantages of static typing. Seeing how long static typing has been around I wouldn't hold my breath on that.
If you think that static languages have won then you must live in a very special bubble seeing how pretty much all of the web runs on JavaScript.
Having such strong convictions in absence of any empirical evidence is simply faith and static typing seems to just be a religious experience for some people. Many advocates are like bible thumpers coming to your door and telling you to accept monads into your heart. Naturally these are the loudest elements of the community who live in their own little echo chamber.
I liked your article and generally think you have lots of good thoughts, so I want to give you some color on why I think you're getting downvoted here (though I didn't downvote you), so that you don't just think it is because of "bible thumpers".
You seem to be missing that jasod's claim was that the "main claim of the article" is "that he predicts the next wave of successful languages will have static checking but also look & feel like a "dynamic" one made possible by smarter compiler technologies such as type inferencing." You haven't said anything in this thread about "the next wave of successful languages" one way or the other.
> If you think that static languages have won then you must live in a very special bubble seeing how pretty much all of the web runs on JavaScript.
That's true, but has nothing to do with the claim about "the next wave of successful languages" because javascript is clearly part of the last wave (or really, the one before).
You seem to be arguing (in an increasingly heated tone) a broader point (which may be right or wrong) in a thread attempting to focus on a more narrow one.
I'm certainly not aware of any. Pretty much all statically typed languages use the REPL as a toy on the side. It's not part of the actual development process. You're not running the REPL in the actual app you're working on and its usefulness is extremely marginal at that point.
A language like Haskell pretty much precludes having REPL integration as you can't just run any top level expression without going through the main.
Well, are you aware that the same Haskell you cite has a REPL environment, with on the fly code patching, and that does not require all expressions to "go throught main" anyway, right?
I simply have no idea how anyone would use a REPL to do main development (I never liked the idea anyway). I use it just for early stage testing and debugging assistance, just like I do in Python, Lisp, Prolog... and certainly not Perl - this one has a useless REPL.
Let me give you an example of my typical day working with Clojure. I will start the app then connect my IDE to its instance. When I work on any code I write it in the context of the running application. I have access to all the state that's been built up, I can leverage user sessions, database connections etc.
Let's say I'm writing an endpoint to serve some data to the client. I would go to my db namespace, write the function to query the db, run the code and see what it's doing. I'll modify and run it until it does what I want. Then I might go write a controller and do the same thing, and so on.
At each point I have my code running within the context of the app, anything I write in the editor can be evaluated modified, and inspected.
This workflow simply doesn't exist in Haskell last I checked.
I'm not aware of any simple way to get up and running like you are describing in Haskell, no. But it's unclear whether you are saying that such a thing cannot exist in a statically typed language, or whether it just does not at the moment. Can you clarify?
In case of Haskell it's due to the fact that you can't just evaluate any top level form without going through the main. I'm sure it would be possible to design a statically typed language that's REPL development friendly, but none exist currently.
Ok, I either don't get what you are saying, or you never really tried Haskell's REPL for real.
I've probably run most of the modules I've written in Haskell on the REPL, normally without importing the main one, because I use it for early-stage testing. I've run DB interfaces, network abstractions, and, of course, lots and lots of pure stuff. Just like you describe doing for Clojure.
Ok, I don't use an IDE that run stuff automatically at the REPL. On Haskell I import the interesting module, and run it. There is at least one IDE that connects with ghci, but I don't like it, and don't know how far it takes debuging (at a minimum, it evaluates your code and tells you the type of anything - I've tried this).
This is simply misinformed. GHCI can evaluate/run the IO monad. Syntax: to run and throw away the result, simply type an expression with type IO a: it will be evaluated to IO a, then executed. If GHCI can show values in a, it will (unless a=()). To get the value, say var <- expr.
People create languages all the time. Some languages (like PHP) were just created for problems that the creators had, not for some overriding empirical reason. Dynamic languages also tend to be easier to make. There's nothing to suggest people will stop creating languages and then adopting them, when they are tailored toward specific problems. Any reasoning as to why they might not appear is hand waving for attention.
Reread the article. It doesn't say such language will not emerge (I certainly hope people keep creating languages all the time), it says they will not be successful.
I'd rate Elixir right now only as "up-and-coming". It is not yet an acceptable second-tier choice. Even in an environment where Ruby or Python (or the several other similar second-tier languages) are accepted without a second thought (as opposed to an environment where only first-tiers like Java or C(#/++/) are acceptable), Elixir will still result in some questions being asked that you better have a good answer for.
This is not a criticism of the language, simply descriptive of its current position. Go, which is substantially larger at the moment from what I can see, is just barely cracking that boundary that I described, and probably still has a good year or two before it's quite there.
Elixir is currently on a good trajectory, but I would definitely be concerned that trying to be a Ruby on top of the Erlang VM is going to make it a non-trivial challenge to get to the next level of usage, precisely because the Ruby space is covered, covered, and covered again with mature, existing languages. This is, again, not a promise from me it can't succeed, indeed I wish it all the best, just a description of its challenges. It is growing rapidly in its little niche but it may have a serious challenge getting to the next level.
And, back on topic, when this came up on Reddit, I'd observe that Elixir is the only dynamically-typed "up-and-coming" language I can think of. Nim is statically-typed. Rust is static. Go is probably on the way out of "up-and-coming" but it's static.
I also observe that all the dynamic languages are all adding static type support, even when the static type support is inevitably compromised by the dynamic portion of the language. In the meantime, I don't see any static language trying to move in a dynamic direction... in fact, many of the static languages are moving to be even more static! The net flow of feature work right now is strongly in the static direction.
Point. And it's a non-trivial implementation, from the look of it. A lot of static languages have "a thing that can be passed around as a 'dynamic' value but has to be cast back to a static type to do anything", but the C# feature is much more than that.
Also, I observe on the docs page this little note that says "As of Scala 2.10, defining direct or indirect subclasses of this trait is only possible if the language feature 'dynamics' is enabled."
Almost all the static languages have some sort of dynamic feature already, and generally have since day one. The mere existence of "dynamic" features isn't my point. The question is, how often are they used, and is the language tending towards more of them or less of them? I don't do Scala, so I can't speak to it, but does every little Scala program use this dynamic package, or is it something rarely reached for, and even perhaps considered a code smell by the community? If it's the latter, then it's not a huge notch in the "dynamic" belt.
I'm discussing the first derivative, not the current state of the world.
I don't think it's considered a code smell, but it is intended to be used only for cases where the problem your modelling really is dynamic. Basically, in cases where you end up having classes that have methods like
def lookup(name: String): Any
you can instead use Dynamic, meaning instead of having calls like this:
myObj.lookup("foo")
you can do this:
myObj.foo
So no, Scala isn't really trying to become more dynamic, but it has a feature for dealing with problems that are dynamic anyway.
Erlang is reliable because it's fault tolerant. The Strong Static Typing camp wants software that is reliable because it is faultless.
Elixir has Erlang's philosophy on the problem. I think most of the wailing and gnashing of teeth is because most languages (even most statically typed ones historically) have just completely punted on the notion of language support for writing software that doesn't fall over in a stiff breeze. The more we build layers of software between humans and their actual goals the more this is going to come to a head.
And you are falling prey to the Nirvana fallacy [1].
Yes, we will never be able to prove that a program is 100% correct but that doesn't mean we shouldn't prefer solutions that bring that number up (statically typed languages) over solutions that don't (dynamically typed languages).
Just because you can't do something perfectly doesn't mean you shouldn't try to do it as best you can.
Yes. Real-world programming is an entirely different thing from academic research.
In the real world, it may well be the case that while you're screwing around with a baroque type system, your competition has iterated three or four times and your milkshake has been drunk.
Of course your compiler will never be able to proof the correctness (or lack of it) of all programs.
The relevant question is, what share of useful programs can our compilers help us creating? This is not answered yet, and you seem to have never even thought about it.
fun fact: the Busy Beaver function applied to the universe yields a finite number and seeing as we need to leave some space to actually run the program we can conclude that in fact any program that can be written can be proven to halt. Granted that in case of "for(;;){printf("hello");}" that halt is the heat death of the universe but it's still theoretically tractable.
I love Ruby, Python and JavaScript. I believe I am more productive with dynamic languages than say static languages like C# or Java. The abstractions in dynamic languages (at least for me) allow me to write less code with less confusion and less classes to remember.
With so many libraries supporting my development, I usually tend to write much less code using a dynamic language. Also most of the code tends to be isolated with clear interfaces exposed to other parts of the code. Writing tests is simpler too, compared to writing them in dynamic languages.
May be large projects with hundreds of thousands of lines of code, could benefit from statically typed languages. But for majority of the projects (80-20 rule) dynamic languages offer cleaner syntax, simpler coding experience and faster time to market (my opinion).
Solely for syntax, my assertion is that this is just because dynamic languages evolve faster. Because they are easier to build.
E.g. one person can buy the dragon book(s), use parser/lexer libraries, get an AST, and then start an interpreter than just stores objects as maps, and evals things dynamically.
But very few people are going to write a new/novel type system on their weekends.
So that's why, IMO, all the sexy/modern syntax shows up in dynamic languages first, and then takes ~a generation or two to reach static languages, where you need a dedicated team to put the effort into the type system + compiler + expected tool set (IDEs/etc.).
(E.g. see Scala, Kotlin, Rust, etc., which have all, IMO, achieved the "syntax as nice as Ruby" level, but took large teams of people to do it).
> see Scala, Kotlin, Rust, etc., which have all, IMO, achieved the "syntax as nice as Ruby" level
I've only reasonable experience with Rust in that list but... no. I've seen enough Scala to get that that's not true, and I know it's not true for Rust.
That's not to say that static languages can't have nice syntax - Crystal is a great example if you'r trying to match Ruby - but I don't think your examples are great.
I simply question writing projects using a monolithic style in the first place. Anytime somebody brings this up as an advantage of static typing I get a little suspicious. In fact, it could be argued that static typing gives the user enough rope to hang themselves with.
With dynamic typing you're much more likely to break things up early precisely because you know you have to keep things in your head.
On the other hand, static typing allows you to just keep adding code and it makes sure that the code will compile and run. However, at some point you end up with a large and complex system that you don't really understand. All you know is that it's self consistent. Understanding what the business logic is doing and why can be quite difficult and the fact that it type checks does little to help you there.
Coincidentally this is the reason that static typing proponents are obsessed with refactoring. Once you realize what mess you got yourself in then you have a ton of code to refactor and there's no way you can keep track of it all in your head.
With a dynamic languages refactoring tends to happen in much shorter cycles and introduces much less cognitive load.
Sorry, but I have a large, complex problem. I just can't have all the business logic in my head at the same time.
Now, I could just write the logic down as logical statements on paper, and check my code against it... Or I can write the logic down as logical statements on the computer, and have it check my code automatically. Pick your poison.
You mean you'll never build a monolith. In practice, I've been working on big systems in Clojure for the past 5 years. They just happen to be broken down into manageable components you can work on and reason about in isolation.
less confusion, less classes, more libraries, cleaner interfaces and easier testing.
Of those classes and libraries obviously apply per language not per typing discipline, confusion was rhetorically tied to the number of classes so again, thats about the language not the typing.
tormeh's right, the comparison was between languages, not between typing.
I think, instead of dynamic languages, the current place of dynamic languages in the future will be dominated by hybrid languages.
Having no type annotations, even in highly dynamic languages, leads to lower maintainability. However, some type systems are too complicated, and over-engineered, the best example of which is Haskell (compare it to Agda or Idris, which can do more, but have much much simpler type system).
I think gradual typing is definitely likely to become the norm. It allows you to develop things using the dynamic style and then add type annotations where appropriate. It also avoids the problem of mixing the proof into your solution.
I think the main problem with types isn't that they slow you down. In my experience they actually don't and reverse is actually the true. The main problem is that static languages are often too complicated for non professional software developers, i.e. designers, system administrators, data scientists, etc, to be motivated to learn them. They just need to write the code, and when they do it themselves, the stuff is done much faster and much more efficiently.
The inherent complexity of static typing is precisely my argument. You have to demonstrate that the additional complexity actually adds value in the long run. This seems like a prerequisite before you start extolling this approach.
>I would say Java (the native syntax) is over engineered. The runtime is a beautiful thing.
It's not over-engineered. Making syntax in this way was a design goal. AFAIU, authors of the language thought that when you specify as much as possible in syntax, the code is easier to maintain. That's why it's so popular in enterprise world where code often lives for 10s of years.
> when you specify as much as possible in syntax, the code is easier to maintain
That's not true of all cases. Most importantly, the number of abstractions that are baked into the language are a combination of workarounds (rather than fixing the type system) and whatever the JLS contributor's motivations are. Some of which has been useful, some of which has not. It's overengineered.
When people write Lisp code, they invent new dynamic languages all the time. Inventing and implementing new languages (a.k.a. writing macros) is just part of the regular course of doing business in Lisp. And these languages that people invent when writing Lisp tend to be dynamic, because they tend to compile to Lisp. But of course they don't have to be.
I suppose no rationale person believes dynamic language would disappear completely. Also I think both dynamic and statically typed languages exist on a spectrum. It is not very productive to talk in generalized terms like "end of dynamic languages".
The post is kinda like saying, you don't really need to understand notions of convergence or measure to be able to do integration. It's true, in a sense, and it's faster to just learn Newton's formula and get the job done. But you may get into problems without good foundations.
I like dynamic languages, especially Python and Clojure. But programming with types is important to get things right, on the correct, formal basis. (I think functional programming is a misnomer, it should really be called algebraic-oriented programming, but I digress.) Once you see how to do it correctly it's perfectly OK to mimic the same abstraction in dynamic language.
You can reason and prove code to be correct in a dynamic language. What static typing gives you is compiler assisted proving. The cost of the compiler assistance is that you have to express the code in a way that often makes it more difficult for a human to understand.
> often makes it more difficult for a human to understand.
Does it though? I guess it depends on the type system.
For example, when I find it hard to understand Haskell code, it's not because of the type system, it's usually that the compiler is doing too much magic for me (with type classes, and automatically lifting things into monads).
I find that having the types actually makes the code itself easier to understand, because I can ask the compiler (usually through the REPL) to tell me what kind of thing can go where. It makes that kind of information more local.
You can do it, but you don't. It's not a language thing, it's a culture thing.
Language like Haskell forces you to do it. It is more effort, but this effort pays off in getting composability and edge cases right. The end result is not actually harder to understand, it's rather that you did not fully understood the problem in the beginning (by glossing over technical details).
Ah but the thing is that the payoff is simply assumed. Nobody has actually shown this claim to be true. You spend more effort and you hope that there is a payoff. Yet you have things like Cabal that's as flaky as anything written in PHP. Clearly static typing isn't some magic pixie dust that all of a sudden makes all your code work correctly.
> Clearly static typing isn't some magic pixie dust that all of a sudden makes all your code work correctly.
Static typing itself - no. What is important about Haskell is the equational reasoning (algebraic thinking, if you will) you can do, not static typing. Static typing is just a vehicle to be able to pull it off.
There is a payoff already - see my comment to Lisp guy above. We could also talk about how monads made LINQ great, or how streams in Java have to be more complicated than necessary because the language wasn't very well formally defined.
And there will be bigger payoff in the future, when we will learn for example how to type a whole big software component correctly (like a web service). So far, it's an art driven by engineering intuition; but formalization will help to make more robust systems.
The payoff is also assumed based on experience with other branches of mathematics (as I was alluding in my first comment), where historically, the formalization always paid off. It doesn't mean you cannot get things right informally, but it's usually fiendishly difficult.
Again, you're making a lot of assertions in absence of any evidence to support them. One would think that this would be easily demonstrable seeing how long Haskell has been around, and yet no such evidence exists to my knowledge.
I think there is plenty of evidence (of how types help to design better APIs), you just don't want to accept it. I already mentioned LINQ and lens, another could be parser combinators and FRP (Observables in particular).
"I am thinking of a data structure for managing collections of objects. It provides O(1) insert and update operations. It has native hardware support on all modern platforms. It has a long history of use. It's proven, and it's lightning fast.
Unfortunately, support for it in my favourite language, Haskell, appears to be somewhat lacking. There are people that would tell me that it's not needed, that there are other options, that it's bad for parallelism and bad for computer science in general. They say that without it, programs are easier to understand and reason about. Yet, it seems that every time I start to write a program I find myself wanting for it. It's called the store.
The mutable store, that is. I want for real destructive update in a real functional language. I wanted it for long enough that I decided I should take a PhD position and spend the next several years of my life trying to get it."
http://benl.ouroborus.net/papers/thesis/lippmeier-impure-wor...
While I admire him, I think the better solution is to give up Haskell.