Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Working on a large scala codebase, I quickly learnt that 'compile time' in one language does not always happen before 'run time' in another language.

What matters is not the phase in which the bug is found but how close in absolute time finding the big is to writing it.



Agreed and we have to continue improving in every way; dynamic/static/hybrid, just saying that I have not seen this dynamic enlightenment in larger projects. I have only seen the pain of runtime errors that other (static language) teams never had. Sure, if you would-have-written a test for it, you wouldn't have had it either, but types rather force you to think about it while writing. So sure, you are 'done faster', but the fall-out, and again ofcourse YMMV, of having statically preventable bugs popping up in Sentry at 3 am in the morning with things you would've prevented (not necessarily directly by the types but you would've thought about it more because you had to define the types, which is I think what the parent poster here misses too; I just tend to think less and try more without types which, again for me, is a bad state, but ymmv) is not great.

But sure, I am biased as my experience with statically typed langs has been good since I moved from asm/basic in the 80s to pascal/c (they were an improvement over asm/basic and my first experience with types, not saying you should use them now, or not).


The view I've heard expressed is that deep thought on a piece of code reduces bugs. Whether that takes the shape of religious TDD, rigorous proofs or detailed type design doesn't make such a lot of difference.

I used to be fully bought into types, but I've since realised that they have a number of downsides that in many cases more than offset their benefits:

1. Ergonomic typesystems require a lot of work to happen at compile time and slow down the iteration time (one of the more important things for programming in my view). In my view, saving the source and seeing the result almost immediately in a browser is one of the big advantages of web development.

2. Types are almost always written in a second, much less powerful DSL and then sprinkled distractingly through the code that actually does the work. I prefer the way Haskell does this- separate the type signature out onto at least a separate line rather than mashing the two different languages together.

3. Higher levels of abstraction tend to become very hairy in many type systems (although not all). This ends up just meaning that people who like types often restrict themselves (unconciously) to less abstract programming. They spot the time they're saving by avoiding some kinds of bugs, but they don't see the time they're wasting by being unable to talk at a higher level of abstraction. Another way this shows itself is that types are very rarely first class objects in strongly typed languages, making it very difficult to create code that operates on types, or understands types.

4. Type systems open up opportunities for type driven architecture astronauting, which is just yet another way you can go down an unproductive rabbit hole. There was an interesting study done on different teams solving problems with different languages. The differences of different teams within the same language was much bigger than between languages, but the team that made slowest progress (and without particularly having an unusually low number of bugs) was the team that leant the hardest into encoding everything in the type system.

5. Type systems encourage code generation build pipelines, which again slows iteration time and makes everyones life miserable.

6. Type systems reflect a incorrect model of the world - user input, network input, file system data is not typed. The misery that I've had with some web server frameworks that refuse to acknowlege that they don't know every possible thing that the web client might send them and are able to slot it into a predefined type. I think this is the same kind of error that we made with OO systems - thinking that we could fit the world into a predefined inheritance hierarchy.

7. Type systems encourage a static view of the world. The types of things can change under you, dynamically, (e.g. the structure of a table in a database), but in most typed languages you can't cope with that correctly without shutting down and deploying entirely new code.

8. Related to that, it's hard to imagine using a strongly typed language with the live image approach of smalltalk or sometimes used by lisp systems. This means that the popuarity of strongly typed languages is killing valuable and interesting approaches to building complex systems that emphasise observability, interaction and iteration as a way of understanding them.

There are genuine advantages to typed languages, but many of the advantages touted as being unique to typed languages can be provided by advanced linting and IDEs (intellij was surprisingly capable on plain JS + jsdoc). You can also ameliorate some of the disadvantages of untyped languages while keeping the benefits by deliberately programming in a fail-fast way.

I'm sure that type systems have their place. The research I've come across on empirical studies suggests that while there may be positive effects they are small, which does not at all mesh with the extreme partisanship I generally observe. Yes, type systems gain you something, but there seems much less awareness of what you lose.


I am positive about a lot of these points for the future. Especially the performance points; that's going forward fast. But yes, that's often pretty slow; not that bothered by it for my work though. Also, linters work well for statically typed languages too; I usually don't have to compile for 100s of lines of code. If the editor does not complain, it'll probably all work fine. Like I said; do what works for you , but I think at least a good mix will get more benefits.

6. People mention this more often, but I just don't see how that works; you cannot program without knowing what data you are getting. Sure the world is not typed, but at the moment you are going to use the data, it is typed; be it in your logic, head or actual types. Any webserver can go lower level and give you a ByteStream, but when you finish parsing that, you still have types. You might not know them upfront, so you use ByteStream for a bit, but once you know, you bake types and the world is nicer. Imho :) Not sure why that's a difference?

7. This is an issue where? I know it's Erlang domain, but microservices/docker/k8s/ci/cd/lambda/functions/.../all modern crap do this (redeploy, killing the previous instance(s)) with any code, always, including dynamically typed code. So sounds like a niche?

8. Agree with this; we should experiment and research these things and continue building them. I work with Lisp/Clojure as well and like it, I just miss types often. I never suggested it's all crap; I'm just looking where benefit comes from.



> you cannot program without knowing what data you are getting

Types almost always overconstrain. Each part of your code relies on some very specific properties of your data, yet most type systems end up restricting your function to only work with data that meets a whole bunch of other properties that your code doesn't actually care about.


> Types almost always overconstrain

Not in my experience; so many, basically, stringy types.

> Each part of your code relies on some very specific properties of your data,

So then you either have a type that exposes those you need or you have different types for different functions.

> yet most type systems end up restricting your function to only work

Again, I don't understand this statement; someone implemented the types to fit the data for some functions they needed. How does the 'type system restrict' anything?


There's a lot of things, some about old-school types (Java, C), other about modern ones. I don't think most are fundamental, even though some are common experiences today.

#1 is fundamental. (Yet people somehow live with the JS ecosystem that's slower than GHCi.) It's supposed to evolve into always becoming a smaller problem, since computers are always getting faster; but I don't think we've put everything we can into types already, so I expect it to get worse in the near future.

#2 and #3 are about old-school types.

#4 Oh, yeah, they can. But they can also help a lot in team coordination. Powerful stuff enable you either way, if you harm yourself or take advantage of it is your choice.

#5 Failures in type systems encourage code generation. Expect that to always improve, but always slowly.

#6 That's why there's always a parsing stage between input and processing. You deal with input errors at the parsing stage, and processing errors at the processing stage. Most communities of dynamic and old-school languages make a large disservice to the industry by mixing those; they explode error handling into something intractably complex.

#7 Hum... You are holding it wrong. Do not state variants into your types. Instead, use the type system to get every invariant out of the way, so the variants stand clear. (And yeah, there are plenty of libraries and frameworks out there that try to encode the environment into types. That deeply annoys me... But anyway, if you do that, take the types as requirements upon the environment, not its description. Those are different in very subtle ways.)

#8 This shouldn't be fundamental. AFAIK there are not many people trying this, and the few there face a Sisyphean task of keeping their code up to date with the mainstream changes. I do hope people make progress here, but I'm not optimistic.


> Yet people somehow live with the JS ecosystem that's slower than GHCi.

I think a lot of people, including the parent, seem to equate speed of ecosystem and iteration with web development and instant reload of web pages. When other systems allow fast iteration, it goes unnoticed unless it's for web dev. Luckily, a bunch of those 'impossible' systems have to now too, like [0].

[0] https://ihp.digitallyinduced.com/blog/2020-08-10-ihp-live-re...


Web development is an example. Fast iteration is the thing that I like. I have so far associated fast iteration with dynamic languages, and it is certainly the case that my experience is that most fast iteration systems are dynamic.

But maybe that simply reflects a concern of the relevant communities. If strongly typed language systems start adding fast iteration approaches to things and are able to achieve a similar level of quick iteration then that will definitely address one of the things I dislike about them. I haven't coded significant amounts of haskell since 2000, but back then what you could do interactively was very restricted.

At the end of the day, the compiler is doing a bunch more stuff in strongly typed languages. It's like taking a bunch of your verification infrastructure and saying 'these must run before you're allowed to see the result of what you wrote'. It will necessarily be slower, although with work maybe it won't be so much slower that it matters.


> Fast iteration is the thing that I like

> haskell since 2000,

Things changed a lot in 20 years.

Thanks for noting down something about your age; I have always been a bit ageist about 'fast iteration' as I never met someone close to my age (been devving professionally for 30 years this year) that cares too much about it. I am not a very good programmer, but a very experienced one and i'm consistently faster at delivering than my 'fast iterating younger peers' as I simply know what i'm going to type beforehand, I don't need too many iterations to get it right and I have enough experience to know that i'm close to what we need after it compiles. The people who just type/run 1000 times/minute get stuff done, but it's not the way I would ever like (or liked) to work.

> It will necessarily be slower,

GHCi is fast but other avenues can be explored as well, like creating a real interpreter just for development , like Miri for Rust. Only for faster iteration of logic, you forgo some of the type benefits, but when you are done iterating, you compile and voila. I guess the merging of incremental compilation, jits, interpreters etc will evolve in something that might not run optimally but gives blazingly fast iteration up to perfect performance after deployment. And anything in between.


> #2 and #3 are about old-school types.

There absolutely are approaches to this that don't fall foul of my complaints, but when you say 'old-school types' I think you're talking about Java and non inferred types.

I was including other more modern languages in my criticism. Scala for example ends up with pretty hairy types very quickly for higher level code. So much so that they made the documentation system lie about the types to make it easier to understand.

And most currently popular languages don't give you runtime access to types and allow you to treat them as first class.

The languages that allow you to deal with types with the same language you write code in are not remotely mainstream. So unless by 'old school' you include all mainstream languages then I disagree.


Yes, I meant types systems like Java's.

I was thinking about unusable code, caused by the need to write way more down in brittle types than anything you save on coding. In fact there are problems with complex types.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: