Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: Why do mainstream dynamic languages not ship with compilers?
1 point by divs1210 on June 24, 2020 | hide | past | favorite | 7 comments
I often wonder how one-man projects like Gambit/Chicken Scheme come with static compilers that produce small and efficient binaries but in the world of mainstream languages like Python/Ruby/JS this is basically unheard of?

Do Python/Ruby/JS programmers not like small, fast binaries?



I think you'll have a deeper appreciation of the issues when you make an honest attempt at a python-to-C compiler yourself. The result you will get will be full of compromises and trade-offs.

Other attempts like Cython already exist. Perhaps you believe they missed something obvious? This is an excellent way to advance the state-of-the-art when you finish your compiler.


I'm pretty much on the verge of doing this.


You are assuming that a compiler for a dynamic language would produce small, fast binaries. There is so much that can't be assumed in a simplistic compiler for a dynamic language that the resultant binary would mainly consist of calls to support code written in C, which is pretty much what we have now in cpython minus the small interpretation overhead.

Even pypy[0], which isn't a simple compiler by any means, is only 4.4 times faster than cpython by their own measurements.

In short, languages like cpython aren't slow because they are interpreted, they are slow because they are dynamic. There's a lot happening "under the hood".

[0] https://www.pypy.org/


Scheme is highly dynamic and can be compiled to surprisngly small and fast binaries. SBCL can produce blazing fast (though not small) binaries for highly dynamic Common Lisp programs.

The claim that Python is slow because of its nature rather than its implementations is wrong and flawed.

PyPy is pretty fast. Python compiled to Chicken Scheme / SBCL should be in the same ballpark, if not faster.


> Scheme is highly dynamic and can be compiled to surprisngly small and fast binaries.

Yes, after almost 60 years of compiler development. PyPy may get there in another 40 years.

> The claim that Python is slow because of its nature rather than its implementations is wrong and flawed.

The python core developers don't lack in brain power. If it's just a matter of poor implementation then cpython would be much faster than it is. Or do you claim the core developers are incompetent?

> PyPy is pretty fast.

The PyPy people only claim 4.4 times faster than cpython, which is itself about 20 times slower than C on comparable code and much slower in many cases, so "pretty fast" is questionable. Still, in 40 year's time maybe cpython will match the fast languages, but I doubt it.


Turns out there IS a third party optimizing static compiler for Python:

https://github.com/Nuitka/Nuitka

Also, a 4.4 times boost in case of pypy is enormous. It translates to 4x less energy, 4x less resources to manage, 4x less operational costs, etc

The goal is not to have the fastest implementation, but a fast enough implementation that can be distributed easily.


Probably because compilation to native code and linking to a native runtime were not part of the original goals of the project, and the designers were more interested matters such as the grammar.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: