Ah, yes, then it becomes a question of real tradeoffs, I guess. There are a lot of domains in which I will happily trade a reasonable amount of performance for nice language feature, though.
That's true. My favourite language is Ruby. Which is an exercise in how to frustrate a compiler writer (in a sign of deep seated masochism, I'm working on a ahead-of-time compiler for Ruby), and much of it certainly is a result of providing nice features.
But many of the worst offenders in such languages also have little to do with "nice features" and more to do with not thinking about compilation. E.g. one of the big issues with compiling Ruby is that classes can generally be re-opened at any time.
It's an awesome feature for a lot of things. But it also makes it incredibly hard to formally reason about Ruby code.
What does "1+1" mean? You can't know without knowing whether or not someone has overridden Fixnum#+. And that can happen anywhere. So even if you know that 1+1 is 2 now, after the next method call, you don't know any more unless you've determined exactly what the intervening method call was, and traced everything it might have called. Oh, and you need to take into account threading. Executing 1+1 in a tight loop, expecting it to mean the same on every iteration? Can't safely do that, as someone might be randomly redefining methods in another thread. Even if you determine that that can't happen now, if anyone ever calls eval() on data you can't statically determine (or its cousins such as "require"), you're back to square one.
Of course 99% of the time nobody has a sane reason to do that, and the 1% (or more like 0.001%) when it happens, it's almost always because someone is proving a point, and nobody would care about breaking the proving-a-point cases. Outlawing overriding methods of the basic system classes, or even "just" specifying a subset of basic methods that the implementations should treat as immutable would make optimising Ruby code immensely much easier with very little practical impact. (Ruby has the woefully underused "freeze" but that applies only to objects; a method-level "freeze" or "final" carefully applied to a few dozen rarely-if-ever methods would make a lot of difference; it'd even be fine to supply a way of overriding it, as at least then you'd be signalling clearly that you're about to do something that requires disabling certain optimisations).
Providing restricted versions of "eval" (so you can provide guarantees about what will still be the same after eval), and the same for "require" would also make a huge difference (e.g. if I had a way of loading new code and being sure that it only defined new classes rather than modifying existing classes, I would be able to aggressively inline a lot of things in situations where I could determine types statically; without it, I can still do inlining but needs to do a lot of work to verify when changes have occurred and potentially fall back on a much slower path, which seriously increase the complexity and reduces the benefit of inlining)
The strength of language designs coming from people who are experienced at writing compilers is that you rarely see those kind of situations, because the first thing a compiler writer would tend to want to do would be to declare a lot of corner cases like the above as "undefined behaviour" at a minimum, and proceed to implement optimisations that would utterly break it.
While Ruby is my favourite language, my favourite influence when it comes to writing compilers is Niklaus Wirth, who from his first language work on successors to ALGOL-60 has consistently been working to reduce the complexity of his languages with each iteration (he has usually removed more than he has added in each new language). One might very well believe Wirth is too much of a purist, but while his languages tended towards the minimal, his overall principle is worth considering: He only wanted to retain functionality in his languages with proven benefit and that he understood how to implement efficiently. A lot of languages could benefit from extensive pruning of functionality on that basis.
(Incidentally, for my Ruby compiler if/when I finally get to the stage when these things matters, I do intend to deviate from Ruby and declare a whole lot of things that people don't generally ever do but that should technically work as "undefined behaviour unless you specify the --generate-horribly-slow-but-compatible-code switch or something to that effect)
Thank you for the interesting read. I do get your point and I don't think I ever really was of another opinion. It's interesting to read a very concrete problem from a contemporary language.