Welcome to Bloggers and the Principle of FUD. Take this post with a pinch of salt.
It's trendy to pick holes at Ruby in various, inaccurate ways lately, but a quote from Matz about that POLS business:
Everyone has an individual background. Someone may come from Python, someone else may come from Perl, and they may be surprised by different aspects of the language. Then they come up to me and say, "I was surprised by this feature of the language, so therefore Ruby violates the principle of least surprise." Wait. Wait. The principle of least surprise is not for you only. The principle of least surprise means principle of least my surprise. And it means the principle of least surprise after you learn Ruby very well. For example, I was a C++ programmer before I started designing Ruby. I programmed in C++ exclusively for two or three years. And after two years of C++ programming, it still surprised me.
You can pick up almost any programming language, not learn it properly, and then pick holes in its various inconsistencies or flaws (my own is over Python's ambiguities over methods vs functions for various things - but I know I'm not a Python developer!) That doesn't make it interesting or insightful. Every language has dark corners and Matz has been more than forthcoming about Ruby's.
This article has numerous curious assumptions and sloppy practices. Local variables called the same as method names? And very few Rubyists, in my experience, would be confused by the key example in the memoization section. Who really thinks || checks for whether a variable is bound rather than doing a logical or? How is it a "what's worse" that instance_eval changes self to instance_eval's receiver? That's its entire reason for existing! The second example under "Local Variables Strike Yet Again" is flat out wrong, even by the author's own admission.. Ruby 1.9 has been the production release for almost three years!
A-bloody-men. I read the post and mostly went "What?". If you're calling local variables the same as methods in the same scope then you're just asked for trouble. Mostly because when you come back and read it 3 months later it's harder to pick through.
Think the only related thing I've ever hit and gone WTF at is `foo = foo` sets foo to nil. But afaik that's due to the implementation of MRI/YARV, not all ruby runtimes.
So yeah, as a professional rubyist, reading that post was mostly just me going "WTF, why would you ever think or do that". As Peter says, take it with a pinch of salt.
Ditto. I think it's problematic for a programmer to rely on a language to resolve ambiguities for you. If you see or suspect undesired ambiguities in your code, resolve them yourself. Best of all, make every effort to avoid introducing them in the first place.
I'm reminded of a programmer I read about who refused to learn any language's operator precedence. Instead, he always grouped expressions with parentheses to make sure they were evaluated in the order he desired. (I don't know that I'd do that myself, but he had a point...)
I mostly agree. Apart from: "not learn it properly, and then pick holes in its various inconsistencies or flaws". Flaws may be subjective, but inconsistencies aren't. If you find two similar things provided by default, with very different behaviour then it's not a case of not knowing the language. It's inconsistent and that's a problem in one way or another.
"Welcome to Bloggers and the Principle of FUD. Take this post with a pinch of salt."
Actually every time somebody uses the term "FUD" I have to take THEIR comment with a huge sack of salt.
Can't you just state your counter-claims without resorting to accusing the other guy of a conscious attempt at misguiding people?
Also, "FUD"? Really? Someone posting on some Ruby shortcomings (or anything) is spreading "fear", "uncertainty" and "doubt"? Because you just had to use a relic term of the 90's desktop wars against Microsoft, right?
I read the post, and haven't felt any particular fear.
Matter of fact, I was rather enlightened by some of the examples.
Also I take offense to the silly line of argument here:
> "You can pick up almost any programming language, not learn it properly, and then pick holes in its various inconsistencies or flaws".
Really? One picks holes "in its various inconsistencies or flaws" if he has not "learned [the language] properly"?
What sense does that make? A language either has "various inconsistencies or flaws" or it has not.
If it has --and you don't argue that Ruby hasn't--, one has EVERY F*N right to "pick holes" at them. And this has nothing to do with "not learning the language properly".
If the language has "inconsistencies or flaws" what does "learning it properly" mean? To learn to work around them?
Yep, as usual, prepare salt before you read _any_ programming languages blog post.
My case was against Ruby back when Rails was hot. Ditto with JavaScript nowadays.
Ruby 1.9 has been out for production for almost 3 years? Cool, how come it's slower than 1.8.7? (not trying to offend Ruby but I use Rails 3.1 with Ruby 1.9, running unit-tests are dog-gone slow).
Ruby 1.9 has been out for production for almost 3 years? Cool, how come it's slower than 1.8.7?
It's not. File loading in 1.9.2 is notoriously slow and you're probably encountering a common case where mid to large scale Rails 3 apps are loading so many files at runtime that you're hitting it. Everything else is bright and breezy. The story behind much of this: http://www.rubyinside.com/ruby-1-9-3-faster-loading-times-re... - Most people are experiencing significant speedups with 1.9.3 so far but it hasn't had its final release yet.
That aside, speed isn't everything. Python 3 was (is?) slower than Python 2 for quite some time, despite being in production.
A lot of Ruby is kind of like the traffic circle. Circles can be a superior intersection of 5 roads (if traffic volume isn't high, you never have to stop), but they trip up nearly all new drivers.
I've found that Ruby is a very right-brain language in the sense that when you analyze it at a low level, things don't seem to make sense. But at a high level, it often does exactly what you'd want. Variable hoisting in 1.9 is the perfect example of this.
OP claims to have been a "Rubyist", but it sounds like he's still fighting the system and hasn't embraced it yet. It's like when people who are new to Lisp complain about the parentheses or prefix notation. If you just go with it, down the road you may find that these are some of its best features.
The first issue isn't really surprising once you understand how Ruby treats variables. I wrote about this a while ago for a more understandable case - basically, Ruby initializes variables to nil as soon as a left-hand assignment is encountered during a scope's parsing, regardless of context or reachability.
def foo
if false
bar = "baz"
end
lambda { bin = "bang" }.call
return local_variables, bar
end
def foo_two
if false
bar = "baz"
end
lambda { bar = "bonk" }.call
return local_variables, bar
end
puts foo.inspect
puts foo_two.inspect
=> [["bar"], nil]
[["bar"], "bonk"]
Obviously the code that initializes bar never gets run, but Ruby initializes a local variable "bar" anyway, since there is a left-hand assignment to it present in the method. The lamba defines a new scope, so its local variable assignments don't leak to its containing scope, but in that scope, unless there was a variable named bin initialized in the outer scope already, which is the case for the lambda in foo_two, which leaks the bar reference to its containing scope since bar was defined in the outer scope at the time of the lamba's definition.
I'm really torn on the "no parens for func calls" thing. It's so damned handy for a majority of cases, but it breaks one my personal rules of "call things what they are". If it's a function call it should look like one. Then this unwelcome surprise wouldn't be a surprise at all. But, alas, I'm bound by Goedel too and am not 100% internally consistent.
I saw Neal Ford's "Abstraction Distractions" recently and a lot of things resounded with me, but one that I liked was a general rule that abstractions should solve 80% of your cases easily, and get the hell out of the way for the other 20% and let you get further down the abstraction tree to solve them yourself. I believe this does.
You have a point that function calls should look like function calls, but keep in mind that, in Ruby, juxtaposition is how function calls look. The same goes for quite a few other languages as well (Haskell and ML off the top of my head, but there are others).
Yeah, I know it's not totally rational. Groovy does a trick where a Groovy'd Java class hides all the setters and getters and makes them look like property accessors. (C# does that too, I think?) And I like that, so I'm not necessarily arguing one way or the other; or at least I'm not arguing with the blog article. I have this argument with myself all the time! =)
I seem to have met a lot of rubyists recently searching for better abstractions. Like OP many go to lisps, especially clojure (OP went to chicken scheme, which i 've never heard of before), and do away with whitespace/precedence/associativity parsing issues.
Others look at static typing w/inference and/or better module systems (O'caml, haskell, scala, F# if you already own Vis. Studio). With these, whitespace/precedence/associativity parsing issues can be amplified or mitigated, depending on whose code you're reading, but once you absorb the language's Gestalt or Sprachgefühl they shd go towards mitigation.
I'm still amazed that neither Python nor Ruby allow you to require declaration of variables. Perl really got this right with strict and the "my" keyword. If you try to use an undeclared variable under strict mode, the compiler throws an error.
It's because Ruby scopes sigil-less variables locally by default. It's like using "my" everywhere for free. "x = 10" in Ruby creates a local variable. By default, "$x = 10" in Perl is creating a global package variable which, in most day to day cases, isn't particularly desirable.
Perl's "my" really patches over a dangerous default. Perl gives you the global by default. Ruby gives you a local by default. I'd much rather have my defaults be sane than be forced to make declarations to fix things up. Which is why Ruby became my primary language after 8 years of Perl..
> It's because Ruby scopes sigil-less variables locally by default.
I'm pretty sure autarch knows that.
I come from the Python world (and have done quite a bit of Ruby as well as a few other languages) and have come to dislike all implicit scoping, it is less readable and always ends up leading to annoying (and hard to track) bugs. This is magnified in Python due to scoping being historically wonky (there used to be no lexical scoping at all, only function-local and module-global) and leading to `global` and (in Python 3) `nonlocal` (yay two crappy keywords and statements instead of having a single clear one)
Whatever the case, these will trip people up and really provide little to no gain, explicit lexical scoping is clearer and leads to less bugs, both from shadowing an existing variable and from hooking into (and overwriting) the same. It's also pretty easy to statically check, in a statically (lexically) scoped language.
Interestingly, that (along with first-class blocks, keyword messages and the cascading operator) is one of the very nice things Ruby threw away from its (limited) Smalltalk ancestry.
I suppose it's stuff like this that leads to the dozens of abandoned language implementations that litter the world of software like balled up hamburger wrappers outside a truck stop.
It's enough to drive a man back to Scheme. Why do all of these sh++ languages get so much right and so much wrong? My personal preference from the list is Python, but it's full of irritants as well.
At least there are tools for python which will catch misnamed variables:
$ pychecker a.py
Processing module a (a.py)...
Warnings...
a.py:3: Local variable (somevar) not used
a.py:4: Local variable (someothervar) shadows global defined on line 1
a.py:5: Local variable (someothervar) shadows global defined on line 1
a.py:5: No global (some_func) found
not every problem with scoping can be caught that way, but hey... can't have everything.
I think you missed the point. Ruby and Python syntax conjoins two concepts: variable declaration and variable binding. This increases the likelihood of subtle naming errors, e.g.:
my_variable = 10
# a few lines later...
my_varable = 15
Some kind of "var" keyword makes the problem disappear and makes the programmer's intentions clearer, e.g., when creating temporary variables inside the scope of a conditional or a loop.
In the case of Perl, strict mode also makes it clearer when the programmer wants a lexically ("my") or dynamically ("local") scoped variable, a feature woefully missing in both Ruby and Python.
If x is initialized to an integer (or more accurately, any object that has a method "+" defined that knows how to accept a Fixnum parameter), it'll work fine. If it's not, you'll get "No method + for nil", since what's happening is that you're trying to invoke a method named "+" on the variable named "x" with the parameter "1".
It's trendy to pick holes at Ruby in various, inaccurate ways lately, but a quote from Matz about that POLS business:
Everyone has an individual background. Someone may come from Python, someone else may come from Perl, and they may be surprised by different aspects of the language. Then they come up to me and say, "I was surprised by this feature of the language, so therefore Ruby violates the principle of least surprise." Wait. Wait. The principle of least surprise is not for you only. The principle of least surprise means principle of least my surprise. And it means the principle of least surprise after you learn Ruby very well. For example, I was a C++ programmer before I started designing Ruby. I programmed in C++ exclusively for two or three years. And after two years of C++ programming, it still surprised me.
You can pick up almost any programming language, not learn it properly, and then pick holes in its various inconsistencies or flaws (my own is over Python's ambiguities over methods vs functions for various things - but I know I'm not a Python developer!) That doesn't make it interesting or insightful. Every language has dark corners and Matz has been more than forthcoming about Ruby's.
This article has numerous curious assumptions and sloppy practices. Local variables called the same as method names? And very few Rubyists, in my experience, would be confused by the key example in the memoization section. Who really thinks || checks for whether a variable is bound rather than doing a logical or? How is it a "what's worse" that instance_eval changes self to instance_eval's receiver? That's its entire reason for existing! The second example under "Local Variables Strike Yet Again" is flat out wrong, even by the author's own admission.. Ruby 1.9 has been the production release for almost three years!