It's because Ruby scopes sigil-less variables locally by default. It's like using "my" everywhere for free. "x = 10" in Ruby creates a local variable. By default, "$x = 10" in Perl is creating a global package variable which, in most day to day cases, isn't particularly desirable.
Perl's "my" really patches over a dangerous default. Perl gives you the global by default. Ruby gives you a local by default. I'd much rather have my defaults be sane than be forced to make declarations to fix things up. Which is why Ruby became my primary language after 8 years of Perl..
> It's because Ruby scopes sigil-less variables locally by default.
I'm pretty sure autarch knows that.
I come from the Python world (and have done quite a bit of Ruby as well as a few other languages) and have come to dislike all implicit scoping, it is less readable and always ends up leading to annoying (and hard to track) bugs. This is magnified in Python due to scoping being historically wonky (there used to be no lexical scoping at all, only function-local and module-global) and leading to `global` and (in Python 3) `nonlocal` (yay two crappy keywords and statements instead of having a single clear one)
Whatever the case, these will trip people up and really provide little to no gain, explicit lexical scoping is clearer and leads to less bugs, both from shadowing an existing variable and from hooking into (and overwriting) the same. It's also pretty easy to statically check, in a statically (lexically) scoped language.
Interestingly, that (along with first-class blocks, keyword messages and the cascading operator) is one of the very nice things Ruby threw away from its (limited) Smalltalk ancestry.
I suppose it's stuff like this that leads to the dozens of abandoned language implementations that litter the world of software like balled up hamburger wrappers outside a truck stop.
It's enough to drive a man back to Scheme. Why do all of these sh++ languages get so much right and so much wrong? My personal preference from the list is Python, but it's full of irritants as well.
At least there are tools for python which will catch misnamed variables:
$ pychecker a.py
Processing module a (a.py)...
Warnings...
a.py:3: Local variable (somevar) not used
a.py:4: Local variable (someothervar) shadows global defined on line 1
a.py:5: Local variable (someothervar) shadows global defined on line 1
a.py:5: No global (some_func) found
not every problem with scoping can be caught that way, but hey... can't have everything.
I think you missed the point. Ruby and Python syntax conjoins two concepts: variable declaration and variable binding. This increases the likelihood of subtle naming errors, e.g.:
my_variable = 10
# a few lines later...
my_varable = 15
Some kind of "var" keyword makes the problem disappear and makes the programmer's intentions clearer, e.g., when creating temporary variables inside the scope of a conditional or a loop.
In the case of Perl, strict mode also makes it clearer when the programmer wants a lexically ("my") or dynamically ("local") scoped variable, a feature woefully missing in both Ruby and Python.
Perl's "my" really patches over a dangerous default. Perl gives you the global by default. Ruby gives you a local by default. I'd much rather have my defaults be sane than be forced to make declarations to fix things up. Which is why Ruby became my primary language after 8 years of Perl..