Mathematica 10.0 for Mac OS X x86 (64-bit)
Copyright 1988-2014 Wolfram Research, Inc.
In[1]:= FullForm[x->y*z^w]
Out[1]//FullForm= Rule[x, Times[y, Power[z, w]]]
In[2]:= x\[UnionPlus]y\[RightTee]z
Out[2]= x ⊎ y ⊢ z
There's no reason your programming language shouldn't be able to tell you what these things are called in English without an external reference.
The name can be arbitrary, as long as it's unambiguous. By the way, here's what mathematica says when I copy paste that and enter x∘y: SmallCircle[x,y]
Also fun:
In[3]:= SpokenString[x\[SmallCircle]y]
Out[3]= "SmallCircle of x and y"
This sums up what I dislike most about Scala (which is an even worse offender in this regard). A rule of thumb: if a 10-year-old wouldn't be able to intuitively read the operator, use words.
People say that. But it's totally misguided IME. You're talking about scalaz.
Normal Scala code has fewer symbols than Ruby aside from function lifting (the underscore).
In fact, you've got Generics (square braces), curly braces for blocks, fatty arrows for call-by-name, functions parameters and pattern matching. Parens for methods and tuples. Underscore for function lifting. cons (double or triple colon). Stab by arrows as an optional shorthand infix notation for creating Pairs AKA Tuple2.
Every once in a blue moon you might see :+ to mean add a single item.
You can write a whole heckuva lot of Scala and never use much more than that. If you're facing symbolic overload then you should probably reevaluate the libraries you're using.
The only ones I use frequently that add anything else are spray-routing/akka-http's Route.~ to concatinate routes and json4s's JObject.~ to concatinate two objects together.
Yes there's a world of libraries that add a lot more, but they're entirely optional. You aren't a bad programmer if you do or do not use scalaz. (At least I don't consider myself a bad one, and I don't.)
Scala itself doesn't actually extend all that far beyond c#. The FUD around symbols in Scala is just that. You add cons, the occasional stabby arrow and the underscore and you've got the extent of Scala's additions you'll find in your average library.
I just do not get how people walk away with this impression. Especially Rubyists where ivars, cvars, pipe characters, two different HashMap styles etc is in everyday code.
IME Scala developers in-training don't actually have much trouble with this stuff. What they have trouble with is pattern-matching structure and function composition.
But Ruby can't do either.
In fact, other than the ability to invent more complex DSLs with more symbols I'm not actually sure that there's anything Ruby does better. Or more consistently.
The problem is that having Scalaz and its ilk underpinning a lot of libraries mean that, if you need to go down the debugging stack, you find yourself pretty quickly in squiggle soup.
Then I'd go back to the issue of: You might be using the wrong libraries if you find that frustrating.
I don't think any of the libraries I'm using in my current project utilize any typelevel stuff to a noticeable extent. I have seen it in my classpath, but haven't bothered to see where it's coming from and never run into any of it despite using CMD+B very frequently in IDEA.
You can go a pretty long ways with Scalatest, Akka, json4s, and maybe some commons stuff for REST service development. Play development isn't all that much different though.
If you're deep in science, maybe your library needs change. But then again I would assume less complaint over mathy concepts then. Maybe that's off-base.
I like the Litmus test that a ten year old should be able to read them, which means that +, *, / and - are fine but =~> and other nonsense should be banned.
Most people who write software are not children. I certainly hold my doctor, accountant, mechanic, etc to a higher standard. Why should we expect software developers to be no better than children?
& is maybe allowed. Sixth graders don't know about bitwise operations, but they know & means "and". That's stretching things a bit, though.
Is [] allowed for indexing? When I learned about arrays in sixth grade... No, wait, I didn't learn about arrays in sixth grade. For that matter, I didn't learn about functions using (): those are only for grouping expressions. Functions are just like, "sin 90". Also, since I didn't learn about code at sixth grade, I obviously didn't learn about using {} to delimit code blocks; , was only used in prose, not in mathematics.
Having said that, it's unclear why programming languages should be limited to operators that we can guarantee "sixth grade" students (whatever that happens to mean in your part of the world) will have heard of.
What actual benefit could it bring, other than (maybe) making it easier for somebody who hasn't learned a language to guess the meaning of a piece of code in that language? Is that benefit worth making code twice as long and rendering half the keyboard useless?
main() {
printf("hello, world\n");
}
Should, I assume be:
the main procedure is:
the message is "hello, world" plus a new line;
display the message;
stop.
Anything else will be too confusing for the many sixth graders currently working professionally as programmers.
The benefit is limiting the cognitive overhead of a language. The translation you describe is a task that must be done every single time by every single person who works with the code. Especially when discussing the code with another person, you end up reading things out loud. Every single time I encounter a bit of ascii-art code, I have to rack my brain for "in this context, what is the word that this symbol translates to". It gets especially bad if you work in multiple languages.
Also, I despise the logic of "because it exists on a keyboard, it should be used". C is a terrible offender in this regard, with * and & used for no good reason.
10-years-old, or 6th grade, or whatever is an arbitrary number, but don't get caught up in the distinction. "90% of high school graduates" would institute a similar threshold, and probably be less controversial.
I wonder at what point people started to expect that search engines can instantly provide all the answers to everything worth knowing. The correct place to find a word you don't know is a dictionary. The correct place to find a symbol in code that you don't know is the definition in the source code (or language reference if it's baked in), or simply ask someone who knows.
Asking someone you know is not an approach that scales in the large, and suggesting that all answers can be divined from source both assumes a level of mastery that cannot be expected (e.g. the difficulty of reading the source of C++'s Boost library vs. reaching for the extensive documentation) and denies the existence of questions such as "when is it appropriate to use this operator?" or "what is the worst-case runtime complexity of this operation?"
This seems like it could have been prime territory for a bit of surrealist/meta fiction, with gradually stranger descriptions of operators turning into a short story of some kind.
You are perfectly fine importing the module Control.Lens.Combinators[0], which has the full functionality of lens except the operators. There's nothing you can do with operators but can't do with normal functions named with words.
Oh, definitely. Haskell approach to operators and overloading is exactly the Right Way: operators are weirdly syntaxed function, and you can add your own. No need to make bit-shifting do IO.
Specifically, sex makes many people uncomfortable. Sure, it's natural function but it's not a topic many people are comfortable talking about openly. Not me but you can understand how this would be a sensitive enough topic for enough people that they'd leave it out, no?
! -> "not truthy". So !! means "not 'not truthy'" which is truthy.
Truthy is something that isn't exactly a true boolean but will equate to true(thy) when evaluated with this type of comparison. Things like undefined, null, and empty string will not be truthy.
edit: typically, if you want to test for truthy, you'll just evaluate it like so
if (my_object) {
//truthy
}else{
//not truthy
}
Does anyone know a time when you'd use !! instead of just the object in the evaluation? I'm not a Ruby dev, so I don't know specific cases for needing !!
One typical use for !! in JavaScript or Ruby is where a function/method returns a boolean value that indicates the presence or absence of an object.
You could just use:
return myObject;
Since that value will be truthy or falsy, any code that tests it will generally work, as your if statement example demonstrates.
But there are two advantages to doing this instead:
return !! myObject;
1) Now you are returning an actual boolean value, not just a truthy/falsy value.
2) Sometimes more importantly, you are now releasing this function's reference to the object, instead of possibly keeping that reference around much longer than needed and preventing it from being garbage collected.
Generally when you prepare a value for serialization. In my ~8 years of Ruby I've only used !! a few times. I think all those times had to do with configuration and/or serialization. For example:
```
my_option = !!ENV[MY_OPTION] # is false when not set
puts "my_option is: #{my_option}"
I like the name "stabby lambda", but I dislike its use. It seems less readable to me. The again I also think ruby 1.9 Hash syntax is less readable, so maybe my judgement is suspect.
http://search.cpan.org/dist/perlsecret/lib/perlsecret.pod#SY...