Hacker Newsnew | past | comments | ask | show | jobs | submit | TweedHeads's commentslogin

Javascript can do it all:

  function fac(x){ return x==0?1:x*fac(x-1); }
  a = fac(8);

  or a one-liner:

  a = (function fac(x){ return x==0?1:x*fac(x-1); })(8);
Both ways spit 4320


The Y combinator allows the definition of anonymous recursive functions. These have names ("fac").

Personally, I have never found much of a reason to use the Y combinator. I typically use the method you used in the second example; I declare a named function within the scope of the anonymous one. I find it to be a readability win.


You can still do a Y combinator in JavaScript. In http://www.mail-archive.com/boston-pm@mail.pm.org/msg02716.h... I not only show how to do it, but I show one way that someone reasonably could come up with the construct.

But I agree with you that Y combinators are not generally a good approach for working programmers because there are clearer ways of accomplishing the same thing. However I've heard that in some functional languages they can be very useful as a pattern to compile language constructs to.


I also did something like this. It's fun to port lisp constructs to Javascript.

http://foohack.com/tests/ycomb.html


Does this count as an anonymous recursive function?

a = (function(x) {return x == 0 ? 1 : x * arguments.callee(x - 1)})(8);


Full anonimity, thanks to tilly (separated for better understanding):

  builder   = function(x){ return function(n){ return x(x)(n); }}
  factorial = function(f){ return function(n){ return n==0?1:n*f(f)(n-1);} }
  alert(builder(factorial)(8))
One liner, just seed factorial to builder:

  r = (function(x){ return function(n){ return x(x)(n); } } (function(f){ return function(n){ return n==0?1:n*f(f)(n-1);} }))(8);


40320. (They do.)


"Its first netbook, the Nokia Booklet 3G, will use Microsoft’s Windows software and Intel’s Atom processor"

There, doomed.

If you don't extend your arms to the open source community, don't expect us to open our arms when you need us.

Fuck you Nokia.


Huh. Nokia is pretty much the #1 sugar daddy of open source in the mobile tech business. In the past year, they have:

* Bought Symbian and open-sourced the entire OS.

* Bought Trolltech, relicensed the Qt toolkit to LGPL, and developed new Python bindings from scratch.

* Poured investment into Maemo, including 3G phone support (check out the upcoming Nokia N900).

They're just testing the waters with this new Windows 7 notebook. (Did you notice that the screenshots are mostly showing Ovi Suite 2.0 which is Qt-based and will be available on Linux as well?)


[dead]


Wow, what a nice guy you are. I bet you're great for dinner parties


The analyst quoted in the article thinks that it may just be an experiment for now. Maybe using Windows is the quickest way for them to get to market and test their product. Just because they chose Windows for this first launch doesn't mean they won't use open source for netbooks in the future.


I just can't wait for my browser to becomes my OS - it would be a game changing scenario - it won't matter what is pre-installed, you'll be able to switch your OS in a second ... it won't even matter what h/w you are using (PC on intel, MAC on intel, Linux on whatever) ....

i wonder how much a h/w platform or pre-installed s/w would matter in one's life when a browser is your new OS and almost all your apps are on the cloud !!!


Who wrote the browser? The window manager? The compiler, assembler, and linker? What about computer games? I don't see us doing Crysis in Javascript any time soon.

The web is great for some things, but some things aren't everything.


I bet most computer users never play Crysis.


I bet most computer users have never used a bash shell. I still prefer the world with it to a hypothetical one without it. Same goes for high end computer games.


whatever i said is from common endusers perspective and not from YC news reader's(who are mostly techies) point of view - just like the way we switch our browser right now ... we will be able to do the same with the OS!! or even better, there will be multiple browser OS preinstalled, you pick what you want....

as far as h/w platform is concern it would be too early to make a call, we may think right now that intel will lead the market/platform as they now support both (PC & MAC) but i wont be surprised if there are new players... especially from taiwan and china!!!

And even if intel dominates the microprocessor market, I really wonder how would a common man will differentiate a MAC laptop with Lenovo's thinkpad because all they have installed is the same Browser-OS (chrome, safari etc.) on the same platform (intel), with the same specs(say 4GB RAM, 5 USB ports, 2HDMI, same display card.... etc.) and almost everything else is on the cloud .... i guess then it would be all about speed, power and exterior look/design of the laptop??!!!


Please, exactly, tell me what a browser-OS is. Are we talking about something like Chrome OS, which looks like it will be an OS whose primary (edit: or even sole?) purpose is to run a browser? Then it's basically just a crippled version of what we already have. Or would we actually be talking browser-as-an-OS-properly, where some hypothetical browser decides to take up the mantle of process management, drivers, etc?

I don't doubt that lots of interesting things will pop up on the web. But "browser-as-that-thing-I-spend-a-lot-of-time-in" is very, very different from any sane definition of "browser-as-an-OS".


Google is planning a OS but in the form of browser...

visit: http://googleblog.blogspot.com/2009/07/introducing-google-ch...


Arrington? Is it you?


Crockford should stop claiming he invented JSON.

Brendan Eich did, JSON is Javascript, part of it, its own data structure, it is javascript object notation.

What Crockford did was to spread its adoption among other languages, building libraries and encouraging people to use them.

To Caesar what is Caesar’s...


If you watch his talk he doesn't claim he invented JSON; he goes out of his way to say he discovered it -- because it was already there. And he even says that he wasn't the first to discover it either -- as many other people figured it out around the same time.


"An influence was Rebol, a shame it’s not more popular"

Then the title is misleading. Rebol as an influence to what?

We should ask that question to Brendan to see what influenced HIM to design javascript as he did.


Javascript is extremely heavily influenced by Self: http://en.wikipedia.org/wiki/Self_(programming_language)


The first thing he says is that he didn't invent JSON. He also tells who used it a year before as such. Around 8:30 he also explains why he did create a website - to create a "standard" out of it, etc..

It would be better if you watched the video before commenting this.


JSON as an interchange format was a unique invention. It may not be a very substantial invention, but it was still a new idea.


(defun foo (n) (lambda (i) (incf n i)))

def foo(n): lambda i: n += i

Give me two powerful languages with the same features and I'll pick the latter for its syntax.

So no, lisp may be the best right now, but it won't be the best forever, its syntax is its best friend and worst enemy.

Something lispers will never understand. Love is blindness.


In this case it's more that ignorance is blindness. That syntax is a feature. The fact that Lisp code is expressed in Lisp data structures makes it easy to write programs that write programs.


Even not as a real lisp guy that's one of the things that I appreciate about lisp and the only time that I've done anything particularly powerful with it: the fact that it's basically a parse tree makes it really interesting for genetic programming.


It seems that the fact that Lisp is a parse tree is what makes it a limiting case. That is, if there's only one canonical structure for representing programs, and Lisp makes it explicit, then any other language that makes it explicit is going to be isomorphic to Lisp.

I'd like to know, though: is that true? Is an abstract syntax tree the only good way to represent programs as data? Has any other expressive way to do this been invented, or do programs always boil down to an AST after the syntax has been stripped away?

I say "expressive" because I can think of at least one other code-as-data representation: binary. But that isn't suitable for programming. If there is another representation that is suitable, but isn't isomorphic to a parse tree, then presumably one could derive from it an analog to Lisp that doesn't simply reduce to Lisp. Until then, though, I think the point that PG made semi-jokingly in the essay, that any language as powerful as Lisp will be a variant of Lisp, has to hold true.

Some day I'd like to research this question (or better, someone on HN could just tell me the answer). If other meaningful representations are possible, surely some must have been invented by now. And if they aren't, then the AST is a considerably more remarkable discovery than people realize.


I don't think that a parse tree is intrinsic -- it's just something that maps well to the languages that we actually use: of course you could translate everything to a turing machine's tape, but that wouldn't be particularly useful.

What would really interest me would be exploring that if human logic fundamentally reduces to a parse tree because of something to do with the language faculty of the mind, and the programming languages that we create tend towards that just because programming languages are built out of the same logical faculty.


Yeah, that's the point. Other representations are possible (binary), but only one has proven suitable for programming. This is presumably because of the way the human mind works. It would be remarkable if it were the only good representation for this. It would also be the reason why Lisp is the limiting case among programming languages, at least when it comes to metaprogramming. But I repeat myself :)


> Has any other expressive way to do this been invented, or do programs always boil down to an AST after the syntax has been stripped away?

That is actually an excellent question.


I was thinking Rebol might be a good place to look.

http://news.ycombinator.com/item?id=780853


Dear Paul, when I say its syntax is its best friend, that's exactly what I'm talking about.

Now, say I fork lisp and change one little thing:

(defun foo (n) (lambda (i) (incf n i)))

to this:

[defun foo [n] [lambda [i] [incf n i]]]

or even this:

{defun foo {n} {lambda {i} {incf n i}}}

or how about this?

<defun foo <n> <lambda <i> <incf n i>>>

Hmm, trees are powerful, no matter how they are expressed huh?

What if we can represent the same trees using colons and commas as delimiters?

:defun foo:n, :lambda:i, :incf n i.

Maybe spice it up a bit using periods as recursive closing delimiter.

See? still powerful trees!

Between common lisp and colon lisp, I still use the latter.

Love is blindness.


Sure, trees are powerful no matter how you delimit them. Most Lisp hackers would be fine with using {}, [], or <> if they had to. What Lisp hackers like is programming in lists, not parens per se.

I don't see what you're trying to claim. Initially you seemed to be saying that you preferred conventional sytax to sexprs. I pointed out that this meant you had to give up macros. You reply that you could use other characters to delimit sexprs. Sure, but so what?


Taking a closer look at my macbook keyboard, of all the lisp forks, I'd go with squared brackets [lisp] for convenience.

Ergonomics win.


If that's the only thing stopping you, just tweak your editor's keymappings.


You're assuming your conclusion: that the languages are equivalent in power and differ only in syntax. If that were true, then sure, one would simply make an arbitrary choice based on taste. But it isn't true. It's profoundly untrue.

The syntax in that first example you quote isn't arbitrary. It's essentially the only notation that exists for representing the structure of a program explicitly in language. Is that valuable? Uh, yeah. It's incredibly valuable. It's worth a lot more than syntax in many cases.

By contrast, the syntax in the second example is arbitrary: the delimiters and operators could just as easily be organized differently. Perhaps this particular organization makes certain programs clearer, but it loses the ability to represent programs in general.

If that doesn't make sense, think of the fairy tale of the goose that lays golden eggs. Everybody's obsessed with their particular golden egg. Lisp - precisely because of the difference visible in those two examples - is the goose.

There's nothing in this that PG didn't already express in his comment. But what you wrote deserves more than one protest :)


That Python code shouldn't work. It needs a 'return', assignments aren't expressions, and assigning to a nonlocal variable in the absence of a declaration just creates a local. (Might be nice if Python did look more like that, though!)


"I am starting to think Dijkstra is about the worst thing ever happening to software development."

"0 based for spatial coordinates, 1 based for lists and character positions."

Sums up my thoughts pretty well.

The real reason we do zero-based counting in computers is because of bits, speed and laziness.


And, of course, it provides an easy way to separate REAL PROGRAMMERS (TM) from everybody else.


In the same snarky tone, I'd say it helps to easily identify masochists in a herd of brilliant programmers.


Better get used to it, soon it will be used on the server side too.


I don't care if computers think in 0s and 1s or if arrays should start at 0 because it is the way computers think.

Computers were made to serve us and as such they should translate all their inner thoughts to more human consumable data.

Five is 5, not 101.

So no, numbering should start at ONE, even if most programmers have already been hardwired to start counting from zero.


We have this convention for the benefit of humans. Indexing starting at 0 has the convenient property that when we modulus an arbitrary number by the size of a range, the result is a valid index into that range. This comes up when implementing things that map a larger set onto a smaller set, such as buffers or hash tables.

We could still achieve the same effect if we started indexing at 1, but it would require more code, and it would be less clear.


"We could still achieve the same effect if we started indexing at 1, but it would require more code, and it would be less clear."

Really?

How about dealing with strings?

"123456789" how many chars we have? 9

char[1] = 1

char[2] = 2

:

char[9] = 9

Now, let's try the C approach:

"123456789"

char[0] = 1

char[1] = 2

:

char[8] = 9

where is char "1"? at zero index!

how many elements we have?

last index plus one!

see? there is already confusion in place, or as you say, more code and less clear...


"0123456789"

  ch[0] = 0;
  ch[1] = 1;
  ch[2] = 2;
  :
  ch[9] = 9;

  ch[1] = 0;
  ch[2] = 1;
  ch[3] = 2;
....


Please re-read his question: "how many chars we have?" "how many elements we have?"

  ch[1] = 0;
  ch[2] = 1;
  ch[3] = 2;
  ...
  ch[10] = 9;
This is about having the Nth element of the array with the index N rather than N-1.


That's not code, that's English.

When we start indexing from 0, the only time we need to do a +/- 1 is when we need to index the last element. All other times (iterating, mapping) we don't need to.

Also, starting indices at 1 would change the idiomatic iteration to:

   for (int i = 1; i <= SIZE; ++i)
I find that the <= and >= operators add more to my cognitive load than the strict relation operators < and > because there's an or in there. I don't find the alternative to that idiom any better:

  for (int i = 1; i < SIZE + 1; ++i)
Your original argument was that we should make it easy for humans to understand, not computers. I think that starting indices from 0 is easier for humans to understand because it simplifies the code we must write and read.


No, your language limitations don't have to force the rest of the world to accept them.

As I alreay said, some languages use the simpler construct:

For i=1 to N

And C could easily use:

for(i=1;i==n;i++)

just by changing the way the loop condition works.

So as you say, it is all about the language.


No way I would trust a language design with your semantics for this syntactic construct: for(i=1;i==n;i++)


But he was making the point that there are valid human reasons for zero-based indexing - e.g. keeping the lower bound within the natural numbers; having the upper bound be the number of elements in the preceding sequence, etc. The "way computers think" doesn't enter the discussion.


"keeping the lower bound within the natural numbers; having the upper bound be the number of elements in the preceding sequence, etc."

Read your post and understand the same can be said of using one-based indexing.


That's false. You'd be forced to sacrifice the exclusive upper bound, which would then force you to sacrifice the difference between the upper and lower bounds being the number of elts in the collection. Unless the lower bound was exclusive, in which case it could be an unnatural number.

No, the only way there's no lump in the carpet is his way.


Look at your hand and start counting your fingers while naming them:

[1] thumb

[2] index

[3] middle

[4] ring

[5] pinkie

so we have a simple range [1..5]

which can be represented in so many ways in computer programs:

for i=1 to 5 print finger[i]

for(i in [1..5]) print finger[i]

my first finger[1] is my thumb

my last finger[5] is my pinkie

how many fingers we have? as many as the last index in the list: 5

-

now, C programmers like to count this way:

for(i=0;i<5;i++) print finger[i]

where finger[0] is thumb

and finger[4] is pinkie

how many fingers we have?

as many as the last index in the list plus one: 4+1

how human-like!

And that's why you get so messy when trying to get the string position of a substring:

if(pos>-1) exists, since pos=0 means the substring is in the starting position

again, how human-like!

But how dare I argue with C programmers without being burned at the stake?


What makes you think people who 0-index arrays and prefer half-open intervals count any differently? Is this argument directed at a four year old?

4+1=5 never enters into it. What a rubbish argument. I might as well complain that your [a,b] has (b-a+1) integers in it. (5-0)=5 so the half open interval has 5 things in it.

What's the measure of a real interval 1<=x<=5? 4. 0<=x<5? 5.


Don't try your cheap tricks on me.

We go from 1 to 5, so the math would be 5-1 +1

You go from 0 to 4, the math would be the same, 4-0 +1

Using 5 as your upper bound, then starting from 0 to 4 is the same as me using 6 as my upper bound then going from 1 to 5.


for i=1 to 5 print finger[i]

for(i in [1..5]) print finger[i]

You're using an inclusive upper range here. Not that that's necessarily wrong, but it isn't standard for the reason Dijkstra mentioned (how many fingers do I have? 5 - 1 = 4. Oops.)


"how many fingers do I have? 5 - 1 = 4. Oops."

Upper-Lower+1, easy. Whenever the lower is 1, just the upper is enough.

In your case you go from 0 to 4, how's that different from how many fingers you have? 4-0=4 Oops.


Congratulations; you have successfully introduced a whole new way to have off-by-one errors.

People are used, due to forty years or more of tradition in the great majority of major languages, to zero-based indexing. Tossing that away Because It's Inelegant won't do much good to anyone, especially when you take into consideration the fact that an awful lot of math is more convenient with 0-based indexing.


Oh, it will be hacked for sure.


When I was younger I used to wake up at 4pm and code till early 6am, pure zen.

Now that I am older I try to go to bed the same day I wake up.

The older I get the earlier I want to wake up, the less I want to sleep.


Having a chat frame in your web page is really easy if you use WebSockets and can run a simple chat engine on your server (some hosting services don't allow sockets at all)

Since WebSocket is not available everywhere yet, use a Flash interface (pure actionscript, no visuals) with the same methods: onopen, onsend, onreceive, onclose, to communicate with the server. So when WebSockets is available all you have to do is replace the flash object with the real stuff.

  mychat = new WebSocket(chatserverurl);
  mychat.send("Hello");
  mychat.send("Can you hear me?");
  mychat.send("Am I getting through to you");
Don't get confused, the chat client is pure html, not flash, and can be easily spiced up with some jquery effects.


We've used Strophe to implement a pure-javascript/html chat like this (with an XMPP server on the other end, and jetty handling proxying of non-blocking asynchronous comet long-polling requests to the XMPP server).

XMPP supports this natively via its HTTP binding specification, and there are free open source servers such as openfire that will work with Strophe out of the box.

http://code.stanziq.com/strophe/ http://www.igniterealtime.org/projects/openfire/index.jsp


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: