Rust is a terrible language. It has an identity crisis. Thinks it's a high level but also system level language. You cant be both. The unix philosophy always wins. Do one thing well. C is the perfect system lang and Java/C#/Lisp/Python etc are the cream of high level langs.
Rust has always clearly labelled itself as first and foremost a systems programming language. It strictly maintains performance and safety as its top priorities. It doesn't go out of its way to be easy to learn or to use, compared with higher level languages. Doesn't seem to me like there's any identity crisis.
Why can't you be both? The "high level" features rust offers to make it feel like a modern language are just abstractions, nothing that makes it directly inferior to C. Allowing for modern constructs without sacrificing performance is a good thing.
This is like Nokia sticking to keyboard in their phones. Worst decision ever made. Their interface will be out of date 5 mins after its made with no way of updating
github uses libgit2 https://github.com/libgit2/libgit2 (in C) but I'm pretty sure they have a lot of wrappers around it (maybe a ruby gem with native extensions, maybe using ffi)
edit: ruby gem: https://github.com/libgit2/rugged which has a native extension written in C to bind libgit2 functions to ruby methods.
They almost surely did not implement Git in Ruby, but merely wrote bindings to it. I used to remember reading through their Ruby source code for this (Edit: found it, it's called Rugged). But even then, you need a more efficient API layer for this binding to be more practical for programmers, hence the libgit2.
Everything you say is correct, and may even be a word to the wise, as it were -- but there are always new command line apps made by people who haven't been thoroughly exposed to the true Unix/Linux/Posix way, and need to have that kind of thing explained. [1] [2] [3]
Which is to say that your comments are not self-evident to everyone, by any means.
Example: less(1) was eventually fitted with the -R flag to allow highlighting to be passed by pipe, to make it friendly to piping from highlight-producing apps rather than its default highlight-unfriendly stripping of input controls/escapes.
Benign intentions with regard to files sometimes require a little more thought to be simultaneously friendly to pipes, except for the simplest of plain text apps.
(I'm not trying to make a big point, but I just rediscovered 'less -R' out of necessity today. So, I was pondering some of these issues)
Worth a re-read for everyone (and I worry that these things may be in the process of being forgotten):
[4] Raymond didn't invent any of this, but he explained it very nicely, IMHO.
Edit: P.S. the above is somewhat meandering, and is sure to be criticized because simple equivalence of piped and file input is the most important thing, which I agree with -- but I was mixing that with a secondary point because of what I was doing today. Sorry. "I did not have time to make this letter shorter" etc.
Everything coming out now went through him, do you think it really takes a year to develop products? The maps team had been working on the new maps for years. I guess Apple should have invented warp drive in the last year too.
I think they would have released it. It seems they had little choice. It was either release it or renegotiate their contract with Google since it is almost up.
It isn't his business accumen or presentation skills that people miss. It is his ability to inspire people to do better. His passion for doing something amazing.
In many ways John Lennon was the same. Read a biography of him and listen to outtakes from Beatles recordings to see what I mean.
Imagine if we all had that kind of passion and devotion to being and creating the best. Imagine if we all created such amazing things that Steve Jobs and John Lennon looked... normal.
Or going to the bathroom, driving their cars, sleeping... For example, I've only seen the Godfather once. My electric toothbrush, on the other hand? I use that every day! Take that Coppola!
I think Jobs himself proved that software isn't art, and "hackers" are not artistes. I submit the following conjecture: The complaints about the app store approval process would be nothing compared to the shitstorm if Apple started blocking songs and films from iTunes. Remember, Jobs is the one who, if he had gotten his way, would make all software go through a central committee before you could use it. ¡Viva la Revolución! Also, eff the Beatles. Put on some Wilson Pickett or Motown, or your pick of their better contemporaries.
I would spend a little extra time upfront and save many many expensive rewrites latter on. Use C from the start, you won't need to rewrite just scale with hardware.
This argument that using Ruby (or any language that makes programming easy but runs like treacle) to get something out the door a few months earlier is bullshit, I would rather release something a few months later that I didn't have to totally rewrite down the line 10 times.
The reason why "getting it out the door" is so important is because nobody behind the door generally has any idea about what people will really pay for. By making it fast (read: cheap) to make, if you find that you don't get it right, you can try again, and again, and again.
Now, you can argue that "they shouldn't be releasing without knowing they will be successful", but if they have the ability to see the future, they should just take that VC money and put it in the market. People like Steve Blank and Eric Ries have a lot of evidence how running most startups (software startups particularly) using a "build it and they will come" attitude is ripe with failure.
Now, if you are just building for fun, that's different. I knew a guy building a Dropbox clone in C. More power to him (though I wouldn't have touched it, because I think it is too hard to get security right all the time in a language like C).
I can turn around a project in a few months easily, and all my _running_ software will be C or C++. The secret is leverage and code generation (Where I will use a scripting language)
This would only make sense if you're working for an established company that can burn a few extra months early on. For a startup the runway might be too short. In this case you have no idea if your product is even going to exist a year later so it seems a bit premature to be worrying about the possibility of rewrites.
My experience is a software always takes much longer than expected, if your startup will fail if you run a few months over then you may as well go and place all your money on black at the casino. You need to get into a position were you can run 1 year over and still survive i.e. have other income while working on your startup.
Wow!! your comment encapsulates the thinking of every wannabe startup loser that ends up smoking crack on some beach in San Francisco"
Will making it faster increase the chances of more people using it? Also this attitude of building shit products (Yes software is a product) because..well the chances are it will fail.. is precisely why so many startups fail. This is why you get one crapola startup after another........
"Show HN: Crap.ly -- we built this in 3 weeks using Ruby, it will scale up to 20 users before craping out, its a twitter scraper that connects to app.net and diaspora showing how much Money your Kick-starter project has made and includes quotes from Paul Graham about how to build a successful startup, because he has built so many"
Imagine Linus had built Linux using fucking node.js
Imagine your shock when you build your craptastic app in C++ and wonder why it's not any faster than a scripted app. I'll give you a hint ahead of time: most bottlenecks aren't processing time, they're DB or IO limited.
Not strictly true. In the bigger companies, everyone knows that DBs (which is really just a special case of I/O) and I/O are slow (we tend not to hire them if they don't - one of my favoured systems & arch. interview questions[1] is on memory/storage hierarchies). The more interesting problem is what ELSE you've traded off to eliminate I/O on the common path - and in many languages it turns out to be memory fragmentation and/or garbage collection time. In Java, stop-the-world GC pauses of multiple seconds are not unheard of, and it's no fun being Oracle's test bed for undocumented GC features. It's also no fun when you find out you can't restart your multi-gigabyte heap JVM because the underlying OS has fragmented RAM so badly it can't alloc that heap in a single contiguous chunk.
This is something newbies repeat to justify their choice of a piss poor implementation. Sure with 10 users hitting your Rails app it might perform the same as Nginx with a custom C handler talking to a database written in C, but increase the load and soon your handing Amazon thousands a month on ec2 nodes while I am still on a small Linode. The funny thing is because of code gen and experience I can probably write my app faster than you dicking around with framework after framework
App speed matters some. But not nearly as much as a lot of other things. And when speed is the biggest problem, your users can tell you and the fixes are pretty straightforward.
The hard things to solve require a lot of user-facing iteration. Basically, the faster you can try new things, the more likely you'll get product-market fit before you run out of money.
My feelings on Rails are decidedly mixed, but fast prototyping is one of the things they got right.
What makes you think you know the problem space enough to make a C implementation that you won't have to rewrite 10 times? Writing in C certainly doesn't stop any rewrites at least where I work.
This. I'm as hardcore C programmer as they come and I still find it better to start with Python and reimplement in C when the application becomes familiar and finite. It's faster, easier and cheaper.
The basic notion is that every bit of code is better the second time its written, and C development is just too slow for the first iteration.
Why are they moving from Ruby if it's just "string manipulation and database access"?
The problem is folks, as soon as you get a large number of users every compromise you made by using a toy language or database is magnified 1000 times. Google didn't implement in some scripting language to get to the market a few months early.
BackRub is written in Java and Python and runs on several Sun Ultras and Intel Pentiums running Linux. The primary database is kept on an Sun Ultra II with 28GB of disk. Scott Hassan and Alan Steremberg have provided a great deal of very talented implementation help. Sergey Brin has also been very involved and deserves many thanks.
-Larry Page pagecs.stanford.edu
----------------------------
There's something to be said about rapid prototyping and evaluation.
The point is that most companies don't get to the point that the difference between C++ and Python matters. Worrying more about the business and less about the technology will be more likely to see you succeed than worrying more about the technology and less about the business.
I don't believe all companies can survive with a python or ruby solution. I do think that, as technologists, we worry too much about the "optimal" solution to technical problems when, in most cases, businesses are made or lost in people problems. People problems are really hard because there are few "right" answers. Instead, it is an optimization game, and optimization games require agility.
If you are that amazing at C or C++ that you can iterate amazingly quickly with them, then use them! That will give you a leg up later. I've been developing software for 15 years and have use everything from C and C++ to Java to Python to Objective C; and I've seen a massive difference in my ability to iterate with each of them.
Optimize for what works best for you, but don't be surprised if you choose C++ and a competitor who doesn't care about the "perfect" solution runs circles around you in the market because they chose something different (even if they re-write in C++ in 10 years, after they've stolen all of your customers).
How about worrying about both equally, if you still don't get it see Diaspora, great idea and lots of buzz, but implementation was shit and it was DOA.