When someone says "I learned how to ride a bike over the weekend", the response is never "I don't see you doing tricks in a half-pipe, so you really only 'learned how to pedal down the street'. Don't call that learning how to ride a bike."
If "knowing how to code" means that you've built webapps, databases, embedded systems, mobile apps, operating systems, console games, and a compiler and mastered 19 programming languages, then very, very few people in world are coders.
Hacker News, stop being pedantic downers, and instead applaud someone for picking up a new skill in life (and trying to help others learn as well).
I think that depending on one's perspective, one might consider this more analogous to claiming to ride a bike, but as it transpires, you only learned how to ride a bike with training wheels. You can write the analogy however you like to make it suit the point you're trying to make.
I don't think it's unreasonable for folks to suggest that "Build web apps" is maybe a more apt title than "code" for what's described here. It would not harm this well-intentioned and well-executed article to do so.
The problem is one of terminology. But ignoring that issue doesn't make it go away.
People talk about "coding" or "programming" or "software engineering" as if such simple terms can encompass the huge range of complexities involved.
Coding is on the same scale as, say, carpentry. One might say "I learned carpentry over the weekend" and it would be obvious that what was meant was "I learned some small amount of wood working over the weekend" rather than "I became a master carpenter over the weekend".
Agreed. A lot of pedantry and smartest-person-in-the-room-ism going on here. People around here seem so jaded. Out in the real world (ie not Silicon Valley) a professional programmer graduates from a CS or CIS program, gets a job doing mindless coding somewhere like in a local government building and writes ASP, .NET, or some form of C all day. They get decent at like 3 languages, stagnate, and become dinosaurs. That's why so many systems suck when those guys retire and the new guys come in. I think that sucks and it should be avoided but my point is, what does make you a real coder/programmer/whatever you want to call it? Is it getting paid or is it knowing the ins and outs of the latest cool-new-thing? Because that's what it seems like everyone is getting at. I have a deep respect for everyone here who I know knows far more than I could hope to know but at this moment I'm a little disgusted by how out of touch everyone seems. I'd submit that there is a definition of a "real" coder but unfortunately it cannot be quantified. You know it when you see it. There's a combination of subjective skill, clout, and aloofness that quantifies the kind of real programmer touted around here but in the real world they're a lot like the cheap guys in India that a lot of people like to deride. I didn't know programmers could be such snooty hipsters.
Agreed, but you have to start somewhere. I think one of the things that turns lots of people off from CS education, and programming in general, is how long it takes to get from just starting out to making something cool.
Of course, of course. All I know how to do is web development (besides some Mathematica for research and dicking-around). This isn't like I am standing on top of an intellectual mountain, shouting down at the ignorant masses gathered below, "Stop worshiping your false golden Rails idols".
If I could add a stage though, it would probably be something like "Cluster 5: Why does my website suck?" Sure it's there, it's up on the WWW, but a website made from someone at Cluster 4 isn't something that is going to make you money.
Cluster 5 is when all the CS comes in. You have to learn about algorithms and data structures and regular expressions and what is actually going on behind the scenes. Yes, you can make a website that does the basic simple "of course, here's an exhaustive search" solution to the given problem. But can you make it fast? Or beautiful? Or, heaven forbid, intuitive? Can you fix it after you try and fail to make it the above three? That's the stage that never ends, the one that doesn't take a week of hunkering down with a textbook to get through.
Cluster 5 is the stage that CS education tries to start everybody at without any prior experience. It works for the imaginative people who can make the leaps about why this matters, but for most it just leaves everyone behind. With web programming, you actually have motivation to learn these weird things about patterns of electrons.
Because, honestly, who wants to make websites that suck?
I agree with this entirely, and I fully understand that web development represents a tiny slice of coding generally and that what I've presented is a tiny fraction of what you need to learn to be a good web developer.
I plan to edit the post a little to reflect what people have said. Hope the title wasn't too misleading.
There are two things: computer science and programming. you can learn one without being competent at the other.
Programming is programming. I don't care if it is just Javascript on a web page, or if it is C. The key is being able to step through the code and think through exactly what it is telling the computer to do.
When I started teaching my son to program we'd play games like "tell me how to get home from here," and I would pretend to follow his instructions to the letter, showing him how the exacting nature of computer instructions work. Every time he'd use a mistaken preposition we'd pretend we got stuck somewhere, or in the river or something. It was a lot of fun.
Computer science is different. Yes, it's helpful if you are in certain fields, but let's not confuse it with programming.
I think learning to develop websites with languages like python and ruby is a relatively easy way to get into coding. I remember learning to program with c++ and seeing int main()...it just threw me into a world of confusion.
Agree... the title is misleading. I opened up the page and quickly rushed to close it before any coworkers saw something about web development and assumed I knew anything about it. We only have 1 web interface on our product so its not like its a high risk but it's a lot like AIDS, once you get flagged with web dev it never goes away.
Finally, a "learning to code" article that actually mentions version control and its importance.
Of course it's an extra thing to learn when all a beginner wants to do is get something cool (often a website or game) working quickly. But what happens when that beginner wants to experiment? "Fail fast" is meaningless if those fast failures irreparably break that simple, cool website or game that used to work. Fail fast means the ability to reset (or even revert) or stash or branch and keep moving.
I agree that the title should be "Learning to develop web applications".
"Web Application Development" is only a subset of "Application Development" which is again only a subset of "Coding/Programming".
There are many kinds of application development - web application development, desktop application development, mobile application development, embedded application development, games development etc. So, web application is just the most popular kind of application development.
Also not all coding is done to develop applications. There is difference between a program and an application. Sometimes people code to write utility programs, e.g. shell scripts which isn't application.
I believe undergraduate courses should be designed to provide solid fundamental knowledge of computer science and how various fields of computer science relate to each other while postgraduate programs should provide specialized education.
Yeah, you're definitely right. I chose to include "coding" over "web development" in the title only for concision/aesthetic purposes. Thanks for clearing this up for other people. Hope it wasn't too misleading.
For HTML/CSS specifically, use Chrome or Firefox's Inspect Element to take a look at web pages. It's especially instructive to examine things like 1kb grid, where the CSS itself is designed:
http://1kbgrid.com/
I studied programming in college...and while I liked the assembly-code + controller projects, I wish there was more focus on web development...because the one thing that has gotten me excited about programming is the potential to share (i.e. brag) about my work.
Making clever paint/tic-tac-toe/sudoku programs in Java/C++ are good exercises...but I lost interest in programming when I couldn't see it being used for more day-to-day practical tasks.
One specific topic I wish had been taught was regular expressions. In my first year of comsci classes, regexes were in the optional chapters of the book. Knowing about them would've led me to do more data-processing/information-gathering type programming, which is the track I am on now.
Plus, they're just damn plain useful in every coding context, even for project code-search-and-replace.
It was my understanding that most CS programs include a course covering formal languages and grammars. At my school it was called (creatively) Theoretical Computer Science.
Learning to break apart a regex and rebuild it as a state machine is where the real magic happens.
Any decent CS program should contain such a course, sure, but it is probably not in the required curriculum, and it's probably not taken by very many students.
As for names, at SFU (my alma mater) it's called "Formal Languages and Automata". Well, actually, there are actually about half a dozen courses covering parts of this material, but that's the most-perfect match for what you described.
Learning to break apart a regex and rebuild
it as a state machine is where the real magic happens.
That's useful for sure, but modern regex libraries have facilities that can only be translated into a pushdown automaton and you end up using those facilities a lot, even if you don't realize it.
I actually do think the web is a good place to start, especially with a backend in python or ruby. You get to quickly see the results of what your doing, which will spur you to learn more.
I definitely do not think that books, tutorials or classes are the way to learn programming at first. The thrill of discovering the concept of recursion on my own really hooked me onto coding. I remember googling "a function calling itself" semi-hoping that no one else knew about this yet...
I did Neuroscience and BME undergrad, so I only had taken the 1 required class of Java (which I couldn't stand at the time, it was just too messy). I'm glad I didn't take more CS in college, it might have turned me off for good. Yes, once you get to where your fluent, I think classes (or books if your self-motivated) in algorithms, data structures and other hard CS are essential. But to get to that point, think of a creative project and try to build it, you'll really fall in love with the language you use, and you'll understand the real reason why people program: because it makes you feel like god.
Great post and a lot of useful information. Particularly relevant for me as I'm a noob starting out on my first project, much like described in the post.
I've followed a very similar roadmap in learning bits of html/css (highly recommend the Head First book), javascript, version control, ruby and doing Hartl's, 'Rails Tutorial'.
I didn't get a job at Pivotal Labs in SF like the author (which I'm guessing would be a pretty awesome learning environment) but I've managed to convince the 'Pivotal Labs' of the Philippines to take me in & provide mentorship ;) If anyone is interested I've just started keeping a log of each step along the way at http://www.ralphy.tumblr.com. (Will update with more posts soon).
So far, it's been tough and I'm realizing how little I know in the whole spectrum of Rails, let alone web development. Having said that I'm already building stuff on the web I can actually see and show my friends, which is kinda awesome.
I am following exactly what this guy says, but it took me a long time, over three years, to finally settle on whats important for this kind of career. I revisited C so that I can keep up with git code and learn from it; I bought a Rails book,and I am trying, although very slowly, to do anything that comes to my mind in Ruby. This is not about Rails, its about a healthy shortcut. I was not lucky enough to read something like this sooner. This is excellent advice, and I commend him for saying it.
I have always wondered how much of CS coding knowledge applies to web apps and how necessary it is to take some of the killer math courses offered in university to be able to code web apps proficiently. And if there is that big of a difference between coding and web apps, then shouldn't universities and students be better off learning how to code web apps since they seem to be in dominant usage anyway?
Btw, I am not a technical person, hence the curiosity. Insights would be much appreciated.
> how much of CS coding knowledge applies to web apps
That depends on your web application and what programming problems you need to solve. If you only need to glue together other people's code and components, then not so much. If you need to write your own framework, crunch ratings data for recommendations, custom binary serialize data for your ux layer, or do complexity analysis of some backend algorithms, then a CS major will help you a lot.
> shouldn't universities and students be better off learning how to code web apps since they seem to be in dominant usage
Oh no they're not. The absolute majority of software being produced is for internal corporate use, helping businesses on the inside.
Also, universities do teach web apps, it's just that they're not very complex or interesting problems. HTTP requests go in. HTTP responses go out. You have a framework in between that abstracts this, and there you have it.
That is very true for CRUD apps that make up the majority of web apps. And if you need to do something more complex with a web app, like analyzing data, the CS knowledge required is no different than if it was an offline app, i.e., a CS degree would be very helpful.
However, I think that for the majority of programming jobs out there, a full scale CS degree is a bit overkill.
If you're hiring someone to produce CAD drawings, you don't need a full fledged engineer or architect; you can find someone with a certificate from a tech school.
I wonder if eventually programming will eventually go that route. 4 year degrees for Software Engineers, and tech school certs for Software Technicians.
A few weeks ago I was listening to a show on NPR about how to get into programming (I believe one of the founders of Codecademy was a guest) and one of the guests said something that has been running through my head since:
5-10 years from now, entry/junior-level programming will be blue collar job.
Think back to 1992. 20 years ago. Using the 'web'. Well, Gopher sites, and veronica searches. Then fast forward to 2002. How much changed? How much was new, and how well did the signal hold up to the noise? We went from pure text, which was mostly well-written, to a heavily graphical environment. Pages didn't render so hot but some of the crazier websites were just about to do some really impressive things. Webmail was becoming a cute way to check your mail real quick when you weren't at your regular computer.
Now you're in 2002 and are warping to the present day. Now how much was new? Somewhere in between it got fast and diverse enough to replace television. Browsers got real good at all rendering the same thing. The client side code started getting robust. The noise level is at an all time high. Advertising runs rampant. Fraud is prolific and all sorts of infrastructure is integrated and allows all sorts of use and misuse. Add to this the upcoming significance of mobile computing.
Now with this cadence of the imagination, pole-vault yourself into 2022. It's ten years from now. How is the signal compared to the noise? How is the 'web' doing? How are the programmers that created it? Their upcoming replacements have never known a webless world. Few were trained in college, nor vocational centers, but simply picked the skills up as part of the natural landscape of growing up. Programming has had another 10 years of abstraction. Graphical environments allow programmers to metaphorically build entire program flows the same way a call center employee reads a script. Creativity within these environments is stifled, undesired, expensive. But it pays the bills and keeps the net flowing back at home.
The blue collar programmer of 2022 goes home from their job, fatigued from a day of pointing their fingers at 4 foot glass displays. They feel a relief when they get home to an 8 foot display, where they can run some white collar's program for the remainder of the night. The program is a mix of video games and social networking. It's how a person from 2012 might have felt hanging out at a cheap bar with a few ipads and college buddies. The experience keeps its user distracted and content, the perfectly designed program to contain the 8.3 billion people ever expanding for space and resources.
You can do quite a lot without knowing much official computer science theory. What CS theory does is teach you what's been done already and what's possible. That helps you avoid some classic pitfalls and keeps you from wasting time doing things someone else already solved.
great, in that case wouldn't it be easier to simply pick up a book or two or listen to a few open course lectures to get a referential understanding of avoiding classic pitfalls in CS while focusing hardcore on web languages like Python :) and learning "Python the hard way" :)
To give an example: I'm working on something that is on one level a simple web app. But then, I needed to code what amounted to a concurrent merge sort (merging multiple streams of information (and their histories) that are each ordered but not relative to each other — think Twitter streams, since that's what they were.)
Because I have a degree in computer science, that theory background meant I could quickly recognise the type of problem I was trying to solve and apply standard stuff I had learned to it. Without that theoretical background, I would have developed something that worked, but it would have taken me longer, it wouldn't work so well, and would probably break in unpredictable ways :)
I think saying that web apps are in "dominant usage" right now is a bit unsubstantiated--there are plenty of jobs that do not involve web apps.
Also, you have to remember that universities are not about preparing you for a career. You do not need 4 years of general education if your only goal is to produce web apps quickly.
Finally, non-trivial web apps--those doing something complicated or needing to scale significantly--do need CS knowledge, and even simple ones doubtlessly benefit from it.
Computer Science is an academic subject, and famously the Computer Scientist Djikstra didn't have a personal computer for a very long time, so you're right that you don't need CS for web apps.
You also don't need to be a surgeon to put in a few stitches (though you will have very nice stitches if you are one).
In other words, if all you can do after a Computer Science course is web apps, ask for you money back. As you alluded to, simple web apps have never been easier, but there are lots of highly interesting and difficult coding problems outside of that niche (think search engines, aircraft software, operating systems and the list goes on).
Thanks for the all the replies below, they clarify quite a bit. Here is another question, how much of the "incredibly difficult" math actually comes in handy and serves real coding purposes other than honing your ability to think logically? Math can be incredibly conceptual at times and as someone who has a psychology background, I know that a person can be not very good at math due to inability to think conceptually, but can be very downright logical on matters that he can understand practically. what do you guys think?
While it depends on what you mean by "incredibly difficult", I would say that if you are doing any work with 3d modelling, it would be necessary to understand basic linear algebra. These concepts will transfer directly - if you are representing your model with points in 3-dimensional space, then displaying those points on a screen is a projection map, and can be described (and quickly calculated) with a matrix. This is a typical example.
Another common example would be a course in number theory. A common topic in number theory is public-key encryption, much more in-depth than is sometimes covered in other courses. In addition, the techniques that you learn in such a course can easily be applied to any "number-crunching" you would need to do. Here is an example:
Problem 1 from Project Euler (http://projecteuler.net/problem=1): Find the sum of all the multiples of 3 or 5 below 1000. A brute-force program for this is simple, but it is much faster if we first notice some patterns: (3 + 6 + 9 + ... + 999) = 3 * (1 + 2 + ... + 333). But then (1 + 2 + ... + 333) = 334333/2 (this would be easily recognized by someone who has taken a number theory class), and so the sum of all of the multiples of 3 below 1000 is 3 333 * 334 / 2. Similarly, for 5, there are 199 multiples, so the sum of all of these is 5 * 199 * 200 / 2. But then we must subtract the multiples of 15, since we double counted them: there are floor(1000/15) = 66, with sum 15 * 66 * 67 / 2. So the answer is (3 * 333 * 334 + 5 * 199 * 200 + 15 * 66 * 67)/2 = 299498.
That was a particularly trivial example - number theory will give you a lot more than some parlor tricks like the above. But this example took a loop through n numbers and reduced it to a single line of simple arithmetic. And for someone who understands number theory, this algorithm is as quick to think of as it is for a typical CS student to think of the brute-force algorithm.
I would also venture that a course or two in "Abstract Algebra" would be both beneficial and accessible to CS majors. This is less immediately practical than linear algebra in the above example, but closer in its style of thought to programming than, say, calculus, topology, or differential equations. The abstraction (which many math majors complain about) would be fairly straight forward for a CS major, I would think.
I have seen many good programmers who are terrible in math. If someone is bad in math it doesn't mean he is less logical or less intelligent. It all comes down to interest of the student in the subject. For me math was always boring(other than algebra or geometry) as I was unable to understand the reason for learning it. I always used to think why in the hell I am supposed to learn this boring calculas. Where am I going to use it? So, bottom line is that I'm not good at math but I am a decent programmer if not a good one and math is not necessary to be a good programmer but it certainly gives you an edge if you are programming in some specific fields of computer science. For example, computer graphics.
I used the exact same approach, except I went with Django and DotCloud. I think Rails people have it even easier with better guides - and possibly documentation.
great article! I wish I had read this when I first started coding. I am sending this to all my friends that are now asking how they can learn to build web apps.
To learn about how a web server actually works, I'd suggest setting up your own VPS on Linode. They've got a great library of information to help someone get started: http://library.linode.com
If you're coming from a Windows background and don't have any Linux/Unix command line experience, though, this will be daunting.
I spent all of yesterday trying to upgrade ruby/rails on my linode for the first time. Kept running into errors and eventually decided to just start over with a clean Ubuntu install. That kind of demoralizing experience isn't a problem when starting with Heroku.
RubyGems and Linux (especially Debian / Ubuntu) are like oil and water.
Don't install Ruby libraries through apt-get, and also don't install them through RubyGems globally (i.e. never use "sudo" for installing gems).
Instead install gems for the local user only, with the help of RVM. Then to upgrade, if everything breaks, you can just delete your RVM directory and start from scratch again.
Of course, this brings with it a whole other can of worms, because then it's your responsibility for upgrading your libraries with the latest security-related fixes (which otherwise would have been updated with a simple "aptitude safe-upgrade", which can be a cronjob or something). But then you've got the same responsibility on Heroku.
As someone in the process of learning Rails myself I absolutely agree with your blog post. Especially your admonition to begin working on a project of your own much earlier than when you feel ready. That is the surest way to reinforce what you may be learning through a tutorial.
The thing is... you probably also need rather high aptitude to make that much progress in 6 months. These blog posts never mention that cause the authors are too modest.
jQuery's a great choice for beginners learning to program for the first time. Two related frameworks, underscore and d3, both have jQuery-inspired APIs and use higher-order functions extensively. These frameworks teach lessons in functional programming that transfer easily to Python, Lisp, Haskell, and even mathematics.
Well, consider all of the things that have to be in place for the web app to get from where it is served, to where you're reading it.
The web app itself is typically going to be running on a framework, written in another language - for example, a site might use Rails, which is written in Ruby.
This has to be served by a webserver - Apache is a popular choice - I think much of that is written in Java.
The webserver runs on a machine, which might be running, say, Linux, as its operating system. The Linux kernel is written in C.
Now the page has to be transmitted across the network - hopping across the globe from one router to the next. The hardware that does all this switching is in microchips, which are often designed in a language called Verilog. (This is my own little niche area)
The browser that you're using to view the page might be running on a laptop running Windows, which is largely written in C++.
There are countless other fields of "coding", too.
So to summarize, writing web apps is a subset of a vast and complex field of study - and to follow this roadmap (which is probably a good way to learn what the author wanted to learn) will not teach you anything about most of the other things one might want to code.
For what it's worth, in setting out on your journey to learn about programming (in the broader sense), I think you could do worse than to start with C and Scheme.
No, I'm saying that knowing C and Scheme is a great foundation for future computer scientists and engineers! They lay a solid foundation upon which much can be built.
But probably not a good place to start if, like the original author, you're mainly interested in developing web apps.
Coding/programming encompasses many programming languages, frameworks, platforms, etc.
Web applications use a subset of these, by using mostly markup languages (html/css) and some web languages (javascript/php). They are deployed to servers with http software, and are built for handling web traffic.
If "knowing how to code" means that you've built webapps, databases, embedded systems, mobile apps, operating systems, console games, and a compiler and mastered 19 programming languages, then very, very few people in world are coders.
Hacker News, stop being pedantic downers, and instead applaud someone for picking up a new skill in life (and trying to help others learn as well).