If you are writing basic CRUD apps, 12 weeks of good instruction can make you (somewhat) productive. One of my former employers (ThoughtWorks) had a 6 week 'immersion' programming training, which ramped up people just out of school to a point where they were able to contribute to a production app in that period. Of course, they had plenty of ongoing mentoring and supervision, a great company culture (given enterprise services) and so on, but it can be done, for some definition of 'done'[1]
But as the amount of theory/Computer Science knowledge required in your job increases, the less relevance these boot camps have. I personally think of bootcamps as a borderline scam engaged in a 'shovels in gold rush' business.
Any half way motivated individual with a laptop and internet connection can learn their syllabus for free, but it takes a certain type of individual to do that, who is not dependent on external discipline, can tolerate ambiguity etc.
And if I'm hiring, even for a CRUD job, I would be very dubious about bootcamp resumes (just a personal opinion, YMMV). They seem to combine the worst of credentialing, and frivolous 'education'.
[1] Whether that ultra basic knowledge is worth the insane amounts charged by some of these 'boot camps' is a different issue altogether. As a completely self taught developer, I would never pay those amounts for that amount of learning, but hey different strokes for different folks.
If you look at the graduates of the top bootcamps, these are folks who studied music theory at Harvard, physics at Yale, math at Stanford, EE at UMich, mechanical engineering at UC Berkeley, etc. These are very smart folks who have proven themselves in their career and education to dive into new challenges. They have often already taken a few online courses. The acceptance rate is very low (5% or less) and the job placement rates are very high (over 90%). These aren't diploma mills and the connections, intensive study, preparation for interviews, negotiation, resume review, peer support group, help finding positions, etc you simply don't get on your own. The good bootcamps provide significantly more than just learning programming.
I'm a technical college graduate, so I have no pretensions about this stuff. But I view the bootcamps as part of the current frothiness and dysfunction around front-end development in general.
The idea that one is prepared to grapple with software applications development without AT LEAST 2-3 years of intense study is absurd to me. After school, I still needed a year of interning to get up to speed.
I study CS as UC Berkeley. If you want to be a web developer, a mobile developer, a product engineer, or something like that, go to a bootcamp. That, and then getting 2-3 years of work experience is a much better deal than studying CS in college, even Berkeley, and even if you get scholarships.
But I would like to point out that many of these criticisms don't apply to top schools like Berkeley. Berkeley's intro-CS classes are all taught by lecturers - and the lecturers are given tenure (e.g. people evaluated solely on teaching). As a result, I have had a nearly-uniformly-positive experience with the quality of teaching in CS - extending to and especially professors who are also leading researchers.
There are also more areas than buildings apps. If you're interested in distributed systems, robotics, machine learning, finance, etc., then going to a great CS school like Berkeley is a fantastic choice. There is truly cutting-edge, industry-leading work being done in distributed systems, computer architecture, machine learning, deep learning, and databases at Berkeley, and that undergrads are able to contribute to.
I think your assertion that you need to study at a higher education institution to do something besides building apps sadly true.
That statement however has nothing to do with whether or not the form of education at colleges is better than bootcamps at training individuals to become good at some task set X.
It does say something about the barriers to entry in those other fields. For instance in robotics, money for hardware, access to cutting-edge research and journals are the minimum requirements for anyone interested in jumping into that field, and this jump is made easier for students who have access to all of this via their university.
I would hypothesize that the form of intense applied study that bootcamps offer is better suited than the traditional 4 year CS degree for training individuals in any field, whether that be research in theoretical computer science or building a distributed database system, mainly because of the higher degree of specialization, applied project based approach, and dedicated mentor based guidance in the bootcamp model.
This topic is fascinating. I think we have so much more space for vastly improving the output of the education process at every level of the system. But I think a lot of improvement domains will depend on the degree to which we can alter and manipulate the current status quo in industry and the system itself.
Okay, I agree in the abstract that "intense immersive study" is a better form of education than any other. This is really obvious. But the breadth and depth of knowledge required in the areas is huge. (e.g. for distributed systems, you'd ideally understand things at all levels of the computing stack - OS, networking, databases, C/C++, data structures, algorithms, some basic discrete math). This could easily require a "bootcamp" to be a year long. Would it really be a bootcamp then?
Also, the world experts in all those areas are at research universities. Now, it's true you could probably offer them a lot of money to teach bootcamp courses, but one of the best routes to developing skills in these areas is to do actual research in these fields. Research takes a long time (> 1 year), so this goes back to my original point.
I was speaking with a friend of mine who recently graduated med school about this topic. An insightful point he made was that with some careers, you need a lot of education before you can start working, like being a doctor. In other areas you're able to be productive in a workplace with less "education" than you find in a university program.
As a former bootcamp instructor, it has been amazing to watch the careers of some of my students. Bootcamps are more about setting up a student's learning trajectory than making them a computer scientist. I think most software developers would say that this is the kind of career which requires continuous improvement and learning, if you give a student the tools to learn and improve then you've set them up on a good path.
Can you become a software engineer in 12 weeks? Probably not, but I've seen a good number of my students become fairly successful engineers within 1-2 years of completing the program I worked at.
We're painting in broad strokes as there are exceptions to every trend. But in my experience, bootcamp graduates are more like tinkerers and university graduates are more like engineers.
Most bootcamps don't cover computational complexity, for example. Or functional programming, or objected oriented design, or music or ethics or philosophy. Instead, most bootcamps focus on tools rather than ideas.
As a CS/EE grad, and having worked as an EE for 7 years then several more since then as a fullstack developer (I got sick of the EE world) as an individual contributor, lead, and manager, I have to completely disagree with this characterization.
Honestly I think being a good engineer is 10% domain training (CS Degree etc), 45% learning good practices on the job from more senior people, and 45% individual temperament. I have a degree from a very good CS program and yes I learned a lot about languages, theory, algorithms, logic, etc and I use about 1% of it.
I've worked with tons of CS, EE, etc from good programs who were terrible engineers because even though they had a more formal background, they were sloppy, didn't practice decent engineering rigor, didn't follow good practices for code management, weren't good or methodical collaborators, etc, etc.
In the last few years I've hired several bootcamp people and several CS grads with varying degrees of experience and the best people are just the best people, independent of their education, because of their passion and discipline for their craft. Of course on day one an AppAcademy grad doesn't know jack but 2 years later they have every opportunity to be as good or better than your average CS grad with the same experience.
My college was far from bad, and covered lots of theoretical[1] ideas. I'd be surprised if 10% of graduate students understood them. Unless you are very abstract and passionated, 5 years aren't enough to connect all the dots I believe.
[1] surprisingly we didn't have a complexity course per se. We had optimizing compilation, type theory/interpretation, abs.alg/cryptography, all forms of relational algebras/calculus[2] etc, formal systems/symbolic processing. more than 50% of students failed, 25% struggled to get points, 25% kept their head above the water.
[2] which is funny, some concepts like transitive closure of relations and db normalization seemed hard and ideal toys, but much later I found that the first is similar to an abstract recursive fold (and folds are nice to have), and normalization gives you a better 'object' modeling theory better than most floating OO principles you read in books.
Well, I cover that stuff. I think it's foundational to learning to program, rather than just assembling toy projects. Exposing students to the real implications of their high-level code can only make them stronger, better programmers. They don't need to write an OS to do app development, but they sure as hell need to know their big O, how to do estimation, their data structures, and some fundamental algorithms and techniques. We try to give them the tools to continue exploring as well- those who are interested tend to follow up with MOOCs, having been well versed enough in programming that they can use videos and exercises to learn on their own.
I'm a recent graduate of Fullstack Academy, and I also happen to have a bachelor's in Computer Science. While at Fullstack we didn't spend weeks on end going over object oriented design or learning algorithm design, we did have lessons on it. And I found it to be delivered in a better format than any of my computer science classes.
I'm sure not all CS classes are created the same, but they are usually the same length and only a few days a week. The difference being that at a boot camp, you're there 10 hours a day, 5 days a week.
"The relationships built between bootcamp instructor and student are deeper than those between college professors and their students. The advice is solid too, because instructors come from the actual industry the students are trying to get into."
The relationships exist in academia too, but the students have to want to them. As a PhD student with actual industry experience I make sure I have room for at least 3-4 undergrads during any given quarter. We love research in my lab, but we also respect mentorship.
I really liked this article and liz seems like she is on the right side of progress. I recently helped a freind at berkly with her work, this led me to find out that she had never learned java, and was expected to make a mini text editor. She explained this was berkly and not everyone was meant to pass. Liz presents things that really touched my heart, tinkering rather than jumping through hoops.
However, from working at a bootcamp and speaking to students who went to others. They are expensive, Unorganized and built in a way that makes students "feel satisfied" rather than prepared and hungry. GA in particular apparently hand holds immensely rather than have students sweat it out a bit. At 5 am, no one is there to help you or tell you what tp search. Additionally, those involved in this space usually are working with big investor money and paying big in order squeak out dollars from college students or people who font have the money to pay. The only person who worked with us above fourty quit because he didnt want to be a tax collector. Hack Reactor seems to be the best one out there but they are more interested in seeing who will survive than in seeing how to make sure everyone can swim
Bootcamps are imperfect but certainly in demand. I Do believe they are a good mutation in the education field. Cutting out fat and creating cultures. But there is still a lot of work to be done before I can say they can accept my praise
> project based learning, and relationship-based learning
You could also go to a university that emphasizes those aspects of learning (e.g. http://www.olin.edu/). Admittedly, there aren't many of them, but I feel like it's the best of both worlds.
I generally take these types of articles with a grain of salt. The authors are selling their services.
I've seen two types of bootcamp outcomes:
1) Data Science - For better or worse, the 3-4 data points I have did not produce great results. All but 1 struggled to get jobs. The 1 that did get a job with the help from his teacher struggled to keep jobs. The boot camp just wasn't long enough to teach what's needed.
2) Programming - Only one data point - I had a person who was a non-technical resource at a software company who used the bootcamp to get into a tech support (not development) role. He said it was much harder than his (non-technical) college degree. I'm inclined to agree, and it moved him nominally.
I'm not a fan of much of college education, but the jury is still out on if these will supplant them.
The tendency to characterize educational institutions as 'factories' for producing intellectual laborers (or knowledge workers, to use a synonomous but less harsh sounding term) is deeply unsettling to me. Although this piece mainly presents the author's arguments for bootcamp-contra-university in terms of her positive cultural experiences, these aren't presented on the strength of their own merits. The argument is (as usual), that the author's favored approach would increase the efficiency of the technology for transforming unskilled laborers into skilled laborers. Although this might seem like a hair splitting point, I think that this view of education causes larger problems than irritating continental-philosophy-reading, freeBSD-loving overall curmudgeons like yours truly.
Education in America was never intended to produce laborers. In fact, the "liberality" of those arts which form a traditional university curriculum doesn't refer to their tendency to attract war protesters, communists, or critical theorists: these "liberal" arts are the arts of free men (free as in speech, not beer, yada yada), who would need a deep understanding of history and culture to prepare them for participation in political life. Historically, this freedom was enjoyed only by clerics and aristocrats. The hope of democratic societies is to extend political agency to ever broadening sectors of society, and to base political agency upon merit, rather than social caste. Universal education, it was argued, would produce the citizens that a large democracy would need, which would compensate for the fact that receiving such an education reduces one's available time for labor. One might recall that summer vacations were conceived to alleviate the imposition that schooling represented on a families ability to avail themselves of the labor of their offspring [1].
Now would be a reasonable time to point out to me that political agency and economic welfare aren't exactly cleanly seperable goals, and that I'm still actually just splitting hairs, and that jobs are important by golly! Point taken. But here's the punchline: I would argue that out confusion about the purpose of higher education has caused us as a society to waste a huge amount of effort 'fixing' universities to accomodate the needs of industry, rather than investing in new strategies - I think that a concerted effort to create an apprenticeship system for training IT workers (and probably other kinds of workers too) would better suit the needs of businesses, as well as the tastes of more restless students, at a vastly reduced expense. Freed from the pressure to teach industry-ready but highly transient skills, universities could be preserved as places to learn deep, ready for students who have a knack for academic work, or have become interested in the theoretical aspects of their practical work. As it stands, the standoff between vocational and traditional perspectives is undermining our ability to teach both theory and praxis.
It's also led to the illusion that the quality of an education can be more or less straightforwardly quantified using crude economic heuristics. Maybe nobody actually believes that this is a good way to evaluate education, but the demand for such quanititative measures nevertheless persists -- to the great detriment of students and faculty. Since no individual element of a University experience can be easily correlated with these "measurements" of educational excellence, the approach by business-oriented administrators seems to be something like "let's see how much we can cut without making people too angry [2]." Of course, neither the constant cutting of less 'profitable' programs, nor the withering wages of the underclass of adjuncts has stopped the university administrations from availing themselves of the near-infinite price elasticity afforded them by the finincial aid system -- after all, it is all about the bottom line, isn't it?
[1] Everything about this paragraph is pretty drastically oversimplified. For a fuller exposition of these ideas as they were expressed in the relevant historical period, check out John Dewey's Democracy and Education, published in 1916.
[2] I don't mean to imply that University administrators are moustache-twiddling evil capitalists out to suck the education system dry. It's just that, faced with the task of administering a complex system with difficult-to-measure
outputs, people who are trained in business and generally encouraged to view their task in business terms seem to settle into this approach.
>>" Freed from the pressure to teach industry-ready but highly transient skills, universities could be preserved as places to learn deep, ready for students who have a knack for academic work, or have become interested in the theoretical aspects of their practical work."
As a person who was immersed in academia (even to the extent of writing a peer-reviewed article and serving as an adjunct college instructor for a period of time), I had a slightly negative view of academia for that reason alone. It appeared to be a self-referential entity, where the only people reading your research papers are other researchers who are only interested in writing their own research papers. The 'outside world' doesn't quite respect your theoretical research fully (as you are not a productive member of society actually helping out other people), and the academic world itself doesn't quite care enough about you either (too obsessed with promoting their own theoretical research).
I think universities need to be more leaning towards vocational skills, or to make their research more "vocationally" relevant. That way, laymen will actually see a use for academics, instead of seeing them as prestigious eggheads. Theory has its place, but it has its place as a part of a vocational curriculum. Nothing more. I also support even more measurement of the quality of education, because without any level of accountability, you do not know whether what you're doing is actually working. I can write and teach all I want, but if I'm not sure if what I'm doing actually have any impact in the long term...then what's the point?
I left academia because I knew that adjuncting would be a dead-end job and that the administration prefers to hire them over that of full-time faculty, but it doesn't take a genius to suggest that low-paid teachers are definitely going to have a negative impact on a future curriculum, even on the measurable metrics. In fact, it is possible that adjunct instructors provide the "breathing room" that allows for the full-time faculty to continue to churn out research papers (although their replacements may wind up being more adjuncts). The fact that academia itself doesn't want to hire its own children suggest even further that academia needs to focus on vocational skills, so that academics can transfer out into a "post-academic" career.
As for me, I am still interested in the "theoretical aspects of [my] practical work". Maybe I may one day find a use for it. But I am unlikely to find an outlet for it within the the insular and corporate nature of present-day academia. For now, theory is just a hobby.
>> The fact that academia itself doesn't want to hire its own children suggest even further that academia needs to focus on vocational skills, so that academics can transfer out into a "post-academic" career.
Or perhaps it suggests that University administrators should stop treating their professorate as factories for research "products," who must continually publish to avoid being fired.
But "should" does not at all mean "will". We live in an imperfect world, and must deal with reality..or find some way to change that reality. Maybe the real solution is for us to create a "new academia" to compete against the "old academia".
I agree, thus my suggestion that we try and move towards apprenticeships. But I think we should avoid framing apprenticeships/bootcamps/whatever as a replacement for universities, and instead frame them as an alternative.
addendum: imagine how self-referential and masturbatory Turing's 1936 paper would have seemed from the perspective of the time, especially given that it was written during the depths of the depression.
I realize that it's a stretch to imagine university bureaucrats changing their ways. That's why I think that academics and educators should tenaciously promote the value of education for its own sake, rather than trying to attach the value of education to some economic measurement. It's the use of this kind of rhetoric by academics themselves that unsettles me - for bureaucrats it's simply par for the course.
I don't believe in "education for its own sake", though that may be because I still associate it with the "publish or perish"/mindless research attitude. I still like learning stuff for fun though, and I may eventually come around to supporting "education for its own sake", but I have to first believe it, and I cannot do so now. Practical knowledge is still better than theoretical knowledge.
I still think the solution lies outside "old" academia, though on a different track that can also run parallel to the "vocational" track of apprenticeships/bootcamps. Academics can simply publish their findings online, via comments, blogs, and open-access academic journals...and can also attempt to teach their ideas (also online). Even research grants could be replaced by Pateron and crowdfunding. A system where academics and educators slowly promote their ideas and gain followers/supporters is one that has the potential of being slightly more just than the old system.
But as the amount of theory/Computer Science knowledge required in your job increases, the less relevance these boot camps have. I personally think of bootcamps as a borderline scam engaged in a 'shovels in gold rush' business.
Any half way motivated individual with a laptop and internet connection can learn their syllabus for free, but it takes a certain type of individual to do that, who is not dependent on external discipline, can tolerate ambiguity etc.
And if I'm hiring, even for a CRUD job, I would be very dubious about bootcamp resumes (just a personal opinion, YMMV). They seem to combine the worst of credentialing, and frivolous 'education'.
[1] Whether that ultra basic knowledge is worth the insane amounts charged by some of these 'boot camps' is a different issue altogether. As a completely self taught developer, I would never pay those amounts for that amount of learning, but hey different strokes for different folks.