By university, many students have already set their minds against a career in tech. We need to reach kids earlier, when they are motivated by curiosity instead of social pressures.
This came up the other day. A 19 year old kid came to my door working for the electric company. We ended up talking a bit and he asked what I did. I explained I'm a designer/developer and I work from home. For him, that was a total mind blow and he explained how he always wanted to be a game developer. I suggested that he should try and that his current job was just a temporary thing he could use to invest in getting better.
The thing that bothered me, though, was that he didn't even think that he could do that, purely based on where he was from and what he saw around him (he alluded to being from the south side of Chicago). Stuff like Codestarter is a great step in the direction toward teaching kids of any background that they can do this stuff, just like everyone else.
Personally, I'd like to see something like this but with a more down-to-earth "here's what they told you and here's why it's wrong" lesson up front.
What is the drive behind this "get everyone everywhere to code!" initiative?
I just don't get it. You don't see this kind of thing in other industries, do you? Why are so many programmers intent on getting everyone from every demographic, rich poor, black white, interested not interested, into coding?
Forgive me, but it just seems like there's this misconception that everyone should know coding like they know algebra. I understand you people like it, so do I - that doesn't mean it's for everyone.
This comment may get down voted but I assure you it comes from a place of genuine intrigue.
There's a similar push for basic financial literacy in schools - most people are not going to grow up to manage a hedge fund, but that doesn't mean that they shouldn't learn about compound interest, debt, inflation, supply & demand, budgeting, equities, and business fundamentals. Managing your personal finances is a basic skill, and people who don't have it are severely hobbled today.
Many lawyers and civil rights organizations create "Here's the basics of the law" pamphlets, websites, and HN comments. Things ranging from who owns your IP when you moonlight, to what are your rights when the police pull you over, to what your responsibilities as a homeowner are, to what you should consider before getting married. These are not a substitute for trained legal advice, but they are hopefully enough to keep you from doing things that will cause legal problems later. And of course, 7th grade social studies in most American public schools teaches the foundations of the legal system and how laws are made and ratified.
Basic mental health is increasingly taught in public schools, along with what a healthy relationship looks like. A 16-year-old kid is obviously not going to be as good at regulating their emotions and perceiving those of others as a trained therapist, but particularly given the epidemic of domestic violence, it's been deemed important enough that we train our kids in basic psychology.
Everyone is expected to be able to write and communicate well, whether they're a journalist or not. We get decades of schooling on this, many colleges make "You will be able to write when you come out of here" a key selling point, and illiterate people tend not to do so well in modern developed nations. Journalists get more practice with this, but it's still considered to be a basic skill.
The progression for all of these is that the skill is initially associated with only a small number of practitioners, but eventually touches enough of people's day-to-day actions that the population requires a common base of knowledge to function. There was a time, a couple hundred years ago, when knowing how to read & write was not common knowledge, and largely restricted to lawyers and clergy. There was a time, barely 50 years ago, where common financial skills were pretty much restricted to business owners (you could argue that among people below about the 80th percentile in education, financial literacy is still uncommon). Emotional intelligence is currently a very scarce commodity. But the pattern is that as a skill becomes more fundamental to daily lives, the basics of that skill start getting taught to a broad base of people.
I'm in my 30s and relatively new to choosing programming as a career. I didn't learn it because someone told me I needed to. I saw (and see) real social and business problems that I wanted (and want) to work on.
The problem I see with the push for everyone to learn code is that it doesn't just expect people to have basic literacy with computers, but wants large numbers of them to choose programming as careers, if only they could be convinced how rewarding it is (financially or otherwise).
To use your examples, everyone should have basic literacy in finance, the law, mental health, and writing. But not everyone should become a lawyer, a finance professional, a psychiatrist, or a novelist. In fact, for many of the aforementioned professions, you could argue we have too many people practicing them already.
Similarly, you could argue we don't need more coders who only want to do it because someone told them it's financially rewarding or is the right thing to do.
The point is that software literacy is becoming more relevant to everyday life. The point of a mathematics class is not to create professional mathematicians, it's to provide the concepts of mathematics to students. I'm not a mathematician, but I use at least trigonometry(mostly through polar coordinates) every couple days to solve something a lot of other people would just let get out of hand.
I see software literacy as a tool which will become increasingly relevant as the world starts wanting more diverse software. You will start seeing florists, dentists, architects, and foremen writing programs to solve issues which are not complex enough to warrant a professional software engineer, or which require so much domain knowledge that they couldn't be written by most software engineers.
That's really what this is about, in my opinion; at least personally I wouldn't be so misguided to think that the point of software literacy is to churn out a country full of specialist software professionals.
The point is that basic understanding of programming is basic computer literacy. The gist is that if you can't automate with a computer, you aren't really able to use the computer as a tool.
One could argue that if you're able to use the computer to catalogue and enhance data, turn data into information (like visualizations) -- then you're able to use a computer. A lot of this can be done in Excel (or another powerful spreadsheet program). But then most will also tell you that spreadsheets are visual programming tools. And that you might need to do (and automate) transforms on data that still need some form of "actual" programming -- be that VisualBasic, PowerShell, Awk/Sh -- or "proper" programming in a "proper" language.
Another side to "everyone should know how to program" (a side I find less interesting) is the growing need for software engineers, as we move to computerize more and more both public and private sector "systems". I think the lack of engineering talent is real, but that's not my main motivation for showing kids how to code -- just as I don't teach kids how to do math because I hope they'll work as mathematicians.
Math helps you think, helps you invent, and so can computers. If we want the next generation to be impatient for the kind of programs that we had, like Sketchpad. Or even an environment like Smalltalk -- I think teaching them to program is a good start. It's also a great way to educate them how crappy (and simplistic as opposed to simple) many of the "apps" and games are.
For more, see eg: "Doing with Images Makes Symbols" by Alan Kay
I take it more like an update of the educational system. I recall a lot of stuff I learned back in high that was plain obsolete in our postindustrial society. It made perfect sense for my dad to learn basic metalworking since a lot of people used to work at factories back then, but now it does not.
You see it even more with the sciences. Anyone else find it interesting that the classes taught in a traditional U.S. high-school curriculum - algebra, trig, calculus, mechanics, combustion, fundamentals of life support - are precisely those needed to send rockets into space? And the curriculum itself dates from the 50s and early 60s, when our primary national priority was beating the Soviets and putting a man on the moon?
Meanwhile, the STEM skills that are utterly necessary in today's world - statistics, linear algebra, logic, computer programming, electrical engineering, chaos theory, ecology, molecular biology - are largely absent from the traditional curriculum.
But then, that's the problem with top-down planning. You end up with systems adapted to the prevailing conditions at their birth, and nobody willing to stick their neck out to change them.
I think with programs that are trying to encourage kids to get into any profession you're going to highlight the benefits. There can be some good perks to being a developer, whether its good salaries or flexible work environments. I don't think there is anything wrong with stating that, and I don't think the ultimate message is suppose to be "hey, you guys have to become software engineers now because it's so great". At least that is definitely not the message I try to give to kids and I don't know anyone who does. In the end its about empowering kids to make their own informed decisions.
Obama told people they shouldn't just use an app, they should build one. I've taken plenty of web dev and programming courses (including a couple of university courses), but still have doubts about my ability to build something production-ready on my own.
Encouraging people to play with code and telling them they're going to help fulfill a shortage of software developers are two different things.
If people were held back by not being able to build production ready web apps, we'd have much fewer apps. We'd probably had much fewer SQL-injection and other silly vulnerabilities as well -- but on average broken tools are better than no tools. Broken tools can be improved.
For what it's worth, I still think BrandonM's comments are spot on. But I think it was also obvious that easily "dropping" files "in the cloud" was a great service to those that weren't inclined/could set it up for themselves.
You don't have to build something production ready. Part of learning about programming is learning how it helps you with smaller things.
You don't have to write a web server in Perl, you can write a script that saves people from editing some horrible text file in Word to remove extra lines.
Similarly, some apps are complex 3D games, and others are timers that let you know when your egg is cooked, or your break is over.
> You don't see this kind of thing in other industries, do you? Why are so many programmers intent on getting everyone from every demographic, rich poor, black white,
You see diversity pushes in many industries - nursing, construction, education, and many many more. You don't see those programmes because you don't hang out in places where they're discussed.
> interested not interested, into coding?
People can't know if they're interested until they've tried it. If you're going to let people try coding it seems like a good idea to let them have a good, realistic, example.
> You don't see this kind of thing in other industries, do you? Why are so many programmers intent on getting everyone from every demographic, rich poor, black white
Can't say I completely buy this argument. When's the last time President Obama addressed the nation, urging more people to learn to do nursing, or construction? Maybe 'diversity pushes' happen in other industries, but they don't seem to hold a candle to the intensity behind this one. Why, is my question. What's the 'greater good' that everyone sees behind an entire planet of coders? Why not push everyone to be doctors? Push everyone to learn virology? Surely, there's more utility there? Why code?
> People can't know if they're interested until they've tried it. If you're going to let people try coding it seems like a good idea to let them have a good, realistic, example.
I think your comment here is a generic statement, but I still have to mention - it's not completely true. For example, I knew very positively that I was interested in coding, well before I ever wrote a single line of code. I think it's very possible to have genuine interest in an industry, before you've ever worked in it. That said, I suppose I understand the benefit of exposing young people to another career possibility... But again, why coding is being pressured so much as opposed to other industries, I don't understand.
Why, is my question. What's the 'greater good' that everyone sees behind an entire planet of coders? Why not push everyone to be doctors? Push everyone to learn virology? Surely, there's more utility there? Why code?
This is a good question.
As microcolonel states elsewhere in this thread,
The point is that software literacy is becoming more relevant to everyday life. The point of a mathematics class is not to create professional mathematicians, it's to provide the concepts of mathematics to students.
I don't think anyone would disagree. So clearly one reason is that software literacy, computer fluency -- it's all part of life in the 21st century. But why coding, specifically? The cynic in me raises his hand to posit that some of it may simply be a result of an unfortunately wide sense that "computer stuff" all lives in one big bag. And in that big bag we have pervasive entities like Google, Microsoft, Facebook, and Apple. We also have venture capital, flashy IPOs, Bill Gates, your mom telling her friend that you're "good with the computer", the aloof and touchy priesthood of IT guys at the office, the technician who fixed your FiOS, someone who made a million dollars selling an app, your cousin's addiction to Candy Crush, Jesse Eisenberg sparring in his bathrobe over hundreds of millions of dollars, and the cryptic, cinematic, powerful -- albeit unrealistic -- representations of programming everywhere from "24" to "The Matrix".
So there's mystery and money and young people and bootstrapping and sudden shocking success. There's little about the trades, factories, apprenticeship, slogging through an expensive education, missed opportunities, irrevocable career paths, layoffs, or two percent raises.
I think most people know it's not all really that way. But it's a powerful undercurrent. And when the manufacturing jobs evaporate, or the giant bank lays off 10,000 workers, one's gaze can't help but be pulled over to that big bag. Even the President's gaze. Most people don't understand the bag, but part of it -- this coding thing -- well, we've heard that's something a determined person could even pick up on their own. Maybe even at home. And hey, maybe some of those people will be mysteriously, fortuitously successful. Join the information economy. It's a convenient place to lay some hope. It's a pleasing, if specific, answer to the complicated question of what to do with obviated workforce cohorts. We used to have "retraining" and "job counseling". Now we have "learn to code" as well.
I can't say it's a bad thing. Some will find a passion or predilection, and a new direction. It's good, but part of me believes the reasoning behind it is reductive, convenient, and wishful thinking at best.
No, I don't think the push towards greater coding literacy is about people becoming professional programmers. It's easy for programmers to forget something that is very special about what we do - we automate our work. Every programmer spends at least some time writing scripts to analyse trace outputs, unit tests to automate testing, preparing macros for their editor of choice etc. As this activity is very similar to our main activity, which is actually writing the code for a real product, we often just consider this to be part of the process. But the thing is that just about every office worker has activities that could be automated. Even if it's just learning how to automate MS Office by using VB scripts, or learning how to shift files around your filesystem based on rules using Automator on a Mac, you can greatly improve the efficiency of workers. Just think of the sort of analysis that knowledge workers could run if they just knew enough to write a webscraper!
This is the sort of literacy that the masses need. They need to learn algorithmic thinking, and the basics about how to get to an environment on their computer where they can actually get started. They need to know enough programming vocabulary so that they can Google for more information.
Beautifully put. I do think "learn to code" is society's attempt to bottle up the magic of the tech boom. Maybe in the 60s it was "learn polymer chemistry".
I think there is a reasonable sense that coding is becoming an important skill for large sections of the general public, well beyond IT professionals. It's essentially an ability to turn the machines that make the world work to your own hand, and to your own purpose. That's something that is enormously powerful in all sorts of fields, and widespread knowledge is probably good for society.
Programming (especially open source programming) is also relatively unique amongst professional careers in its low financial barrier to entry - in theory, you just need time, shelter, and access to a cheap computer to become genuinely prosperous, so it does seem reasonable to be a bit evangelical to marginalized groups. Although this works alongside the slightly embarrassing knowledge that the actual demographics of programming are to one degree or another wealthy and white.
It's also important because knowledge of programming gives the citizenry, and even the political class themselves, the knowledge they need to make informed political decisions about how computers and digital technology should be deployed. It's difficult to know where the line should be drawn if you don't understand the potential of the technology, and it's very difficult to stand up and object if you do not understand exactly the nature of what is being suggested, because of the risk of being made to look a fool. It seems likely that the lack of discussion about digital issues is partly down to many politicians hardly being able to manage their own email.
> in theory, you just need time, shelter, and access to a cheap computer to become genuinely prosperous
AND the desire...i think that's what a lot of evangelists seem to leave out of the equation.
personally, I think learning code is almost identical to learning to play a musical instrument. It is a very solitary development and THAT'S what is the biggest barrier to entry.
a lot of people just don't like working alone on things...imo
Well, that is changing because technology now allows us to communicate with each other from distant locations. There are websites like HackHands where you can code with a mentor simultaneously. So even the "loneliness" barrier to coding is also dissolving.
I disagree. Although that technology is wonderful, you will still always need the kind the drive and determination to do figure things out for yourself.
It's one thing being shown how to do something, it's another thing to work it out for yourself, in much the same way as studying maths or physic etc.
No doubt, but that is then true about nearly anything - art, music, film, photography. At the end of the day, we are all alone, but that isn't an excuse to dissuade you from doing anything, especially not coding.
"Because computer talent has to become cheaper for the tech industries to have even better margins"
Anyway the larger percentage of tech companies are now at a post-innovation stage going to small incremental updates and profit maximization. Star programmers that could build everything will start becoming legends.
The same way they pushed people to STEM to have a wider selection of cheap candidates from their top 5-10%. They only need the cream but to have enough of the cream they need to waste a lot of milk. The milk is all these people that go into an area without loving what they do or without having a lot of talent and determination. They are the ones that lose in the end, becoming the worker bees of the hives. Simple supply and demand. It also gets a lot worse for the good ones too.
I think you completely missed this part of the comment:
> The thing that bothered me, though, was that he didn't even think that he could do that, purely based on where he was from and what he saw around him.
There are many many kids at the primary and secondary school level, and especially minorities and kids from poorer backgrounds, who don't know they could possibly even get into this line of work. It's not like saying you want to become a doctor or a lawyer, which requires a ton of money and time for a post graduate education. They could get started coding now and relatively cheaply, they just don't have the right people directing them and showing them the tools to do it.
Having volunteered at the high school levels teaching kids html/css/js it's very clear there is a total disconnect between what these students could be doing with technology and the ability of our school system to provide it. What really upsets me about these sorts of comments is it really comes from a background of ignorance.
I don't mean any disrespect by saying that, but have you tried to talk to kids at this level and see what the situation might be like in public schools that don't offer any resources for learning how to code? And if you had that experience do you have the perception that we aren't failing students miserably by not getting them more comfortable with technology? If you have then great share your experience, but I find it troubling people can write it off as simply "oh, coding isn't for everyone" when you really don't have much experience to draw from other then how you learned your skills as a developer.
Clearly not every profession is for everyone, but we need to be giving more kids a chance and our schools have not been up to the challenge outside of teaching the standard subjects in their curriculum that we've been teaching for 100's of years. It's good to keep in mind cheap computing and widespread internet access didn't really occur until the mid 1990's. It's barely been 20 years and it's going to take some time for our schools to catch up, but you first have to admit there is a problem.
In this case, this is a person who wants to code, so I'm not sure your point applies.
In the event of a person who's not specifically interested, I'd argue that very basic programming is useful for learning logic, getting a more realistic idea of how computers work (and don't work), and it's useful because of how much work in almost every industry is done on a computer.
I'm an MD, and I'm not a developer. Knowing some basic programming has helped me analyze data and automate a few tasks in the past, but more importantly it helps me identify which problems can be solved with software even if I'm not the one that builds it.
I'm not sure that I interpret the fervor behind this world-wide code push, to be simply targeted at encouraging people to have develop a 'basic programming' knowledge. It seems like there's a far stronger drive to get people into this as a CAREER.
I hear what you're saying though, there's definitely benefit to knowing programming. My point is, it's no more beneficial than many other disciplines which AREN'T getting national attention.
Firstly, since Google/Facebook/miscellaneous mega-startups, there have been people with enormous wealth and power who got there through the 'engineering track', and love developing software, even if they don't do much of it any more. There are nurses who find their jobs intensely rewarding, but CEOs of big medical companies are not typically drawn from their ranks. So there are evangelists for code with powerful public platforms for their statements.
Secondly, the smarter of the SJWs realise that - in terms of diversity and so on - there's a supply shortage further down the pipe. So a disproportionate number of do-gooder startups/tech non profits are focused on teaching more kids from different backgrounds the basics at an earlier age, since that is a sincere concern among the people who found such enterprises. (Throwing money at such things is a cheap, convenient PR move for the likes of Google, as well.)
Thirdly, as others have pointed out, teaching somebody to code requires a light capital investment up-front, relative to the value of the skill. There's no way to become a doctor without years of med school; you can start throwing code around as soon as somebody gives you a refitted chromebook and tells you to type "print 'hello world'".
Fourthly, hype around the Valley and the tech industry today.
Fair point, but the net result has been a shift in basic computer education from "how to use Microsoft Office" to "learn a bit of coding" which seems like a huge step in the right direction.
I don't think of it as an industry. It's more like knowing how to read and write or how to cook or do first aid that we should endeavor to at least give kids a taste for what it's like. Computers are so ubiquitous now that it's the equivalent of Home Ec half a century ago. There's a lot of office jobs that we don't think of as traditional programming jobs, but would still benefit from having a basic skill in automating or scripting repetitive tasks.
One side effect from this is that the people who love it and have a knack for it could figure it out much earlier, and we'd probably end up with more, better full-time programmers.
I'm also intrigued by this. One of the responses below mentioned how other industries do similar things. Now let me give you my perspective. I have been semi-technical for many years but my background is rather bookish: translation and publishing. More recently I've been working as a technical translator at software company. Even though I had tried to learn php at some time, I didn't make much progress. But now, I finally understood programming and learnt python. Let me tell you something: it took my work to the next level. For a professional programmer, what I do might seem easy, but learning python has had a real impact in the way I work. I can now, for example, do things like extraction of common words, BUT filtered through a word stoplist and a spell-checking library. Proof-reading is super fast now. I can get all adjectives from a text. And each day I learn more. I want to become a computational linguist, so I can offer new, different services (I feel I MUST to this, specially facing advances in machine translation). But I don't think all people must have the same orientation, so I don't know if I should say "all translators should learn python". It works for me, but maybe not for other translators, for example, literary translators.
1. You can start programming early. You probably can't (or shouldn't) do metalworking at 12 but you can program. There are few hard prerequisites. You can safely learn by trial-and-error (I'd say the crucial lesson is how to trial-and-error).
2. The "official" education path to programming is unsatisfactory to non-existant. You're much better off tinkering on your own as a teenager than waiting for CS 101 in college (or Pascal in high school if you're lucky).
I have a strong belief (but no hard data) this is worse than in other industries. The cheap excuse is "programming moves too fast for educators to keep up"; I also blame CS departments for denying that CS != programming and that most of their students came for the programming, not CS.
I have no doubt the above is heavily biased by my own experience. I'm factually aware some people start programming late, e.g. 30s, and quickly become very good at it; but I don't even understand how that works — it doesn't fit my model of the world.
I wonder if people who start late regret it and/or feel getting kids to code is cool, or disagree completely and don't see the point...
I see it as "Let's make it easy for kids to learn programming", so that those who find it interesting are aware, inclined and develop further interst, if they choose to.
I see it less of "Kids to develop awesome Android app", and more of "you can do something simple and interesting with programming"... It's more similar to dacing or Taekwondo classes that kids can easily go and try out.
Where are you from where everyone knows algebra? I understand you people like it, so do I - that doesn't mean it's for everyone.
edit:
This is perhaps a bit too curt, but I don't understand how comments like this generate conversation - they would be better served by the poster googling "why learn how to code" or "why should children learn how to code" and responding to the specific arguments that other people have made rather than saying:
"Forgive me, but it just seems like there's this misconception that everyone should know coding like they know algebra. I understand you people like it, so do I - that doesn't mean it's for everyone."
in which you could easily substitute 'reading' for 'algebra' and and 'algebra' for 'coding':
"Forgive me, but it just seems like there's this misconception that everyone should know algebra like they know reading. I understand you people like it, so do I - that doesn't mean it's for everyone."
If the same argument can be used for not letting women drive as for not teaching kids to code, it's because what you've said is essentially a no-op.
One of the reasons I'm a programmer now is because before the cold war ended, it wasn't even a question whether children should be taught how to program, and they started me on Logo and Basic when I was a lot shorter than I am now.
>Why are so many programmers intent on getting everyone from every demographic, rich poor, black white, interested not interested, into coding?
The mention of class and race here are disturbing to me, as if these are your candidate filters to decide who to teach coding to. It would never occur to me to mention wealth and race during a question about what people should be taught.
The straw man that I'm projecting onto you here is: "Why teach classes of people who are destined from childhood to be low-paid service workers anything other than grill, fryer, and point-of-sale register skills?"
Hm, not sure I see where I said it was for everyone?
To answer the bigger question, though, I'd say it's a premonition of sorts in regard to where kids will find the most success in the future (compared to investment). It's not that EVERYONE MUST CODE, but rather, if they're interested, they shouldn't be discouraged/should have help getting access to the tools they need to do so.
Further, I can't really speak for other industries but I think it's really cool to be a part of an industry that cares enough to want to give as many people as possible access to it. That says a lot.
At this point it's almost a literacy thing. Basic knowledge and breaking that fear of computers would be enough. At that point people can decide if they're genuinely interested based on knowledge instead of (probably) irrational fear. For some people it really is a matter of just breaking that initial intimidation factor. There's more than enough time wasted in K-12 to not put at least an intro class in there for a year...
I remember getting scolded in my 7th grade (required) home education class because I had the pan's handle in a dangerous position while cooking eggs. The pillow I made was the shit though and I did learn how to make muffins so it wasn't a complete loss. But you've got to agree an Intro to Python/Java class would be a much better use of time.
And then there was that one time I broke my arm playing flag football in one of my countless years of "physical education". Teach kids diet and exercise for that "subject" to be remotely beneficial in my opinion, with an emphasis on the diet. That's lifelong knowledge. I don't play flag football anymore, also still don't have full range of motion in my left arm (it's nothing bad al all, just kind of annoying on occasion).
Or those two years I spent in French. I remember my French name was Pascal. I can tell you my name and count to three but unfortunately can't remember how to tell you bye, or anything else. Everyone should at least be introduced to code like they are to algebra. I was introduced to woodworking, close to a decade of random ass activities in phys. ed., clay classes, jewelry classes, and more I can't even remember.
A semester or better yet full year is more relevant in today's society. And frankly, is a skill that will have a subjectively higher quality of life than whatever else they were considering. Breaking the initial intimidation faction, like I said, could be all it takes. Stupid to not do that when so many other almost-mind-boggling stupid things are required.
I know my 6th/7th grade cousins were amazed this year when I told them I was making an Android app. I told them they could too and there are plenty of sites to start, told them a few, but I know they didn't believe me. I think kids around that age would love making a simple HTML5 game or something. Accomplishing that at a young age would change how they think about tech for the rest of their lives. I know my cousins will never get exposure to that outside of school because their blue collar father (not saying anything bad, my dad is blue collar as well) has them playing sports year round - football, basketball, baseball, and soccer. Now that's awesome, but over the top. Those aren't lifelong skills the same way coding can be. I hate to even limit it to 'coding', it's knowledge that can help apply logic to all areas of life while potentially creating plenty of opportunities.
> I had the pan's handle in a dangerous position while cooking eggs. ... But you've got to agree an Intro to Python/Java class would be a much better use of time.
Even as a programmer, you still eat 3 times a day, right? There you get your utility. I think cooking and personal finance classes should be much more prominent. The think that was useless for me was Chemistry. The subject I got the most out of - English (I am not a native speaker).
Pah, chemistry is just useless because you can't make fireworks any more without risk ending up on a no-fly list or worse ;-)
On a more serious note, I'd not remove anything from parent's list of classes -- I'd even defend the breaking of his arm (in the sense that, I'd rather have some kids break arms, than playtime/phys.ed. being too cuddled).
It's really sad that he didn't feel he got anything from learning French. I suppose that's the price one pays for having "won" the language lottery (it was a little like that with German 70 or so years ago, I'd imagine). Perhaps French should be dropped from American schools and be replaced by a language that actually is more of a gateway into an alternative culture and history, like Arabic or something...
It's not so much as I'm asking people to become a professional programmer more than a psychology teacher is telling her students to become psychologists or math teachers telling his students to become mathematicians.
Student's should at least be aware of it. Understand it's not MAGIC and doesn't require you to have little friends and knowledge of hacking traffic lights in order to build a useful tool for yourself or others.
It is seen in other industries, they are just more localized. For example, it used to be the case that in coal mining towns or automotive industry towns, there was great pressure to get everyone working for the local coal mine or auto plant.
The difference between the web/tech and older industries is that the web is literally everywhere. You can get to work in tech anywhere there is a computer connected to the internet.
Coding teaches logic as a discipline. Logical children tend to move toward STEM careers. The intent isn't so that every child becomes a career computer programmer; only that coding is the cheapest, easiest way to teach children the foundations of logic they will need later to succeed in analytical pursuits.
Thing is, for people who can understand its benefits (which are great) it is hard not to advocate. I remember the father of a friend in grade school giving advice to learn languages (spoken languages). He knew a few and obviously witnessed the benefit for life and business. 20 years later, that was good advice.
That's a great anecdote, and you've brought up one of our primary hypotheses. A few weeks ago I attended a CoderDojo event in Virginia to deliver our first batch of Codestarter laptops, and it's always amazing to see the demographics. Mostly kids under 12, 50/50 boys/girls and this event was about 90% African American. Seeing the diversity at this age and comparing with the makeup of the professional software industry paints a pretty interesting picture. You can read about the event and see some photos here: http://blog.codestarter.org/post/92559298800/first-delivery-...
In this case, it's being compared to the normal demographics of people doing programming/CS related stuff.
The demographics here are pretty stark: the percentage of women, and the percentages of hispanic and black people in programming and other tech related jobs (and studying it at the high school/college level) are way, way below their representation in the overall population.
So "diverse" can both being used to (as another poster mentioned) refer to "non-white", as in the states it's often used as a byword for that, but I think in this case it's being used as a comparison to the expected demographics.
As a much more off topic explanation of why "diverse" gets that use in the US dialogue: the US has serious problems talking about the fact that racism and inequalities along race lines both are still incredibly prevalent; many people would prefer to think it ended after the civil rights movement in the '60s (and that is largely how it is taught in the public school system). Saying something along the lines of "I was impressed by the number of black people/African Americans who attended" or even "I was impressed by how many non-white people attended" isn't politically correct in some circles, (and can open you up to accusations of racism) despite the fact that it mainly acts to highlight underlying inequalities and race lines rather than actually actively perpetuating and enforcing them. However, "diverse" is used as a byword to mean the same thing without actively highlighting any particular racial inequalities, which is more palpable for some people. (In my opinion, more palpable mainly for the people who are uncomfortable acknowledging the problems or who would rather not address them, but that's more social commentary than an explanation of the use.)
Thank you, and everyone in the other replies, for the explanation, I have a better understanding of the situation now.
In western Europe, Belgium, we would use 'diverse' when there are many nationalities among the attendees (ie: Italians, Moroccans, Turks, African, etc.[0]). Note that I purposely gave southern nationalities to highlight how 'diverse' is also a byword for us: we wouldn't use the word if the attendees were Norwegians, British or German (we would use 'European').
[0] Although nowadays all those people are sadly put under the broader and less subtle 'Arab' or 'Muslim' label.
I meant diversity across all the CoderDojo events that I've seen. This one happened to be primarily African American, but in aggregate, the breadth of kids that show up to learn to code is really amazing compared to what we see in colleges and the profession in general.
Probably because the population they recruited participants from had a similar distribution.
"Diversity" as a slogan is somewhat misleading, because what these initiatives try to do is to counterbalance negative selection of characteristics like skin color, parental income etc which shouldn't matter in coding.
Also "minorities" has become a difficult term because of spatial segregation, demographics, blending, and even the decline of racial prejudices (which is a good thing) blurs the lines of who's the majority and who's the minority in a particular issue.
I think calling this group diverse is a slightly sloppy way of saying that this group would increase the diversity of the overall Computer Science community. Also this group was diverse in gender.
The problems exposed show that being a distributor is harder than it seems;
> The “developer mode” boot screen is scary and unhelpful.
Then put your own bootloader ? You can do this by accessing the hardware: http://www.chromium.org/_/rsrc/1381990807648/chromium-os/dev... and removing the write-protect screw.
Of course the chromeOS team is making sure it's not easy to bypass their security features, and it has the (desired?) side effect of preventing this kind of mass-scale re-purposing.
> Kernel updates wipe out the custom kernel modules.
Then put your own kernel. Being a distributor means taking responsibility about what's running on your hardware. Make your own repositories, recompile the kernel(you can even automate it), and while you're at it, add all those packages you're installing in the script(hello .deb package downloaded over http), and VERIFY THEIR SIGNATURES with apt.
About the kernel, you could also make sure the upstream kernel supports the trackpad, and then make sure the (intermediary) distributors pick up the associated patches, so it's a burden off your plate in the mid/long term. You could also probably pay Canonical (or anyone else) to do this job.
I love the C720. My dad got me one for xmas, and himself. I used crouton to install Ubuntu, but like your approach better. I work on the Intel XDK team and am pushing the C720 as a great development machine since we provide cloud based builds for hybrid apps. You should look into the XDK for Code Starter https://software.intel.com/en-us/html5/tools
Crouton is pretty nice, and I experimented with that approach as well. A big reason we chose dual-boot is to give kids a playground in which they can really explore their laptops. I fondly remember the joy of installing every operating system I could get my hands on, and a full, independent install of Ubuntu demonstrates that this is possible. Thanks for the tip on XDK, I'll give that a look!
One other thing that I've learned the hard way with Crouton is that it's too easy for kids to delete the Ubuntu partition. It's doubly true when several kids use it.
I have fond memories of when we got our Amiga 2000. Must have been around 1986 or so (which would've made me 7 years old). My oldest brother spent half a day installing the OS onto the gigantic 40 MB scsi hard drive. When he was finally done, I spent about half an hour managing to bork the entire system, so he had to do a full reinstall... good times.
Still remember he told me that "it shouldn't have been possible to do that." -- by witch he meant that while it wasn't much of a surprise that'd I'd mucked up the install -- it shouldn't have been possible to do it the way I did. Still don't remember what I did, just that it involved a lot of dragging and clicking. I doubt I'd be able to do the same thing today ;-)
Even in developer mode, ChromeOS provides a very limited set of tools, no package manager, and a mostly undocumented environment. It's not really appropriate to use as the base of a developer machine. Ubuntu, on the other hand, has all of the right things to make it easy to use and extensible for devs.
I think that in long term it will take much less effort to customize ChromeOS than Ubuntu. Initially it will be harder but after that it will require next to no maintenance. Changing the kernel sounds much more complicated than finding a way to install a package manager. Google might also be willing to participate in this in more ways than just helping set up the environment, which could be very beneficial for the whole project.
Since this _totally_ undermines the security concept, some dis-assembly is required (open the case, turn a screw, close the case), which is described in the chromeos developer documentation
Yeah, I saw this technique, but it requires violating the warranty (opening the case), and we'd really like kids to be able to get their laptops fixed for free if something goes wrong. Still great that it's possible to do, and can make a fun project for kids with our laptops.
I have an Acer C720 myself but put ElementaryOS on instead of Ubuntu - runs like a dream and looks great to boot.
A good thing about the C720 is that the SSD is upgradeable fairly simply[0], though it's easy enough to stick an SD card in as a stopgap.
I picked up a refurb (not that you'd know to look at it, feels brand new) for £130. Wouldn't get an 8.5hr lightweight laptop for anything close to that elsewhere.
I can't say for sure unfortunately as my only previous experience of Ubuntu on a Chromebook was via Crouton on a Samsung which, being ARM, was somewhat limited in what you could do with it.
Elementary on the other hand I run natively via dual boot. It's nice and slick, and I much prefer Pantheon to Unity, but I can't really give you a direct comparison on this device.
I don't have a Chromebook, but I do have an HP netbook running an AMD APU. Elementary OS does run better than Ubuntu on it, but granted, it has a decent video card. I've no idea why it was so much smoother in regular usage, but it is!
I quite like the work you've done here! Most computers in a school are ill-equipped for programming needs.
I'm a bit concerned that this laptop is designed with an experienced programmer in mind. I don't think the *nix Terminal is friendly to noobs, considering it comes with no safety nets. Additionally, a teacher who is unfamiliar with Ubuntu is likely to have a hard time helping students. I tweeted a few more thoughts here: https://twitter.com/dshankar/timelines/497091774792228864
In it's current state, perhaps this laptop would be better suited at college students who are already familiar with the basics? At middle school/high school, students & teachers need a more friendly introduction into the programming world, not the real & scary one that is modern day Linux. However at the college level, students need tools that will prepare them for the industry and Codestarter packages all the tools one will likely use.
Thanks! We're targeting an age range of 7-17 years old, and distributing these laptops to kids that are attending learn-to-code programs like CoderDojo. For the very young, most will choose to boot ChromeOS and use online programming tools like Scratch, which is totally great. Installing Ubuntu gives more advanced kids a place to go further, with no limitations, and that's why we do it.
Another thing to consider is if kids did something bad from the command line to catastrophically mess things up, it's easy to wipe the drive and do a fresh install.
That itself is an invaluable learning experience. Knowing that breaking the software does not break the computer only helps pique a child's curiosity. (I first installed Slackware Linux when I was 12.)
Exposure to the command line at an early age never hurts, as long as it doesn't get in the way.
I've had some experience with middle schoolers and scratch, as well as my 10yr old. FWIW, my school district has access to iPads, Windows desktops, and chromebooks in varying quantities and frequencies through the grade levels.
Once the middle schoolers were in the scratch shell, it's all about scratch. (Well, that and browsing the scratch repos for interesting projects). Would work just as well on linux as windows. Progressing to python with this group didn't work as well on windows as it would have on linux, because the terminals were infuriating for both the teacher and the students. (Also, there would have been a ton less issues getting stuff installed).
My 10 yr old started seriously using a shared ubuntu desktop @ 8 did some scratch, then at 9.5 started on python, and now arduino. He's just fine with the terminal, for what little he needs to actually do with it.
When I was a kid exploring my brand new desktop, I broke it several times by deleting system files or wiping out my partition. It was a great learning experience to reinstall my OS, that also boosted my courage in trying things out.
So I really don't understand this drive for safety features. If you want to teach kids to program it kind of defeats the purpose.
You must be young. 20+ years ago all of us were using shells / terminals just fine. Have people already forgot that everyone used text based operating systems before GUI OSes came around? If I could learn basic CPM commands on my dad's VT180 when I was 6 I think people learning to code can figure out Ubuntu.
firstly, the posix SHELL is not at all unfriendly (terminal != shell), and introducing it earlier can only be beneficial. If you get scared by a shell, you probably shouldn't be in programming.
If a 7 year old gets scared of a shell that does not mean they should not be learning programming. It seams reasonable to introduce shells early, but we should be aware of the risk of scaring off children with unintuitive interfaces.
If we managed to load games from cassette tape on a c64 (or a vic20), a modern posix shell should be a walk in the park. In my experience, a kid will learn any number of rote actions required to launch a game. And then later work out what those actions meant.
Completely agree. I got a commodore 64 and a 286 when I was 6. My dad taught me how to start my games in DOS so I could start those up by myself when I woke up early in the morning and they slept in. I started working through the BASIC manual that came with my C64 when I was 7 and if/then/else were some of my first English words that I learned. No need to dumb it down too much, kids will figure it out.
1) This shell is not required to play games, so that motivation is not relevant to this discussion.
2) survivorship bias.
I think starting with something approachable like Scratch then advancing on to a posix shell in Ubuntu is a great idea, but I take issue with the attitude that if a child is scared by a shell they should not learn to program.
Sure, I agree with both 1) and 2) -- I just don't see how "being scared of a shell" is a valid point. If the kid is given the thing and told to go wild, I doubt most kids would be "scared". Adults, maybe. But not kids.
Anyway, the kid doesn't have to know that the shell isn't prerequisite to play games -- until they figure it out on their own. My point was more that children are adept at overcoming obstacles that stand in the way of play.
> My point was more that children are adept at overcoming obstacles that stand in the way of play.
I agree with that point entirely, but that does not mean that all children will overcome all challenges. My point is that we should not give up on a child just because they struggle with a shell. The reason I think shells are not the best introduction is that they cannot be figured out by exploration. They require specific knowledge that cannot be deduced because it is fundamentally arbitrary. That information can still be found online, so it is not impossible to discover, but I think it is ridiculous to give up on a 7 year old because they did not like exploring a shell.
Is that a real risk? I mean, we aren't talking about breaking a leg. In my experience, as a father, the husband of a wife that works at a kindergarten and as an ex-child ... the best thing about being a kid is the courage you have in exploring new things. Take the opportunity of exploration away, in the name of safety and the effect may be the opposite of what you want.
And actually, for better or worse, many of us as kids learned programming in "unsafe" environments. So the onus is on you to prove that the alternative is better. Surely a study could be made.
For the driver problem: You should look into DKMS. That's a better stop-gap than holding back the kernel modules. What DKMS does is to re-compile your kernel module each time the kernel is updated. It can also move aside kernel modules with the same name in the base kernel. See http://linux.dell.com/dkms/manpage.html
I did come across DKMS during my research on solving the problem, and spent some time trying to figure out how to get it set up, but fizzled out in my attempts to get it working. Do you know of any more extensive tutorials on the matter? Real-world examples would also be hugely helpful. I couldn't find much. Thanks!
I really love Arch, even though I don't use it on a daily basis anymore the community wiki is the best resource I've found for guides on how to do a large number of tasks with the kernel that can work in most other distros.
It's good to see Minecraft mentioned--a friend of mine recently ran a two-day Saturday class on beginning programming in Python, using Minecraft on Raspberry pis.
Kids had a lot of fun making functions to do cool things in Minecraft. Gamedev is probably the most rewarding way to learn programming.
That's rad. Minecraft is such a great gateway for getting kids into programming, it's awesome to see educators taking advantage of that. Does your friend have their curriculum available anywhere? I'd love to take a look at how they ran it.
I don't think it's unethical (according to whom?).
There is a problem however - children need to be educated in matters of licensing, which is why I believe starting with an open-source editor makes sense. The real power with open-source is that as a developer you've got control over it and that it never dies. We wouldn't even have this discussion if we were talking about a proprietary programming language with no open-source clone.
So there's a risk associated with picking something proprietary - those risks are sometimes worth it, Sublime Text is surely a good editor and much friendlier than Emacs and if it dies or gets too expensive, then the lock-in is not that great, but children at some point need to learn about licensing and the pros and cons of the choices they make.
Unethical as in free software proponents believe that you ought to be able to run, study, modify and distribute software (kind of like basic rights such as food, water, shelter etc.). Of course you're free to disagree if that's ethical or not.
Imo, I already think it's too expensive for the age group their targeting (7-17) though. But I agree with you that instead of just starting them with a proprietary text editor that nags you, it's better to educate about licensing beforehand. Otherwise, it'll just perpetuate the notion the 'nagging' dialog doesn't matter, until they found out when it's probably too late (as in they're already locked in due to habit).
What if they agree that free software is ethically better? They might not (they probably don't care), but shouldn't we at least tell them what they're getting into just in case?
Yes, I totally agree that children should be educated in matters of licenses, as I've said. They need to understand the pros and cons. I also believe that they should be taught not to pirate stuff. Piracy is partially responsible for a lot of things, amongst which:
1) developers are not paid accordingly to their contributions and I'm thinking here especially of indie games developers that suffer the most from this
2) open-source is not as popular as it should - for example whenever a Gimp vs Photoshop thread comes up in which Gimp is said to "suck" by comparison, it infuriates me immensely since I bet that a large percentage of Photoshop users are pirating it and so Gimp doesn't get the credit or the attention that it deserves. Actually piracy limits competition, since this is a problem for all Photoshop alternatives, not only for open-source.
3) monstrosities like DRM which end up punishing well behaved people, since DRM doesn't really work except for lock-in purposes, but hey, companies get away with it because piracy
4) closed platforms that mandate the installation of apps from app stores, because safety and piracy
In general I do not agree that piracy is like stealing, but it does real harm and children should be taught not to pirate, primarily out of respect for other human beings.
If you can come up with an economic model which will support developers doing the hard work of producing something with a well-thought-out user experience, feel free to suggest it. Do you know of any besides targeted advertising or corporate sponsorship?
How would you suggest developing a text editor with a well-designed yet powerful user experience for a beginner?
> If you can come up with an economic model which will support developers doing the hard work of producing something with a well-thought-out user experience, feel free to suggest it. Do you know of any besides targeted advertising or corporate sponsorship?
Don't be lazy. There are many many many many many projects that thrive with _zero_ cash flow, of which many are found in ChromeOS already. There is no reason for there to be a single drop of proprietary software on a laptop besides the user opting for it.
I'm going to hijack this comment to add that your opinion that money is the only reason to develop software is one of the worst things to ever happen to computing. If creativity and desire to help the world isn't enough for you, pass the torch to somebody else, because they will do a better job.
I imagine a world where closed source software is looked down on in disgrace as unsafe and nonsensical. Soon we're going to see more and more successful closed source software companies switch to an open source model. One day hopefully people will have the knowledge to realize we can code everything our selves. 90% of our goods and services can be augmented or automated. We could and should have 3 hour work days and we should be retiring after 4-5 years.
In a world where we all had 3-hour workdays or some sort of guaranteed minimum income, my argument that "significant software projects require some sort of revenue stream in order to pay developers" becomes invalid. We don't yet live in that world.
> your opinion that money is the only reason to develop software
I don't hold that opinion. I hold the opinion that money is the only way to make developing software a full-time endeavor. Otherwise, I'd have to find some other way to support myself. As it stands now, I can cake my living writing proprietary software and try to use the time I have left over to write free software. But I also have to go grocery shopping, spend time with my wife, help my family out with things, maintain relationships with friends, understand and contribute to policy discussions in my community, exercise, sleep, cook, eat, and many other things besides software that contribute to a healthy life. Without being paid to write software, I would spend as much time on it as I spend playing the Irish drum...which happens to have been 0 for the past month. I'm sad about this. As I'm sad that I can't spend all day working on OpenHatch. You seem to be asserting that creativity and desire are enough for someone to engage in significant amounts of labor, however enjoyable. I ask you: How do you pay your monthly expenses? How do you think most developers of software, libre, gratis, or otherwise do so?
I'm writing this while running Chome in an Xmonad window on top of Ubuntu GNU/Linux. I'm not ignorant of Free Software.
For absolute beginners, they should start on codecademy or something.
When really starting with coding, you want something with autocompletion and easy library access for lookups. For Python I think Ninja-IDE is really good, it covers all the ground necessary, is cross platform, and has the python docs built in. Something IDLE or any terminal based editor won't have.
For general purpose editors, Geany / Gedit, and Kate / Kwrite give you all the tools you need to build software in a multi-application environment once you graduate from the first stages. Having used both, I still vastly prefer Kate to Sublime for how easy it is to theme languages how I want, the vim keybinds, the huge number of plugins that include things like git diffs and documentation lookup built in, etc.
And then you can graduate to full IDEs when you want to just "do work", like Monodevelop, Eclipse, Netbeans, IntellaJ, Kdevelop, Qt Creator....
I don't think that's related to the substance of my comment (so you should consider this independent) but Light Table and Atom are probably the most beginner-friendly. vim is actually very easy to learn if you don't need to unlearn Emacs-style editing as well.
It is strange that you think it is unrelated to the substance of your comment. That indicates we are slightly talking past each other, because "What text editor should a beginner use?" is central to my comment. I'll outline how, since I seem to have been insufficiently clear.
Jck argued against the use of a particular text editor on the grounds that it was proprietary. I defended the use of of a proprietary editor in general on the grounds that:
1) It is important that the editor a beginner uses be well-designed.
2) Good design comes only from skilled people spending significant amounts of time on something.
3) In order for a person to spend significant amounts of time on something, they need to have something paying their monthly expenses. That could be a company's salary, student stipend, family, or kickstarter.
4) The most reliable way that I can see to get the money for a project like a text editor to support the monthly expenses of its core developers is by charging for the license to use the software.
If there is a more well-designed-for-a-beginner editor that is not proprietary, then my point #4 is false or irrelevant because some project has figured out a way to support development without blockading users from the source code.
These are both fair points, and we'd like to offer free Sublime Text licenses to recipients of our laptops, I just haven't had time yet to explore the best way to make that happen. Once Atom has a competent Linux version, we'll likely switch to installing that instead, as it's open source but still very friendly for novice developers.
Huh, a bunch of us at Canonical use Atom with Unity, I think you might be missing the .desktop file if you're pulling it from github? Make sure you have one for atom in /usr/share/applications. I took a quick scan of the github repo and didn't see one, so you might want to just grab the .desktop and icon from the PPA package.
I've never used a .desktop file for atom and had no problems launching it until a recent update from the master repo. Guess I'll try installing the PPA package to see if that works.
If you're on OSX, there's Lime Text, which is Open Source http://limetext.org . Linux cross compile is apparently working and a Windows version is in the works as well.
Thanks for posting this, just picked one up. I've got the Samsung Chromebook (running crouton), which is great, but I would like to have something x86 as well...
Yep, that's not a bad value, but $30 extra is still quite significant to us, and the smallness/lightness and battery life of the Chromebooks is hard to beat.
Touché. Though, to be fair, it's hard to "accidentally" do it, as there is a subsequent confirmation screen. The problem is one of ignorance, upon first receiving the laptop, and not knowing how to boot into an operating system. Once that's been settled, then it's perfectly safe and convenient to use.
This made me think, at what scale is this project happening?
I'm guessing a lot of companies would be willing to give you discounts or perhaps even free machines (buy one, get one free?) to get the positive PR by being associated with this kind of thing.
We've currently funded 146 laptops over the last few months, but we'd love to be distributing thousands every year and are currently in conversations with a variety of companies to see if we can get sponsorships going. That would potentially allow us more flexibility around what machines we use.
You guys should work with CodeDay (http://codeday.org) and StudentRND (http://studentrnd.org) to help students at the CodeDay events learn how to code even if they don't have laptops!
Maybe sponsor one of the local events with a certain number so they can open up registration to kids who otherwise couldn't come due to the lack of a laptop.
I've heard rumors that this is the case (at least for the Acer C720), though I can't confirm it. I did call Acer to speak with a sales rep and learned that Acer's cost to produce is nearly $250 (if their information can be trusted), so it seems that someone is subsidizing the cost.
That Toshiba represents just about everything wrong with inexpensive laptops. It's large, heavy, has a terrible screen, slow HDD, poor battery life, and underpowered CPU.
Would processing (processing.org) make a good added extra? Has a lot of example 'sketches'. Has audio as well as graphical code examples. Lots of example projects around, some ambitious[1], exhibit Web sites, plenty teaching material available, peer reviewed/published stuff (might help with school committees). Comes with an integrated IDE and docs.
Might act as a bridge between the simplified graphical environments and Java/Eclipse while developing code concepts further.
I know I can donate, but why can't I buy one through you? If you've already done the effort to develop the product, why not sell it to the "savvy developer looking for an inexpensive laptop to travel with", reduce your per-unit costs, and make some profit you can pour back into the project?
We'd very much like to, but we're only a few months into this endeavor, and we want to make sure that our product is really polished before we start selling them in quantity. We'll get there! In the meantime, if you're willing to put $300 towards a ready-made Codestarter laptop, I'd be game to put one together for you and have any proceeds go towards the program. If you're interested, email me: tom@codestarter.org.
Their main goal is to get these laptops into the hands of children that can't afford them. They are making the script to do this available for free along with a tutorial. A "savvy developer" could pick the laptop up at Best Buy, Amazon, or Walmart for $200 and run this script.
I'm actually doing this to my C270 right now. I've ran it with Crouton up until recently. When I first did it, the Ubuntu side could see the sound card and I could access usb serial ports to program my radios. But after some update to ChromeOS, that stopped working. So having a full dual-boot setup would be really good for me, and this script looks like one of the easiest methods I've seen.
I suppose they could do a program similiar to what OLPC did where you paid the cost of two devices. One would go to you and one would go to a child. I see he suggested a $300 price tag, would is like buying 1.5 C270s, which would be a very similar scheme. I am of course, keeping in mind that they also have to compensate people for their time to do the install. I assume it can't be automated, at least without opening the case and voiding the warranty.
I'm a mentor for a Lego robotics team and have been for years - this year's season will mark my eighth year of Lego Robotics. I love to help the kids learn and I even try to teach them (to some extent) how to code.
This looks great, and I'd much rather get four of these for my team than a new laptop for myself. However, it's crucial to have Mindstorms software, which only runs in a VM [0]. Can you test if it's at all possible (speed-wise) to run the NXT-G software in a VM? I don't care if it's a bit sluggish or if it takes five minutes to start as long as it's usable.
If so, this would be absolutely revolutionary. So many teams don't have enough computers or have to share one computer.
If you don't have the software or an NXT, I'd love to buy one from you guys, test it out, and evangelize it to everyone I meet. My kids would go crazy for another computer.
When I bought it this Acer C720 was like an experiment, at first I was skeptical.
Now it's my main dev machine: its very little weight enables me to carry it wherever I go, the battery lasts 18h (18 HOURS, CRAZY!) if I read only PDFs, the 2gb of ram and the dual core are just perfect (but not tempting about running virtual machines on it), and with the SSD it performs better than more powerful machines with HDs. Oh, and I just love the 12" format.
(Of course I have thrown away ChromeOS, now I run elementary OS)
I, too, got a acer c720 as an experiment, and it also has earned it's way into being my primary machine. Not only for casual browsing, but development too.
I use ChomeOS in developer mode and chromebrew (http://skycocker.github.io/chromebrew/) and then some installing binaries and building other dependencies. My current projects include node, ruby, java, and golang. Some days I miss having sublime, but Caret is a mostly competent stand-in.
I would love to get one of the dell chromebook but they are next to impossible to find.
I think it's just perfect: the zero-bloat of elementary OS allows me to get my multitasking work done flawlessly, keeping firefox+sublime+atom+a-lot-of-terminals open without a glitch.
Also the dual core Celeron is fast in compiling things for elementary dev (of course just don't think to run Gentoo on it), while moderates the desire to play games.
About the 2GB RAM: as devs we are used to systems with 8/16 GB today; if you can accept that even RAM is a finite thing so it isn't an issue.
Anyway I try to preserve my SSD, so I keep my ~/Downloads folder in RAM, and at first I disabled the swap volume: the system just freezed too often. So I re-enabled the swap, but I reduced the swappiness and I plan to move it on an SD card.
And trust me, the Downloads folder in RAM is one of the best thing you can do to reduce the bloating of files in a system: you learn to copy away the things you really need, and leave the others there; at the next reboot they'll be vanished.
This was a huge problem for me as well. If I couldn't apt-get something, I was stuck. Also, it was hard for me to install things locally since my directories were all mucked up from the hacky installation. It ended up being more trouble than it was worth.
The Intel Celeron processor is what finally made me interested in the latest Chromebooks. For development, ARM is still a pretty painful thing to have to deal with.
I think crouton is a much better-integrated experience than dual booting. You can set up the Ubuntu chroot exactly the same way you have in this, but you don't have to reboot if you want to watch Netflix or use Chrome OS. With the crouton integration extension (synchronized clipboard, open links from Ubuntu in Chrome OS), Chrome OS will feel like "just another app" (even though Ubuntu is the "app"/chroot).
Another benefit of using crouton (with an SD card) is that if the user accidentally hits space during the boot warning and developer mode is disabled, they can simply re-enable developer mode and nothing is lost in the Ubuntu chroot. Unfortunately, the C720 has an extremely slow SD reader, so you'd probably want to go with the Dell Chromebook 11 (same processor, ~90MB/s SD reader, better battery and build quality, $70 more) if you wanted to go this route.
It is very nicely integrated, and that may very well be the best method for a more experienced developer. For our recipients, the simplicity of the dual-boot setup is nice and makes it easy to understand what's going on under the hood.
Also, I totally faced the same problems which is why my Chromebook is basically bricked now - when people press space all of the changes are erased...there needs to be a more user friendly way to make this process work. I'd be interested in collaborating on this process perhaps, shoot me an e-mail at zach (at) nycda (dot) com
The offline editor also lets you import extensions, which means that if you're adventurous, you can hook up Scratch to your NXT, your Arduino, your LEAP Motion, your Sphero, your Kinect, anything. That should be pretty exciting for new coders. A friend of mine's speaking about this at the Scratch conference now-ish.
One of our sdk devs broke his mb air and got one of these as a temporary replacement. Seemed to work really well, used it over a week long sprint. Wish they came with more RAM though
The original Acer version had 4GB. I tried to buy one a month or two ofter they were released, but 4gb versions ended up being sold out because of the switch to the current 2GB version. So I never ended up getting one.
Thinkers also write code, but they have a perfect understanding - breadth and depth - of the application, the environment, the tools they use, the purpose...
They have full autonomy and initiate things.
They have passion and basically do not need any formal education in CS.
Doers receive requirements and implement using the framework they have being told to use.
Most people who have been told to do CS because it is financially rewarding or a good job end up in this category. It is any-office-job for them.
I have the Acer C720 and installed Linux on it. It is an excellent development machine for what it is. Cheap but sturdy, light enough to carry places that I'm afraid to bring my Chromebook Pixel, cheap enough that I wouldn't be too bothered if I were to lose it instead of the Pixel, and fast enough to do most of what I want.
Are you sure you won't run into limitations of 2GB RAM? That's the thing that worries me most about these Chromebooks, if the SSD is indeed expandable. I routinely run over 2GB on 32-bit Ubuntu with regular office and internet browsing tasks; sometimes over 4GB.
I'm wondering, technically you're supposed to buy Sublime Text. Are you just running the (infinite) trial version of Sublime, are you purchasing a license for every laptop, or do you have some kind of sponsoring by the guy behind Sublime?
I'd love to offer free licenses of ST and am working on a way to do that. Until then, the trial will suffice for a lot of kids, and pre-installing it means they can easily try it out. But yes, a better long-term solution is needed.
Everything was going fine UNTIL.... one accidental press of the space bar and you wipe everything out! Yikes. :) Still looks great, but something tells me this would be in my kids' hands for about 3 days until everything accidentally got wiped out.
To be fair, pressing SPACE gets you a 2nd confirmation screen that must also be accepted, before your fate is sealed. And even then it just places you back into normal ChromeOS mode. The installer can easily be re-run to get you back to Codestarter-factory-settings, and is actually a pretty nice learning experience all on its own. But, yeah, those are all rationalizations for what is a non-ideal experience. I'm hoping to find a good solution to the problem.
But what about all your saved work? Reinstalling and losing everything could be a real tear jerker for kids. In fact, it's unlikely they would start over, possibly resulting in giving up on it completely given the emotional side effects.
Maybe the solution should have a cloud backup option built in that uses Dropbox or something to clone all your data. Then at least you are just a couple-hour restore away from being back online.
The mechanisms to completely wreck Windows are simpler than the same process on Linux.
At least on Ubuntu, you need to enter a root password to delete /usr. On Windows until really recent versions deleting /Windows/sys32 was as easy a few explorer clicks.
But even then, you can still just delete a few arbitrary DLLs and wreck the system all the same.
Create a guest login for the kids and they can hardly mess up anything.
There's a big difference between hitting space bar and OK and losing everything, and finding a way to delete the windows system folders, considering they are usually hidden to begin with.
Sorry for my lack of knowledge, but will this leave ChromeOS side-by-side with Ubuntu? If so, is this because it's technically not possible to only run Ubuntu, or because of other (legal, warranty, etc.) issues, or simply by choice?
Our installer will leave you with the ability to boot either ChromeOS or Ubuntu. It is possible to wipe ChromeOS completely and boot just to Ubuntu, if that is your goal.
Why must it run Ubuntu? Especially if you're gonna turn around and make Chrome (not Chromium) the default browser anyway?
Is it for the Minecraft (which, by the way, encourages a construction mindset, not an engineering mindset, since Minecraft takes numerous liberties with the laws of physics)?
That seems like the only good reason, because you can already learn to code just fine on Chrome/Chromium.
Coming from an almost totally Windows experience, how restrictive is 2GB of RAM going to be in Ubuntu, because Im not even sure a Windows machine would function on 2GB these days haha.
For browsing the web and simple development tasks, 2GB is sufficient. More is better, but if you're working on a budget, then 2GB will get the job done.
Yep, and still totally possible by just booting into ChromeOS and firing up Chrome. But we want these kids to have the opportunity to explore a full operating system and see what makes the machine tick. It's also really important to us that kids can learn to program without a full-time internet connection (not all of our recipients have internet at home), and Ubuntu allows for that.
I'll probably get a bunch of hate for this, but I honestly think this is a bad idea. My main concerns are the phrases "It must run Ubuntu", "It must have Google Chrome as the default browser", and "It must support Minecraft", and I have minor concerns with the solutions to the other statements.
* Ubuntu: why ubuntu? because it is _popular_? that's a bad reason, and it's magnified that this is supposed to be a teaching tool. How can you justify teaching kids to do something because it's popular?
* Must have Chrome: I read this and laughed. As a fairly large supporter of the free software foundation, I am frankly appalled that you want to use chrome when there are many better and more free alternatives. There is of course firefox, which is free, but if your real desire is the webkit rendering engine or v8, then there is always chromium or DWB.
* Minecraft: I play minecraft, and I think it's fun. I write mods, and I run servers. But to include video games on an educational computer is repulsive and you should be ashamed. So what if kids like it? Kids like refined sugar too, lets just give that to students as a way for them to enjoy food.
Some other points:
The wide variety of programming languages is good, but the list you provided is noticeably lacking C (the most widely used language) or any systems language. Programming is about much more than stupid hello world programs and writing silly kiddie games, and should be treated as such. I also noticed a lack of purely functional languages, which bothers me more than I can effectively express in words.
The lack of internet is good, but you completely missed the existence of man pages and virtually all compilers work offline as well.
Having a great editor is fantastic, and I use sublime text (I even paid $70 for the license!), but as many people have said, you shouldn't. That $70 is much better spent on better hardware for the computer, and shipping with unlicensed software is wrong. Vim and Emacs are both more than capable of doing EVERYTHING sublime does and are much lighter weight, and free.
Including the paragraph about the 'custom sidebar' is laughable. Aside from the fact that I dislike Unity and that the sidebar can be configured by students already, I don't think it's appropriate to coerce students into using the software YOU want them too. That flies so hard in the face of the unix philosophy.
the function keys are easy to remap. I suggest looking into sxhkd as a replacement for the bloated tools you listed. It's free software and can be found on github. It is written by the same author as bspwm.
developer mode screen: so reformat the laptop. I did it with arch and it works fine and has no 'scary screen' as you put it.
*trackpad support is not the responsibility of the kernel, and a kernel update will not in fact remove support. This problem can easily be solved by distributing modified versions of the apt-get remote upstream lists.
My final remark is that every time I hear someone refer to themselves as a 'coder' rather than a programmer, I die a bit inside. Please take this profession seriously and don't encourage what are commonly know as 'skript kiddies'.
Let's remember that the Codestarter is giving these laptops to kids who are basically just learning how to use computers.
1) I think Ubuntu is a great choice as the first Linux distro because of its popularity: that means there is a large community of people in the Internet who can help and there is already so much information written about different common issues one might encounter.
3) Because gaming is fun and Minecraft has shown its capabilities also as an educational platform so I feel it's always better to provide a way of learning that is also fun. Kids learn by playing all the time and when we can slip in a few educational aspects into gaming, all the better.
4) I don't think introducing things like C or functional languages would be doing such great things when teaching kids to learn to program. Yes, programming is much more than just "stupid hello world programs and writing silly kiddie games" but languages like Python and Ruby are completely valid choices for writing real stuff, especially in the web app world (think of Django and Ruby on Rails). Also, it is easier to get started with languages that provide things like REPL to play with without having to first learn how to compile stuff just to get basic stuff done. It lowers the barrier of entry.
5) I don't think man pages are such great resource for people who are just learning stuff they have never dealt with before. They are really compact and even bit cryptic way of displaying the information (and not saying that is a bad thing, once you learn it, they are great, but it shouldn't be a priority when learning programming).
6) I don't know how often you have seen people try to start using vim or emacs? When people struggle even exiting an application, it's not the most welcoming experience. Again, having editors like Sublime Text or Atom lowers the barrier to start doing because you can focus on the most important part - learning how to program - instead of wasting time to learn your tools, which you can do later.
Ubuntu's popularity is actually a benefit - popular things have more resources, both in terms of software/documentation, and in terms of eyeballs that can help out when things go south.
In addition, Ubuntu is a good choice because it's target audience is naive users. If you want to be a hardcore developer, you can use Ubuntu, but you're not its target audience. It also has official support lasting for years for appropriate releases, which not all distros do.
You seem to be personally offended by a lot of personal choices - for example, calling preinstalling text editors 'coercion'. I wonder what this 'coercion'-free system you're imagining looks like? Is it just a boot screen that says "insert SD card with distro installer of your choice"? Then you can install whatever you like, with no-one pushing any particular software on you.
Except it's often exaggerated. I doubt Ubuntu is even the most popular distro these days. Also, lately Ubuntu (or rather Canonical) diverged from the global Linux community with efforts like Mir and so on. So they are quite controversial, and using them as a recommended distro for new Linux users is now more questionable than before.
As for popularity, whenever I encounter projects on the web, they almost always have an ubuntu .deb (usually specifically ubuntu and not debian), with redhat/centos being in second place. It's not as common to see installation sections for other distros. At a conference, when a speaker does the who-uses-what-distro question, more hands go up for ubuntu than others. I'm not sure which distro you're thinking of when you suggest others are more popular, to be honest.
In any case, the OP's point was that choosing on popularity was a bad thing - selecting another popular distro in place of distro X would not change that line of reasoning. I disagree, for the reasons I gave: more willing hands to help out, more appropriate resources available. Given the OP is an FSF fan, maybe they go for gNewSense... which doesn't have much mindshare. "I'm having problems with my gNewSense install, can you help?" isn't going to hit as many resources as a popular distro (and on a tangent, why would you ever name your distro a homonym to the word 'nuisance'?)
> I encounter projects on the web, they almost always have an ubuntu .deb
Ubuntu got a lot of hype in the past, and Canonical put an effort in PRing it. However it doesn't really mean that it's actually most used. I never really saw some conclusive studies on that. It surely is the most PRed / hyped. Or rather was until recently. That hype cooled off somewhat lately because of various controversies like Mir. Mint is commonly perceived as most popular distro these days (I'm not a Mint user for the reference, I use Debian), though again - it's hard to prove it globally.
I don't equate PR with actual size of the user base. It's probably more proper to use the term "most used" than "most popular" for such kind of evaluation. Making a choice based on PR / perceived popularity works to some degree, but it's not a precise method therefore it can be disputed.
> What do you think is a more naif-friendly distro?
I'd recommend Mint or may be some KDE centric distro for new users, openSUSE for instance.
Mint is popular, but it's popularity is largely because it's basically a skinned, traditional-desktop version of Ubuntu - it's binary compatible, so the various .debs advertised with Ubuntu also work for Mint. So in terms of resources available, the two use the same resources (ubuntu PPAs work fine in mint, for example).
But in terms of user experience? Depends on what you want. I prefer a traditional desktop (debian/kde myself) so I'm not fond of Unity and loathe guh-nome 3. But for a naive just learning about how to deal with computers, unity might be good for them - it's pretty simple, and works the same on touchscreens or with a mouse. That would be something worth testing, actually - do the kids using these laptops work better with a Unity-style interface or a traditional desktop? Which one allows them to use their computers more effectively?
> do the kids using these laptops work better with a Unity-style interface or a traditional desktop? Which one allows them to use their computers more effectively?
Hard to say. I personally think KDE is better for both purposes (experienced users and beginners). Unity style UI is too limiting and I don't like the direction it takes.
> Mint is popular, but it's popularity is largely because it's basically a skinned, traditional-desktop version of Ubuntu
It's more than that. Mint is also not managed by Canonical, and therefore they are distanced from related controversies. They are sticking to the global community in choices like Wayland and so on.
The Unity shell is very heavy, and unlikely to perform well on a Chromebook. It's also very much non-standard in terms of APIs , is not a good cross-distro citizen, and will soon diverge even more from the base, as Canonical skip the Wayland project for their own purposes.
To the contrary of your third point, having minecraft on the machine is brilliant.
Kids are playing minecraft whether you want them to or not. They'll find a way to play it on a tablet, on an Xbox, playstation, iPod, what have you. Having it on this particular machine gives them an incentive to be in close proximity to relatively fun coding tools. In other words - come for the minecraft, stay for the coding.
It also opens kids minds up to "how was this game made"? If you juxtapose the game next to the tools teaching you how to make it, kids will be exponentially more likely to invest time and interest in those tools. Plain and simple. Not to mention that minecraft is inherently beneficial for that creative, engineering mindset - seems like an obvious, fun pairing.
I came to respond about the Minecraft point, but you've already said what I would have. I think it's folly to presume that learning is better when all the fun is removed. Thanks!
> Some other points: The wide variety of programming languages is good, but the list you provided is noticeably lacking C (the most widely used language) or any systems language.
Er, perhaps I'm being thick, but doesn't Ubuntu ship with a C compiler anyway? All three provided editors have C modes. It's there if you want it, or if somebody wants to teach it.
Besides which, you seem to have just completely missed the point of the whole enterprise. It's a machine for primary/elementary education in coding, not university level CS101. The first language mentioned is Scratch, for crying out loud. No, systems programming is not relevant for this particular use case.
When you get all sniffy about "script kiddies" and "silly kiddie games", does it enter your head that you're talking about 10 year olds? There'll be plenty of time for pointer arithmetic and kernel patches later.
There's a short edit window; annoying, but it prevents aging threads from being defaced or comments being changed to misrepresent responders so I guess one has to learn to live with it.
Google's entire purpose for developing chrome was to push the internet forward with better technology. By running chromium, you not only get this (since chromium gets certain features before chrome), but it is also free software.
By university, many students have already set their minds against a career in tech. We need to reach kids earlier, when they are motivated by curiosity instead of social pressures.
This came up the other day. A 19 year old kid came to my door working for the electric company. We ended up talking a bit and he asked what I did. I explained I'm a designer/developer and I work from home. For him, that was a total mind blow and he explained how he always wanted to be a game developer. I suggested that he should try and that his current job was just a temporary thing he could use to invest in getting better.
The thing that bothered me, though, was that he didn't even think that he could do that, purely based on where he was from and what he saw around him (he alluded to being from the south side of Chicago). Stuff like Codestarter is a great step in the direction toward teaching kids of any background that they can do this stuff, just like everyone else.
Personally, I'd like to see something like this but with a more down-to-earth "here's what they told you and here's why it's wrong" lesson up front.