Hacker News new | past | comments | ask | show | jobs | submit login

I think Atwood is right, PCs are on their way out simply because with tablets there is so much less that can go wrong. He mentions tech support in the article: THAT is why they will win, regardless of whether or not they will fully replace PCs, or whether or not they are better suited to writing emails. Millions of people who can't be bothered with anti-virus updates and the like will just find it a much more pleasant experience: "it just works" indeed.

(I'm a Mac user and I find myself spending a lot less time on tech support type issues than back in the day when I was using Linux and Windows, still it's hard to deny that iPads are currently the epitome of simplicity, compared to their power.)

And that's what I'm worried about: "No user serviceable parts inside". I'm concerned the very reason tablets will take over the PC market will also mean that tomorrow's kids' experience is very different from, and I would argue poorer than, my experience when I got my first computer, a Commodore VIC 20.

Ironically, my VIC 20 also was very user friendly. You turn it on, it's on. You put in a diskette -- well, it didn't load automatically but making it load was a very simple incantation, and once the game started, it was started. Things were simple and worked, for the most part, quite well. No tech support needed. But the BASIC shell was there right at your fingertips -- hell, the manual came with example BASIC programs.

Quibbling over the merits of BASIC aside, it was a simple experience but it was also tremendously open.

Things got much more complicated since then; network connectivity in particular introduced previously unknown threats to users, so of course this is a simplified comparison. Still, I wonder, is giving up the freedom to tinker with the system really the price tomorrow's consumers will have to pay in order to buy simplicity?

And if you're thinking about replying with links to IDEs running on the iPad: no, something that runs in the cloud and has no direct access whatsoever to the internals of the machine does not qualify as a modern-day replacement for this experience, no matter how sophisticated.




I think the whole question Tablet vs Computer is pretty dumb in the first place. It's not a versus, it's a peaceful cohabitation, only some evangelist and/or journalists want to push us to believe there's a war but that's really not true.

Tablet and PC simply correspond to two totally different use of media. A tablet is consumption centered whereas the PC is able to produce complex goods and services than then can be consumed by a tablet. Will the tablet push away PC usage to very low level in some areas? Probably! Will it replace a PC for every usage ? Are you kidding me: Development, 3D Effect, Advanced Audio Processing, Advanced Photo processing. That's just as stupid as saying that photoshop is dead because Instagram can do sepia...

PS: I will absolutely never buy a tablet, I fight every day to stay on the creator side of society, with a tablet I would just give up!


It's not a versus, it's a peaceful cohabitation, only some evangelist and/or journalists want to push us to believe there's a war but that's really not true.

I wouldn't call it a war, but Atwood (who, by the way, is neither an evangelist nor a journalist with an agenda) makes the point that tablets are about to take over large shares of the PC market. And my point was that this way, people (non-hackers) get even further removed from the "close to the metal" experience that I and others in my generation used to have.

With all respect, I think you and dozens of other posters in this thread are missing the point by "defending" the notebook and saying it won't get fully replaced. Of course it won't, but that's not an interesting question. The interesting question, in my mind, is: how can we avoid the TV-ization of computing? How can we make sure future generations don't equate a computer with a consumption device rather than a productive device? How can we make sure they can find out that this "magical device" isn't actually driven by fairy dust? Whether or not some share of the population uses notebooks, when the default for the millions and billions of non-hackers out there is to use a tablet, doesn't matter very much at that point.

Fortunately there are initiatives like the Raspberry Pi. It's telling that one of the main initiators of that project, David Braben, is also an old-school guy who grew up with computers in the 80s. Or so I'm guessing -- at any rate, he wrote some of my favorite 80s games, Elite and later Frontier. But I fear that if you plot the Raspberry's sales curve into Atwood's graph it won't look anything like the iPad.


Don't worry, you're full of respect ;).

It's maybe a philosophical question but who doesn't have an agenda? I mean when you buy a device, don't you want to convince the other that you haven't bought a stupid toy or a useless object? Everybody has an agenda, even someone just buy an iPad he has incentives to convince that his device is great, the only ones who don't have an agenda are the ones who genuinely don't care.

I have an agenda, I am trying to push people to see computers as something else that a tool to consume media/ slack around. But from the current design of tablets, especially Apple's iOs Devices it's getting really hard to see something else than a "pure-consumption" product.

To continue on the philosophical level, I think humans deep down aspire to do very little while receiving a lot of pleasure. The most productive of us have managed to channel this desire towards building systems that, when they will be working, will provide us with a lot of rewards with little additional work (the startup!). In a way the self-driven programmers have achieved a Post-Slacker era in their mind and god knows how hard it is to keep that up. The iPad/iPhone/iPod Touch/Android Phones are enabler of the slacktivist part in everyone of us, and that's one of the big reasons of their success.

Sure Raspberry Pi is great but, again, it's in the realm of post-slacktivism and thus it will probably NEVER be popular. It doesn't mean it won't be successful.

I hope this comment will find interested minds!


I think tablets will replace "PC's". But I think it won't be as bad as you think, You'll still have you dev environment, you mouse, keyboard, and touchpad all working in harmony. Here's a snippet from another post I made (discussing mobile vs desktop OS's, hope it provides some alternative viewpoint:

"I'm pretty sure this is the direction all OS's we will be using are going to end up in the coming years. Some people will cry and yell and holler that there is no way something can be made for both keyboard, mouse, and touch screen. I see this as simply a failure of vision. That's absolutely the way things are going to be going, and it's coming sooner rather than later. (I actually think the OSX launchpad is pretty close to allowing this implementation) Devices are getting smaller and smaller. I use a Mackbook air. but guess how it gets hooked up at home? That's right, wireless keyboard and mouse, nice big monitor, I never see the actual computer. It's only a matter of time before my laptop gets replaced by a tablet that has comparable hardware specs. The OS allow for my normal desktop interfaces, along with a nice touch screen interface. I'll use it to pick out movies to play on my TV, from my couch. I"ll use the same device to write code at my desk (with big monitor and keyboard). My kids will use it to play video games, both mobile, and on the TV. I'll use it to send email in the backseat of a car moving at 65mph. (As a wireless comms guy, I fully appreciate the technology it takes to perform that last action.) But the bottom line is, It's going to happen. Sooner rather than later."

I guess my point is, mobile and desktop can blend perfectly. Those that don't need a monitor or keyboard won't get one, devs and graphics guys will. Technology will keep the size of processing power shrinking. Could you ever imagine something as small as an MBA being a full on computer.

Last bit, and my only reservation of this whole "movement", is that, I hope they keep it more open, more like OSX, vs overly walled garden a-la IOS. But the signs are there, I think it can be done, and done very well, it's only a matter of time.

edit: It seems you biggest problem address the issue that these devices are built for consumption rather than creation. And I'll agree, that's my biggest reservation about the mobile concept. I guess I could have been more clear about that. I'd like my iPad (not that I actually have one) to be more like OSX, where I can hook up a monitor and keyboard, and go to town on my OS. While being able to switch into the "launch pad" mode while mobile.


I'm interested. I agree.

Apple's devices (and other embedded devices) will never replace PC's - and exactly for the reasons you state. Thus the Raspberry Pi and the PC (PC platform) have a bright future.

Apple's devices will live until Apple EOL's them.


Well things will need to be produced somehow, I imagine this will still be done with some form of computer. Whether or not you keep the keyboard has little to do with the "freedom" of the device.

The question is what the difference will be between the consumption devices and the production devices.

Whether the devices for creating content become specialized pieces of equipment priced out of reach of the general market and you get to touch one for the first time when you turn 18 and start your computer science course or perhaps with increasing viability of virtual machines you simply download a program that gives you all of your power tools but inside a sandbox so if you mess it up you can just re-image and go again.

There will also be as you suggested "hacker hardware" like the Raspberry Pi, I think the key for these is to make sure the costs are low and that they are available to kids in school.

The "hacker spirit" can overcome many obstacles. Hell in the 70s nobody had a computer at home and the original hackers used to break the law by breaking into companies computer systems via telephone lines just to play around with them.


Thanks for this optimistic reply. You're right, the "hacker spirit" has proven time and again that it can overcome any obstacle. Those who explicitly seek out open hardware will always be able to find it.

My concern is just that the lure of simplicity (and parent's paranoia) will mean that kids will be more likely to end up with a closed system rather than an open one, and consequentially deprived of the ability (and, more importantly, the inspiration) to tinker. But maybe you're right that hacker souls will always seek out systems that allow them to do what they want to do, and it won't make a difference in the end. I hope so!


Yes, I think inspiration is the key here. I was largely lucky in this regard though, I had very liberal parents who let me have a computer that I could mess around with as well as unfiltered internet etc.

Most people in my peer group at the same time were only allowed limited access to their home computers and often were not allowed to install any programs on them etc.

Really though, I think it is in the governments best interest to make sure that kids are inspired to tinker with things if we want to stay competitive with BRIC countries in terms of creating and innovating.

I can only speak as a British person here but I think that from Alan Turing to BBC Computers (and now raspberry pi) etc , the "hacker mentality" is very much a part of our national DNA and it would be a great shame if that was lost.


I would rather buy my child their iwn computer then have her mess up mine but I guess that was not as affordsble a decade ago.


I wouldn't call it a war, but Atwood ... makes the point that tablets are about to take over large shares of the PC market.

No he doesn't. He says that everyone now owns a PC, so innovation is moving to post-pc devices.

I think the term "post-pc" causes many geeks to project a vast ethical struggle onto the tablet market. Chillax, game consoles didn't kill off programming, this is no different.

Personally, I prefer the term "portable device" but I guess that's not as link-baity.


>I think the term "post-pc" causes many geeks to project a vast ethical struggle onto the tablet market. Chillax, game consoles didn't kill off programming, this is no different.

The iOS locked-down norm will likely migrate to the conventional computing world. Mountain Lion's Gatekeeper software establishes default behavior of not being able to run unsigned software. This gives Apple the power to censor software and is the a step towards App Store-only software on the desktop.


> This gives Apple the power to censor software and is the a step towards App Store-only software on the desktop.

And that will lead to more open platforms being where consumers find the most innovative software, as well as being where commodity software is cheaper and/or better because competition will necessitate it.

Locking down your platform to prevent crapware that users don't want getting onto it is one thing. Locking down your platform when noncrapware that users do want is also available but only one someone else's gear is how you turn Apple back into being what they were before the iPhone: a footnote.

Of course, this does mean those developing for more open rival platforms will have to actually produce noncrapware, not the poor excuse for software many places ship today. It's about time our industry grew up and stopped pretending that shipping junk and charging for it is acceptable anyway.


What about virtual machines?

Let's say your OSX is locked down, but you can get a virtual PC app from the store that allows you to install Linux without affecting your OSX setup.

Now you can use your Linux VM to program anything you want and share it with a whole community of others who run Linux inside a VM.

Perhaps at this point this whole community grows big enough that it becomes the main target market for companies who produce "power user" software.


I would like to see a network enabled virtual machine approved for the iOS App Store, or the basic human decency of allowing sideloading on iOS, before trusting Apple with the OS of the future.

Now, Tim Cook has shown a good track record of undoing the most extreme abuses from Steve Jobs megalomania (you supporting employee charitable contributions, paying dividends to investors, admitting imperfections in factory working conditions), so there is hope that he will do the right thing for humanity with this issue too.


>how can we avoid the TV-ization of computing? How can we make sure future generations don't equate a computer with a consumption device rather than a productive device?

Why do we need to concern ourselves with this? The Desktop already is "tv-ized" for most home users. People who are interested in producing will continue to do so but the majority most likely never will be and trying to make them will only turn them against us further. We should just get out of their way and make it as easy as possible to do what they want to do.


Fortunately there are initiatives like the Raspberry Pi

With its USB ports, video acceleration, web browser... The RPi has more in common with a PlayStation than a BBC Micro.


>Will it replace a PC for every usage ? Are you kidding me: Development, 3D Effect, Advanced Audio Processing, Advanced Photo processing...

No. But for a large proportion of the PC Market this doesn't apply. They only ever used their PC for email, youtube, email etc. So for what is probably a majority of PC users (ie. Not you, me or most HN readers) this is 'Tablet vs Computer'.

To call the argument 'dumb' is dumb. Make your point without attempting to dismiss the argument in such a cheap manner.


I hope the barrier to entry to be a producer will stay as low as it is now. I worry that as PC demand goes down, the cost to get started in programming will go up. I started to learn how to code web sites while procrastinating one afternoon during finals week; I discovered Apache on my macbook. The barrier to entry was almost accidental, because I had all this great stuff on my computer waiting for me to discover it. Those experiences will be more rare in the post-PC erra.


Funny you mention that, but in a world totally dominated by iPads the price to just have your application being available by legal means is 99$. When I began to code(around 12-15) 99$ was WAY above anything I could imagine. I don't know if my parents would have allowed me to spend such a sum. Hopefully you can get started and publish for Android for free!

So, who knows what the future holds for the cost of content creation... Again people often ignore the long-term externalities they produce when they make their choices, buying a PC or a tablet being one of these choices.


Are they ignoring the "long-term externalities" or is it just impossible to actually know what they are? Or maybe they just disagree and are putting their money where their mouth is. To even expect a non-programmer to consider this issue strikes me as pretty crazy.

I bought an iPad and I don't think that it's going to affect distribution of software in any meaningful way, besides making it easier for developers to reach users through the app store. For one, anyone can deploy an app to Heroku for free. You can host plain html, css, and js for free. You can learn how to program an iOS device for free with tons of great guides and materials online. When you want to distribute it, there is a hurdle to clear, but (and this might be a cultural thing), $100 as a 12-15 year old is not such a wild sum. It also makes my experience browsing the app store better.


Xcode is free (as in beer): https://developer.apple.com/xcode/index.php

You only need to register in the iOS Developer Program if you plan on distributing your apps in the App Store.


You have to join (and pay) if you want to run your apps on hardware -- the free download only lets you run in the simulator.


I don't know when you were 12-15, but $99 at 2010 prices was about $34 in 1980 and $23 in 1975 [1].

[1] http://www.westegg.com/inflation/infl.cgi


I'm 25. Maybe I was born in a conservative family on spending but I would have to seriously push my parents for weeks to obtain $60 worth of something as abstract as a SDK (like they even know what it is...)!

And for families that live check after check, they often have a computer but affording a $100 license can be very painful, I was in that situation!


true , but it's only recently that there has been so much good development software available for free.

When I was first looking at computers a computer cost $1000 and a copy of MS C++ was something like $300


true , but it's only recently that there has been so much good development software available for free.

I absolutely disagree. When I was 12 (in 1994) I became interested in Linux because it had so many development tools available. I thought computers and programs were magical, and Linux/bash/Perl/gcc et al. made it possible for me to learn programming. And Slackware Linux could be obtained for 10 guilders or less.

My generation became hackers through GNU and Linux, just as the generation before used MSX, C64, or a ZX machine with a free BASIC interpreter.

IMO, the sickening development is not so much that Mac and iOS developer accounts cost $99 per year. It's that the world (Apple and Microsoft) is slowly moving to a model where there is a gatekeeper who decides what gets in and what does not. As a bonus the gatekeeper gets 30% of every purchase. I can sympathize with the need to provide a 'trusted' source of software, but it should also be possible to install whatever the heck you like.

What would the Internet be if it followed this model?


I hadn't used Linux in 1994, my first experiences with it were ~1998 but I remember that getting it to install and work correctly with my hardware as well as getting X to work was no mean feat.

So I imagine that would have been quite a large hurdle for somebody with a casual interest in learning to program to jump through in 1994, I also remember paying somewhere in the region of $100 for my Linux distribution then (SUSE I think).

Even at the point you had installed it , you would be compiling binaries with GCC that would have targetted Linux/glibc etc so you wouldn't have been able to share many of your creations with the rest of world apart from a slim minority of Linux users.

I'm not sure what the best compromise between having a "trusted" source of software is and being able to install what you want.

Most people seem quite happy with the app store lock-in, in fact most people I know who own android devices are not even aware of it's sideloading feature, they just get stuff from android market.

Of course if Apple become over restrictive then they do risk damaging their own ecosystem to the benefit of competing platforms.

Unfortunately there does seem to be more popular support that I had previously expected for the Internet to develop something closer to this model. I recently watched a documentary on cyber-bullying in which groups of parents were calling for an authority from government to be able to control the content of social networking websites as well as remove any anonymity from the Internet.


I hadn't used Linux in 1994, my first experiences with it were ~1998 but I remember that getting it to install and work correctly with my hardware as well as getting X to work was no mean feat.

X was no worry - our machine only had 4 MB of RAM, so I was practically forced to use pseudo-terminals. The learning curve for Linux distributions was not that steep. Slackware, especially in those days, was orders of magnitude less complex than current distributions. I picked up a cheap bargain UNIX book, which was enough to get started.

Even at the point you had installed it , you would be compiling binaries with GCC that would have targetted Linux/glibc etc so you wouldn't have been able to share many of your creations

Well, sharing my creations with the world wasn't very much possible anyway, since we had no internet connection. Besides that, for me the magic was in creating a program.

Besides that, your comment is factually incorrect, since a DOS port of gcc was fairly quickly available (DJGPP). In fact, IIRC Id Software's Quake was later compiled with it.

I'm not sure what the best compromise between having a "trusted" source of software is and being able to install what you want.

Me neither. I see how it is beneficial for some family members and friends to have a controlled software ecosystem. Also, App stores improved usability a lot.

On the other hand, if kids only get their hands on devices that are controlled completely by corporations, how will the next generation of hackers learn?


I think maybe they will learn different things.

I imagine virtual machines, both cloud and local will become a commodity at that point so whilst they may not be taking their iPad apart or replacing the software it could potentially provide a dumb terminal to an infrastructure of disposable Linux instances all loaded with state of the art FOSS.

I can see them doing things like building mashups of their social data and possibly using the next generation of arduino like devices to create real world interfaces.

They will still "hack" just their building blocks will be different. Hell in the 70s you probably weren't a real hacker if you weren't a whizz with a soldering iron, how many of the RoR type hackers today practice that?


It's really weird to hear someone say that - because I grew up in the 80s with a computer that had a commercial-quality (for the time) assembler built-in! Sure you could buy Pascal or Lisp too, but out-of-the-box you got the same tools the pros were using, and all the documentation with it too.


Well truthfully my first computer (an Acorn BBC) had a BASIC interpreter built in, I think you could also do some assembler out of the box although I never tried that at the time.

Even my first DOS PC had QBasic pre installed but it had no ability to make .EXE files which was what you needed to be able to do if you wanted to submit your software to shareware libraries.

It seems at that point that Microsoft wanted a devide between "toy" programming languages like QBASIC and "professional" ones like Visual Basic which cost money.

If you wanted to create "proper" windows software you needed to fork out some cash. Contrast that with now where you can download Visual Studio Express along with all documentation, compilers etc for free as well as a whole load of open source languages and libraries.

I now make my living as a programmer and I use almost no commercial tools at all, I doubt many people were doing such a thing in ~1993


Thanks for reminding me why I make more money now than I dreamed possible as a child, yet can't seem to afford anything beyond AmazonBasics and store-brand cereal.

Inflation is such an insidious way to pick the pockets of the working class. I wish the government would tax cash and equivalents instead of inflating the currency.


There's nothing inherent in the nature of tablets or iOS that prevents them from being used for learning programming. It's purely an issue of the availability of suitable apps.

Have a look at Codea (http://twolivesleft.com/Codea/) for an example of such a tool.

(Disclaimer: I know the guy who wrote it, but i genuinely believe its a very cool environment for learning programming and writing basic games)


Agree with your point overall, but if you look at it this way, there were lots of people who were, before, using a computer just as a "consumer" of media, internet, music, information, etc... they had to get a PC because there was nothing else to replace that tool. Now, for THESE people, a tablet makes more sense: it fits their needs. So the conversion we may be seeing right now is around those who were simple consumers in the first place.

Creators who need to consume and to create will of course prefer to have a desktop, a laptop on top of a tablet. And there may not be much evolution on the desktop or laptop side because there is no such need to be: the desktop environment has had dozens of year to evolve, its concepts are solid for its intended usages.


I don't think the point is that the market will shift all to tablets. It is more that, given an upbringing within a "consumer" household. The individuals there will have little opportunity to experience and gain familiarity with, something that can produce. You and I will continue to use PCs, our kids and our friends will likely be using tablets more and more.


the idea of the general purpose mechanical vehicle died a long time ago, the idea of the all purpose computing device should also. while these ARE general purpose machines as far as their internals go, even with pcs and laptops different tools are required to perform different tasks.

Artists use wacom pads and cintiqs to "create" audio experts use mixing boards and MIDI instruments, 3d artists use even more specialized tools (this is my proffession so I happen to know more about it than the others) a lot of what is happening with tablet computers is that the keyboard/mouse combination, once a first class input device used by coders and office employees is becoming a peripheral just like all the others I have been talking about.


Great comparison. If you look at things like Garageband on the iPad, you can certainly make music with the touchscreen only. But any musician would probably still be more productive when using the real instrument as the input device. Same with office work: yes, you can write on the virtual keyboard, but can still attach a physical one if you want


>It's not a versus, it's a peaceful cohabitation, only some evangelist and/or journalists want to push us to believe there's a war but that's really not true.

The issue is; lots of people who never should have had desktops had to get them because there was no other option. It's not that the Desktop market will die or go away, it's simply that it will contract to what it should have been all along: a very small home market and a large business one.


The thing we need to look at different is that there are types of creator. Damon Albarn recorded a massive chunk of one of the Gorillaz albums on an iPad, David Hockney sketches on one.

There are ways in which tablets have massive creative potential. The interface for instance allows a very direct interaction for some mediums and when you look at something like Garageband it has a very shallow learning curve (and price point) which will potentially pull a lot of people in who would have never have looked at that sort of thing on a desktop or laptop.

I think it's important to differentiate between the things that allows developers to be creative and the things that allow others to be so.

To say I'm on the side of the creator and am anti-tablets is to subscribe to a very fixed and narrow definition of what enables creativity.


It's not impossible to remain a creator while using an iPad (or other tablet) - at least not when it comes to development. There are certainly people who are happily developing on an iPad (albeit with a more powerful remote backend)[1]. I think we are likely to see even more of a shift in this direction as these devices become even more capable.

[1] http://yieldthought.com/post/12239282034/swapped-my-macbook-...


Ok, granted, it seems to work but when you look at the configuration:

iPad 2 (16Gb, WiFi)

Apple wireless keyboard

Stilgut adjustable angle stand/case

iSSH

Linode 512 running Ubuntu 11.04

Apple VGA adapter

This is basically the same configuration as an iMac! I'll take an example, most of the programmers I know when doing some serious coding have 2+ monitors because it's always good to run on the side, get the result live. Or just because you need to compare two files. On such a small screen, it's close to impossible. Also for a programmer a lot of the wasted time is browsing through long files, with an ipad and, again, a small screen = nightmare.


This is just not really that serious. There is not going to be widespread adoption of serious programmers doing this in anger. I promise. It's not going to happen. It's fitting a square peg in a round hole. This attitude wants me to have this overly complicated setup to have a really horrible version of what I can do swimmingly well with a little Thinkpad or whatever notebook you like. Done and done. No nonsense.


To be fair , that one article is the only evidence I have seen thus far of anybody using an iPad as their primary development machine.

Wonder how he is getting on with it?


the iPad is a _great_ couch, porch, or kitchen computer. Just because you buy one doesn't mean it needs to replace everything else :)


The question is what makes the thing a tablet or a PC for the purpose of what we're discussing. We can probably all agree on the point that a thing with a bigger screen and a keyboard will always be a big part of the mix.

But, something could fit that description while being more similar to a tablet. I think that's the point Atwood is making.


I apologize in advance for the dreaded car analogy but:

I'm concerned the very reason tablets will take over the PC market will also mean that tomorrow's kids' experience is very different from, and I would argue poorer than, my experience when I got my first computer

I don't think that's any more an issue than today's car owner's have a poor experience because they don't have to worry about setting the fuel/air mixture in their carburetors like I had to with my first car. I think you are right that the closed environment is a barrier, but I live in one of the poorest cities in MA, and yet there are no shortage of computers on the curb on trash day that can happily run Linux/NetBSD/etc.

The young hacker culture will live on.

Ironically, my VIC 20 also was very user friendly. You turn it on, it's on. You put in a diskette -- well, it didn't load automatically but making it load was a very simple incantation,

"Simple" to who? You and me? Sure. Random guy on the street? Doubtful.


I don't think that's any more an issue than today's car owner's have a poor experience because they don't have to worry about setting the fuel/air mixture in their carburetors like I had to with my first car.

Sorry, I didn't express myself very clearly. By "poor experience" I wasn't referring to usability but to the lack of opportunities to tinker with the system and make it do things outside of what its creators intended.

"Simple" to who? You and me? Sure. Random guy on the street? Doubtful.

Surely the iPad is easier to use than the VIC-20. Then again, the VIC-20 came with a manual and it wasn't very long. I'm confident every VIC-20 owner read it and even the dumbest of us didn't have much trouble figuring out what command to type to load the disk.

Come think of it, I guess I was confusing things: I only had a disk drive later with the Commodore 64, games for the VIC-20 came on a cartridge so no typing required at all, just plug it in and turn on the power.


Computers are becoming more reliable and less user serviceable in the same way that cars did.

My dad tells me stories about how he and his father used to service the family car, replace parts etc. My father and I used to do the same thing with the family computer.

I hope my future son and I get to play with the family 3d printer.


As systems become more reliable (Computers, Cars, Operating Systems) - the need for expertise is reduced. That's a godsend for those of us who use to spend days (weeks?) of every year providing computer support for relatives who's systems were constantly self-destructing.

I found this (recent) article in the NYT interesting:

http://www.nytimes.com/2012/03/18/automobiles/as-cars-are-ke...

Little details like this are what start to make a big difference:

"“The California Air Resources Board and the E.P.A. have been very focused on making sure that catalytic converters perform within 96 percent of their original capability at 100,000 miles,” said Jagadish Sorab, technical leader for engine design at Ford Motor. “Because of this, we needed to reduce the amount of oil being used by the engine to reduce the oil reaching the catalysts.

“Fifteen years ago, piston rings would show perhaps 50 microns of wear over the useful life of a vehicle,” Mr. Sorab said, referring to the engine part responsible for sealing combustion in the cylinder. “Today, it is less than 10 microns. As a benchmark, a human hair is 200 microns thick."


"hell, the manual came with example BASIC programs."

I think part of the problem with people and computers is that most wouldn't even bother to read a manual no matter what was included in it.

People expect it to be so intuitive that they don't even have to figure anything out. As though this highly complex machine was purpose built for their own expectations. How far would today's typical computer user have gotten on your VIC 20 if they put a diskette in the drive and nothing happened? The incantation that seems so simple doesn't just summon itself.

So there's an entire market for highly intuitive and trouble free systems that is now being catered to. But there are tradeoffs in building these systems. They're not as flexible and can't be used for all the things that a PC can be used for.

I just bought a tablet and have been trying to figure out a use for it that justifies the price. (laying in bed watching youtube doesn't cut it) It doesn't replace my PC, and it doesn't even replace my phone. I don't see it replacing anything. It actually does the opposite. Overall it increases the ubiquity of computing devices in people's lives.


Computers of that era were very simple to use. I never had a Vic 20 , but in Britain "BBC" computers made by Acorn were popular, especially in schools.

If you wanted to start the computer up you pressed the button on the keyboard and it was booted and ready in under a second, ditto for switching it off (no hibernate/shutdown/suspend etc).

If you wanted to run a program , you put the floppy disk in the drive and typed "RUN" then pressed enter.

I literally learnt to use one at age 6 without much trouble, so did my school peers. Even our aged school teachers could operate them.

They were pretty much impossible to screw up since the entire OS was loaded onto a ROM chip so no viruses ,config files etc.

Even BASIC was very easy to use, the shell and the basic interpreter were the same thing. I don't think there were many people who didn't know how to do:

10 PRINT "SCHOOL SUXXX"

20 GOTO 10

They were also very extensible machines, you could add joysticks, mice, modems and it even had an analog output board you could plug into the back and control robots arduino style (we were doing this at age 8!)

The idea of being "computer literate" didn't really spring up until Windows 95 (or maybe 3.1) and people suddenly had to worry about C drives and "programs" etc.

I remember moving from programming text games in basic on my BBC into trying to understand OOP and Visual Basic for Windows and thinking "why does everything have to be so complicated?". To be honest I think if I hadn't had that taste of magic from the BBC I wouldn't have had the will to carry on learning to program.


Yup. I remember back when I was a kid, and I wanted to make the jump from writing console/text based games to using the full graphical power of the Mac. The transition was brutal, going from Basic with no pointers to MPW pascal, which used handles(1), not even pointers, was just impossible. It requires a deep understanding of how computers work which the Basic interpreter available with most 8-bit home computers carefully hid from the budding programmer.

1. Handles for those that don't know, were, to use the C idiom, double pointers eg char, or Window. The entire MacOS API was based around them, because it allowed the memory manager to compact memory without having to tell you - it would change the value of the pointer to the memory, but as you only had a pointer to the first pointer, you wouldn't even notice the change.


I remember studying Fortran in college in 1993 and deciding to write a program (using no dynamically allocated memory, because I think we were still studying Fortran 77) that would solve simultaneous linear equations with arbitrary numbers of variables.

The Microsoft Fortran compilers we were using on, I think, 386's, would allocate all memory at compile time and therefore the more memory requested, the larger the executable. I discovered that this lead to serious issues on the 5.25in disks and I was better off writing Basic on my C64 for that task.......


People are perfectly capable of learning all sorts of fancy computer stuff if they want to. Generally they don't want to. My mom taught me all sorts of stuff about DOS (if you have autoexec.bat, it will get run automatically on boot). She knew that because she had to know it. But now, she doesn't have to know it, she just wants to do something and has my dad fix it. So she barely knows how to use it.


I think the current "PC" paradigm needs to die because the things that we do with our computers is just so different now.

Regards the "no serviceable parts" issue, there is nothing stopping manufacturers from building systems that are more open in addition of course it may be the case that kids grow up without access to one of these. However that was probably also the case in the 80s for the most part, most of the kids I knew growing up didn't have a computer but they probably did have a SNES or Sega Genesis.

Of course what might happen is that all the geeks and hackers buy the "open" hardware and build cool things with it, once the general population sees these things then perhaps they want the open version too.

Free market economics dictate that competition will create innovation after all and there is only so much you can lock a platform down without stifling that.


Thinking about this for a long time: "And that's what I'm worried about: "No user serviceable parts inside". I'm concerned the very reason tablets will take over the PC market will also mean that tomorrow's kids' experience is very different from, and I would argue poorer than, my experience when I got my first computer, a Commodore VIC 20."

And found the answer in automobiles. They are as essential to us as communication devices will be to everyone tomorrow. And yet only a few only know at least the fundamentals of an engine.

It's a problem society has to tacle by embracing STEM but it's not as problematic as it seems IMHO.


I think this is a classic example of us old timers wishing our kids got the same experience as us. Back then, you HAD to spend time fiddling with the hardware. Today's kids are much more empowered. Since everything just works they can spend their time creating much more awesome stuff than we ever had the chance to do.


The BBC makes a big deal of "digital natives" (kids who grew up with the Internet) and "digital immigrants" (the old folks).

What this model completely overlooks is that the digital immigrants built it, and the digital natives merely use it to watch videos on Youtube and poke each other on MySpace. Kids these are days are consumers, not producers.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: