This is what I personally feel is just missing these days. I was never part of the leading edge of hobby computing.
I was very far off, geographically, growing up a teenager in India. But back then, 1998 - 2006 ish, I feel the Internet was a lot more about experiments. I would read up on some software or hardware group all the time. LUGs were way popular. I used to be on chat rooms from MIT or other unis every other day.
Nowadays most people I see of that age group are happy with a YouTube video on their smartphones. General purpose of computers is simply not visible. Not even to the level that I, a curious teenager, had experienced in those years from so far in India.
We have too many tall-walled gardens now.
Update: I should add I am happy with how powerful computers are these days, but they are not for the purpose of learning or tinkering the actual device or its software. It is just way to consume media for the mainstream.
Software is economically worthless without making it artificially scarce. A lot of "software innovation" is not about building better software, but building software you can charge for, which means "walled gardens", patents, DRM, license servers, anti-tampering mechanisms, and the like. In essence all content (music, movies, software, books, etc) follows the same pattern.
Note that profit-driven entities can give away software, but only if it supports another artificially scarce good. Google rents space on its search tool, as Facebook rents space on its social media tool. Ad space is scarce, and made more valuable the more Google and Facebook give away (which is proportional to the attention they capture).
But devs generally don't like artificial scarcity, so the platforms need a path in for them. I think Apple demonstrates this most clearly: the iPad and iPhone are software consumption devices, but the Mac is also a software creation device, and therefore is generally more open, free, and changeable.
> Software is economically worthless without making it artificially scarce. A lot of "software innovation" is not about building better software, but building software you can charge for, which means "walled gardens", patents, DRM, license servers, anti-tampering mechanisms, and the like. In essence all content (music, movies, software, books, etc) follows the same pattern.
You remind me of Cory Doctorow's "The Coming War on General Computation" keynote, which interestingly didn't notice that streaming and similar cloud-based services would become the DRM of the last decade.
Or, as it turns out, the rising appeal of non-general purpose devices that are more efficient.
I think you mean a different definition of economic worth than the GP post. One meaning is the developer's ability to sell the software, the other meaning is the value derived by using the software.
I maintain an open source project that sits in a very small niche. Based on downloads, stars on GitHub, and comments on forums, I estimate that hundreds or thousands of people find it useful. I do get the occasional donation, but if I calculated dollars per hour that I have put into this project, it would probably be in the single digits.
Economically worthless is a very bad phrasing, but the concept is very real (economists go with non-excludable, but that has an underwhelming impact on laypeople), software authors have a really hard time capturing the value they create.
I've always thought of it as the turn-style in front of the theme-park. Without that turn-style, the theme-part is economically worthless no matter how good it is.
And not only that... but a theme park without an entry fee is still a money-maker via snacks, drinks, souvenir cups, and extra-cost rides and line-skipping.
In much the same way that a company can produce an open-source product and charge for support, feature development, consulting, data feeds and sometimes SAAS operation.
I just mean you can't charge for it directly. "Utility" is perhaps what you mean. Note that the most important achievements (scientific and artistic) in human history were economically worthless for their authors, but have had enormous indirect positive economic benefits.
Yes but that’s not what the parent meant but in a money making way, and that has been something big companies realized a long time ago and capitalized on it. The general trend has been to create attractive gardens, get people used to them then raise up the walls. In the end even open source has stated to make a bit less sense to its creators and maintainers who are people and need to eat too
Control of where Linux development is going is indeed economically worth a lot. Copies of the Linux kernel are near enough to worthless for the distinction not to matter.
>Software is economically worthless without making it artificially scarce.
If that were the case no software would be written except by hobbyists as a kind of joke. Software provides value other than being scarce and it's really the maintenance of the software that people are wanting (those paying attention at least) not the software itself.
> Software is economically worthless without making it artificially scarce.
Worthless to whom? Maybe for corporations who want to make billions by exploiting a legal monopoly. For the rest of humanity, abundant software is extremely valuable.
The difference is that those people who are happy with YouTube videos wouldn’t have been tinkering with computers in 2006. You have to remember that tinkering with computers was and still is a nerdy niche. Let’s say there are 1000 people in the world. Of those, 10 are hackers (in the “tinker with computers” sense, not on the malicious sense), 20 are enthusiasts, the rest are neither. In 1990, you’d have 15 people on the net talking about computers and doing stuff with them. Everyone else is watching TV, reading books, whatever. In 2006, you have 100 people: 10 hackers, 20 enthusiasts, 70 people who found cat videos. In 2021 you have 990 people on the net: same hackers abs enthusiasts but also a whole bunch of people who are here for the cat videos. Does that detract from what the 10 hackers are doing? I don’t think so. In fact, it adds opportunity to the hackers and enthusiasts.
We didn’t lose the hacker spirit. Go look at hackaday.com to see what people are working on. We simply gained an audience.
You are creating a false dichotomy of cat videos vs. hacker. I've watched a cat video or two. I've also spent the vast majority of my life programming (coming up on 40).
My point is, video games were my gateway to programming. I got into mods and then expanded from there. I'm sure I am not alone. It's these natural paths that spark curiosity which are being cut off.
Is there any evidence that these paths are being cut? Mod communities for video games seem like they are thriving. Look at the sheer array of Minecraft mods, or mods for nearly any popular game on Steam.
Combine this with popular and easy game making software like RPGMaker, or even the ease that someone can set up a basic game on Unity. Beyond this, there are programs like Scratch and Processing that allow for fun and relatively easy computational tinkering.
I doubt the natural paths of curiosity have shrunk, though they may look different. Rather, the number of people using computers has massively increased, and the number of computer "tinkerers" is just a smaller proportion of people of the larger whole.
I am oversimplifying the situation. There are more than 1000 people in the world, everyone can enjoy cat videos. My main point is not that the world is divided into hacked and cat videos lovers. (There are two types of people: those who divide people into two categories and those that don’t :)).
My point is that there is a small percentage of the population who are makers and a large percentage who are consumers. The Internet of 30 years ago was mostly populated by the makers so it felt like everyone was a maker. Now it is much more representative of the real public because the consumers joined. It now feels like it’s mostly consumers because in the world’s population consumers outnumber makers by a large percentage. However, and this is the crux of my point, the absolute number of makers did not go down and in fact went up. You won’t find many makers in your Facebook feed, in relative numbers. But maker communities are bigger, better, easier to find, and easier to join than ever before. Just because consumers have joined the makers does not mean that things are worse. They are better. The rest is nostalgia.
Why the negative outlook? The world now is way more awesome for hackers than it was back then!
1. You can find any information you want, in an instant.
2. Hardware is very cheap, computers can be tiny, you can order components from anywhere
3. Robotics and electronics are way cooler now, with drones etc.
4. Software development is way more productive with all the libraries and tools we have right now.
5. When I was a kid, games were made by 1 or a few people. It quickly evolved into big studios. But nowadays, a solo or tiny team can release super successful games.
> 2. Hardware is very cheap, computers can be tiny, you can order components from anywhere
In the past I could get any component I might need from a local store in 20 minutes. now I've got 2 weeks to 3 months turnaround on shipping from shenzhen.
In the past, there were only enough components to fill a few shelves. Also, you can get many, many components from local stores now, if you're willing to pay the 4x-10x premium.
Not many people are, so local stores aren't as common.
In 2010, the local store had more components than I can order at reichelt, and was cheaper than aliexpress (if you include shipping).
I really miss having a place (back then we even had ad dozen of them in a few km distance) with a huge selection carrying every component you might ever need, and with enough technical expertise that you could show your circuit diagram and get feedback how to improve it. This is something only seen in shenzhen today.
Now I just don't do any small projects anymore as there are no stores in the entire state left, and the extreme premium in shipping and waiting time of aliexpress just isn't worth it anymore.
>I really miss having a place (back then we even had ad dozen of them in a few km distance) with a huge selection carrying every component you might ever need
That sounds very atypical. I live outside a major US metro and I know of one good computer store in the area (Micro Center). I imagine there a few small places that carry electronic components but not sure how many. And it wasn't much different 10 to 20 years ago unless you count Radio Shack which wasn't really all that great.
I count RadioShack, because it was great. As late as 2016 they had their transisor, resistor, capacitor buckets and speaker wire. You couldn't get arbitrary ICs there, but for basic components or last-minute bodges they were key. And now they're no longer carried.
>You can get many many components from local stores now, if you're willing to pay the 4x-10x premium
Well. No. RadioShack is closed. Fry's is closed. If I needed a resistor, I honestly wouldn't know where to go. Probably AliExpress or DigiKey. There's no local stores left.
Don't get me wrong, I am not talking about myself here. I am still an absolute curious fellow. I just started learning woodworking, I am 37 years old now. I start learning something new each year.
I meant the current trend of digital products which are, in my opinion, more like a TV than "computers".
I have also been disillusioned of late with the computing hobbies that have fueled me most of my life. I'm still heavily involved with computing for my employment, but there's just something that doesn't spark my curiosity as much as it used to.
Interesting that you mentioned you've moved to woodworking, as I found a way to rekindle my curiosity be exploring older technology as well. I've dived head first into antique clock and watch repair.
I took up to painting as Im involved too in computers as my day job. The contrast is wonderful, there is freedom in it, there is a sense of limitless possibilities and no pressure or hard deadlines. I’ve often said to myself if I went to art school I’d probably have been interested in computers now. I have also become disillusioned a while ago with computers and I think there is a limit to any interest without taking a break. But recently about an hour a year ago I started to become interested in programming again and one was using racket/lisp/scheme and second was Linux. I’m a new Linux user I am very stoked about it.
What, if any, resources did you use to learn woodworking? In-person courses and hackerspaces are not available in my area right now and I wonder if there are any good online classes. And by that I'm not talking about those quick-cut YouTube videos where you need a workshop worth 50k in machinery and tools to properly follow "this one easy trick" ;-)
If you have any resources you can share/recommend, I'd be very curious also ;-)
Sounds like you might like Wood Working for Mere Mortals (https://woodworkingformeremortals.com/). I haven't taken any of the courses Steve offers but I'm a huge fan of his YouTube videos.
Rex Krueger has the most budget-conscious channel I know of (https://www.youtube.com/c/RexKrueger). His "Woodworking for Humans" series starts out with just needing three tools and bootstrapping your way to more.
Personally I started with restoring discarded and thrifted furniture. The initial outlay was just for some glue, varnish, and sandpaper.
> 5. When I was a kid, games were made by 1 or a few people. It quickly evolved into big studios. But nowadays, a solo or tiny team can release super successful games.
And games will look exponentially better considering the tools we have at disposition now. Also, selling millions of units of one game is "possible" even for Solo developers (while it's obviously going to be a superstar kind of probability). in the 90s reaching that kind of market was impossible without a huge corporation support.
From my perspective I think it might feel that way because back then it was mostly tech curious people who used the internet. Now it’s mainstream and everyone is connected which means most of the content is produced for massconsumation. Back when I got my first computer and got online, in around 1996 I think, the computers were still quite expensive and I would have killes to have access to the information and tinkerboards/cheap computers that are available today.
I clearly remember my friends' parents being scared of me since I used to open up PCs and see what components were in them.
I have been using laptops that can not be easily opened/upgraded for the last 4-5 years. Devices are being made mainly for people to consume media and that's it. I think computers of the earlier era and now are not at all comparable. The devices now are not really "computers" as they used to be. They are opaque all-soldered-and-sealed boxes. How will you inspire anyone to tinker with them?
Adding a graphics card to a box was such a cool thing for all of my gamer pals. Video cards, audio cards, or any upgrades would naturally involve casual talks about tech. Who does that these days, unless you own and upgrade a desktop?
I think you're comparing the wrong things. When I was a kid, I tinkered with clocks and watches. I thought digital ones were opaque boxes and remember calling them boring. The mechanical clocks were much more fun and reconfigurable.
Does that mean we've sadly come to the end of the era of hobbyist friendly alarm clocks? Maybe, but so what? We then got computers instead. Now we have Raspberry Pi's for kids who see cellphones as boring black boxes. There's always something and it's fine if it's not the same thing it used to be. The same thing happened with cameras - people used to build their own cameras out of bits of wood and develop their own pictures from glass plates. but cheap film cameras eliminated that hobby from the mainstream. It's fine. Technology moves on. Now we have 3D printers that enable kids to make things they never could have before without access to an expensive milling machine.
I never liked to tinker with hardware. Tinkering with software was more fun, and the hardware that I can run my self-made software on is now more powerful than ever. Even an iPad, or rather, especially an iPad.
Maybe we could make an Internet for Age 35+ or so. Remember when one could take over Yahoo chats with Javascript? With containers and stuff, the Internet could be a lot of fun again. App battles in the park by old timers on the chess tables.
The mainstream was never going to be part of the tinkerers and hackers. But the depressing thing is to what extent todays hackers have become consumers.
But, we've still got linux and a lot of hardware support for that. I'll likely be the last bastion of general purpose computing.
I was a computer geek from about 1981 onwards, starting with Commodore PETs at high school. These machines were uniquely accessible. You could do surprisingly creative graphics using only the graphic characters hardwired into the machine (no bitmaps, colours or sprites). But these characters were printed right on the keyboard, and could be used in normal BASIC "PRINT" statements to do quite performant animations and such, without any advanced programming skill.
That was as good as it got. The difference between a beginnner dabbling with his first PRINT loop and an advanced assembly language programmer wasn't that great because the machine was so limited; you could go from one extreme to the other in skill in a year or two if you were interested enough. My natural response to seeing the "MISER" adventure, after solving it of course, was "I have all the programming skills to make a game like that" and did so.
And while then as now, only <1% of the general population was interested enough to get good at programming these things, another 10-20% was interested enough to hang around, dabble, try the latest cool programs that came down the pipe, or were made by the 1%. I had people playing a game that I made that consisted of one (long) line of BASIC.
Then, it seemed, most of the rest of the population (the other 80-90%) got computers too, Commodore 64s mostly where I lived. And still, even if only a tiny minority actually programmed their own stuff, it felt like a part of a vibrant scene, you could always show off your stuff.
With modern machines, even without the discouragement of locked down operating systems, missing accessible programming languages and such... there is just no hope of making something "cool" you can impress your friends with. So the tiny minority that does learn HTML5, Javascript, Arduino programming, what have you, is relatively obscure and seems to be a shadow of its former self. It isn't really. It's just that 99.99% of the computing power now in the hands of the population will never be tinkered with.
>there is just no hope of making something "cool" you can impress your friends with
I believe there is hope of making cool graphics effects with GPU shaders. GPU tech is still advancing similar to how CPUs were back then, so people have not yet explored all the possibilities of the hardware. You can do impressive things just by experimenting even if you don't have a lot of theoretical knowledge. Sites like shadertoy.com make it easy to get started.
> With modern machines, even without the discouragement of locked down operating systems, missing accessible programming languages and such... there is just no hope of making something "cool" you can impress your friends with.
Every modern computing machine I know allows you to add a very rich set of coding tools, other than unjailbroken mobile devices perhaps. Yeah, poor access to mobile sucks, but if you feel the need to hack, you won't let walled gardens stop you.
Web scraping is an insanely rich source of data for building cool tools. Then you might add a bit of simple machine learning, like a voice-based interface or a GAN-based video face-melter or visualizing web-scraped results on a map. These sorts of tricks were hard or impossible to do 10 or 20 years ago. But not today.
If you want to hack, I'd start by immersing yourself in Linux. Or Raspberry Pi. Better yet, both.
You are looking at this through the eyes of someone who is already comfortable with their understanding of computers and software. To a kid with no prior knowledge, the entry barriers might appear impossibly high.
No, they're lower than ever. They're one google search away from a huge amount of usable resources, youtube videos, web IDEs that they can use on their locked down iPad / School Chromebook and more.
And to do anything non-trivial requires years of tediously building up a knowledge of all the abstraction layers and dependencies between you and the machine. For a long time there is nothing you can do that a million others haven't already done far better and made readily accessible.
I can personally attest to what a massive turn off this is. I grew up in an age when computers were a part of everyday life, but their inner workings - hidden behind a mountain of ugly obfuscation. If you find the rare trivial task like making stamped websites fascinating, good for you. But for people like me who couldn't care less, it takes some sort of external impetus to actually discover their interest in computing. In my case it was a half assed mandatory programming class in engineering school where I found out I had a talent for it, and discovered an interest in the inner workings of things I had been taking for granted all my life.
Just because resources are easy to find doesn't mean anybody cares for finding them.
The percentage of people who tinkered and explored, and did all the fun stuff back then is probably still the same as now. But back then most of the "normal" people didn't use the internet at all, and now use it for youtube and facebook.
Classic PC and even some laptops especially with open operating systems are still fantastic tool for learning and tinkering. There's just more options now for people that simply wants to consume media without thinking too much about everything else. PC enthusiast scene is getting bigger every year.
Non sense. Linux makes many details about computers much more explicit than Windows does. Even installing Ubuntu teaches you something about how computers boot, what partitions are and file systems do. Linux also makes it easy to get started in programming on the OS level -- thanks to the Internet it is way easier to find information about how everything works.
So from a learning experience perspective, a kid tinkering with Linux might gain more insight into how computers work and than, say, on Windows or Mac.
Agreed. I still remember my first foray into linux install. PC wouldn't boot afterwards and I had no one to shepherd me through it. These days if I want to get information on how to get something working, 9/10 someone already posted it for your benefit.
Some of things are worse, but some things got infinitely better.
That's today, but it wasn't so initially. You'd learn about your monitor's scan modes, how the disk is structured, dumping bytes between devices, optimizing for load by placing bytes on the outer sectors of floppy disks, using RAM as a disk to speed things up, and whole lot of "useless" things that build machine sympathy.
Yes, but it's also true that drivers and peripherals are generally a lot more complicated today and hardware services more federated across layers of dependency and security than in the days of DOS or Windows 3. Reverse engineering today's Windows 10 wifi-based encrypted RAID disk array is a good deal trickier than a 30 year-old DOS FAT IDE drive ever was.
Well you still have easier and more difficult distros. You can go easy route and play with Ubuntu or Fedora but you can approach linux from the hard side and try Arch or Gentoo
I feel very similar. It took less than 2 decades to go from tinkers, creators,and just general curiosity, to somewhat mindless consumption of videos,or photos.
When the Covid wave started last year, we had to assess who has necessary equipment at home to be able to work remotely. A lot of people don't even have a computer at home,all they need is a phone or a tablet, which are created for consumption,not to be creative with it.
It's worth noting that the article is about the development of specialized hardware rather than walled gardens. In many cases this can be hugely beneficial. Not only has it allowed for the development of energy efficient hardware to watch videos on a smartphone, the same concept has allowed for the development of efficient encoding hardware for video to serve that market. Likewise, the energy efficient processors that enable media consumption have other applications. In the end we carry around a tremendous amount of general purpose computing power in our pockets.
Walled gardens and specialization are somewhat different concepts. The former is primarily concerned with restricting how hardware is used, while the latter is mostly concerned with optimizing hardware for specific applications. While there are times when these two concerns can overlap, this need not be the case.
This can be seen historically. Early microprocessors were developed for integer calculations, with floating point operations being done in software or with an optional floating point coprocessor. Floating point was integrated, and became a fundamental part of our notion of general purpose microprocessors, later in the game. The path taken in computer graphics was much more complex, but it is also worth noting that it started off as specialized hardware. Experimentation resulted in GPUs becoming a generalized tool. Recent developments have encouraged them to diverge into a multitude of different specialized tools. Both of these examples illustrate how specialization has broadened what can be done, without creating and environment that restricts what can be done.
While I agree that walled gardens encourage media consumption at the expense of experimentation or creation, this is more of a cultural thing. There is far more money to be made in the former than the latter, so that is what the market focuses upon.
Except that the algorithm is optimized for revenue and attention-grabbing content (the opposite of university material), and will gradually steer watching habits in that direction.
I hear people saying this a lot, and I admit I sometimes think it myself. But I'm not sure how true it is, as opposed to just a perception.
> Not even to the level that I, a curious teenager, had experienced in those years
I think therein lies the rub. Teenagers curious about computers aren't very common. And even back then (early 2000s), my experience in Europe was the same [0]. Most of the people in my school didn't really care about computers. They had other interests and pursuits.
The difference today is that computers are much more widespread because they allow people to do much more diverse things. At the time, they would maybe type up a report or something and that's it. There wasn't Facebook et al., so many people didn't spend any significant amount of time in front of the computer. Idle time would mostly be spent in front of the TV or hanging out with friends.
I think the reason this perception comes up is that at the time, people who "spent time on computers" were doing so out of curiosity and interest in learning about them. So the shortcut is that since now many more people are spending a good chunk of their day in front of the computer, that must mean they're as interested in them as the "curious teenagers" of our time. That's simply not true. Just look at what they do on the computer. It's mostly idle scrolling on some form of social media.
And, to address the subject of the OP, I'm also wondering whether there's an actual decline in the "general-purposeness" of the computer, as opposed to there just being more "non general-purpose" computers.
There's a definite decline in the percentage, and maybe even in absolute numbers if we look at people who used to use computers because they had no alternative but who are now better served by smartphones or tablets and also for uses which need "specialized processors" (say ML).
I draw a parallel to my feeling whenever I see discussions about iOS being locked down and people expecting the whole of the HN crowd to be up in arms about it. In my case, I was one of those curious teenagers of yore. I still love tinkering with computers, etc. But phones? I just can't be bothered to care about them. Can I make calls and check some maps? It's great!
I have an iPhone and treat it pretty much like a dishwasher. I barely have any apps installed. Neither allows me to install whatever random app I want. It's not an issue for my dishwasher, and frankly, nor is it for my iPhone. Whatever "general computing" I feel like doing, I prefer doing it on an actual computer.
Of course, there are other philosophical considerations in this discussion which I understand and can get behind, but the point is that, to many people, a computer is an appliance. They want it to do certain things. Does it do that? Great! If it doesn't, it's not fit for purpose. They simply do not care if some guy some where would like to do some thing with it but can't because Microsot / Apple are locking it down.
---
[0] I went to school in the suburbs of Paris, so many of the kids' parents were pretty well-off and many were even working in computer-related fields, so there wasn't any access problem. Practically all of them actually had at least one up-to-date computer at home.
I don't think it's curiosity about computers - it's more general curiosity about how certain kinds of systems work. It isn't even about appliances and mass consumption. That's a relevant view, but not a very revealing one.
Computers are an idealised example of an organised system which is assembled/operated by solving puzzles. Some people enjoy solving those kinds of puzzles. Most people don't. The first kind of person doesn't really understand the second kind of person - and vice versa.
Something interesting and disturbing has been happening over the last twenty years. The Internet stopped being a technological puzzle and became a social and political puzzle.
This kind of "cultural engineering" - where the currency is attention, influence, patronage, and sometimes actual spending, not puzzle prowess - appeals to a completely different kind of person. And these people have been driving the development of the Internet for a while now.
People who like solving computer puzzles have been completely blindsided by this and continue to act as if technology is still somehow primary - when it isn't, and hasn't been for a while.
Lots of technologies start out appealing to technologists until they mature enough to reach mainstream adoption. It's not disturbing, it's a sign that they were successful! Cars did the same, and aeroplanes, and electricity. I'm happy the internet is now as common and easy to use as electricity. Geeks can move on to the next interesting immature technology. Perhaps cryptocurrency or machine learning or whatever. The world is full of exciting things.
Cryptocurrency has now pretty solidly transitioned from hobbyist/enthusiast to monetization through centralization. Which is a little ironic but that’s how it goes.
> Computers are an idealised example of an organised system which is assembled/operated by solving puzzles. Some people enjoy solving those kinds of puzzles. Most people don't. The first kind of person doesn't really understand the second kind of person - and vice versa.
Plenty of puzzle lovers don't care about computers at all. Plenty of good programmers and tinkerers don't care about puzzles of different kind.
> Something interesting and disturbing has been happening over the last twenty years. The Internet stopped being a technological puzzle and became a social and political puzzle.
The “internet” as a thing is still capable of the same things it was 20 years ago. The proportion of people using the internet who care more about the social and political puzzle rather than the technological is what dramatically increased.
> But phones? I just can't be bothered to care about them. Can I make calls and check some maps? It's great!
I understand where you are coming from, and it makes 100% sense if you own a PC/laptop as well. But guess what? In a lot of places the majority of population (including younger people) do NOT have them. Their first and only experience of computers is a mobile phone. Even though "teenagers who care about computers" is a low number, now a lot of them won't realize that something like a general purpose computer exists (until way later, if they end up using a computer at work).
I grew up being fascinated with how useful and flexible computers are and fell in love with creating software. I'm quite sure that I would not have happened if I was born during the last 10 years.
I do get your point, but I don't agree it supports the thesis that there's a "regression" happening.
Broadly, I could see two situations in which people only get a phone and have no computer: either "poor" countries, where they can't afford a computer and will get a phone that's better than nothing, and very affluent ones where they'll just get a phone or tablet and not care. Although I'm not sure that this second category has no computer.
While I do sympathize with this, and I think that it would more likely than not allow people to discover this curiosity by, as you say, being fascinated with how flexible phones would be, my point is somewhat different: this isn't "going back". Phones never were open. And I'm pretty sure that the people who can't afford a computer now couldn't afford one in the 90s.
As such, for me, this is more of a lost opportunity to progress rather than a regression.
[for simplicity, I'm using the term "computer" as a desktop/laptop computer]
> Phones never were open
They certainly were more open than today. It was much easier to fiddle around with the software with smartphones that Nokia used to have, for example. Both iOS and Android were easier in that respect, too. So there's clearly been a regression there. However, if you meant "phones were never as open as PCs", ah yes then I agree.
> As such, for me, this is more of a lost opportunity to progress rather than a regression.
I would agree with the spirit of that, but the problem is that a lot of computers are actually being "dumbed down" so that they are more similar to phones simply because a larger population is used to them. For example most young people might not find it weird if some day MS or Apple make it painfully difficult to install PC/Mac apps from outside their stores.
You're right we are not there yet (yay?), but we seem to be moving towards that.
>It was much easier to fiddle around with the software with smartphones that Nokia used to have
Uh, as a young person during that time, I think there may have been one kid in my high school who was the right combination of nerdy and well-off to get a smartphone. It was still cool if your flip-phone came with any internet-connected features at all.
Is it really any more limiting? They can make programs in Scratch or other web apps on their phone. It's not "real" native software but it wasn't in the 1980's either. Kids then used BASIC interpreters. Compilers for "real" programming languages like C were unaffordable. Even assembly language and machine language were often inaccessible because computers didn't have an assembler or hex editor and you couldn't get new software like that without spending money.
You can’t use Scratch on your iPhone, and the built-in basic programming was far more “native” than any similar smartphone equivalent we have today. My TI-89 calculator remains way more programmable than my smartphone.
I’m not sure why it has become so common on HackerNews to defend this trend and/or pretend it’s not a trend.
Isn’t that the wrong comparison though? If you want to run scratch just use something other than an iPhone.
When I was a kid we had 8bit micros. They were programmable but they were also eye wateringly expensive. So it was a struggle to own anything you could program at all.
Also they all came with closed source, proprietary OSs.
I used to spend a fair amount of time just starring at adverts for things I couldn’t afford and that would be obsolete before I could.
Now you can get a whole computer given away free on the front cover of a magazine and it will do a pretty good job of running scratch for you.
ML wasn't totally inaccessible by that metric, it was quite common to store ML in DATA statements and POKE it into the desired memory locations. Depending on the platform it would either be directly called with SYS or CALL type statements, or it'd be vectored through USR() (or perhaps the ampersand in Applesoft).
That's true. I was mainly thinking of a struggle I had trying to create a binary .com file containing ML. The BASIC had no function for writing an arbitrary binary file. In hindsight, I suppose you could use one of those ML techniques you mentioned to do file I/O but that would have seemed out of reach in difficulty to me as a teenager.
> Most of the people in my school didn't really care about computers
When I was in university in the Netherlands studying CS in the early 90s, many (I would say most) didn't care about computers; they were there because 'money later' or 'making games'. By far most never wrote a line of code. It was frightening (to me) how much they had to catch up as the lectures were fast paced. Many changed to psychology or law in the first 2 years.
I think there is an other issue, which is that in the time when most of us got involved with computers they were devices in which almost anything could be done but hardly anything had been done, this attracted people who were explorer types, but also people who wanted something to be on the computer that there wasn't would be pushed to developing it themselves.
Now we live in an age where multiple versions of nearly everything seems to be available. Just buy what you want, don't learn to make it would seem to be the imperative of the day.
You are right overall but the smartphone does not inspire tinkering at any level. It feels closer to the TV to me than the computer. Computers used to be boxes which, even non-developers, would open up and upgrade RAM, HDD, etc.
That encouraged tinkering to a high degree. I remember having lengthy talks about what components to use to build a PC with gamers or videgraphers. The more computers become a closed box which can not be opened, casual onlookers will have no reason to even configure them.
The industry is simply not giving people a box to open.
> The industry is simply not giving people a box to open.
The industry is, it’s people who don’t want it.
The sealed boxes are some combination of more resistant to damage and longer lasting and smaller fo factor that a modular box can’t be, the sealed software is less prone to malware and blue screens of death.
The cloud makes things work like an appliance, rather than having to worry about dying drives and backups and figuring out port forwarding on your router. It also helps companies extract maximum rents on their product, but the point is it came with some benefits that consumers do value.
I don't think the problem is curiosity around computers. Back in the day, computers were just a novelty and weren't yet essential to most jobs. It's fine if people didn't care then.
Nowadays however most white-collar jobs and daily life tasks involve computers at some point, and knowing how to use & program them would make these tasks more efficient - almost too efficient, to the point where it goes against corporate interests (turns out society normalized and encourages business models that bank on artificially created or maintained inefficiency) which is partly why general-purpose computing is on the decline now.
This looks spot on.as I understand, what you're saying is that folks curious about computers shouldn't expect others in general to be curious about them too, despite their ubiquity?
> But back then, 1998 - 2006 ish, I feel the Internet was a lot more about experiments. I would read up on some software or hardware group all the time. LUGs were way popular. I used to be on chat rooms from MIT or other unis every other day.
> Nowadays most people I see of that age group are happy with a YouTube video on their smartphones. General purpose of computers is simply not visible.
Have a look at GNU/Linux phones recently developed:
The challenge with technology these days is that while the capabilities have gone in one direction, the culture has not necessarily gone in the direction many of us who came of age during an earlier era wish it might have. To some extent, I guess this is just a part of what happens when cutting edge things go mainstream -- no frontier lasts forever, sadly.
You cant compare back-then-yourself to general population now and assume the comparison is good comparison between generations.
Plenty of your peers were not on those chat rooms. They did not interacted with tech like you or even did not interacted with tech at all. They had different hobbies and interests, different habits then you.
I was very far off, geographically, growing up a teenager in India. But back then, 1998 - 2006 ish, I feel the Internet was a lot more about experiments. I would read up on some software or hardware group all the time. LUGs were way popular. I used to be on chat rooms from MIT or other unis every other day.
Nowadays most people I see of that age group are happy with a YouTube video on their smartphones. General purpose of computers is simply not visible. Not even to the level that I, a curious teenager, had experienced in those years from so far in India.
We have too many tall-walled gardens now.
Update: I should add I am happy with how powerful computers are these days, but they are not for the purpose of learning or tinkering the actual device or its software. It is just way to consume media for the mainstream.