In the early 90's, Apple had big plans for end-user programming. Larry Tesler, the head of Apple's Advanced Technology Group, gathered people from ATG and elsewhere at Apple who were working on aspects of end-user programming for a project code-named Family Farm. The idea was that most of the pieces that were needed for end-user programming were already working or close, and that with a few months of work they could integrate them and finish the job.
The project sputtered when (1) it became clear that it was going to take more than a few months, and (2) Tesler was largely absent after he turned his attention to trying to save the Newton project.
AppleScript was spun out of the Family Farm project, and William Cook's paper [1][pdf] includes some of the history, including its relationship to Family Farm and HyperTalk, HyperCard's scripting language.
AppleScript shifted the focus from creating something that end users without previous programming experience could easily use to application integration. I was a writer who worked on both Family Farm and AppleScript, and I was both surprised and hugely disappointed when AppleScript was declared finished when it was obviously not suitable for ordinary users. I assumed at the time that there had been no usability testing of AppleScript with ordinary users, but Cook's paper claims there was. All this was even more disappointing in light of the fact that so much more could have been learned from the success of HyperCard and HyperTalk, and that the HyperCard developer and champion Kevin Calhoun was just around the corner.
The Wikipedia article on HyperCard [2] gives the history of the product, including the pettiness of Steve Jobs in cancelling it.
While the Hypercard discontinuation in 2004 can be viewed as petty, that was quite late in its life. I think this article neglects many of the developer gifts that came back into the Apple fold via the NeXT merger which Steve Jobs can legitimately take some credit for. Cocoa was a major development which is completely ignored here. Also, the evolution of the World Wide Developer's Conference, and its supporting materials is not covered in the article as well.
While NeXT UI frameworks can be considered great to be made available to an wider audience via OS X and Cocoa, similar ideas were already mainstream on the Windows world via Visual Basic and Delphi, and much widespread in terms of bringing programming to the people.
I finally started writing AppleScript a few months ago when I realized GPT-4 can write it for me, which eliminated almost all of my previous frustrations with it: https://til.simonwillison.net/gpt3/chatgpt-applescript
I feel like scripting with non-standard languages (not Python, not JavaScript) is one of the killer features of GPT-4. It was trained on a large amount of this material and can spit out functional scripts for small tasks (which you can then chain together) incredibly quickly.
ChatGPT is bad at the AutoHotKey (Windows automation scripting language), which is down to having two major largely incompatible versions of the language out, and a dearth of code examples out on the Internet.
ChatGPT can even show you how to do this in Python BTW!
from ScriptingBridge import SBApplication
notes_app = SBApplication.applicationWithBundleIdentifier_("com.apple.Notes")
for folder in notes_app.folders():
for note in folder.notes():
print(f"{note.name()}\n{note.body()}\n")
That's interesting, and would be doable for small scripts that I could fully understand. But I would be hesitant to use it for larger jobs, where it might end up misunderstanding me, and I wouldn't necessarily realize there was a mistake. Applescript is very powerful, and I wouldn't want to risk it deleting/moving/editing files. Perhaps in the future, when people have used it for this purpose for years (without incident), I would trust it more.
> You can't write a BNF for it because its syntax depends on what program you're trying to control.
More importantly it AppleScript operates on sending and receiving messages from those external programs and thus has no insight into their current state/progress unless it's specifically communicated back to AppleScript.
Many programs would just return back to AppleScript immediately after getting a command even if the operation requested took a lot of time. There was no way to register callbacks (at least no obvious way) so you had to put in boilerplate to check for a file at a location to know when a long running process was complete.
The challenge level between writing any two scripts could vary wildly. A program's script dictionary might technically allow you to write a script but actually making it workable or reliable was fantastic amounts of effort.
My recollection of the English-like nature of Applescript is that there would be multiple ways to express something, and more than one of them would work, but it always seemed totally unpredictable which would work and which wouldn't. That's actually a bit worse (more frustrating) than the way you're describing it. Overall, I agree with you that it was unsatisfying for both professional programmers and non-programmers, and that's why it failed to take off.
In the late '90s, I really wanted to learn how to make a computer game. Our family's only computer was some kind of Power Mac. I managed to find my way into a basic copy of code warrior (because it looked like a programming tool based on a computer catalog). But we only had dial up internet that had to be used with permission because it hogged the phone lines. And nothing built in to the manual really explained enough to do much besides some basic stuff involving printing characters to the screen, and some keyboard input. If the manual had any kind of tutorial sufficient to make snake or even just a really bad version of asteroids or galaga or something I'd probably have never put down the keyboard. But instead, I kind of lost interest because I assumed I was missing something you needed to make graphical programs. Another barrier was that what internet time I did have mostly lead to Windows stuff. Not that I really knew how or what to look for.
The first program I was ever able to write that involved things actually moving on a screen came a couple years later and was written on my graphing calculator. Solely because the calculator came with a big manual that documented all kinds of functions you could use in a program. And so long as you were writing only a few pixels at a time its built-in basic language was just barely good enough.
> And nothing built in to the manual really explained enough to do much besides some basic stuff involving printing characters to the screen, and some keyboard input. If the manual had any kind of tutorial sufficient to make snake or even just a really bad version of asteroids or galaga or something I'd probably have never put down the keyboard. But instead, I kind of lost interest because I assumed I was missing something you needed to make graphical programs.
This sounds a lot like myself as a kid in the around 1999/2000 after getting my parents' Performa tower as a hand-me-down.
I'd become aware that it was possible to build one's own programs, but exactly how was extremely murky. I grabbed various development tools mentioned on the internet, but as you note tutorials were nonexistent and IDEs, consoles, etc may as well have been ancient tablets written in an archaic language.
Finally I stumbled across REALBasic, I think maybe on a MacAddict shareware/demo disc, which was analogous to Visual Basic in the Windows world (though unlike VB, RB could compile binaries for both platforms). That was far more intuitive… just drag a button onto your window-canvas and double-click the button to edit its code. Examples were also much more common and easy to find on the internet and so before too long I was hitting the ceiling of what was possible in RB.
That segued nicely into Cocoa development when OS X rolled around, which brought Project Builder and similar (but far more extensive) WYSIWYG UI builder with Interface Builder. That took a lot longer to become capable with, but led me to where I am today as a mobile dev.
Sounds very much like my experience, except that I was using Delphi. Non-existent documentation (complicated by me not speaking English at the time), no friends or relatives using computers for anything other than office documents (and very few having or using them at all), no access to the internet, and a friendly neighborhood software store that sold "pirated" copies of Delphi 7 for something like $2. That was an excellent environment to start precisely for reasons you described: double click on any element created an event handler for its "main event", and you could take it from there.
But: Object Pascal/Delphi is a quite powerful language and you can write really complicated stuff with it, compiled into a relatively small statically linked executable that worked just about everywhere without the need to install anything. (300-400 KB binaries seemed large at the time, little I knew what awaited us 20 years down the line.)
The need to dig and learn everything yourself definitely had some impact and provided some lower level knowledge of things many newer developers don't seem to know or care about, but I very much prefer what we have today where you're not expected to learn to read what looks like ancient Egyptian from scratch to write a basic calculator. Good riddance to those times.
Sounds like the opposite of my experience with Java in 1996. I had some great thick book on it (as was the style at the time), and after chapters on language basics, streams, collections, etc, it quickly got on to GUI programming with AWT. AWT is pretty horrible in many ways, but it did make it very easy for a teenage novice to put up a window with some buttons in it, and start drawing lines and text.
I immediately started working on an ambitious space battle game. I got to the point where you could click to set waypoints for a ship, and it would try to fly a course through them, using Newtonian physics, but the navigation was incredibly naive, and always just accelerated towards the next waypoint, so eventually the ship was going so fast it crossed over the waypoint in a single time step, and would turn round and come back to it, inevitably missing is again, and so basically going into perpetual powered orbit around it. I then abandoned the project.
REAL Basic was actually an amazing program. Easily throw together a cross platform GUI and one click to create a self contained executable. Remarkable that we struggle with this in Modern stacks.
I also got a strong interest in programming 1998/1999-ish and also had only a Mac at home. I emailed the guy who authored a chess application (his email was in the splash screen) over our dial up connection. He told me I would need to learn C or C++ and I had no idea how to act on that (I was 8 or 9 at the time, living in a very rural area).
4 or 5 years later I got my own laptop and installed Linux on it. Then I started learning bash and tried (and failed) to install Linux From Scratch. That did get me to a point where I could program.
This is my story too. I remember playing with Basic on an Apple II and how it was a revelation, but then a big void in the 90s up until graphing calculator in high school. I was so frustrated, programming seemed a parallel world I had no idea how to reach but was deeply attracted to. I had fun with HyperCard and ResEdit but I knew I was just scratching the surface. When I finally grabbed a copy of CodeWarrior it was way too complex for me, and console-like C programs too limited. If only something like Unity had existed back then..
Turbo Pascal, Delphi, QuickBasic (not so much QBasic), and Visual Basic had amazing built-in documentation for learning how to do things in a pre-internet way. That's what got me started.
For me was learning to rom-hack using hexadecimal editors because that was the only information available online in my native language. Mostly from people translating nes/snes/megadrive/gameboy roms.
I only got into actual programming several years later by joining computer course during high school.
Shortcuts is one of those rare products that have been acquired and have not only not been ruined, but has improved tremendously for first and third party app integrations.
I was initially turned off by the visual programming style because it was mind-bending. Now I use it a lot because it is mind-bending and flexes my brain while also being highly useful.
If for some reason I don’t want mind-bending, I can hope into Pythonista and get stuff done. Or thanks to the tight integration, integrate it with Shortcuts.
As a programmer, I hate Shortcuts. I really can’t believe 10 instructions can take multiple seconds. I can’t believe Apple decided this product is ready for prime time.
I feel like this article undersells Hypercard. From my perspective as a young teen BBS kid in the early '90s, everyone and their mother (literally) were making Hypercard stacks. A friend of mine in 9th grade made a Spanish flash card quiz game that got a ton of downloads. My dentist made a stack that tracked patient records and visits and allowed you to click on a tooth on a diagram and enter data about it. There were games, both amateur and professional (MYST is the most famous example). Hypercard really did seem like programming for the people.
By comparison, the early web was much weaker visually, interactively, and programmatically, but the interconnectivity of the web was the killer feature and eventually, the web became the programming platform of the people.
But man, it was really to get up and going with something quickly in Hypercard, and so many of the paradigms that modern web developers are familiar with today, were in place in the late '80s with hypercard. Components, components inside components, components sending and listening to events, programming individual components' behaviors. etc.
My high school had so many classrooms running everything with HyperCard. The most intriguing were decks connected to VCRs to play videos. My computer science teacher had his laserdisc connected to some of his decks. It was amazing what HyperCard was.
I had a Mac from around 1993 to 1997. I even wrote articles for minor league Mac journals.
My impression is that Microsoft was founded to bring programming to the people, and the Mac was founded to bring great software experience, which I appreciate. But Apple (represented in my mind by Steve Jobs) didn't want his platform to be flooded by crappy software. Even HyperCard never had his wholehearted support (I used it extensively, despite its odd and limited language), and I think Apple was glad to finally dump it.
If you had a reason to write crappy software, as I did, you ended up with Visual Basic. What I mean by "crappy" is software that solves a problem reliably, but can't be scaled. For instance, installation might require help from the author, and entering a bad value might make it crash, but a small handful of tolerant users can handle these issues.
The solution today, with open source, is that the people bring programming to the people.
Open source is completely impenetrable to most of the population.
Between an impressive list of incompatible languages, the vagaries of Git, the complexities of command line access, the reliance on some version of the Unix folder structure - with endless confusion between /etc, /local/etc/, /usr/local/etc and on and on, the weirdnesses of package managers, and completely unfamiliar editors like Emacs and VSCode, the scene might as well be proprietary.
VB at least made if possible to write software in a reasonably accessible and integrated way. So did Hypercard. So does Excel.
The real problem with Automator, AppleScript, etc, is that they're hopelessly underdocumented and the tools are mediocre.
If Apple had made them more prominent and documented them thoroughly they'd have been far more successful - perhaps in an industry-changing way.
But they seem more like side projects that someone spent a lot of time and a little money on. Clearly Apple was never seriously interested in them.
That is such hyperbole. I have no formal IT background and I taught myself the Linux command line, Python, and Emacs in the course of a single year after I heard about this new Linux thing that was free to install. This was 2000, and from communities like Slashdot I saw there were plenty of other nerdy young people taking up the CLI and other Linux stuff, just out of curiosity and without any connection with the IT industry.
Sure, none of this software will ever appeal to the broad masses*, but since the 1990s programming has become so much more accessible to the general public, because of both the free-of-charge software and the free-of-charge documentation. Learning to program on earlier platforms could be very expensive.
* Still, I have heard rumors of a corporation in the 1980s where the secretarial staff – middle-aged women with no formal computer-science education – wrote some custom Emacs Lisp and/or Awk. Maybe learning even relatively arcane stuff can be done by anyone if their salary depends on it.
I would still disagree. 80s home computers pretty much all came with BASIC built-in, up to the PCs with QBASIC. The documentation was right there, and you had everything you needed to just boot up and start programming. Many even booted up within the programming language by default.
And at the time, most programs were text-mode. I/O was only keyboard, screen, and disk (and maybe mouse, joystick, and serial port if you wanted to get fancy). When your program started you had full control of the entire system. There was no multitasking, 100+ processes running in the background, nor GUI architecture over an OS subsystem. So you only needed to learn a very very few things to be able to build software roughly equivalent to professional store-bought software. Simple logic and data structures and 3-6 types of I/O and that was it.
From there it was a short step to something more powerful like Turbo Pascal or Visual Basic, again both of which were pretty much "batteries included" everything you need in one tool.
Today's landscape with its enormous stacks, build tools, cambrian explosion of languages, frameworks, and libraries, etc., and layers upon layers of abstraction and indirection between you and the machine, plus countless dependencies, package managers, containers, etc., all of which are constantly changing and shifting out from under you, is baffling even to professional developers.
Luckily, it is easier to find documentation, tutorials, and other actual people to ask about stuff. But also a lot of that is quickly outdated, faddish and evangelical, just assumes that you already know 500 other things, or assumes that you know absolutely nothing, so only covers the barest basics.
So it's really still more difficult than just picking up a book about C, QBASIC, Turbo Pascal, or Visual Basic and getting started was back in the day.
I've mostly self taught in programming since around 1981, and I've helped several friends and colleagues learn. My impression is that it's more difficult, like you say, but not prohibitively so.
People just instinctively steer clear of the "professional" tools and documentation, and choose their battles. Even well into the Windows 3.1 era, a lot of people who programmed stuck with MS-DOS text mode. Today, our code runs in Jupyter notebooks. We get stuff done. If it needs a framework or a container, we just don't go there.
There's a mild suspicion amongst amateurs, that the professionals are creating the complexity for their own entertainment or job security. It doesn't seem to make the software better (most "users" think that software is getting worse and worse), or less costly to write and maintain.
To put it more charitably, the struggles of commercial programmers are not invisible to us. I work in a department adjacent to a large programming team. By not attempting to write commercial-scale software, we avoid many if not most of the headaches.
There was at least one popular home computer which was never just text mode but booted in permanent bitmapped graphic mode, directly in BASIC line editor. Entering a command would execute immediately, entering it with the line number in front would add that line to the program.
So right after boot a single command could draw a line or a circle. One was not supposed to even type the letters for any command, but had to enter it like on the pocket calculators: the commands were written on the keyboard, in more colors, so just looking at the keyboard one directly saw all the available commands.
The barrier there was learning to enter the needed mode to reach the appropriate color and learning the syntax and the meaning of the commands.
Every user learned to enter at least one command and its syntax: the one which allowed loading any program from the tape. It was a full command with the empty string parameter meaning "match any name": LOAD ""
People are managing to wring Python / Jupyter installations together using online tutorials, writing, and sharing useful code. The open source community brought us those tools. Today, all of the "top" programming languages have open-source toolchains.
I agree about the portability of VB, HC, and Excel code, including spreadsheets. At least, VB6. ;-) But people are managing with the more fragmented ecosystems of open source tools such as Python toolchains.
Interestingly tools like ChatGPT can make some of these slept-on tools more usable. "Write me a script using __ that sends me notifications whenever __ happens"
> My impression is that Microsoft was founded to bring programming to the people, and the Mac was founded to bring great software experience, which I appreciate.
Apple did develop a native version of Basic that let you create Basic programs that took advantage of the Mac UI, but Microsoft forced them to cancel it as a condition for continuing to license their Basic for the Apple II.
> Apple's original deal with Microsoft for licensing Applesoft Basic had a term of eight years, and it was due to expire in September 1985. Apple still depended on the Apple II for the lion's share of its revenues, and it would be difficult to replace Microsoft Basic without fragmenting the software base.
Bill Gates had Apple in a tight squeeze, and, in an early display of his ruthless business acumen, he exploited it to the hilt. He knew that Donn's Basic was way ahead of Microsoft's, so, as a condition for agreeing to renew Applesoft, he demanded that Apple abandon MacBasic, buying it from Apple for the price of $1, and then burying it.
He also used the renewal of Applesoft, which would be obsolete in just a year or two as the Mac displaced the Apple II, to get a perpetual license to the Macintosh user interface, in what probably was the single worst deal in Apple's history, executed by John Sculley in November 1985.
It seems like you're reacting to the phrase "bring programming to the people" occurring in the same sentence as Microsoft. I don't think GP was trying to present Bill Gates as having been on some kind of altruistic moral crusade. I think the point GP was making is that Microsoft was founded on compilers for hobbyists, whereas Apple was thinking about end user experience.
Maybe the phrase "to the people" is confusing things because it suggests some kind of noble, high-minded motivations on the part of Gates. That wasn't how Gates (or Jobs) thought. Actually, the whole idea of a company having some social responsibility mission was not something that existed at that time. These guys wanted to sell a product that people would pay money for, that's all. In that era, this was perfectly socially acceptable and completely normal.
The statement "Microsoft was founded to bring programming to the people" is true, but it may be a little unclear if people are reading it as suggesting something other than the profit motive. However, for a long time, Microsoft really did view its own success as dependent on providing customers ways to write and sell their own software. That is in fact empowering to the customer. I'm pretty sure Gates did care about this because it was what was driving sales of his products.
> They created a Mac native Basic at launch, and only Microsoft's anticompetitive arm twisting kept if from being released.
Wow, that almost makes me want to cry.
I discovered personal computers when the computer basically felt like a programming language.
I might be wrong, but if a friendly language installed on every computer out of the box had held, the evolution of “people’s” languages might have been a huge thing to this day.
Not disagreeing with your point about MS, but it's not a conspiracy theory that Macintosh dev tools were nonexistent. Apple's solution was "buy a Lisa". (well, maybe they considered this a 3rd party opportunity.) No indication they even wanted homegrown BASIC type stuff.
No, not at all, the folklore story doesn't even address Apple's strategy. Their developer relations were "Apply Here" back then. Macintosh shipped with zero developer tooling, not C/C++, not Pascal, not BASIC.
And, we have no idea what Donn's BASIC was actually like ... it probably wasn't "Visual Basic", or even a platform. Just better than Microsoft's halfazz port of their micro basic. (which did have some quickdraw commands.)
If you'd like to see what Donn's BASIC was like, you can! The Internet Archive has a version that runs in-browser [1].
Although the Folklore article says there were two books describing Donn's BASIC, I believe there were at least three: Introduction to Macintosh BASIC, Using Macintosh BASIC, and The Macintosh BASIC Handbook. All three are available at vintageapple.org [2].
> we have no idea what Donn's BASIC was actually like
We literally know exactly what it was like. Dartmouth was already using a late beta of the software for a Basic class when Microsoft forced Apple to cancel the release of the final version.
From that story is sounds like MacBasic wasn’t a priority at all to Apple, even on the Mac team. The fact they let MacBasic slip from the launch window is a pretty big red flag, I’d be interested to hear why Bryan Stearns decided to leave(I feel the author is saying a lot by not touching on this more)
I think the idea that Microsoft or Apple were founded for any reason but money is pretty much a joke. Every action and reaction in the story revolves around money and how to get it.
I suspect it might have been a timing/market positioning thing.
By 1984, "boots to BASIC" was reserved for low-end kit: C64, Atari XL, and increasingly Apple II. The IBM PC and clones were rapidly defining what a high-end personal computer was, and for them, the BASIC was an useless appendix (Casette BASIC on IBM kit) or a bullet point at the back of the pack-in software list (BASICA/GWBASIC)-- it might be there, but commercial boxed software was the primary selling point.
> Gates and others at Microsoft had gone on the record repeatedly saying they intended for Windows and the Macintosh to be sufficiently similar that they and other software developers would be able to port applications in short order between the two. Few prospects could have sounded less appealing to Sculley. Apple, whose products then as now enjoyed the highest profit margins in the industry thanks to their allure as computing's hippest luxury brand, could see their whole business model undone by the appearance of cheap commodity clones that had been transformed by the addition of Windows into Mac-alikes. Of course, one look at Windows as it actually existed in 1985 could have disabused Sculley of the notion that it was likely to win any converts among people who had so much as glanced at MacOS. Still, he wasn't happy about the idea of the Macintosh losing its status, now or in the future, as the only GUI environment that could serve as a true, comprehensive solution to all of one's computing needs. So, within weeks of Jobs's departure, feeling his oats after having so thoroughly cowed Digital Research, he threatened to sue Microsoft as well for copying the "look and feel" of the Macintosh in Windows.
> He really ought to have thought things through a bit more before doing so. Threatening Bill Gates was always a dangerous game to play, and it was sheer folly when Gates had the upper hand, as he largely did now. Apple was at their lowest ebb of the 1980s when they tried to tell Microsoft that Windows would have to be cancelled or radically redesigned to excise any and all similarities to the Macintosh. Sales of the Mac had fallen to some 20,000 units per month, about one-fifth of Apple's pre-launch projections for this point. The stream of early adopters with sufficient disposable income to afford the pricey gadget had ebbed away, and other potential buyers had started asking what you could really do with a Macintosh that justified paying two or three times as much for it as for an equivalent MS-DOS-based computer. Aldus PageMaker, the first desktop-publishing package for the Mac, had been released the previous summer, and would eventually go down in history as the product that, when combined with the Apple LaserWriter printer, saved the platform by providing a usage scenario that ugly old MS-DOS clearly, obviously couldn't duplicate. But the desktop-publishing revolution would take time to show its full import. In the meantime, Apple was hard-pressed, and needed Microsoft -- one of the few major publishers of business software actively supporting the Mac -- far too badly to go around issuing threats to them.
> Gates responded to Sculley's threat with several of his own. If Sculley followed through with a lawsuit, Gates said, he'd stop all work at Microsoft on applications for the Macintosh and withdraw those that were already on store shelves, treating business computing henceforward as exactly the zero-sum game which he had never believed it to be in the past. This was a particularly potent threat in light of Microsoft's new Excel spreadsheet, which had just been released to rave reviews and already looked likely to join PageMaker as the leading light among the second generation of Mac applications. In light of the machine's marketplace travails, Apple was in no position to toss aside a sales driver like that one, the first piece of everyday Mac business software that was not just as good as but in many ways quite clearly better than equivalent offerings for MS-DOS. Yet Gates wouldn't stop there. He would also, he said, refuse to renew Apple's license to use Microsoft's BASIC on their Apple II line of computers. This was a serious threat indeed, given that the aged Apple II line was the only thing keeping Apple as a whole afloat as the newer, sexier Macintosh foundered. Duly chastised, Apple backed down quickly -- whereupon Gates, smelling blood in the water, pressed his advantage relentlessly, determined to see what else he could get out of finishing the fight Sculley had so foolishly begun.
> One ongoing source of frustration between the two companies, dating back well into the days of Steve Jobs's power and glory, was the version of BASIC for the Mac which Microsoft had made available for purchase on the day the machine first shipped. In the eyes of Apple and most of their customers, the mere fact of its existence on a platform that wasn't replete with accessible programming environments was its only virtue. In practice, it didn't work all that differently from Microsoft's Apple II BASIC, offering almost no access to the very things which made the Macintosh the Macintosh, like menus, windows, and dialogs. A second release a year later had improved matters somewhat, but nowhere near enough in most people's view. So, Apple had started work on a BASIC of their own, to be called simply MacBASIC, to supersede Microsoft's. Microsoft BASIC for the Macintosh was hardly a major pillar of his company's finances, but Bill Gates was nevertheless bothered inordinately by the prospect of it being cast aside. "Essentially, since Microsoft started their company with BASIC, they felt proprietary towards it," speculates Andy Hertzfeld, one of the most important of the Macintosh software engineers. "They felt threatened by Apple's BASIC, which was a considerably better implementation than theirs." Gates said that Apple would have to kill their own version of BASIC and -- just to add salt to the wound -- sign over the name "MacBASIC" to Microsoft if they wished to retain the latter's services as a Mac application developer and retain Microsoft BASIC on the Apple II.
> And that wasn't even the worst form taken by Gates's escalation. Apple would also have to sign what amounted to a surrender document, granting Microsoft the right to create "derivative works of the visual displays generated by Apple's Lisa and Macintosh graphic-user-interface programs." The specific "derivative works" covered by the agreement were the user interfaces already found in Microsoft Windows for MS-DOS and five Microsoft applications for the Macintosh, including Word and Excel. The agreement provided Microsoft with nothing less than a "non-exclusive, worldwide, royalty-free, perpetual, non-transferable license to use those derivative works in present and future software programs, and to license them to and through third parties for use in their software programs." In return, Microsoft would promise only to support Word and Excel on the Mac until October 1, 1986 -- something they would certainly have done anyway. Gates was making another of those deviously brilliant tactical moves that were already establishing his reputation as the computer industry's most infamous villain. Rather than denying that a "visual display" could fall under the domain of copyright, as many might have been tempted to do, he would rather affirm the possibility while getting Apple to grant Microsoft an explicit exception to being bound by it. Thus Apple -- or, for that matter, Microsoft -- could continue to sue MacOS's -- and potentially Windows's -- competitors out of existence while Windows trundled on unmolested.
> Sculley called together his management team to discuss what to do about this Apple threat against Microsoft that had suddenly boomeranged into a Microsoft threat against Apple. Most at the meeting insisted that Gates had to be bluffing, that he would never cut off several extant revenue streams just to spite Apple and support this long-overdue Windows product of his which had been an industry laughingstock for so long. But Sculley wasn't sure; he kept coming back to the fact that Microsoft could undoubtedly survive without Apple, but Apple might not be able to survive without Microsoft -- at least not right now, given the Mac's current travails.
You're exactly right that Apple has always had an ambivalent relationship with software developers. On the one hand, they needed them for their products to have any value for consumers. On the other hand, Jobs clearly saw low-quality, third party software as something that could taint the product Apple was selling. They really saw app developers as something very dangerous and difficult to control. Gates's view was always to just let anybody write anything they wanted, and the market would sort the good from the bad.
I like the egalitarian nature of the Gates view, but ultimately I think Jobs was correct that the mere existence of low-quality software hurts your entire platform, and no amount of high-quality software can completely offset this.
>Gates's view was always to just let anybody write anything they wanted, and the market would sort the good from the bad.
I like the egalitarian nature of the Gates view, but ultimately I think Jobs was correct that the mere existence of low-quality software hurts your entire platform, and no amount of high-quality software can completely offset this.
That is wrong on so many levels. I grew up in a Windows environment in the '90s tro '00s and had the best video games in the world at my finger tips, while Mac users at the time had what, some high quality photo editing software? Good for them I guess.
So unless you don't count video games as software, Gates was right. DOS and Windows being the main platforms for the hottest videogames of the moment, was what led to the popularity of the PC over Macs at the time. I think Doom alone was responsible for millions of PC and MS-DOS sales.
Yeah, Job's Next-Step machines and SW were cool, UNIXy and all, but to what end if nobody but the wealthiest businesses and institutions bought them in numbers you can count on a few hands? In fact, Jobs understood from the fail of the Next-step and the success of the DOS-PC that you need to get your devices cheaply in the hands and homes of as many casual consumers as possible (hence the success of the colorful and affordable iMac G3) and forget the premium business workstation market.
I wrote the comment and I too grew up in the MS-DOS/Windows world. My family had a Mac and an Apple ][e, but I didn't use them as much as the various PCs.
As I say in another comment, I think Gates's view was right for that time. The Jobs view was needlessly limiting the potential market for his machines. Gates didn't make the machine, but he was making a thing that was sold with every machine, and he understood that more apps means more reasons why people might want to buy a PC.
One problem with the Gates view today is that if you make it easy for anyone to write and distribute any kind of app, you end up with not only a lot of low-quality software but even negative-quality software like ransomware and malware. It's surprising that so many people want to write that stuff, but apparently they do. Every modern platform, including Windows, has gone to extreme lengths to block this stuff, even though the more laissez-faire approach would be to just ignore it and assume other 3rd parties will write programs to block it. The problem was that malware was an existential threat to the entire Windows platform, and Microsoft couldn't afford to just let the market sort that one out.
I believe the Jobs view is the correct one for the present era. Every platform has app stores, code signing, DRM, and other kinds of security and non-security restrictions. It's certainly easier to run random code on Windows 10 than macOS Ventura (or especially iOS), but no platform vendor is completely app neutral these days.
I agree with you 100%. I think maybe the problem is that the platform vendors think it's too difficult to explain fine-grained access controls to end users, whereas an app store is dead simple to explain. And, as you observe, an app store makes money whereas ACLs do not.
Indeed I think Jobs approach to software dev was too far ahead of time, Gates pragmatic approach proved to have more leverage for computer/platform sales and growth.
In the present era, running random code is trivial - JavaScript, via a web browser. It runs in a sandbox, which limits what it can do, but it's a really big sandbox and you can do really useful things within it.
Apple has always gone for the premium end of the market, and with vastly increasing wealth inequality, that's where the money is these days. You can see this is in other luxury industries which make incredible amounts of money considering how small their markets are (in terms of numbers of participants).
This focus on high-quality software has also encouraged _better_ software developers to build in Apple's ecosystem. Even today a lot of the best software is only available for Mac or iPhone.
I agree with FirmwareBurner's sibling. All personal computers were expensive luxury items at that time. Most of them cost as much or more than a used car, which people would have considered much more useful than a computer.
Apple's machines were the most expensive, but it wasn't because they were higher quality (that was not the general perception outside the Apple world). It was because Apple refused to allow clone makers into the market, so they were manufacturing in tiny volumes compared to Dell and Compaq and a million other random names like Packard-Bell (Sear's store brand PC) that probably no one remembers.
> Even today a lot of the best software is only available for Mac or iPhone.
I really don't see that at this point, and I do use my Mac every day. Most of the software I use is also available on Windows and Linux, and anything that isn't has a clear Windows or Linux equivalent.
The only software I use that is absolutely only available on a single platform with no equivalent is actually Windows software. I'm not a gamer, but that's apparently a whole category that is Windows-only.
I'm curious what Mac software you use in 2023 that is only available on Mac.
> I'm curious what Mac software you use in 2023 that is only available on Mac.
Ulysses, Bear, Things, Reeder, Mela, Alfred, MindNode, just to name a few. These apps are incredibly well designed, and have no equivalents in the Windows or Linux worlds. I know because I also have a PC and I’ve looked very hard for replacements.
Additionally, apps like 1Password, Scrivener, iA Writer, and Arc Browser started their life on the Mac. Some social media networks, like Instagram and VSCO, were iPhone apps before they released on other platforms. Because Apple users are generally not averse to spending money on software, all the really good software reaches the Mac and iPhone a long time before it becomes available on other platforms.
iCloud itself is something that no other platform can compete with. I never have to create a separate account for any app in the Apple ecosystem because they just sync using iCloud without me doing any extra work. When I install them on my phone later, my data is already there. The Windows/Android world have no equivalent of this.
Apps really are better in the Apple world. I blame Microsoft for underinvesting in native Windows UI toolkits.
> I'm not a gamer, but that's apparently a whole category that is Windows-only.
Not anymore, really. Proton has done an amazing job at making the huge library of Windows games out there work on Linux, so at this point it's a pretty good contender. (Hence the popularity of the Steam Deck.)
We were talking about the past. And back in those days when computers were expensive, most people and companies were writing SW for the most popular platform out there, which at the time was watever had the best bang for the buck thjat home users could afford: Coomodore, Sinclair, Amiga, MS-DOS, etc.
>Even today a lot of the best software is only available for Mac or iPhone.
Again, we are talking about the past. TikTok might be the hottest app today, but back then it was Doom.
I see a lot of poor people with iPhones, Apple Watches, Earpods and such. These are what you'd call an "affordable luxury" and probably a bargain when you consider you might go through 5 $300 Android devices in the time that you get out of a $1000 iPhone and all that time you are struggling with an Android.
but the psychology is weirder in that rich people know a "luxury" watch costs upwards of $10k so it is quite the trick to convince the hoi polloi that a $500 watch is a luxury device at the same time.
I've noticed also that poor folks are also very aware of expensive wireless plans from Verizon but seem to have little awareness of wired internet plans, free Wi-Fi, etc.
... and will your Android actually charge when you plug it into a charger?
I've had roughly five Android devices (tablets and phones) that had the curious problem that I'd plug them into a charger and it was always 50-50% if they would actually charge. This would happen when the devices were new and were starting from zero (often it would take several times to charge the device for the first time) and still happens when the devices are in regular use. All the time I leave one plugged into the charger over night and... nothing. Plug it into the same charger again and maybe it charges.
I've noticed the same problem w/ Sony devices both cameras and the PS Vita if they are used with non-Sony chargers.
Not if they are kernel related, and only since APEX became a thing on Android in Android 10.
However, it doesn't change the main point, most phone users care about updates on their phone as much as they care about updating any other device with a CPU in their home.
I meant that most vulnerabilities in the wild that pose a threat to users and end up actually being exploited by malware, are the ones usually covered by updating Google Play services.
Yes, kernel vulns will always be uncovered, but how often does the average user encounter malware that exploits that attack vector?
The biggest attack vectors on Android are Google Play services and your browser. Keep those up to that and you're good even with older phones.
Jobs' view sounds essentially the same as Nintendo's view in the wake of the video games crash, which eventually became the unanimous model in that industry. With iOS they also got to enforce it like a console, with a whitelist of acceptable software
I was so used to this view by Nintendo that it was hard to believe seeing a lot of really low quality games in the Nintendo switch store nowadays. The worst part is that it's still not easy to publish there (or get access to the SDK in the first place) in an independent way without a publisher or a history of already successfully released games.
Yeah, I think a big part of how Gates and others came into their view was that there were so many examples like Nintendo where a platform vendor had artificially limited the third party ecosystem and it constrained the success of the platform as a whole.
Basically, the Gates view was the more "correct" one for the 1980s/1990s. At that time, most people did not own personal computers, even many businesses didn't, and growth came from on-boarding these non-users. The more apps available on your platform, the more likely a non-user would be to see a reason to buy a computer at all. Also, the platform Gates was selling was much cheaper to try.
Today, everyone owns at least one computing device. Platforms aren't competing for non-users (there are none left), they are competing for users of other platforms. The Jobs view works better in that world since it improves platform quality perceptions.
By the way, there is no particular reason why the computer field had to go in this direction. If you look at the history of aviation, there were a lot of kit airplanes in the 1920s and 1930s because people thought that over time more and more people were going to want to own an airplane. That turned out to be wrong. Most people are content to pay somebody else to transport them.
Saas is the computing equivalent of an airline, and it's very popular, but it hasn't fully displaced individual computer ownership. We'll see what happens long term.
The writing is on the wall though, isn't it? Even companies that you'd expect to go all-in on user freedom and ownership have locked down their platforms and ratcheted up the service offerings. Not a single FAANG company is willing to take a definitive stand in defense of their users. They all just see a bottom line waiting to be butchered.
And for most users, I'd wager SaaS has displaced computer ownership. Stuff like the App Store, Play Store and even game consoles are ultimately a service unto themselves. Even disregarding that, the "happy path" of modern software is paved far away from user empowerment. People are generally more willing to pay for a subscription than to switch to a more convenient hardware platform.
I'm really not sure. Five years ago, I thought that SaaS would completely replace local computing. Now, I'm less certain.
Yes, SaaS is very popular and there are far fewer local applications now. However, there still seems to be some value in local applications because it's thought to be a better end user experience. Look at Slack. It started out as a website, pure browser-based SaaS. Over time they developed a local app, which is basically the same Javascript but packaged up as a local Electron app. This seems to be regarded as superior to the web version for most people.
Consider Zoom. There was initially a native client and an equivalent web version. They seem to be actually killing off the web version now -- access to it is a non-default option for the meeting creator, buried on a screen with 100+ meeting settings nobody understands or reads. They apparently want to be a local-only program.
As long as there is demand for local apps, people will continue owning general purpose, re-programmable microcomputers. Maybe the vendor can lock you out of some low-level stuff like Intel ME or the BIOS, but these platforms are still allowing people to download a compiler and run their own programs directly on the metal.
I'm not sure what direction it will ultimately go, but my hunch is that if it were possible for SaaS to 100% displace local computing, it would have already happened.
Perfectly fair response. I think your examples are also good evidence that people like native software, or at least the appearance of a cohesive platform. Apparently "non-native" apps like an SPA or WASM tool will probably turn most normal users off. Good enough native support can make a web client redundant, even if 'native' means Javascript with OS-native controls.
To counter that though, I think we have to define SaaS concretely and ask what a more service-driven world would look like. I consider SaaS to not only be the paid components of an OS, but also the "free" services like Google and the App Store. Holistically speaking, our OSes are saturated with services; Find My, Android Notify services, OCSP/Microsoft/Google telemetry, online certificate signing, and even 'assistants' that fail to assist without internet. What more could we possibly put online?
It's a bit of a rhetorical conclusion to leave things on. I'll certainly see worse examples in my lifetime, but the status quo is bad enough in my opinion.
I agree with you about the absurd degree of service saturation. I've discovered many things I cannot do without an Internet connection because one trivial step in the process absolutely requires communication with some network endpoint. I found one app where, if you block it's ability to send analytics reports back to Mixpanel, it will run 3 or 5 times but after that refuse to run again until an Internet connection is available. I thought it was absurd since it proves the developers actually considered offline use cases, but decided that they couldn't allow that to go on indefinitely.
Anyway, sure, let's leave it there and see where things are in 5 years. It'll be interesting to find out!
Only if the users can be satisfied by the supposedly higher quality software, produced by devs willing to pay the Apple tax and unafraid of being Sherlock-ed
There are only two smartphone platforms, and both of them have app stores and make other efforts to control the overall end user experience. What do you disagree with?
> Gates's view was always to just let anybody write anything they wanted, and the market would sort the good from the bad.
Gates's view was also that, once the market had sorted the good from the bad, Microsoft would, if desired, just implement whatever killer apps third party developers had discovered and make them part of Windows, cutting the floor out from under the third party developers. In other words, Gates viewed third party developers as doing market research for Microsoft that Microsoft didn't have to pay for.
> ultimately I think Jobs was correct that the mere existence of low-quality software hurts your entire platform
Windows is still well ahead of both macOS and iOS in terms of market share, so I think it's more that Jobs and Gates had different definitions of success.
Yep, and Apple does this too. The Spotlight feature in macOS (OS-wide search, it's the little magnifying glass in the top right) was a feature that Apple blatantly ripped off from a popular early-2000s third party extension. Adding this to the OS killed that company overnight. Forget their name, but someone else might remember.
All platform vendors seem to do this, which is unfortunate.
Twice: The usage of a giant text input box as a launcher or command palette was pioneered on the Mac by LaunchBar and Quicksilver and later Alfred, etc. Spotlight for a long time was only a menu extra.
> Gates's view was also that, once the market had sorted the good from the bad, Microsoft would, if desired, just implement whatever killer apps third party developers had discovered and make them part of Windows, cutting the floor out from under the third party developers. In other words, Gates viewed third party developers as doing market research for Microsoft that Microsoft didn't have to pay for.
Reminds me of how Amazon sees third-party sellers.
Maybe that is the standard abusive strategy of platforms that hold a dominant market position.
> Jobs clearly saw low-quality, third party software as something that could taint the product Apple was selling.
I mostly disagree with this, so long as the low quality software isn't pushed by the platform then it's not that much danger to the reputation. Lack of quality software is a much larger danger, and it's not easy to tell ahead of time where the quality will come from. The big problem is when you are running your own app store and it becomes packed with garbage.
It feels like they eventually managed to enforce this when the App Store came along, rejecting “useless” or “redundant” apps. Eventually they gave up on that idea because they loved seeing the number of apps rise, even if those apps were, in the end, low-quality software.
If they didn't have hidden 1st party api stuff that 3rd parties have difficulty leveraging this moniker would feel more useful/relevant, IMO. In the current context they use these labels to protect their workflow from other entities.
When I setup computers for family members I've used things like Windows 10 S mode BECAUSE it limits the software that can be installed. They are too low skill to know what applications to trust.
At my work, I similarly have gotten secure administrative workstations deployed with again, very limited lists of software. These users being compromised has such extreme consequences that one cannot trust any application to run on these computers.
So I can certainly appreciate the need to be very exclusive with the software allowed on a device. Yet I see this exclusivity being rather anti-consumer in practice. iOS is the most secure operating system I've ever seen for the average person, and yet things like adblocking are behind a paywall. Neutering Adblock for the sake of security is considered acceptable. Blocking alternatives to WebKit may reduce the overall attack surface area Apple contends with and it also limits the functionality of the phone since v8 is the de facto standard. Blocking alternatives to the App Store absolutely enhances security while also forcing consumers to pay more. Then you get into things which have nothing to do with security truly, like how you can only make "Music" your default music player on iOS, not Spotify.
I really appreciate what Windows did with "S mode" because it's optional yet provides benefits to consumers in niche situations. I similarly appreciate the enterprise features of Windows to configure a laptop which has limited access to software to enhance security. Apple on the other hand forces the exclusive use of their software to benefit themselves at the expense of their customers. It is unacceptable and I would vote for anybody pledging to outlaw it as monopolistic abuse. Software exclusivity is useful, shoving it down people's throats is abhorrent.
Personally I’m not annoyed at low quality software - I just avoid it. What I draw a bright red line at is intrusive/abusive software (essentially malware). I’m glad apple enforces a standard with regard to software behavior that is acceptable versus not.
The 128k Mac that came out in 1984 might have been the first microcomputer that didn't come with some kind of BASIC, machine language monitor, or other tools for developing software. That is, the point of every micro up until then was that you were going to program it yourself. In some sense it was a real step backwards in terms of people losing control of computers that Apple has continued with iDevices.
At first you could not do any software development for a Mac at all on a Mac, instead you were expected to get a Lisa if you wanted to develop for the Mac. Part of that was that even though the Mac had twice the memory as most computers at that time, the GUI ate up a lot of it, so you could just barely run the bundled MacWrite and MacPaint. Part of the reason why Jobs insisted on keeping the Mac monochrome until the 1987 Mac II was to conserve memory. (When it did come out, the Mac II was priced closer to a Sun workstation than even an IBM PC AT.)
I remember that, it's why I didn't get a Mac. Although it was a few years later for me.
The only computer that I'd used up until then was an Apple II that had been donated to our school. I managed to talk my parents into buying one, and they took me to the store but I couldn't find one. Asked the salesman and he said they didn't sell them anymore but now they had these new Macintosh computers.
I looked at the rinkydink black and white screen with a mostly-blank desktop with just a few icons you could drag around or click on it. Couldn't even figure out how to get to the command line, let alone into BASIC. When I asked the salesman he said that you couldn't and seemed utterly baffled that someone would actually want to use their computer.
So I left and went and bought a PC clone instead. Plugged it in and booted it up and I was coding immediately. PCs just worked, and let you do whatever you wanted. Macs were kind of toy computers at the time.
That philosophy still kind of holds to this day, though less so now. I use a Mac for work, and luckily they did finally make it possible to get to a command line and to program on it. At some point, they realized they had to give some power to it or else become utterly irrelevant. But you can tell that it's still not intended for power users. It still can't run most software, and what it does have is limited and simplified, from the OS on up, and still defaults to making things difficult except for the simplest use cases.
> Even HyperCard never had his wholehearted support
When HyperCard was introduced Steve Jobs didn't work at Apple. When he came back, he killed it along with many other projects he had no role in, like the Newton.
I say this as someone that loved the Newton. The Newton was a millstone for Apple at the time. Just like printers, scanners, and cameras.
The Newton had several problems. The first it was intended to be an entirely new platform distinct from the Mac. There was zero overlap between the runtime environments of the two platforms. Nothing you made on a Mac was ever going to run on a Newton. This would stretch already thin third party developers even thinner.
The second major problem is it had cheaper competition with a better market fit in the form of Palm. A Palm Pilot was half the price of a MessagePad and did most of the same tasks. It also actually fit in a pocket which meant it could be carried around and used as intended.
A third problem was its OS was an older model lacking memory protection and preemption. By 1997 it was clear that multitasking protected memory OSes were the future if for no other reason increased stability in day to day operations. Rebuilding the NewtonOS with those features would be a major project.
The MessagePads were bulky and expensive. They were too big to fit in a pocket meaning the only way to carry them was a bespoke case or a briefcase. They weren't that capable so a true road warrior worker was just going to get a laptop. Their target market was John Sculley, executives that didn't want to tote around even bulkier laptops.
The Newton didn't make a lot of sense as a product and killing it off with the rest of the Apple peripherals made complete business sense.
I don't disagree. I have a soft spot for the Newton because I had a lot of respect for Larry Tesler, who got Apple interested in Common Lisp for a while as part of the Newton project. CL was used to invent Dylan which was originally intended as the Newton programming language. Dylan ended up like the Newton: The invention itself didn't have much impact but the project and the people who worked on it moved computer science forward in many important ways.
Oh, and the genesis of what we now call the ARM computer architecture was the Newton project.
The Newton was out for 5 years by the time the US Robotics Palm Pilot debuted. However, the first really good Newton was the 2000 and that was right around the time the Palm Pilot was released.
I was rooting for the Newton but at the same time, I found myself mostly using my Palm Pilot (and later my Handera 330) while my MessagePad sat on my desk unused.
> My impression is that Microsoft was founded to bring programming to the people
Why do you think that? I was not around in that period, but the impression I get is that the foundation of Microsoft[0] was antithetical to "bringing programming to the people".
Long before entering the operating system business, Microsoft was a company dedicated to making tools for programmers, like compilers and interpreters for many languages, for the CP/M operating system and for most kinds of early personal computers.
The tools for programmers have remained a major part of their products during the MS-DOS era, until the early nineties. The documentation for their compilers and libraries was excellent and many people have learned much about programming from the Microsoft manuals (or from reverse engineering their programs).
Only after the transition to Windows and their success in suppressing the competition for MS Office by their control over the development of the operating system, the programming tools business has progressively become only a small part of the Microsoft products and the programmers only a small part of their customers.
All you need to do is head over to archive.org and look at a Byte magazine from "back in the day". It was filled with ads and reviews for programming tools like Turbo C and Turbo Pascal. It was a golden age, and I was there for it.
That famous Gates letter is not inconsistent with the idea of large numbers of hobbyist programmers. He cared a lot about the piracy issue, but he was never trying to gate who was "allowed" to be a programmer.
Microsoft's first product was a BASIC interpreter. It was all about bringing programming to the masses. Saying "to the people" implies some kind of empowerment, and I don't think that was ever quite as much a part of Microsoft's identity (although it was there in some form). "To the masses" is a better way to describe it in my opinion.
We had an original IBM PC (not a clone, genuine IBM), and I actually don't recall it doing that. Maybe that was on some models and not others?
The way I accessed BASIC was to boot up PC-DOS (IBM-branded version of MS-DOS) and then run some program that was like QBASIC although it might not have been called that yet. Later I switched to MS-DOS 3.21, and that came with real QBASIC. I also switched from BASIC to Turbo Pascal and Turbo C. Great times!
They sold software tools at the beginning and further down the line used the tools to become the platform of choice for corporations. They controlled the OS and the tooling and this gave them great advantage in business software market share.
You could get Windows source (or parts of it) if you needed to work on low-level stuff from MS and their MSDN was and is miles better than Apple developer documentation and tooling.
Bringing programming to the people does not mean giving away your work for gratis. The fuss over the increasing use of non-free but source available licenses instead of OSI approved FOSS licenses shows that people are coming around to Gates' viewpoint.
> Apple (represented in my mind by Steve Jobs) didn't want his platform to be flooded by crappy software.
This kind of shocks me.
When you go through macintosh garden and look at all the software (there were tons) a hefty enough portion of it looks like the 80s'/90's equivalent of AppStore fart apps. "Porno" apps with interfaces and interactivity that could have been constructed in Powerpoint.
So that (gatekeeping) didn't work. And then they extended the life of all of that garbage with blue box.
> My impression is that Microsoft was founded to bring programming to the people, and the Mac was founded to bring great software experience
They were both founded to make money. Microsoft was never close to the people and the "great software experience" on the Mac, if it existed, was limited to rich people.
Actually, rich people at that time were probably the least likely to own a computer. If you're talking about somebody who lives in mansion and has a butler and a chauffeur, that type of person was not buying computers and would not have had any interest in them. (I don't know what those people did with their time, but it sure as hell wasn't computing.)
Personal computers were expensive by middle class standards [1], and the sweet spot was educated professionals like doctors and lawyers because they could afford it and it was interesting/useful to them and their families. A rich person who inherited an oil fortune would have been able to afford it, but not interested. A teacher would have been interested, but unable to afford it.
[1] At a time when a decent, no-frills used car would have been around $1,000, you had something like this: ~$700 for a more limited VIC-20 or Z80-based machine, ~$2,000 for a PC-clone, and maybe double that for the Macintosh. For most buyers, it was a big but not impossible expense.
Yeah, that's totally fair. I think I misunderstood because the word "rich" in that era was really reserved for people with butlers and yachts, and excluded those who were merely highly paid professionals like doctors and lawyers. But the word "rich" today is used somewhat differently, for most people it probably means to include highly paid professionals as well as CEOs and oil fortune heirs.
Also, at that time, there was not this extreme divergence in income that you have today. A lawyer might have made more money than a police detective, but the difference wasn't really that extreme and often these two people lived in the same neighborhood. Since that doesn't seem to be true anymore, it makes sense that the definition of "rich" has shifted.
I think that apple wanted to appeal to the makers at the beginning to get traction but really what they wanted was more consumers. Great hardware and long lasting but is for the elite that can afford it, most people in HN are in that elite. But ultimately apart from raising the bar in user experience is just another money making corporation. And that does not make a world a better place.
Yeah, the whole Apple cult thing has never made much sense. I remember talking to some environmentalists in the mid nineties who cared a lot about reducing industrial activity but were also strongly advocating the Mac. I was trying to ask why they felt so strongly about that given that any computer is basically a non-essential luxury good that requires a lot of toxic metals to produce [1], whereas pencil and paper has very low environmental impact and even some recycleability. There wasn't a clear answer. I think it's irrational.
[1] at that time, electronics used a lot of lead, cadmium and mercury. It's less of an issue now.
Officially, their dev tools cost money. Here is the 1997 pricing for various versions of Visual Basic 5.0 and Visual C++ 5.0 in 1997: ($99, $499, $1,199)
Unofficially, almost nobody ever paid for that stuff in my experience and it was ridiculously easy to get for "free." As was the norm in those days there was zero copy protection. You could sail the pirate seas, you could buy one "legit" copy and share it with the whole org, you could get the MSDN subscription and get "evaluation" editions of literally their entire dev tools catalog on a big stack of CDs with zero limitations besides the legality of the situation. I'm sure that with minimal effort it was easy to get a hold of Mac dev tools for free as well. But Windows was so ubiquitous, as was "free" Microsoft software, and you could build a nice little PC for cheap.
I always wondered why Microsoft charged for that stuff in the first place. Were they actually making money on it directly? Did they sort of want to give it away for free, but were wary of antitrust implications?
Apple had a different strategy. Perhaps this is an incorrect perspective it seemed like they just didn't care about independent garage and small business developers. They left things up to 3rd parties. In a lot of ways this was better -- it probably led to a richer ecosystem? For example, looking at the Microsoft strategy, it's not hard to see that they drove Borland into the grave.
This was a little bit later in the timeline, but I remember attending an event in Philadelphia about F#, and I walked out of it with a licensed copy of Visual Studio .NET and SQL Server. They were just handing them out. I had an earlier copy of Visual Studio from when I was in college, and I was working in .NET at work, but no one at the event knew that. It was just - thanks for showing up to hear about functional programming, take some DVDs on your way out!
It always felt to me like there was a culture of openness in the world of Microsoft in a way that didn't exist in Apple culture. You gotta pay for dev tools to use dev tools on a Mac. You want an FTP client on a Mac? Buy Cyberduck or Fetch. My impression of Apple was that everything was proprietary and had a price. Whereas I could cobble a computer together that ran Windows, there was oodles of shareware and freeware as well as professional tools. You have full access to the registry and file system in Windows, and you could very easily hack the bejesus out of the OS if you wanted to. It was great for tinkering. Everything was backwards compatible - the point where I had Windows 10 running on a 2005 era Dell laptop that had come with Windows XP, and I had managed to upgrade legitimately without paying (I think the Windows 8 beta converted into a full Windows 8 upgrade, free).
Today, I'm typing this from a 2021 Macbook Pro with USB-C ports - when I travel for work, I bring one charger, and it charges my laptop, my phone, my earbuds, even my mirrorless camera. When I need software, I can usually find something using Homebrew. The value you get for your money on a Mac is much better, but it's still a steep barrier to entry, IMO - even though I'm in a much better position today, and bought this without breaking a sweat. There's a lot of tinkering-related things I miss about the Microsoft ecosystem, but I've largely moved out of the weeds for my day to day work on a computer. All the software I was using on my Windows machine is multi-platform now, and the performance and battery life on these Apple native chips is hard to ignore. As a developer, it's just as simple, if not easier now, to build on Macs - ever since OSX opened up the Linux ecosystem to these devices. That, in conjunction with superior hardware, finally convinced me to switch after at least three decades of being staunchly Microsoft.
Hahaha I hate it but I'm doing that thing where I agree with the other 50 things you said but point out on thing I don't agree with, but:
The value you get for your money on a Mac
is much better, but it's still a steep barrier
to entry, IMO
Is it that steep? A refurb M1 Macbook Air straight from Apple with 1 year of AppleCare is $850. $1,189 if you want 16GB + 512GB.
That is more money than a lot of people can afford, especially outside of the US.
But boy, it's so much cheaper than ever. That's like, two Xboxes (+ controllers + a game or two) so it feels within the reach of a broad swath of the population.
Admittedly... you can get a nice-enough Windows/Linux developer desktop or laptop with a little Craigslist or FB Marketplace effort for like, $150.
I'm not excusing it, but people had to pay for access to a computer in the first place, and ways of funding software development without selling either hardware or software, such as advertising, were not really prevalent yet.
Microcomputers greatly expanded private access to computers, and Microsoft brought programming to microcomputers.
"HyperTalk was both limited and limiting, as most came to discover"
People did amazing things with hypercard. Anyone that says this probably wasn't there or wasn't paying attention. I knew people who put whole museum collections, including audio, into kiosk-like stacks.
The problem with hypercard became complexity and geography: at some point you couldn't find anything and changes were impossible to control.
The thing is, that's the same problem people have today. Visual basic had the same essential problems.
In fact, you can argue that programs like Object Master and the Think! Series were the genesis of today's IDEs (even though OM was based on the smalltalk object viewer from what i remember). Back then programmers mocked IDEs.
I said it in another comment, but just wanted to reinforce this. The things people -- professionals and amateurs alike -- did with Hypercard were amazing.
I think Hypercard faltered because of the confluence of a number of things, primarily:
(a) Apple's years in the "desert" and Jobs' return, losing focus and then regaining it in a laser-like cut-all-that-is-not-essential reinvention, and
(b) the advent of the web; cgi; php; mysql, etc. -- soon devoting effort to a Hypercard stack would be isolating oneself on a small and shrinking (relatively speaking) island
Someone needs to leak one of the early development versions of Hypercard 3, which was supposed to be some kind of heavy integration into the newest version of Quicktime
The Apple II Basic prompt brought programming to the people. Around 1984 Byte magazine published an article about a hot language "C" that was used to create special effects in Hollywood. Not long after C was available in the Macintosh Programmers Workshop (MPW) where many started using it. Objective-C was written in C. Most "programming by the people" was Excel macros on both PC and Mac. I agree the Mac didn't bring programming to the people, but it helped.
The number one thing that brought programming to the people was Excel. Multiple orders of magnitude more people have written formulas and macros than have ever used a “real” programming language.
Sadly using the totally wrong tool for the job. Using excel as a database.. sigh. So many times at work I've been asked to help out some hobby bob who painted themselves in a corner and made a 'tool' that ended up in production but was architecturally completely inadequate.
Especially in the days of 65536 rows and 256 columns this was a disaster waiting to happen. And also the 'Can whoever has orders.xls open please close it, URGENT!!' emails. Both issues are now fixed but it's still the wrong tool for the job.
I think many companies are taking away any sort of self-determination by design.
there's no teach a man to fish, or give a man a fish. It's sell him a fish. Hopefully a fish subscription.
Apple is absolutely horrible in this respect. If apple gives you a tool to use with their ecosystem, the tool has a license agreement, and is carefully controlled, with the emphasis on control.
I wish python on mac was more of a thing, but that becomes crippled.
Programming on the Mac was the second worse experience I'd had - second only to a terminal Lisp system that had no backspace and thus no ability to fix a typing mistake. Both were in fact worse than the punched cards where I first learned to program.
Edit: Mac programming was in 1984; Lisp in 1983; Punched cards in 1980
I think programming was a fairly miserable experience on all platforms before about 2000. In the early days, there wasn't enough surplus computation available to devote any of it to programmer ergonomics. Today we have IDEs that can make accurate autocomplete suggestions, compilers with detailed error messages (and even suggestions on what the mistake probably is), shells that tab complete really effectively, etc. Before ~2000, there just weren't enough spare cycles to do any of that stuff particularly well.
The biggest advantage I found when learning on DOS and Windows in the 1990s was the overall lack of distractions. No smart phone, YouTube, Reddit/HackerNews let me fully concentrate.
Yeah, I totally agree. Computers for a long time were not a communications device the way they are today. You could buy a modem and use it communicate, but that was a distinct activity and once exited the terminal program you were once again by yourself.
Now, the communications function is this always on thing that never goes away. It's interwoven with every other activity you do. That makes the whole experience much different.
I had Turbo Pascal and Turbo C. The compilers were insanely fast compared to what had come before. But I still think that programmer ergonomics sucked. There was no syntax highlighting, definitely no autocomplete, and it couldn't necessarily find all the syntax errors on the first pass, so you'd fix a few errors, recompile, and there'd be a new set of errors. That's my recollection anyway.
It was 1991's Turbo C++ 3.0 (they merged Turbo C and Turbo C++ with that product) that had syntax highlighting and code folding, so it wasn't the 80s :-). Hard to believe, but Turbo C came out in 1987 and was pretty primitive. I seem to remember the early version of Turbo C would just hard stop on the first syntax error. Fun times.
Borland introduced syntax highlight on Turbo Pascal 7 for MS-DOS, if memory serves me right, while all their products for OS/2 and Windows 3.x provided it from the start.
Autocomplete came around Borland C++ 4.0 timeframe, or something like that.
I don't think we knew what we were missing though. I just remember how fast it was and being amazed that it came with four inches of printed documentation.
> second only to a terminal Lisp system that had no backspace and thus no ability to fix a typing mistake.
To be fair, the 'proper' way to deal with that is probably to write some Lisp code to patch the running editor to add backspace functionality. That's not really reasonable for someone (presumably) still learning Lisp, though. (And chances are it wasn't a true Scotsman^WLisp system and didn't support that anyway.)
IMHO this article writes off Hypercard way too fast. It was a surprisingly capable language, especially when you start adding modules. It's a huge shame that Apple crippled it so quickly by only shipping Hypercard Player with most machines. Had they kept the full development environment in the base OS it might still be around today.
Yeah I wonder how much longer HyperCard had survived if they had (1) Kept bundling it with Macs (2) Added proper integrated color support from day 1 when the Mac II came out
I think they did bring programming to the people by way of the app store. Maybe they didn't make it so your aged granny can write an automation script, but that more people than ever were inspired to learn programming. They made two languages and a proprietary hardware/software/distribution stack that brought a lot of people into the world of software development. I know so many people a generation younger than me who learned programming specifically to make apps for the app store. People who would probably not have gone into development but for the iPhone. I'm sure Apple did not intend to mint new programmers, they intended to sell hardware and make money on software, but the net effect is more programmers than ever as a result of their choices.
That may be true, but the point was about empowering the end user. Making programing popular through the app store (arguable) is a separate (also important) topic.
Empowering the end user to not look at the computer as a black box that they have no idea about how it works is quite freeing and mind-expanding.
In the long term, I do believe there is a real risk for programming to become a niche where you end up asking for permission from the h/w vendor before you are allowed to write code on the device. We're slowly heading in that direction with app stores being bundled with the OS, and we may end up in a situation where you can only install s/w through the app store. And only "authorized" persons can download IDEs and dev tools. A large population that has no concept of programming is likely to not oppose this because the vendor will throw the security/privacy boogeyman at them, about "unauthorized" developers writing software that can harm them.
There's a lesson here for everyone who thinks they have a new no-code solution. There's a long history of these solutions that never caught on, and the best outcome you can hope for is Filemaker, Access, or VB. It works for simple things, but it quickly escalates to needing a "real" programming language.
This is a really silly framing device wrapped around a decent high-level review of end-user focused dev environments.
Hypercard was hugely influential and very widely used. Yes, it ignored the internet, but that was inevitable based on its origins. Heck, 13 years isn't a bad run for any piece of software, especially one that crossed the no internet/internet divide.
The people that want to program will do so no matter what the end-user environment is. Most people just don't want to.
It's also worth considering that HyperCard was very effective at allowing users to build or customize useful tools for themselves while writing little to no code. You could write "programs" with HyperTalk, but you could also just pick through the extensive examples HyperCard shipped with and kitbash something, or make searchable databases with little more than a card background with some fields. A small amount of scripting could go a long way, because the surrounding environment and tools did heavy lifting for you.
The HN audience skews toward programming per se as the ultimate expression of power and flexibility with a computer, but HyperCard was accessible and empowering in very different ways from QBASIC or an iPython notebook.
Yeah HyperCard came with a stack called “Button Ideas” with a bunch of ready-made buttons you could copy and paste into your stack with no coding needed
> The people that want to program will do so no matter what the end-user environment is. Most people just don't want to.
this is something I would expect to hear from an Apple executive, and would hope to never hear from a commenter on hacker news. the whole point here, is we should be fostering environments that make it easy to program should people want to, not assuming the default is "no one wants to program" and making it as difficult as possible, or just making it difficult by laziness (not prioritizing developer experience).
what we DON'T want, is what Apple, Google and Microsoft currently push hard. which is essentially, lets make all personal computing devices black boxes, that cannot be modified, and that the end users dont even own, but rent from their overlords. no thank you.
I’m on board with making programming as accessible and easy as reasonably possible (though I do think there’s an unsurpassable upper limit to that), but that’s different from programming or even somewhat technical tinkering being the norm and expected. The former is achievable, the latter is a setup for failure.
The reason for this line of thinking is that in my experience, some individuals will simply never have the mindset required to be technically inclined, let alone be able to program. The various people I’ve encountered who freak out at the sight of an alert dialog and won’t even read what the dialog says no matter how many times it’s explained to them come to mind.
So while I would agree that programming should always be within easy reach I would not expect more than a tiny minority to ever reach for it.
> Software engineering is no more magical. Some people like to build things. Most do not.
This misses the point. There are considerable advantages to Free and Open Source software even if you never modify it, the most obvious being that it tends to deter software vendors from adding user-hostile functionality such as tracking. (It isn't perfect in guaranteeing this, but it's a strong start.)
For more on these second-order advantages: [0][1]
> A couch is more of a black box than a smart phone.
In what sense?
With a smartphone full of proprietary software, it's extremely difficult to find out what it's up to. Same goes for modern smart cars. [2][3] Even if the software is benign, can you be sure about its future updates? There are no such concerns for a couch.
> People that want to build their own furniture will do so no matter what the end-user environment is. Most people just don't want to.
I would rather have 10% of the population programming than 5%. replace those numbers with whatever is more accurate, but the point stands.
if we can create devices and operating systems that make programming easier, why not do that? why purposefully make it more difficult, or make it difficult by not prioritizing developer experience? do we want a society of consumers, or one of empowered and informed users?
> I would rather have 10% of the population programming than 5%.
Why? I'd rather have a society where everyone is able to achieve their full potential, whatever that may be. If only 1% (or 0.1%) of people are uniquely good at writing programs, then that's who should be writing programs.
Maybe what you're saying is that we live in a world where writing programs is an essential skill, so let's make sure it's not unnecessarily difficult for artists to write programs. But with sufficiently good tools, I don't really see why an artist would spend time writing programs instead of more directly creating art.
Creating apps -- native or otherwise -- has never been easier, so I think we're doing pretty well there. Devices are indeed locked down but I would guess that most people, even programmers, don't care as long as they can create most of the user interfaces they can imagine.
I'd argue more things are easier to do than ever, yet making them distributable has narrowed. Before you could ship floppies from your garage, partner with a publisher, shareware, or sell to local shops. Now you have to pay gatekeepers, or train users to do dangerous things.
I fully agree with you, but with one exception: I think hacker news has very much become a place of love for black boxes. Many people here love and praise apples "it's not a computer, it's an appliance" approach.
Certainly there are black-box-ophiles on here, but there is much more awareness of the problems of black boxes on HN than most places. After all, the post you're replying to went out of its way to bring up the problem of black boxes. If this were Reddit or Twitter/X, would there even be a person around to bring up this issue?
Overall interest in computing freedom seems to have declined from where it was a couple decades ago. I've been surprised by that and I have a hard time understanding why it happened. I mean, if you show a carpenter two hammers, the only difference being that one of them can only drive nails from a specific vendor while the other works on any nail, he or she would instantly recognize how significant that difference is in terms of overall usefulness. How is it that computer users intuitively understood this in the past, but somehow the level of awareness and understanding seems to have declined from where it was? I don't get it.
> Overall interest in computing freedom seems to have declined from where it was a couple decades ago.
Did it decline, or did the demographics change? Us old-timers are probably still as interested in computing freedom as we were back then, but there are a lot of new people entering the computing world, whose first (and perhaps only) experience with computing was with more closed systems like modern smartphones (and even desktops nowadays are more closed than they were back then).
Hypercard was mostly sidelined by the time the Internet went mainstream. By 1995 development on Hypercard was long dead. You really can't blame it for not embracing web pages.
There is an old what-if scenario where Hypercard got the ability to load stacks over the web. I personally think this would have been totally awesome and a disaster at the same time. Hypercard had no security model built in. It's basically advanced Javascript years before it appeared in browsers.
> The people that want to program will do so no matter what the end-user environment is. Most people just don't want to.
As an example, take Excel and Python scripting. Almost anyone who uses the former can pick up the latter yet that isn't what typically happens. There is effectively a chasm between the two. Why is that?
Almost everyone who uses python would say they're programming (or 'scripting') - but very few people who do very similar things in Excel would say the same.
Excel is like Factorio or Minecraft, you end up "automating" things you were doing by hand and it doesn't "feel" programatic, and you can very easily "peek and poke" the memory of the "program" as you go along.
Hypercard (and maybe Flash, I never used either) might be somewhere in-between, but closer to Excel.
Powerpoint is almost programmable, but nobody uses it that way.
(There's a similar disjuncture amongst the Linux/Mac users where they will write bash/zsh scripts but not count it as programming.)
> you end up "automating" things you were doing by hand and it doesn't "feel" programatic, and you can very easily "peek and poke" the memory of the "program" as you go along.
I don't see how that's different from mucking around with Python locally on your own machine?
The lack of any mention of Quartz Composer is really sad. I saw so much absolutely amazing things done in Quartz Composer by people who knew almost nothing about programming. I recall Facebook even talking about using it as a tool for prototyping UI animations.
I used to use it to inspect HID devices because it was SO unreasonably easy to do so.
It wasn't developed by Apple, but Metrowerks Codewarrior[0] was my serious introduction to programming.
Codewarrior was an application framework for the Mac OS but I learned a ton about programming and OOP in particular from reading through the Codewarrior code. It was revelatory and motivated me to go on to discover and explore the Gang of Four's Design Patterns and Alexander's work more broadly.
I always thought the branding for that product was terrible and embarassing. I even renamed my binary to "HackLikeADork" on my Mac when I needed to use it. I cringed when I thought that someone of the opposite sex might glance at my screen and see that I was trying to be a "CodeWarrior".
When I was a wee one in the 90s (teenager actually :D ) I tried to be interested in programming my Mac. I had the opportunity to learn how to program (Turbo Pascal) in school, and wanted to do something with that on my Mac. I vaguely remember that accessing a Pascal compiler was a bit out of reach. At the time there was MPW and Codewarrior and a ton of documentation in the form of the Macintosh Toolbox (for which I excitedly printed out - a number of Inside Macintosh editions). However, nothing stuck with it. I think all I had an evaluation version of Codewarrior only and I think MPW was a bit hard to come by (and being dated anyway.) I think it would have helped to have a tutorial or something, but I ended up getting very impatient with it and gave up for a few years, until I started with web development with a purchased toolchain (and serving on classic mac :D )
I was lucky to have my computer, but I grew up with a mentality from my parents that they will not buy something that I am only going to probably look at or play with once. With a thick enough price tag on compilers/IDEs, that crushed that interest for awhile. It wasn't only until my brother bought licenses to certain software, it became a bit more available. However, it was eventual access to Free software that changed everything for me. It's hard to put a price to something as valuable as free (in both ways) access to software to make a career out of and it goes well beyond the exchange value of software licenses.
Things have changed a bit and accessing any classic Mac stuff is easier and I have been a little curious again about making some silly classic mac application, if easily possible through emulation. HyperCard, AppleScript, etc. never interested me, but they very much had their purpose.
There's a modest sized outpost for "programming [ed: and other manipulations] to the people" interests at https://malleable.systems . Huge fan.
Starts off with the perfect Lick quote,
> The user wants open software, software that can be modified, and that can participate in a progressive improvement process.
I feel like software is still in some pre-history phase, a time before you can just open up the editor & look at the flow-based view of where data came from & went, before you can openly hack around with that flow. The software all seems so hard & opaque to users, not at all in it's soft form when it is delivered. I believe over time in a softening of software, where we advance into a malleability.
HyperCard & Apple script are two bright points in this history. But today they seem mostly historical. Apple had significant presence in this drive enable users to augment, but like the industry as a whole, this vision has mostly rusted out.
HyperCard was everywhere on the Mac in the late 1980s. It was so much more accessible than most anything comparable on the PC. Its short life was due to other better database choices coming out — such as FileMaker or 4D — and, of course, the advent of the World Wide Web, which was, at that time, a collection of HTML pages and CGI scripts often written in perl.
Speaking of, its creator Bill Atkinson famously said that his big mistake with HyperCard was to not have cards connect to one another through a network. Imagine what that could have been!
Surprising for the OP to neglect mentioning MacOS X's bundling of Terminal.app (2001) and the built-in window it opened to UNIX on MacOS. While I now use iTerm2, Terminal has been steadily maintained since. But I agree, when compared to the Apple ][ series, and to NeXT, Apple did not court developers to the Mac well during the 80s and 90s.
Funny enough I’m literally writing an article at this very moment that touches on this subject a bit.
I’m surprised the author mentioned Swift Playgrounds but not Storyboard. Storyboard was promoted as a way to build apps without touching code. It obviously failed, as anyone who has used it could predict, but that was definitely part of the messaging around it.
AppleScript gets a bad rap for being obscure, but the language itself isn't to blame (mostly) -- when I was an AppleScript developer, 97% of my headaches came from applications with weird dictionaries, not the language itself.
One time I spent half a day trying to figure out how to add a page to Quark XPress, only to eventually find that documents didn't contain pages; they contained spreads, and spreads contained pages.
That was 100% not shown in the standard dictionary display in the Script Editor tool; the UI tool I used for creating interfaces for AppleScript apps had a better dictionary display, and that revealed the answer to me.
The "natural language" with very poor tooling compared to any popular language make coding in AppleScript a dreadful experience, "natural language "doesn't help you much in reducing ambiguity in learning, and of course, awful documentation doesn't help the people, so discoverability is awful
They should've made the automation system accessible from any language instead of inventing this one (and Mac had a few scripting languages)
The GUI paradigm has much more potential, but tedious for anything but simplistic stuff, so it should be on top of scripting so you could switch back and forth
They blew it right from the start. I loved my Apple ][ and was excited when Apple introduced the Mac. But I looked at it, and it was impossible to write software for it unless you had access to a Lisa. (A few companies had products like "Hippo C" but you couldn't write a GUI app in it!) I thought that a computer you couldn't write your own software for was obscene.
So I switched to the IBM-PC and never looked at Macs again.
"Programming" as a concept is more accessible to people than it's ever been. Some years back I tried to break into React development (I'm an iOS developer, but only with native code and wanted to expand my skills) and I had built a simple app in a language I'd never used before with tooling I'd never used before up and running in a couple of hours. I'm sure my prior experience in software helped some, but I cannot overstate how completely different the tooling was between one and the other.
If you want to build websites, there are more tools and options than ever. Basic HTML editing and markdown files paired with something like Jekyll, hosted on Amazon S3 is basically free, minus the cost of a domain, and is done with simple HTML and text files, with massive community-driven documentation and support. If you want a fancier website, PHP/MySQL are extremely mature, well documented, with probably billions of code samples available and who knows how many libraries. If you want to build apps, Swift playgrounds is excellent, and Swift itself is an extremely flexible and comprehensible language, and Kotlin on Android is quite similar as I understand it. If you want to automate your daily life, Macs have a whole selection of shells, and PHP can work there too. Windows meanwhile has batch files, powershell scripts, and if you're willing to do a little work, bash can be had too.
And more and more and more. 3D printers with GCODE automation, arduino microcontrollers, raspberry pi's to make dumb electronics smart on your own terms. I could come up with examples for the rest of the day if I wanted and probably not repeat myself.
And anytime you run into a problem, you can slam an error message into Google, and pretty reliably find something/someone that can help you, and many of them absolutely will because we all love this stuff and love solving problems and helping each other out.
This notion that Apple has failed to bring programming to the people is fucking bizarre to me. Programming is with the people. A lot of people don't wanna do it, that's fair. A lot of people don't have the skills, or the desire to get them, completely understandable. But to say it's inaccessible is just wrong. It's all out there. Whatever you want to do, there is probably a solution to it if you're willing to put the time in, as has been the case with basically any other skill for all of human history.
> "Programming" as a concept is more accessible to people than it's ever been.
I really don’t think this is true. The programs and web sites we interact with nowadays are far more complex than they used to be, so building something feels far less approachable.
I feel lucky that my first exposure to programming was on 8 bit machines in the 80s and then got to learn web programming in the PHP era.
It provided a gentler introduction than starting afresh nowadays would.
> I really don’t think this is true. The programs and web sites we interact with nowadays are far more complex than they used to be, so building something feels far less approachable.
I mean, sure, you can't build your own Amazon or Facebook in an afternoon. But that is not the sole domain of programming.
Just like building a model RC airplane for your own enjoyment involves a lot of similar principles to designing a Boeing 767, but one is magnitudes more complex to pull off. But also, few individuals find themselves wanting a Boeing 767, and instead want a model RC airplane.
If you're going to start with HyperCard, you can't leave out SuperCard and LiveCode, both still around today. And if you include SuperCard, you have to mention that its creators went on to create Flash. And then of course there's Director, mTropolis, iShell, and Apple Media Tool, and many other tools that started on Mac.
Automator was great, I used it to steal my sisters copy of twilight sequel unofficial translation, she kept on password - protected CD. All it took was to piece together a few visual blocks that would backup any PDF on newly mounted disc. A ten years old kid could figure it out in two hours.
The whole point of the Nac was that you didn't have to write your own code for it. It was a consumer device with modeless editing and an emphasis on the GUI, not the terminal window.
The only people I ever met that programmed for the Mac were converting PC programs over to it.
Think Pascal was a revolutionary IDE for it's time. I remember the absolute joy of coding on a Mac IIcx, with the Apple vertical page display, and the 3 volumes of Inside Mac by the side.
early years of the Macintosh when it was seen as less accessible and less friendly to programming enthusiasts compared to other personal computer platforms of that era. However, the landscape has changed significantly since then, and the Mac now offers robust development tools and a thriving ecosystem for developers and programmers
Maybe the people share some responsibility here? Seems like Apple tried again and again but none of the pretty powerful tools attracted enough following to be worth maintaining. If they did, they would be improved with every release and fix early shortcomings.
By contrast, MS-DOS batch files were far from powerful, but everyone had a few to automated repeated daily tasks. Seems to be just loss of any willingness to do a little upfront work for long term time savings.
Apple didn't bring programming to the masses - for the following reason:
- Just to expensive. The Apple Macintosh in January 1984 cost $2500 - thats $7000 dollars in todays money!
- Too complex to get started. The amount of knowledge to successfully write anything really useful in 1984 on that make was a huge barrier to entry - you had to learn ObjectPascal, the then new GUI concepts, somehow get your hands on CodeWarrior (another expensive software package) and learn the UI of CodeWarrior.
- Documentation was poor. The system shipped with nothing mentioning any of this or the hardware underpinnings.
The Mac was designed as an end user 'consumption' machine, not a development machine. Nothing wrong with that - as the market had shown it was highly proficient and successful in its intended role.
The company (in the US at least) that brought programming to the masses was Commodore.
- IT WAS CHEAP! With the release of the Commodore 64 it brought the a super sophisticated system at bargain basement prices. By early 1985 the C64's price was $149 (https://en.wikipedia.org/wiki/Commodore_64) - thats $420 in todays money. A complete system with a 1541 disk drive & monitor cost a total $549 (https://www.mclib.info/Research/Local-History-Genealogy/Hist...) or $1,547 in todays money. A computer with 64K, 16 color graphics, a revolutionary 3 voice synthesizer capable of speech, and a massive software library and peripherals.
- It came with everything you need to to write code. The system booted into a BASIC interpreter REPL which served as its command line also. No additional software needed.
- The info that it shipped with was phenomenal. The manual shipped with the computer actually give you a nice tutorial on the basics of writing code, with examples of color graphics, sound generation, sprites and accessing peripherals (https://www.commodore.ca/manuals/c64_users_guide/c64-users_g...) You could write to Commodore and they would actually send you a schematic of the main board and expansion ports as well as a dump of the ROM so you could look at the actual code that the system used to do what it did.
It was an amazing time to be a young Commodore 64 programmer back then. Heck even the company's slogan was "Computing for the masses, not the classes". And with the Commodore 64 they lived that slogan.
The interesting thing is that todays Apple is even more friendly than Commodore was to the budding developer - with the MacMini line and the Unix underpinnings of the MacOS X, there isn't a better beginner developer system than Apple today.
Apple in many ways, learned its lessons and adapted. Commodore unfortunately - so Apple Mac of today is bringing programming to the masses.
All of my nieces and nephews - I bought them an iMac mini for last Christmas and showed them how to use the terminal and python - and they are on their way.
I think the gold standard for this was Visual Basic, especially combined with MS Access. And they didn't touch on File Maker. A lot of those one-person, in-house applications I've come across were wrappers around databases more than anything else. For example, I ported an application that was written on Access and VBA to track counts of organisms in stream beds to the Web for a county government. Frankly, I'm not sure I made their lives better, except for their manager's manager to check a box. And there are a bunch of small applications I've run across for things like property management that are little more than a GUI over a database.
In a previous job, I was looking after the in-house FileMaker application which basically managed CRM, finance and sales for a small company (around 60 users). Company got bought at some stage and the application got replaced by
Oracle counterparts. From what I’ve heard, many people were missing the FileMaker DB.
I like Howard's blog. However, in this post and many others, he seems to regard the Mac as a continuous story line from 1984 to the present.
I actually think the Apple of today is completely different company than the Apple of the late 70s / early 80s, and the two companies have basically nothing in common at this point. The way I see it, before the iPhone, Apple had never had any truly mass market, "it's for everybody" product line. They had a long series of niche products. The iPhone really changed them, in ways good and bad.
Nevertheless, I appreciate Howard's perspective, even if I'm not totally bought into it.
Yeah, the iPod was a major step along the way to the iPhone. I guess the difference to me is that MP3 players were never something that literally everyone was going to feel a need to own. Not everyone listens to music or podcasts, but everyone needs to communicate. It's a size-of-market thing, the difference between "our market is anybody who listens to music" vs "our market is all human beings." A successful product in that second category feels much different to me than a successful product in the former category.
At the time though it really felt everyone would one day own an mp3. The entire world had a walkman. The mp3 player was like a 10x better version of that.
The Walkman was a hugely successful product, but I don't think that everyone owned one. It was popular with teenagers, exercise/fitness enthusiasts, and a few other categories. It wasn't something that a businessman would have owned, or an elderly person, or a poor person. Today, however, all of those people own smartphones.
I really think it was the iPhone that changed Apple.
Everyone had a little radio/tapedeck/cd player. I had like a half dozen of these devices laying around my home. You go on a walk and everyone has one of these things strapped to their hip. I assume business people still want to listen to the baseball game taking the commuter rail home.
Wikipedia cites an article in Japanese saying that about 220 million Sony Walkman devices were produced before discontinuation in 2010. Even if you quadruple that for other brands, you're not even close to 1/4 of the world at the time they first came out.
I don’t think you will ever get full numbers. People in Kazakstan or kenya in the 1980s definitely had this sort of thing but I doubt it would be a first party sony walkman product. A portable tape deck/radio is cheaper than a larger stereo.
the Mac as a continuous story line from 1984 to the present
It's a reasonably continuous storyline for the end user, which is what matters given the topic. Personal computers themselves weren't a mass market, neither were smartphone so you could just as well look at it as, say, "Apple's long-term vision of mass market computing succeeded". But it's not clear how either of these interpretations would change this blog post much.
Well, everyone is going to have their own perceptions, and maybe some end users do see it as a continuous evolution. For me personally, I saw a discontinuity in the 2010s where suddenly the platform was becoming very locked down: features like SIP prevented you from modifying anything that was part of the OS (though you could turn it off) and code signing appeared to be the first step toward preventing end users from executing any program not blessed by Apple. There was a while in the late 2010s where many of us were pretty sure that Apple was on the verge of prohibiting users from running any code they wrote themselves. There would be some kind of byzantine approval process like for iOS that would reject random, personal utilities not useful to anyone besides the author.
Apple kept denying this was their plan, but kept moving closer and closer to doing this. However, they then appeared to stop around 2020 and haven't renewed those efforts, so I guess we are safe?
There was a while in the late 2010s where many of us were pretty sure that Apple was on the verge of prohibiting users from running any code they wrote themselves.
Other than arguments made on messageboards, I don't think there was really any concrete evidence of this, the technical/product reasons they were/are doing this for were clear. So yeah, I think that's a discontinuity of perception.
As for me Mac did bring programming to me, just with its durable chassis and well working memory management and relatively stable OS.
- I can lift a Macbook with one hand creating tension on the chassis. When I do the same on a Lenovo T14 Gen 1 it shuts off. When I get excited and am just in the middle of programming, just grabbing the laptop to bring it closer to me kills my work.
- Stable firmware, stable startup, stable wake from sleep without losing progress
- Can swap dozens of gigabytes
- Doesn’t corrupt after updated
- Usable trackpad
- Color calibrater high-res screen since 2012
For me it’s these purely usability centered aspects that enable me to even do programming. Automatic sleep/hibernate features don’t work with the other models at all.
As a non-American the biggest problem with Macs & the people was they were too expensive. No young people could afford them. The only real people who used them were designer types and a few rich arty people. Everyone else used PCs.
And Commodores, Ataris, TRS-80's... 8-bit home computers (and some 32-bit ones) survived long after the introduction of the Mac. What really killed them was the cheap PC clone.
All we needed was for vendors to ship development tools with the OS. Nobody ever did that. Apple could include Xcode on every Mac, but they don't. macOS and Windows... neither of them come with a code editor.
The benefits are obvious. The downsides (disk space?) are not. Why didn't anyone ever do this?
EDIT: Nevermind. Xcode used to ship on the MacOS discs.
They did! Back when there were Mac OS X install discs, in the 2000s, every Mac came with the Xcode developer tools on disc as an optional install.
That's the only reason I'm a programmer today. So I would say that in an important sense, the Mac did bring programing to the people. And back in those days, there seemed to be a lot more amateur Mac developers than there are now (and possibly more professional Mac developers too).
The article doesn't mention it but there was also Newton Script. Was a language for the Newton that looked like javascript with sharable top-level PDA objects for contacts, notes, handwriting, etc.
I didn’t read the article, but you can’t blame Apple for people’s stupidity. When I got my first Macbook, programming is literally the first “app” I installed lol
The project sputtered when (1) it became clear that it was going to take more than a few months, and (2) Tesler was largely absent after he turned his attention to trying to save the Newton project.
AppleScript was spun out of the Family Farm project, and William Cook's paper [1][pdf] includes some of the history, including its relationship to Family Farm and HyperTalk, HyperCard's scripting language.
AppleScript shifted the focus from creating something that end users without previous programming experience could easily use to application integration. I was a writer who worked on both Family Farm and AppleScript, and I was both surprised and hugely disappointed when AppleScript was declared finished when it was obviously not suitable for ordinary users. I assumed at the time that there had been no usability testing of AppleScript with ordinary users, but Cook's paper claims there was. All this was even more disappointing in light of the fact that so much more could have been learned from the success of HyperCard and HyperTalk, and that the HyperCard developer and champion Kevin Calhoun was just around the corner.
The Wikipedia article on HyperCard [2] gives the history of the product, including the pettiness of Steve Jobs in cancelling it.
[1] https://www.cs.utexas.edu/~wcook/Drafts/2006/ashopl.pdf
[2] https://en.wikipedia.org/wiki/HyperCard