Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Alan Kay answers “What was it like to be at Xerox PARC when Steve Jobs visited?” (quora.com)
123 points by vshan on Oct 28, 2017 | hide | past | favorite | 55 comments


I got the tour of PARC in 1975, years before Jobs. Smalltalk wasn't really working yet, but some Altos and Ethernet were running. Kay described Ethernet as an "Alohanet with a captive ether". They had a file server and a laser printer starting to work. So what they had was the beginnings of an office environment of the 1990s.

Kay's point at the time is that this was what you could do if you were willing to spend a large amount of money per user on hardware. It wasn't cost-effective yet. Someday it would be. Xerox was willing to spend the money, so when the hardware became available, they'd own the market.

This wasn't the first GUI. They'd downsized the 1968 Mother of All Demos to a machine that sat alongside a desk.[1] When Engelbart did that, it required a room-sized mainframe, a video link from the mainframe to the demo site (a very big deal in 1968), and a sizable crew of support people to keep it all running. All to support one user.

This technology reached production with the introduction of "workstations". The first was the Terak, in 1977, which ran an interpretive environment called the UCSD P-System. This was a byte code interpreter for Pascal. Fast compile, slow execution, basic graphics. Perhaps the ancestor to Turbo Pascal for DOS. PDP-11 technology underneath. I used one briefly. 1979 brought the Three Rivers PERQ, which was a lot like an Alto and sold as a commercial product. It had another 16-bit CPU inside, from ICL in the UK, and was another interpretive Pascal system. I never saw one; it was considered something of a dud.

When the Motorola 68000 came out, it was clear to a lot of people that there was finally a big enough microprocessor to build a workstation. At last, a 32-bit address space. (Or at least 24; early machines didn't use the high byte.) The Apollo Domain, in 1980, was the first of those. That was the first real workstation. It had its own OS, its own networking (a coax token ring), its own file system (exclusive use locking across the network worked, unlike UNIX), and its own GUI. Apollo had their own MMU and their own paging system, which was really hard due to the M68000 not handling page faults properly. It was way ahead of anything else at the time. But the small organization behind it just had too much work to do building all that software and hardware. When I used one, it was clear this was the future, but it wasn't there yet.

Then came Sun, and a whole slew of UNIX workstations. UNIX was really the wrong tool for the job, but it was available and cheap. (Not yet free.)

Meanwhile, Apple was trying to get a low-end workstation working. The Apple Lisa (1983) was the result. It was sort of a cost-reduced Apollo - its own OS, its own MMU (a whole board, which ran the price up), and its own GUI. Also its own disk drive, the one time Apple tried to build a hard drive in house. That didn't go well. The Lisa was impressive and useful, but it cost too much to go mass market. The real UNIX workstations came with big screens, and the Lisa had a dinky screen like DOS-type computers. Meanwhile, IBM was eating Apple's lunch, replacing the obsolete Apple II with dumb DOS machines. No GUI, but enough compute power to run a spreadsheet and, importantly, a hard drive. Now people could do basic business work.

Apple's response, the Macintosh, was the world's greatest toy computer. It shipped in 1984 with 128K of RAM, one floppy drive (you needed two to get anything done), an operating system with no CPU dispatcher, no memory protection, and a nice GUI. But no hard drive. It was really slow and spent most of its time reading floppies and displaying an hourglass "wait" icon. It almost killed Apple. Sales were very low; Apple had planned to sell about 47,000 units a month, but only sold about 5,000. What saved Apple was the first desktop laser printer, the LaserWriter, in 1985. (PARC had a laser printer in the mid-1970s, but it was based on a big copier engine and bigger than a desk.) In 1986, Apple introduced the Macintosh Plus, which could, at last, support an external hard drive. That was the first successful product in the line, and launched the desktop publishing industry.

In parallel to all this, there was another line of technology, now forgotten - "word processing". Wang and IBM were the big players in this. This started with typewriters with some memory, around 1971, and by 1977, Wang had the Wang Office Information System. This involved many semi-dumb terminals connected to a shared unit with a CPU and file server. Those could in turn be networked, and documents sent around the system. Very cost-effective, because each terminal was cheap. This was a huge win for offices which did a lot of document preparation, and Wang was, for a while, a very successful company.

So Xerox tried to move into that space with the Xerox Star, in 1981. This brought the Alto technology into an office environment. Worked fine, cost too much. Like the Wang system, it was very closed. This was deliberate. Word processing and office system were used by secretaries and clerks. They had to Just Work. Exposing end users to the internal complexity of the system was considered a bad idea.

The cultural change which brought system administration to the masses wasn't seen coming. It was inconceivable in the early 1980s that clerical people and small business operators would have to worry about what was going on inside the box. But as the DOS-type machines got more powerful and moved into offices, for about two decades everyone had to become a sysadmin. Apple tried to hide more under the covers, but it didn't really work all that well in the early years.

Today, of course, the complexity has mostly been put back in a sealed box, and you can give a Chromebook type tablet to a 5 year old and they'll be able to work it.

[1] https://en.wikipedia.org/wiki/The_Mother_of_All_Demos


One of the early goals for workstations was a "3M" machine - a megabyte of RAM, a megapixel of display, and a megahertz instruction rate. That was about the point at which GUI workstations started to be really useful. The Alto and the original Mac were both below that level. The Apollo Domain, the Sun I, and the Macintosh II were slightly above it.

(The price of RAM was a big problem in those days. All the early DOS machines and Macs were RAM-starved. The workstation people plowed through that problem with money, and RAM was a big fraction of machine cost. Now it takes a gigabyte to run Hello World, but, whatever.)


I think it was MIPS[0] rather than Mhz.

[0] https://en.wikipedia.org/wiki/Instructions_per_second#MIPS


Just to help with a few facts (easy to find for those who care). http://worrydream.com/EarlyHistoryOfSmalltalk/

Smalltalk started working in 1972, the Alto in 1973, and Smalltalk on it a month or so after the Alto came alive. The GUI came more from other parts of ARPA than NLS, but there were a few NLS elements. Cheers.


I used Apollo Domain machines in the early 90s at university. We had a lab full of them, they were impressive even then. I saw the World Wide Web for the first time on one of those (NCSA Mosaic). I never looked into it’s history. Everything was upgraded to HP PA RISC workstations shortly after which seemed like starships at the time. Thanks!


I cut my teeth as a sysop using Wang systems - very interesting systems .. it seems there is a missing element in the history of computing and GUI's in general, when discussing the part Wang played in bringing these concepts to the masses.


That's why it takes two types of people to make something successful and lasting. One to come up with and do the invention, and someone to take it and make a product that affects the world. Neither does much without the other. Xerox had all the amazing tech and did absolutely squat with it. Jobs saw a demo and made it an actual thing you could use, whoever many errors he made in the process. Same is true with Steve and Steve, one did the tech and one made it a company. Rarely can one person do both and be successful.


The one example I know of where the two skills were founded in the same person was Satoru Iwata of HAL/Nintendo.

Brilliant programmer/hacker, tech/User experience visionary, and a corporate operator chief.


Pretending that apple is as important as PARC seems to me to be poorly thought out. It's mistaking was is visible with what is important.


>Jobs saw a demo and made it an actual thing you could use,

Xerox systems were complete and readily usable, but too expensive. Jobs simply tried to make something similar but cheaper. First, Jobs failed as well (Apple Lisa), then the Mac was easier to sell due to being lower priced. But the Mac also didn't sell well at the beginning; it started selling well after Jobs was fired, paradoxically.


Rarely can one person do both and be successful

Agree, but that doesn't sell as many books or movies I guess.


More detail on changing the system while running, and a comment at the bottom from Alan Kay retelling this same story:

http://www.righto.com/2017/10/the-xerox-alto-smalltalk-and-r...

"Dealers of Lightning" also a has a chapter on Job's visit: https://www.amazon.com/Dealers-Lightning-Xerox-PARC-Computer... Try the "look inside" for "Steve Jobs Gets His Show and Tell".


If you are interested about this era of computing, I recommend The Dream Machine: J. C. R. Licklider and the Revolution That Made Computing Personal which covers a lot of the ground of getting to the famed Xerox Parc GUI demo, though that's barely more than a footnote in the context of the rest of the book.

https://www.amazon.com/dp/B01FIPHEXM/


I thought it was interesting that 2 objections raised by Steve Jobs so he "could feel in control" are actually normal UX items and have been ever since. (smooth scrolling, selecting text being an outline)

Maybe this is a difference between someone totally focused on feel and experience versus pure computer science.


You missed the story and the history, and who knew what.

Many of the GUI ideas had been around in the ARPA community before Parc, including scrolling. The bit-map display on the Alto brought a few more possibilities -- such as the overlapping windows, and "display any image". We had done both smooth scrolling and outlined text selection years before (remember that

Steve's visit was in the 6th year of the Alto working). We used the line by line scrolling in Smalltalk in '79 in part because it was easier to read during the movement, and because a list would always show the whole top line). There was plenty of compute power to do either.

What Steve was reacting against is the normal way we portray selections today, namely highlighting (not at all what you said above). We found the highlighted selection to be more discernable than the outline.

I think this is a case where you didn't take the trouble to find out the facts and are instead projecting your beliefs on a situation at which you weren't present. I'm calling you on because there is much too much of this kind of commentary in most forums.


I haven't seen many systems that use an outline for selection. Alan indicated that their original version inverted the selection (he uses the word "complement") as do most systems today.


Thank you, I believe I misunderstood the article's description of the change he was talking about.


most systems today use highlight.


... of which "inverting the text" is an implementation


Based on Alan's answer and Alan's reading/knowledge of Steve Jobs, Steve's questions were just more ad libbed challenges to try to appear to be the alpha smart person in the room.


The Smalltalk used in this demo was my personal favorite (-78)

Is there a document that shows the differences between The Smalltalk-78 and Smalltalk-80, and what made Smalltalk-80 worse?


Wish I knew! One difference is that -80 had metaclasses. (They weren’t in -76 but I’m not sure about -78.) I’d say metaclasses are more complication than they’re worth.


I think Smalltalk-80 was adapted to be portable to non-PARC systems: only standard ASCII, only Unix-style file system, etc.


After more or less six years of running this demo, a designer (Steve) steps in and makes a couple suggestions that very much improved the experience, suggestions that took only a few seconds to implement in Alan’s own words. Alan seems to poopoo Steves’s visit but it highlights exactly why the computer industry needed Jobs.


Do you feel that the value of Steve Jobs is underappreciated? In the mainstream media he has received perhaps more attention and praise than anyone else in the computer industry. He is widely considered to be a genius and given nearly sole credit for a wide range of innovations.

I think it's okay for the guys who invented the GUI to also get a small fist bump.


In mainstream media, Jobs is a god amongst men. In technical circles, he's demonised. The truth, as always, is somewhere in the middle, and the HN audience in particular is one that probably needs reminding that he did have valuable input to contribute.


"The truth, as always, is somewhere in the middle"

This statement ticks me off to an unreasonable degree. The truth is complicated. Any clear spectrum drawn between 2 points is an artificial construct. There is no reason to suppose that the truth would just happen to lie in the middle of an artificial spectrum. In fact if it did on average it would be incredibly strange.

Further the world is full of morons who espouse clearly ridiculous things and the truth doesn't lie somewhere midway between sanity and insanity.

Jobs is a great businessman who excelled at turning other peoples ideas into piles of money not at making the world a better place. People like kay changed the way people interact with computers.

Jobs isn't halfway between god and demon hes just not that important.


I'm not sure what you mean. There are plenty of people here that recognize Jobs' positive contributions. In fact, most HN threads about Jobs devolve into acolytes fighting detractors.


I think the demonization you're referring to is rooted in the fact that he receives or takes credit for innovations he helped refine but definitely didn't invent. Where as the inventers receive little to no credit of appreciation for their contributions.


> The truth, as always, is somewhere in the middle

I have found [irrespective of this case] that this statement is just not true. Often, the truth is not in the middle but at one of the extremes, or even beyond them.


I don't discount Jobs' influence, but I think people tend to dismiss the degree to which luck is involved. Jobs certainly made plenty of mistakes, but was in the right place, at the right time, enough times to overcome them.

I tend to think if you take Jobs, or Gates, or any other icon out of the picture, somebody else would have filled the void. And the end state might be slightly different, but not that much. Maybe Kildall, CP/M, and GEM would have filled part of the Apple void, for example.


Every successful person had some event(s) that are lucky. Dismissing a person's accomplishments because they got lucky in some way is disingenuous. Unlucky people get hit by cars and never fulfill their potential.

Jobs defined how the mass market 8-bit computers looked for example[1]. GEM wouldn't exist without the Mac, and well, CP/M was actually the first choice but some people have one bad day.

If the giants of the industry didn't exist then we'd be living in an alternate reality with different pillars to support later people. I often think about Rome. They had all the technology to move to something like the steam engine. Maybe someone in the empire got close and just had some bad days, but it never really got done until much later. People who see something different are not interchangeable. You might get close, but all the experience that brought someone to a point isn't going to be duplicated. Parallel inventions happen, but even they are not exact duplicates (e.g. a different notation for Calculus).

1) it is almost a iPhone type display on how 8-bit computers looked before the Apple ][ and then after. This is no way says anything about my opinion of which 8-bit computer was the best.


A slightly more "we're-living-it" example is to look at Elon Musk and Tesla. Mr. Musk has succeeded in convincing the auto industry to make electric cars. Not all by himself, mind you, certainly the EU emissions regulatory environment had something to do with the "convincing", but nor were those regulations dreamed up in a world where Tesla did not exist.

Fisker tried and failed at around the same time Tesla was making the roadster, so while there is certainly an element of luck to Elon Musk's success, saying it's all due to luck undercuts the fact that he's also worked very, very hard to get where he is - there are stories about him sleeping in the factory so he can do QC inspections himself.

I'm doubtful that electric cars would be in the same position they are in today, if this single individual, Elon Musk, had not existed, but that's impossible to prove - there were hobbyists who were converting Mazda Miatas and Honda Civics to electric engines, without Mr. Musk, would one of those hobbyists have "made it big" and managed to the auto industry into the future?


But this still potentially "luck". Or perhaps "luck" is a very bad term. Imagine that Musk had a twin name Allen who happened to start a company called "pay-now". And "pay-now" was hit with a lawsuit related to the patent on eshops and had to shut down. Allen would never have started Tesla. Would we even know about Allen? He wouldn't even be rich. His behavior, timing, actions ect. were identical to those of Musk, but his outcome was greatly different. There are probably several Allens out there. Now, consider, what if Musk was identical, but he had never started paypal at all. Instead he, after studying electric drive trains for 20 years, made a proposal to investors that an electric car company should be started. This Allen would be MORE qualified than Musk to start an electric car company, but lacking capital to back the investments, he would be LESS likely to succeed!


I hate to break this to you, but the American myth; that if Allen is just as smart, and works just as hard as Elon Musk, then they both get to be just as successful, is a sham; a lie. There are definitely many Allens out there, but shame of failure means we barely ever hear that story. There are people dumber and lazier than you or I, that have far more money than you ever will, or at least did far less work to get there. Not to be cynical about the world, but there's more to life than fame or fortune.

There is undoubtedly luck involved in business, I can agree with that. FedEx's Vegas story from their early days could easily have gone the other direction, and then they wouldn't even rate a mention in a book about the history of shipping.

However, I disagree about studious Allen - someone who studied drive trains for 20 years is less qualified than serial entrepreneur Mr. Musk, who had two successful businesses under his belt before coming on board Tesla simply due to his experience running businesses. That Mr. Musk was already wealthy from those businesses which gave him a leg up with Tesla seems unfair to everyone who didn't start off independently wealthy, but c'est la vie. (There have been several articles recently about the dearth of new software company IPOs, due to how much the big 3 control the industry.)

Starting an auto manufacturing business was never going to be a bootstrapped operation the same way a two-person software startup in a garage could be, so having funding to hire a good drivetrain engineer is simply part of it. Having studied electric drivetrains for 20 years, maybe Allen ends up a very early employee at his brother's car company, and if that goes public, then Allen will be very rich - say he's still holding on to 50k shares of pre-IPO TSLA, he's doing quite well for himself, despite not being famous. Not nearly as well as Elon Musk, but he's doing well enough.

That's not to say Allen couldn't learn about news skill so he can run a business, or about, say, the intricacies of lithium battery manufacturing, but studying drivetrains for 20 years doesn't make a CEO. CTO for a drivetrain subcontractor perhaps, but sales and marketing and managing people; all those soft skills that aren't hard-core engineering building a product are actually vital to a company's success. (There was even a post on HN early today saying just that very thing.)

Elon Musk's success with SpaceX's success comes from rejecting prevailing industry knowledge - that reusable rockets just won't work. Everyone else in the industry, especially those that had been studying rockets for 20 years, knew that as fact.


I wasn't dismissing his contributions. Just that luck does play into it. Meeting Woz, for example, was fortunate for him. Jobs likely would have been successful anyway, but maybe in a completely different field.


And yet... consider what products Jobs was associated with without Wozniak, Atkinson, or Hertzfeld (to name only 3 of the deservedly famous techies who worked with him) involved. Then consider what products these three were associated with without Jobs involved.

At some point, "luck" ceases to be a particularly viable hypothesis.


What are you saying? Did Jobs's suggestions improve things so much that they wrapped around to being terrible or something?

Because virtually every single desktop or mobile UI in the world—including Mac and iOS—does the Smalltalk thing and highlights the text, not the Jobs thing to put a box outline around it.

This is embarrassingly unrepentant idol worship.


1. Jobs' suggestion of outlining text is not the norm today.

2. Pixel by pixel scrolling is, but that it was done line-by-line on the Xerox computer could have been a deliberate performance concession for this 70s era hardware. Redrawing large sections of the screen over and over (for every pixel of scroll) was very slow by modern standards and not redrawing every pixel was often preferable on those systems even if you could do it.


Pixel by pixel scrolling was so hard, that one of the parts of the original MSWindows drawing api was "shift up/down by one pixel" for the purpose of scrolling and this was to hardware accelerated on systems that supported it. The API was otherwise completely low level.


See my comments above. And if you take a look to see what the Alto could really do graphically, and even more so the Dorado, you would not worry about speed. The decisions were for other reasons.


I don't like to shit on steve jobs, he was a cultural father (like gates, or other figures of that cherished era). That said, what jobs saw was the outer layer, the shallow experience. It has hard consequences:

- it's what the user wants to feel quickly: beauty (it drives sales also)

- it's not what the user wants to have: understanding (it's .. non existent for a commercial market)

both are true, still the elusion of the underlying layer saddens me to no end.

Another example: the copy key, in mainstream OSes you copy some text, in PARC OS, you copy an object, the whole graph; it's life altering in capabilities and simplicity. It even has a dedicated key !


On the Mac desktop copy and paste has long (always?) worked with file & folder icons, i.e. objects.


I think the parent is talking about that behavior, but extended throughout the entire OS. You can see this in some apps, one that comes to mind is Sketch and its concept of "symbols".

I think the complaint is that copy/cut/paste was weak in the beginning, and generally just worked with plain text. This weakened the metaphor to where now we don't expect much else, but it could have been much more powerful.


I believe it was a per application design. File explorer could copy folder and files (windows too). But you can't copy anything out of the box.


The macOS Finder comes in the box. Copy paste a file icon to replicate a file. I just tried it and it works.


can you copy an application ? a window ? a structured paragraph ?


Yes to an app, no to a window and I don’t know what a structured paragraph is. I can’t explain how but being able to copy a window seems like it would break the metaphor.


An application (*.app), yes. Window, no. Structured paragraph, yes.

Shrug.


See my comment above, including the last paragraph


He complains no one asked him until now, but the question is a bit like asking John Lennon "Wow what's it like to work with Ringo?" Well it's like busting ass writing most of the songs, and Ringo shows up and records the drum part and smiles, and becomes everybody's favorite Beatle.

This analogy is not guaranteed. No refunds.


Your downvotes prove my point. Popularity is a hell of a drug.


What is amazing to me is that the "invention" of the computer used the already established norms of our physical reality as it's guide. From the second image description:

"once windows are created they overlap on the screen like sheets of paper"

As we turn computers into android robots, its clear that the credit never belong to man himself. The Wonderful idea was already known by God, who gave man the sense to emulate a good thing.


But overlapping window managers are hardly the best and only solution. I prefer tiling window managers. Did god invent tiling as well?


The finder is proof that there is no god.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: