On a normal desktop OS, it is routine to use multiple applications in concert to do work, using files and the clipboard as means of composition.
On an iPad (or mobile devices in general), it is indeed possible to build delightful and useful end-to-end integrated suites for doing so-called "Real work".
The problem is that if you want to perform a task that requires features from two or more applications, you may well be out of luck! Applications can rarely share data, often resorting to cloud services (and thus a mandatory network connection) as a clumsy workaround, and even copying and pasting data between applications can be challenging and error-prone.
On a desktop, any user can invent new workflows. On an ipad, a user can only buy them wholesale. Users are at the mercy of the degree of forethought the app developers put into their designs.
I completely agree with this. For decades, operating systems have been designed to make it as easy as possible for us to create our own ad hoc solutions to problems by combining subsets of tools in customized ways. A folder holds a project, which contains a completely file/app-neutral hierarchy of data files. A command line allows a constellation of tools to be combined via file/app-neutral means. Various types of scripting let you create tools that call tools. The GUI offered additional ways to use apps in combination. Copy and paste, drag-and-drop, save as... from one and open from any other that can read the file format.... Plug in or mount any kind of drive or mount via wire or wireless, and all is present before you simultaneously on your workbench/desktop/project directory, ready to be worked on with your favorite tools.
The OSes were removing friction and making it easier and easier to craft our own solutions from tools in an expanding and interoperating toolbox.
And then we got iOS and iPad and each app has its private data, no more command line, no more generic project folders, no file system access, not even copy and paste, and years of "well, I guess you could email it to yourself" or "just pay us every month for iCloud instead of plugging in a thumb drive" nonsense. Those restrictions are being partly and awkwardly eased somewhat, but we're being assured that having self-contained, pre-designed apps for everything is the new way. Apple (which seldom makes claims about the future) is telling us explicitly that the iPad is what the future of computing is going to be. That is not appealing to me, nor is it intended to be.
While it is true of course that you can do "real work" if "there is an app for that", you can't even write your own app for that without paying Apple a yearly fee and getting their approval to put it in the app store. I hope this won't be the only future of computing.
I wholeheartedly agree. In some ways, iOS and its spinoff iPadOS are the complete opposite of the vision that Apple pursued in the 1990s (before Steve Jobs' return) of scriptable, composable software through AppleScript and OpenDoc, which themselves were influenced by the everything-is-an-object world of Smalltalk. Sadly AppleScript isn't supported by all Mac applications and it seems to be rather deemphasized these days, and OpenDoc was killed in its infancy upon Steve Jobs' return.
The iPhone and iPad are fine platforms; I use them regularly for my work. However, they are a far cry from Alan Kay's Dynabook idea. A better base for developing on the Dynabook idea would be the Microsoft Surface line of tablets, which can run the full versions of Windows 10 Home or Pro and thus can run modern Smalltalk implementations such as Squeak or Pharo.
I wish there were more work done in the area of composable GUIs, where the core logic of applications is scriptable and where these elements can be combined in ways that could be even more expressive than Unix pipes are. Every now and then I have thoughts about taking a modern Smalltalk implementation and writing a suite of composable GUI apps as a demonstration of this idea.
This topic always takes me back to a Hacker News comment I read many years ago (https://news.ycombinator.com/item?id=13573373) about composable software versus monolithic applications and how commercial software companies are predisposed to support the latter while the free software movement should have focused on the former. I still don't think it's too late, however, as long as there are some vendors selling hardware that users can install their own software on without needing to use a certified app store.
> commercial software companies are predisposed to support the latter while the free software movement should have focused on the former.
There is a related issue at play here when it comes to how systems are developed. It takes an enormous amount of sustained effort and time to create whole computing systems (hardware, OS, design, etc). The FOSS movement is not really capable by definition of doing this, since it is dispersed and not funded at the level that today's major platforms have been funded. This is precisely why FOSS is so Unix-centric: that is the predominant and free OS that was available at the time the whole movement really came about.
It would take a years-long, well funded research effort to build a truly new computing environment. On the technical side, the desire and some of the knowledge is there. We simply lack funders that are willing to pony up a good amount to the right people and leave them alone. That is was ARPA, PARC, and Bell Labs were able to do in their time, and the disappearance of this kind of funding environment is exactly why we are still reiterating on the accomplishments of those institutions today.
> I wish there were more work done in the area of composable GUIs, where the core logic of applications is scriptable and where these elements can be combined in ways that could be even more expressive than Unix pipes are. Every now and then I have thoughts about taking a modern Smalltalk implementation and writing a suite of composable GUI apps as a demonstration of this idea.
I agree with your assertion but this only stands at more advanced level for a niche audience, us. Overall, for what they are and for the wide audience they cater to, Ipads give a decent abstraction so one doesn't get bogged down by details. Yes, at the details part unfortunately sometimes it fails short and for that reason I personally have no real use for an Ipad, I can get by fine with just using a smartphone and a desktop/laptop for most of my uses. As stand alone devices, Ipads can be put to a lot of uses though.
I think of the iPad as a computer the same way I think of visual basic as a programming language. It may make 95% of things easy, but that 5% is a real pain!
If iPads didn’t have a productivity problem then Apple wouldn’t have to dance around that narrative and our conversations would shift more to Apples direction — that the iPad can be your first and primary device. That would be another revolution in personal computing. That iPads are so well priced and that this hasn't happened says something.
Apple has pushed the narrative of "What's a Computer", so they're just reaping what they sow.
that the iPad can be your first and primary device
For tens of millions of people it can be.
Evidence: Hundreds of million of iPhone users who do not have computers. It's why Apple started iCloud backup and restore for the iPhones — So people without computers can upgrade without losing all of their data and settings.
I'm not implying that work done in integrated suites or "apps" is illegitimate. I'm just trying to close the philosophical gap by explaining that this sort of work is a subset of all the work one might want to do with a computer.
What if your work requires detailed onscreen notation and drawing or requires mobility and a rear camera? These aren’t work subsets of a computer without bulky and expensive peripherals.
I have an older Surface with front and rear cameras and a pen. I can draw, annotate documents, and take photos just fine while using Windows 10. I can also use the whole universe of software available for Windows. Occasionally I might have to plug in a mouse or attach the keyboard/touchpad cover to use older software that isn't multitouch-friendly.
If Apple sold a device which was physically identical to an iPad but ran OSX (with some minor affordances to close the gaps between touch, pen, and mouse input- much as Microsoft has pursued in recent Windows iterations) I'd be ecstatic with it. I've considered trying a Modbook, but the price is a bit eye-watering.
Having access to a desktop-style OS would only add to the range of things you could do with smaller devices. Would it be harder to use than an iPad? Maybe.
You can put Linux on most Surfaces, and many of the cheaper Windows tablets. With pmOS, you may soon be able to put a desktop Linux OS on many cheap Android devices. The UX work for these use cases has been done - GNOME is now very much a mobile-ready and touch-ready environment, supporting real professional work. Plasma Mobile is not too far behind.
>What if your work requires detailed onscreen notation and drawing or requires mobility and a rear camera?
Then your work is not a typical example of real work, the same thing being a rock star is not -- even if people exist that are rock stars and get paid for it, so it's still "real work" in that sense.
It's you who want to enforce much narrower areas of work as characteristic of real work.
"Real work" in the casual sense (what most people consider work, or work most people's work is like) is what agrees with the statistically wider kinds of work people do.
This also has nothing to do with gatekeeping. People still get to be stunt drivers and rock stars and vloggers paid to eat their lunch live on YouTube, whether we consider what they do representative of "real work" or not.
It's real work in the sense of it involves real effort (work here used as in the phrase "hard work").
But "Real work" for the purposes of this thread is work that is representative of what most people do / realistic for people to have -- and statistically speaking, being a rock star is neither common, not representative of the kind of job most people do.
To recap the argument made:
A: People can't do real work on the iPad because it lacks X.
B: Really? And what about the people that are Y, they don't need X feature.
A: Well, Y is not real work.
What the person A says is not that (a) Y doesn't exist, (b) nobody has Y as their role, (c) nobody makes money by doing Y (and it is thus, their work).
The charitable interpretation, which is obvious since we're talking about e.g. "fighting aliens" (which indeed, is not a "real work" in any sense of the term), is that A means Y is not really representative of the work most people do, and that the percentage of people doing Y is small to not count for the purposes of how useful an iPad is for real work.
Well, you might still do real work (if you get paid for doing whatever you do) but you're in a much much smaller niche of real work.
It makes sense then to define "real work" by what the majority of "real work" is, not the exceptions, even if the exceptions are still real work.
Like the phrase "Nobody uses X" means "very few use X, so few that it doesn't matter", "Y is not good for real work" means "Y is not good for 90% of real work" (the most common types) not "Y is not good for any kind of real word at all ever period".
I think I see more normal working people using iPads to do their 'real work' day-to-day than I see using laptops to do their 'real work'. I think it may not be as much of a niche as you think it is.
Do you have numbers to backup this statement? I would expect the group of people not requiring composability to be larger than the composability group.
It is kind of a shame that in the day and age of containers we can't just have Linux as an "app" on our iPads. Instead people are forced to write a worse version of everything and bundle it together as an app.
I kind of appreciate Apple's approach to simplicity. I use an iPad because I don't want to maintain another computer; syncing configuration, updating packages, etc. But it basically means that I use it as a "dumb terminal" when I hear it has a surprisingly fast processor. Seems like a waste.
it takes a tremendous amount of configuration to stay in your ide 100% of the time. I refuse to believe you never have to copy paste from another app into the IDE.
Add Git credentials, importing a project takes care of the setup and build and you can create new Run tasks in most IDEs for calling a different part of your build file e.g. deploy or test.
And I occasionally copy/paste from a browser. But you can do that just fine on an iPad or iPhone.
This was true, but is no longer true in most cases. Copy/paste works between many iPad applications. Increasingly, applications can expose interfaces that allow information to be exchanged between them. Pythonista, in particular, provides a number of ways to use, pile, and process data from and between applications. The Files app has also contributed to making this simpler.
That said... it’s still less productive to move data between applications than on a PC or Mac. The gap hasn’t been closed yet.
> Copy/paste works between many iPad applications.
Meanwhile, on desktop I can't paste between Emacs and Confluence because Atlassian went out of their way to break it.
I know what people are trying to get at here, but I think there's also a lot of tunnel vision and selective memory.
Realistically, it's only easier to share data on the PC if the application developers made it easy to share data. There's no shortage of proprietary and undocumented file formats. Heck, there's a small industry built up just for converting CAD files between different formats.
> on desktop I can't paste between Emacs and Confluence because Atlassian went out of their way to break it.
Which presupposes that by default (i.e without Atlassian's messing things up), it would have worked.
On desktop applications, File-level and copy-paste interoperability are there by default.
This is the difference: what is the amount of interoperability you get by default, i.e without specific actions from the app developers?
The fact that you can break interoperability is irrelevant.
On a normal desktop OS, it is routine to use multiple applications in concert to do work, using files and the clipboard as means of composition.
Applications share data on the iPad just like on the desktop - copy and paste and opening a document in multiple apps.
often resorting to cloud services (and thus a mandatory network connection) as a clumsy workaround, and even copying and pasting data between applications can be challenging and error-prone.
You can store documents on your device that can be shared just like you can on iCloud and Dropbox.
On a desktop, any user can invent new workflows. On an ipad, a user can only buy them wholesale. Users are at the mercy of the degree of forethought the app developers
Apple actually bought the company that made the Workflow app and integrated more tightly into iOS to allow cross app automation
...and there’s pythonista: http://omz-software.com/pythonista/ ,I have some python shortcuts I use.
And something similar for Lua, and some variants of julyter notebooks.
I oftne use Panic’s code editor (formerly coda) and workingcopy for my git repos.
BTW, you might check out Pyto. I’ve completely replaced Pythonista with it (YMMV, of course), and it has the enormous advantage that it’s actively developed with new releases very frequently.
I’ve had hardware failures/loss at a mercifully low rate of one a decade. I learned long ago that storing work on your own computer is a recipe for disaster. If it’s worth doing, it’s worth uploading to the network.
I can tell not everyone feels that way when I tell others that my computer died and they offer sympathy like I just lost a relative. It shouldn’t take more than three hours to set up a machine with your work stuff, or somebody isn’t doing their job.
Having the ability to back stuff up via a network is great.
Most of the time when we say "Cloud Services", though, it means exactly one specific vendor. If an app is designed to integrate with Dropbox and GitHub, I can't necessarily substitute an alternative, or run my own replacement instance. If a cloud-based solution becomes unavailable because the company is bought out, shut down, drops old API support, or simply because I've stepped onto an airplane, I'm stuck. The cloud is both a convenience and an accident waiting to happen.
My NAS at home has never sent me an email thanking me for participating in an Incredible Journey and disappeared.
I'm not talking about backing stuff up. I'm talking about not keeping data of business value on single points of failure in the first place. Those hardware failures weren't resolved in 3-4 hours by restoring backups, they were resolved by fetching and setting up things from systems of record. Version control. Artifactory. Runbooks in a Wiki.
If this data is useful, why am I bogarting it? I should be sharing it, as soon as possible. If some ears are too sensitive to see rough drafts, we can use obscurity or permissions to control premature messaging.
It's the same argument as Bus Numbers, or for DVCS, or feature branches. I don't want days or weeks worth of work hanging out on some engineer's system, which can get wet or take a tumble down the stairs on the way to a meeting or stay home for a week while down with the flu.
Backups are for people with a bus number of 1 (ie, mismanaged teams or personal projects).
Getting locked out of online accounts is the 21st century's version of the twentieth-century's hard drive crash: "It's not a matter if of, but a matter of when."
So you should have a second repository of any data you have in any account, in a different location, provider, hard drive, or whatever. Because it's only a matter of time until you need it.
Say, logging into your account in a fashion which trips whatever heuristic they have for deciding if your account is compromised or just 'breaking their terms of service' with no further explanation. For example using 'suspicious' IPs like from some cloud provider, or from multiple countries in a short period of time, as these algorithms were written post-internet but pre-air travel. Who knows, honestly.
And yes this has happened to me, coincidentally after a period of travelling.
Some backed up hard drives in my house and a family member's are probably more reliable.
On the other hand, I’ve found it way easier to script complicated workflows with x-callback-url and Shortcuts on my iPad than on my laptop. Replacing some of the shortcuts I use daily on my MacBook would probably involve busting out Xcode and making a full-blown app. I can sure, but making a shortcut (augmented with Python or JavaScript as needed) is easy and fun.
This thread is an incredibly frustrating example of people talking past each other, apparently just for the fun of it.
The iPad lacks flexibility and as a result it is less generally useful than a "PC". Can we agree on that? I own a 12.9" iPad Pro, I love it, and I'm fully bought in with the pencil and the keyboard folio and all that shit. If I want to write or draw or browse the web or watch a video, I reach for my iPad first because it does all those things just as well as or far better than my laptop, and nothing would delight me more than to be able to toss my laptop in the trash (or whatever). But I can't!
Neither can my partner, who does most of her computing (largely consumption) on her iPad, but often has to interact with institutional websites that have poor compatibility. Neither can my mom, for whom I bought an iPad in 2012, which she still uses and loves, but not even the latest and greatest iPad can support her hobby genealogy research workflow, so I had to buy her an iMac, too.
The author's tone may strike some people as inflammatory or self-centered or whatever, I guess, but his essential point is right: the iPad's apps-first design and locked-down nature keep it from being a first-class general-purpose computing machine, especially for creative work. That it works for some people is great--we don't have to fight about that, nor should we. Obviously it is a crazy good form factor and fully capable for many real professional tasks and workflows. But it should not be recommended as a PC replacement for somebody who hasn't looked at all the angles, because it simply isn't a fully capable replacement, and that's what the author is talking about. If anything, I think the author's point is made more narrowly than it could be.
The frustration is not that it's different or less, it's that it could be better if apple was actually willing to try.
You don't need to sacrifice composability just because of the tablet form factor. You need to be careful with the input methods, sure, but composability on a tablet could actually be better than on a PC if designed well, but we'll never know until someone bothers to try.
I'm in the camp that thinks Apple made a mistake adding multitasking to the iPad. I don't want my iPad to be more PC-like. The changes were significant and iOS is less stable now than it used to be. Even if they get all the bugs ironed out, I'm still a little sad that they clearly want to take the device in that direction.
For similar reasons I thought Microsoft made a big mistake dumping the Windows 7 UI for Metro (or whatever it's actually called).
If I want to do PC-type stuff, I'll do it on my PC. Tablet stuff I'll do on a tablet.
I don't really like the multitasking implementation, and I definitely don't need it. Multitasking was never one of the things standing in the way of my iPad being a PC replacement for me.
I can use my PC for all those things if I really want to; it's just less convenient and less compelling a form factor. Those things are the big reason the iPad sells despite not being a fully-capable PC replacement.
But you can use your iPad for all these other things too. That’s the point of this. There are work arounds in both directions.
Nothing is stopping you from using an iPad to VNC into a Windows PC on AWS to access that website. It’s inconvenient and complex, just like using your PC on a treadmill, but it’s certainly possible.
Or I could hire an assistant to sit in front of a computer, follow my directions issued via Facetime running on my iPad, and read back the results (or not, if I only care about side-effects).
Is that interesting in the context of a frank discussion of the iPad's capabilities as a general-purpose computing device? No, it isn't.
Well people are asserting that they can use a PC - by which I mean a laptop with traditional form factor - on a treadmill, and they clearly can't - not without a bunch of hardware to help them out. And while I haven't done it, I'm pretty sure I can just spin up a desktop PC in the cloud and remote into it; it's not that hard, but it is inconvenient.
The point is that PCs have got use cases, iPads have use cases, and they both help people do "real work". In both cases there are situations where the form factor or operating system is an impediment to getting "real work" done.
The fact that one aspect of iPad - multitasking - is rough around the edges, doesn't deduct from the overall proposition of iPad or the ability for it to be used in "real work". The form factor itself is a huge benefit to certain types of job, the OS is built to work with that form factor, and this is something that iPad does better than anything else (at least if sales are an indication).
The PC is not remotely the best tool for every computing job, and just because you're, say, a pilot, doesn't mean you're not doing real work.
> Well people are asserting that they can use a PC - by which I mean a laptop with traditional form factor - on a treadmill, and they clearly can't - not without a bunch of hardware to help them out.
I feel like perching it on the control panel would work pretty well, or perhaps a tall side table if you did it a lot. Not sure what's so hard about any of that, but it's not really all that relevant IMO.
> And while I haven't done it, I'm pretty sure I can just spin up a desktop PC in the cloud and remote into it; it's not that hard, but it is inconvenient.
Can my mom do that? I mean, she's intellectually capable of learning how, presumably, but I wouldn't even bother with that rigamarole. I'd just buy a PC, and so would she. Both of us own PCs and iPads, but it's pretty clear which one we'd choose if we could only afford one or the other, and that's the point of all this: if you are a person who sits down at your computer and makes things, if you can only afford one or the other, without any other information (and most people will really have to dig into it to be sure, and still risk getting burned in the future), it's obvious which one you should buy.
> The point is that PCs have got use cases, iPads have use cases, and they both help people do "real work". In both cases there are situations where the form factor or operating system is an impediment to getting "real work" done.
I never said you can't do "real work" on an iPad. I do "real work" on mine all the time, and the whole point of my original post in this thread was that people are—as you are now, to me—talking past each other, using this "real work" phrase as an excuse to do so.
I'm typing this comment on an iPad. I love my iPad and I have a very good idea of what it can and can't do. All the things it can't do (for me, at least) are 100% imposed by the operating system and apps-over-files model.
> The fact that one aspect of iPad - multitasking - is rough around the edges, doesn't deduct from the overall proposition of iPad or the ability for it to be used in "real work". The form factor itself is a huge benefit to certain types of job, the OS is built to work with that form factor, and this is something that iPad does better than anything else (at least if sales are an indication).
Multitasking isn't the thing that stops me from totally replacing my PC with my iPad, and it's not the only thing the author of the OP article brought up (if he even brought it up at all).
See above re: "real work".
> The PC is not remotely the best tool for every computing job, and just because you're, say, a pilot, doesn't mean you're not doing real work.
See above re: real work. At no point in this thread have I suggested that you can't do "real work" on an iPad, and, again, I am personally a counterexample to that claim.
I don't think remoting in to an entirely different device really counts as a workaround, that's just using a different device because the iPad can't do it.
The "i-devices are just consumption devices and toys LOL" viewpoint, which is shockingly still common in programmer circles, seems to rely on a bizarrely narrow idea of what "doing real work with a computer" looks like. Meanwhile iPads and in some cases even iPhones are wildly better devices for lots of work, creative and otherwise, than a "real" computer.
That viewpoint is common in programmer circles because it applies most strongly to them. The work they've been doing for likely their entire career is still either painfully difficult or impossible to do on anything other than a "real" computer, so of course they're going to say that for them an iPad is a terrible replacement for one.
> You can't even compile and run code on a (mobile) Apple device. It's explicitly against the app store terms and conditions to allow users to do that.
You better inform Apple then, that the Python IDEs that are available in the App Store need to be banned!
Last time I checked, the exact rules were that all of the code had to be either typed in by the user or included by the app. So there's no real way to share code or collaborate other copy pasting into text windows.
I haven't used Pythonista myself, and it is true that, as I understand it, Pythonista doesn't directly provide the capability to use "pip" or pull code from GitHub. But it seems that these issues have been addressed via plugins for Pythonista. E.g.,
I think we will need some examples to support that claim.
Retail is one. Any retail company that doesn't have a legacy system to integrate with uses iPhones and iPads for their POS systems.
Logistics is another. According to an ex-girlfriend who works in LTL, the logistics industry is 90% people with iPads and iPhones. The only people who aren't using mobile devices for their "real work" are people like her who are chained to a desk doing back office stuff.
Tradesmen is another. When my water heater burst, the water heater guys, the plumber, the guys who replaced the floor, the guy from the gas company, and the guys who replaced the walls all took pictures of the situation before, during, and after to document what they did. Copies went to me, to the landlord, and to their office.
I have yet to see the last one lol. The tradespeople here still use the clipboard case that has their forms inside, sometimes with carbon paper. And they all seem to take cash or checks. XD
They're much, much better, and extremely useful, for just about any work that involves building or fixing things in the real world, to pick one large category of work. I'd personally find a laptop maybe 1/10 as useful as an iPhone (or smartphone more generally), at best, when, say, working on my house or car. Watching construction contractors work, they seem to love their phones and use them constantly, too, and for a lot more than making calls and sending messages.
It's like carrying around a whole office's worth of office equipment in your pocket. It's a document scanner, you can use it to measure stuff, to get paperwork signed, to take pictures and look at them later, it's a level or flashlight in a pinch, you can load reference materials on it, and on and on, and that's without getting into really specialized software or peripherals that let you do all kinds of cools stuff.
Pretty much every large retailer I've been to in the last 6 months has most of their employees supplied with smart phones of some sort, usually with a barcode scanner, that they can use for inventory processes. I've walked up to employees at the grocery store and a hardware store and asked for a specific niche item, and they've pulled out their phone, searched for the item, found it's location in store, price, and the number in stock. Seems like they're doing a lot of work with them.
how about take a picture of something and sending it to someone. Camera's are not only used for selfies, lots of inspection work basically requires to take a picture and submit it.
The workflow snap picture with camera -> eject sdcard -> insert sd card -> photo app for resize -> email attachment is really slow and inconvenient compared to add attachment -> snap picture -> send
Agreed, both the iPhone and iPad are amazing for digital art.
I have a project where I try and create at least 2 pieces of abstract art every day on my iPhone. It’s great for getting ideas out quickly. Over 1500 pieces later I’m still going.
Also, on the iPad, Procreate and the Affinity apps plus the Pencil are amazing.
I think the reason for that is simply that programmers can’t do real work on these devices, while the many people who do real work on them don’t write about it much.
Indeed, let's be clear pretty much 100% of coffee shops, food trucks, etc use ipads to do the REAL WORK of being a very flexible and portable cash register.
I designed a Food Safety application for iPad devices for a major retailer. Food Safety specialists / inspectors would go through all parts of a store - the produce on display, the meat locker, the freezers, the prep / cut table areas, etc.
Hauling a laptop through all those environments was a pain. They would typically need to set their laptop down to type anything, even a brief five word note. Using a laptop could be done, but an iPad with a strap on the back enabled them to hold the device with one hand and input inspection results with the other. This along with other app improvements made the entire inspection process easier and faster for everyone involved.
Think about how many people use clipboards and printed forms in their work. Ask why they need the mobility of a clipboard. If those reasons are sound, then they would probably be a good candidate for an iPad / tablet application for their workflow.
I'm not a professional photographer, but the 11in iPad Pro has been my main photo editing machine for a year now. The USB-C port and external drive support via iPadOS makes for a simple workflow using my DSLR and editing pipeline. If I ended up doing this kind of work professionally, I could see myself continuing to use this current workflow. I guess that would be real work too.
I don't know, but when it comes to editing photos out of my DSLR, I want the biggest screen possible (and matte), not some kind of tablet size glossy screen that reflects everything around.
Matte can be added onto an iPad with a screen protector. My iPad Pro is by far the best quality display I own. It’s definitely smaller than I’d want to be confined to all day, though.
Edit-I wish they’d release an oddly large iPad Pro. I think some interesting use cases might exist in the 24”,30”,higher screen sizes. Not unlike LFD displays.
external drive support - first time I've heard of this.
I generally have found it maddening that apple has set it up so that everyone has to "ask permission" before something is allowed on ipad/iphone/ipod. And they never let this happen before except to import from a camera (and the same camera adapter that worked on an ipad was not allowed on an iphone)
The authors choice of words is terrible ("real work"), I'd phrase it as "making digital products". E.g., in the fitness instructor example, the product is of course the fitness training. I think most people who use this phrase "real work" are in the digital product business themselves, e.g., programmers, designers, even people writing for websites (although I think an iPad works decently well for web writing already, if you're willing to make a couple of adjustments). I think the author is only really looking at work through their own prism of being involved in making digital products. And files have been remarkably resilient in that domain (while there have been some inroads to non-file-centric workflows, e.g., No Code comes to mind). It's overwhelmingly how those products are made today, and, as the author points out, one of the iPad's biggest flaws, if you're trying to do use to make digital products, is it's resistance to working with files.
I agree 100% that calling it "real work" is terrible, but I still think the authors points about resilience of files in the process of making digital products are important.
Agreed. The vast majority of humans who are doing “real work” on computers use just one or two applications. The vast minority of people are software developers and IT workers.
Lawyers, accountants, nurses, doctors, paramedics, tree surgeons, baristas, plumbers, electricians, shop assistants, mechanics, jewellers, cleaners... most of these people - if they want to use computers at all - want them to be less complicated, they don’t generally care about interoperability between applications, and iPad in some cases fits the bill. Its simplicity for some of these use cases is an enabler.
To my mind there seems to be two independent narratives at play here:
- “is the iPad better than a general purpose computer?” for many use cases, yes it is.
- “Is it perfect?” Perhaps not yet. But it’s more perfect for those use cases than what came before.
Then why don't these people use iPads? (Maybe because iPads are actually more complicated for most common work-related tasks.)
Your claim that only a minority of people use more than two applications needs a source, because I don't know a single person who gets by with just two applications for work. Pretty much everybody uses at least an email client/office messaging app, a browser, and Word, and those are just the generic tools; there are very few people who don't also use work-related, specialized tools.
This guy used to write pretty good articles, but this one is egocentric nonsense. I don't personally use an iPad for work much either, but I edit video and photos, and make music on one - all three of those tasks I prefer to do on my iPad.
When Joel Spolsky was starting out I agreed with at least four out of five things he said and the fifth was more of a “huh?” As time went on quite a few “no, Joel” and a few “yikes” reactions began. He petered out when I still thought half of his stuff was good. I’m sure others have kept going.
s/real/office/g and then the article reads just fine.
Otherwise, yeah, it makes little sense, because, yeah, just about every other kind of work, short perhaps of literal bricklaying, has been radically changed by tablets. The last time I had to visit a hospital, I got the impression that they might have had more money invested in iPads than in CT scanners.
I don't see this as a failure. Expecting the iPad to be positioned as a device that would outright displace an existing line of Apple products that's typically sold at 2-3x the price point strikes me as an impressive failure to intuit Apple's priorities. The iPad was intended to break into - and even create - new markets, not to cannibalize its cousins. And if you look at it that way, it's been an amazing success.
Bricklayers may not use tablets for work, but I bet they use smartphones a ton, and not just as communication devices. I know I use mine constantly when doing "real world" work—my laptop's a much worse camera, flashlight, level, and measurement device than my phone, and that's just the things it's worse at without even factoring in portability (which would include almost everything else one might want to use an Internet-connected device for on a job site, which a laptop would do alright at except that it's huge compared to a phone).
[EDIT] "but why's a camera important to construction work?" want to get a usable estimate of how much tile you need for a room in about 5 seconds flat, recorded for later use? Snap a photo of the floor with the 4' by 8' plywood exposed. Done, you can walk away now. Ditto drywall if the studs are exposed and follow standard spacing, and so on.
Also pretty much every pilot uses an iPad of some sort to do their real, and extremely critical work. I don't get these, "heres why a thing failed: because i decided i didn't like it and then manufactured some reasons why" articles.
> What is this silly ‘real work’ gate-keeping? I guess the author means their work.
This is about productivity. Real-work is usually about being about to do many different things with a single machine. This typically requires interoperability between applications, a file system (to access data thru a unified interface even outside of apps), and standard ways of storing data. The iPad can be very good for doing very specific tasks in one application, but everything else is lacking.
Of course, if your "work" depends only on a single application, then you will be happy with pretty much anything. If you need to work with a wealth of different applications, at the same time, on the same machine, then "real work" won't be possible on an iPad easily: probably do-able, but akin to painting with spaghetti instead of brushes.
> Because by 'real work' we mean creating stuff, not consuming stuff.
Why is creative work the only 'real work'?
Do you honestly think someone who does an exhausting 12 hour shift repairing trains, covered in grease, is not doing 'real work' because they aren't creating anything?
I didn't say that. I'm just explaining what the original article writer's phrase ('real work') means here in this context - the context of personal computer users. Think of it like this: by 'real work' he wants to say 'really using the personal computer to its complete potential as a force multiplier'.
Do you include a scanner and drawing tablet in your example of a ‘single machine’? What about a high-res mobile rear-facing camera? These are built into the iPad.
Would you be able to expand on what your setup looks/looked like? I've long wanted to get an iPad to do development work, but haven't been able to find a great setup for JS/Go development.
For me, real work means "realistically replace my laptop." The iPad cannot begin to do that. Apple could have designed IOS as a tightly integrated touch/keyboard-optional system, but they chose not to. Instead it's a bunch of app-centric silos that is decidedly hostile to most of my laptop-like keyboarding use cases.
Not irrelevant at all. The previous commenter's point was that many workflows have been made difficult/impossible by the design of the OS, and not for pro-user reasons. The iPad could support many additional uses cases without compromising at all on the ones it already serves. That is the frustration that is being expressed.
Honestly, the apps for this cost like 3-10K a year, and we're a small facility and a small factory, with a single iPad. So we have this web-based software named SimplyISO, We created the templates and forms/checklists in there, managers get reports etc, we threw the iPad into Kiosk mode to lock it down.
> He uses files to do it as well! Opening an Excel spreadsheet.
I find it interesting that your first example of a file-centric (rather than app-centric) workflow describes the file by the application that creates/views/edits it. Can you do anything with this "Excel spreadsheet" on an iPad other than open it in Excel?
I love my iPad, but I think it's safe to say the sum of the parts is less than the whole. If unix is about composability, iOS is the opposite. You can get work done... within an app. Your apps will never act as a multiplier to make the device as a whole more useful though.
I guess where I diverge from the author is that I don't really think it's a problem that it's mostly a consumption device. I think desktop computers are so thoroughly better as creative devices that maybe its fine that the ipad has its own niche.
I also don't think it's a problem, I just think it's a missed opportunity. I think the iPad could have had the potential to be a better computing device for many, if not most, people, had Apple made better decisions.
>What is this silly ‘real work’ gate-keeping? I guess the author means their work.
I used the term "real work" because that's the term people use when they write "you can actually use the iPad for real work" articles. I think it's clear from context that "real work" means the kind of work Steve Jobs introduced the iPad for, i.e. the work people traditionally do on PCs: writing letters, creating presentations, doing spreadsheets, and so on.
I do acknowledge that people do real work on iPads; I pointed out some examples in the first paragraph. The problem is that these are clearly niche tasks, as is reflected in the iPad's sales numbers.
The iPad works well for some people, who do very specific tasks. That's genuinely great for them, but it's clearly not what Apple had in mind for the iPad. And in my opinion, it's not what the iPad's true potential could have been.
I think by real work they mean the kind of work you do for 8 hours or more a day (or "get shit done" type of work) like using Excel on desktop with all it's bells and whistles, checking stuff on your crm and making a report based on it, programming etc. Sure even sending emails can be real work(and there are many people whose half the time at work is spent just sending emails) and you have been able to do that using symbian phones for more than a decade now. Putting the term "real work" into context makes sense here because iPad gets compared to laptops a lot but it can't even remotely be used for the breadth of work that you can get done on a desktop in a practical sense.
He clearly means real work in the realms traditionally requiring computational power. Except for excel, all the things you mentioned could easily be done with a piece of paper. And even then, good luck converting an xls file to csv then using a text editor to manipulate it on an iPad.
Real computer work is making software or graphics that require either computational power (video, graphics, music) or access to the Unix shell (programming). The first category is ready to go, capability wise, on iPad, but like the author said there's no incentive for those companies to make iPad versions of their software. The second category is a non-starter because of the App-centric nature of iPadOS.
I agree that "real work" has different meanings for different people. OTOH I can see where the author is coming from when saying "real work" as in using the iPad in a productive manner in the device itself and not an accessory to another activity or consuming content.
I've owned 3 iPads (currently iPad Pro 12.9) and while I love them for reading and consuming media I've never been able to do any kind of work on it other than answering a quick email.
I think the OS would benefit from not only keeping apps on the screen but objects that are acted upon by apps. there is little reason why it cannot go both ways, desktops don't care if you click an app or object/file/etc.
> What is this silly ‘real work’ gate-keeping? I guess the author means their work.
So... are there any web developers here (since it's a common occupation of many HNers) who do all their work on an iPad without any kind of desktop or laptop?
The work your fitness instructor friend is doing is fitness instructing, the teacher is teaching and the train engineer is doing train engineering.
Referring to documents on an iPad while doing other work != doing work on an iPad.
Other people have posted examples of things where the primary/producing part of the work is done on the iPad though (drawing, audio production, presumably Excel/word/PowerPoint or equivalents are decent)
maybe the direction of the content, then? Ipads are great for generating a sub-set of content and consuming almost all of it. If you're work is (a) content creation and (b) tends to use a keyboard, it doesn't work great.
I don't understand, you are making his point here. Real work involves lots of files and you want an operating system that does not discourage apps exchanging files or apps managing abstract files just because of a failed, aborted crusade against the whole "file system" thing.
No I just gave examples of people doing real work who don’t need lots of files.
An outdoor instructor may just need a single file - their map. A train engineer may need a single file - their reference manual. They’re still doing real work!
Yes it is real work, of course. In all your examples the iPad is being used in a secondary function to support the main task (fixing a train or whatever). iPad is great for these types of work where it can be used for simple reference or information capture tasks.
Everyone in this thread getting offended by casual use of the term "real work" is missing the point. The iPad can obviously be used for "real work" -- but the point of this argument is not to criticize people and the work that they do, it's to criticize the limitations of the iPad's operating system.
Computer power users like composability, interoperability, "unix philosophy" / "do one thing well" -- whatever you want to call it -- because it's a powerful tool.
RodgerTheGreat summed it up really well:
> On a desktop, any user can invent new workflows. On an ipad, a user can only buy them wholesale. Users are at the mercy of the degree of forethought the app developers put into their designs.
Power users are frustrated because we like the iPad hardware and want to be able to use it, but Apple seems to have latched onto the idea that good tablet design requires them to sacrifice this idea. I and many others don't believe this is necessarily a requirement. Yes, the form factor is different, and you need to be careful with input methods, but good composability should still be possible, and could even be better than on a PC -- but Apple doesn't seem interested in actually trying. That's the criticism.
The criticism is that the iPad isn't better than a PC? Neither is a hammer or a saw ... but both are used professionally every minute.
The iPad is a tool. Would someone argue that a hammer is flawed because it only can be used in some ways, some professions and for some tasks? A laptop is a tool, a PC is a tool, a mainframe is a tool, a calculator is a tool and a saw is a tool. So what?
Of course, a PC might be a better tool for your workflow, your work, your results. But that might be different for someone else. Work with the tools that fits your needs best!
Why do we need to criticize one tool because it is not equal or better to another? Gruber is right in regards of discoverability and consistency of (multitasking) gestures on iPadOS. There is room for improvement. The lack of "professional" iPad software should be directed more towards the developers than Apple - why needs some iPad app to be crippled in comparison to its Mac counterpart? Valid concerns.
But calling the iPad a failure because it doesn't fit to your work, your workflow and your processes? Because it isn't the ultimate general purpose tool? I know authors still using a typewriter. I know some that use iPads because of the "single app interface" and less distraction. I know others using laptops (and complaining about distraction). All are doing "real" work (as many other professions mentioned in this thread) and all have their preferred way of working.
If a developer/power user is happy with his Mac/Win/Linux setup, why does s/he need to criticize the iPad for NOT being the right tool? Or are these folks just unhappy with their setups and desperately hoping Apple will give them a better (-toy-) tool for less bucks? Or is it just en vogue to bash Apple?
You're making exactly the mistake I described. You're getting hung up on how the criticism was lobbed, when it wasn't meant to be taken literally. Obviously it's not actually a failure: it's made billions of dollars for Apple.
> The lack of "professional" iPad software should be directed more towards the developers than Apple - why needs some iPad app to be crippled in comparison to its Mac counterpart?
The problem isn't the software, because the goal isn't to have one or two magically complex IDEs that can do everything, the goal is to allow useful ways to shovel intermediate data around in between apps. That has to happen at the operating system level, and so the criticism needs to be directed at Apple.
> Why do we need to criticize one tool because it is not equal or better to another?
Because it fucking could be better, and it's maddening that it isn't! That is a totally valid reason to criticize something!
If you want to use a woodworking tool analogy, it's like a tablesaw with a crosscut sled welded to the table. Sometimes I want to make a rip cut. I could buy a new table with a welded-on rip fence and spend an hour retooling every time I want to switch tasks... But interchangeable fixtures are useful, and I'm allowed to be upset when other people stubbornly refuse to acknowledge this.
If the issue persists and remains unacknowledged even after a decade of criticism, you should expect that people are going to start using hyperbole out of sheer frustration. But that's "frustrated person language" -- you still shouldn't get hung up on the literal phrases.
But I think "real work" is a bad criticism. If the kind of work it can do is the problem with the iPad, then why is the iPhone so popular?
The real "problem" with the iPad is the size. Laptop have been getting smaller and thinner (gigantic 17" laptops used to be a thing). iPhones have been getting bigger (more iPad-like or at least iPad mini-like), but it's probably close to big enough.
It turns out Steve Jobs was wrong. The iPhone is not a too small version of an iPad, the iPad is an iPhone that's too big. That's not to say there's no room for it in the product lineup. It just happens to be more like the Mac Mini than the iMac.
For the third time, "real work" is not the actual criticism. It's inexact language used by a frustrated person, and you're taking it literally when you shouldn't be.
> If the kind of work it can do is the problem with the iPad
Nobody is saying this. The kind of work you can already do on an iPad is great. We're just frustrated because with a little bit of thought at the operating system level, the device would also be capable of doing a lot of other things well, without any sacrifice to the all of the existing apps and workflows that make the device popular.
> We're just frustrated because with a little bit of thought at the operating system level, the device would also be capable of doing a lot of other things well, without any sacrifice to the all of the existing apps and workflows that make the device popular.
I wonder how true this really is? Removing the need for files or composability was a “feature” of the iPad from the beginning. To simplify things. For example, add in more native support for a real file system from the beginning, and many of the apps that were designed without a file system /composability in mind now is designed completely differently, and may no longer work as well/simply as the current users experience them today.
I don’t pretend to have a crystal ball and know how things “might’ve been”, but I think saying that by supporting a native file system/composability, things would be “the same but better” may be a bit naive.
You're saying that the iPad could do these additional Mac-like things with a few tweaks or additions. I'm saying the sweet spot is in the other direction. You want it to do more (essentially 'real work'). The device that's doing better is the one that's just like it, but is smaller and does less.
That's why "real work" is a bad criticism. I put it in quotes, because yes, I read what you wrote and you're frustration. It sucks but I don't think the iPad is ever going to be that. It would be better for you, but it won't make the iPad a more popular product.
I really like my iPad but I find it sometimes maddening that the closest I can get to using it for software development is with a character terminal, yet it's actually faster than my MacBook.
I have found it really great for one type of serious work: writing. (With an external keyboard of course, which I think is fair.).
And I think there are some places where it really ought to be the better tool for professionals:
* Editing photos (pretty good already but not pro-level)
* Audio workflows (can't judge but seems good?)
* Video workflows (maddeningly weak at this point)
* Storyboarding (again can't judge)
I get that Apple doesn't consider this a tool for making software, and I sort of wish I could get by with just writing and editing the occasional photo, and I realize that I could of course write whatever I want in Xcode if I had time.
But as a UNIX guy, having returned to the Apple ecosystem after it became a UNIX ecosystem, the inaccessibility of the UNIX sitting under iPadOS still rubs me the wrong way.
> Why do we need to criticize one tool because it is not equal or better to another?
Probably because we want it to become better :-)
> But calling the iPad a failure because it doesn't fit to your work
I'm sorry that's the impression you took away from the article, because that wasn't what I was trying to communicate. Perhaps I shouldn't have said "failure", but I did try to explain what I meant by the word, by going back to what the iPad was originally meant to be: the car to the PC's truck, a better device than a PC for the vast majority of users.
I think it's unfortunate that it failed to achieve this.
It's great that there are authors who still use typewriters, but that's exactly my point: that's what the iPad is. It's a niche tool for people who have very specific needs. That's fine, but I don't think it's the iPad's full potential.
>> Computer power users like composability, interoperability, "unix philosophy" / "do one thing well" -- whatever you want to call it -- because it's a powerful tool.
Which equates to a fraction of computer use globally. I'd be willing to bet large amounts of money, most people who use a computer couldn't even tell you what composability is. They just don't care.
You're frustrated because the iPad is not a replacement for a laptop.
Er, yeah? So what?
That's not what it's designed for. You're complaining that a Tesla Model 3 is crap off road.
FTR, I have an iPad and I wouldn't even consider doing my day job on it. Could it be redesigned to fit my workflow? Of course, but then it would be worse for the things I actually use it for.
I agree, even among programmers, not everybody is a poweruser. Some will want a prebuild IDE with streamlined single workflow. Others will assemble git / procfs / binutils / sqlite / hexdump / younameit to hack through a task in wild ways. But I very rarely see this around.
I've read so many of these articles lately. People seem to approach the iPad expecting it to work like a laptop, and when it doesn't, they're off to compose a thinkpiece.
Many of these articles also state as a fact things that haven't been true of iPads for a while. If you want a file-centric experience you can have it--at least if you select apps that have been updated in the past several years. All of my files are stored in iCloud Drive, which is synced to my Macs just like Dropbox, and which works flawlessly with iPhone and iPad apps that properly implement the Files app APIs. I don't have any data silos. Yet I view the option of using an iPad purely as an app jukebox with data silos as a major win for many, many users.
I stopped using laptops years ago. No, I can't do everything on an iPad Pro. But I find iPadOS to be a superior operating system for a small, portable screen. (I'm defining small as below 20"). I use a 27" iMac for more advanced tasks (as well as a media server, etc).
Many people approach the iPad in the same way people approached the cloud a few years ago. They try to treat it like the older technology it’s meant to replace, and when it doesn’t work the same they will proclaim “it’s not the cloud it’s just someone else’s data center!”
It took a cloud native approach to really get the real benefit from the cloud. It takes an iPad native approach to benefit from the iPad.
Since 2015 or thereabouts, my opinion has been that tablets plateaued and have faltered because of what I call "Yet Another Application Platform Dilemma." [1] Many people who already have a PC and phone don't really want, let alone need, another in-between device that they have to babysit and manage.
That said, I agree with the OP's argument, which is effectively that if you have to go out of your way to say, "No really, you can do real work on an iPad," you have missed the point. Of course you can do real work on a lot of devices. Is the device particularly good at it, though? Maybe, in some cases. But for a large number of us—and apparently that number is large enough to slow tablet sales—doing work on a tablet is just not good enough. We're either in "Work mode" where a laptop is better or in "Consumption mode" where the laptop or phone is just fine.
It's funny that in this thread people are saying the same thing that the OP pointed out. In effect: "You can do real work on an iPad." Great. But the topic at hand, whether you care about it or not, is conjecture about why the devices are not selling better.
For about 20 years, I have been observing the paper-book-to-tablet ratio when I'm on the bus. Of course, back when I started it was because I was wondering if tablets would do away with paper books. Now it's just out of habit, because in fact the smartphone turned out to be an unintentional ally of paper books. I see people with both paper books and smartphones, but people don't normally want to carry two screen devices with them, and they sure aren't giving up their phones. So the tablet/e-reader/whatever-you-call-it peaked 5-10 years ago.
Of course, this discussion is about work, not how one reads books, but I think something similar is happening. Tablets are too close to other devices (laptops, smartphones) which are not going away, and therefore they are appear to have hit a hard upper limit.
My 14-year old daughter used to use tablets; now she uses her chromebook, her smartphone, and her desktop, but never an iPad. Most impressively, she even uses a Wacom tablet, which she plugs into her desktop, because she liked it better than the iPad for drawing. I think she is not alone, and people her age will go into the workforce more accustomed to using phones and laptops to get things done.
I don't think it's true that users generally want things to be file-centric instead of app-centric. Sometimes yes, sometimes no. The underlying lack of data integration between apps - even simple copy and paste can't be relied on - is a bigger stumbling block. It's also odd that the screen-layout and I/O limitations don't feature more prominently. The author's larger point - that you don't have to spend so much effort telling people something is useful if it really is - rings true, but I'm not sure they make a very good case for it.
A lot of people in here arguing that the iPad isn't for lesser work than a desktop or laptop, just different work. I think this is largely true, however the frustration (at least for me) comes from the fact that this shouldn't need to be the case.
With a few tweaks to the OS, and a keyboard cover the iPad could be used for all the things that iPads are currently used for and all the things that people still need desktops/laptops for. I think the frustration is coming from people who see that they're being unnecessarily constrained.
Even being able to dual boot iOS or macOS would solve the problem.
Stephen Sinofsky, who led the Windows team at the time the iPhone and iPad were introduced, had a very different take on the iPad that he recently shared.
>In first year 2010–2011 Apple sold 20 million iPads. That same year would turn out to be an historical high water mark for PCs (365M, ~180M laptops). Analysts had forecasted more than 500M PCs were now rapidly increasing tablet forecasts to 100’s of million and dropping PC.
The iPad and iPhone were soundly existential threats to Microsoft’s core platform business.
While I agree with the author's sentiment that it hasn't really panned out for professional work, the title is not right. When the iPad came out, I dumped my Apple stock before the quarter where they would report on sales. The stock at the time was heavily forward valued on lots of iPad sales happening. Apple had failed to gain traction with all it's previous ultra-mobile computing products in the past, all of them marketed for productivity. Why would this one be different? Tablets were available in the stores, but they weren't selling. I figured it would be another case of the early adopters getting overly jazzed about a product nobody wanted.
I was wrong. The iPad saw adoption by the everyday consumer. Give it to the kids so they could play games, watch videos and stop bothering mom. Mom liked how simple it was to check her email. Etc. They created an entire new market. THAT, is not a failure.
I use my ipad pro 12.9" for real-as-in-paying-work. Good luck trying to take my ipads away. I've two of them. Theres a few more around that aren't mine.
Checking email, general websites, discord use, videoconf, etc etc all on ipad. Drawing, sketching, music generation. Excellent.
What is not so good:
No support for tough work background threads. Don't suggest Spotify as counter-example. Audio streaming is a non-challenge for modern cpus.
No support for on-device app development. Both of my ipads leave my raspberry pi computers in the dust in terms of ram, processing power and graphics. Yet, each pi is more functional as a general machine because I can easily remote into them. I can easily build code on a pi and run it. Using ipad I need something like Pythonista to get anywhere near the same. Workflow is useful.
I don't think my ipad pro deserves the "pro" moniker however. It is moving towards but it is no where ready.
Don't believe me? Look at the apps for email as just one example. Try setting up filters/labels in a ios app like gmail. For that matter, show me all the headers for a particular email. Can't. Basic things I'd like to do. Especially with spam handling. A pro device doesn't have stripped down interfaces. It can have hidden functionality revealed through a setting. Even a few 70 year olds have complained to me about limitations with the various email and calendar apps. They use desktops for "complex" tasks.
Why can't other browsers use non-safari engine? Why no extensions support for browsers?
Its the dumb-it-down-for-ipad mindset that is holding the ipad from being a true pro device.
SSH on ipad is what keeps my ipad useful for development. Thats real enough for me. Why can't I run a compiler natively on the ipad? Its got the processing power.
Apple absolutely nails the generalist and prosumer space but leaves me baffled by their posturing towards industry adoption.
As a JAMF swilling AASP, I've lost my patience at the slaverers (it's always the webdevs) and share my opinion freely that the "Pro" in MacWhatever Pro means "an accessory for someone who considers themself a professional", not "professional-grade equipment".
I consider it part of a healthy aspirational realignment. Some move on, others try to prove me wrong. Both end up closer to their enlightenment.
Although, all it would take to flip me on iPads is something like Chromebook's Crouton, and I'd lifecycle every MacBook Air with an iPad Pro 2nd Gen and offer them in place of MacBook Pros. They're almost amazing, and it just hurts that there's either no support or no feature parity with the major creative and engineering software we license.
iPads are top notch for media and productivity, but the platform's niche utility is way outclassed by its own hardware and other products more common in industry.
"Why do people keep posting articles about people doing real work on an iPad?"
Because people like the author of this blog post keep saying that people can't use the iPad to get real work done.
At this point, it's become maddening. Do I still carry a Mac laptop around with me? Yes. Are there some things I can do on a Mac I can't do on an iPad? Yes.
But it goes both ways. Are there some things I can do on an iPad I can't do on a Mac? Unequivocally yes. Does that mean, in those contexts, I should throw shade at the Mac and say people can't get "real work" done on the Mac? No, that's absurd.
A smartphone != tablet != laptop != desktop != watch != smart speaker, etc., etc. Different devices for difference purposes, and the fact they all overlap in many ways is a nice bonus, not a problem to bemoan.
The laptop/desktop form factor is a physical barrier to the kinds of things that tablets are good for (portability, drawing, good for reading documents while working etc etc). But there's not reason why you shouldn't be able to use the iPad for the stuff you currently need a laptop/desktop running mac os for other than software. I should be able to chuck a keyboard cover on an iPad and use it for all the same stuff I use a desktop pc for (albeit with a smaller screen), but I can't - because Apple intentionally puts a less feature rich operating system on the device (and has a restricted app store that explicitly disallows certain use cases).
But many can, and barring certain industry specific specialty tools, I’m unfamiliar with everyday laptop software for everyday people that doesn’t have a plausible counterpart.
I wish we lived in a world where we had hardware as nice as the iPad that anyone could write software for--at any level--with no control or veto power by any single big company (or country).
i've tried out a fair number of the android tablets and imho they're all a long ways off from being as nice as the iPad.
The only tablet hardware that really comes close to the iPad for me is the Surface.
I really want android (or chromeOS) tablets to be as good, because i'd much rather use android apps than windows or iPad apps, but the hardware just sucks.
I personally find iPad (or any other tablet for that matter) a big hit when it comes to device usage for small kids as well as using as a interactive screen for customers in retail shops.
But yeah, other than that, it would be hard to be productive with just a tablet I reckon.
>using as a interactive screen for customers in retail shops.
This is precisely what we are using iPads for right now. Our clients and end customers are very comfortable using Apple hardware. The screen size on these devices is perfect for most interactions. If you have the device in landscape, it's about as wide as a 8.5x11" document at 1:1 scale.
Also, it is intuitive for most of our users to capture photos of documents or read barcodes using iPads. If I had to build this using USB webcams or even laptops with embedded webcams it would be far trickier from a UX perspective. Also, some of our clients have iPads with LTE connectivity so they can seamlessly go into the field and work with end customers who don't have time to come to them.
That said, the app development ecosystem around Apple is a fucking nightmare. Signing your apps, dealing with regressions after iOS major version updates, trying to do anything at all inside XCode, attempting to automate your builds, etc. This is why we are looking at supporting a wider range of devices heading into 2021 - Surface/Win10, Android, and PWA. IMO the holy grail is PWA, Apple just needs to give it better support (I am quite aware of why they do not want to). If we could somehow get everything working via PWA, we would immediately dumpster the iOS codebase and do the whole thing exclusively using Blazor server-side.
I use an iPad Pro as my primary personal device. I have used it to write a book, and to write code for GitHub.
I would use it in my current job, however we have very strict rules about what we are and are not allowed to do with personal devices regardless of their make and model.
It's quite amazing how capable it is once you rustle up a git client, and editor, and a wireless keyboard.
Yes but the biggest success of the iPad lies mostly in there are no alternatives left. Almost everyone else has left the market. At this point if you’re buying a tablet that’s not an iPad, you really have to go out of your way to make that justification.
I feel that a lot of the shortcomings of the iPad as a platform are really a consequence of the chicken or egg problem. For the platform to be viable, you need a lot of good apps, and to make good apps a possibility, the platform needs to be monetizable. Currently, the iPad lacks on all 2 fronts.
Apple isn't leading either on the software front. All the stock apps are just resized apps of their iOS counterparts with nothing extraordinary about them. Most happy users are just using a bigger web-browser, or a bigger drawing app (with the same functionality as on smaller screens), or custom apps that are uploaded on an educational/enterprise portal and are a job requirement. A lot of families use it as an entertainment/learning device for their children where its processing power is underused.
There is almost no money to be made as a developer if you spend all that effort supporting multitasking or adding nice iPadOS-only features. Customers won't pay and Apple only makes your life harder with every OS release.
How so? Android has a file system that's accessible to the user out of the box, and there are numerous file manager apps on the Play Store in addition to whatever the OEM packaged in the Android OS image. While the Play Store may be a walled garden, Android doesn't stop you from installing APKs from third parties.
In addition to that, Android has had mouse support since at least 2012. It took Apple what, 8 generations to add it into iPad OS?
However, your overall point is correct. the Ipad is more successful as a "productivity" tool than Android among regular people, despite the suite of Microsoft Office apps being fairly similar in quality across both. I'm assuming that Office style apps are what's mostly being used, and not specialist apps like Procreate or Affinity which are iOS only.
My guess is that people are more willing to buy Apple for tablet productivity as their support is more reliably available and it's easy to get third party accessories. On the Android side there is no such guarantee. Samsung is only OEM even attempting to compete with the iPad Pro, but its support doesn't have the same reputation and accessories aren't always available in all markets.
This is why I wonder about the reasoning in the article. You have platforms that are NOT walled gardens, where you CAN write your own great file manager and those also have same ballpark hardware, yet nobody rushed to write this groundbreaking filemanager that would trigger platform success remotely comparable to iPadOS. Maybe, this is not about walled gardens and filemanagers? Maybe there are best tools for certain tasks and for author's job tablet is just not the right one? Since I got my first ipad 10 years ago I almost exclusively read things on it instead of my notebook. And I read a lot and it is part of my work. But I never expected it to replace my notebook or phone.
Android has a file system, but almost every app goes out of its way to hide that fact from you. I have almost no idea where any particular app is going to save files to, and so it's playing needle in a haystack every time I want to open a file created by one app in a different one.
This is another case where Apple didn't invent something, but made the best version of it.
For a device that didn't have a public file system until recently, the iPad's both works and makes sense. It's iPod, mobile phone, Bluetooth headphones all over again, on a feature level.
> Of these the iPad is far and away the best option currently.
Except when it isn't. When you actually need to transfer a file from/to an Ipad to another device, and that you cannot access the file system and you don't have any other Apple device to do so, this is the hardest device to work with. I remember being honestly puzzled as to how to do it, and looking online for numerous "solutions", all of which involved installing more applications to hopefully do the trick. Pure horror. (It was, as far as I know, impossible to transfer files even thru a direct USB connection without any specific application - meaning USB was completely useless by default).
The iPad by virtue of its form factor, sucks for typing anything longer than a short message. You can't hold it comfortably like a smartphone, or place it on an unsturdy surface like a laptop. Everything about the typing experience on an iPad feels distinctly second class.
This is a problem for a computing device because writing is a pillar of human culture and economy. A device that places writing second is a device that has excluded itself from vast swathes of what people use computers for.
iPads are good at many thing that involves passive consumption, or drawing using the Pencil, or pressing buttons, etc.
I have an iPad and it complements my laptop well - I use it for reading music. The iPad is not a failure. But it's also not replacing my laptop. It is an accessory to it. I'm sure that Apple, a company that sold me both devices, is ecstatic about that.
On the contrary, the iPad Pro with Apple Smart Keyboard is spectacular for typing all the things — anywhere you happen to be. Takes up half the depth of a laptop, or braces firmly on your lap. In fact, given the way it’s balanced, only takes about 2” of table ledge, the keyboard can hang in the air and still firm enough to type on.
It places writing first, whether with pencil or keyboard, no laptop does both as well.
I love the form factor of iPad, and with a physical keyboard it's starting to be quite good.
But...
I can't design UIs on it (no Figma or Sketch - yes there are some vector drawing apps but they don't compare)
I can't edit high quality video on it (locked down codecs - only h264/h265, no prores, no raw codecs, no ability to install codecs beyond what the OS provides)
I can't use an iPad to write software for iPad (no xcode)
File management with iOS13 is better but still clunky compared to desktop OS's.
Yes there are many things you can do with it that qualify as work for many people, but compared to desktop it's still very much lacking in many areas, and not really progressing significantly either. I wish iOS was more extensible by users and it could have a better chance
I just want a Macbook where I can fold over the screen and it's an iPad. Then I want to be able to use iPad applications in Macos. Is that too much to ask for?
> The thing that truly hurts the iPad is the App Store.
I'm skeptical if the situation would be better without an app store. The app store provides monetization and that leads to higher quality apps. I think what we'd really want is for iPadOS to open up to more integrations, like an integration app that offers a different home screen / app switcher / task manager experience.
The comments on this thread are immensely frustrating. Just because all the work you need to do can be done on an iPad doesn't mean that all the work everyone needs to do can be done on an iPad, or that it wouldn't be easier on a desktop/laptop OS.
And yes I know there's some (creative/primary - not augmenting something else you're doing) work that does work better on iOS^. But that's an exception, not the rule.
^I mention the OS and not the tablet form factor because (as the article mentions) that is the main problem here.
And none of this really matters, it's fine to have a tablet with an OS that's tailored for certain use cases. But people keep trying to claim that an iPad can replace a regular PC, and that's just not true for a lot of people.
I admit I’m in the minority but for my purposes my iPad fits every need that I have for a computer which is I can do all of my work when I’m away from my desk on it.
I do a lot of work in Salesforce, I write a lot of sql, I write a lot of JavaScript, I write a lot of Python, and I write a fair amount of Go.
I write and respond to a lot of emails.
I talk to people on Teams.
I do a lot of work in Adobe Analytics, Hue, Jira, SFMC.
The main thing that I’ve had a tough time replacing (and I’d be so thrilled if someone here has a good suggestion) is Postman. Because I write tons of little example payloads for APIs to send off to developers to get them started on projects or to test internal stuff. I’d love a native solution for that.
This article is oddly similar to the Daring Fireball article that just came out and was just posted on HN. Both talk about how, 10 years in, the iPad has failed to revolutionize computing and both complain about awkward UX.
Someone replied on that thread that the iPad doesn’t quite have the same upgrade treadmill as other devices. I made do with my iPad mini 2 when it started having a video glitch during the long gap between the 4 and the 5. Then I finally bought the new one. That was almost five years for me.
Another one said they were common with children, which is true. But if they grow out of them, then they also might not be replaced that often. Birth rate in the first world is steady or falling. High end device sales for children should tend to follow.
The original MacOS only supported one application at a time. It also supported little apps called "Desk Accessories", which could float as a separate window within the applications space, but they required a small amount of support from the application itself, which could choose not to do so! Desk Accessories internally were coded as drivers, just like any other system-level driver.
It wasn't until Andy wrote "Switcher" that the Mac supported more than one app, which, the first time it demoed made my jaw drop! Andy is a really smart guy. To be fair, Switcher had limitations due to memory and no hardware multitasking support in the original 68000. It was more like cooperative coexistence than true multitasking.
Mind you, the original Mac came out in 1984; the standard PC operating system was MS-DOS (3.0 and 3.1 came out that year), IBM introduced the PC-AT that fall, and Windows ... Windows 1.0 was still in development hell (it saw public release in November 1985). At least Digital Research shipped the GEM window manager for DOS in early '85 ...
Multi-tasking really wasn't a thing on PCs much before 1988; I mean, you could do it, if you had a bottomless pit of money with which to buy RAM by the half-megabyte and a 286-class processor and a memory manager like DesqView or a penchant for paying DR big ticket license fees for Concurrent CP/M-86, but otherwise ...?
> To be fair, Switcher had limitations due to memory and no hardware multitasking support in the original 68000.
Ahem, the Amiga would like a word...
The 68000 hardware could support preemptive multitasking just fine; what it lacked was a hardware MMU. You couldn't run an OS with protected virtual memory on a 68000 without external hardware support (sometimes another 68000 was used for this!). Accordingly, all tasks on the Amiga had access to each other's memory space and even the kernel's. But the fact that Switcher, MultiFinder, and later versions of Mac OS could only support cooperative multitasking was more a Mac OS limitation than a hardware one.
>>The 68000 hardware could support preemptive multitasking just fine; what it lacked was a hardware MMU.
Yes, exactly. Thanks for clarifying. As you say, the lack of preemptive multitasking was mainly an OS limitation, although a hardware MMU would have really helped. I remember at the time reading about the hoops that Andy had to go through to get Switcher (which, as mentioned in other replies, became MultiFinder) to work. One of the issues was that the single app OS model used a bunch of low memory locations as system globals...on every context switch these had to be saved and restored because each running application could change them. (ouch!)
Also, the virtualization support came in the 68010 follow-on which would have been available couple of years befora Macintosh launch. Of course for actual VM this would have still needed an MMU.
One of the first architectural decisions that Bud and I made for the Macintosh system software in the spring of 1981 was that we were only going to try to run one application at a time. We barely had enough RAM or screen space to do even that, and we thought that we'd benefit from the resultant simplifications. Besides, multi-tasking was supposed to be Lisa's forte, and we didn't want to usurp all of the reasons for buying a Lisa.
--Andy Hertzfeld
In the original MacOS, only one program* ran at a time. You could have multiple MacPaint windows, for example, but if you wanted to use MacDraw, you'd have to quit MacPaint.
From a process point of view, this was single-tasking.
Andy Hertzfeld wrote the first multi-process environment for Macintosh, "Switcher." With Switcher, you could have MacPaint and MacDraw (or any combination of as many as four apps) open simultaneously.
Only one would control the entire screen, so if you were using MacPaint, the MacDraw (for example) windows were hidden. When you "switched" to another app, it was like side-scrolling the entire screen, the entire desktop scrolled horizontally to the left or the right and another scrolled into view.
Later on, MacintoshOS (pre OS X) acquired its own cooperative multi-tasking capabilities, and when that happened, you could have multiple windows from multiple apps visible on the screen at the same time and switch by clicking on another window.
It was quite the hack for its day, and had a cool UX, it was like having four Macintoshes arranged in a carouself, you could "switch" between processes.
---
As another comment points out, there was another thing called a "Desk Accessory" that could co-exist on the screen with a full application. But these were not full applications and had very limited capabilities.
The original Mac OS supported a single app plus "desk accessories" which relied entirely on every application calling them at exactly the right points in their own main event loop. It was a very limited form of "cooperative multi-tasking" which isn't really multi-tasking at all IMO. If any app or desk accessory got even the slightest part of the scheme wrong, which was easy to do because the "rules" were complex and unclear, it would take down the whole lot. No background threads, or really threads at all. Even contemporary DOS TSRs did better. Switcher came fairly shortly after, but (as the linked article makes clear) it was definitely a hack. Real multi-tasking didn't come to the Mac for a long time.
Honestly, Apple is asking for this kind of scrutiny when they apply the 'pro' moniker to any and every product. My iPad Pro is incredible and I love using it, but there is still so much friction involved in doing certain things that are way simpler on my desktop machines.
I tend to agree. I've got an iPad Pro, but my MacBook Pro is still my default for activities which are often possible, but awkward on the iPad.
What Apple seems to have done though is avoid an uncontained failure. I could have seen other tech leadership teams going all in on the iPad Pro, and removing/deprecating other lines completely at this point.
They've started incremental investment in the iPad after years of low/no investment. This seems like a better strategy to realise the full capabilities of the product while retaining what works. This is too slow for some people, but I don't know of another option, especially when an ARM transition on laptops is a consideration.
I get the real work argument but I think real work from real professionals is slowly permeate to the iPad for example Madlib (prolific hip-hop producer) produced his entire last album on an iPad and it was seen as a pretty big deal in the professional music world. (I think)
Choosing a tablet over a laptop or any other kind of computer is all about the tradeoffs you are willing to make.
There are some circumstances where the combination of portability and screen real estate makes it a better choice than a laptop or a phone. There are some applications that are more well suited to tablets, and some that are more well suited to a laptop, desktop, server or phone instead.
They’re all computers, there’s not many reasons you couldn’t run a web server off a tablet or phone, it’s more a case of choosing the right tool for the right job at the right time. From that perspective, iPads and tablets in general are just a different tool, you can beat a nail in with an iPad in a pinch, maybe, but why not just grab an actual hammer?
But that’s boring. It doesn’t play into the narrative that tablets, with iPads at the vanguard, were or are predestined to replace laptops entirely in exactly the same way that GUIs replaced what we used to call command-line interfaces and surely not a single soul on Earth uses those anymore.
Obviously the only thing holding this back is Apple’s own policies, and if they would just let the next Adobe rise up and take their rightful place as a software giant making tools for iPads, surely its destiny as a laptop-murderer would be fulfilled. That would be so much better than merely being absolutely top tier for digital painting, digital clipboards, digital books, bedtime movies, and at least being serviceable as a third-rate cash register.
tl;dr: laptop-murder is more exciting than another tool to choose to have in your tool belt or not.
What frustrates me about the i<thing> and smart-phone ecosystem in general is the abject lack of integration with computers and, in the case of i<things>, the lack of sensible file system access, memory card slot, f-ing cursor control, and more.
Simple example: Why can't I pop-up my iPhone X screen as a window on one of my three workstation monitors (Windows 10) and run it from my computer?
Why can't I make calls with it from my computer?
Why can't I click on a phone number on a web page in Chrome and dial it with my phone?
Why can't I type a text message from my PC keyboard and send it with the phone?
Why can't I copy and paste onto the phone?
Why can't I copy and paste an image and send it with the phone?
Why can't I do the reverse and grab things from the phone from my PC?
Why can't I access a file system transparently (other than images)?
For that matter, why can't I transfer large videos?
Why can't I use my PC headset through the PC to communicate using the phone?
I mean, the thing sits there ALL DAY, 10 to 12 hours a day, right next to my workstation. If I want to do anything with it I have to go from a full size keyboard with arrow keys and all kinds of nice things as well as a beautifully efficient trackball to typing with one finger on a small screen and using the f-ing ridiculous select-copy-paste mechanism, etc.
If I want to send someone a link from my PC I have to email it or SMS it to myself, grab the phone, disconnect it from the charge cord (because it is inconvenient to do anything with it plugged in), go to SMS, email or FB messenger, copy the link, go back to wherever I need to go to send it, paste it and click send.
It's 2020. The phone should meld into the computer with full transparent operability. Same for tablets.
BTW, this is what I was hoping Microsoft would finally bring to life with Windows Phone. I was eagerly awaiting this one killer integration. I firmly believe that would have inspired, I don't know, a billion people world-wide to adopt Windows Phone. Instead they wasted time trying to make it an iPhone killer. It didn't need to be that. All it had to do is become symbiotic with a PC in all possible ways and it would have been an absolute hit.
Here's hoping they try again. If anyone from MS is watching, call me, I know exactly what you have to build (and you guys have no clue).
An awful lot of this integration exists, between iOS devices and Macs. Windows 10 seems to be getting some better integration with Android in recent beta releases, take a look at the release notes of a recent 'Fast Ring' update.
Most obvious and best use case has been the proliferation of point of sale systems especially to small businesses which couldn’t make a large investment in that area before.
iPad would succeed a lot more if there was no iPhone to compete with it. As things are right now, iPad is an afterthought for iOS developers mostly, and an expensive afterthought at that, because larger screen requires different UI. This creates a vicious cycle because there isn't really that much you can do with it, so there's no good reason to buy it unless a wad of cash is burning a hole in your pocket (which is common for Apple clientele).
My son has an Xbox one, desktop with decent cpu/Gpu, and an iPhone 7. Still he recently bought an iPad from his own savings because he likes it best for gaming...
You'd have to say the same about Android tablets and Windows tablets too. These tick more of the boxes but they have not really connected w/ the market.
Yeah Google gave it some lip service for a little bit and then just walked away from any serious help for Android tablets.
Granted ChromeOS and Chromebooks have done a good job of covering some of the tablet use case space, obviously not entirely, but it seems like a better place to focus.
> Nobody wrote any articles about how, actually, real work on a Mac is possible.
Yes they did. Back then Apple also did something called "advertising" to try and convince enterprise users that Macs could be used in a business environment.
In more recent times they resorted to a very petty form of advertisement that, by way of direct, cherry picked comparison points, they tried to convince everybody that Macs were just as good as Windows PCs. At that time there were tons of articles commissioned to push the same point and even whole websites.
It was never really true though and it still isn't. Macs were only any good for certain workloads, just like the iPad, and only if you could look past their wonky UI and hardware design choices.
This is what I've found to be the most true. For years people talked about the Mac devices being so much better for development and creativity but after dealing with one for a year all I can say is that it's got a Unix terminal which is better than windows but falls short compared to a Linux system. Android is the same, having tried the iPad and the iPhone because it was faster and had a better interface I soon realized it was just a toy painted to be a device for adults and no real speed improvement was noticed for anything but mobile games which I don't play anyway, I switched back to a note and threw Manjaro on a laptop and I've been more productive with that combo than any I've tried...
I can't even imagine what you would or could use an iPad for that a thin laptop like the xps wouldn't run circles around even in terms of portability. It just seems to me like these people bought a very expensive 2 year old computer and need to justify it to themselves and others.
What is this silly ‘real work’ gate-keeping? I guess the author means their work.
My fitness instructor uses an iPad to refer to fitness plans. He’s doing real work. He uses files to do it as well! Opening an Excel spreadsheet.
My friend is an outdoor teacher and uses an iPad as a map. He’s doing real work. Maps are files.
Another friend works as a train engineer and uses an iPad to refer to technical manuals. He’s doing real work. The manuals are PDF files.
My colleagues who do recruitment at conferences take people’s details on iPads. They’re doing real work too. Each form filled in is a file.