The “almost” device

By Niko Kitsakis, May 2019

There are quite a few tech journalists and bloggers who can’t stop arguing in favour of the iPad¹. I find that strange and it has been bugging me for some time now. The zeal with which these people write glowing reviews of the device itself, or pieces about how to best use it “in the real world”, never sat right with me. It always seemed, and continues to do so, artificial.

I wonder what the possible motivation for writing a story like that could be. Ask yourself: if the iPad really was that good, wouldn’t that be a rather self-evident fact then? Would you need an article (or several dozen actually, published over the course of nine years) that tried to convince you of the merits of such a device? Or change perspective and ask yourself another question: did you ever need convincing about how the iPhone (or any other post-iPhone smartphone) has advantages “in the real world”? Of course not. And yet, many of the people who write about the iPad, can’t seem to stop creating pieces with titles like “the best uses for the iPad”, “six reasons to buy an iPad” or “why an iPad is worth it” – all while rummaging in their box of adjectives for expressions like “great”, “life-changing” and “fantastic” to describe it².

Fantastic

Something that was really fantastic – insanely great even – was the original Macintosh, introduced in 1984. Thanks to its easy to learn user interface, it allowed you to do things that were either not possible before at all or that you would have needed a degree in computer sciences for. Not to mention the fact that it was relatively affordable. And while the Mac of course evolved and became (mostly) better in the following years, it was true from day one that you had a powerful machine with a lot of potential in front of you. It was a bit like the iPhone later in that when you saw one, you knew you needed one. But just a bit: the Macintosh had to do a lot of ice-breaking of the kind which was unnecessary for the iPhone to do later on. What I mean is that in 1984, it wasn’t obvious that you needed a computer at all, let alone one which seemed so exotically different from what you were used to seeing in magazines or movies like “WarGames” for example.

How revolutionary the Macintosh really was becomes clearer if you look at what happened in the years after it came out:

Screenshot of MacPaint

The Mac adds at least ten pounds: MacPaint from 1984 and its inspiration from 1920 “Woman combing her hair” by Goyo Hashiguchi. Look at the MacPaint toolbar and compare that to the one in Photoshop, Pixelmator or any other similar application of today. Take note also of the Memphis Milano inspired fill patterns at the bottom, it was the 80s after all… (Funny enough, this is the second time in a row that I post a screenshot of MacPaint)

There are a couple more things that I could have mentioned but I assume that this will do for the moment. Now, compare that timeline of roughly seven years with what has happened since the introduction of the original iPad in 2010 – nine years ago. Name me a single product, a single piece of software or hardware for the iPad, that had even a fraction of the impact of one of the products that I mentioned for the Mac above. For the duration of those nine years however, the aforementioned bloggers and journalists have constantly stated, that the iPad is a fantastic device since you can almost do with it some of the things that you could already do with your conventional computer in the early 90s.

Which brings me back to asking “why?”. Why must this notion of the iPad being fantastic be stressed so much when it should be obvious? And how come I never see the thing being used for the kind of work that the bloggers advertise it for? The only people that I ever see using iPads for their work are service technicians, express couriers, receptionists and the like. People, in other words, who have predefined forms to fill out on their little touchscreens. Forms, mind you, that have been created on real computers. The highlight of using the iPad like that is the moment when these people hand the device to their customers for them to sign. Something that is entertainingly awkward in itself – or do you usually finger paint your signature?

The other users of iPads I see are the occasional old man on the train and, of course, toddlers. The former thinks he’s being young and hip and the latter can’t hold a real game controller in his hands yet. I have never ever seen anybody use an iPad – or any other tablet – to do what I consider real work: writing, design, calculation, science, engineering… you name it. And no, people on stage at tech conferences with proofs of concept don’t count. Nor does using it for a couple of hours a month.

Now, I have no doubt that some of my readers will approach me with their pet examples to prove me wrong. “Look here,” they will say “here’s X who does Y on his iPad!” But that’s akin to saying that E. L. James is a brilliant author because there’s praise by some random person on the back cover of “Fifty Shades of Grey”. If you look hard enough, you will find people who will praise anything. And if not, you can always pay them to do so. Richard Dawkins once said something along the lines of “anecdotes don’t impress scientists” – it’s the same principle for me.

Be aware that I’m not saying anything about the potential of the iPad as a tool for certain tasks, or about what the future may hold for it. If the situation somehow becomes much better I’ll gladly embrace the device for work myself. I’m not dogmatic about these things so why shouldn’t I? After all, I didn’t have to think for a second about embracing the iPod, the iPhone, a mouse with a scroll-wheel or a software feature like Exposé (now called Mission Control). When I’m being critical about the iPad I’m talking about what I see people really using it for and how that is in rather stark contrast to how the bloggers write about it. Let me repeat: I do indeed think the iPad has its niche and that’s fine by me. It’s the gap – or chasm rather – between the propaganda and reality that bugs me.

Why bother?

I’m concerned about where Apple, or more specifically the Mac, is going. I couldn’t care less if Apple made a lot of money with the iPad. It doesn’t bother me at all and why should it? On the contrary: more money in Apple’s bank account should benefit me as a Mac user as well. Especially since I vividly remember what that company looked like in 1996, when money was a big issue. So, good for them! Right? Well, not really.

It would all be well if it wasn’t for the stupid and paternalistic movement inside Apple to align their different computing platforms in terms of functionality and user experience. That basically means overcomplicating iOS to the point (already reached) where features are not easily discoverable anymore and, at the same time, dumbing things down on the Mac to a point (almost reached) where they become barely usable. The main motivation for doing this seems to come from the misguided notion that the iPad is somehow the sole future of computing. Thus Apple seems to feel the need to force this simplified concept of a computer down everybody’s throat – no matter the cost. Like I said: I certainly see the iPad in the future of computing but it is not the be-all and end-all of computing. It’s a nice device for certain limited tasks that works best in conjuncture with a “big” computer. The sheer fact that access to the file system is something of an afterthought (the crippled “Files” app, released seven years after the iPad) tells you everything you need to know about the real scope of the device. And don’t even get me started on the topic of not being able to freely install applications from any source you like.

Losing balance

I didn’t use the expression paternalistic by accident. Apple has always been rather aggressive in deciding what is best for their users, but usually, that turned out to be ok. Those were design decisions they had to make unless they wanted to sell DIY kits to their customers (as I imagine Apple’s co-founder Steve Wozniak would have probably wanted). Nowadays however the balance is off. There has been too much of the aforementioned dumbing-down, fuelled by – at least that’s my assumption – the success of the iPhone and its touch interface.

The problem with this line of thinking is the following: you can only simplify things to a certain extend before they become unusable for everyone. Apple’s Disk Utility, the application used for erasing and partitioning storage volumes, is a good example of this: Apple has simplified it so much that the resulting mess is helping exactly no one. Why? Because first, the users that are not proficient with computers are still going to be afraid to use Disk Utility lest they do something wrong (or don’t even know what the application is for – in both cases, they will ask a pro user for help) and second, the pros miss having the information and options available which they used to have (like creating a software RAID – something which is not possible anymore in Disk Utility). What’s Apple’s solution to this? They just moved as much functionality as possible from the GUI app to the Terminal version so that the former looks nice and tidy.

Different versions of Disk Utility

Top: Disk Utility before OS X 10.11, Lower left: Disk Utility from OS X 10.11, Lower right: Disk Utility running in Terminal. Note the friendly rainbow colours in the lower left version which are supposed to give you a hint as to how your disk space is being used. Unfortunately, the feature is so badly implemented that it’s not serving any purpose at all.

If you want to create a software RAID today you have to open the terminal and type in something like “diskutil appleRAID create stripe Storage JHFS+ disk2 disk3”. At least that’s the info which was up-to-date circa 2016. I assume from the “JHFS+” that this has changed in the meantime but I would have to search the web for the relevant info once more to be sure. So much for discoverability – or using a Mac at all for that matter. Thank you so much. If I would have wanted to use a computer through a command line interface, I would have been using MS-DOS back in the 90s, and today would be using Linux (or Windows even) on a box which I can assemble myself. The brilliance of the Mac has always been, that it was full-featured with almost no compromise for the pro, while still being extremely easy to use and self-explanatory. Thus, loading functionality off to the Terminal and telling the users to just deal with it is an extremely cheap move on the part of Apple. It’s the equivalent of giving up.

In the meantime Apple is missing out on one trend after the other. For example: can you guess what devices I do actually see used in the real world more and more often? NAS units – specifically, NAS units by Synology. A market that Apple isn’t even in (but should be I would argue). A great number of small companies and households can’t or don’t want to use cloud services – for whatever reason. For them, having a NAS around is perfect. Especially one that is easy to use and flexible. Meanwhile, Apple doesn’t even have a server OS anymore (not that the one they once had was any good).

It is in light of circumstances like these, that I don’t understand what is going on at Apple, and with the bloggers that try to force the notion of the iPad as a fully fledged computer replacement. As for the device itself: if you like it, go ahead and use it. It is not however the harbinger of a “Post-PC era” as some journalists and bloggers would have you think. If it was, I’d be using one right now.

  1. I’m going talk about Apple and the iPad throughout this text but what I say about the latter basically applies to all tablets. ↑
  2. Just when I was about finished writing this piece, someone published a gazillion-word blogpost-slash-book entitled “Beyond the Tablet: Seven Years of iPad as My Main Computer”. After having read my text here, the title of that piece alone should tell you almost all you need to know. The fact that the author was “supported by” a developer of an iPad app should tell you the rest. That being said, if the iPad works for him, good for him! But one thing becomes clear very fast when you read his piece: it’s a messy hell of dongles, keyboard cases and software workarounds – compromise upon compromise wherever you look. Even the author himself has to admit this. Here’s what he says about file management for example: “Having gone through all the different ways I use Files and its derivations on iPad, it shouldn't be a surprise that I have a long list of complaints, ideas, and suggestions for how Apple could take iPad file management to the next level. Strap in because… I have thoughts.” How nice. Well, I have a thought too: Apple solved these problems in 1984 with the Finder. But if you think that it makes sense to use a combination of multiple file managers to almost, but not really, reach 1984 productivity levels, be my guest. ↑
  3. What else could Photoshop have been in 1990 than a Mac exclusive? While in 1987(!) The Macintosh II could show bitmapped graphics in 256 colours on a 640×480 display, the best you saw most PCs do at that time was to run MS-DOS in text mode. Most of the time that meant 40×25 characters and no (usable) pixels. Three years later in 1990, when Photoshop came out for the Mac, the situation on most PCs wasn’t much better. You really had to be there to appreciate the difference between the two platforms at that point in time. For reference: Windows 3.1 – the first version of Windows that you actually saw some people use – came out only in 1992 and was considered pretty shitty, even by PC users. The time for halfway respectable graphics in the PC world – the VGA graphics standard was only then adopted widely enough – came in 1995 with Windows 95. That’s eight years after the Macintosh II was introduced. This should help you appreciate why Steve Jobs was always so grumpy when asked about Apple in the 90s. He had to watch the advantage that his old company had with the Mac being slowly eaten away by the competition. All thanks to mismanagement at the helm of Apple. And today, history is repeating itself – thanks to Tim Cook and his failure to even grasp the basics of what a technology company, let alone Apple, is all about. ↑
  4. If you look hard enough, you will also find people who will do, eat or say anything. Even people that refer to Bill Gates as some sort of a software genius, to Yuval Noah Harari as an intellectual and to McDonald’s as a restaurant… ↑
  5. I’m happy to say that I wasn’t one of the poor sods who were bitching about the iPod on MacRumors when it came out in 2001. On the contrary, I ordered one right away and loved it. The thing even got me laid, believe it or not, but that’s a different story… ↑
  6. Don’t even think about contacting me with the argument for security. That’s just bullshit with roots in half-truths, created to protect the 30% profit margin that Apple swallows with everything they sell on the App Store. And whatever happened to the concept of owning your device or your data? How much is that idea worth in a world where you can’t freely install applications or have unrestricted access to all your files on the storage medium of your choice? ↑
  7. A term which by the way – and conspicuously in support of my hypothesis – I haven’t heard in a while… ↑

After I published this piece, two of my readers drew my attention to the fact that Apple has restored the software RAID functionality a while back. I stand corrected but I also stand by the point I was making: Disk Utility is still only a shadow of its former self. The RAID feature aside, it is still exemplary of something larger going on with Mac software from Apple. You could just as well look at Pages, Keynote or Mac OS as a whole and find the same trend manifesting itself to different degrees.