Jef Raskin was the father of the Macintosh.
Those of you who don’t know the history, go catch up on that by reading his Wikipedia entry, then come back here for the rest.
Raskin, now unfortunately deceased, did a lot of thinking about what makes a good user interface.
Since his death, his website pages have been dispersed to their constituent electrons for recycling.
I’m going to excerpt a variety of sentences from him because he seems to make part of the case for the iPad. In fact, some of these excerpts make me wonder if he should be acknowledged as the unofficial father of the iPad.
For an interface feature to be humane it must be easily learned and it also must become automatic without interfering with the learning of or habituating to other features. The present blend of hard-to-learn keyboard shortcuts and difficult-to-automatize menu choices fails on both counts.
To develop an interface that can be operated automatically by a human places constraints on the design, constraints that we learn about from cognitive psychology’s studies of habit formation. For a feature to be habituating, for example, it must be usable without requiring that the user make any decisions. It is better to provide only one way of accomplishing a task when the time lost in deciding which method to use is greater than the time lost by choosing the slower of the methods.
And this one really screams “iPad”:
Interfaces must be designed to accommodate our ability to pay conscious attention to only one object or situation, called our locus of attention, at a time.
And so does this:
Another common feature of present systems, file names, causes difficulties in that it is vexing to have to come up with unique file names (within a limited number of characters) whenever you wish to save your work; it is even more difficult to try to remember file names at some later date. It is possible to eliminate file names altogether. In addition, a user should never have to explicitly save or store work. The system should treat all produced or acquired data as sacred and make sure that it does not get lost, without user intervention.
In present systems, work gets done in applications (which are sets of commands that apply to certain kinds of objects). Tasks are not accomplished at the desktop, and desktops (or launching areas in general) should disappear as interfaces improve. The idea of an application is an artificial one, convenient to the programmer but not to the user. From a user’s point of view there is content (a set of objects created or obtained by the user) and there are commands that can operate on objects.
And this, which really presages Pinch In/Out:
The twin problems of navigation and limited display size can both be ameliorated by using a video camera paradigm, where the user can zoom in and out and pan horizontally and vertically over a universe of objects.
There used to be a actual demo of this interface online several years ago, well before the introduction of the iPhone.
Imagine a sort of iPhoto album taken to the extreme — where everything is in that album, zoomed out to the point where a large collection of files are represented by images smaller than a thumbnail. Items were zoomed into and once an object filled the screen, it was something that would become active and could be worked on.
In other words, no program was ever specifically invoked or launched. Everything was file-centric.
That interface didn’t make sense, using a mouse. It would have made more sense with a touchscreen and Pinch In/Pinch Out. But I’m not keen on every kind of file — text, PDF, video, audio — being spread out in an enormous collection of very small thumbnails.
Another thing Jef Raskin worked on was a word processor brought out by Canon, called the CAT.
I had one of those, so I can speak with personal experience of it.
I did not find it to be a gratifying experience at all.
Here’s one page of its Reference Card.
I never really got used to LEAPing. And I felt very frustrated by the limitations of the system itself.
Maybe I was too used to other computing paradigms — having both used an Apple II and Macintosh and having owned a Commodore-64 and an Atari ST(!!) — but the CAT was restricting to me. In its defense, it was meant primarily for word processing, yet even at that it seemed constricting when compared to Wang word processors (which I had also briefly used).
These days, I think it’s personal computing that has reached an endpoint. Both Windows and Mac OS X have both become very time-consuming to use. I don’t think I’m the only one who winds up wasting time finding “just one more tool” to try to squeeze in a new function or get some kind of overall system improvement out of them. All of this distracts me from getting work done.
That’s partly why I think the time is right for the iPad.
Despite the limitations the iPad will have, I think we’ll quickly come to the point where people will pick up an iPad first to get things done, instead of turning on a desktop computer.
I don’t see desktop computers going away. The current monsters will still be around, but increasingly they will revert to what they were originally: basically customized hot rods. Specialist and hobbyist devices.
If you want some perspective on how much personal computing has changed, read the about the state of the industry in Raskin’s paper: Computers by the Millions, An Apple Document from 1979.
Dig this bit:
It will be very easy for a programmer (or almost anybody else) to make an error that costs the company a million dollars, even without anybody generating a lawsuit. All the error must do is force the company to make an update to a piece of software that went out with each machine for the last few months.
If that puzzles you — and I used that bit to intend it that way — go read the paper to see why.
And even back then in 1979, Raskin saw very far ahead:
The third generation personal computers will be self-contained, complete, and essentially un-expandable. As we shall see, this strategy not only makes it possible to write complete software, but makes the hardware much cheaper and produceable. The kinds of options that do not give programmers nightmares are things like case color, kind of screen (so long as size, aspect ratio and resolution are unaffected), power supply and the model name.
So, is Jef Raskin the unofficial father of the iPad? Or maybe its uncle?
Apple Museum: Biography: Jef Raskin – by Ruth Bonnet
Recollections of the Macintosh project
Jef Raskin: A Life of Design
ACM: A Conversation with Jef Raskin
Intuitive Equals Familiar
Wired: Down With GUIs!