Saturday, September 8, 2012

Thoughts on the progression of UI designs...

Following on from the Traces video, this week I've ended up seeing a number of videos and screenshots demonstrating the progression of computer interfaces over time. Having spent quite some time over the past few months thinking over issues about UI design and how we could/should try to redesign them so that they are easier and faster to use while encouraging discoverability, some of the things I've come across recently have been a bit eye opening.

While reading research papers as part of my honours thesis/project work (but also through videos and articles like these), I've also found that many approaches that came to mind had been tried - some successfully, some not so, and others that I wouldn't have thought of but which end up becoming "all roads lead to Rome" designs.

However, at the same time, there have been times when I've been struck with the immediate thought: "Wow! They had that back then? But gee... have our designs regressed?"


Hotkeys
For example, take the following demonstration of the Xerox Star, which inspired the WIMP GUI paradigm used the better part of the last 3 decades:


For most people, the most significant features of this video are that this demonstrates the use of a mouse, the desktop + windows + icons + files metaphors, and the fact that the thing had a graphical display (as opposed to being purely text-based).

However, what was more striking for me was the fact that it had dedicated keys on the keyboard for performing common actions that worked in and were available across all applications. For example, a dedicated there were dedicated "Copy" and "Paste" buttons (instead of Ctrl+C, Ctrl+V) that worked for all applications, and was clearly labelled on the keyboard as performing that very purpose.
  • The big red "STOP" button was also quite amusing, though I'd much rather have seen a big green "GO" button in its place (where the EnterKey on the NumPad usually sits these days) ;)
  • The "Move" button was a bit odd in some ways. But, at the same time I couldn't help but think of Blender - especially if there were dedicated "rotate" and "scale" buttons beside it at the same time. Hehe...
Now, one of the things we've learned is that command use is really repetitive and limited to a small subset of operations (as predicted via Zipfs' Law). It also happens, that most of these common/standard operations also tend to be ones that work across a number of different applications - things like: copy, paste, open, save, etc. So, to see that way back then they had the idea of assigning dedicated (and well labeled) buttons for these operations, and to have placed these in a convenient location near the user's Left Hand (assuming Right Hand on Mouse) was quite interesting. Without having actually being able to have tried such a setup (i.e. based purely on imagination and general principles), this seems like quite an optimally-efficient setup.

For now, we can only really speculate on what could have been. After viewing that video, I could only wonder why these buttons ended up disappearing? Was this yet another one of the Jobsian quirks (i.e. like getting arrows keys removed, and wanting a single button mouse) that somehow managed to catch on? Was it just one of the economics-driven decisions (e.g. IBM didn't want to make spend money building such keyboards) and/or most software developers at the time didn't want the extra work of hooking up their code to such a standardised hardware solution? Was it that a bunch of left-handers complained that they couldn't operate the mouse AND use these keys efficiently without reaching over their keyboards? Or was there just a paradigm shift, where suddenly physical buttons on the keyboard became seriously uncool, and everything should only be point-n-click on the screen?

If (and it probably was) primarily the last of these, then the recent shifts towards touchscreens and having dynamic keyboards provide us with some interesting new opportunities. While the ability to dynamically assign labels to keys is interesting, far more interesting is the ability to create different layouts entirely. Perhaps all that's missing still are those haptic touchscreens (there was some news about this a while back) that provide the sensation of actually pressing a button, and/or a material which doesn't collect so much grease from fingers! Imagine having this working for performing sculpting on a tablet while on the go...

Hotkeys, Modes, Multiplexing, and Spatial Memory
But, let's take a step back and think back to why having a set of dedicated buttons for common operations located on the LHS of the keyboard (right under or quite near the Left Hand in rest position) may be quite an optimal setup. For this, we need to consider a few concepts which, I've been coming to find are actually quite closely related:
  • Modal vs Non-Modal,
  • Space-Multiplexed (many devices/objects, each with a single dedicated function) vs Time-Multiplexed (one device, many functions),
  • and Spatial Memory
As many who have followed Blender's UI debates for more than a few pages will generally know, the issue of modality vs non-modality will come up. Modes are generally considered evil by most (IIRC, Nielsen's Usability Heuristics mentions it as something to be avoided). In particular, this is because there is the well documented "Mode Error" phenomenon, where people end up making mistakes because they lose track of the mode that they're currently in. This causes problems because different keys may have different behaviours in different modes, so if you forget which mode you are or think you're already in a certain mode when you've forgotten to change mode first as you're more focussed on the end result of the current task at hand, then, all of a sudden, the computer has done something which you did not intend it to. In some cases, the error may not even be noticeable, while in others, it causes the momentary "drat!" response.

Anyone who's used the Microsoft Ribbon interface will know pretty well what I'm talking about (more on this later). For example, when you're on the "Home" tab, and you want to insert a table, you might start moving your mouse towards the ribbon, only to realise when you get there that you're on the wrong tab ("drat"). One of the PHD students here was studying this, and came up with "CommandMaps", a spatially consistent version of the ribbon that basically splays out every tab in the ribbon on a separate row. Thus, each command is now in a spatially consistent location, which was shown to allow users to quickly click on a target without worrying about mode switching (i.e. we've made a modal -> non-modal conversion). This works because we get rid of the spatial overloading. In other terms, we're going from having a time-multiplexed interface (where the same spatial region can serve multiple functions) to having a space-multiplexed interface (where each spatial location corresponds to a separate location).

Command Dashboards
In many ways, Blender's UI was in fact (many) years ahead of its time. IMO, one of the most notable examples of this is the old Buttons Panels that used to sit across the bottom of the screen. However, I'm not talking about Horizontal Buttons Panels that were introduced in 2.3, and stayed through until the end of the 2.4 series. Rather, I'm talking about the Buttons Panels as they existed in "ancient" versions of Blender, such as 1.8 (AFAIK one of the first publicly available versions, or at least the earliest version available for download).

Horizontal buttons panel in Blender 1.8
In this original design, we had a fixed number of items so that they could always all show on the screen at once. This meant that it was possible to display a spatially consistent representation of the button layout as a panel that stretched right across the screen (in Blender 1.8's case, across the bottom of the screen), with pseudo-tabs (in the form of a row of toggle buttons at the top) to switch between different categories of items. To activate a command or change some setting, you just flung your mouse down to the bar across the bottom, and changed the setting (perhaps changing tab first). No need to scroll around to find the items, as they were always all visible.

Does any of this begin to sound familiar now? Why, this is almost exactly the Microsoft Ribbon, albeit nearly completely text-based, with a lot more buttons (many of which were smaller than some of the big "feature" icon-buttons), and running horizontally across the bottom of the screen instead of the top (*). Oh, and this was some 20 years ago. Duh duh duh duh!
(* ASIDE: I'm currently a bit mixed still on whether putting commands/controls as a big fat panel across the bottom of the screen or across the top is superior, in terms of facilitating rapid access, but also being the most comfortable for motor movements and for being the easiest to spot).
It's quite surprising that this very idea, that others are only now starting to adopt is something we used to have for such a long time, and have now abandoned. Sure, there were limitations with arbitrary length lists. But, then again, perhaps we should now be reconsidering whether those lists were well placed alongside everything else in the first place! Could we have done so differently?

Anyways, back on topic: as I've mentioned in the past, I think horizontal panels can actually work quite well as an interface component for providing quick access to a mixture of commands and options. However, for this to work, it seems that it really depends whether or not you can fit everything in (i.e. no scrolling), at a reasonable size (i.e. easy to hit targets - Fitt's Law) while maintaining enough prominent landmarks to facilitate quick orientation and recognition.

For vertical panels, a few months ago I did some experiments evaluating a few of the different ways of implementing them, and trying to determine the dataset densities for which various designs were suitable for use. If/when I get some time, I should really polish up the paper I wrote for it at the time (it was a bit too light for actual submission to conferences/publication still). In any case, hopefully there'll be more I can reveal about this in coming months.

No comments:

Post a Comment