The "crapification" you describe is the creep away from the scientific
principles that once underpinned this field. Before UX we had HCI
(Human Computer Interaction) which was in turn a development of CE
(Cognitive Ergonomics) and other "human factors" sciences.
These sciences were rooted in very rigorous but time consuming tests,
observation, psychology and physiology.
from TFA: "Designers want a "clean" interior with minimal switchgear"
This is where the wheels fall off the wagon. Should "what designers
want" be high amongst the priorities for safety critical products?
> from TFA: "Designers want a "clean" interior with minimal switchgear"
Speaking as someone with an Interaction Design (IxD) degree: no we fucking don't. Tactile buttons being superior has been known for ages. For example, Bret Victor wrote "A Brief Rant On The Future Of Interaction Design" in 2011, so over a decade ago[0]. Not that anyone with the power to change things listened, because these decisions aren't made by the designers.
This is mostly a consequence of people higher up trying to save costs by using touchscreens, which is cheaper to buy and cheaper to develop for. HCI and IxD have always had this issue that we're asked to fix things up after everything else has already been decided. Basically, we're mistaken for graphic designers who decide on what the final product will look like. So we're given a touchscreen to develop an interface for, not a blank-slate car interior (or whatever) for which we get to decide the button layout.
At the risk of pulling a "no true Scotsman", this is a consequence of cost-cutting first and foremost. Don't blame the people who actually have a background HCI or Interaction Design. We all knew this was coming, and we hated it. If we're told to make do with the touchscreens we are given, with the alternatives of actual physical buttons being ignored before we even get to make decisions, then don't blame us for the lack of those buttons.
Bret Victor article is very good. Thanks for sharing.
Some remarks stood out for me:
> talk about technology. That's the easy part, in a sense, because we
control it. (my emphasis)
Yes, I agree with him strongly. But - there's been a dreadful
anti-intellectual tide this past decade - a descent into
"technological determinism", or the idea that technology is its own
process to which humans must bend. It's the idea that we don't
control it. It comes along with the overuse of words like
"inevitable", "ubiquitous", "unavoidable" and endless talk of cats
escaping from bags and genies refusing to go back into bottles. It's
a defeatist and lazy creed that seeks to excuse a race to the bottom
of cheapness, as you describe, with a narrative about how we "have no
choice".
> if a tool isn't designed to be used by a person, it can't be a very
good tool, right?
Increasingly, tools are designed to be used by other tools. Humans are
being sidelined amidst the interplay between machines. For example;
the demise of the Web is largely due to bots and the arms race to
create other gatekeeping bots to defeat them.
> Hands
Bravo! Not "a finger" or "your thumbs". That's why I use a keyboard,
interact through text-based technology, and cannot fathom
thumb-twitching smartphone users. I totally get what he's saying,
having worked in sonic interaction design with musical instruments
(NIME) stuff like the ROLI seaboard (or whatever they changed the name
to)... hands and touch, with mechanical haptic feedback is the way to
go.
I wish more people payed attention to this understanding of our
relation to technology as embodied beings, instead of chasing a
"clean" disembodied dream - which I think hides within sublimated
Orthodox Dualism in the tech community - but that's another story.
Most executives don't have the vision or creativity to come up with these trends; they have to pick them up from somewhere. I think there's plenty of blame to go around though.
The trends adapt to the requirements of the customer. See also: the appeaiance desktop interfaces having phone interfaces that don't fit desktop affordances at all.
Note that "customer" can be a manager or similar higher up in the hierarchy.
The cost of fixing hardware failure in a final product is not the same thing as the cost of developing and mass-producing the product.
For example, we're not talking about one button, we're talking about a lot of buttons, usually custom-made for the car in question. The whole dashboard physically has to be designed around them. Meanwhile Tesla just slaps a screen on a mount in the middle of the car and calls it a day. It's basically "we have to get everything right the first time" vs "fuck it, we can always fix things in a later software update". Which is a way to save costs by cutting corners.
The buttons all have their own complicated logic too, although I suppose that even with physical buttons one can handle almost all of that purely through software these days, so that's not really as much of an issue any more as it used to be (it does make me terrified that cars can be hacked and bricked, but I digress).
Speaking of a lot of buttons, that's the other thing: if all your buttons are virtual, you can have infinite buttons! The only thing we have to do is introduce a ton of mode switches! Which is absolutely terrible when you're driving, but nobody seems to care! So we can cram a ton of features into a screen that would otherwise require a million buttons, and use that in marketing. Even though we'd probably be better off if some time was spent to whittle things down to the essentials and design the interface around those cleanly.
Many buttons also means many more pieces to physically install, and many many more wires. And each one (or small cluster) is often accompanied by even more independently-wired small information displays (small LCDs and LEDs for showing the state / temperature / etc) which are yet more wires.
A touchscreen is largely just a single fused physical unit with ~two cables: a data ribbon and power. Utterly trivial to install and wire up in comparison.
I have a bachelor's and master's in Industrial Design. When I first entered the software industry after grad school in 2000 a master's was the floor for work in UI Design or Information Architecture (Ux wasn't a job title at this time). Many of the people I worked with in these early days were CogPsy PhDs. Design was slow and methodical. This seemed to hold true for the next decade or so. As design as a competitive advantage (or necessity) started to take hold more and more people flocked to Ux. Many in the field today are self-taught, attended bootcamps, or pivoted away from graphic design (thanks Dribble) to Ux. Did we lose something when many Ux practitioners no longer have roots in HCI, library sciences, human-computer interaction, industrial design, human factors? I'm not going to judge. Myself, I transitioned from Ux to programming.
No it's not their job, and I'll try to explain why I think that.
Apart from the remit being just too broad, designers in any case are
part of a complex team that deal with a multitude of functional,
non-functional, regulatory and financial requirements.
Now, we have many different definitions of "designer", which I am very
aware of, but I believe that, in some circles "designer" has become
romanticised and extended to include a set of perceived "magical"
powers to "deliver what a boss wants". That is a distortion of the
role to something grotesque.
Speaking from a domain in which I have expertise; in sound design a
great battle ensued between designers, users (audiences) and the
'bosses' (studios and publishers) as to how music and films should
sound. You probably know this as the "Loudness Wars". I think it
remains a textbook example of misalignment between technical, artistic
and financial factors. It also remains an example of why I think
"Markets are a myth" [2].
Despite listeners saying over and over that they "Don't want it", the
producers, through a mess of internal motives (mainly financial),
repeatedly foisted their values onto them, being obsessed with what
they think users want in preference to flat-out contradiction that
would be evident in even the most cursory market research.
The job of a designer is to balance factors, and in a sense act as an
advocate (stand-in) for the user by mentalising their actual
needs. It's a very demanding and complex skill. Doing "what your boss
says" is absolutely not it and reduces a designer to a tool.
On the other hand, a job of the designer is also to listen to expert
technical advice outside of their skill-set, and so must not get
carried away with any grand "aesthetic vision", wanting to be Steve
Jobs.
A hard line to tread, and one requiring strong will and ethics as well
as judgement.
Related, the challenges sound designers face in making dialogue audible. What seem like simple problems (make car climate control buttons easy to use, make the speech in a movie easy to understand) turn out to be incredibly complex.
Genuinely curious, what do you like about it compared to Win10?
I've only tried it in a VM for a few minutes so far, but was unnerved by the general feeling of 'pretty, but impractical', mainly thanks to the taskbar and the right-click 'hide everything by default' context menu.
Evidently not if the work they're producing is reportedly outperformed by old school physical controls from more than a decade ago and in most of the vehicles tested it wasn't even close.
I worked briefly as a freelance experience designer hired by an appliance manufacturer. I asked if they could send me physical prototypes of controls so they could be tested. They refused and said it would be too expensive. They expected the controls to be designed, spec'd, and sent to the factory without any usability testing.
Designers can do all those things, but often they're not given the space to.
The best products are typically produced in an environment where the people running the company care about the design. This is a rare environment.
Often they recruit kids with graphic arts backgrounds, hand them some fancy post-it notes and a YouTube video of how Zipcar did a journey map, and set them loose.
UX usually focuses on the critical path for the top-5 tasks. So turning on the car radio makes sense, but changing the radio station didn’t make the cut, so some rando engineer guy stuffed it in a menu.
When it’s done well with a great team and time it’s magic. It’s easiest to see when Apple gets software right, like Keynote - the functions of making a presentation are immediately obvious to an elementary school student. But even then, once you leave the happy path, woe to you - modifying a template is a dark art to most people.
Or you could use Apple's iTunes as an example of how to build one of the world's worst and most user-hostile interfaces, but one that every iPhone user must deal with unless they let Apple have complete access to all their information via iCloud.
I'm convinced most people really don't like iCloud, but since the alternative is iTunes, they basically have no choice...
The "crapification" you describe is the creep away from the scientific principles that once underpinned this field. Before UX we had HCI (Human Computer Interaction) which was in turn a development of CE (Cognitive Ergonomics) and other "human factors" sciences.
These sciences were rooted in very rigorous but time consuming tests, observation, psychology and physiology.
from TFA: "Designers want a "clean" interior with minimal switchgear"
This is where the wheels fall off the wagon. Should "what designers want" be high amongst the priorities for safety critical products?
[1] https://iea.cc/what-is-ergonomics/