Since ~2013 Apple designers have been throwing over board lots of conventions the company had been itself establishing for decades.
I remember user interface design class at my university ca. 2005 where 20 out of the 30 best practice interaction design patterns originated at Apple!
Steve Jobs for the most part really cared and you could feel those priorities clearly: "it's how it works, not how it looks!"
Aside from some natural missteps, the "form over function" critique at the time was predominantly false. Apple is slowly getting there though, joining "ignorant web" as correctly called out here by Nikita.
The thing is that none of this is a joke or could be taken however lightly. It's 2024 and by now we've fundamentally realized the "Software is Eating the World" prophecy; living in a digitally permeated world.
Bad design is a moral issue, in worst case scenarios it has been killing people before and will increasingly kill or harm even more going forward. It always starts with the little things, especially so in design / engineering.
I desperately hope that Zoomers at least will start to realize that Millenials really fucked it up in that regard. I know, I know it also were the bosses pushing for this but we clearly should have said "no" much more often as the professionals (?) implementing this stuff.
There is much satisfaction waiting in learning; a full-grown craft with deep history.
Zoomers: Alan Cooper's "About Face" is a great start, probably super cheap these days as seemingly no one cares anymore.
For a few years in the last decade, it seemed that UX design was getting recognized as a serious discipline rooted in user research. Then, somehow, it devolved into fashion. When/why did that happen?
I'm sure there are many exceptions but a decent amount I've seen is portfolio-driven design, where the goal is more to have something eye-catching, unique and interesting that will look good in a portfolio and to show other designers, more than building something well considered and reliable/predictable. There can be a sense, especially amongst more junior designers, that the job of design is to add some style, and that design should be fun and much more like creating abstract art than making sure door handles are in a reasonable place and turn in the expected direction. The end result is, as you say, more fashion than function.
Does anyone have a design inspiration site that isn't this useless portfolio garbage?
It's incredibly annoying, and i say that with a an interest art, the abstract, cool concepts etc. but i want to see sites and interfaces with "actual messy real life content" not just one big image or whatever idiotic whitespace hell everyone's doing with way too much scrolling on Awwwards, Behance, Httpster, Gsap etc.
It's relatively easy to make a big font, a big picture and a 3d effect look cool, much harder to present 15+ items on one page and create a cool visual narrative around it that both grabs attention and lets the user go solo if he wants to without scrolling two miles.
I feel like a few newspapers were okay examples of this 5-10 years ago, but now they've also gone whitespace crazy.
We need a site like "Real life UX" or "Actual usable design" inspiration.
If you find out the answer, I've also been looking for that everywhere to no avail. Sometimes I go to websites like https://dribbble.com/ and search "Rich UI" but results are really hit or miss (mostly miss)
Honestly? Install classic software and use it! I needed exactly those sort of references to design a project and knowing I wouldn’t find them on the web, I booted a win95 vm and studied it. My designs improved dramatically.
There's no money in making a thing that works well, only in making things that look good. Effectively 100% of people who are using software are using it for things that fundamentally don't matter, so why should they care if the functionality is shit? Personal computers and phones are fashion statements, not useful devices. Business computers and phones exist to facilitate the bullshit jobs that employ the majority of the white-collar population. Follow the money; nobody with money cares.
Maybe a bit cynical and hyperbolic, but the point is good.
I'd boil it down further and say it's a focus on short term gains over long term gains. If the pan flashes, that's a win, full stop. When the pan stops flashing and people don't want to use your software because it's confusing, that doesn't matter because they can just flash another pan.
> There's no money in making a thing that works well, only in making things that look good.
I work in a company building vending machines and such and it's the other way around here. Most products are shipped with a pretty mediocre UI because it just isn't valued. The software has to run the machine, vend products to people and don't eat their money.
> There's no money in making a thing that works well, only in making things that look good
Amazon, AWS, Salesforce and anything Oracle entered the chat.
More seriously, I think it really depends. People will use and pay large amounts of cash for stuff that solves there problem and does not have a fancy looking UI.
1. 99.99999% of management have zero understanding of UX. So their view of UX is basically some designer making it "pretty".
2. Most UX design aren't probably taught. Especially Software User Interface.
3. A lot of Design in that era came from Web. And if we read this article we already know or could guess what web design were like.
4. It is my observation that Tech, or Silicon Valley historically speaking learns very little about history of their industry. Unlike many other discipline, things comes and goes like fashion industry. Combine with Hype machine and VC money. Apart from Politics or Finance there is no other industry that contains as much noise as Tech.
5. Conservatism ( Not politics ) is generally not well accepted. Finished Software is not appreciated. And if you cant improve, or remake something, there is no way you can move up the ladder. The fundamental of not doing anything hype or large changes is against Resume Driven Development model.
Electron-based applications seem like a hugo factor. When native applications were abandoned in favor of Electron, designers stopped trying to match their designs to established operating system standards and began designing from scratch, with much poorer results, because they didn't have the resources and experience of teams that had worked on major OSes.
Prior to that, deviations from established standards (layouts, colors, logic) were seen as unprofessional and tasteless. Things like buttons with unusual colors made software look like a shareware hobby project downloaded from Tucows, and nobody wanted their product to trigger these associations. Premium software made for Windows wanted to have the look and feel of Word and Excel.
I’ll give my perspective as a designer turned developer. People have always conflated design with “desenho” (drawing). But design is supposed to be more about information architecture. It just so happens that what’s usually trusted to be architected by designers is materialized graphically. But when the whole ecosystem of training and employment robs designers of their impact by not integrating them in both higher and lower level industrial processes, they’re hopelessly left at their corner, with a lot of energy to spend on what they actually have a stake on: visuals.
> Then, somehow, it devolved into fashion. When/why did that happen?
I'll tell you what happened.
Apple.
I've been saying for years now that Apple is not a tech company, they're a fashion company. Alternatively, they make tech fashionable, which means abandoning function in favor of form.
Actual UX is unimportant. It just has to look nice.
What infuriates me is when other companies then start to copy them. I'm looking at you, Microsoft. With Windows 11, it seems you have forgotten that many of your users stick with you because you're not MacOS. So why would you want to start imitating MacOS?
> Since ~2013 Apple designers have been throwing over board lots of conventions the company had been itself establishing for decades.
For decades, all the conventions had developed on desktop. By ~2013 it was clear that mobile required different conventions, and that it was important to also unify conventions to some degree across mobile and desktop.
Also, traditional desktop apps had largely limited themselves to the UX vocabulary provided by the OS's graphical widgets. But with the rise of high quality CSS and JS, websites and apps became more free to develop their own conventions, separate from anything coming out of Apple or Microsoft. Hamburger menus and pills and what have you.
So it makes perfect sense that Apple started to evolve more rapidly around that time. And good for them -- none of these rules can or should be set in stone.
(And please don't make this about generations, that's just silly. Trying to assign blame to entire generations is utterly meaningless. Generations are made of individuals who disagree with each other.)
I believe these rules should be changed rarely, consciously and very carefully. Perhaps it would be worth it for them to explain their thought process behind revising them in some developer session.
Meanwhile, every person working on design systems should think about decisions like these deeply. Designers, engineers, accessibility specialists should all talk together and come to some common ground before doing something like this.
I am quite positive, that (despite this and countless others fun examples to the contrary) the average the UX floor has risen by a lot.
Sure, things regress and move in waves, but on the whole user design has been established as the primary of software development and that really was a not the case back when.
Take something like error handling in a form. In a lot of average software, it was not at all uncommon for a form to just say "Error" when something went wrong (or just not submit). Or lose all form input after unsuccessful submission. Programmers were unironically confused about why people would not just enter correct information. People then wrote books about how to design form errors. Now, basically every web framework includes at least some form of validation and error handling by default(-ish), and most people would be seriously confused if they saw something like the above.
If you find it easy to poke holes into this one, please consider the average across all the little things that go into not fucking up a form, which is still hard to get really good, but again I am describing something of an average expectation here.
I would pin this to two major developments:
1. Designers are increasingly everywhere. If you think "duh?", this is entirely not how software was made. Programmers, commanded by business people, made software.
2. Most programmers today are also designers, and I don't mean in the sense that they always were (designing the software), but as in "thinking about people using the product".
Again, this might feel like a comical thing to even say but in most places programmers were just not expected to do anything to make the users life simple, unless explicitly told so. That was the designers job. In fact, a lot of programmers considered it a holy duty to fight any feature that was merely a convenience, and were quite adamant that, surely, the user could simply be expected to suffer a little and work a bit harder to get things done, if that meant keeping the code base pristine.
I think your point 2 is absolutely on the nose here. It fits in with broader industry trends in testing and operations.
And perhaps that's where the OP's question originated from?
As we've watched the despecialization of our field in testing and ops, we've seen that things improve, as ideas are introduced more widely, while also seeing them get mimicked and cargo-culted when the ideas are diffused.
Maybe the coders who were fighting against testing mandates or devops or design thinking were just insecurely admitting to their own ignorance on these topics and asking for assistance in being able to perform their new duties effectively?
One value in specialists is the freedom that comes with specialization enables them to do their job more completely. Fred Brooks's surgical team could not be more relevant.
Mind you, I know this has probably never been the case looking for example at Apple, Google or other shops that worked in similar spirit, but as a mainstream phenomena you have to not look further than the late 90s or early 2000s to find that average programmers in mid tier companies harboured a mix of non-empathy, non-sympathy and user frustration over a complicated interface and a designers call to do something about it, was regularly met with arrogance, a sigh or a frown.
Of course, this can also be credited to the fact that ui design for software was at a much different place in general.
>Since ~2013 Apple designers have been throwing over board lots of conventions the company had been itself establishing for decades.
2013 was when we first witness it in effect. It started a little earlier inside Apple. When Scot Forstall was forced out, the whole Software User Interface falls to Jony Ive and he basically ripped everything out and redesigned it with iOS 7. There is a huge different, or dare I say 95% completely unrelated field in Software UX and Hardware UX. Apple then spend the next 3-4 years walking back on all the design changes made in iOS 7.
Unfortunately a lot of UX learning was lost during that period. Including the Seniors of Human User Interface retiring during 2015 - 2020. The group has also grown rapidly in terms of numbers under Tim Cook. A lot of the Steve Jobs design requirement and "why" were diluted with more new members.
The design from Apple today may still look beautiful, but they are no longer as functional as they once were.
I blame the human spirit. User interfaces could have been the one thing we collectively agreed to "stop innovating" on and delivered better experiences for everyone. People are unable to stop innovating. People paid to look at design can't just say, it's done, we did it. And now I fully expect for the rest of my life I will need to explore an increasingly complex labyrinth user interfaces for which I will one day be unable to figure out.
I'm not sure if I read all the way through it when I first got my copy but it -is- very good and I have it stashed for the next time I end up doing UI design.
I remember user interface design class at my university ca. 2005 where 20 out of the 30 best practice interaction design patterns originated at Apple!
Steve Jobs for the most part really cared and you could feel those priorities clearly: "it's how it works, not how it looks!"
Aside from some natural missteps, the "form over function" critique at the time was predominantly false. Apple is slowly getting there though, joining "ignorant web" as correctly called out here by Nikita.
The thing is that none of this is a joke or could be taken however lightly. It's 2024 and by now we've fundamentally realized the "Software is Eating the World" prophecy; living in a digitally permeated world.
Bad design is a moral issue, in worst case scenarios it has been killing people before and will increasingly kill or harm even more going forward. It always starts with the little things, especially so in design / engineering.
I desperately hope that Zoomers at least will start to realize that Millenials really fucked it up in that regard. I know, I know it also were the bosses pushing for this but we clearly should have said "no" much more often as the professionals (?) implementing this stuff.
There is much satisfaction waiting in learning; a full-grown craft with deep history.
Zoomers: Alan Cooper's "About Face" is a great start, probably super cheap these days as seemingly no one cares anymore.