You feel alienated from your computer because there has been a conscious decision to take away options and user control in modern software. And I get why that decision has been made, even if I hate it as much as you and every other computer enthusiast.
For 90% of people, they have always "felt alienated" from their computers. They didn't understand what was happening or why things changed either, and it was easy to get yourself into trouble if you didn't know what you were doing and were trying to figure out how to fix something.
So companies decided to make their software have fewer options, and do more things automatically, without asking the user to have to make a choice. They don't give the users an option to customize, so they don't have to worry about those customizations causing breakage.
For advanced users this is crippling, but there are a lot more of them than there are of us, so they are going to be catered to.
> For 90% of people, they have always "felt alienated" from their computers. They didn't understand what was happening or why things changed either, and it was easy to get yourself into trouble if you didn't know what you were doing and were trying to figure out how to fix something.
I have stubbornly resisted it, but I think I will go the way of all of my friends and just accept it soon.
I now spend more time doing personal system administration than at any time in my life as a computer user. If you want to have control of your computing devices, you need to spend more time than in the age of five in floppy disks.
Most updates are a one way trip now, and they aren't keep on publishing exactly what features they have removed, so a lot of time is spent disabling updates, firewalling, researching, jailbreaking, imaging, and backing up.
My biggest liability now is not malware, but updates! I have to put all of my development toolchains in virtual machines, because they will break and I can not rely on being able to re-create them. Re-creating my modest workflow is a bi-annual affair, when it really shouldn't be.
And there has been a cultural change in software development as well. Software like Firefox will clobber your data during an update, and when you file a bug report, it will be WONTFIX, and they will say that it is your fault for not using Time Machine and rolling back their changes. They did this awhile back with bookmarks, and they certainly do it with extensions. I had to spend an entire afternoon recovering annotations and citations that were destroyed by a Firefox update, and I was told it was essentially my fault for trusting Firefox and not having hourly backups.
I hate to sound like a broken record, but there was a time when you could reasonably assume that if an update was making major changes, that it would give you the option to go back, or at least export your data if it didn't support it. I really wish the open source community would step up and be different, instead of embracing this.
I think these "lower the ceiling rather than raise the floor" rationales are cop-outs.
Yes, I get it, this reasoning does technically provide a justification for the design decisions made that will shut some people up. But that doesn't make the product or design decisions anything to be proud of.
And, to take it a step further, I don't think UX is putting in honest effort to improve things for unskilled users. All the low barrier to entry stuff is superficially pleasant, but the number of common UI paradigms end users have to learn how to intuit their way around has exploded, while affordances and discoverability have plummeted. People who've spent their dayjobs in front of computers for years - if not decades - are assumed to categorically have bad, less-than-worthless, ideas about what might make interacting with those devices easier. All because "shut up, nerds. Nobody would ever like the things you like" was easier than listening and figuring out the how to separate the wheat from the chaff.
Ironically despite there being less choices in an effort to make it easier for that "90%", the amount of tech support friends and family request from me has only increased in the past years.
Personally I don't buy the whole "removing choices to stop users from hurting themselves" excuse. To me it seems like over zealous designers trimming far more than necessary to make things look nicer at the cost of usability. But what do I know?
People are using tech to do a lot more. 20 years ago tech was a thing you sat down at, turned on, and used to check your email, update a spreadsheet, or type up a document.
Now we're using computers and software every waking hour and using software with millions of distributed users that we expect to always show us the latest updates.
We're using phones for banking and payments, to turn our lights on and off, to keep a lifetime of family photos backed up and synced across multiple devices.
It has made things more complex, but I don't think a world of 1997 style configure-it-yourself software would necessarily help with that.
Regarding the increase, I would argue that is less about increased device complexity but more about the increase in the amount of people who are using these devices on a daily basis. More they use, the more problems they encounter; I think the intersection between complexity and usage is determined moreso by the latter.
I doubt the number of people using smartphones and computers has increased significantly in developed/first-world countries, especially when it comes to one’s family and friends.
As anecdotal as it may be, my friends and family have used computers and smartphones for years, but I’ve experienced the same increase in requests for tech support as the parent comment.
Further, no one said there’s increased complexity. The argument is that the oversimplification, the removal of features, and overzealous design assumptions have made UX go in the wrong direction. It’s also an argument I agree with.
A lot of UX design today fails to recognize the spectrum of “tech literacy” and it should, ideally, accommodate all within that spectrum, rather than pander to the least “tech literate” end. It’s not always possible, but it should be strived towards. Instead, we have UX trending towards attempting to be so “intuitive” that it becomes counterproductive.
I had an interesting incident recently, where I was with some relatives and we were trying to plan the next leg of our trip; what restaurants to go to and what directions to take, etc.
Anyway, we had to start using a paper notepad and pens to keep track of the information!
Even for people who just want to paste an address from a text message to look up in maps, and especially if you want to do anything with the calendar.
I just remember 15 years ago on my Treo 650 never needing to do that, and having no problem copying text between different apps seamlessly, between calendar, email, text, maps, and other apps. Same with Blackberry. Using modern Android is as awkward as driving a car with a mouse.
But I think there was an intentional push to minimize options for users, to make fewer pathways for things to go wrong. Forcing people to use pen and paper when they have a smartphone next to them is a UX success for them, because they don't have to improve handling text.
I'm afraid that's Agile development for you; teams contemplating the emptiness of their backlog and rushing to invent an endless stream of small t-shirts to fill it and keep the velocity within the desired KPIs.
> You feel alienated from your computer because there has been a conscious decision to take away options and user control in modern software. And I get why that decision has been made, even if I hate it as much as you and every other computer enthusiast.
This is a weird claim here because this issue only appears if you made a conscious decision to disable a critical security of the operating system, something only possible because you have "options and user control in modern software".
That this conversation arose within this context is somewhat ironic, but not particularly weird.
It’s great that Apple lets you disable SIP in macOS. It’s not great that Google frequently takes away user control. Different companies making different decisions in different situations is not contradictory, and outliers do not discount an overall trend.
For 90% of people, they have always "felt alienated" from their computers. They didn't understand what was happening or why things changed either, and it was easy to get yourself into trouble if you didn't know what you were doing and were trying to figure out how to fix something.
So companies decided to make their software have fewer options, and do more things automatically, without asking the user to have to make a choice. They don't give the users an option to customize, so they don't have to worry about those customizations causing breakage.
For advanced users this is crippling, but there are a lot more of them than there are of us, so they are going to be catered to.