IMHO programming comes down to understanding logic, breaking down many-step problems into their smallest elements and being able to express a process in a flow or state chart. There is an old saying "behave yourself or I'll replace you with a very small script", which I assume comes from the Unix world. ZSH and shells are surprisingly powerful in this respect. Perhaps consider automation technologies like Zapier and Automator on the Mac? I suppose it's all about making ones (and others) lives easier with the minimum of effort.
Here's a little anecdote as an example. I was consulting on a job to increase efficiency, I noticed that there were changes that were being made by hand to PDF invoices generated by a system which could not be changed. I can't recall the exact changes, however it was trivial for me to chain together a PDF editing service with Zapier to automate bulk changes to invoices (the business was growing and the amount of invoices was increasing significantly each week). The entire process took less than an hour and saved hours worth of work each week for the admin staff. Hence the quote in the previous paragraph, a small script or automation did in fact replace a manual task for an admin person (in this case I didn't automate anyones job away thankfully ;-).
How to get better at technology is a very general question; but I'm assuming you mean technology as it applies to business - which apparently is supposed to be all about increasing efficiency. Technology for entertainment is a completely different game, etc. etc.
+1 here for "breaking down many-step problems into their smallest element".
If you can do this, then the technology to solve each element is much easier to determine, and it'll also be easier to ask (Google, ChatGPT, coworkers) more specific questions that provide more specific answers.
According to Gall's law, breaking things down to simple elements is not only the best way, but the only way.
"A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system."
Excel is a functional programming environment which works at scale. Yes, we hate it. Yes, date handling is a mess. If you learn how to drive excel you can do almost anything.
>I feel the major issue with excel (and other stuff such as CSS) is that one learns by cobbling things together and never through a formal process.
This is literally all programming, and it's not necessarily a bad thing. As long as you keep learning and don't just keep doing the same cobbled together mess for years and years
I always thought of it as a datastore. What else is it? A document with one or many tables, each containing rows of highly unstructured records (that can cross reference each other), but essentially it is a data store. Or is this view completely wrong?
There have been multiple points in my life where I’ve been tasked to create a plugin for excel for essentially this purpose.
It is a database though, one that is very accessible to most savvy people (aka, your domain experts) and you should lean on that instead of pushing them out. At least at smaller scales.
One day I discovered "Record Macro". That was handy to automate some simple tasks.
Then I discovered I could make a clickable button, that would activate one of my macros. That was great.
Then I discovered I could double-click (iirc) the button and there, in front of my eyes, was the code. A whole world opened up to me.
I made gigantic programs. Thousands of lines of code. Horrific code. I didn't understand variables or arrays but I had cells and columns! Imagine that.
I became the automation guru at my job. God I hope those Excel files no longer exist. I would die of embarrassment. I built automation tools in my own time. In Excel, of course. Before long I was making enough to quit my day job. Thanks, Excel.
Mine is: Don't waste time finding the "best" program for a particular task. Rather, find one that exposes keybindings for every action in the program, and then commit those bindings to muscle memory. This applies to:
- design tools (Photoshop, Figma)
- media-editing programs (GarageBand, whatever is used for video-editing these days)
- unexpected places (e.g. Trello)
The mouse has only 3-6 degrees of freedom to interact with. Only two digits on one hand operate a mouse. A keyboard has 50+ degrees of freedom, with all ten fingers available for usage.
Anecdotally, companies that design software that can be exclusively used by a keyboard tend to think more highly of their customers.
As long as you're optimizing, consider a mouse with a lot of buttons. Lets you use a mouse while at the same time having access to a bunch of keybindings.
Life isn't a support-system for technology. It's the other way around. If there's no joy in it, it's just no good.
And my own thoughts:
"Get better at using technology" is a vast possibility space. You'll never be able to learn, use, or even appreciate all modern technology like a specialist. Be intentional and figure out what you want to get out of your technology use, and learn what you need (maybe from one or some of those specialists) to best achieve it.
For those new to the advice, it's a quote from "On Writing", I saw it first in a Zen Pencils mini comic, and it moved me deeply, it seems like very good advice.
You just have to convince one that your thing would be more fulfilling than writing their own malbolge compiler or whatever they currently do in their spare time.
Most tech companies today work against your interests, to one degree or another -- even the companies that run expensive warm-fuzzy PR campaigns -- so try to minimize how much you have to trust any of them.
I can agree that interests of companies and individuals are not necessarily aligned and it's good advice to take this into consideration. I'm not sure it's necessary to add actual bad intentions from the part of all tech companies. In fact, I think it spoils most of what's good about the advice.
Truth be told I think non-programmers are the experts on using technology. When I have a problem I reach for code, the real power users are the ones who don't have that option.
The interesting thing is they are not building things up from first principles, or making new tools, they're using exactly what they have, as is, and developing a process around it that they can teach and share.
Programmers rarely do that. A programmer sharing something means developing it enough to put in the App Store, or convincing you to run their scary shell script.
A non-programmer takes 3 existing things, uses 10% of the functionality from each one, sometimes using one step to undo unwanted stuff from the last step, and does the task reasonably fast without a single hard to explain step or unusual tool.
And these are real world problems, not just stuff coders made up as an excuse to code more.
It's the opposite of what programmers talk about all day, it's not first principles thinking, their basic building blocks are insanely high level million line apps, and yet it all works.
I do think more of them would work from first principles if more (non-programmer focused) programs exposed a set of principles and allowed users to operate on them using their own logic and imagination.
Excel and some image editors do that, and people make pretty cool stuff with them.
2) recognize how this state of things is no different in software from the rest of what people create
3) deal with it the way you deal with it otherwise and apply your existing knowledge accordingly. software really isn't that brilliant most of the time.
You can Google for just about anything these days. Or ChatGPT if you're into that. a lot of the time as a person who's known to be good at computer stuff to my friends, I have no idea and just Google their questions, and seem smart.
Turning something off and back on usually fixes any problem, which extends to apps. This means it's actually hard to break stuff, so go ahead and try things. Play with your apps. Follow tutorials. Try to use features you haven't been 'trained' on. You'll find you can do more than you think, and often you'll improve on the processes you've been told to follow.
Corollary: If your coworkers or employer say this is bad advice that's a massive red flag and you need to find a new employer, unless you work in medicine or nuclear power, in which case maybe it's ok.
Read the error messages, and if they don’t make sense, you can probably google part of them to find other people who have the same problem. In particular, if it produces something like an error code, google that (along with the program name)
Analyze your workflow regularly and see if you find any repetition to your work. It's not worth automating every tedious task you might do but if one is coming up regularly it is very well worth automating.
Also reading others and understanding what they wrote - requires effort.
Accept that not everything can be communicated with 2 sentences summary and you have to put effort into reading and understanding.
I am avid reader and hope not worst writer - but people don’t read or read but don’t understand and jump to conclusions. Which was causing me a lot of frustration because I thought everyone is able to read at my level and a lot of people are not - they can be dyslexic or bunch of other issues like even having a bad day or not sleeping well last night.
Depends on their goal and inclination. But perhaps this: Learning the basics of the Unix shell can be very useful when wrangling text and data, or for operating on lots of files in a batch.
I think learning Excel and Regex in VS code or even some random browser based regex replace tool might be more practical.
It's easier to remember when you only need to process text and data a few times a month... and average people will need to do that kind of thing even less.
Start thinking about how data moves. This is the simple first-principles trick that lets me understand things. What data is moving where? How is it moving?
If you want to get better at using technology, not just understanding it, I can give you a tip as a UI dev. Just ask yourself: "what would an overpaid designer named Ramona assume about the average user of this app". Unfortunately, most things have been enshittified, and no longer have manuals.
Ramona seems to do a better job that the vast majority of programmers... modern tech tends to be a joy to use unless you're expecting a blank canvas to do things your specific way, which is not what most people want from an app.
Any enshittification I see is strictly in non UI stuff, like vendor lockin and subscription models.
A lot of it is sales-side, but so many good tools could literally be a single page with a form, not multi-minute flows. Everything is a sales pitch, not useful.
Here's a little anecdote as an example. I was consulting on a job to increase efficiency, I noticed that there were changes that were being made by hand to PDF invoices generated by a system which could not be changed. I can't recall the exact changes, however it was trivial for me to chain together a PDF editing service with Zapier to automate bulk changes to invoices (the business was growing and the amount of invoices was increasing significantly each week). The entire process took less than an hour and saved hours worth of work each week for the admin staff. Hence the quote in the previous paragraph, a small script or automation did in fact replace a manual task for an admin person (in this case I didn't automate anyones job away thankfully ;-).
How to get better at technology is a very general question; but I'm assuming you mean technology as it applies to business - which apparently is supposed to be all about increasing efficiency. Technology for entertainment is a completely different game, etc. etc.