I think if you have a very specific role where your workload is constant it makes sense. I am an independent contractor and work across a lot of different projects. Some of my client projects require running a full Rails/Worker/DB/Elasticsearch/Redis stack. Then add in my dev tools, browser windows, music, Slack, etc... it adds up. If I want to run a migration for one client in a stack like that and then want to switch gears to a different project to continue making progress elsewhere I can do that without shutting things down. Running a VM for instance ... I can boot a VM with a dedicated 8GB of ram for itself without compromising the rest of my experience.
That is why I think 16GB is table stakes. It is the absolute minimum anyone in this field should demand in their systems.
Honestly the cost of more RAM is pretty much negligible. If I am buying laptops for a handful of my engineers I am surely going to spend $200x5 or whatever the cost is once to give them all an immediate boost. Cost/benefit is strong for this.
All of this is doable in 16GB, I do it everyday with a 3.5GB Windows 10 VM running and swap disabled. There are many options as well such as closing apps and running in the cloud.
Update: Re-reading your above comment I realized I mis-read your post and though you were suggesting 32GB was table-stakes... which isn't quite right. Likewise much of below is based on that original mis-read.
I'm not convinced that going from 16GB to 32GB is going to be a huge instant performance boost for a lot of developers. If I was given the choice right now between getting one of these machines with 16GB and getting an Intel with 32GB, I'd probably go with the M1 with 16GB. Everything I've seen around them suggests the trade-offs are worth it.
Obviously we have more choices than that though. For most of us, the best choice is just waiting 6-12 months to get the 32GB version of the M? series CPU.
I've seen others suggest that 32GB is table-stakes in their rush to pooh-pooh the M1.
I, personally, am a developer who has gone from 16GB to 32GB just this past summer, and seen no noticeable performance gains—just a bit less worry about closing my dev work down in the evening when I want to spin up a more resource-intensive game.
I agree with this. I don't think I could argue it's table stakes, but having 32GB and being able to run 3 external monitors, Docker, Slack, Factorio, Xcode + Simulator, Photoshop, and everything else I want without -ever- thinking about resource management is really nice. Everything is ALWAYS available and ready for me to switch to.
People have been saying this kind of thing for years, but so far it doesn't really math out.
Having a CPU "in the cloud" is usually more expensive and slower than just using cycles on the CPU which is on your lap. The economics of this hasn't changed much over the past 10 years and I doubt it's going to change any time soon. Ultimately local computers will always have excess capacity because of the normal bursty nature of general purpose computing. It makes more sense to just upscale that local CPU than to rent a secondary CPU which imposes a bunch of network overhead.
There are definitely exceptions for things which require particularly large CPU/ GPU loads or particularly long jobs, but most developers will running local for a long time to come. CPUs like this just make it even more difficult for cloud compute to be make economic sense.
As someone who is using a CPU in the cloud for leisure activities this is spot on. Unless you rent what basically amounts to a desktop you're not going get a GPU and high performance cores from most cloud providers. They will instead give you the bread and butter efficient medium performance cores with a decent amount of RAM and a lot of network performance but inevitable latency. The price tag is pretty hefty. After a few months you could just buy a desktop/laptop system that fits your needs much better.
Larry Ellison proposed a thin client that was basically a dumb computer with a monitor and nic that connected to a powerful server in the mid 1990s.
For a while we had a web browser which was kinda like a dumb client connected to a powerful server. Big tech figured out they could push processing back to the client by pushing JavaScript frameworks and save money. Maybe if arm brings down data center costs by reducing power consumption we will go back to the server.
That is why I think 16GB is table stakes. It is the absolute minimum anyone in this field should demand in their systems.
Honestly the cost of more RAM is pretty much negligible. If I am buying laptops for a handful of my engineers I am surely going to spend $200x5 or whatever the cost is once to give them all an immediate boost. Cost/benefit is strong for this.