One of the most important lessons about mass-use computer tools is that overwhelmingly people use defaults.
I first realised this back in the 1990s when shitty default monitor resolution and refresh rate would literally inflict pain (headaches and eyestrain).
MS Windows 95 / 98 shipped with a default of 800x600@60Hz. This was on CRTs, many viewed under office flourescent lighting, which gave rise to a bloody awful flicker that people mistakenly attemtped to correct with "glare screens" and other useless crud.
The solution was simply to crank up refresh rate slightly. Bumping up resolution also gave more useful area and clearer, easier-to-read fonts. (This was also before Cleartype / sub-pixel rendering.)
And yet something like 90-95% of systems monitored ... somehow (possibly an early Web tool) revealed 800x600, and I believe 60 Hz refresh. This on systems which could easily be bumped to 1024x768 or better.
And this was literally a click away off the desktop.
Lesson Younger Me learned: users won't change any default.
So:
- Pick really good defaults. Autoconfigure for best experience if possible.
- If you want to cram things down users' throats, you can just change the default experience, especially on a SAAS app.
I agree with your general point, but the specific example isn't as egregious as you make it out to be. In the mid-90s, protocols for determining supported monitor resolutions were just coming out. The first widely-adopted protocol was DDC2, which wasn't standardized until mid-1996. Had Windows 95 or 98 defaulted to 1024x768@72Hz, a significant fraction of users would upgrade and find their display didn't work. To get a usable system, they'd have to figure out how to boot into safe mode and manually tweak the display settings.
I'm pretty sure this was 1999--2000 or thereabouts.
It's possible that standards were still too rough, but the statistic left a strong impression on me. I believe the general concept is validated elsewhere as well --- I'm pretty sure Jacob Nielsen has a similar finding.
I first realised this back in the 1990s when shitty default monitor resolution and refresh rate would literally inflict pain (headaches and eyestrain).
MS Windows 95 / 98 shipped with a default of 800x600@60Hz. This was on CRTs, many viewed under office flourescent lighting, which gave rise to a bloody awful flicker that people mistakenly attemtped to correct with "glare screens" and other useless crud.
The solution was simply to crank up refresh rate slightly. Bumping up resolution also gave more useful area and clearer, easier-to-read fonts. (This was also before Cleartype / sub-pixel rendering.)
And yet something like 90-95% of systems monitored ... somehow (possibly an early Web tool) revealed 800x600, and I believe 60 Hz refresh. This on systems which could easily be bumped to 1024x768 or better.
And this was literally a click away off the desktop.
Lesson Younger Me learned: users won't change any default.
So:
- Pick really good defaults. Autoconfigure for best experience if possible.
- If you want to cram things down users' throats, you can just change the default experience, especially on a SAAS app.