The problem is that Ubuntu is not focusing on the fundamentals and they're not putting enough money into testing.
I know well and good that Ubuntu was conceived because Shuttleworth felt that Linux was ready for the big time and that Red Hat etc had too much penchant to devote resources into low-level bickering that ultimately had relatively little effect on your average end user instead of focusing on improving user experience, but now that Ubuntu has moved the user experience so far forward, they should reconsider that mission. The places where Linux is most lacking is low-level compatibility for things like fast 3D acceleration and power management.
Canonical should use some of its funds (aka "Mark Shuttleworth's money") to buy the 100 best selling laptop models each year, set rigorous testing standards, spend six months developing a new release and then take however long is necessary to make sure that everything passes on the last three years' best selling laptops. In this process, they should not be shy about contribution to X, kernel, etc., and should distribute patched versions of these if necessary to get compatibility.
That, combined with Ubuntu's user experience work, is what will really make Linux a completely viable desktop computing platform. Far too often things break between releases and/or upgrades.
The problem is that Ubuntu is not focusing on the fundamentals and they're not putting enough money into testing.
I think you made my point more succinctly than I did.
I think the UX was just fine in last year's Ubuntu. You could argue about whether Gnome 2.x was as slick as Windows 7 or OS X (I think it was at least better than Windows), but I don't think there's much doubt the system was usable by non-geeks. The first thing to break for me was suspend/resume, with one of 10.10's kernel updates. I reverted to an older kernel and hoped 11.04 would fix the problem. It didn't, and it precluded running the older, working kernel. Performance also got worse, and I can't think of any noteworthy improvements as I didn't consider Unity ready for prime time.
So then 11.10 came out. Reviews said Unity was great now and everything ran smoothly, so I pulled the trigger on the upgrade. Unity did, in fact mostly work, though it was slow and glitchy. Oh well, back to Gnome Classic. Of course, it's Gnome 3 now and I can't even move the clock. That won't do, but I've heard the new Gnome 3 gnome-shell is awesome, and I have a video card that can handle it. It loads, slowly, but UI components sometimes vanish when I try to interact with them. Eventually, X crashes. Oh well, that gives me an opportunity to see what progress KDE 4 has made. I can report that the error messages for Plasma crashing look like they've had some attention from a designer since the last time I saw them. Good work.
I'm running Linux Mint Debian Edition with Xfce now. Still no suspend/resume, but everything else works. The Linux desktop experience is almost back to where it was two years ago. Yay leadership!
So the good news is you can move the clock in Gnome 3 Classic. All you have to do is hold Alt to right click on the widget and move. I mean... that's completely obvious right?
This has to be one of the worst UI decisions I've ever encountered...
"Far too often things break between releases and/or upgrades."
I recently tried to upgrade my older 9.10 install to 10.4 LTS. I got a bunch of errors about x.org upon upgrade (I think maybe because I downloaded and installed Nvidia's Linux driver a while ago), and yep you guessed it - hosed system upon reboot.
These sort of showstopper problems should not be occurring - not in the year 2011. Totally inexcusable as far as I'm concerned.
I'm a Linux user, full stop, and generally have some eye-rolling in these threads, where someone comes along and says something like "after 10 years of Windows, I tried Ubuntu for a day and it sucks", but the hardware regressions are indeed maddening. If something worked once, it should not stop working.
The problem is that Ubuntu is not focusing on the fundamentals and they're not putting enough money into testing.
Agreed.
As an example, if Ubuntu had instead teamed up with Adobe, Mozilla (for Firefox), and Google (for Chrome), and shipped a fix for the Flash problem that's been around for years (referring to Flash's instability on Firefox, Chrome, etc.), I think they would've pleased far more users w/a much more subtle change than the massive UI revamp that is Unity.
Total agreement. I've often thought the same thing about hardware. Ubuntu doesn't need to run on every laptop out there. They need to pick their fights. Not even 100, say 20.
Or they make an unholy alliance with one or two manufacturers to ensure (NOT excluding other mfgrs, mind you) top notch compatibility on a high end, mid range and netbook class unit.
That's it. They would do so much better like that.
I know well and good that Ubuntu was conceived because Shuttleworth felt that Linux was ready for the big time and that Red Hat etc had too much penchant to devote resources into low-level bickering that ultimately had relatively little effect on your average end user instead of focusing on improving user experience, but now that Ubuntu has moved the user experience so far forward, they should reconsider that mission. The places where Linux is most lacking is low-level compatibility for things like fast 3D acceleration and power management.
Canonical should use some of its funds (aka "Mark Shuttleworth's money") to buy the 100 best selling laptop models each year, set rigorous testing standards, spend six months developing a new release and then take however long is necessary to make sure that everything passes on the last three years' best selling laptops. In this process, they should not be shy about contribution to X, kernel, etc., and should distribute patched versions of these if necessary to get compatibility.
That, combined with Ubuntu's user experience work, is what will really make Linux a completely viable desktop computing platform. Far too often things break between releases and/or upgrades.