Hacker News new | past | comments | ask | show | jobs | submit login
Burn My Windows (github.com/schneegans)
764 points by marcodiego on Jan 4, 2022 | hide | past | favorite | 275 comments



Compiz was probably the single most impactful event for desktop linux of the mid 2000's. Not that it itself made much of a difference, but the ripples affect many areas we take for granted today. A few reasons:

  - it put linux ahead of windows and mac in terms of appearance;

  - it brought many new users and most were a good mix technical and enthusiasts;

  - it showed the advantages of modular software;

  - many plugins were useful and these useful plugins influenced desktops to this day;

  - it was fast, stable and cool enough;

  - it brought many new developers;

  - it was an incentive for vendors to improve 3d linux drivers;

  - it made X.Org developers improve redirection,

  - it came by default on the most popular distro from 2006 to 2012.
Yes, most of the effects were useless but even they helped developers and designers to decide what not to include or do in the future. It pioneered useful things like selecting an area of the screen and saving it directly to a file, useful zoom and quick visualization of non-visible windows. It also showed how important compositing was on the desktop. Although probably not in direct influence, there is a reason android, wayland and whatever comes with ChromeOS all have compositing features.

At the time, there was some interesting developments and experimentation: métisse, sun's looking glass, bumptop, deskgallery... none of them was as successful as compiz. I'm proud I was myself part of it (https://www.youtube.com/watch?v=-X9bcrJ3TjY) and have my name written in some of its source files to this day, even if almost nobody use it anymore.


It also taught me the value of enthusiast built software. A nearly blind friend of mine was able to use Linux for music recording in Ardour (thanks again Ardour/Paul, you rock) with the ezoom function in compiz! The only problem was that it had a limit of something reasonable like 8x, but my friend often required a bit more than that. I emailed the maintainer and he added it within an hour! We were both so used to dealing with the various exploitative zoom software providers on Windows that charged an arm and a leg for support, and new features were only added in future expensive upgrades that our minds were totally blown. Thanks again, Kristian! If you ever read this, you really made our year.


The physicality that Compiz and wobbly windows brought to the desktop was a huge boon to my productivity. Everything was low latency, high framerate, and just felt real. I could rotate my virtual desktops around and they felt like actual spatial locations for organization. I could drag windows and they felt like quasi-tangible objects, not just abstract rigid platonic rectangles.

It was far more than just a gimmick, and I really miss the effects today.


I have to say the same. It makes the computer "feel" better.

It reminds me of discussions around Mac vs PC in the early days. The experience of using a Mac was more "fluid" than Windows. The Mac would draw windows faster, move the mouse faster (as in more refreshes of the cursor position per second) and that made it more comfortable to use. At the same time I also used Sun and SGI boxes regularly and the stark difference between the jerky mouse movement of the Sun and the fluid, Mac-like, movement of the SGI made the former an inferior experience (even though I liked OpenWindows over SGI's window manager whose name I forgot).

I'd love to have wobbly windows back.


Your comparison of Mac vs PC user experience reminds me of a criticism I read of Java's cross-platform GUI implementation. The critic compared Java GUIs to an episode of Star Trek where aliens had created a facsimile of a room and table of food, but when the Enterprise crew tried to eat the food they gagged on it. It turned out the aliens, working off of visual TV broadcasts, hadn't realized the food was supposed to taste like anything.


Are you sure you're not talking about the feast scene from Galaxy Quest?


>...SGI's window manager whose name I forgot

4Sight!

https://wiki.preterhuman.net/4Sight_Window_System

http://www.vintagecomputers.info/pitechrep.html

http://www.bitsavers.org/pdf/sgi/iris4d/007-2001-030_4Sight_...

4Sight Programmer's Guide: GL/DGL Interfaces. NeWS. Window Manager.


If you use KDE wobbly windows are built in to its KWin compositor. :) I forget the exact details of how to get to it since I don't use KDE anymore (long-ish story), but even most of the silly things Compiz did are a configuration checkbox away in KDE.


Yeah, but it just doesn't quite manage to Compiz/Beryl feel. It is difficult to put a finger on but whoever did Compiz wobbly windows had an eye for it that the KDE devs don't.

Using both, I can't imagine getting excited by KDE wobbly windows in the same way that Compiz could get a "wow!" by demoing wobbly windows -> desktop cube.

It was a short, beautiful peak of Linux on the Desktop where a high was reached and then desktops just fell away from it.


It's under Desktop Effects.


It is incredibly ironic that Mac support for high refresh rates is worse than Windows or Linux today. It isn't possible to run an Intel Mac at 4k 144Hz any more, and this is a software issue not a hardware one.

https://discussions.apple.com/thread/253168625


https://wayfire.org — you're welcome :)


Yes, it's a strange effect. I initially dismissed wobbly windows and the desktop cube as very gimmicky bling, but it really did make a huge difference to the feeling of the desktop.


It’s funny how many things go like that. I agree on wobbly windows for floating window managers, though I use a tiling window manager now. In a completely different space, I found similar with expression-orientation in Rust: coming from languages like Python, I initially presumed it to be a gimmick that just let you omit the `return` keyword, but quickly found it was bound up in a better way of thinking about data flow and grasping what’s going on (much like wobbly windows), and now statement-oriented languages are just galling (but I still use JavaScript regularly).


Wobbly windows are still available in KDE Plasma Desktop Environments (I use Kubuntu, but there are many others).

Settings > Workspace Behavior > Desktop Effects > Enable Wobbly Windows

Enjoy :)


I just tried (on most recent versions with Fedora 35). Did not work yet :( Maybe after restart.


Interesting. On my KDE 5.18 (Kubuntu 20.04 LTS) Wobbly Windows worked as soon as I clicked Apply.


You can still use compiz today if you use a modular desktop like MATE or XFCE.


I remember when Compiz was released. It was mind-blowing, its sheer awesomeness was something out of this world. "ok, Windows is done" me and my friends thought, "prepare for linux dominating the desktops".

Yet ~15 years later here we are.


I remember Compiz being cool looking, but resulting in mostly crashing my computer or freezing the GPU driver.


I never had any issues at all with it. I believe it was all tied in with how stable your graphics card driver was. I grew tired of it after a few months and went back to boring old xfce flat, square windows, but it was fun while it lasted and peopel were always impressed by it.


I miss that feeling. Desktop moves at a much slower pace today.


Me too.

Desktop "progress" is now sadly now mostly done by Microsoft (geriatric at best) and Apple (mostly just implementing the new graphical design whims every year).


My god I wish companies would stop iterating on desktop “default” design. The only features on a Mac I use are Cmd+space and Cmd+tab. Beyond that it’s just a host. Every time they add crap I have to find and disable it all.

By all means make cool stuff but also make it opt in.


So much this.

Things I need my OS to do are: run my apps (office/browser/games), connect my devices (printers/game controllers/displays), basic file operations (copy/paste/delete)

And while doing this be: secure, reliable, out of my way.

All these adding people/chat/weather widgets "innovations" (looking at you, msft) make me throw my hand up in the air and ask "why", I wish they would spend that time and energy on security and reliability instead.


Ahh that weather widget was so bad. I had to use Windows for one project, and one day that abomination just appeared there. It rendered very badly (text didn't look at all like other taskbar elements), it was in a wrong language (???) and unnecessarily hard to remove. Felt super unprofessional, something I'd expect from browser malware.


I feel the same about the control panels. Windows 11 has like three or so? I can't keep up, thankfully they added a search function some time back and so, I only have to remember the keywords.


I don't miss desktop progress the way it was done on Linux between 2000 and 2015 (ish). It was exciting but now, I'm happy with what I have with Plasma 5. It is a fast, stable, solid desktop environment.

Plasma 5 is great and ages like good wine. The interface does not change much, but receives big improvements each version that don't break habits. Focus is on fixing usability bugs. See the weekly posts at [1] for instance.

It gets refined instead of changed.

[1] https://pointieststick.com/


That's not right. Gnome Shell overthrow the whole concept of a desktop in the past years, KDE never stopped to invent and there are hundreds of small WMs for every liking with hugely different concepts.

Microsoft and Apple both are stuck comparingly both running the same Windows 95 approach for all their desktops since.


You might well be right. I was at university during that time, and people would legitimately see that someone else had wobbly windows and cool effects and end up getting Linux because of it.

Just looking at all the options in the Compiz window was exciting. I can have fish inside my desktop cube?


I instinctively expected to hear the notes of "Here Comes the Hotstepper" when I clicked on that video. I was expecting https://www.youtube.com/watch?v=xC5uEe5OzNQ; which was somewhat popular to send around to show comparisons of how Compiz was so far ahead it could not just replicate Vista's interface, but could also better it in some aspects like the famous 3D cube.

Thanks to you and others for working on it! Looking at the video almost 15 years later, I feel wistful for the joy that accompanied some of these desktop effects, and wonder where it has gone today.


Shocking to see what passed as acceptable video quality back in 2007.


  $ yt-dlp -F 'https://www.youtube.com/watch?v=xC5uEe5OzNQ'
  [youtube] xC5uEe5OzNQ: Downloading webpage
  [youtube] xC5uEe5OzNQ: Downloading android player API JSON
  [info] Available formats for xC5uEe5OzNQ:
  ID  EXT   RESOLUTION FPS │    FILESIZE  TBR PROTO │ VCODEC       VBR ACODEC      ABR     ASR MORE INFO
  ───────────────────────────────────────────────────────────────────────────────────────────────────────────────
  sb2 mhtml 48x27          │                  mhtml │ images                                   storyboard
  sb1 mhtml 60x45          │                  mhtml │ images                                   storyboard
  sb0 mhtml 120x90         │                  mhtml │ images                                   storyboard
  139 m4a                  │     1.55MiB  47k https │ audio only       mp4a.40.5   47k 22050Hz low, m4a_dash
  140 m4a                  │     4.13MiB 127k https │ audio only       mp4a.40.2  127k 44100Hz medium, m4a_dash
  251 webm                 │     4.11MiB 126k https │ audio only       opus       126k 48000Hz medium, webm_dash
  17  3gp   176x144      8 │     2.67MiB  82k https │ mp4v.20.3    82k mp4a.40.2    0k 22050Hz 144p
  160 mp4   192x144     25 │  1006.62KiB  30k https │ avc1.4d400b  30k video only              144p, mp4_dash
  278 webm  192x144     25 │     1.70MiB  52k https │ vp9          52k video only              144p, webm_dash
  133 mp4   320x240     25 │     2.01MiB  61k https │ avc1.4d400d  61k video only              240p, mp4_dash
  18  mp4   320x240     25 │     9.42MiB 290k https │ avc1.42001E 290k mp4a.40.2    0k 44100Hz 240p
  242 webm  320x240     25 │     2.81MiB  86k https │ vp9          86k video only              240p, webm_dash
None of these will be the format it was originally uploaded in, but I’d guess format 133 (~61 kbps) to be the closest for the video part, at 2MB. Judging by the sound of it, the original audio bit rate was probably higher than the video bit rate!


I've been looking for youtube-dl with saner and parsable -F output. Thanks for suggestion!


Without checking, I would guess it was more related to limited upload speeds: video was probably recorded at a much higher bitrate, but then uploading that...


It was just plain fun! Nowadays, the only thing to look forward to is what functionality the Gnome devs ripped out this time...


But they did that back then as well. Quartz compositor and quartz extreme are older.

It has always been like this.


The way I remember it, I never managed to get the original Compiz to work right, the Beryl fork worked out-of-the-box with no hacking around, and then Compiz Fusion (when Beryl was merged back into Compiz) lost like 95%+ of what Beryl could do.


+1 to this. Back in the mid-late 2000s, I used to have a modded compiz theme which was heavily inspired by the Matrix, and looked absolutely spectacular. It included things like windows beind "hidden" unless you used a key-combo to "see the matrix", a bunch of animation effects, some sound effects from the movie etc.

I can bet that more folks in school wanted to try out linux (This was around the time when Canonical was sending out free installation CDs to whoever wanted to try out Ubuntu) because they thought it was "cool" rather than because of any of the other philosophical or technical arguments for Linux.


I thought compiz was interesting and it did have an impact.

That said, In the mid 2000s, Linux was better but still had issues getting video/etc configured properly (depending on what you wanted to do/your hardware). I specifically moved to OS X because video just worked.


Playing videos with Xv worked well for me on Linux in early 2000s. I can't never get proper hw-accelerated video to play without bogging CPU down on my Linux laptops anymore (Thinkpads with Intel low-voltage CPUs).


I was more a FreeBSD person at the time, slutty to try various things, but 2000s think pads and FreeBSD just worked at some point and never changed that system. Used OSX otherwise. Printers were the other Linux/OS X usage gap.


https://github.com/Schneegans/Burn-My-Windows/commit/b5f9118...

    // This effect is a homage to the good old Compiz days. However, it is implemented      //
    // quite differently. While Compiz used a particle system, this effect uses a noise     //
    // shader. The noise is moved  vertically over time and mapped to a configurable color  //
    // gradient. It is faded to transparency towards the edges of the window. In addition,  //
    // there are a couple of moving gradients which fade-in or fade-out the fire effect.    //
https://github.com/Schneegans/Burn-My-Windows/blob/main/src/...


I got into Linux largely because of how cool compiz was. Wobbly windows legit created my entire career.


Yes, Compiz, yet another point where Linux could have grabbed massive mindshare in the desktop market, when there was massive ... what do I say ... confusion? incompetence? exhibited by Windows especially.

And... it went nowhere. A shot at a truly amazing desktop that would put it even above OSX.

Squandered.

The desktop market is STILL, ten years after windows 8, STILL ripe for the taking.


Sorry no one gives a shit about these. The majority of users hate visual effects. Those that add useless delay into the task they are doing, tasks that in some cases are already repetitive & tedios without the delay. Same for modern web design, just egoistical style over functions.

Windows won because of software written on it, OSX because of exceptional hardware that comes with the OS. Things like compliz are for nerds and inconsequential on the grand scheme of things


Linux is still far superior when it comes to window managers.

If you like the Windows 95 approach both Microsoft and Apple are using you have KDE/Plasma, mature, stable and full of helpful gimmicks.

If you like the out of your way approach and still have all the luxury of a modern desktop (a bit like windows 8 wanted to be) there is Gnome Shell.

If you don't want to waste resources just try something like openbox and if you hate desktops all together try something tiled like awesomeWM

If desktops would be any argument for the average users nobody would be using windows today.


Also - it was just fun... I don't know why that can't be more of a consideration in modern UX.


I was already dabbling in Linux with shared hosting for my hobbyist website, but Compiz made me lean into Linux and development full tilt.

Showing friends the crazy 3D desktop, the wobbling, burning windows, and all of the other crazy customization and effects it provided gave me a kind of unique confidence and excitedness in my explorations. It was like jet fuel for learning. Bash, vim, Unix philosophy, Python - all things I got sucked into because I liked the aesthetics and promise of Linux. Lessons that outlived the window manager and paved the way for my career.

Compiz couldn't have done a better job.


I really miss this Linux era. Yeah, probably all these effects were not useful, but at least we had something innovative and fun going on in the Linux community, now the best we can hope for is to copy Apple interfaces...

Don't get me wrong, I appreciate the quality of the Linux desktop of today, but I don't see anything really different from Windows/Mac apart from getting subpar support from many software/hardware vendors.

Maybe is just nostalgia, or maybe I'm growing old and grumpy...


I dont know what Compiz is (not a linux user), but I seem to remember something like this mod existed for Windows in the 90s/00s


WindowBlinds had some similar features, but I’m pretty sure Compiz almost immediately did far more. The state of the art third party software on Windows in the 2000s tended to try to emulate Vista Aero on XP, or skin Vista/7 differently but with similar functionality. The Molten theme from WindowBlinds 6 is somewhat reminiscent of burning windows, but I don’t think it actually did that. Maybe at some point you could burn the start menu down, I can’t remember.

Some stuff, like the desktop cube, could be emulated, but not too well. On Compiz, everything was drawing during the animation, showing off the modular compositing that it enabled, whereas most desktop cube toys didn’t update the screen while rotating, making it less impressive.


Stardock's WindowFX is the Compiz equivalent for Windows. It still seems to exist: https://www.stardock.com/products/windowfx/

I remember trying it in the XP/Vista era alongside Compiz/Beryl and it was quite unstable. As a result it never reached the popularity of the (mostly usable) WindowBlinds.


Oh man, I forgot all about Stardock or how much I played Sins of a Solar Empire at one point. Talk about a game that should have been waaay more popular. I was (and still am) impressed by how Stardock made Windows Apps and Games - until I saw that, I had never realized that a mostly programming utility house could also do games and do them extremely well.


WindowBlinds was ok, but the performance was pretty bad and I don't think it could live-update the windows in the expose like view. It just displayed screenshots. Compiz on the other hand was silky smooth.


Yeah, Compiz could literally play a movie across two or three sides of the cube, with the video player "bent" in 90º. Felt quite impressive that the graphics card could even handle that.


Yes I think that was it! I never used it for Windows XP but I definitely used it for earlier versions of Windows.


Are you remembering Desktop Destroyer[0] perhaps?

0: http://stressreliefpig.com/games/downloadable-desktop/deskto...


Random, low confidence and trollish, theory: Compiz et al had the opposite effect. The number of different compositor / effect projects increased the already substantial fragmentation of the linux desktop world, and put a large amount of developer energy into things that didn't end up having influence over the long haul.


I clicked on the youtube link expecting to see Compiz examples, and was extremely pleasantly surprised to hear the music of Top Racer on SNES (I believe the game might've been called Top Gear 1 in America. In Asia Top Gear was the sequel to the first Top Racer game). One of my favorite SNES sound tracks of all time.


I loved it then I grew to love even more low effect, dark-style simple UIs as a result of the constant slowdowns, uselessness, desktop crashes that came with compiz. But god was it cool to brag.


I think it also frontiered GPU acceleration in user interfaces too.


Even thought most was useless I still wait for to see native zooming coming back. For visionally impaired with high resolutions this is more relevant than ever.


What does “mid 2000's” mean? I don’t mean to be dense, I just can’t figure it out.


Mid '00s if you prefer.

Referencing decades in two-digit form fell out of favour with the Y2K issue. Perhaps it should be resurrected.

At the turn of the 20th century, fashion was to refer to "oh-eight" and such, IIRC. I don't know that the decade had a common nomenclature. I suspect there's a Wikipedia article on that somewhere....

Hrm ... not really, or at least not readily.

https://en.wikipedia.org/wiki/Decade


Lol was thinking this yesterday when I was explain school years to my grade schoolers. You know, 2nd grade was from 2020-2021 and third is 2021-2022.

Half way through I was thinking, we could probably go back to 2 digit years now. . ..


We're just starting the 20s. I don't think I ever heard of the decade from 1910 to 1919 being referred to as the 10s, but I definitely heard of the 20s. Maybe the first 20 years of a century are kind of hard to name, so, yeah it's about time.


I'm pretty certain "teens" was commonly used to refer to the years 1910--1919.


Can we now go back to storing year as 2 digits? /s


Show HN: How we saved $2MM in AWS transfer fees by storing years as 2 digits (2051)


By that time $2MM will buy you a Pepsi and a broken record player.


Many would pay decent money for a player that can revive broken records!


There's also the linguistic thing.

In English, verbal reference to "1999" was "nineteen ninty-nine". But "2000" was "two thousand" rather than "twenty-aught" or "twenty-oh-oh".

There's an interesting aside here, the ephochal 1968 science fiction film classic Stanley Kubrick / Arthur C. Clarke collaboration 2001: A Space Odyssey was conceived as "twenty-oh-one" as I recall,[1] but came to be referred to largely as "two thousand and one". Whether this was simply due to the same psychological / verbal awkwardness of "twenty oh-one" or itself shaped subsequent usage I'm not sure.

I do note that the years in the span 2000--2009 are typically referred to in my experience as "two thousand one ... two thousand nine", but from 2010 onwards, the pattern is far more typically "twwenty ten", "twenty eleven", ... "twenty twenty-two", etc. There seems to be a similar pattern for the span between the years 999 and 1010 at least that I'm aware: "nine ninety-nine", "one thousand", "one thousand one", ... "one thousand nine", "ten ten", "ten eleven", ...

If I can nerd-snipe anyone into tracing what usage, patterns, history, and psychology of millennial-decade-span verbal year references are and why ... well, I'm sorry, I guess?[2]

________________________________

Notes:

1. I'm pretty certain Clarke references this in his many essays. Probably in The Lost Worlds of 2001

2. My first nerd-sniping target is of course myself... Some easily-found results:

- "How to write dates" notes the "two thousand" / "twenty ten" distinction: https://blog.harwardcommunications.com/2017/09/28/how-to-wri...

- NPR dedicated a story to the pronunciation of "2010" ... and reaches no conclusion, though it comes up with another cultural reference, "In the Year Twenty-Five Twenty-Five", in addition to my suggestion of 2001: A Space Odyssey. And an ominous reference to "the year we'll all be talking about: 2020"...: https://www.npr.org/templates/story/story.php?storyId=120470...

- This Reddit thread discusses rationales, precedents, and a few further references: https://old.reddit.com/r/linguistics/comments/rrzzcd/why_can...

- NY Times: "Naming the '00s" (2009): https://www.nytimes.com/2009/11/15/weekinreview/15segal.html

- Wikipedia seems to suggest that the 0--9 years of millennial centuries (1000, 2000, 3000, etc.) are pronounced as n* thousand, followed by the "twenty", "thrity", "forty"... followed by years after the year 10 of those millennial centuries, though without citation: https://en.m.wikipedia.org/wiki/English_numerals#Dates

I'm not sure what AP, Chicago, or MLA style guides suggest.


I've heard it referred to as the aughts. (or in the UK the naughties) Useful way of distinguishing it from the other decades of the 2000's


Also noughts or "noughties".


The English-speaking world still hasn't really agreed upon a term for that decade, especially in America, where "aught" and "nought" are rarely used words. (We tend to use "zero" instead, and "the zeroes" doesn't exactly roll off the tongue.)

https://en.wikipedia.org/wiki/Aughts


Yeah, at best we have the-turn-of-the-century (millennium?). Though mid-turn doesn't seem quite right either.


Years 2003(ish) to 2008(ish).


Thanks. I was being dense. The first thing I though was, “around 2500?”, which is silly.


Bunch of rose-tinted hogwash, this.

> It pioneered useful things like selecting an area of the screen and saving it directly to a file

Pioneered? This was a standard feature on Macs since January 1997.

> It also showed how important compositing was on the desktop.

No, that would be the Quartz and Quartz Extreme compositors released with Mac OSX 10.0-10.2 years prior.

Compiz in the mid 2000s was a mixture of catching up to established ideas and a bunch of cute but useless visual wastes of time. It didn't pioneer anything except novelty display plugins and was quickly made obsolete when people realized that wobbly transition effects got very old very fast.


"When disagreeing, please reply to the argument instead of calling names. 'That is idiotic; 1 + 1 is 2, not 3' can be shortened to '1 + 1 is 2, not 3."

https://news.ycombinator.com/newsguidelines.html


I'll have to agree with this. Compiz was great at drawing curious people in but I'd argue that Compiz helped many people write off Linux. It definitely made Windows XP look outdated but after you scratch the surface and adopt Linux, you discover a world of hurt whether it be bad drivers that break your desktop and throw you to a different run level, poor interoperability between the different components or just the general loss of trust when something catastrophic happens like the stupid USB stack corrupting your drive when you copy a simple file (yes this happened to me a few times over the years).

Compiz resulted in tons of Youtube videos showing how cool it is but it was a gimmick. An OS is much more than cool looking visualizations and to that end during the time it was introduced, Linux was less stable and so people come for the looks and then left because its Linux.

I still want to believe that Linux will become the king because we have lost so many freedoms over the years. As a result, every year I install a clean copy of Ubuntu on to my PC, start using it and then stop when I discover some serious bug. After that I put it back on the shelf and wait until next year. Maybe next year it will be better. This whole journey began during the Compiz era.


> An OS is much more than cool looking visualizations [...]

At the time it was easy to hear people complaining about how ugly linux was. Compiz helped a lot in that front. We were used to people saying "you can't do that on linux" and then things quickly changed to us saying "you can't do that on your OS".

Let's not fake it: linux is still far from being a diamond of UI design or consistency, but well, competition has its own problems too. The point is: things improved a lot and that event at that time in history made things improve a bit faster. To the point that almost two decades later something like this gets to the front page of hacker news and is filled by comments of people with fond memories of the time.

> This whole journey began during the Compiz era.

Another evidence of the impact it had.


> linux is still far from being a diamond of UI design or consistency

While I'll agree those are important for usability, I'm not sure they're necessary for adoption. Windows 10 uses a mix of UIs ranging from win32 windows 95 legacy to MAUI and most popular programs implement their own UI frameworks and it's doing ok.


A lot of people don't care about the experience of using a computer. If they did, there would be revolts and walk-outs against Outlook and Exchange, about SharePoint, and about every single version of Windows. I am an amateur font designer and even I find the font rendering on Windows (specially if you have mixed density screens) horrendous. It's like it has a dozen incompatible libraries using different font rendering methods that are inconsistent between screens.

It's an old joke that one of the best ways to make someone perpetually unhappy is to teach them proper font kerning.


In Outlook, on the vendor's own O/S, one can click on the "trash" icon of a message, and then watch in horror as a new message arrives, every message drops down a slot, and then the program recognizes the click on the trash icon of what used to be the message above, which is then deleted. I mean, come on.


I have used Linux since 2004, and the fonts were so horrid I read up on configuring FreeType to try make them less jarring.


At this point Windows is largely running on inertia and the fact that, despite all its flaws, its competitors still somehow manage to have worse issues for most people. But at one time, it was actually a pretty damned consistent and user-focused OS.


> linux is still far from being a diamond of UI design or consistency

YMMV. I myself am very happy with Gnome and would say it's about as nice to use as a Mac. You can, of course, install ugly applications with horrendous UIs, that use Athena or Motif widgets, limited only to X bitmapped fonts or an ncurses UI that would work on a VT-52, and so on - but that's kind of a feature of Linux - it's Unix and it runs a lot of things originally built in ages long past. It can be consistent if you want, and it can embrace the past in ways no other OS can dream of.

Except, maybe, IBM's z/OS, but that's a completely different beast.


I think Gnome apps in Gnome are actually pretty alright.

Not really jarring anymore with ok functionality between apps. I wish gnome-shell was more stable and shipped with a dock…


> I wish gnome-shell was more stable and shipped with a dock…

It does. It's just that it's hidden now by default.

What I really would like is a simple e-mail/messaging program to consolidate my mail, Slack and Teams inflows on a single threaded timeline.


I guess I can sort of agree with you. Although during that time was the height of GNOME2 and even today I find myself leaning towards Gnome2/MATE because it feels so much more stable than anything else(despite me always giving the main Ubuntu distro a chance every year as well because I feel it is the most looked at distro).

>Another evidence of the impact it had.

Well for me it wasn't Compiz that brought me into Linux, it was this idea of something different from WIndows but it may have had this impact for others. Compiz was a gimmick to me and after trying it once I put it aside to try and just make my regular Linux installs remain stable.


I switched to Linux full time before Compiz took off and did so for exactly the reasons you cited the other platforms were superior: Linux was more stable, easier to reason with (as it was doing less magic behind the scenes), components worked better with each other since POSIX is designed for interoperability.

Driver support was patchy at times, but then it wasn’t exactly easy on OSX (Apple: “if we don’t support you then you’re shit out of luck”) nor Windows (Microsoft: “we support everything. Albeit you’d have to manually find those drivers yourself so if your system doesn’t boot or network access fails then you’re shit out of luck”) either. At least Linux shipped 99% of what you needed on the install CD.

For that reason, I’d almost always switch to Linux if ever I needed to debug a hardware problem in Windows or OSX. Though that’s less of an issue these days because I haven’t run Windows in ~15yrs and if you have a hardware problem on a modern MBP then you’re shit out of luck so there’s little point trying to debug it yourself.


>I switched to Linux full time before Compiz took off and did so for exactly the reasons you cited the other platforms were superior: Linux was more stable, easier to reason with (as it was doing less magic behind the scenes), components worked better with each other since POSIX is designed for interoperability.

Good for you that you had this experience but this is the standard talking point I have been hearing for 15 years. Yes if you are willing to put in the work Linux is more powerful than a closed source OS. However you forget that the primary job of the OS is to provide a stable platform to enable you to run applications. Instead you are ignoring this and praising other aspects of the OS that do not directly correlate to improvements for regular non IT end users. If I do not want to spend time fixing a broken config caused but a bug, I am out of luck. If I do not want to deal with poorly made system utilities that do not correlate to what the config files do then I am out of luck. If I want different components of the OS to have a unified design language so they work together I am out of luck(ex. Even today GNOME bundles a bunch of old garbage tools and expect them to be equivalent to their Windows/Mac counterparts, no thought is put into the usability and uniformity of these tools).


> Good for you that you had this experience but this is the standard talking point I have been hearing for 15 years.

Which is entirely anecdotal.

I’ve done a considerable amount of research on this topic over the last 20 years and for the at least 10 years of it the actual main reason Windows users don’t like Linux is simply because it’s not like Windows. It doesn’t matter how much better Linux might be or how crappy Windows might get, if people are comfortable in one thing then they generally don’t like switching to another thing that behaves differently. And Linux behaves very differently.

This is the reason Microsoft practically gives Microsoft products away at schools. Get them comfortable at a young age and most of them will stick with you for life.

Just look at how successful Android, ChromeOS, Linux netbooks (before Microsoft subsidised XP on them) are/were. If a compelling platform comes with Linux pre-installed people manage just fine. But if you ask them to take a Windows machine, wipe it and install something alien the of course a lot of people with struggle. It’s no different to how few people install 3rd party firmware on smart TVs, routers or other consumer devices.

But I’m fine with that. I used to get wound up with tactics like MS subsidies 20 years ago but these days I’m very much more live and let live. As long as people don’t impose their preferences on me, I won’t be an arse about my preferences to them. Just don’t try to fob me off with pseudo-technical rubbish when it’s clearly just a subjective bias.

> Yes if you are willing to put in the work Linux is more powerful than a closed source OS.

Open source is only part of the equation. It’s that the whole OS is modular and easy to interface with. Whether it’s CLI components, common APIs or even just hot swappable services like desktop managers.

Windows has elements of this too but frankly Linux just does it better. And I say this as someone who use to author a competitor to Windows Blinds. I’ve done my fair amount of low level hacking on Windows, I’d even go as far as to say that Win32 APIs are fun. But Linux is just easier to mould into whatever vision you have. But that’s not a criticism of Windows, Windows caters for a different audience.

And it honestly doesn’t take any more effort to learn Linux than Windows. People just get a head start with Windows given that’s what you grow up with. However having taught computer literacy to old people, I can tell you that Windows can be just as alien if you haven’t already had that head start. Equally my wife has bought Linux laptops before (because they were cheap) and had zero issues with them. So the stories of Linux being anti-user are far overblown.

> However you forget that the primary job of the OS is to provide a stable platform to enable you to run applications.

I haven’t forgotten that. You just wrongly assume that only Windows can do that.

> Instead you are ignoring this and praising other aspects of the OS that do not directly correlate to improvements for regular non IT end users.

I did actually give examples. :) eg Linux being easier to install because there’s no googling around to find the correct drivers. They just get picked up by default from your install media.

Admittedly Windows has improved vastly in that area too but I think Microsoft had to borrow a lot of ideas from Apple and Linux to get there.

> If I do not want to spend time fixing a broken config caused but a bug, I am out of luck. If I do not want to deal with poorly made system utilities that do not correlate to what the config files do then I am out of luck.

That’s just as big a problem on Windows and macOS as it is any other operating system, Linux includes. Software breaks on any platform. Heck, I’ve had far more instances of Windows Server failing after a broken update than I have on Linux despite running 2 orders of magnitude more Linux servers. And we are talking severs! Never mind all the junk that slows desktop Windows down from a thousand different independent update managers to printer bloatware that isn’t an issue on Linux. And Windows itself isn’t exactly big free itself either.

> If I want different components of the OS to have a unified design language so they work together I am out of luck(ex. Even today GNOME bundles a bunch of old garbage tools and expect them to be equivalent to their Windows/Mac counterparts, no thought is put into the usability and uniformity of these tools).

That’s not really a fair comment when Windows has multiple different control panels (has the Font applet even been updated from Win 3.x yet?) that were designed for entirely different desktop paradigms. Each with slightly different functionality and thus finding the right option usually requires clicking around a dozen hyperlinks in different applications for 10 minutes until, by chance, you happen upon the right applet.

Honestly mate, I’ve got nothing against other peoples preferences. Maybe you should relax your outlook on others too. Or at least stop pretending your preferences are technical in nature because for the vast majority of peoples that’s really not the case. For most people, it’s far more down to familiarity than it is down to which platform is objectively better (not that a vague term like “better” can ever be an objective metric anyway)


Here we go...down the same rabbit hole that these Linux vs whatever else conversations always go down.

Just to reiterate: Every OS has problems but in MY experience Linux has broken on me in fundamental ways. MY experience is that Linux cannot be trusted for day to day usage even though I have been giving it chances for 15 years now. I'm glad that you have the fortune of having a better experience but I am not going to ignore what I have experienced with the OS just because you said it was good.

I'm not going to waste my time with this anymore so I bid you good day.


> Here we go...down the same rabbit hole that these Linux vs whatever else conversations always go down.

At risk of sounding like a school child: you literally started it.

My point was initially just to say that other people get on fine with Linux. Then you took us down the rabbit hole conflating preference with technical fact.

> Just to reiterate: Every OS has problems but…

Exactly my point. You try to sound impartial but then drift into anecdote and bias. Like what you like, I’m really not here to argue you into using another operating system.

> I am not going to ignore what I have experienced with the OS just because you said it was good.

I feel like I’ve said this a dozen times already…but: I’m all for people having preferences and I’d never dare try to change someone’s opinion. But you’re conflating preference as technical fact. Maybe you should relax a little and appreciate other peoples preferences too instead of assuming you’re right :)

If you read back what I’ve posted you’ll see I’m not here to argue with you that you’re experiences don’t matter to you. I’m just saying it’s all subjective.

Having done as much research as I have on this topic over the years (had to for work) it’s funny how much of what we believe is fact is actually just down to preferences and those preferences are usually just down to comfort (like an old friend) rather than technical capabilities.

But I’ll happily end the topic here if that’s you’re desire. :)


> At risk of sounding like a school child: you literally started it.

No. When someone relates their negative experience with an OS you happen to use, that is not a personal attack or invitation to expound upon your own contradictory experience. This happens every single time anyone ever says anything even remotely negative about the Linux Desktop. Can you honestly say the same happens with anywhere near the same frequency when discussing Windows or MacOS problems?


> No. When someone relates their negative experience with an OS you happen to use, that is not a personal attack

What personal attacks are these? All I’ve seen thus far are adults having a mature conversation.

> or invitation to expound upon your own contradictory experience

That’s literally the point of social platforms. You cannot post an opinion on a public forum and then declare that other people are forbidden to rely. If that’s your bag then you’re better off writing your thoughts and then popping them in a glass bottle and casting that out to sea :P

> Can you honestly say the same happens with anywhere near the same frequency when discussing Windows or MacOS problems?

Yes. Happens all the time and on any topic. This is a message board, opinions will differ and people will want to discuss them. I don’t see what the issue is there (as long as it’s civil).

Eg this started out a positive thread talking about Linux composing managers and there wasn’t any need for anyone to start arguing about how much better Windows was but that happened. And I’m fine with that. Weird you should think I’m not allowed to reply when that does happen though.

Anyway, this has gotten meta and in my experience that’s usually the point when the quality of conversations deteriorate so I’ll duck out of the chat now :)


> this started out a positive thread talking about Linux composing managers and there wasn’t any need for anyone to start arguing about how much better Windows was but that happened.

That isn't what happened, this is my point about taking things personally. Let's look at what was actually said:

> Compiz was great at drawing curious people in but I'd argue that Compiz helped many people write off Linux.

After which that opinion was elaborated to say that despite the attraction of the window effects, Linux's issues during the time period in question drove people away and left them with the impression that Linux was all gimmick that couldn't hold up to serious use.

Your response to this was:

> I switched to Linux full time before Compiz took off and did so for exactly the reasons you cited the other platforms were superior: Linux was more stable, easier to reason with (as it was doing less magic behind the scenes), components worked better with each other since POSIX is designed for interoperability.

Which, at least to me and apparently nebula, sounded a lot like the "no, Linux is great actually" defensive anecdote you see every time anyone says anything negative about Linux. Worse, you didn't even engage with what was being said, only saying that you had a great experience. What does that have to do with the argument presented?

> That’s literally the point of social platforms. You cannot post an opinion on a public forum and then declare that other people are forbidden to rely.

There are useful replies and then there's tired old defensive performative argumentation. I'm arguing that 'I had a good time with Linux' in the context of what was posted is much more the latter than the former. I might be a touch too sensitive to this sort of thing after hearing it out of Linux Desktop evangelists for 20 goddamned years, though.

> Yes. Happens all the time and on any topic. [...] Eg this started out a positive thread talking about Linux composing managers and there wasn’t any need for anyone to start arguing about how much better Windows was but that happened.

Take a look back at the original nebula post and you will see that he only ever mentioned Windows negatively.


I think you are missing the most important point laumars made that is somewhat relevant:

> I’ve done a considerable amount of research on this topic over the last 20 years and for the at least 10 years of it the actual main reason Windows users don’t like Linux is simply because it’s not like Windows. It doesn’t matter how much better Linux might be or how crappy Windows might get, if people are comfortable in one thing then they generally don’t like switching to another thing that behaves differently. And Linux behaves very differently.

Now, I am curious if that research is public anywhere? That's been my gut feel for the last 20 years too, but I've never had any data to prove it either way (provided the methodology for that research stands to scrutiny).

Now back to the point, I agree that Compiz contributed to both attracting and pushing people away from Linux. It exacerbated any stability issues in the graphics drivers, but at the same time, there was a huge presence of all the Compiz-stuff around. I similarly already used Linux (since '98) for stability and hackability on the desktop, and I had no interest in Compiz because of the fragility.

Still, it's obvious from the discussion here that there were people who were both pulled in and pushed out due to Compiz situation. As nebula says themselves, it hasn't really pushed them out, but instead it's the continuing troubles they experience with every yearly attempt to switch to Linux (and sure, their choice of hardware may contribute to it: I've had better experience with Thinkpads than even with Dell XPS 13 preloaded with Ubuntu). Basically, just like you get a very specific model if you want to run Hackintosh, you should look for computer models with decent support to run GNU/Linux on (it's just that there are many more models to choose from that would run great OOB): and don't trust the certification, because that doesn't cover all the things a consumer might care about.


I know I said wouldn’t reply (and frankly this meta debate is every bit as worthless as I predicted it would be) but I feel I should point out that you haven’t traced the thread back up to the OP when conducting your analysis. You need to hit ‘parent’ on the post that you thought was top. Might need to do that a couple times in fact. That should add the context I was describing which you couldn’t see. :)


The flagged parent didn't mention Windows at all. In fact, the only comparison made was to MacOS.


That’s still not the OP and you’re misreading my posts again because I didn’t say the OP mentioned Windows, I said they discussed Linux and had replies mention Windows (that OSX comment just being one of many replies).

But don’t bother looking it up. I’m done chatting to you after the last barrage of personal attacks you’ve made in the other thread. There’s only so many times you can post offensive remarks and miscomprehend the conversation while others are being polite and patient before their patients eventually wears out.


You gave no examples supporting your claims, though. What was your experience? What has broken on you in fundamental ways? When was that? What distribution were you using? Anecdotally, there is no shortage of examples of people formatting their windows installation which would corroborate to the idea that windows is not safe from fundamentally breaking on you.


I have given examples in another thread here [1]: https://news.ycombinator.com/item?id=29798725


> I’ve done a considerable amount of research on this topic over the last 20 years and for the at least 10 years of it the actual main reason Windows users don’t like Linux is simply because it’s not like Windows. It doesn’t matter how much better Linux might be or how crappy Windows might get, if people are comfortable in one thing then they generally don’t like switching to another thing that behaves differently. And Linux behaves very differently.

There is a difference between just liking the way things behave because you're comfortable with it and preferring the way it behaves because it is better.

To this day there is a good chance that if I want to run the latest version of any piece of Linux software I will have to compile it from source like it's the 1970s in order to do so. That is a problem that Windows and MacOS have never had, and the Linux Desktop community has been very slow and reluctant to do anything about.

Hell, even today as Flatpak beings to emerge as the dominant cross-distro application packaging format, it is still lacking basic features of 1980s Desktop software management and gets a lot of flak from the community for existing at all.


> There is a difference between just liking the way things behave because you're comfortable with it and preferring the way it behaves because it is better.

indeed there is. However the vast majority of people fall into the former category while assuming theyre the latter category.

Or to put it another way, everyone cannot be right that their preference is technically superior. Ergo our preferences must be subjective.

> To this day there is a good chance that if I want to run the latest version of any piece of Linux software I will have to compile it from source like it's the 1970s in order to do so. That is a problem that Windows and MacOS have never had, and the Linux Desktop community has been very slow and reluctant to do anything about.

That’s a huuuuge generalisation there. The truth is it depends on the Linux distribution (Arch and Fedora are bleeding edge, Debian and CentOS are not) what repos you have enabled (stable, testing, etc) and even what software you’re running. Eg some niche cross platform thing on GitHub might require compiling for all OSs never mind just Linux.

Linux will see more regular platform updates than Windows and macOS where you’re limited to service packs and new OS releases. You also don’t have to wait until “patch Thursday” for patches on Linux. They get released as soon as they’ve passed build and test pipelines.

So there are definitely plenty of examples where the generalisation is a way off. But for the sake of impartiality I do agree that some niche software and some distros will make you compile from source. However its definitely not the norm for common software and hasn’t been for 20 years.

> Hell, even today as Flatpak beings to emerge as the dominant cross-distro application packaging format, it is still lacking basic features of 1980s Desktop software distribution and gets a lot of flak from the community for existing at all.

Yeah cross platform package management is broken in Linux. Snap, flatpak, etc. all have problems. Personally I think the real issue is that Linux is trying to emulate Windows and Mac with portable installers. If you want a platform where the responsibility is on the user to download and install applications manually then there are already mature options available for that (Windows and macOS). So there’s no point trying to compete there. Where Linux excels is with its package management taking the risk of application installation away from the operator.

This won’t be to everyone’s preference but that’s fine because not every platform should behave the same. Just because a specific paradigm makes sense for one platform doesn’t mean it makes sense for every platform.

Just look at how fundamentally different remote management on Windows vs Linux is. Windows is based around RPCs while Linux is based around scripting. Neither is wrong or right. Both work effectively despite being completely different approaches.

And here lies the problem with people who say one is better than another: they look at the differences and say “I don’t like it” but think it’s a technical decision when in fact it’s just an emotive response based on what they’re comfort zone is.


> indeed there is. However the vast majority of people fall into the former category while assuming theyre the latter category.

That sounds arrogant and condescending. In order for that to be true you would have to presume that any given behavior was objectively better, and therefore anyone who prefers different behavior only does so because they are comfortable with the 'wrong' behavior.

> That’s a huuuuge generalisation there. The truth is it depends on the Linux distribution

No, it doesn't, and I'm really tired of hearing "you chose the wrong distro, bro" for literally every distro because none of them actually do what I, and many others, want. Why is it considered so insane to want a stable base system and be able to install applications of any level of bleeding-edgeness on top of it without jumping through a bunch of hoops?

> Linux will see more regular platform updates than Windows and macOS where you’re limited to service packs and new OS releases.

You clearly haven't kept in touch with Windows through the Windows 10 era.

> Yeah cross platform package management is broken in Linux. Snap, flatpak, etc. all have problems. Personally I think the real issue is that Linux is trying to emulate Windows and Mac with portable installers. If you want a platform where the responsibility is on the user to download and install applications manually then there are already mature options available for that (Windows and macOS). So there’s no point trying to compete there. Where Linux excels is with its package management taking the risk of application installation away from the operator.

I disagree with literally every statement in here except the first two sentences. There is no reason one cannot have "portable" applications that are also managed with a store or package manager! It just means that you don't actually have to and you don't need an army of unpaid volunteers to repackage shit constantly. And Linux, in my experience, doesn't excel at taking the risk away as I have seen updates break other package's dependencies plenty of times.

We're getting off in the weeds. My original point, I think, stands: Linux doesn't behave the way I think is best, therefore it isn't 'comfort' that makes me not like the way it behaves, it's my preference for a different --better in my reasoned opinion-- behavior.

> This won’t be to everyone’s preference but that’s fine because not every platform should behave the same.

If this is truly your opinion then you have phrased your initial assertion in a very poor and condescending sounding way. You very much make it sound like "people who don't like the way Linux behaves are just idiots who can't learn new things".

> And here lies the problem with people who say one is better than another: they look at the differences and say “I don’t like it” but think it’s a technical decision when in fact it’s just an emotive response based on what they’re comfort zone is.

Oh, I see, it's because you actually did think they're just idiots.


> That sounds arrogant and condescending. In order for that to be true you would have to presume that any given behavior was objectively better, and therefore the anyone who prefers different behavior only does so because they are comfortable with the 'wrong' behavior.

That’s the literal opposite of what I said. I was saying what’s “right” is often subjective and people (ie not just technical users but non-technical users too) often (ie not always but in a great many cases) chose what’s right based on what is familiar or comfortable.

That’s basic human psychology. There’s a number of published papers that demonstrate this exact phenomenon.

> > That’s a huuuuge generalisation there. The truth is it depends on the Linux distribution

> No, it doesn't

Of course it does. A bleeding edge distro is going to have packages mainlined sooner than a conservative enterprise distro. This isn’t some shocking news, it’s Linux 101.

The whole point of distributions is they are different distribution of packages on Linux. So it should be pretty obvious that some will ship different packages to others.

> and I'm really tired of hearing "you chose the wrong distro, bro" for literally every distro because none of them actually do what I, and many others, want

I didn’t actually say you chose the wrong distro though. I said the age of packages depends on a number of factors including what distro you use (but not limited to).

The second part of your statement reaffirms my point about how there’s not right or wrong with preference. Linux doesn’t do what you want it to do your preferences lay else where. You’re even admitting there that others have told you that you’re trying to bend Linux into behaving in a way that distro wasn’t designed to. Thus that’s a preference you’re demonstrating and not a statement of something being technically superior nor inferior.

> Why is it considered so insane to want a stable base system and be able to install applications of any level of bleeding-edgeness on top of it without jumping through a bunch of hoops?

It’s not insane and there are platforms out there that manage that better than others. However it’s pointless me listing them because there are a myriad of other reasons you don’t like Linux too.

> And Linux, in my experience, doesn't excel at taking the risk away as I have seen updates break other package's dependencies plenty of times

No software is infallible but package managers are robust and have been for at least 15 years. The only times I’ve seen a package manager go tits up is when I’ve intentionally overridden it’s default behaviour and then forced it to proceed regardless. Basically knowingly forced it to break. I’ve had Linux (and FreeBSD too) running for years, constantly upgrading it from the package manager without a single issue. So I don’t doubt you’ve ran into issues but with package management on Linux, after all you seem to find all the problems on Linux, but generally it does work well for most people.

The same is true for Windows. If you’re competent then you can run a stable Windows install for years without any issues. And equally on Windows people might bork the system installing the wrong software. Such as accidentally installing a dodgy copy of Firefox from a spoofing website.

This doesn’t mean manually installing software is worse - I’m just exampling where package management tools do help. Microsoft would agree too since they provide a number of tools for pushing software out in an automated way.

> You clearly haven't kept in touch with Windows through the Windows 10 era

I don’t see why you make that statement when Windows 10 still proves my point: updates happening slower and less frequently than on Linux.

That’s not a bad thing though. It’s just a different operating model.

> We're getting off in the weeds. My original point, I think, stands: Linux doesn't behave the way I think is best, therefore it isn't 'comfort' that makes me not like the way it behaves, it's my preference for a different --better in my reasoned opinion-- behavior.

“Better” in your context is subjective to your preference though. Which is the point you keep missing. And if the term is subjective then it cannot be argued as technically superior.

Then there’s the question of why do you subjectively prefer the platform that you’re already familiar and comfortable with rather than this other platform that feels alien to you. Could it perhaps be because it’s familiar?

> If this is truly your opinion then you have phrased your initial assertion in a very poor and condescending sounding way.

People in glass houses shouldn’t throw stones:

> You very much make it sound like "people who don't like the way Linux behaves are just idiots who can't learn new things".

> Oh, I see, it's because you actually did think they're just idiots.

We don’t have to agree on everything but the ridiculously hostile way you’re twisting my post makes it hard to have a sensible conversation with you.


> And here lies the problem with people who say one is better than another: they look at the differences and say “I don’t like it” but think it’s a technical decision when in fact it’s just an emotive response based on what they’re comfort zone is.

Give me a way to interpret this other than condescension. You're saying that people who espouse a preference are not doing so based on valid reasoning, and the only evidence you have offered for this is that you have a different preference.

I agree that 'better' is subjective, I don't think I ever said otherwise.

You don't give people the benefit of the doubt that when they say the prefer some behavior over other behavior that they have legitimate reasoning behind it. That's arrogant nonsense entirely deserving of the stereotype that IT people are arrogant, and I for one am sick of people acting like this in our industry.


You’re literally the only one who’s been name calling and insulting people. You keep making wildly inaccurate technical claims that even a beginner in Linux would know better then admit that others have pointed out to you how deeply ignorant you are on this subject. And at no point agreed your opinions were subjective, repeatedly stating they were based on technical merit and even going as far as accusing me of thinking people were “idiots” for being subjective.

…and now that you finally grasp enough comprehension skills that you’re not making wild presumptions and insulting interpretations of my posts, now you accuse me of being arrogant and the “stereotypical problem with people in IT” in your very same post that’s agreeing with the entire point that sparked all the initial venom? Even now you cannot bring yourself to be polite with other people and instead insult them while simultaneously agreeing with them.

We’ve chatted a few times before and each and every time you’ve failed miserably at even the most basic comprehension while repeatedly making aggressive and insulting remarks. On one occasion even berating me with the same point I’d literally made the post before. And Ive seen you do this to other people too. So I’ve been more patient in this conversation than you deserve. There’s one tangent where you can’t even follow the thread and keep referring back to the wrong parent while making a big hoohar about how everyone else has misunderstood the conversation.

Maybe the reason everyone in IT always disagrees with you and comes across a certain way is because you are the one being an illiterate ass to them? Communication is a two way street and I don’t have this issue with literally anyone else on HN apart from yourself. And after this experience of insanity I’ll certainly make sure I don’t bother to engage with you again too.


You know what? You're right, I have been a little more insulting here than need be and I apologize for that. Twenty years of the same arguments from Linux Desktop's die-hard fanboys has made me a bit unkind to people I interpret to be Linux Desktop evangelists.

However, I submit that your language is not as clear as you might think it is. I am still not sure if you're trying to say that most people prefer certain behaviors purely out of comfort, or if you're agreeing with me that there is a good chance they prefer behaviors for legitimately technical reasons that just happen to be subjective. Your initial post on the subject very much strikes me as the former, and frankly so do nearly all your posts after that, yet you seem to insist that I am misunderstanding.


Nah. I am not the only one who used both and ended up using only windows.

Because, it was less work to use windows. I like programming, I even like configuring, but I want to do them when I want and not because I need to do something third and computer is failing.


Calm down... Let's not turn this thread into an ugly flamewar.


I am calm. What in my post suggested otherwise?


It's a real shame, but unfortunately it seems endemic to the FOSS development ecosystem: people will work on things they think are cool; and stable, consistent, functional software is hard work and not very cool. Consequently we get a lot of opinionated little fiefdoms ruling over collections of frankensteined software and then the evangelical wonder why it isn't The Year of the Linux Desktop yet.


Yes, I think so too. And I have used Linux since 2004, macos since 2013. And have barely touched windows since 2009, but I have tried w10 and w11 briefly.


> I install a clean copy of Ubuntu on to my PC, start using it and then stop when I discover some serious bug

What do you do when you encounter a serious bug in your OS of choice?


>What do you do when you encounter a serious bug in your OS of choice?

When I say serious bug I typically mean serious OS breaking bugs.

Some examples from these past years(these all happened different years):

1) After clean install, desktop crashes after first reboot and I am thrown into terminal. Result: Stop usage and move on.

2) After Clean install, I wish to copy some files to an fat32 USB drive(Sandisk purchased directly from them). I get some error while the file is being copied, the drive is unmounted and then when I go to another system running Windows to check if my file was copied, the drive is corrupted causing all my files to be lost. Result: Stop usage and move on.

3) After clean install, I go ahead and connect a second monitor. Now my desktop becomes a garbled mess on both screens. I disconnect screen and the desktop remains a garbled mess on the main screen. I force reboot and upon reboot now I have been dropped to the terminal. Result: Stop usage and move on.

These issues don't happen on Windows and Mac in my experience. Don't get me wrong, Windows is degrading in usability and Mac is as well (at a much slower pace) but they are not falling apart in these fundamental ways. The very foundations of Linux seem to be built on sand and that does not convey trust when you expect your system to be more than just a toy. This is a machine to get work done on and I depend on it. I cannot be dealing with silly issues like this.


I completely believe you've experienced these. TL;DR: Linux, like all other systems, mostly has issues with hardware. For a proper experience, get a system that's known good under Linux (and you need to use a non-ideological distribution like Ubuntu which will ship proprietary drivers). Linux-focused or business models of major manufacturers are your best bet.

1) sounds like a hardware issue or hardware compatibility issue. Unfortunately, while Linux does come with great and extensive hardware support out-of-the-box, you'd still benefit from choosing hardware that works well with Linux. Thinkpad T series (I usually use X1 Carbon, but as the highest end ultraportable it can introduce some Windows-specific firmware issues that they only resolve in a few months after release — they did that with Gen6 when they switched to new always-on-standby, but they introduced a BIOS option for Linux ~6 months later) is usually the best bet out of big manufacturers, and Linux-focused ones like System76 are a good choice too. So if you want to try out Linux next time, get Linux hardware that will work well with it first (or be ready to debug issues like the ones you mention).

These sort of issues happen even more on Windows if you clean install it on a wide variety of hardware (MacOS can't even be installed on your non-Apple system, I am guessing, so I'd say that it's even worse :). The benefit Windows has is that it comes pre-installed on those systems.

I've seen 2) frequently ~5 years ago myself: you had to really ensure that you unmounted FAT32 devices cleanly and waited for the system to announce that the drive was safe to remove: don't do that a few times, and filesystem will be messed up enough for Linux FAT32 driver to be able to cope (Microsoft, having developed FAT32 themselves, has an obvious advantage there). They don't seem to cause as much corruption anymore (only the file you were transferring might be corrupted), or maybe I learned to wait properly. Also, flash drives can degrade and cause all sorts of issues where it seems to work properly in other systems but causes silent corruption. If this was an old flash drive, that's the most likely issue: I've learned to go with only SanDisk Extreme flash drives because they have the best write endurance, but they are multiple times more expensive for the same capacity compared to other flash drives (including SanDisk ultra stuff).

3) is really weird, and external display connectivity has been a huge issue ~10 years ago, but I haven't seen any issues since (including in presentations on random projectors, multiple screen types and such): the only stuff that still annoys me is the default of setting up new screens as extended, chromeless display areas (I'd definitely default to "mirror screens"), but once a screen is set up, it saves the settings forever. Still, this is most obviously a driver issue (integrated Intel graphics is usually most stable, Nvidia most performant, but AMD should be becoming the best of both worlds in the last few years, though it's hard to get their GPUs).

And since you say these don't happen with Windows, just 10 days ago I had to reboot my X1 Carbon into Windows (that it came preinstalled with) to file some tax paperwork and I connected it to my Dell 4K TB3 display (which I dock to and undock from repeatedly on Linux without issue: all of this is pretty high end Windows equipment). On reboot, TB3 light remained on and it stopped charging and passing DisplayPort through (still worked as a USB hub, and rebooting back into Windows did not fix it). I was thinking I'd have to go to the service shop since my laptop is still under warranty, but as I was pulling my SSD out to prepare for that, I realised I could just disconnect the battery and see if that helps, and luckily, it did.

In short, dealing with hardware is hard (which all your issues seem to be), and all of the operating systems have different quirks. As above, I had a pretty terrible issue with an expensive Windows laptop and Windows-supporting screen ($3000 + $1200) which worked without issue under Linux for more than a year. Apple has a huge advantage because they control the hardware they ship with, Windows is generally worse on arbitrary hardware but has the benefit of being pre-installed and "pre-debugged", and Linux is better than Windows but has to be clean installed (I am pretty confident that if you took random 20 laptops off the market today and attempted to clean install both Windows and Linux on them, you'd get a fully working Linux system more easily; MacOS easily loses out on such a benchmark).


I strongly disagree with your assessment. First of all I don't know what kind of frankenstein Windows environment you are running to get those sorts of issues so lets put that aside.

With Linux the issues are fundamental to how the OS is designed and maintained. Several of these issues were fixed only to come back one or two releases in the future. These sort of terrible regressions don't inspire confidence.

You have listed excuse after excuse to justify these failures when in reality both other OSes have had these functions worked out over 10+ years ago.

1) I have heard this excuse before that I should stick to official Linux hardware. I have had issues with Thinkpads as well and some of these issues happened on a Thinkpad. I have also been hesitant to jump onboard System76 or Purism because their systems just look like junk and in 2021 when we have stunning Mac laptops with beautiful trackpads and retina displays and me entering my 30s I just don't have the stomach to spend hundreds of dollars on laptops that look like they came out of the mid to late 2000s. In the case of System76 they are rebadged Clevo systems from what I understand(could be wrong about this). So in reality I am paying for QA for this year of distro with no specific guarantee that I won't start hitting the same issues a few years down the road when the regressions start to come. There is no garuntee that System76/Purism will fix some small regression in a multi year old system. They suffer from the fact that the community will move on without them and so they have to maintain support. Mac systems have been supported for 7+ years. I have a 2012 Macbook Air that still runs pretty decent with no regressions on the latest MacOS (well second latest since they finally discontinued it giving it ~9 years of support).

2) I have never had any doubt as to Windows's ability to correctly write data to a USB drive ever since I started using this functionality back in Windows XP!

No I specifically mentioned that I bought Sandisk and direct from them to remove this commonly used excuse where Linux people blame everything but the OS itself. The flash drive is fine. Windows has written to it successfully without ever throwing an issue. MacOS has written to it multiple times without issue. The flash drive is not at fault!

I want you to stop for a moment. Picture yourself as the user. You have come to expect over the years that when you copy a file to a USB mass storage device, the file is correctly written. You don't do file integrity checks or other silly stuff. You just copy your files to the drive, unmount it and then move on. Now imagine you are trying linux out and you follow the same mindset. You copy files over and then unmount and move on. The horror and shock of losing precious data will definitely put permanent anxiety and distrust into the User's mind that will be very difficult to restore. If you cannot trust copy and paste then what can you trust? The OS is fundamentally flawed because these copy and paste operations in the GUI are basically abstracting command line operations that may or may not fail silently. This is garbage design compared to MacOS and Windows. When these other OS'es fail, sure they don't show you the error most of the time but the GUI is strongly tied to the operations that are occurring and so when they fail, you know they have failed. You're not in limbo because you are looking at a GUI that is not correctly tracking what the underlying command line application is doing.

3) All I know is that I have never had display issues in Windows and it is not even a thought on MacOS. In fact MacOS handles multi display so beautifully because it scales things perfectly. This is better than even Windows because while Windows always gets multi-monitor working without fuss, things like font rendering leave a lot to be desired. MacOS give you everything: Stable muti-monitor support, beautiful scaling and excellent font rendering across displays with different resolutions. Linux is not even in the same ballpark because you don't really know if the display you hook up will even start up properly and then when you need to do config changes you are stuck doing some stupid stuff like installing xrandr. You cannot expect to plug in the damn display and move on with your life. I recently hooked up a 640x480 CRT along with a 4K display to mac and to my surprise, it handled scaling across these wildly different displays perfectly. I couldn't believe it! I have gotten so confident in the Mac ecosystem that I don't even doubt it anymore. It just works and that freedom allows me to remove that cognitive load off of my mind.


> I strongly disagree with your assessment. First of all I don't know what kind of frankenstein Windows environment you are running to get those sorts of issues so lets put that aside.

I am happy to see you strongly disagree while making such a wild assumption even after I said it was factory pre-installed system.

I'm not using any "frankenstein Windows environment": this is factory installed and up-to-date Windows installation on a recent high end Windows laptop.

You are dismissing my experience in a worse way than you are saying your experiences are being dismissed. It's also not hard to find evidence of Windows messing up flash drives online too.

If all you can do is discuss in bad faith, there is no reason to discuss further.


Also the composited desktop Aero shipped in Windows Vista in late 2006, the same year as the initial release of Compiz. Aero was originally demoed at WinHEC 2003, for whatever that is worth.

I don’t know how much these different compositing window managers inspired each other. To me it seems like there is some convergent evolution. Compositing window managers are obviously superior (no redrawing when moving windows). In the mid 2000s memory and graphics cards became cheap and powerful enough to make compositing viable.


> Also the composited desktop Aero shipped in Windows Vista in late 2006, the same year as the initial release of Compiz.

Opposite end of the year though since Compiz was released at the start of 2006. Compiz had seen significant development over that year (unlike Windows that only ships big graphical updates in new OS releases). So much so that by the time Vista was out it had already forked a mature competitor: Beryl.

Plus Compiz wasn’t the first compositing Window on Linux either. Just arguably the best in that era.

> I don’t know how much these different compositing window managers inspired each other. To me it seems like there is some convergent evolution.

Technology almost always works that way. But it’s fair to note that Compiz did feel miles ahead of the competition at the time. Which I think is entirely down to its module system. Meaning anyone could build their own effects and not just wait for their OS developers to release a new service pack.


I always thought it was neat that the three main competitors in use then were Beryl, Aqua, and Aero - solid, liquid, and gas. Not sure if that was intentional or not.


Maybe for the people that could afford to own a mac. Compiz was something I, a student could use though. And for free.

Maybe it’s more accurate to say it brought it into the reach of everyone with a bit of willingness to learn how to install an obscure OS (as opposed to having a ton of money)?


Same. Apple products were so obnoxiously expensive.


Quartz looked nothing like Compiz. Comparing the two and saying Quartz was better massively misses the point of what Compiz was and what the OP discussed.

If you want to argue that most of Compiz effects were overused and tacky then that’s a different issue; also an entirely subjective one.


Being first isn't everything, for instance Mac's "System" was not the first desktop GUI, but it was the most significant first for most users.

Similarly compiz was important for the world beyond MacOS, while being utterly useless it attracted the attention of lots of kids like myself while also producing useful side-effects in the Linux ecosystem, and no doubt pushing desktop's outside of Linux.

Compiz was indeed a pioneer, and also explored far more effects compared to MacOS X for better and worse - In fact I'm pretty sure Apple copied the 3D rotating desktop from compiz for a short time... not that it's a particularly imaginative effect, compiz just stumbled upon it first.

... so how about we stop being petty.


Compiz was one of those things I could show off to friends to prove that linux was actually way cooler than any of _their_ operating systems, but since then seems to have been completely forgotten about (at least by me). This was a nice blast from the past.


Almost every foray into linux on the desktop (when I was younger) for me started with seeing a cool video online with window effects (Compiz being the one I remember), installing linux on a new partition, spend the day getting most of my hardware working and playing with Compiz and other cool visualization utils (I can't remember the name of a tool that would add computers stats and whatnot to your desktop background, "nerd"/"geek"-something maybe?). Then after I spent a day getting it all working I'd be staring at my computer and it wouldn't take more than an hour or two to think "Ok, that's cool but I want to play a game" or something else that I couldn't do in linux.


You are likely thinking of conky. It was included on some distros with a basic layout, but you could spend hours just adding other stats to it and changing colors.


That sounds familiar, maybe the name I'm thinking of was the windows version/copy/port or something. All of those were neat and I'd spend countless hours (this was back in HS so I had tons of free time) configuring it and looking at screenshots that people posted to see what parts I want to recreate and then in the end I'd realize I never see my desktop background, like ever lol. Even now with 4 monitors you can't see my desktop background anywhere, I'm sure it's still the default macOS desktop because I never see it.


I still remember the hundred page thread started by the author of conky on ubuntuforums.org. Back when I'd volunteer time on that site to help new ubuntu users. Blast from the past.


> Then after I spent a day getting it all working I'd be staring at my computer and it wouldn't take more than an hour or two to think "Ok, that's cool but I want to play a game" or something else that I couldn't do in linux.

Hah, for me, this was when I started getting deep into WINE and also some of the games available for Linux (SuperTux, that one game where you shoot a ball and it sticks to other balls and if enough of them are the same color they disappear, and some DOOM port).


> that one game where you shoot a ball and it sticks to other balls and if enough of them are the same color they disappear

Frozen bubble.


I remember making a Rox AppDir of that. It was written in Python[0] IIRC and I had to modify a few lines to account for relative paths. I miss Rox...

[0] I misremembered, it was actually Perl.


Thank you! I knew the name had something to do with ice, but the name eluded me.


I did the same for sure, played every native linux game there was but at the time most of the game I played were rough under wine. CS: Source, TF2, L4D, and WoW were all pretty hard to get reliably running especially compared to their windows performance (note, this was 2007-2009 range). I still remember a youtube video showing WoW running on Wine and they had Compiz so you could see WoW running then they switched (using the rotating cube transition) to another desktop. The video claimed it was getting higher FPS on Linux+Wine vs Windows so I of course dropped everything to try it.... I did not have similar results.


Heh, my time with Linux was before then, I think - but only by a couple of years. I do recall having some fun experimenting with StarCraft and NFS: Hot Pursuit (the og 1998 version, not the 2010 remake) under Wine, though.


Bragging about wobbly windows was the best thing ever. I'm glad I'm not alone!


Multiple desktops on a rotating cube tho. Shit was straight fire


It wasn't just a gimmick either. Mapping workspaces on to a physical cube makes navigating between workspaces more intuitive and natural. It provides a useful spatial metaphor to latch onto.


Then there was this one screensaver that made the cube slowly rotate while all your windows from all the faces blew around like leaves in a gentle whirlwind in the middle.

I really really want to see this come back. Even back then it was never released to stable and I got it from a script that grabbed and compiled all the bleeding edge stuff. It worked for a few weeks and then an update somewhere broke it and I never saw it work again :(


the cube was straight up useful as a visual cue since you can animate it faster and still know what's going on - I find the slide more confusing at speed.


For the pro level you had to make the cube transparent so you could see it all the time.


I don't see why Apple couldn't introduce this to its desktop switching routine. The cube animation already exists for switching users. It would be nice to have the option when switching desktops.


Apple hates options of any kind. Their core ideology is "opinionated software". Meaning the software does things one single way, the way they intended, and it does that really well.

It sucks though if you really want things another way. Then you have to mess around with third-party addons that break every time there's a major upgrade. It's the main reason I moved back to KDE (and the OS being closed off more).

I would never choose to use Gnome for this reason because it does the same thing. But at least on FOSS we have many options available, to each their own!


That's just it. Apple would rather provide One True Way to do it. On Mac, cube = switch user, and slide = switch desktop.


Apple is about the minimum amount of features. It's approach is minimalistic to the extreme and that's also good.

It's more or less the same reason why I like Gnome's minimalistic approach.

I have ADHD and everything I DON'T need is an OS that distracts me. FFS, I'd work from a VT-100 (even though I'd prefer a 3278-2 or 3279) if that was possible.


I still use wobbly windows on KDE and it fills me with warm nostalgia.


I'm almost ashamed to admit how large of a reason wobbly windows working out of the box is for my continued preference for KDE in most cases. Does anyone know what the status of '00s desktop effects is on other common DEs? I'd guess it'd be easier to achieve on MATE than Cinnamon for example, though I've always liked Cinnamon.


There's a comment above saying that wobbly-windows is available as a gnome-shell extension.


Just found this! This thread has really let me compiz out my Gnome, between the flames and wobbly windows!

https://extensions.gnome.org/extension/3210/compiz-windows-e...


And that multiple-desktop cube thing!


What about the fire effect?


Wow did this ever take me back. Arguably my first "public code release" was a plugin ("mod" back then I think) for the WWIV BBS system -- a screensaver called "Bubbles" that would draw random circles on the idle screen instead of the dead blinky cursor at top left. BBS owners would basically need to code it as a diff in their own system and recompile the thing.

I was maybe 9 or 10 years old. It was probably 50ish lines of C code, and I made some serious assumptions about what video card and modes were present.

I really really loved computers and coding back then.

Reading the comment below about being elderly and soulless also resonates for me at the moment.

I miss the romance of it all. :) I've been married to computers for nearly 40 years now, and all of the spice is gone. It's just comfortable and regular and routine.

Oh well, on to another 2-hour interminable sprint planning sesh (sigh) -- "yes dear, I'll be right there"


I am interested on how did you learn programming and even create a functional program when you were that young. I was 14 years old when I write my first "hello world" program, and after that, for the next 5 years I basically just fiddling with Visual Basic 6 UI builder and programming. Most of my script come from books or the internet, and I don't understand the complicated stuff at that time, especially where the program interacted with Windows API. Maybe the lack of teacher and access to materials also has a role for my lack of understanding back then. So, I am interested on how did you learn programming on such a young age?


My dad was a DBA and did a lot of C code for his work. He got me a PC clone for my 7th birthday and a copy of Borland Turbo C. I made all sorts of strange code widgets and ascii animations, and also ran a BBS which put me in a lot of nerdy circles. Learning coding by osmosis I suppose.

I did dad's homework up into my mid teens, so it was a good payoff for him! :)

I never asked him if he liked the book Tom Sawyer. I bet he did.


I wonder if a PC clone and a copy of Borland Turbo C is a good present for a 7 year old kid (:D), just kidding. Nevertheless, you have good support from your dad there.


I actually kind of want to use GNOME just for this now.

Call me crazy but little novelties like this are part of what make computers fun.


That's really it-- it's a waste of resources, but it's such a tiny amount that all these effects are almost free. There's no point in not having a little fluff that makes the experience a bit more fun.


> it's a waste of resources

I always laugh at the people that take this notion way too seriously. If their CPU is only 98% idle, it's a travesty. I imagine the same people driving around in cars stripped of all paneling and upholstery, because every little bit of unnecessary weight hurts performance!

> There's no point in not having a little fluff that makes the experience a bit more fun.

Well put. Considering how much time we spend staring at these stupid little number boxes, things that makes the experience a little more enjoyable are worthwhile, even if they're dumb and frivolous.


I don't consider it a waste. These features provide enjoyment, which has value. As far as software goes, they are no more wasteful than video games or media players.

Units of energy expended per unit of "enjoyment" is certainly a factor to consider, but in this case the extra energy consumed is very minimal.


Not really, these effects are nothing to modern graphics cards.


On that note, I’ve kinda wondered what kinda of wonderful looking modernized replacement for XFishtank we could have on modern GPUs.


KDE has crazy effects too. They're buried deep in the settings, but all the 2000s effects are still there.


FWIW, KWin is easily extensible with effects like that and some of them are even available by default.


I am so glad I'm on GNOME right now. Click the link, slid the slider, and woo here we are!


Feels like GNOME Plus!


I was expecting this to be some tool that deleted C:\Windows\System32 or something, but I came out pleasantly surprised with a hint of nostalgia.


Or moving to a warmer climate - and burning my windows


VR desktops apps nowadays let you place windows around your head in VR, so that you feel fully immersed in whatever you're doing.

Back in the Compiz days, my virtual desktop switcher was a 3D cylinder. Holding the middle mouse button would zoom out my current desktop, placing me in the center of a giant 3D cylinder which I could rotate by moving the mouse to switch to a different desktop. And it worked with my dual monitor setup!

That was immersive as hell, and I felt so freaking productive having that spatial awareness of my other desktops. Back then I was doing Android development with Eclipse, and I would have one desktop for code, another desktop for logcat and an ADB terminal, and another desktop for documentation/music/etc.

And of course, all of my windows were wobbly.

Today I don't use anything fancy like that anymore, and I barely ever use virtual desktops for anything, even though switching between them with a keybinding is much easier/faster than that old setup I had. ALT+TAB takes about as much effort as CTRL+ALT+ARROW, but one is muscle memory and the other is not. If I ain't getting a fancy 3D cylinder, why bother?


And here I am today, with Compiz Alike windows and magic lamp effect, Burn My Windows, Blur My Shell, Desktop Cube, and of course, Useless Gaps on GNOME. I love it.


How bad is the "screendoor" effect with top-of-the-line VR headsets nowadays?


I use Mate with all the animations turned off and this kind of stuff makes me realize I am an elderly, soulless fuck


Same, but KDE fwiw. Yuck. I don't need it, I don't want it, get it away from me. But I'll be the last to yuck someone's yum. Have at it y'all. It's just not for me.


Yeah KDE's configurability is exactly what I want. 5.23 was again a great release.

This is really the power of Linux... You can make it what you want it to be.


Recall working on an highschool assignment in MathCad on my Windows XP virtualbox, and having my work fall apart in front of my eyes...

It was the VM crashing and the window destruction effect was quite appropriate as my work wasn't saved :/

Maybe we should only burn windows when the application crashed non-zero :D


MathCad in VirtualBox.. what a time.


Some of these effects might be useful in a presentation. For instance, if your windows break apart into small shards of shiny glass (making a slight noise when doing so) and then disappear, it might be engaging. This sort of thing is common in films like Minority Report.


Ahh, you've brought back fond memories of grade school powerpoint presentations :)


I remember I have to do a presentation for a country that I picked to present for my 9/10th grade history class. My first slide have that blue flaming text as a title that I generated from the flaming text generator website back then.


I was thinking more like a Youtube walkthrough of something that involves clicking around a desktop.

But even in a PowerPoint presentation, if it's done in good taste, it can be quite stylish.


Me: “There are people who had PowerPoint in grade school!”

*googles PowerPoint release date. Answer: 2004.

I guess that checks out. I’m old now.


I'm pretty sure PowerPoint is way older than that. It was already there in basically its current form with all the terrible animations in office97 but it goes back to the late 80s.


It's from the 80s, actually. But yes, I was referring to the early 2000s :)


This is really cool!

Typed from my KDE desktop, with wobbly windows and desktop cube effects. Desktop computing should be fun.


Love the flashback to the old Compiz days! The only thing missing now is wobbly windows.



The GNOME Foundation needs to adopt this.


I miss the fad of Compiz effects. Sure they were silly but it added a little fun to the desktop. I never really got into desktop themes preferring something plainer and smaller (because screen real-estate was still a commodity back then) but wobbly windows and closing effects largely didn't take much away from usability while still adding a little personality.


You can still use compiz today with MATE or XFCE


Thanks, I wasn't aware of that.


it is not missing: https://extensions.gnome.org/extension/3210/compiz-windows-e.... My eldest son has it running on his account, and I predict that later today those wobbly windows will also burn... EDIT: correct extension


wanted to make the same comment :). they were so good


haha, ditto, wobbly windows were awesome, and the geared cube


I love Linux because it creates a space for stuff like this to take place. That said, i3 is enough glitz for me (and it's pretty much none).


We all go through phases. I was on Cinnamon for the past six years and see myself returning to GNOME. I find myself wanting less cruft out of a DE as I get older. And more keyboard friendly too.


Also on Gnome for the minimalistic experience, but wobbly windows have a physicality that just clicks.


Somewhat ironically, now hosted on a Microsoft-owned closed platform.


Fortunately, Git itself is "open" and the source code can be migrated to another host without much difficulty. Migrating bug/issue tracking, PR management, and CI will be more difficult, but not impossible.

That said, I don't quite understand why no viable alternative has arisen.

Gitlab was a good attempt, but its interface turned out to be kind of clunky and more "team-oriented" than makes sense for general open source projects. I strongly believe that if it had a "slick" interface like Github, it'd be more popular.

Sourcehut is fantastic, but lacks the same "issues" and "pull requests" system.

Mailing lists honestly kind of suck, if only because there's zero semantic markup in email (excluding HTML-in-email which is a clusterfuck that nobody should use), making it difficult to track comment replies, embed code blocks, etc. And submitting patches over email is a chore compared to making a PR, viewing diffs, etc. on a platform like Github.

Also the social networking features of Github are unobtrusive and fun. Following other users has introduced me to a variety of interesting projects, starring projects is a fun way to show support, and the ability to watch a repo for releases is useful (although I wish it were an RSS feed instead).


The emoji-commit-message spec absolutely drives me batty. Why. Just why.


Youth maybe?

Worse though is the dependence on the plain ASCII codes, limiting their utility (if you even call them that) to GitHub's display. On a terminal (at least where I'm likely to use "git log"), it's just a bunch of dumb ASCII codes taking up line space.

I don't really get it. Why not just use the real emoji? At least it'll display properly outside of GitHub.


I liked the idea, but I agree with you that they should just have used emojis.


I wonder if it's possible to do things like this with Windows or MacOS. I love the idea of cool effects like this (even if the first thing I did after OSX introduced the genie effect was to turn it off—nowadays the hardware is fast enough for it to not be annoying and the split second of the window shrinking away is a nice visual cue as to what's happened especially if one accidentally hides the window via cmd-H).


Windows Terminal has a neat shader thing that you can use to add things like noise and scanlines to your terminals.

I wish all desktop windows could have shaders applied.

I'm trying to convince myself to write one for a curved CRT look and one for phosphor persistence.

https://github.com/microsoft/terminal/blob/main/samples/Pixe...

The Windows Terminal team has a lot more fun than the others, it seems.


> The Windows Terminal team has a lot more fun than the others, it seems.

Instead of having fun, they should be performing doctoral research to improve performance. (https://github.com/microsoft/terminal/issues/10362#issuecomm...)


If the person thinks it's that simple, they could offer a patch. I don't think it's simple and I won't.


After that post they did implement a full reference implementation:

https://github.com/cmuratori/refterm/commits/main

And there is movement in getting changes into the terminal itself:

https://github.com/microsoft/terminal/issues/10461


A while back I made a goofy app called Appstagram that applied Instagram-like filters to the windows of your desktop applications (https://github.com/aleffert/appstagram), but Apple continually made it more difficult to inject code into every process (even after having been granted permissions by the user) and I eventually gave up.


macOS has gotten a lot less fun, starting with X. I remember no end of UI-customization utilities for pre-X Macs, some of which were really powerful, like Kaleidoscope. I loved being able to make the system look like an NeXT box (there was even an Irix theme) or design my own UI entirely. Even Apple briefly considered the idea of building theme support right into the OS.

But they didn’t, and the few quirky things that OS X did, like the puff-of-smoke effect, have been quietly removed. I hate how sterile Apple’s products have gotten. Sure, they’re beautiful, but they don’t have the kind of character the old ones did.

I wish something like Kaleidoscope (or Burn My Windows) existed for Macs.


I remember one, "Out of Context Menus" that allowed things like adjusting the vertical and horizontal settings of windows, as well as applying a gaussian blur to them.

You can read more about Eric Trout's extension here:

https://tidbits.com/1999/07/12/the-machack-hack-contest-1999...


On the flip side, I remember in the late 80s that pretty much every Mac disk had a virus on it thanks to fairly loose OS security in those days. I'm guessing that no more talking moose is the price we pay for not having to worry about getting a virus anytime something is put in the computer.


Heh, I remember reading about a few of the infamous ones. Most of them seemed generally harmless; I never got one, although if I’d gotten more software from MUGs and mixtape-style floppies, I might have.

You are definitely correct about loose OS security. It was possible to do a lot of damage with just a HyperCard stack, and those were pretty widely distributed.


Stardock's WindowFX is probably the closest you're going to get. It's old, was never all that stable and never got super popular (compared to WindowBlinds), but it should work.

https://www.stardock.com/products/windowfx/


Went down the comments looking for a Windows version. No luck. :(


I still remember the feeling when I managed to run the cube desktop with compiz on ubuntu 06.


Excellent. Longtime Gnome desktop user, former Compiz user...I had no idea this even existed. You better believe I now have burning windows :), despite how childish it may seem.


compiz effects are probably one of the main reasons why Linux interested me so much growing up, and why I now work in tech!


Boy do I miss how magical graphics used to seem back in the day. I remember one GPU I bought came with a 3D "computer house" simulation where all your files were in drawers, the control panel items were in the utility closet, etc. It was the most incredible thing I'd ever seen.


Hmm nice work but I always hated the way Compiz had so many effects just for the sake of it :) The wobbly windows, the fire.. It was cool for 2 minutes and then annoying. At least to me. I'm surprised so many people thought the wobbly windows added a real feel to the desktop. I never really had that experience. But it's good that it's an option.

I preferred Apple's animations which like the 'genie' one have a functional purpose too: they show where a minimised window is going.

For me, the perfect animation is extremely quick so it doesn't make the desktop feel slower, but still just noticeable enough to make it feel sophisticated. And it should have a function, not just for show.


Really cool.

SimulaVR (simulavr.com) is interested in potentially adding compiz effects like this into its VR window manager. There are some really cool ones you can do in VR, but I won't say more. =]


>You should also start your commit message with one applicable emoji

I don't think many people will contribute with a rule like that. Project is neat though.


I was thinking the exact opposite actually. I like how easy you can get a feel for the latest commits just by seeing the emoji, which is excellent! More people should enforce rules like this.


Simon's a top notch interactive graphical user interface designer and programmer. Not just static pictures, not just fixed animations, not just functional code, but rich interactive animated feedback that's actually useful and helps you complete your task while it's also beautiful. When you can design and program stuff like this all on your own and give it away for free, then you can make up any rules you want about commit messages. Look what else he can do with icons and emojis:

https://schneegans.github.io/news/2021/12/02/flypie10

>More Fly-Pie Updates!

>In the last couple of months several new versions of Fly-Pie have been released. In this post, I want to highlight the major new feature.

>New features were added in version 8 and version 10. The versions 9 and 11 were released as well, but they contain bug fixes only. Here are two trailers to celebrate the respective releases:

Fly-Pie 8: New default dark theme and support for GNOME 3.36, 3.38, 40, and 41!

https://www.youtube.com/watch?v=j9t7hfkE_5w

Fly-Pie 10: A new Clipboard Menu, proper touch support & much more!

https://www.youtube.com/watch?v=BGXtckqhEIk


Gitmoji - Yay or Nay? 2019, 220 comments, https://news.ycombinator.com/item?id=21760021

If you'd like more thoughts on the matter.

(and the original article now lives here: https://www.bekk.christmas/post/2019/11/gitmoji-yay-or-nay )


I may start doing that for my own projects.


Interestingly, this diverges from the more common gitmoji rules: https://gitmoji.dev/


>I don't think many people will contribute with a rule like that

Why do you think that is?


It increases barrier of entry with something that is really arbitrary and not easy to remember unless you have a cheat-sheet in front of you.

I have lots of commit rules in all my projects but they're simple, straightforward, and easy to remember because they're useful and commonplace. eg "Short one-line commit message, more details in the paragraph beneath it, atomic commits with single change per commit, no individual commit breaks the tests".


If you can't remember that the "silly" project has a "silly rule" in place, and you won't remember any time you look at "git log" results, then I really don't know what to tell you.


That's simple enough: have a text file with the emoji saved somewhere handy, or use a text macro expander to replace :colon_style_markup: with real emoji. If you don't already have an emoji input widget on hand, that is.


Because emoji are tacky. You want to use them in your commit message? Fine. But if I were ever contribute to a project that enforces such a rule I would start every commit with the middle finger emoji.


> But if I were ever contribute to a project that enforces such a rule I would start every commit with the middle finger emoji.

Ah yes, the Kid Rock aesthetic. Much less tacky.


It's his project. He can be tacky, if he wants to. If you don't like it, fork it.

I don't like emojis, but I have even less regard for how boring the internet has become.


I don't think it's out of place on a project about cool window effects which many would also consider tacky.


Isn't tackiness heavily context, culture, and timeframe dependent? Besides, computing is far too serious these days . I see no reason for my computing to be a little bit whimsical (especially if it's a hobby project), provided it's also self-consistent.


> Isn't tackiness heavily context, culture, and timeframe dependent?

All the more reason to not require them?


I believe the opposite: that there is sometimes the right place, the right project/people, and the right time to be whimsical. So not always, but also not never (as an abolitionist stance would see it).


Well I do have to admit that in this particular instance it is just as silly as the project itself, so it does fit.


I don't even know how to type an emoji on my computer


They're not actually emoji; they're shortcodes that get rendered out to emoji in the Github interface. (Which, IMO, is worse than actually using emoji, but easier to type, I suppose.)

    935e922 (HEAD -> main, origin/main, origin/HEAD) :tada: Bump version number
    aeec220 (tag: v7) Merge branch 'feature/3d-noise'
    17fae26 :lipstick: Tweak labels
    a331736 :twisted_rightwards_arrows: Merge pull request #21 from Schneegans/feature/3d-noise


On Windows just press Win+; .


there's lessons on coursera and edx


Compiz has been my daily driver for the past 10 years, I highly recommend it to everyone (make sure you use CompizConfig Settings Manager to configure it).

If you do try it out, please _please_ ignore all the gimmicky effects (yes, including burning windows)- they're a distraction from the pure utilitarianism that compiz offers.


Anyone happen to know what gnome shell version added the ability for extensions to run arbitrary glsl shaders like this?


I love the thing you can do with linux! gnome is an exceptional desktop environment, i don't see any other DE letting plug directly into its environment to change every aspect of it, i just wish they are not going to make it more limted with a bunch of limited portals access (like firefox)


This would be great when doing live demos during a talk at some conference. People would be entertained by these effects, even if they are just for show and 'purely for aesthetic purposes'.


Wow, that's really cool. I think I am going to use the Matrix shader on my private workstation. Will do a code check before I install it on my work laptop tho.

TIL: Gnome Extensions can be written in JavaScript.


The Gnome desktop shell is written in javascript. (or at least it was in 3.0, although I don't expect it to have changed)


Only small parts of it.


I need this for MacOS.


I was thinking this would be nice on win10 too


Is there anything like this that works in KDE Plasma?


So. Many. Emoji's. On. The. GitHub.


The matrix one is nice; maybe a star trek teleporter effect would be a nice addition?


they've got emojiquette!


dumb question: is there any way to get similar effects on a Mac? are there third-party tools that can change macOS appearance so dramatically?


So many neat things going on with Linux desktops. It’s unfortunate high DPI/mixed DPI support lags so far behind Windows and Mac. This has essentially killed Linux as a daily driver for me.


Made in JavaScript HAHAHAHAH


Very neat. Works perfectly so far.


would like to try it but I never close windows :/


have to add this to my wides pc ...


I love Simon's work and artistic sensibility! I posted this earlier about his amazing work with pie menus for Gnome.

Here's done even more amazing pie menu stuff since then, including Fly-Pie -- why don't all web browsers and window managers support this yet??? This stuff is extremely useful, practical, easy to use, and deeply customizable, not just beautiful window dressing, eye candy, and fancy effects.

More Fly-Pie Updates!

https://schneegans.github.io/news/2021/12/02/flypie10

Fly-Pie 8: New default dark theme and support for GNOME 3.36, 3.38, 40, and 41!

https://www.youtube.com/watch?v=j9t7hfkE_5w

Fly-Pie 10: A new Clipboard Menu, proper touch support & much more!

https://www.youtube.com/watch?v=BGXtckqhEIk

Pie Menus: A 30 Year Retrospective

https://donhopkins.medium.com/pie-menus-936fed383ff1#ed08

>Spectacular Example: Simon Schneegans’ Gnome-Pie, the slick application launcher for Linux

>I can’t understate how much I like Simon Schneegans’ Gnome-Pie, as well as his bachelor thesis work on the Coral-Menu and the Trace-Menu. Not only is it all slick, beautiful, and elegantly animated, but it’s properly well designed in all the important ways that make it Fitts’s Law Friendly and easy to use, and totally deeply customizable by normal users! It’s a spectacularly useful tour-de-force that Linux desktop users can personalize to their heart’s content.

https://news.ycombinator.com/item?id=17098179

Pie Menus: A 30-Year Retrospective: Take a Look and Feel Free (medium.com/donhopkins)

https://news.ycombinator.com/item?id=17106453

DonHopkins on May 19, 2018 | parent | context | favorite | on: Pie Menus: A 30-Year Retrospective: Take a Look an...

I'm very impressed by Simon Schneegans' work on Gnome-Pie:

http://simmesimme.github.io/gnome-pie.html

And especially his delightful thesis work:

Trace-Menu:

https://vimeo.com/51073078

I really love how the little nubs preview the structure of the sub-menus, and how you can roll back to the parent menu because it reserves a slice in the sub-menu to go back, so you don't need to use another mouse button or shift key to browse the menus.

Coral-Menu:

https://vimeo.com/51072812

That looks like a nice visual representation with a way to easily browse all around the tree, into and out of the submenus without clicking! I can't tell from the video if it's based on a click or a timeout. But it looks like it supports browsing and reselection and correcting errors pretty well! (That would be something interesting to measure!)

There's another useful law related to Fitts's law that applies to situations like this, called Steering Law:

https://en.wikipedia.org/wiki/Steering_law

The steering law in human–computer interaction and ergonomics is a predictive model of human movement that describes the time required to navigate, or steer, through a 2-dimensional tunnel. The tunnel can be thought of as a path or trajectory on a plane that has an associated thickness or width, where the width can vary along the tunnel. The goal of a steering task is to navigate from one end of the tunnel to the other as quickly as possible, without touching the boundaries of the tunnel. A real-world example that approximates this task is driving a car down a road that may have twists and turns, where the car must navigate the road as quickly as possible without touching the sides of the road. The steering law predicts both the instantaneous speed at which we may navigate the tunnel, and the total time required to navigate the entire tunnel.

The steering law has been independently discovered and studied three times (Rashevsky, 1959; Drury, 1971; Accot and Zhai, 1997). Its most recent discovery has been within the human–computer interaction community, which has resulted in the most general mathematical formulation of the law.

Also here's some interesting stuff about incompatibility with Wayland, and rewriting Gnome-Pie as an extension to the Gnome shell:

http://simmesimme.github.io/news/2017/07/09/gnome-pie-071


This is cool, but the fact that its written in JS tells me just about everything I need to know about gnome...

JS has its place, using it for systems programming isn't one of them IMHO, since I prefer to have the core of my computation stack slim and fast. I can almost forgive the electron apps their piggyness given the desire to build cross platform, but gnome? Yah, no thanks.


I don't care what language is used, my problem with Gnome extensions is that after I install them, one of these things will happen:

a) after a minor apt-get update the extension will stop working

b) the extension will leak memory and after a few days of uptime my desktop will be unusable

That's why I'm still running Cinnamon. Gnome extensions are a thing created to deflect the biggest criticisms towards Gnome's questionable direction, yet they are a second-class citizen and never really work well enough to be acceptable.


I find it interesting since JavaScript in this case is only "glue" language. Actual effects are hardware (?) shaders: https://github.com/Schneegans/Burn-My-Windows/blob/main/src/... .. Had no idea this is possible.


IMHO, its less about where the work is done (and yes i'm aware that gjs tends to be used mostly as glue, and the same with KDE) and more the fact that I don't want a big heavyweight garbage collected language deciding to garbage collect and glitch some part of the system, or JIT pass recompiling a bunch of code when I first click it. I despise latency in human computer interactions and everyone whines about how its worse on pretty much every common PC/etc vs older devices, yet they go an install hooks written in JITed/garbage collected languages all over the system.

Having those hooks written in compiled languages/etc is bad enough, I found myself regularly cleaning the runas & windows explorer context menus of loads of cruft because the click latency was noticeable, and now not only can one plug in a ton of stuff but it needs to thunk though to JS to do it (and not picking particularly on JS, because it would be just as bad in java or python or whatever other scripting language one chooses).

Its just a waste of cycles, and for projects I work on, engineering time is "cheap". That applies to most system programming if one spends 1/2 a second considering that the code forms the foundation for hundreds of millions of devices all burning energy and the time of their users.


Ah ha!

Thank you the detail. The philosophy is that Gnome is NOT configurable, really, just does a VERY limited and consistent desktop thing. It doesn't even have icons on the desktop (by default).

I find that it is "easy" for most users -- there is really nothing there! If you want icons on the desktop, add an extension for that. And, the idea of extensions is that they are small programs that are easy to manage. It is possible to turn them all off with a click! (if they are getting in the way).

I just counted -- I have 36 extensions on my Gnome 41. Note that icons on the desktop = extension, start menu for programs = extension. You can certainly start programs without a start menu -- that is the default "Gnome Way".

On the other hand -- being able to consistently customize is very nice (I particularly like "argos" extension, which makes it delightfully easy to add buttons, gather and display information and more -- and as a bonus, is fully compatible with the MacOS bitbar plugin.

Yes, I use a lot of extensions, but I do have 4 or 8GB of RAM is my laptops, and i3 or better processors, so this becomes a reasonable fit for me.


Javascript for desktop extensions. Makes sense -- doesn't matter what processor you are running! Sure, depends on the version of the desktop environment - and each extension declares what version(s) it is operable for. The major problem is that I may be running (just for example) Gnome 40 on one machine and then Gnome 30 on another, and I really can't share the same home directory! That would be lovely if it could be worked out! Would also slightly simplify my backup strategies.

I don't see desktop extensions as the "core of the computation stack". Can you expand on that idea, please?


The JS usage in this case isn't any different from Python or Ruby or Perl or Tcl.


Pretty, but just more cycle-stealing inefficiency.

Using something like this boils down to whether you don't mind wasting cycles and RAM on pretty effects or you just want the best performance consistently.

For most of us, that depends on whether we have had the experience of struggling to maintain good GUI performance or not. The younger users will have no hassle with using something like this, they've always had plenty of GUI performance.

For those of us who have had the experience of the necessity of only using plain-color wallpapers because it took too long for an image wallpaper to repair itself after moving a window on the desktop, it will remain one of those 'interesting thing to see, but not for me' effects.


This will use less than a couple of percent of your GPU when it's "in action" . My computer is so overpowered for what I do most of the time this sort of thing is completely harmless. I'm pretty old myself but this stuff is fun to play with. I don't mess with it very long usually as it gets boring after a few weeks, but it's not really wasting much energy or cpu.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: