I’ve long suspected that a Linux desktop environment designed to closely mimic Windows 7 (with light modernization where it makes sense) would prove popular, and the existence of all this reinforces that idea. A rough facsimile can be built using KDE, Cinnamon, or XFCE, but many details will still be wrong (and aren’t practically fixable without forking) and I think that’s enough to prevent many users from considering Linux as a viable alternative.
Windows Vista/7 screwed up file associations badly, you can do far less with them than you could under Windows 95. Under Windows 95, you could customize the right click menu for any file type, and add all your favorite programs to that menu as a different option. That's gone since Vista, which added in the buggy "Open With" submenu.
The worst is that today, if you associate Icon files with an icon editor, Icons suddenly lose their ability to display themselves, and instead turn into pictures of the associated application!
I am not claiming Compiz was first. It's just the one I still use so it happens to spring to my mind. (My own laptops run Ubuntu with the Unity desktop.)
As far as know, Apple invented the concept of a display compositor using 3D hardware. If anyone has prior art from before 2002 I'd love to know.
I am not sure I personally consider that closely enough related to count.
It often seems to me that even today Amiga fans are so passionate about the machine that they make rather exaggerated claims that do not really stand up.
For instance in many places I have read the claim that AmigaOS was a microkernel OS, or closely-related claims such as that it was the first widespread microkernel, or the first GUI microkernel, and so on.
The point of a microkernel is that only the microkernel runs in kernel space (in x86 terms, in Ring 0) and the rest of the OS is divided into multiple "servers" which run in user space (again in x86 terms, in ring 2 to 3). This in turn brings a problem, which is how to make comms between the microkernel and the servers fast. IPC is the big problem and that is what microkernel OSes struggle with, which has shaped the design of Mach, XNU, L4, seL4 etc.
AmigaOS is small but everything runs in ring 0. There is no division and so there is no need for tricky performance-critical IPC and all processes can read and write each others' RAM. That makes it (1) easy (2) fast (3) not a microkernel.
I would regard direct blitting into 2D windows as not unique, not the first such implementation, and not the same as 3D compositing.
In AmigaOS, everything runs in unprivileged mode, except for some specific critical code within exec.library which runs in supervisor mode or interrupt mode.
What's true is that exec.library does offer a call to run code as supervisor[0], and that there's no memory protection.
FWIW I am also linking to AmigaOS 4, MorphOS and AROS...
I wish "Notch" Persson or some other billionaire nerd would just buy the things and declare the whole lot freeware. There surely cannot be much residual value to extract any more.
Maybe, but there’s no reason why a thoughtfully engineered Win7 clone DE on a lightweight Linux couldn’t run just as well or better on the same hardware.
Your opinion. Win 7 was the best Windows UI, both in looks (Aero) and usability. It was abandoned primarily because Microsoft was trying to achieve some hybrid desktop/tablet/mobile UI that worked poorly for all form factors. Win 11+ (or whatever the next version is called) would be well-suited to return to an Aero-like UI.