>If you or others say that it's fine, then you're suffering from Stockholm syndrome, having been taken in to the cult of Atom.
Can we please stop accusing each other of malicious intent? I can't believe there are people on both sides of this argument that can't see that the other could have a different experience.
I'm just starting to use it today on a large code base, and it's been fine. Only have vim and go-plus plugins installed. I would guess plugins are what's slowing most.
Seems to me like you're suffering from "I can't replicate it therefore it does not exist and everybody else is either lying to me or a complete idiot."
You haven't actually called anyone out as liars or idiots, but your lack of belief of what they are communicating to you and the tone (from what I can gather) of your responses speaks for itself (whether or not that was your actual intention).
Linux is actually not half bad nowadays. It's not 2001 anymore.
I've migrated my main laptop from Windows 8 to Elementary OS and the only complaint from my very non-technical SO is that things "are different" (i.e. the close button is in upper left corner of the window).
Apart from that, everything just works. In fact it all worked out of the box, with the noticable exception of Skype.
Linux is turning into a pretty viable option. Really.
Of course there are. It is the forever configurable Linux after all. But since the discussion was about Apple products having the close button on the left is a benefit ;-)
But really these things are frustrations at first but you will learn the differences and get over them very very quickly. Just ask anyone who switched from win to osx.
I have been using it full-time circa 2006-2009. I guess that counts as "past decade".
1. There was no out of the box support of my particular NVidia video card, so I've had to build the driver from source and fiddle with my system to install it.
2. I've had userspace processes freeze on network activities and become impossible to kill, because of a kernel deadlock in my wireless driver.
3. Proper smooth fonts required work at the time (freetype did not ship with good subpixel hinting).
4. Proprietary media codecs did not work out of the box and required a third-party repositories.
5. The whole KDE four-point-not-really-zero debacle.
A minimalistic XMonad+Vim+terminal setup was decent and worked for me, but eventually I got sick of desktop Linux and moved to OSX.
I realise that many things have been fixed in the meantime, but it was not a flawless experience back then. If I was to give desktop Linux another shot today, I'd be cautiously optimistic, but not expect a miracle.
As you said many things happened in the meantime and you should not stay on your 2009 impressions. Besides, Please mention the distribution you used, there is no "single Linux Experience", just like you don't compare Windows 8 and windows XP as being a single brand of "Windows". For your particular comments:
1. Not an issue anymore in most recent distributions, and Nvidia is now supporting even the latest graphics cards on Linux.
2.3. I can't comment on that.
4. On Ubuntu and many other distros they now work out of the box.
Chinese language fonts (especially when mixed with English fonts) and input methods. And I don't just mean the default fonts the OS comes with (those these are better on OSX by a wide margin), but the way font selection and fallback works for GUI elements that contain a mix of Chinese/English.
I've run both linux and OS X as my default setups (currently linux), and I have to say, OS X does a lot of things better. I like linux better overall, but that's just too strong a statement. Here's a few that OS X does better:
-- Font anti-aliasing.
-- Package management - people using linux should be expected to be able to compile from source, I know, but at least most packages released for Mac OS X are developed with binaries.
-- Graphics card drivers, specifically Nvidia (which Mac ships with).
I've been using Linux and BSD for thirteen years and I can't even remember the last time I had to manually build and install an application from source (not counting random, early-stage hobby projects from github and things).
Open source Unix systems are the kings of package management. OS X pretty much does the exact opposite of 'package management', and any third party solution is based on existing Linux and BSD implementations.
Replying to this because I can't reply to the child comment.
This is something I keep hearing, and I keep linking to the same thing. "Download and One-Click Install" can easily be done on Linux. In fact, that's even what it is called: "One Click Install", at least as far as openSUSE is concerned.
Bonus: It still adds a repository so you have a working update path that plays nicely with the centralised package management that lets you manage every single package centrally, instead of having half a gazillion individual updaters, bundled libs etc.
"Package management" is the wrong term. OS X just makes installing software less terrifying.
Let's say you wanted to install Google Chrome. On OS X, you go to chrome.com and click the single "Download now" button, and it downloads.
On Linux, you get the same button, except clicking it takes you to another screen where you have to know whether you are a "Debian/Ubuntu" or a "Fedora/openSUSE" user or need a "community supported" version, and whether you want the 32 bit or 64 bit version, and whether you want to "add the Google repository to your system", and here's a thing to run on the command line if you don't want that.
The "sudo apt-get install yadayada" stuff is legitimately great for power users. But for most consumers, the OS X approach is definitely to be preferred.
OSX works out of the box because the hardware is known in advance.
If the hardware is supported by Linux, today's distros work out of the box too, and Linux supports a lot of different hardware components, especially older ones.
If you buy a new machine which comes with Linux pre-installed, like OSX is preinstalled on Macs, you can be pretty sure that it works just as "out of the box" as OSX does.
My experience is that when a UI feature is present in OS X and Linux, the OS X version will typically (though not always) be less buggy and more polished.
I've got Ubuntu 12 open in VMWare, running Firefox, and so I can compare it to my Mac running Mountain Lion. I'll pick on window resizing. Here are ways that Ubuntu's window resizing is objectively worse than OS X:
1. The resize edges on the right, left, and bottom are exactly one pixel in extent. They're nearly impossible to hit.
2. When resizing the Firefox window larger on the left/top sides, there's ugly flicker and transient drawing artifacts on the right/bottom side.
3. The resize cursors are misleading. For example, if I grab the top and resize the window as tall as it will go, the cursor still implies it can grow taller. On OS X the cursor changes when the window reaches a limit.
4. An Ubuntu Firefox window can be sized down to three pixels wide. It's hard to click on this window to make it bigger again.
5. Ubuntu allows me to position the window under the launcher and then resize it smaller. Such a window then gets "stuck" behind the launcher. I was also able to get an OS X window stuck behind the Dock, but it was harder: I had to make the Dock smaller, reposition the window, then make the Dock larger again.
6. If you position a window partially offscreen and then try to resize it smaller, it does this weird jitter thing.
7. Ubuntu does not appear to support background, fixed aspect ratio, or centered resizing, which are power-user features in OS X that you can access with modifier keys. If I hold down the shift key, then the Ubuntu window does some sort of snapping thing where it alternates between being as wide as it can, and being three pixels wide (??)
That's from a few minutes of poking around with a single UI interaction.
This isn't a fanboi post. While I like OS X, I noticed things that Ubuntu does better. But in turn, an objective Linux user cannot miss the many things that OS X does better.
I don't know the current state of high density display support in X11 (and I may just be an ignorant user and plain old wrong about this), but it was a totally broken deal breaker for me a year ago.
Cater to naive users. I say this as someone who particularly dislikes Apple's way of doing things, but who still recommends OSX to naive users.
Something goes wrong with linux? Better hope there's a congenial neckbeard around, and if it's a hardware problem, good luck to you. In OSX? A larger online presence, and then of course you can always take it into a "genius" bar... and they replace the hardware gratis often enough.
IME, Linux has a long way to go for non-dual-head setups. I have four monitors with different geometries on two graphics cards on a Hackintosh and OS X handles it seamlessly. Ubuntu pukes hard and dies (or, almost as bad, just ignores monitors) when it can't figure out how it should handle them all in a single X session.
A single X session cannot span two graphics cards. X just doesn't support it. You're waiting on Mir or Wayland for that one.
If you manage to get all four monitors on one card somehow, X can handle the multiple geometries just fine. Though it will generally do a terrible job of automatically detecting the right resolution.
In my experience the state of graphics drivers is still a bit of a sad affair on Linux in general. I never got my two screen setup working properly, not out of the box, and not after fiddling with xorg.conf etc. for more than two days.
YAGNI is not about being lazy, it's about being smart about the code being written, about the craft.
Over-engineering, architecture astronauts, hurt a system a lot more than someone following YAGNI. Making systems that are a big mess of hard to understand and modify code (even worse). I've seen this happen again and again.
Architects sitting in ivory towers pump out Word documents which result in overly complicated systems. Large, slow, waste people's time and money. The whole idea of "oh, in 10 years we may want to handle this and that" adds unecessary code which then needs to be tested and creates many more points of failure. And did I mention slow? These hairy-balled monstrosities are terribly sow.
If over-engirneers got paid by the line of code, they'd be millionaires. If they got paid for writing elegant, short and easy to maintain code, then they wouldn't have two pennies to rub together -- the YAGNI developer would be sitting in his yacht.
I'm not so convinced that YAGNI generally leads so directly to "elegant, short, and easy to maintain code" as you say. Certainly, with a very keen sense of when to make the transition from Ain't to Are, it can lead to time-savings and simpler and more focused code, but with less discipline or tight deadlines or (typically) both, it can also lead to repetition and the complexity of maintaining many one-off features, even if each is relatively simple in itself. Something very similar can be said for a more architectural philosophy - with a very keen sense of when a design is liberating and when it is burdensome, it can lead to less repetition and targeted maintenance, but with less discipline it can lead to analysis-paralysis and over-engineering. Like most things, the best answer is somewhere in the middle.
My preference is YAGNI while keeping a sharp eye on things that are relatively hard to make extensible later and relatively easy to make extensible now. To tack on another not-great example to the not-great Dog example - if I find myself writing 'dog' a bunch of times in a bunch of ways, I may well pull that out into some sort of `type` variable even if I-ain't-gonna-need-it.
You're correct when you say about YAGNI not necessarily leading to "elegant, short, and easy to maintain code". That is, in the end, up to the developer's skill.
Well, I agree with you because you're agreeing with me, but I'm not sure your comment that I replied to agrees with you. From my reading, it definitely implies that YAGNI leads to good code and architecture astronautics doesn't, entirely ignoring developer skill. Perhaps you just oversold your point.
It would be much easier to discuss issues of design patterns and when to apply them if we weren't constantly being bombarded with straw men. Ivory towers and word documents? Come on now, this adds nothing to the discussion and yet this is basically the entirety of the argument against anticipatory structural code.
If you hang out on github, and do your social coding over there (or simply prefer git over Mercurial), here is a github fork of the repository: https://github.com/dbarros/dapper-dot-net
A Java developer complaining about Ruby's supposed verbosity? What's the world coming to?! When one looks at Java code, 90% of it is just syntactic sugar, which doesn't add much to the actual working code. One has to filter through large amounts of white noise to get to the crux of the code. Not so with Ruby (and many other languages).
This is exactly what the original article was saying. ;) The languages take very different mindsets though, so if someone is actually happy with Java, then it's very possible that they don't have (or want) the right mindset for using Ruby.
"public static final" is not particularly verbose. It's a one time upfront cost which defines a few flags. It's error free, since a typo in any of those keywords will be picked up.
It can also be automated easily etc.
But Ruby using 'end' etc instead of curly braces, that's going to add up to a lot of verbosity.