Hacker Newsnew | past | comments | ask | show | jobs | submit | hyperbovine's commentslogin

Uhm, plenty of phrases like this existed before 12 months ago.

Apparently, before ChatGPT, the english was devoid of any occurrence of "It's not X, it's Y"!

printf() debugging is still considered a best practice in the eyes of many. I still remember being really surprised when I heard my famous (Turing award-winning) CS professor tell the class this for the first time.

https://tedspence.com/the-art-of-printf-debugging-7d5274d6af...


The thing about printf debugging is that it works universally. All languages, all platforms, all stacks. Even down to the lowest levels of most software, there will always be some sort of log available.

While some tools/frameworks might have more robust debugging tools, if you have a dynamic role within an organization, you may not find it worth the effort to set them up if your target platform is constantly changing.

One real world example of this from my own work in PHP - there is a tool/lang-extension called XDebug that is great and provides step through debugging, but historically, it has been a pain to configure. It's gotten better, but when I can just as easily add a few `dump()` statements that expose the same data, it's overkill. Very rarely do I need to actually pause the running request to debug something and 99% of the time, I just want to know the state of that object within an specific `if()` block and a debug log gets me that same information.


Hakeem Olajuwon famously started playing basketball in college. He had some … other gifts tho.

I don't think that's true.

That’s not irony. Interesting, perhaps, but not ironic.

https://thereader.mitpress.mit.edu/what-irony-is-not/


Isn't it dramatic irony when we, the audience, know that the first sentence is counterproductive to the point being made by the author while the author isn't aware? Maybe it depends on how meta you want to be about considering the author of the article a character.

It's certainly ironic if an article about slop leads with a tired old glob of pseudoscience slop and the author doesn't realize.

I can't tell if your comment is being ironic or not.

Ironically enough, the comment is pretty straightforward to interpret.

Well played... 4k words

Sorry did that scroll past your little context window?

On the flip side, if the rumored AI crashdepressionapocalyspe does in fact materialize, those things will become super cheap.


but no one has money anymore to buy it. :(


I don't understand why this comment is downvoted, it is undeniably true.


GNOME and KDE have stepped up with their design and user experience. I recommend you give them another try.


Often, when we don't understand something, asking questions helps us learn. Happy to answer any you might have, to help you understand.


Whereas on my laptop and my distro it works. And a lot of other people probably feel the same way. I use Linux at work and have never had issues with it in the last 6 years. Prior to that, yes.


Because for most of us, it's simply not true. It's as stable, if more, than MacOS, by far.


The word "stable" literally does not appear in the comment to which I was responding.

Maybe I'm just scarred from laboring much too hard in the 90s and aughts to get desktop and laptop Linux working, but here is my current take:

- Yes there is fragmentation. Perhaps there are not hundreds of Linux distros but, off the top of my head: Debian, Ubuntu, Mint, Fedora, RHEL, CentOS, Rocky, Alma, Arch, Manjaro, openSUSE, Kali, PopOS, elementary OS, Zorin, Gentoo, Alpine, NixOS are all viable options. Next, pick a desktop: GNOME, KDE Plasma, Xfce, LXQt, Cinnamon, MATE, Budgie, Pantheon, Deepin, Enlightenment. Each has different UX conventions, configuration systems, and integration quality. There is no single Linux desktop and its bewildering. - Power management now "works" in the sense that, when you close your laptop lid and re-open it, yay! the machine (mostly) comes back to life instead of just crashing. It took us at least 15 years to get to that point. However, PM does not work in the sense that battery like on my M4 Macbook Air is literally 2x what I would get from a comparably priced Linux laptop. Part of that is better hardware, but _a lot_ of that is better power management. - Audio now mostly works without glitching, just like it did in OS X circa 2002. But God help you if you're not using a well-supported setup and find yourself manually having to dick around with kernel drivers, ALSA, Pulseaudio. (Just typing these words gives me PTSD.) Here is a typical "solution" from *within the past year* for audio troubles in Linux: https://www.linux.org/threads/troubleshooting-audio-problems.... There are thousands more threads like this to be found online. For typical, 99%-of-the-time use cases, experiences of this sort are rarely if ever encountered on Mac. - Printing is arguably the closest because, as previously noted, they are both using the same underlying system. But printing, thanks to AirPrint, is still smoother and more pain-free on Mac than on Linux. - Don't even get me started on Bluetooth.

It's not that I'm anti-Linux, I wanted sooo bad for Linux on the desktop and laptop to succeed, for a variety of reasons. But Steve J came along 25-30 years and completely pulled that rug out from under us.


But hey, he also owns Sees Candy.


I still don't understand why certain performance aspects of the CUDA platform are so poorly documented. Why is successfully pushing the hw to its performance envelope considered a novel research result? Shouldn't I be able to look this stuff up on the Nvidia website?


One reason is clearly the fast past at which nvidia is evolving the hardware. I would consider cuda a very well documented platform in general. What they lack is low level tutorials, but this is where posts like this one can be a good resource


The bar is low.


> 1. Gaming cards are their R&D pipeline for data center cards. Lots of innovation came from gaming cards.

No way that is true any more. Five years ago, maybe.

https://www.reddit.com/r/pcmasterrace/comments/1izlt9w/nvidi...


on the contrary. this is the place they can try out new tech, new cores, new drivers, new everything with very little risk. driver crash? the gamer will just restart their game. the AI workload will stall and cost a lot of money.

basically, the gaming segment is the beta-test ground for the datacenter segment. and you have beta testers eager to pay high prices!

we see the same in CPUs by the way, where the datacenter lineup of both intel and amd lags behind the consumer lineup. gives time to iron out bios, microcode and optimizations.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: