Hacker Newsnew | past | comments | ask | show | jobs | submit | veidr's commentslogin

Please elaborate.

Well if you can get $100 worth of X on credit card for $98, but you can buy the same thing with cash for $97, aren't you actually paying 150% of the "cash back" with your own money? ¯\\_(ಠ_ಠ)_/¯

> but you can buy the same thing with cash for $97

Merchants rarely offer cash discounts in the US.


Except nearly every restaurant I frequent. Example menu below with cash and credit prices, credit prices being 4% more. I see this commonly.

https://misslucyskitchen.com/menu


Even as somebody really disliking the current interchange fees in the US, 4% is a money grab on the merchant's side that I find hard to empathize with.

Even if the merchant pays the sticker price for card acceptance, it's usually just below 3%, unless international cards are involved. Add to that the fact that cash transactions in restaurants are often accounted for in "more tax efficient ways", and it feels even more icky.


This is interesting to me, because I see this kind of comment on almost every Zed post.

I haven't used a low-DPI monitor for like... not sure, but more than a decade, I'm pretty sure, so for me the weird blocker I have with Zed is the "OMG YOU HAVE NO GPU!!!! THIS WILL NOT END WELL!" warning (I run a lot of Incus containers via RDP, and they mostly have no GPU available).

But what kind of monitors are you low-DPI people using? Some kind of classic Sony Trinitron CRTs, or what? I'm actually curious. Or is it not the display itself, but some kind of OS thing?


Depending on the definition I'm not a low DPI user myself, but in my friend group, I seem to be the only person who cares about >160 dpi, lots of people are using 1440p displays, or >34" 4k displays. In Apple's mind, high dpi (eg retina) is > 218 dpi, so my lowly 34" 5120x2160 doesn't count for them. But it is > 160 which is my personal threshold for hi dpi.

There aren't all that many >20" displays on the market that meet Apple's definition of high dpi, and not a ton more that meet my much looser definition.


I was imagining 72dpi like old CRT monitors, but yeah, I'd agree that 160 feels like a fair threshold.

I have a 4-5 year old ultra wide monitor which is a lot of pixels but low dpi. I really like the single monitor containing two screens worth of pixels, but I wish it was high dpi. At the time there weren’t really high dpi ultra wides available, and they’re still expensive enough that upgrading isn’t a high priority for me… but I’m sure I will at some point.

Mine is 2560x1440 which is a pretty nice "sweet spot" size. A comparable 5k to 6k display still commands a substantial price, and - given that I work at two locations - would need me to have two of them. The screen I use as my current (a 3x2 BenQ) also has some amount of subsampling going on, because running it at 2x ("Retina native HiDPI") all the UI controls are too damn big, and space is not enough. Running it at 1x (everything teeeny-tiny) is just not very good for my eyesight and not very workable - and, again, with Zed bumps into the same broken antialiasing rasterizer they have.

And it is not an OS thing. The OS renders subpixel antialiased fonts just fine. But Zed uses its own font rasterizer, and it completely falters when faced with a "standard passable resolution" screen - the letters become mushy, as if they have been blurred - and rather sloppily at that.


It depends on your OS.

Linux and Windows are significantly better for both 1440p and 4k monitors. Both Linux and Windows have subpixel rendering and configurable font hinting for 1440p. And they both have fractional scaling UIs for 4k. macOS on the other hand only really looks acceptable on a 5k monitor.


When people says things like "mine is 2560x1440" on HN, are they talking about the mac scaled resolution? I feel like some context is always missing from resolution discussions, and it's a topic non-technical people can weigh in on as well.

The 2560x1440 is QHD which is kind of a happy medium: high resolution enough to look really sharp, but not so high resolution that you have to scale it up like Macs do on retina displays. Having had retina Macs (and been very happy with them) since they came out, I've been using 16" and 17" QHD panels on my linux laptops for about five years... and they are actually just fine.

I actually don't understand what I'm missing. I'm using two old monitors, a 27" at 2560x1440 and a 23.5" at 1920x1080 (in addition to my high DPI Framework 13 screen). How else can I get at least 4480 across (after scaling to a font size I can read - I'm 49) and still cover that many inches? My DPI right now is about 100, so to double that, wouldn't I need 8960 across 44 inches? I don't really want to pay $1500 for resolution my eyes are probably too old to notice.

It’s okay eyes are just different. I personally enjoy 220DPI, but 60Hz looks absolutely fine. However at the workplace enough people complain about 60Hz that all the monitors at work are 120Hz. I don’t notice any additional smoothness at all so it’s all wasted on me.

Regular HD LCD monitors are typical office fare.

> But what kind of monitors are you low-DPI people using?

They’re still pretty common in enterprise. So cheap. At this point most desks probably cost more than the PCs on top of them.

TBF, enterprise probably still has to deal with ancient apps that can’t handle higher resolution well.


Typical DPIs are still all over the place depending on the demographic. Macs have been ~200dpi forever, while cheap PCs are still mostly ~100dpi, and decent PC setups tend to land somewhere in the middle with ~150dpi displays which are pretty dense but not up to Mac Retina standards. Gamers also strongly favor that middle-ground because the ultra-dense Mac-style panels tend to be limited to 60hz.

Zed started out as a Mac-only app, and that's reflected in the way their font rendering works.


I guess that makes sense. I'm a 280ppi convert, so I judge Mac users with pity — Linux and Windows work perfectly with my 31.5" 8K display (from fuckin' 2017 btw...) but Macs can only drive it at 6K, which adds a fuzz factor.

Unless you use it at 4K, but macOS isn't really usable that way (everything way too small).

But yeah, it's 60Hz. Which has sucked ever since I accidentally got a 120Hz display, so now 60 Hz looks like 30Hz used to...

    Monitor                              Resolution       PPI
    ─────────────────────────────────────────────────────────
    31.5" 8K                             7680 × 4320      280
    27" 5K                               5120 × 2880      218
    31.5" 6K                             5760 × 3240      210
    23" 4K                               3840 × 2160      192
    27" 4K                               3840 × 2160      163
    34" 5K ultrawide                     5120 × 2160      163
    31.5" 4K                             3840 × 2160      140
    39.7" 5K ultrawide                   5120 × 2160      140
    44.5" 5K ultrawide (LG 45GX950A-B)   5120 × 2160      125
P.S.

I had a chance to try that LG 45GX950A-B at Yodobashi Camera in Akihbara the other day, and... that measly 125ppi might overperform at the distance you have to put it at. But then again my 50-year-old eyeballs are starting to be like "anyway you need your glasses bro" so... YMMV


There are actually 218ppi 180hz panels coming soon, although they will no doubt cost a kidney for the first few years.

https://rog.asus.com/monitors/27-to-31-5-inches/rog-strix-5k...


Looks good overall, but:

> USB-C with 15W power delivery for maximum compatibility

I am hoping that is a typo.


Yeah that's... not great if correct. LGs version has 90W PD.

https://tftcentral.co.uk/news/lg-27gm950b-5k-monitor-announc...


What does that mean? If the monitor only requires 15W to operate, that's a good thing, right? Unless monitors are expected to use less than that? I'm not familiar with reading monitor spec sheets.

to add on to what jsheard said, for this feature to be usable (ie, charge your laptop just by plugging in the monitor), you need this number to be about what your laptop's charger is. At 15W, even a macbook air would run out of power slowly while plugged into this monitor, assuming you don't plug a second cable into your laptop. 65W or 90W is a much more normal value for a feature like this.

That all makes sense. The only thing I was missing was that this refers to power output. It seems like kind of a niche and tenuous value-add for a monitor. Why would I want to get power from my monitor?

> Why would I want to get power from my monitor?

Both at work and at home, I can plug in my monitor to my laptop with a single cable to my monitor. That single cable charges my laptop, connects the display, and passes through a usb hub that's built into the monitor that connects my keyboard and webcam. It's _incredibly_ convenient. It's also just a lot less cabling. You can think of it like a dock, built into the monitor for free.

> It seems like kind of a niche

Different workflows/circles. It's not something you're likely to use with a desktop, mainly with a laptop. It also really only works well if you use thunderbolt. It's reasonably common but probably not a majority where I work, where 90% of dev machines are macs.


Charging and video on the same cable is super cool. I take it all back.

That's output power. USB-C monitors can charge your laptop while it's connected through a single cable.

My 42.5" 4k LG monitor is 104 DPI being Low by macOS conventions.

There it is.


effect change


LOL that's crazy, but I would also recommend that any user of these devices who don't have an actual medical issue that requires the device to work, for whatever value of "work" it is capable of ... maybe consider doing that. Just like I put my modern TV in VLAN purgatory.

But if you actually have any form of diabetes... definitely do not do that. Unless you are also rocking some other brand. ¯\\_(ಠ_ಠ)_/¯


I tried so hard to make Codex work, after the glowing reviews (not just from Internet randos/potential-shills, though; people I know well, also).

It's objectively worse for me on every possible axis than Claude Code. I even wondered if maybe I was on some kind of shadow-ban nerf-list for making fun of Sam Altman's WWDC outfit in a tweet 20 years ago. (^_^)

I don't love Claude's over-exuberant personality, and prefer Codex's terse (arguably sullen) responses.

But they both fuck up often (as they all do), and unlike Claude Code (Opus, always), Codex has been net-negative for me. I'm not speed-sensitive, I round-robin among a bunch of sessions, so I use the max thinking option at all times, but Codex 5.1 and 5.2 for me are just worse code, and worse than that, worse at code review to the point that it negated whatever gains I had gotten from it.

While all of them miss a ton of stuff (of course), and LLM code review just really isn't good unless the PR is tiny — Claude just misses stuff (fine; expected), while Codex comes up with plausible edge-case database query concurrency bugs that I have to look at, and squint at, and then think hmm fuck and manually google with kagi.com for 30 minutes (LIKE AN ANIMAL) only to conclude yeah, not true, you're hallucinating bud, to which Codex is just like. "Noted; you are correct. If you want, I can add a comment to that effect, to avoid confusion in future."

So for me, head-to-head, Claude murders Codex — and yet I know that isn't true for everybody, so it's weird.

What I do like Codex for is reviewing Claude's work (and of course I have all of them review my own work, why not?). Even there, though, Codex sometimes flags nonexistent bugs in Claude's code — less annoying, though, since I just let them duke it out, writing tests that prove it one way or the other, and don't have to manually get involved.


LOL this is beyond idiotic. Banning AI-generated assets from being used in the game is a red line we could at least debate.

But banning using AI at all while developing the game is... obviously insane on its face. It's literally equivalent to saying "you may not use Photoshop while developing your game" or "you may not use VS Code or Zed or Cursor or Windsurf or Jetbrains while developing your game" or "you may not have a smartphone while developing your game".


> you may not use VS Code or Zed or Cursor or Windsurf or Jetbrains while developing your game

Just to nitpick, AAA game developers probably don't use the editors you mentioned since they do native applications.


What do you think they use?

I work in mobile games and practically everyone is using either VSCode or some Jetbrains IDE. A few use Visual Studio but it has AI autocomplete too.


Oh, mobile. I was thinking of the AAA desktop titles. I'm too poor to play free to play games so the mobile titles are kinda out of my mind.

Of course you'd use Jetbrains for Android...


Unity is by far the biggest platform for mobile games and its C# all the way, both for Android and iOS.

You need to crack open XCode only for very specific debugging tasks


Interesting. I don't do games, so I may be wrong, but I thought a lot of Unreal Engine devs used Jetbrains. So what editors do they use? Are there current IDEs or code editors shipping in 2025 that don't have any LLM-based coding features?


I could be wrong, but I'm not aware of any native debugging support except in visual studio... the native one.


For C# Rider has had superior tooling for debugging for years now.


OK, maybe my point got lost because I didn't know that, but I should have just added Visual Studio to my list — it too has LLM and agentic features, which was my point.

If you can't use LLMs to generate placeholder graphics that don't ship in the actual game, then why can you use coding editors that let you use LLMs to generate code?


If LLMs were simply a niche but somewhat useful technology people could choose to use or avoid, then sure, such an absolutist stance seems excessive. But this technology is being aggressively pushed into every aspect of our lives and integrated into society so deeply that it can't be avoided, and companies are pushing AI-first and AI-only strategies with the express goal of undermining and replacing artists (and eventually programmers) with low quality generic imitations of their work with models trained on stolen data.

To give even an inch under these circumstances seems like suicide. Every use of LLMs, however minor, is a concession to our destruction. It gives them money, it gives them power, it normalizes them and their influence.

I find the technology fascinating. I can think of numerous use cases I'd like to explore. It is useful and it can provide value. Unfortunately it's been deployed and weaponized against us in a way that makes it unacceptable under any circumstances. The tech bros and oligarchs have poisoned the well.


I mean, I share some of the concerns you expressed, but at the same time there is no chance at all that working programmers and artists won't be using LLMs (and whatever "AI" comes next).

I'm a programmer, and I enjoyed the sort of "craftsman" aspect of writing code, from the 1990s until... maybe last year. But it's over. Writing code manually is already the exception, not the rule. I am not an artist, and I also really do understand that artists have a more legitimate grievance (about stealing prior art) than we programmers do.

As a practical matter, though, that's irrelevant. I suspect being an "artist" working in games, movies, ads, etc will become much like coding already is: you produce some great work manually, as an example, and then tell the bots "Now do it like this ... all 100 of you."


It’s like banning any and all uses of chainsaws for any kind of work ever just because some bros juggle with them and have chainsaw juggling conventions.

It’s just a tool, but like any tool it can be used the right way or wrong way. We as a society are still learning which is which.


More like banning automated forest clearing power chainsaw robots because they mowed down all the forests killing most life on earth. That's a terrible analogy, but better than yours.


> To give even an inch under these circumstances seems like suicide.

Seems like a histrionic take.


I've been waiting so long for this DE, but as a longtime KDE user who really doesn't like GNOME, GNOME is literally the only DE on Linux that can do Windows/Mac-equivalent remote desktop on Wayland.

COSMIC isn't yet close to a solution for remote desktop, so I expect to have to wait a few more years. (T_T)


"npm and javascript in general shouldn't be used" — stipulated


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: