Hacker News new | past | comments | ask | show | jobs | submit login

It's pretty playable on GeForce Now, for what it's worth. Still a big laggy, but I was able to play for many hours without major issues... just the occasionally annoying but livable stutter.



GeForce Now has been amazing as a mac only user


Same.

I have a M2 Max and GFN is much much easier than trying to set something up with GPT (Game Porting Toolkit) and Whisky, and much faster & quieter too. An RTX 4080 running in their data center means no local heat and noise.


Yes because you have no other options.


There's lots of options? GPT, WINE Crossover, Luna, Boosteroid, Shadow.tech... none of them run as well as GeForce Now, though. Or a dedicated gaming PC.


Does it support Steam mods yet? That's what kept me from using it to play CS1.


Dosent really count as it is not rendering on your machine... ofc its good there.


So? That's even better. Doesn't use my battery life or create noise & heat. Netflix isn't run on my machine either.


Sure, but then it does not have any relevance to the article.


I wasn't claimed to be relevant to the article, it was a reply to "it's a big bummer that I can't give this one a go" with a suggestion of how they could play it.


Wasn't supposed to challenge anything in the article. Just an option for those who are struggling to play it on their current hardware.

Even with GFN it's laggy and stutters. Totally agree with the article.


It just uses natural resources to outfit and power data center stuff to create heat and noise somewhere further away. Netflix... is fairly efficient, though being on-demand, perhaps much less so than broadcast TV.


Yes, but better that, in a shared environment with center-wide cooling and such, then each individual household needing to do that on their own.

Also way fewer cards needed this way, with users being able to share cards through the day instead of each needing their own.

Basically mainframes all over again :)


I should have been more clear in retrospect. There are definitely efficiency advantages to be had from moving things to the cloud, and I'm not against people moving workloads to data centers in general. I meant to point out that moving this particular workload to a data center sort of sweeps it under the rug that Cities: Skylines 2 itself is unacceptably resource-hungry for what it is. It should neither be the home users problem nor nvidias problem that C:S2 was so poorly optimized.


Oh, yes, absolutely agreed. CS:2 is horribly optimized... no argument there. I found the article illuminating and don't disagree with any of its findings.

Was just trying to offer the GP a way to play the game that actually works :) (Yes, only by inefficiently throwing RTX 4080 power at it, and even then still struggling.)


> Yes, but better that, in a shared environment with center-wide cooling and such, then each individual household needing to do that on their own.

I don't think this actually tracks, unless the heat is actually being put to use. You don't need HVACs when you have the machines distributed.


Why don't you need distributed HVAC? A gaming desktop can use a kilowatt of power, which is what a small space heater might put out. It often will make the room uncomfortably hot. (Anecdotally, this is part of why I switched to Geforce Now, myself. My apartment got incredibly uncomfortable from the heat. It's an older unit with no air conditioning.)

In data centers, sometimes (not always, and perhaps not often?) the heat can be more efficiently handled through central heat exchangers, more efficient commercial HVACs, etc.


The vast majority of machines are not using a kilowatt and the vast majority of users are not running anything to deal with the heat. You're an outlier here.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: