Mechanical sympathy. Rather than designing a game on a PC to take arbitrary advantage of modern tech and then trying to cram it down onto a more limited console platform, Nintendo ask, at design time, what the most interesting things they can do are that would work perfectly within the constraints of the platform — and then do that.
(And Nintendo engineers can have perfect knowledge of "the constraints of the platform", because 1. they built the platform; 2. it's the only platform they ever code for, never porting to anything else; and 3. for late-in-generation titles, they have been developing for it for years already, while also doing platform-SDK support for every third-party development studio.)
Oh, and besides that, because they design each platform initially specifically to work well for the types of games they want to make. (This goes all the way back to the Famicom, which has hardware PPU registers that were specifically implemented clearly to make the launch-title port of Donkey Kong extremely easy to code.)
As someone who hates cutscenes, that was fine by me. Cutscenes rip you out of the game to present a story in a different mode, and often it could be completely done by using the game mechanics directly. Not to mention until quite recently they would also take large loading times.
Even if you take cutscenes out of it, there's simply almost 90% less space for a game - and cartridges were more expensive for the publisher to boot.
There's no denying that there's great N64 games, but Nintendo crippled the console outside of first and second (Rare) party development. The same trend continued with the Gamecube (despite it being more powerful than the PS2, the Gamecube disc held 25% of the capacity of a PS2 DVD).
If their portables hadn't dominated the market, the mid 90s to mid 00s would've been a terrible time for the company.
Of course Nintendo is doing great now, so you could say it's all a moot point.
I know that there’s been activity in the emulation scene around “extending” the SNES emulators with features the SNES never had during its lifetime, like CD-quality audio aka the PSX, or hi-res texture packs, or faster + higher-resolution poly rendering for games using the SuperFX chip; and there have been ROMhacks and home brew that take advantage of these extensions.
Has anyone tried doing the same for the N64? Seeing what would be possible for an N64 game given an emulator tweaked to allow an unlimited ROM and VRAM size budget and (effectively) zero DSP DMA delay?
1 - In the product line of Nintendo consoles ever produced, the N64 by being an SGI Onyx in a box, was the exception in regards to IP before technical specifications.
2 - In 1996, when the N64 was released into the market, ALL game consoles had hardware limitations of some sort. This includes Sega Saturn, Atari Jaguar, 3DO, and Playstation 1.
> 2 - In 1996, when the N64 was released into the market, ALL game consoles had hardware limitations of some sort. This includes Sega Saturn, Atari Jaguar, 3DO, and Playstation 1.
So, N64 wasn't an exception since it's a console with hardware limitation ?
In the product line of Nintendo consoles ever produced, the N64 by being an SGI Onyx in a box, was the EXCEPTION in regards to IP before technical specifications.
Oh, so it's about IP, not technical specifications or hardware limitations ? edit: Quite frankly, comments like With exception of N64. throw me off in the long run.
I dunno, on paper maybe but in practice N64 look no better or worse than PS1, textures were very low quality (I think it was a texture cache size problem). Games like MGS or Vagrant Story looked better than anything on the N64 for me.
I think it's great that it maintains a solid 30 and seems to have very few issues.
Another interesting perspective is that it—a game made at the end of the Switch's life (we hope)—is only marginally prettier and more polished than BOTW—a launch title. I would still hold BOTW as one of the prettiest Switch titles, including third parties (I realise this is subjective). I'm not sure of another console where you don't make graphical progress in 6 years of it existing. I don't know why this is, or even if it's a failing, I just think it's interesting.
I think sometimes we conflate "graphical fidelity" with "beauty". I agree that BotW is one of the most beautiful games I've played, and that's an artistic achievement not a technical one. They do of course go hand in hand to some extent - sometimes you need technical tools for artistic vision, but you don't need high-tech for beauty.
> I would still hold BOTW as one of the prettiest Switch titles
+1; I don't usually care much about graphics (I play ASCII roguelikes for crying out loud), but there have been several moments in BotW where I found myself soaking in the scenery because it was gorgeous.
Priorities. For Nintendo, size matters.
Time also matters. In an old interview, Someone from Nintendo says that the time to put the disc in the Wii and be able to play Zelda needs to be less than 30sec.
Nintendo save: bit-optimised tightly packed binary
NBA savegame: json encoded in xml, run through a base64 because < in team names was breaking saves, all packed in YAML because the new summer intern couldn’t find a parser library for anything else
no offense to nintendo and fans, but fidelity wise the game looks like an early ps3 game and runs at 30 fps with dips in certain areas and very regular dips when using certain game mechanics.
I can count the triangles on a lot of geometry visibly, and the textures are so blurred it looks like you take any modern PC game and only render the lowest available level of detail of all the textures. I can count the pixels in the shadows on the floor and lighting wise the game is extremely basic (it doesn't need to be more because of the art style). most effects you see in the game are literally blurry billboarded (but granted alpha blended) sprites, including the clouds that are so important in this games visuals.
and to top it off we're in 2023 with half the people or more on 4k screens and the game doesn't even manage native 900p most of the time.
the art design of the game is just designed well around those constraints. very well. but the devs likely did nothing super special to make it run well.
Have you seen modern video games? Take a look at The Last of Us Part II, Call of Duty: Modern Warfare 2, Red Dead Redemption 2, Hogwarts Legacy, Horizon Forbidden West, etc. The difference in fidelity, animations, audio, and just overall depth and immersion is astronomical.
Zelda would make for a great game on my phone though.
I wouldn't say seamlessly but it run okay yeah. There is a digital foundry video already on the technical side of thing, apparently it's using AMD FSR1
Note that all Switch games have to be small, out of technical necessity. The Switch only comes with 32 GB of internal storage, and that also gets used for your game saves, your screenshots/videos, and the OS itself. If Nintendo wants to offer digital downloads, and if it doesn't want to require users to go out and buy an SD card to expand their storage (which, to be clear, you should anyway for convenience), then they have to keep game sizes small.
It's using proven technology and they didn't push any boundaries, graphics wise; Breath of the Wild also came out for the Wii U, which is nearly 11 years old now and even at the time didn't really try to push any performance boundaries.
Pokemon-Games are not from Nintendo itself. They own shares of the franchise, but it seems they are overall not directly involved into the game-development, unlike with Zelda. And Pokemon in general has different problems regarding quality. They are more time-constrained, stressed, and seem to have some internal struggle in the last years. While Zelda seems to had the liberty to develop peacefully for years on their own.
Not even close. It's a BOTW expansion pack writ large, there's little performance or graphical difference between TOTK and BOTW, a game that released six years ago.
This is apparently incorrect. I'm avoiding detailed tech reviews right now so as to not spoil myself, but reports are that the original BOTW held itself back in order to accommodate the Wii U. Draw distances appear to be higher, objects are more detailed and there are more of them, there are more LODs, and the framerate is now reliably 30 FPS in all but the most demanding scenes (which sounds like faint praise, but if you've ever played the original, is a definite improvement).
It helps that the original game was utterly gorgeous, thanks to inspired art direction.
Same engine, slight retune, not six years-worth-of-dev-new. Much more of an iterative upgrade. Compare, for instance, Ocarina of Time (1998) and, Windwaker (2003), vs Ocarina (1998) and Majora's Mask (2000), and you'll see that BOTW/TOTK is much closer to the latter than the former.
I think the commenter means that mechanical watches don't keep time as well as digital watches. So, I guess they are optimized for mechanical beauty instead of timekeeping.
The better question you should be asking is why does everyone else need 300GB of disk space that transfers at over 7GB/s, 32 CPU cores at 6GHz, a video card worth $3000 dollars, and RAM sizes measured in gigabytes with 3 digits just for a fucking game?
At some point I have to wonder if the reason we have so much computing power is so we can use that computing power.
If you are serious (I do game development for a living and work on graphical assets daily so that seems evident for me, but I totally understand it can be arcane stuff) it's simply that they choose a stylized graphical style avoiding a lot of costly details you generaly find in hight end games.
They use low poly models, as far as I know there is no baked lightmap (these are pretty expensives but are mandatory in a lot of engine if you want realistic shadows on higtly detailled environment) and their shader materials probably use very simple and low resolution maps.
All these thing decrease the asset footprint by orders of magnitude.
If you want to look in more detail in can look and compare a similar rendering in unity. Taking two unity exemple you can compare :
- 'chop chop' a game using a similar rendering style : https://www.youtube.com/watch?v=GGTTHOpUQDE, if you take the pig and its environment showed in the video and go in the github repository you can see they only use one texture map : an albedo one.
All the models (pig + environment) weight about 6mB of textures and 350kB of models. and are sufficient to have the full main character and an environment.
- a 'realistic PBR workflow gun asset' on asset store (choose randomly but seems nice, realistic and containing only the gun so we can see download size) : https://assetstore.unity.com/packages/3d/props/guns/free-fps.... The workflow need 6 maps (there are 7 here but you generaly only use either a normal map or a heightmap) The pack weight 35MB. It's only the gun, you lack a full character handling it and the environment.
While I really like zelda, even with stylized graphics the game look a bit outdated for me. The cellshaded characters are fluid and pretty but the 'low resolution texture and low poly models' bother me a bit especially on environments. The artistic direction is really good but technically I can only think they are held back by the hardware.
As a game developer, I totally want to use all the resources i know i can find on the target hardware. Trust me even today they are lots of features game designer dream to put in game and cant because computing resources are still limited ^^. Do game NEEDS them to be fun ? Of course not, but COULD they be fun experiences ? I think yes :)
I absolutely am serious; a lot of games and software in general today demand far more system resources than they have any reasonable right to.
Don't give me "but the textures!" and the like either, optimize that stuff better instead. Whether it's Windows 10/11 or Call of Duty or Elite: Dangerous or Chrome or whatever strikes your fancy, software today has no business demanding the resources they do.
Lest we forget, the hardware we can buy today would have been considered supercomputers just a few years ago. You want to tell me that will choke and croak just doing mundane stuff like playing games or browsing the internet?
Well they were given theses right by the users who spend lot of money on having these system resources and are asking games to be as beautiful and complex we can (not all the users, i'm not the last to spend time in oldschool games, but a significant and heavy spending portion of them).
Business is exactly why most games dont spend an enormous budget on optimization today. It's not a requirement by the great majority of customers, it's quickly time and cost heavy, so the return on investment is pretty low.
Yes, i think even with infinite optimization budget a today triple A realistic rendering could simply not be possible on a too old computer in realtime.
I also think while it would really add value if background application like teams/slack/discord would be less resource heavy because they are open but not the main focus, when you play a high end video game it make sense to consider it's your main reason to use your computer at that time :)
If simulating and rendering a complete complex intractable realistic but imaginary world with today achievable level of detail seems mundane to you, it's far to seems like to me :)
No opinion about browsers and OS, today games are doing lot more of stuff valuable to most users than those of yesterday. I don't know enought about modern value of os and browser, exept empirically they do seems to crash a lot lot less than 20 years ago, but also syp a lot more on me :)
The priority of an AAA game developer is to provide as much graphic fidelity for a specific compute budget, not to consume the least compute for a specific graphic fidelity. If they "optimize that stuff better", the outcome wouldn't (and shouldn't!) be a lower usage of system resources but rather fitting in even more graphic details while still capping out all resources.
They do obviously have the reasonable right to demand all the system resources that are available, because a game is usually an immersive experience that is the only important thing running on the system at that time, and the only purpose of those greatly increased system resources is to be used for gains in visual quality - there's no reason to not try and use all of that compute power of what would have been considered supercomputers just a few years ago.
> You want to tell me that will choke and croak just doing mundane stuff like playing games or browsing the internet?
The fact that you're comparing browsing the internet with playing AAA games speaks volumes. Browsers are capable of making insane amounts of optimizations because the "geometry" of a website is (mostly) completely static, there's no physics, there's no sounds, there's no AI running client side, there's no game logic, etc. This means they get to cache 90% of the view and only update the changed portions of the screen.
Contrast that with a game, which has the entire view of the 3D world changing every 16ms when the user moves their mouse, has thousands of physical interactions happening (most likely at a higher framerate), is buffering and mixing sounds in a 3D world, is animating and loading large 3D assets in real-time, is creating photo realistic lighting in real-time, is handling all game logic and AI client side, etc. It becomes clear that the two fields, while both difficult in their own ways, don't overlap very much. Of course AAA games take a super computer to run. It's doing all that in 16ms, sometimes 7ms!
Plus, if you don't care about all the visual fidelity and stuff, most games allow you to turn a ton of that off. Games have never been mundane, whether we're talking about the original tetris or the remastered version of the last of us, they are pushing the boundaries of the hardware they run on to the limit to achieve incredible immersive experiences.
Not only that! They also have increasingly helped improve the state of the art rendering in offline renderers! We're seeing the improvements that games have been able to make to achieve real-time photo realistic rendering slowly make their way to large Hollywood studios. This allows the movies we watch to have higher fidelity CG, because the artists have quicker iteration times. And it reduces the compute load required for these massive CG scenes since they are using more optimized rendering techniques. Saving money, and our environment.
Lest we forget, these "mundane" games have led to huge breakthroughs in all sorts of fields because of their willingness to push the boundaries of our machines to see what's truly possible. As opposed to 90% of the software created today which runs orders of magnitude slower than it needs to because people can't or don't know how to write efficient software.
Because this hardware is for a different experience. Zelda is nice, but the style has its limits. It's kinda like asking why Disney invests hundreds of millions of Dollar into Marvel and Star Wars-Movies, when you can also make a cheaper but polished animation-movie with a fraction of that price. It's simply not the same.
We need this power because companies need to keep selling us new stuff and also because developers nowadays can't optimize their games too much since their managers make them do ten times the work, in half the time, and for one third of the pay they had on the 1990s.
Gaming has been a massive driver of hardware for a very long time in a way that can always be looked at as unneeded. We surpassed more compute for the sake of compute a while ago. The neat thing is that there are still new things to do with it. Real-time path tracing will be the next thing as well as moving more compute over to the GPU. We don’t need it but it will open new possibilities. And it seems less wasteful than running ten copies of Chrome to support a few desktop apps.
The new Zelda demonstrates something that’s been true of every console generation. They are a fixed platform and the later games are always considerably better at utilising the hardware.
Yeah, just look at FromSoft's output for the PS4 for example. Bloodborne came out very early, and while it looks great, it doesn't hold a candle to Elden Ring.
Though gameplay wise I have to say I prefer it. Elden Ring has too much stuff in it(crafting, gathering, tons of cookie cutter dungeons, and too many easy boss fights) for me. Bloodborne is very stripped down and devoid of fluff. I can sort of keep the whole game in my head and I love that about a game.
I doubt anyone is asking that question since none of what you said is true. This just seems like rant of someone wanting to play recent games and not wanting to upgrade their computer. If that's the case, just say that instead of whining about a trend that's been going on for 40 years.
Furthermore, why does a 2023 released game, in development since 2018, run like utter garbage on said spec hardware that is orders of magnitude more performant than last years specs ...
How does Nintendo pull it off?