But that's the problem with this. The vast majority of the spend is going to be on the Nvidia chips which have a shelf life of 3-5 years. They are not making any significant long term investments.
During the dot com bubble, telecom companies spent 10s of billions of dollars laying down cables and building out the modern public internet infrastructure that we are still using today. Even if a lot of companies failed, we still greatly benefited from some of the the investments they made.
For this bubble, the only long term investment benefits seems to be the electricity build out and a renewed interest and investment in nuclear.
Most (if not all) of Oracle's investments are mostly in chips and data centers.
I also used to have a bare phone policy, but I had to change it after everybody decided to start making the damn things out of fragile glass. Yeah plastic screens are uglier but they don't crack
Whoa there pardner, my first 'smartphone' victim was a Nokia NGage - hey, it ran Symbian and I got it for not that much - which I had in my front coat pocket while working in the forest. One relatively gentle collision with a branch sticking out from a tree and the plastic screen was cracked. As was the LCD underneath it. It was then I switched to the next big thing, a Qtek S200 (better known as HTC Prophet). It was cheap 'cause it was used in some experiment by the Swedish railways which seems to have failed. The thing was new, more or less, for 1/10th of the price. It had a plastic touch screen cover which I replaced twice 'cause it started to resemble frosted glass from use.
"what it thinks you want" is doing a lot of work here. why would it "think" that you want to be pushed into an echo chamber divorced from reality instead of something else? why would it give you exactly what you "want" instead of something aligned with some other value?
Given the number of people that describes it's pretty clear that people do want that. It's not exactly a new and surprising thing that people want things that are bad for them.
yeah but writing an essay over the course of a week and over the course of two hours are entirely different experiences -- and the first one is the one that's usually useful in post-graduate life
so nobody understands why we use "story points" instead of time estimates? I feel like some people do appreciate that its not about the number of points but the quantitive difference between the items up for work.
In my experience the vast majority has completely abandoned the idea of story points or team velocity, they are just an (unnecessary) proxy value for time estimates. And god forbid you suggest something like planning poker to make the estimates somewhat accurate.
Why do you see C# adoption in gaming as desirable? Honest question. I think it's a good enough language, but why would it be a good fit specifically for building games?
I'm not the parent poster, but to me it's not about C# in particular, but that there is basically no other well-supported gamedev framework in a language that's 1) statically typed, 2) garbage-collected for safety and productivity, 3) has decent performance, and 4) has good tooling/IDEs. JavaScript, C++, and Python/GDScript are all missing at least one of those. Java could possibly fill the gap, but it's still clunkier to program in than C#, and no Java game framework has much traction.
I'd be ecstatic if there were something checking those boxes that wasn't owned by Microsoft.
However I would point out that before Unity took off, jMonkeyEngine had a good following.
And since reference counting is also garbage collection from CS point of view, I would add Swift with the game development kits, however that is pretty much Apple only, thus not really the same league.
Microsoft unfortunately since XNA, the whole DirectX team has been anti .NET, they didn't even provided Windows Runtime Components for .NET Native, even though it would have been relatively easy to do so, as it is based on COM (plus some extras).
The moment key people responsible for XNA left the building, it was done.
jMonkeyEngine is on my list of things I want to do a project in. I hope it continues to get support. It might be better for 3D than Godot (based on hearsay anyway)
Because I am firm believer in managed languages, and while I am also big into C++, I would like to still see the Xerox computing vision become reality on my lifetime.
Game developers usually are the last ones to move, and tend to do so due to external factors, mostly pressed by platform owners.
It was like that when we moved from Assembly in 8 and 16 bit platforms, slowly into Object Pascal (Mac/PC) and C.
Then it took until Watcom C/C++ on PC, and PlayStation 2 SDK, for C++ to finally start being taken seriously by game devs, then XBox did the rest on console space.
C# had a first victory in 2004 with Arena Wars, in OpenGL
Managed DirectX was already around, then XNA on XBox Live Arcade was the needed push.
When Microsoft killed in name of C++ and replaced it with DirectX TK, as usual in Windows in their territorial attitude, Mono project came up with MonoGame as rescue.
This is what enabled them to eventually join efforts with Unity, when they were rewriting the engine to be cross platform, and go beyond OS X.
Eventually Unity alongside C#, use to have a certain Flash like vibe, and for many, the entrypoint into the .NET ecosystem.
So it is kind of sad see this go away.
I don't believe in the one true language for everything, other than Timex BASIC, the moment I learned Z80, I have been a polyglot ever since.
A cool designed game, even in PyGame, has probably more entertainment value, than yet another remake in Unreal running on a PlayStation 5 Pro.