It will. I actually made a test when NanoBanana first went GA which featured a photo of a one-legged man and asked the model to change the clothing into pants. It added the pants as requested and then proceeded to "heal" his missing leg in the process.
Very difficult for even SOTA to go against data that is as well-represented as bipedal humanoids.
I’ve used this but it made my laptop spontaneously reboot a few times (or was it a hard lockup that necessitated a reboot)? I liked it but ended up putting it away.
Dual GPU nvidia and integrated Intel, was running the latest Ubuntu LTS at the time with all updates.
And that's why it is best to avoid GPU-acceleration unless you truly need it.
GPU drivers have always been full of bugs and will always be, since modern GPUs are ridiculously complex and designed for speed rather than reliability.
It's completely normal for Linux operating systems, and in fact even most others (even Windows and Mac under certain loads), to choke and fail under resource exhaustion. I've experienced it the most with Linux though, but I've done it to Windows and Mac too.
I don't think its really practical to do so. wasm can't directly access the canvas or webgpu contexts. it has to route date through js anyways so you will always have js code in the loop here.
You literally, and I mean literally read my mind dear random stranger down to the wording. Micro is definitely underrated.
Micro is a truly goated software. I mean, it can genuinely replace vscode for small scale editing in the context of shopify that the parent comment was referring to.
It also helped me in physics when I had to remember the units like 10^-6 being micro, 10^-9 being nano etc. and the funny thing is that I used to remember it in the start by seeing I am not sure if it was on micro's github or something but it was a comment on how micro has more features than nano and thus it's name.
So like for some time I definitely felt like I was thinking of micro software, then nano and making the feature comparison to find micro to be larger than nano.
Might seem kinda niche but I ABSOLUTELY LOVE MICRO. Its the one software that I install everywhere, even on my android phone by using UserLand[1] with alpine linux.
I tried writing python code on my phone and it was definitely pleasant thanks to micro.
A suggestion: in the comparison table under the “AES and Port-Parallelism Recipe” it would be great to include “streaming support” and “stable output” (across os/arch) as a column.
Also something to beware of, some hash libraries claim to support streaming via the Hasher interface but actually return different results in streaming and one-shot mode (and have different performance profiles). I’m on mobile so I can’t check atm but I’m about 80% sure gxhash has at least one of these problems that prevented me from using it before.
Thanks! You are likely right! It took a lot of time to make sure that all 6 of ISA-specific versions of StringZilla (https://github.com/ashvardanian/StringZilla/blob/main/includ...) return the same output for both one-shot and incremental construction, and I’m not sure if it was a priority for other projects :)
Framing it in gigawatts is very interesting given the controversy about skyrocketing electric prices for residential and small business users as a result of datacenters over the past three years, primarily driven by AI growth. If, as another commenter notes, this 10GW is how much Chicago and NYC use combined, then we need to have a serious discussion about where this power is going to come from given the dismal status of the USA's power grid and related infrastructure and the already exploding costs that have been shifted to residential users in order to guarantee electric supply to these biggest datacenters (so they can keep paying peanuts for electricity and avoid shouldering any of the infrastructural burden to maintain or improve the underlying grid/plants required to meet their massive power needs).
I'm not even anti-datacenter (wouldn't be here if I were), I just think there needs to be serious rebalancing of these costs because this increase in US residential electric prices in just five years (from 13¢ to 19¢, a ridiculous 46% increase) is neither fair nor sustainable.
So where is this 10GW electric supply going to come from and who is going to pay for it?
To everyone arguing this is how DCs are normally sized: yes, but normally it's not the company providing the compute for the DC owner that is giving these numbers. nVidia doesn't sell empty datacenters with power distribution networks, cooling, and little else; nVidia sells the GPUs that will stock that DC. This isn't a typical PR netnewswire bulletin "OpenAI announces new 10GW datacenter", this is "nvidia is providing xx compute for OpenAI". Anyway, all this is a segue from the question of power supply, consumption, grid expansion/stability, and who is paying for all that.
I work in the datacenter space. The power consumption of a data center is the "canonical" way to describe their size.
Almost every component in a datacenter is upgradeable—in fact, the compute itself only has a lifespan of ~5 years—but the power requirements are basically locked-in. A 200MW data center will always be a 200MW data center, even though the flops it computes will increase.
The fact that we use this unit really nails the fact that AI is basically refining energy.
A 200MW data center will always be a 200MW data center, even though the flops it computes will increase.
This here underscores how important TSMC's upcoming N2 node is. It only increases chip density by ~1.15x (very small relative to previous nodes advancements) but it uses 36% less energy at the same speed as N3 or 18% faster than N3 at the same energy. It's coming at the right time for AI chips used by consumers and energy starved data centers.
N2 is shaping up to be TSMC's most important node since N7.
I think it is really just the difference between chemically refining something and electrically refining something.
Raw AC comes in, then gets stepped down, filtered, converted into DC rails, gated, timed, and pulsed. That’s already an industrial refinement process. The "crude" incoming power is shaped into the precise, stable forms that CPUs, GPUs, RAM, storage, and networking can actually use.
Then those stable voltages get flipped billions of times per second into ordered states, which become instructions, models, inferences, and other high-value "product."
It sure seems like series of processes for refining something.
It is the opposite of refining energy. Electrical energy is steak, what leaves the datacenter is heat, the lowest form of energy that we might still have a use for in that concentration (but most likely we are just dumping it in the atmosphere).
Refining is taking a lower quality energy source and turning it into a higher quality one.
What you could argue is that it adds value to bits. But the bits themselves, their state is what matters, not the energy that transports them.
I think you're pushing the metaphor a bit far, but the parallel was to something like ore.
A power plant "mines" electron, which the data center then refines into words. or whatever. The point is that energy is the raw material that flows into data centers.
Where do the cards go after 5 years? I don't see a large surplus of mid sized cloud providers coming to buy them (cause AI isn't profitable), Maybe other countries (possibly illegally)? Flood the consumer market with cards they can't use? TSMCs' more than doubled packaging and they are planning on doubling again
Yeah, it was the companies pilot site, and everything about it is tiny.
But it very quickly became the best place in town for carrier interconnection. So every carrier wanted in.
Even when bigger local DC's went in, a lot of what they were doing was just landing virtual cross connects to the tiny one, because thats where everyone was.
Basically, yes. When you stand up something that big, you need to work with the local utilities to ensure they have the capacity for what you're doing. While you can ask for more power later on, if the utilities can't supply it or the grid can't transport it, you're SOL.
You could in theory supplement it with rooftop solar and batteries, especially if you can get customers who can curtail their energy use easily. Datacentres have a lot of roof space, they could at least reduce their daytime energy costs a bit. I wonder why you don't see many doing solar, do the economics not work out yet?
I'd have to do the math, but I doubt that makes sense given the amount of power these things are drawing. I've heard of DCs having on-site power generation, but it's usually in the form of diesel generators used for supplemental or emergency power. In one weird case, I heard about a DC that used on-site diesel as primary power and used the grid as backup.
Compared to their volume they absolutely do not: you get about ~1kW / m^2 of solar. Some quick googling suggests a typical DC workload would be about 50 kW / m^2, rising too 100 for AI workloads.
That's pretty interesting. Is it just because the power channels are the most fundamental aspect of the building? I'm sorta surprised you can't rip out old cables and drop in new ones, or something to that effect, but I also know NOTHING about electricity.
Not an expert, but it’s probably related to cooling. Every joule of that electricity that goes in must also leave the datacenter as heat. And the whole design of a datacenter is centered around cooling requirements.
Exactly. To add to that, I'd like to point out that when this person says every joule, he is not exaggerating (only a teeny tiny bit). The actual computation itself barely uses any energy at all.
>I'm sure the richest country in the world will do fine.
You underestimate how addicted the US is to cheap energy and how wasteful it is at the same time.
Remember how your lifestyle always expands to fill the available resources no matter how good you have it? Well if tomorrow they'd have to pay EU prices, the country would have a war.
When you lived your entire life not caring about the energy bill or about saving energy, it's crippling to suddenly have scale back and be frugal even if that price would still be less than what other countries pay.
It's hard to appreciate the difference in 'abundance mentality' between the median US and EU person. It always struck me as an interesting culture difference. While both EU and US grew in prosperity post WWII, I feel the US narrative was quite on another level.
And the substantial increase in profits for all providers, which isn't comparable to that of our neighbours. Our disposable income in Belgium really exists to subsidise energy companies, supermarkets, and a pathetic housing market.
> So where is this 10GW electric supply going to come from and who is going to pay for it?
I would also like to know. It's a LOT of power to supply. Nvidia does have a ~3% stake in Applied Digital, a bitcoin miner that pivoted to AI (also a "Preferred NVIDIA Cloud Partner") with facilities in North Dakota. So they might be involved for a fraction of those 10GW, but it seems like it will be a small fraction even with all the planned expansions.
> Framing it in gigawatts is very interesting given the controversy
Exactly. When I saw the headline I assumed it would contain some sort of ambitious green energy build-out, or at least a commitment to acquire X% of the energy from renewable sources. That's the only reason I can think to brag about energy consumption
Or this brings power and prestige to the country that hosts it. And it gives clout precisely because it is seemingly wasteful. Finding the energy is a problem for the civilian government who either go "drill baby drill" or throw wind/solar/nuclear at the problem.
Datacenters need to provide their own power/storage, and connect to the grid just to trade excess energy or provide grid stability. Given the 5-7 year backlog of photovoltaic projects waiting for interconnect, the grid is kind of a dinosaur that needs to be routed around
Watt is the hottest new currency in big tech. Want to launch something big? You don't have to ask for dollars or headcount or servers or whatever else used to be the bottleneck in the past. There's plenty of all this to go around (and if not it can be easily bought). Success or failure now depends on whether you can beg and plead your way to getting a large enough kilowatt/megawatt allocation over every other team that's fighting for it. Everything is measured this way.
I had my highest power bill last month in 4 years, in a month that was unseasonably cool so no AC for most of the month. Why are we as citizens without equity in these businesses subsidizing the capital class?
To me, the question is less about “how do we make more energy” and more about “how do we make LLMs 100x more energy efficient.” Not saying this is an easy problem to solve, but it all seems like a stinky code smell.
I'm pretty confident that if LLMs were made 100x more energy efficient, we would just build bigger LLMs or run more parallel inference. OpenAI's GPT-5 Pro could become the baseline, and their crazy expensive IMO model could become the Pro offering. Especially if that energy efficiency came with speedups as well (I would be surprised if it didn't). The demand for smarter models seems very strong.
This feels like a return to the moment just before Deepseek when the market was feeling all fat and confident that "more GPUs == MOAR AI". They don't understand the science, so they really want a simple figure to point to that means "this is the winner".
Framing it in GW is just giving them what they want, even if it makes no sense.
An 8% increase y/o/y is quite substantial, however keep in mind globally we experienced the 2022 fuel shock. In Australia for example we saw energy prices double that year.
Although wholesale electricity prices show double-digit average year-on-year swings, their true long-run growth is closer to ~6% per year, slightly above wages at ~4% during the same period.
So power has become somewhat less affordable, but still remains a small share of household income. In other words, wage growth has absorbed much of the real impact, and power prices are still a fraction of household income.
You can make it sound shocking with statements like “In 1999, a household’s wholesale power cost was about $150 a year, in 2022, that same household would be charged more than $1,000, even as wages only grew 2.5x”, but the real impact (on average, obviously there are outliers and low income households are disproportionately impacted in areas where gov doesn’t subsidise) isn’t major.
I wouldn’t call a $100-270 electric bill a “fraction” when it’s about 5% post tax income. I use a single light on a timer and have a small apartment
Especially since these sorts of corporations can get tax breaks or har means of getting regulators to allow spreading the cost. Residential shouldn’t see any increase due to data centers, but they do, and will, supplement them while seeing minimal changes to infrastructure
When people are being told to minimize air conditioning but then these big datacenters are made and aren’t told “reduce your consumption” then it doesn’t matter how big or small the electric bill is, it’s supplementing a multi billion dollar corporation’s toy
And in Germany the price includes transmission and taxes, it's the consumer end price. You have to remember that some countries report electricity price without transmission or taxes, also in consumer context, so you need to be careful with comparisons.
Utilities always need to justify rate increases with the regulator.
The bulk of cost increases come from the transition to renewable energy. You can check your local utility and see.
It’s very easy to make a huge customer like a data center directly pay the cost needed to serve them from the grid.
Generation of electricity is more complicated, the data centers pulling cheap power from Colombia river hydro are starting to compete with residential users.
Generation is a tiny fraction of electricity charges though.
Prices of _everything_ went up over the past five years. Datacenter expansion was far from the main driver. Dollars and cents aren't worth what they used to be.
Elsewhere it was mentioned that DCs pay less for electricity per Wh than residential customers. If that is the case, then it's not just about inflation, but also unfair pricing putting more of the infrastructure costs on residential customers whereas the demand increase is coming from commercial ones.
Industrial electricity consumers pay lower unit rates per kWh, but they also pay for any reactive power that they consume and then return -- residential consumers do not. As in, what industrial consumers actually pay is a unit cost per kVAh, not kWh.
This means loads with pretty abysmal power factors (like induction motors) actually end up costing the business more money than if they ran them at home (assuming the home had a sufficient supply of power).
Further, they get these lower rates in exchange for being deprioritised -- in grid instability (e.g. an ongoing frequency decline because demand outstrips available supply), they will be the first consumers to be disconnected from the grid. Rolling blackouts affecting residential consumers are the last resort.
There are two sides to this coin.
Note that I am in no way siding with this whole AI electricity consumption disaster. I can't wait for this bubble to pop so we can get back to normality. 10GW is a third of the entire daily peak demand of my country (the United Kingdom). It's ridiculous.
Total gigawatts is the maximum amount of power that can be supplied from the power generating station and consumed at the DC through the infrastructure and hardware as it was built.
Whether they use all those gigawatts and what they use them for would be considered optional and variable from time to time.
> So where is this 10GW electric supply going to come from
If the US petro-regime wasn't fighting against cheap energy sources this would be a rounding error in the country's solar deployment.
China deployed 277GW of solar in 2024 and is accelerating, having deployed 212GW in the first half of 2025. 10 GW could be a pebble in the road, but instead it will be a boulder.
Voters should be livid that their power bills are going up instead of plummeting.
Fyi capacity announced is very far from the real capacity when dealing with renewables. It's like saying that you bought a Ferrari so now you can drive at 300km/h on the road all of the time.
In mid latitudes, 1 GW of solar power produces around 5.5 GWh/day. So the "real" equivalent is a 0.23 GW gas or nuclear plant (even lower when accounting for storage losses).
But "China installed 63 GW-equivalent" of solar power is a bit less interesting, so we go for the fake figures ;-)
I was commenting the initial number announcement. And storage at this scale right now doesn't exist. The most common way, water reservoirs, requires hard-to-find sites that are typically in the Himalaya, so far away from the production place. And the environmental cost isn't pretty either.
I'm living in one of the most expensive electricity markets in the US. It has a lot more to do with the state shutting down cheap petro energy (natural gas) and nuclear then replacing it with... tbd.
Theoretically couldn't you use all the waste heat from the data center to generate electricity again, making the "actual" consumption of the data center much lower?
Given that steam turbine efficiency depends on the temperature delta between steam input and condenser, unlikely unless you're somehow going to adapt Nvidia GPUs to run with cooling loop water at 250C+.
Thermodynamics says no. In fact you have to spend energy to remove that heat from the cores.
(Things might be different if you had some sort of SiC process that let you run a GPU at 500C core temperatures, then you could start thinking of meaningful uses for that, but you'd still need a river or sea for the cool side just as you do for nuclear plants)
In the Nordics the waste heat is used for district heating. This practical heat sink really favors northern countries for datacenter builds. In addition you usually get abundant water and lower population density (meaning easier to build renewables that have excess capacity).
Any time looking at the RO data directly triggers a need to edit it. Eg, you notice a typo or a mistake, you get the idea for a follow up or clarification, etc.
Different use cases mean different materials. The Air needs to be strong against bending forces because of the thin body, the Pros needed aluminum for thermal transfer for better sustained performance.
I am in a unique position to confirm that they are a load of bunk. I have solar urticaria and develop hives in response to UV exposure, directly proportional to how much UV is getting through. I’ve developed hives in minutes while the UV index was supposedly only 4 and gone for relatively too long without erupting in hives the next day even when the UV index was supposedly 10.
i hate to be that person that quotes chatgpt, but this seems VERY relevant to your complaint:
“Solar urticaria is a rare condition where the skin reacts to specific wavelengths of light rather than the overall UV intensity. The UV index is a general measure of the total amount of erythema-causing UV radiation (mainly UVB) that can cause sunburn in the average person.
But in solar urticaria, the trigger might be UVA, visible light, or even a narrow band of wavelengths — and the UV index doesn’t capture that nuance.
So it’s not that the forecast is wrong — just that the UV index isn’t designed to reflect the sensitivity profile of solar urticaria.”
In other words, you’re (literally) a special case. :)
I don’t have a LinkedIn and it has impaired my job hunts in the past but I always worry that creating one now (without the references of colleagues from decades of past work) would look worse than not having one?
Nah that’s not a thing. Get involved spend an afternoon setting it up and then it will suggest a bunch of people you’ve probably worked with in the past. They’ll be happy to connect and then it’s a good point to catch up and drop the “I’m in the market”.
If anybody used to enjoy working with you and they know of something it, should be easy enough from then on.
Do people still do endorsements on LinkedIn? There was an initial flurry when that "feature" launched but I haven't been endorsed for anything for I think the past decade. Really the only things I do on LinkedIn are update my job history and accept connections from coworkers.
Imho, anything past where you've worked on LinkedIn is a waste of time.
And arguably even a negative signal. Productive people have jobs to do instead of grinding Monopoly karma. Yes, this absolutely includes LinkedIn thought leadership.
I know MS and recruiters love to push the 'it matters' line, but I'd ask the reader -- who would you rather hire: someone who wow'd in an interview or someone with LinkedIn flair?
> who would you rather hire: someone who wow'd in an interview or someone with LinkedIn flair?
Who would you rather interview: someone who has a great resume, and a strong LinkedIn profile, and connections to a strong peer community who can endorse them, or a faceless rando that shows up in your inbox with a PDF, amongst thousands of others, with zero referrals?
I'm not endorsing LI grind -- I too hate it, but ignore at your own peril. OP seems to be in a rather precarious situation, so maybe it would help being a bit less dogmatic.
> who would you rather hire: someone who wow'd in an interview or someone with LinkedIn flair?
Wrong question. This is not about the hiring stage.
Who would I rather move on to a phone screen: someone with an empty or nonexistent linkedin profile, or someone with a profile which matches their resume and has many connections to other people who worked at the same companies?
While I hate to have to say it is the latter, that's where we are today with AI-generated fake resumes.
I have 344 resumes left to review tonight. Those that don't match their linkedin profile history have no chance (unless they are a direct colleague referral).
As I take a break on friday night from reading through an endless pile of resumes for a role I'm hiring...
I would suggest creating the linkedin profile but be sure to fully populate the job descriptions for each job (or as far back as you care to go) and spend some time looking up past colleagues from each one and send them invites to connect.
I'm finding that a completely blank linkedin profile (listing only companies but zero detail) is a bigger red flag than not having a linkedin profile.
But having a profile with job description info and a network of connections from each job adds credibility. When a resume looks borderline suspicious, I dig through the persons connections in linkedin to see if it looks like they really worked at each of those places. Even better if I find any shared connections, which is a stronger signal that I'm looking at a real person not an AI bot.
Also, building that network of connections can be a source of job leads on its own.
Man, for 15 years I’ve been working on projects that are not LinkedIn friendly. For example, online casinos where my coworkers all have pseudonyms. Or taking 1-2 years to work on a personal project that fizzles out. Not to menion, surfing for 2 years.
I'm in a terrible position for when I need to find a normal job, and comments like this don't let me forget it!
not a recruiter: I have never felt that recruiters pay attention to linkedin references specifically.
You can also make one, add people, and then ask for a few references. "I just finally made a linkedin in 2025 on a lark" is a perfectly cromulent icebreaker/reason to ask.
It is better to have 1 than not. I highly recommend you set it up now. Put a real picture. Too much noise these days and without a Linkedin Profile, lot of employers are not even going to look at you. Just stating facts.