The efficiency of solar panels is practically irrelevant. For scale for ~20,000$ of land in Arizona (60 acres) is getting ~240,000 kw of energy in full sun. At 2c/kWh and 10% efficiency you pay for that land in ~42 hours of full sun. Land costs are rarely more than 0.1% of total costs.
Anyway, nobody is going to run a major grid on 100% solar power because hydro and wind are cheaper than batteries. But let’s think pessimistic worst case. You want 50% over supply of solar so you never have to worry about long term storage (as in 1/3 of all solar is wasted).
That’s 3c/kWh. Batteries aren’t 100% efficient and degrade with use so let’s say 55% of power is provided directly via solar (much higher demand in day) and 45% of power is provided from solar fed batteries. If 1 kWh from a battery adds 10c the equation it’s roughly 3c + 10c * 45% = 7.5c/kWh.
Which is very close to what base load coal costs except because we’re using batteries which also covers peaking power plants. Frankly coal is only surviving in China because of how long it’s talking to build cheaper alternatives.
That's wildly inefficient at scale (and that's if it can be done reliably). Check my numbers here—should be correct but I'd appreciate pushback—estimating 50% of annual U.S. usage in panels vs nuclear plants: https://imgur.com/a/zeHo21R.
Even if we want 50% of our power nationwide to come from solar power, based on your 60 acre model that's still 6,429 of these installations or 385,753 acres (and that's with a generous output estimate of 2.5kWh per panel, per day).
A single 582 megawatt nuclear plant can generate 13,968,000 kWh per day and each one takes up a mile (640 acres).
Not to mention, 60 acres of panels takes a ton of manpower to maintain and keep functional. People, trucks, maintenance equipment, etc.
Keep in mind, too, these numbers are before we introduce everyone driving electric cars (assuming that's an ideal).
First 385,753 acres is trivial in the US. The federal government for example manages 640,000,000 acres in the US, it’s a big freaking country. We where doing above ground nuclear testing because we just had that much unused space.
As to output per day real world capacity factors for grid solar vary by location but tend to ~30% here’s a 29.7%. https://en.wikipedia.org/wiki/Mount_Signal_Solar So if a 400W panel averages ~2.9kWh per day 2.5kWh is perfectly reasonable ballpark estimate. Panel density varies but over 2000 per acre isn’t unreasonable as they are tilted. Still let’s use 2,000 or account for access roads etc.
As to nuclear it’s capacity factor in the us ~90% because it doesn’t need to follow the grid. Unfortunately that drops if you want to increase the number of nuclear power plants. France leaned into nuclear, but even exporting a lot of subsidized nuclear still they had capacity factors of ~70%.
By comparison at 2.9kWh per day panel * 2000 * 640 = 3,712,000 kWh per day from solar. Sure it’s less dense but not enough to be particularly meaningful especially as you don’t need access to water. The real difference people care about is it’s vastly cheaper and can actually scale. Try and hit 100% with nuclear and your stuck with sub 50% capacity factors or your need a lot of grid storage both of which dramatically increases the price. To actually be competitive long term, nuclear needs about an 60-80% cost reduction which is frankly never going to happen.
As to EV’s you need more space to park them than to install solar panels to power them. You also need more batteries for EV themselves than you need to backup a solar powered electric grid.
> By comparison at 2.9kWh per day panel * 2000 * 640 = 3,712,000 kWh per day from solar.
How is being less efficient by ~10,000,000 kWh per day preferred? That's a significant difference in output. You seem fixated on price and are ignoring the reality that solar panels inconsistently generate that amount of energy whereas nuclear is more stable. If climate change is literally going to destroy the world, then cost should be of less concern than "can it give us the power we need while reducing emissions?"
> Try and hit 100% with nuclear and your stuck with sub 50% capacity factors or your need a lot of grid storage both of which dramatically increases the price.
Can you share what led you to this 50% capacity number?
First intermittency is just a cost of battery question, because cost of one type or another is the only meaningful metric. Hell if land use bothers you we can put panels on rooftops, parking lots, between highways etc. It’s wasteful but plenty of people are doing it.
Also, 13,968,000 kWh per day * 70% capacity factor = 9,777,600 kWh so solar only needs 3x to 4x the land. Except, nuclear reactors need land near large bodies of water making that land far more valuable. So because you can’t put a nuclear reactor on the ultra cheap land viable for solar panels Nuclear actually ends up with higher land costs.
It's not a matter of technical feasibility (i.e., you could do it, but is it the best. most stable way), it's what is the goal. All of those batteries require fossil fuels to be produced, as do the solar panels. Favoring the model where you have to produce more (meaning more fossil fuels) is the antithesis to the greater point.
The more efficient option—even if more expensive—should be preferred. You may end up getting similar results, but the overall impact of trying to make solar panels play catch-up is more destructive in the long run. And they don't even catch up. By your own math, even at 70% capacity a reactor is 3x as powerful.
And re: the water requirements, those would also be a factor in hydroelectric so it's a moot point (assuming you see hydroelectric as a supplemental form of power to solar).
I've yet to do research into inputs of manufacturing a plant vs solar/wind/etc. so can't give specifics but just in the abstract what you've described sounds more wasteful (and that's if it can be done—there is far less ambiguity with the nuclear option).
The only necessary CO2 release from batteries is based on the current state of our global economy. Reduction in the amount fossil fuels used in the grid and mining equipment directly reduces the use of fossil fuels manufacturing batteries and solar panels all the way down to zero.
As part of being massively more expensive, Nuclear actually involves significantly more indirect fossil fuel usage due to those costs. For example all a nuclear power plants requires onsite fossil fuel generators which need to be run regularly: http://aa-powersystems.com/wp-content/uploads/3061871_OE_Bro... Yet those generators are left off of assessments of CO2 releases associated with nuclear power plants. Similarly the centrifuges reacquired to manufacture nuclear fuel require vast amounts of electricity to manufacture and run. The nameplate generation from Nuclear further ignores those ongoing energy costs.
Anyway, nobody is going to run a major grid on 100% solar power because hydro and wind are cheaper than batteries. But let’s think pessimistic worst case. You want 50% over supply of solar so you never have to worry about long term storage (as in 1/3 of all solar is wasted).
That’s 3c/kWh. Batteries aren’t 100% efficient and degrade with use so let’s say 55% of power is provided directly via solar (much higher demand in day) and 45% of power is provided from solar fed batteries. If 1 kWh from a battery adds 10c the equation it’s roughly 3c + 10c * 45% = 7.5c/kWh.
Which is very close to what base load coal costs except because we’re using batteries which also covers peaking power plants. Frankly coal is only surviving in China because of how long it’s talking to build cheaper alternatives.