According to table 1, losses for 800 kilovolt HVDC systems can be as low as 3.5% per 1000 km. That comes to 52.5% loss over 15,000 km. Losses should go lower at 1100 kilovolts (the highest voltage HVDC systems running today) but I can't find detailed loss numbers on those systems.
At 52.5% energy lost in transmission (47.5% retained), this is still considerably more efficient than making hydrogen via electrolysis, liquefying it for shipping, and then burning the hydrogen in the receiving country to generate electricity.
This particular plan is breathtakingly ambitious and would no doubt cost a fortune. It may appear more attainable when (if?) the similar but smaller Australia-Singapore Sun Cable project gets built:
Total losses in case of 1100 kV line can be found as follows. As correctly commented below 3.5% loss per 1000km yields to 58.6% retained power over 15000km. So I will use 41.4% loss on that distance for my napkin calculation.
Voltage drop in 800kV line: V_in - V_out = 314kV = I * R (since lossses in DC lines are mostly due to heating, unlike AC lines where there is also capacity induced resistance).
Assuming that the same wires are used to transfer 1100kV DC current (i.e. resistance is the same) we can outline two cases:
A) you want to transfer the same inflow power on the Chilean side, then the voltage drop would be a 800/1100 fraction in comparison with 800kV line, i.e. V_in - V_out (1100kV case) = 314 * 800 / 1100 = 228kV. So the total loss is 20.7% (1.54% per 1000 km)
B) you want to transfer the same outflow power on the Asian side (i.e. smaller current due to lower voltage), then the voltage drop would be 162kV (there is a simple quadratic equation, if instead of considering inflow powers, one considers outflows which depend on V_out), so the total loss is 14.8% (1.06% per 1000 km)
Summary: same (as in 800kV line case) inflow power on Chilean side - 20.7% loss, same outflow power on the Asia side - 14.8% loss.
Is ~50% retained more efficient than producing the solar power in Asia itself?
Wouldn't it be better to generate this power in the deserts of the middle east, and the wide open plains of Central Asia and transmit that over shorter distances despite producing less power per solar panel ?
The whole point is that you can generate the power at different times. The deserts of the middle east get sunlight at a different time then South America.
If we had a worldwide spanning power grid we wouldn't need to have a grid storage solution at all.
That's a good point, I can see how this could be better than battery based storage for night time. But I wonder if wind power coupled with some limited battery storage, geo thermal, and newer generation nuclear peaker power plants can solve this problem instead of laying a mega cable under the ocean.
Thats fair, but I still think that a 15,000Km undersea cable adds a lot more complexity and cost when compared to diversifying renewable low carbon power generation within Asia
Also the cost of laying that cable is going to be baked into the cost power coming from Chile, and this might make it far less economically viable when compared to adding redundant power generation capacity in Asia.
_Microft linked to a great paper saying that 99.99% reliability can be achieved in 42 different countries using a mix of overprovisioned solar, wind and 3 hours worth of batteries.
A mix of power sources and some storage can of course solve this problem. The question is whether it is cheaper (or more desirable for other reasons). An infrastructure of a couple of HVDC lines that are vital for countries on different continents has very interesting geopolitical consequences for example. Imagine the bargaining power a country has if it can literally turn off the lights during the night in another country.
With fossil fuels it is at least easy to store months worth of energy, but still OPEC is quite powerful.
Are the losses thermal? How significantly could this change the ocean temperature near the cable? 50% of 600 GW is enough to cause a 10C increase in the temperature of 10 million liters of seawater every second?
The short answer is that if 10,000,000 L/s of well mixed water passed over the wire, it’s temperature would increase by 0.75 degrees Celsius.
The long answer is that this is not quite the right way to view the question. There will be a temperature gradient around the cable - a zone where water is heated. The question is, how hot is this zone and how large is it. The cable will be buried, which makes the zone larger and less hot. However, this all depends on the type of soil in the ocean floor.
One thermal study estimated that, with typical operating temperatures and burying depths, the sea floor temperature (of the soil, not the water) would heat by 10 to 18 degrees Celsius. This is enough to potentially interfere with ocean life, according to the authors of the study. Because of limits in the operating temperature of cables, a project like this would make a larger zone like this, but the maximum temperature would stay the same.
It’s important to note that we already mess up aquatic life by discharging thermal energy into bodies of water. High temperature water discharge from power plants can make rivers more friendly towards invasive species and shift the balance of aquatic ecosystems. This project might be a net neutral for aquatic life by reducing high temperature discharge into rivers while creating hot spots in the ocean.
What about the magnetic field? I think distance is required between the two sides of the cable to prevent arc, which also means there will be a magnetic field.
Heat becomes an issue before magnets do. Magnets follow an inverse power law but thermal conduction regimes produce linear temperature gradients. The spacing required for cooling is greater than the spacing required for EM insulation except for very low power flows.
hmm, 500 gigawatts dissipated over 300k square gigameters of ocean surface gives a forcing effect of 1.6 milliwatts per square meter. Global warming right now is forcing 1w/sqm, so it's not a problem at this scale. But it's not a crazy question either.
It’s not evenly spread, though. It’s concentrated along the cable. It won’t heat up the ocean much as a whole, but it will create “hot spots”, which may change the balance of ecosystems in the regions of the ocean the cable passes through.
Worth it to stop carbon emissions? Absolutely, in my opinion.
Something that should be considered and minimized if possible? Also yes.
That's a good point. 500 gigawatts over 15 megameters gives 20kw/m of local heat to deal with, which sounds kind of yikes.. but then again a pool heater draws 5kw. Seems clear the biggest effect will be on the seabed-- but that's already true because you gotta drop the cable. I reckon the cable itself, and its underwater infrastructure, might have more impact than the heat it generates. Another consideration is that there have always been heat sources in the deep ocean.
i'd say synth gas would be more versatile option. Also they can like Iceland export the energy as aluminum or similar products taking a lot of energy to produce. (and a bit dreaming - i think Andes is the place to build a large rail gun style space launcher, and not just for one launch per week like, i mean large spaceport launching stuff non-stop and this is where beside launch energy you'd need energy to produce fuel for the upper stages)
The viability of synthetic gas really depends on the price of your carbon source. Direct air capture of CO2 is pretty expensive but subsidies can make it work.
Ammonia is the conventional option for long distance shipping of hydrogen. The ammonia can be used directly as the fuel or cracked at the destination for its hydrogen.
The loss on microwave relay is just incredibly high (think >90% power loss), because it goes as distance^2.
Also there's physical limits on how tight the resulting beam of energy can be -- it's not a laser beam, it'd be irradiating 100s m^2 if not km^2 with whatever it's peak W/m^2 is. Also it can't be anywhere near a useful radio band, because interference.
The distance^2 applies, but it's most pronounced for isotropic source (i.e. emitting across a sphere). If you create a tight beam (think columnated laser), you vastly reduces the overall system loss. The bigger problem for microwave links is converting electricity to RF & back -- you get big losses in the amplifiers, radiators & rectifiers.
I'd be interested in seeing the math. Would multiple hops and satellites be required? What's the power loss for each hop? What's the cost and capacity for each satellite?
Batteries are not nearly energy dense enough to ship. A ship would need more energy than is even stored in the battery just to make it across the ocean.
You said that already, but why would that be a better approach to using a higher voltage from the start or using AC? This isn't exactly an unsolved problem.
You will have less loss if you correct for some voltage loss midway.
You lose more electricity if your line drops from 1000kv to 600kv than if it drops from 1000kv to 800kv twice, even if you count up conversion losd.
Megavolt range up conversion is rather inefficient (that's why HVDC links real efficiency is notably lower than its transmission efficiency,) but it will still be more economical
https://publications.jrc.ec.europa.eu/repository/bitstream/J...
According to table 1, losses for 800 kilovolt HVDC systems can be as low as 3.5% per 1000 km. That comes to 52.5% loss over 15,000 km. Losses should go lower at 1100 kilovolts (the highest voltage HVDC systems running today) but I can't find detailed loss numbers on those systems.
At 52.5% energy lost in transmission (47.5% retained), this is still considerably more efficient than making hydrogen via electrolysis, liquefying it for shipping, and then burning the hydrogen in the receiving country to generate electricity.
This particular plan is breathtakingly ambitious and would no doubt cost a fortune. It may appear more attainable when (if?) the similar but smaller Australia-Singapore Sun Cable project gets built:
https://suncable.sg/