I maintain that the power of an intermittent generator should be considered in terms of watt-hours per year. Which is of course just watts, but collapsing the time terms obscures that it's a mean value, which the Whr/yr refuses to do.
Perhaps we could go full acre-foot here, and use the kilowatt-day per year. So a 100,000kWday/yr installation can provide 2kW to 500 homes for a hundred days.
Those of you who are aesthetically repelled by batteries using amp-hours instead of joules will really hate this one. But it gets at the difference between maximum continuous power, and total energy delivered per installation.
So, big 'it depends'. Anywhere where the wind averages the working speed more often than the sun averages the working luminosity will make wind cheaper in relative terms, and vice versa for solar.
Perhaps we could go full acre-foot here, and use the kilowatt-day per year. So a 100,000kWday/yr installation can provide 2kW to 500 homes for a hundred days.
Those of you who are aesthetically repelled by batteries using amp-hours instead of joules will really hate this one. But it gets at the difference between maximum continuous power, and total energy delivered per installation.
So, big 'it depends'. Anywhere where the wind averages the working speed more often than the sun averages the working luminosity will make wind cheaper in relative terms, and vice versa for solar.