Due to some counterintuitive geothermal reasons, it's likely that three of the four Galilean moons (of Jupiter) as well as Titan (Saturn's largest moon) all sport significant underground oceans of liquid water.
For example, despite being much smaller than the Earth, Ganymede is projected to harbor more liquid water than all of Earth's oceans combined.
(Whether this adds up to 30% depends heavily on what you start counting as a "body")
The issue isn't one of fixed vs rotation, it's that radar can't fundamentally achieve the resolution necessary to distinguish important features in the environment. It's easily fooled by oddly-shaped objects, especially concave features like corners, and so while it's great for answer the question of "am I close to something" it's not reliable for telling you what that something is, especially at longer ranges.
Note that the only proponent of the man-made theory mentioned is Graham Hancock, a somewhat infamous Atlantis-chasing pseudoscientist and overall purveyor of extremely flimsy "ancient civilization" theories.
That doesn't mean that there aren't also legitimate archeologists who might be excited about these prospects, but the lack of their mention is...suspicious.
To me it's quite telling that even John Anthony West is among those who say this structure is natural. J. A. West was famous for his theory about Sphinx showing signs of water erosion and hence possibly being 12,000 years old.
> A better approach is to optimistically merge most changes as soon as not-rocket-science allows it, and then later review the code in situ, in the main branch. And instead of adding comments in web ui, just changint the code in-place, sending a new PR ccing the original author.
This may work for small projects or open source projects that generally receive high-quality PRs, but this sounds truly infeasible for large teams or organizations.
There are a couple reasons why the code review process occurs before merging. First, it helps keep the canonical version of the codebase in a "correct" state. This is, ironically, an extension of the "not rocket science" principle that the article mentions. Without this invariant, any checkout of the codebase might contain what is essentially "WIP" code that will waste other engineers' time if they have to interact with it. Second, the social pressure of blocking someone else's work is an important and necessary force for motivating code review. The idea that folks will go back and review code that's already been merged is akin to "will add tests in a followup PR". It's a pleasant lie we tell ourselves, but it rarely comes true.
Relatedly, the article also seems to suggest that the code REVIEWER should be providing the changes necessary after code review. This is problematic for so many reasons: first, it places an even higher burden on code reviewers' time which no one wants; second, it encourages poor code hygiene if someone else is just going to fix up your crappy work later on; third, it robs junior engineers of what is perhaps their most valuable learning experience: getting feedback from more experienced engineers and acting upon it.
(there are other issues here; the general thesis of the article leans heavily towards making code easier to _write_ rather than _read_, which again I think it just not appropriate for a codebase with a significant lifetime or number of contributors)
I have no insider knowledge here, but Google tried to go the high route of working with carriers for years before giving up on their intransigence.
I suspect that Google's RCS is proprietary as a blunt instrument to prevent carriers from trying to either (a) undermine e2ee in some weasely way or (b) have the ability to pick and choose the pieces of the implementation they want to support. You either get the whole thing, with e2ee that you don't control, or nothing.
Sadly the lesson from Google, Apple, and Whatsapp here appears to be "cooperating with telecom carriers is a fool's errand".
Google had the opportunity to own this space a decade ago when they made Hangouts the default SMS client on Android. It's exactly what Apple did with iMessage, but Hangouts was cross-platform.
It's absolutely bizarre to me they didn't iterate on that. I'm kind of glad they didn't.
On the flipside, Hangouts being sunset is the main reason I eventually left the Android ecosystem. Hangouts on a Pixel phone on Google Fi service was excellent for an SMS app. Feeling snubbed by the life getting choked out of Hangouts, I'm no longer a user of all 3.
Yeah, it really is the post child for Google not being able to innovate in it's modern form.
What's really changed about their core products in the last 10 years? (maps, mail, ads, YouTube, docs/gsuite) some of them have gotten some nice QoL improvements but nothing has really been added to that list because they keep killing products off.
I'd have much rather iMessage only open up interoperability with E2EE platforms like signal or even Whatsapp (because Facebook is somehow the lesser evil in this corner of the privacy world).
In theory, E2EE is good until someone you are messaging turns on iCloud backup of messages you sent and now law enforcement can force Apple to give them your iCloud backup - with iMessage
There’s always a risk that someone you’re sending a message to has been compromised but most of us are never at risk from that, as opposed to things like dragnet data collection or server breaches. E2EE is solving the problems it’s designed to solve, so it’s not a problem that things out of scope are more complicated.
They are encrypted, but (by default) the key is escrowed for recovery by Apple support, which LE can request just as well as the account owner (or other parties with judge decree, such as surviving relatives)
And this is, honestly, a pretty reasonable default. For the average person, the failure mode is "I lost my phone, and I can't remember my iCloud password", not "I really need the cops to not be able to get into my backup", and they'd be super pissed off if Apple couldn't get them their data back. Having good security be available, but not the default, and requiring you to acknowledge the risks is a sensible trade-off for the customer service problems it might cause.
I kinda agree with you, but I think there's also a reasonable argument to be made around the idea that a user might be super pissed off that Apple made the default be not secure against state actors.
Also, how many people actually care all that much about their message history? I know I do (and I have 1GB of SMS/MMS/RCS message history dating back to 2010 that I back up to GDrive nightly), but it seems to me that most people don't care about their message history that much?
The nice thing is that there is now an advertised set of features to protect against state actors in the form of Advanced Data Protection, Lockdown mode and (soon) iMessage Contact Key Verification.
These all have significant usability impacts; I think Apple still has the correct defaults.
Finally, my understanding is that recovery keys are escrowed in a HSM separate from cloud hosting, and releasing an escrowed key is an audited event. My concern is mostly about actors accessing my data or surveilling me without transparency, as that gives no chance for accountability.
I'll grant that what people really care about is their backed up photos, and there's nothing stopping Apple from having separate security strategies there.
That said, I suspect that there's more people out there who're going to lose their text history with their dead parent and be distraught over that, than who're going to be actively upset that the state can subpoena their messages.
As opposed to the same someone just going to the police and showing them your messaging? Or getting caught and forced to open it? Or being an idiot and sending a screenshot to it to Facebook?
The issue you describe is just not an attack vector that is in anyway relevant, if you can’t trust the other side, every hope is already lost.
This is just me but I’m less bothered by Big Brother than I am by little brother.
I don’t worry (very much) that law enforcement will read my messages but I do worry that advertisers, insurance cartels, spam marketeers, bookmakers or price gougers will.
Sure, but in practice, everyone's RCS is currently E2EE since everyone uses Google's client and Google's server.
This should change, certainly! Hopefully Apple will force Google to open up their implementation and protocol for E2EE so they can build a compatible implementation.
Maybe. The challenge with E2EE is how to resolve an email address or phone number to the authoritative public key and networking route, securely. If we wind up with multiple authoritative sources of that mapping, each one has the potential to lie and become an avenue for surveillance. Thats ignoring for the moment lesser issues, such as privacy issues with leaked metadata in querying these sources.
Things like Key Transparency in the IETF are tackling some of this, in the sense that they'll provide public evidence of tampering.
I don't suspect what Google has implemented for their own client/server setup gets us close to a multi-party solution within RCS Universal profile.
I can replicate this behavior fairly easily in a browser.
1. Open incognito window in Chrome
2. Visit https://t.co/4fs609qwWt -> 5s delay
3. Open a second tab in the same window -> no delay
4. Close window, start a new incognito session
5. Visit https://t.co/4fs609qwWt -> 5s delay returns
Your humble anonymous tipster notes to their skeptical audience that browsers are capable of caching all sorts of things, even something as peculiar as an HTML page.
Here's a simpler test I think replicates what I am indicating in GP comment, with regards to cookie handling:
Not passing a cookie to the next stage; pure GET request:
$ time curl -s -A "Mozilla/5.0 (X11; Linux x86_64; rv:102.0) Gecko/20100101 Firefox/102.0" -e ";auto" -L https://t.co/4fs609qwWt > nocookie.html
real 0m4.916s
user 0m0.016s
sys 0m0.018s
Using `-b` to pass the cookies _(same command as above, just adding `-b`)_
$ time curl -s -b -A "Mozilla/5.0 (X11; Linux x86_64; rv:102.0) Gecko/20100101 Firefox/102.0" -e ";auto" -L https://t.co/4fs609qwWt > withcookie.html
real 0m1.995s
user 0m0.083s
sys 0m0.026s
Look at the differences in the resulting files for 'with' and 'no' cookie. One redirect works in a timely manner. The other takes the ~4-5 seconds to redirect.
You're completely missing the point, which is that the 5 second delay doesn't exist at all for most t.co links, even without cookies. The delay only exists for a few Musk-hated domains.
In your second example you are passing the cookie file named ./-A then trying to GET the URL "Mozilla/5.0 (X11; Linux x86_64; rv:102.0) Gecko/20100101 Firefox/102.0" followed by https://t.co/4fs609qwWt
F150 is 98 kWh standard range, 131 extended. But the important point is this sentence:
Toyota explained that the system supports supplying power from hybrid electric vehicles
Presumably, this means you have a short, medium, and long option for emergency power:
- short: the battery
- medium: the battery + your car's battery
- long: use your ($25K) Prius as a gas generator to power your home for as long as necessary.
This seems like a more versatile setup than the e F150 at least for my use cases (rural WV -- power might be out for long periods but I can always get gas). It'll be interesting to see the price range of course, but this could be a good "mostly battery + gas if needed" backup option to compete with the diesel generator situation now. And of course the eF150 isn't really a good backup power (or transportation!) option in my case.
The eF150 generator use case always seemed like suburban prepper fantasy bullshit. The actual use case is for running power tools on site.
> The eF150 generator use case always seemed like suburban prepper fantasy bullshit.
Perhaps you didn't hear about the millions of people who were miserable (and several who died) because their power grid is run by morons[1].
Losing power for three days may have been unusual for a long time, but with the combination of radical/unaccountable government, climate change, and aging energy infrastructure, it's easy to imagine that a lot of the warmer parts of the US are at some risk.
I live about an hour from San Francisco and my power goes out about once a quarter for several hours. It's not the end of the world, but twice in the last year it happened while I was making dinner. A weird feeling to realize I can't even make food without the governments help.
I’m planning for solar cells and looked into the possibility of running my house as a micro grid (ie disconnect it from the main grid) in case of a prolonged power outage. Turns out that unless you redneck engineer it, running your house without a main grid to synchronize to is very costly - among other things you need to supply your own grid grounding and that could easily run into the high €x000.
In Europe (EU) every house needs own grounding. You are not allowed to ground on main grid's ground. We have three cables: 1) power 2) zero (=main grid's grounding) 3) own grounding. Only old installations are allowed to connect grounding to zero which is called something like "zeroing".
In the UK, which was until very recently in the EU, most modern (last 20-30 years) houses have what’s called Protective Multiple Earthing where the house earth is just connected to the incoming neutral and there’s no earth spike at the house. Then the power company earth-bonds at the substation and several points between the substation and houses.
I believe metal pipe bonding is a building control requirement however many new builds use predominantly plastic pipe work. The requirement is to protect the occupant against touching live metal pipe work, not to earth the house.
Here in the US, I have three wires from the utility: phase 1, phase 2, and neutral. I also have two copper ground rods that are 3-4 meters in length, which are connected to the utility neutral.
At the first electrical panel in many cases the grid's neutral is connected to the house grounding, this is done in new houses in my country as standard.
Stake in the ground, as the sister post said, non-corroding, in the simplest case.
But there is more to it: The stake has to have permanent contact to some electrically conductive layer in the ground, so you need to take geology and local climate into account. In central europe, with generally wet climate, you just need to reach the year-long stable, frost-free, local water table at a depth of (usually) between 1m and 10m. If you cannot reach sufficient depth, don't know the required depth, a simple stake isn't going to cut it. Because in case of an electrical fault, the grounding has to withstand and dissipate in the order of a few hundred Ampere. To achieve that you then shallowly bury lines of non-corroding material in a grid, or bury a grounding net something like 1 to 2m deep over an area of 100m^2 to 10000m^2.
If you are on sandy or rocky ground, permafrost, arid climate and no handy body of water is nearby for grounding, you need to have a far larger grounding net or use conductivity-enhancing methods like permanent watering, adding salts or carbon to the soil or replacing it outright with something more conductive. In all, very expensive.
And as for large installations, you just measure the soil conductivity, calculate the necessary grounding current and scale up the aforementioned methods.
This must be a European design. In the USA, governed by NFPA 70 - The National Electric Code, the ground rod's purpose is to establish the ground voltage reference to reference the electrical systems of a building to it and to dissipate any charge buildup on the circuits.
The ground rod is not there to carry current to interrupt a fault to "protective earth" - the green or green/yellow. The circuit breaker interrupts a short as you bring the protective earth wire back to the main disconnect of a building where it bonds to the neutral wire.
If a fault occurs, an unregulated amount of current flows on the protective earth wire to the breaker panel and a circuit breaker interrupts the circuit.
I'm no expert, but here in Norway we predominantly have IT systems[1], though new installations are mainly TN.
In the IT systems, the protective earth is not bonded to the neutral. Thus in case of a fault, the protective earth should be low impedance to ground so that the circuit breakers trip. At least that's my understanding.
Technical nitpick, if it detects earth leakage and trips based on that, it's an RCD, not just a circuit breaker. Otherwise yeah, there's different approaches to earthing - in Australia for domestic stuff we have mandatory RCDs which work as you describe above, but then also in industrial/mining settings we have the big green/yellow cables which will directly sink current (potentially hundreds of amps) to ground, hopefully stopping you from getting bitten.
A thermal breaker won't trip on leakage current or arc faults either. I was trying to describe the classic "dead short" that will trip a thermal circuit breaker or fuse without the circuitry for arc or leakage detection.
Ground water isn't really receding, it's just pumped to a low level to benefit agriculture at early spring, which then fucks everyone up if the spring and/or summer is dry.
Usually you can just tie it to an outside copper plumbing pipe, it's metal and makes good contact with your local ground.
Disclaimer: Not intended in any way as professional electrical advice yada yada. Just what I've read.
(Also all sorts of weirdness can take place around grid earth vs. local earth, eg. during thunderstorms. Earthing is its own entire engineering discipline. :S )
You need to put as much as needed to achieve 4 Ohm to ground or less. This can be up to ~ 10 stakes at a couple of meter interval, it depends a lot on the soil type.
You also need a transfer switch to make sure you're disconnected from the grid otherwise you're going to be feeding power back into lines that are supposed to be dead.
I already have a 2000W inverter wired to my Prius' 12v battery. Pop the car in ready mode and I can run things off it for ages, and then unplug them and go refuel the car if needed.
Having the extra storage battery mounted at my house would be cool and all I guess, but you don't need this to back up your house's power supply with a Prius.
(I live in a small, simple house and only run the blower fan for my propane heating system, my refrigerator, my freezer, and a lamp off the inverter. I suppose if you had much more complex power needs, the battery would be a larger advantage, but for emergency power outages, it keeps me from freezing or losing all my food.)
Honestly I kinda already did. Connect 2000W pure sine inverter to terminals on 12v battery in trunk. Plug in extension cord.
It's not terribly sophisticated, but it keeps my pipes from freezing.
Refueling a car is as easy as it gets, though. To refuel anything else, you'd have to put it in a car and take it to the fueling station anyway. This way you just refuel the car the normal way. You can also store whatever gasoline you'd have used in a generator at home too, but the Prius' 10.9 gallon tank holds a lot more than your average portable generator and runs a good, long while.
Running your home off your car's combustion engine sounds like exactly the wrong way around. I want a battery that allows me to save days worth of power from solar panels to use during cloudy days.
Well, I suppose if you have enough panels to generate power even on short, cloudy winter days, then that doesn't matter anymore; then you just need enough storage for the night. But then what are you going to do with all the surplus power on sunny summer days?
Some sort of cheap long term storage would really help a lot.
At least for our install, we spent about half the money on batteries and half on solar. It fairly reliably gets through the night unless we run the AC (but those are sunny days) or charge the car (car batteries are about the same size as the house batteries).
On a cloudy day, the panels provide 90% of our normal power usage. Anyway, to scale it up, we'd want to increase panels and also batteries. Increasing only one would leave us with no power at dawn or with a large battery that would never reach 100% in winter. One night of batteries with panels that reliably provide enough electricity to get the batteries to 100% is a good tradeoff for sunny climates. As it gets cloudier, batteries might have more incremental benefit, but multi-day storage probably doesn't make sense.
Also, you can tie a gas/propane generator to the battery to handle the "a few times a year" cases. That's probably less carbon intensive than 5x-ing the system for 1% of the days.
(Since the 1% days for us are in winter, we have a wood stove.)
> But then what are you going to do with all the surplus power on sunny summer days?
You don't have to do anything with it. Panels are dirt cheap these days, and if you want reliable off grid storage then you have to size them to keep up with baseload power under your target range of conditions anyway. Figure out how many days a year you're happy to run a generator or turn your fridge off, find stats on your local daily kWh/m^2 solar energy, size panels to cover baseload with that incoming energy.
I have 2kW of second hand panels hooked up to a 200AH 24V battery pack (again second-hand) to power a server rack, it uses about 4.5kWh of solar power per day without running the batteries down too far. The panels can generate 8kWh/day in summer, the rest is headroom for cloudy winter days. The last time the server saw mains power was... December, I think?
> Some sort of cheap long term storage would really help a lot.
Generating some sort of fuel is probably the best idea indeed. Hydrogen is often suggested because it's easiest to create out of water, but it's also hard to store. Ethanol is definitely easier to store, but requiring a lot of biomass is definitely a bit of an obstacle.
Genuine question - how many engines can run on straight ethanol? Are there any negative consequences for running straight ethanol in an engine expecting gasoline?
Yes, I get it, it's indeed a lot of surplus in the summer.
Long term storage would be the better solution, but batteries don't seems to be able to store a large amount of energy anyway. So in my opinion, it's really a tradeoff.
I looked into this. Even assuming PG&E's buy back rate is 50% of current numbers, the ASICs have an expected profitability horizon of over one year of uptime. I'm not convinced they'll be profitable much longer after that, since hash/kWh keeps improving.
Also, I bought the panels to reduce carbon emissions, and would rather sell the power back.
> The eF150 generator use case always seemed like suburban prepper fantasy bullshit. The actual use case is for running power tools on site.
I think the Hybrid F150 / generator case is pretty decent; not so sure about the EV only generator one, but running tools could be useful. Rolling a truck over to my well when the power goes out will be a lot nicer than rolling out a portable generator by hand. Could be maybe useful for cell towers that rely on generators driven to the site during outages as well; although that depends on if they usually drop off a generator on a trailer and let it sit without local supervision or if they stay with the generator. My well servicing company has a box truck with a generator in the back, that they use to confirm that the problem isn't related to utility electric service; not sure if a built up f-150 would be sufficient for their storage/transport needs though.
Let me introduce you to the Eastern Seaboard and Gulf Coast of the United States of America, where people often run generators intermittently for weeks after a hurricane in order to keep their refrigerators and freezers cold until power can be fully restored.
In that case, the prius prime battery alone is probably sufficient for several days. And it's 30K less than the eF150. And you can actually buy one off the lot today. :)
I don't heat my home with electricity, and I am not home at the moment. Without lighting and cooking it does 4kwh/d.
When I am home and cook, wash and have the lights on I do about 7.
Factoring in the heating (Swedish "fjärrvärme", remote heating. Hot water from a central plant) I do A LOT more. Something like an extra 30kwh/d in the winter months for a 120 m2 home with half-decent insulation by Swedish standards.
I measured a groundsource heatpump to consume about 30-50 kWh per day in -20c for hot water and heating a 100m2 house. About 22c indoors in a 70s somewhat poorly insulated 1-story brick/stone house. A good 1/3 of that energy goes to hot water.
In houses where hot water is heated with electricity the ratio might be even worse.
Another data point: 180m2 house renovated in 2009, very well insulated, brand new 5.1 COP GSHP equipment we averaged 36 kWh per day for heat in January. We generate some hot water from that but our primary hot water is resistive electric.
Another data point: Our fairly old house, heated with an air heat exchanger, use about 60 kWh per day during the winter months. The power draw for heating is reduced to almost nothing during the coldest period when the outside air is too cold and we use firewood.
Apart from heating the house, we also use some power for hot water and pumping water from the well into the house.
When my apartment was empty for a few days last month, it used 2.2kWh/day.
I used 1700kWh of electricity last year, presumably mostly on cooking and the fridge-freezer. I don't have the district heating (fjernvarme) bill to hand, but that wouldn't be comparable to a house anyway.
Full size fridge and freezer from 2015, forced ventilation fan (which is probably around 35-60w), one server (a repurposed office computer) and 2 WiFI hotspots, one of those Google speakers, a router (USG) and a PoE switch. Those are the big ones.
We didn't build the house, so there are all kinds of standby stuff (including needing smart lights for most lights, stove, towel heaters).
I didn't turn these things off because my mother in law is using the apartment a little while we are gone.
4kWh is 166W average, which is a refrigerator and a couple WiFi access points, a camera, a home assistant device, and some other random plugged in devices (i.e. cordless phone).
Indeed. I just looked up the heat loss for my house: 213W/ deg K. So, to maintain 20 deg C inside with -20 deg C outside requires about 8.5kW of heat energy (200m2 single-family house built around 2008), or approx 200kWh/day.
Generally by using the dimensions and heat conduction properties of the exterior surfaces of the house (area of foundations, walls, windows, roof), taking into account the heat gain due to sun, with some local fudge factors applied (loss due to wind, natural ventilation, outside temperature).
Then, after the building is actually built and inhabitated, the calculations are adjusted by the actual energy consumption over year).
As pointed out, competent HVAC companies should have people on staff comfortable with such calculations. However, my experience shows that it is not universally true, and many are just guided by intuition/experience with other projects (i.e. the roof insulation thickness on the previous project was X, so that's good enough for you, or "well, on average we recommend 50W/m2 of heating power when selecting a heat source"). Which probably works fine for many cases (e.g. renovating an older building, where even if the material properties when they were new are known, you can only guess the values after 20 years of service).
Even a very well insulated home can use quite a bit of electricity.
The most rigorous standard for home efficiency, the Passive House standard, stipulates that no more that 15kWh/m^2/yr is used for space heating. For a 200m^2 house, that's 3000kWh/yr.
Given a 120 day (4 month) heating season, that's 25kWh/day average just for space heating. Obviously it varies quite a bit, with some days much higher and others much lower. Add in other electricity uses, like refrigeration, laundry, and you're easily at 40+ kWh/day even in an efficient Passive House.
That is obscenely high. Where I live, it would cost over ~~150k~~ 7k USD to use that much power every day for a whole year. How is it even possible to use this much energy??
This means that you pay ~4 USD per kWH. That's almost 12 times more expensive than in Germany where electricity is supposedly expensive. Where do you live?
Wow, how is 100 kWh/day even possible? We consume about 3 kWh/day (excl. heating) in our standard-sized 2-person Dutch household, living pretty normal life.
Its going to be 39C at about 60% humidity today here with a bright sun beating down. My home has some decent insulation, double paned low-E windows without metal framing, thick attic blown insulation, etc. AC is set for about 26C. I'll probably still use about 70kWh of power today with the majority of that being the AC. A pool pump uses a good bit of power too though, pumping about 60,000 gallons of water through the filters uses a good bit of power.
You can do 3kWh maybe if you are not cooking using electricity or running a cleaning machine for dishes or clothes.
I've installed an electricity meter 2 weeks ago, and the lowest it got was 4,8 kWh/day in a 2-person Croatian household, although I do have a small Synology NAS running 24/7 and we have a TV on for a couple of hours.
2 Adults, 2 kids, also close to 10 kWh per day (cooking on electricity (induction) but showering on natural gas, 0.6-0.8 m3/day). When we are not home, it's about 4 kWh per day (2 freezers, 1 fridge, home server, router etc). Big sources are Laundry, dishwasher, hot water in the kitchen (5L boiler).
But we heat the house on gas, and last december we burned about 180 m3 of it. Now, during summer, (in the Netherlands) we don't need heating or air-conditioning.
2 adults, 1 kid, belgium. One adult is always WFH (we alternate). Average of 13KWh per day. There is a server rack running in the basement though 24/7 but its optimized (nucs and rpis and no costly energy burning servers) and this rack alone accounts for 3-4 KWh per day (out of the 13)
We heat and cook using natural gas.
The biggest consumer are the same here. Dishwasher and laundry.
Not sure how you are managing this, are you sure your numbers are correct? When I turn my kettle on it consumes more then 2kW, yes it is running for a few minutes at a time but it all adds up, not to mention the electric oven.
With all major appliances off (except the fridge/freezer) I consume ~0,13 kW/h, that adds up to 3,36 kWh in 24 hours.
24kWh a day here. Fairly large house near Cape Town, South Africa. This is excluding heating in winter, for which we mainly use a slow combustion fireplace and also natural gas. Stove is also natural gas. Rarely use AC for cooling.
Easy, live in a house 4x the size of a 2-person Dutch household and in a climate that averages 10 degrees C warmer, like would be common in the southeast US.
I ran your figures through a local price comparison engine. [0] Cheapest rate for you here would be ~12800€, or ~$13400/year. For reference, that's slightly less than half the median net income here. [1]
I also made a quick price comparison between the US and here:
My conclusion is the US provide a reference framework of cheap abundant energy. The environmentally conscious have to deal with a framework that stimulates unbridled energy consumption, with hardly any real incentives for conserving energy.
Update. Local price comparison site was just updated this morning. Price here would be between 14569€ (.40€/kWh) and 17308€ (.47€/kWh) depending on the supplier chose. This includes all taxes and surcharges.
Something is off here, even if both your cars use about 40kWh per day, every day (which is humongous) that would still leave an absolutely staggering 60kWh/day. That would be the same as heating with electric alone a house in the Nordic Lapland during winter.
Given a COP of 5 for a ground source heat pump you can calculate 60/24 is 2,5kW power draw around the clock. Multiply by 5 give 12,5kW heat non stop. That is a lot of heat.
2000 sqft is 185m2 which is a mansion by my standards though :)
Average sq ft for house in US is around 2500sq ft so 2000 sq ft is below median. Ok starter house but not more than that. I understand that houses in Europe are much smaller though due to low incomes and higher utility costs.
I have 2800 sq ft house and use around 30-35 kWh/day in summer. 60kWh/day is high but not outrageously high.
I just ran my last year's electricity. I'm around 26 kWh/d in the DC area in an old inefficient house around 1500 ft^2. While it had a gas stove, gas water heating, and gas house heating, it did have electric window AC units. I'd be really interested in how the previous author had 6-8 kWh/d.
I used about 8 KWh/day when I lived alone in a 1550 sqft townhouse with gas heat and cooking. That ran my IT equipment, refrigerator, and blower fans for heating and exhaust. No AC usage.
Now I'm in single family home and my energy use is bonkers, but most of that is heating while I'm missing part of my roof and an entire exterior wall. It should be criminal for a town to take two years to approve permits.
My last bill I ran 95kWh/day. Smaller house but have several adults and even more kids who all shower and do laundry as well as other things. I wish I could get it down half as much I don't see how I could at this point.
A air-water heatpump for heating that shower water would pay itself back in a few months. Solar panels would get the rest. Water is probably your biggest energy sink.
I use around 70kWh/day (but nearly zero in summer and probably 4x that on colder days).
Heating is a heat pump with probably 300% efficiency (i.e. 3kW heat for 1kW electricity). Walls are 300mm insulated wood frame. Triple glass windows. -20C for at least one week every winter. Could probably lower the consumption by recycling more heat (none of the wastewater heat from hot water running down sinks is recycled for example).
Thats incredible. We use up to 30kWh/day (heating a poorly insulated house), which is almost double the national average household daily usage here in Australia. I guess you really do use alot more power living in a freezing climate.
In a climate I live in (Latvia , -20 degC for a few weeks in winter), reasonable energy consumption for heating of single-family house is around 100kWh/m2/year.
My 200m2 house is slightly worse at ~120kWh/m2/year, or around 24MWh of energy per year (that also includes domestic hot water though).
With a ground/water heat pump it should translate to ~5MWh of electricity per year, or about the same as my current yearly electricity consumption.
I was fairly resistant to Ugg boots and the like, but I realised why they are so popular this year. You really can get by quite comfortably in a Sydney winter without heating. And my home gets cold.
Just to add a point as everybody is going on about how much they use per day or what: You are expected to generate power, too.
With a battery around 9 kWh, you'll probably install a solar system around 3 times as big or sth. like that.
So those capacity has to buffer for the night time when you're sleeping, heating probably goes somewhat down, nobody is cooking on 4 induction plates etc. pp.
I'm in a relatively small UK house. Selling power back to the grid doesn't get you lots of money these days, so the emphasis is on using batteries for night time use. Running off-grid isn't really something that is done, in urnban areas anyway.
Heating is gas, the hob is gas
I have 3kW panels and a 5KWh battery. During the summer hot water is heated through an immersion heater from solar - my electricity bill is roughly zero and I get to sell a bit back. During the winter - forget about it.
That's exactly how you should do it, I think. Generate as much power as you can use for yourself, fill a battery for the night, repeat that cycle.
My landlord installed a 30 kW peak solar system (two households, six childs/people), installed a heat pump and insulated some walls that were not insulated before. Surely a hefty invest, but external power usage has dropped pretty much to zero from February to October. Even after that it's minimal.
There is no reason for such comparison: the point of a small battery is ensure 24h full autonomy for critical loads (fridge, freezer, VMC if it's a new home, lights, computers etc. Batteries are NOT cheap so choosing to limit what's backed up is reasonable.
After, for far bigger backups, a vehicle might be a gamechanger: it need anyway a far bigger battery for it's own performance, so battery costs does not matter much for the use-case and using it once you own the battery...
I'm a little sad that there aren't any comments describing what is technically novel around Fuchsia and why it is/why it isn't interesting from an OS design standpoint.
I get the sense that its advances are probably too low-level for most app developers to care, but that's kind of precisely why I'd love a comment elucidating them a bit.
Edit: For example, the Fuchsia docs list the primary talking points as secure, updatable, inclusive, and pragmatic. How well does it live up to those principles? Will they bring practical benefits? What's exciting/new about what's being done here?
I don't see what there is to lament. If this is necessary spending (it seems it is) and there is other necessary spending, then it seems we must pay for both of those things.
I guess we could wish that the hornets never arrived here, but the life of a government is solving thousands of problems like these. If not the hornets, it would be something else. The uplifting moment is that this was a successful containment. Money well-spent :)
Yes, we should do both things, and the hornet issue is essential spending. But resources are limited, and there are plenty of people don't consider safety-net spending essential. That's my lament, that between hornets and people, few will dispute the necessity of dealing with the first, while plenty will the second.
I have to say, after reading those posts, Matt is looking more trustworthy than the OP.
Especially considering the definitely-not-a-sock-puppet post by jimboykin [1], an account that was created immediately after this thread and has but a single post on HN.
I mean part of this issue is that there is no feedback to the website owner, which is shown by the guy only getting a response here on HN.
Like isn't it kind of insane that the top Google search engineer provided customer service to OP, but he couldn't get a response thru a normal channel.
For example, despite being much smaller than the Earth, Ganymede is projected to harbor more liquid water than all of Earth's oceans combined.
(Whether this adds up to 30% depends heavily on what you start counting as a "body")