There is probably more DC power stuff now than there has ever been at any time in history.
In the past 20 years with the growth of ISPs, telecom, internet infrastructure stuff, a huge percentage of it runs entirely from giant -48VDC battery banks, rectifiers, bus bars, fuse panels and distribution systems. In general as you get to the size of really big important core routers, DWDM systems and such, a lot of them don't even exist in AC powered versions.
The difference is that it's normal 3-phase AC power distributed to the building, and turned into DC within it.
The origin of this is within the Bell system and other similar legacy telco equipment manufacturers (Northern Telecom, etc), all of which standardized on -48VDC power systems more than 65 years ago.
It is just not apparent to most people how common and widespread -48VDC stuff is, since unless you work at the OSI layer 1 in the telecom business, you'll rarely have a reason to lay eyes on it in person.
48V DC is mostly popular because human safety laws don't require special handling of 48v. Voltages above 48v/50v normally require testing/certification/etc.
48v is basically a voltage to avoid paperwork, while still being as high as possible to avoid spending too much money on copper.
I don't have a -48VDC power supply to test it with, but as far as I know +48VDC has the same effect. Firmly grabbing a lead with both hands results in absolutely nothing happening.
Doing it with wet sweaty hands is somewhere between "pretty painful" and "deadly". You're right that with dry non-sweaty hands, you can't even notice 48 volts.
I've certainly wondered many times why we don't wire homes for DC right now. Almost everything in my house runs on DC, and the Solar Panels on the roof generate DC, but it gets converted to AC in between with losses on both sides of that conversion.
Solar (DC) > AC Inverter > DC Converter > Device.
(And in the future, there will be a home battery in there that converts that Solar Inverter AC back to DC, then discharges DC back to AC)
All the electronics, all the LED lights, everything that charges, even my car, all are native DC. What's stopping us from have a parallel DC connection for these things in homes that could be powered directly (ahem) via DC without the conversion?
They want to run transmission lines at high voltage, so the same power can be delivered at lower current (meaning smaller conductors and less losses due to resistance), then step those down to lower voltages as you get closer and closer to the customer. High voltage AC transformers are very simple and have been around for 100+ years; they're just loops of wire around a core.
High voltage DC transformers would require very modern switching technology, or a difficult and inefficient switch to AC and back again.
True. But while a high voltage converter might be expensive, it's far more efficient and would pay off costs over the duration of the life of a home. Having a single one of those per home should surely be a net win. The utility is responsible for high voltage AC transforms down to the home, and the homeowner fronts the cost of the centralized expensive but efficient converter to power the DC circuit within the home.
If solar and local generation are thrown in the mix, they bypass the converter and directly power the home circuit.
As it is, typical home electrical systems have no active components. There are just wires, panels and bimetallic circuit breakers. These systems are nearly maintenance-free over a lifespan similar to that of the structure.
A DC distribution system in the home would require both a high power rectifier at the main panel to something like 125 VDC, then many smaller DC/DC converters throughout the home for your usable voltages like 5/9/15/20 V that are too low to be effectively distributed.
All of those things would need to be maintained and upgraded over the years, because there is no such thing as power electronics that last forever. After a few electrician visits, you might find that you haven't saved any money at all.
Even if you have solar, you still need a DC converter because it will not output a constant voltage let alone all of the DC voltages you need for your devices. And generation any further away than your own rooftop is going to need to be stepped up to higher-than-home voltages and then back down for use in your home - all of which is exactly why we currently use AC for distribution.
You forgot the magnetic trip of the breakers and the now-mandatory RCDs. The latter are far more complex than a simple rectifier would be.
And even then, there's no reason such a rectifier module couldn't be a pluggable module. They still last 10~20 years, easily.
I don't see what all those low voltage rails should be for. Computers typically work fine on 300~350 V DC, and if anything, there is reason to go from 12 V to a higher supply bus voltage, actually deployed in some modular servers by now (with a 48 V bus between the local battery backup modules, AC-fed supplies, and motherboards).
The ostensible benefit to DC distribution in homes is to be more economical and simpler for devices that already run on DC - not to redesign ever device ever made to accept mains-voltage DC. If your iPad and your laptop and your blender still need a power brick to work, what's the point?
Using high-voltage unnecessarily to avoid using a DC converter is also not going to save money. Yeah, you can use a 300 V DC motor in a coffee grinder, but why? It's just going to cost more money to make.
90% of things in your home would happily run from 150V DC, even though they aren't rated for it.
Source: I sometimes connect my solar panels direct to my AC wiring without an inverter, and my house works entirely except my washing machine and fridge (both of which have AC motors in). Even my vacuum cleaner works (although it's on-off switch doesn't work, since it uses a thrysistor!). Phone charger, laptop charger, oven, microwave, doorbell, furnace, routers, TV, monitors, desktop pc, all work fine.
If some country declared tomorrow that all electrical devices must accept AC or DC, not that much would have to change.
I had no idea about this. Can it damage things that won't work (eg things with AC motors).
I've been building out electrical in a campervan and always wonder if there were DC equivalents to a lot of things.
My point is, that the European accidentally-DC-capable mains equipment can be expected to complain/sustain overcurrent damage, provided it isn't able to handle US residential voltages.
Hence you might as well take the opportunity and switch to a higher in-house distribution voltage than the typical 120 V.
And that 300 V DC motor may actually be cheaper, as you could run a BLDC driver directly from the DC supply with just minimal filtering.
The enhanced power density and copper-efficiency of these high-frequency 3-phase motors may make up for the cost of said inverter, even neglecting the considerably increased energy efficiency over a typical single-phase-capable "oldschool" motor.
Yes, but single-stage conversion from 48 V to ~1.2 V core/memory voltages is inefficient with the typical buck topology, due to the low duty cycle.
There are solutions based on ZCS (+ZVS) (semi-)resonant switched capacitor topologies that could (technically) do this in essentially one stage. But because they are still somewhat recent and rely on either GaN enhancement-type FETs or low-average-blocking-voltage topologies that make use of e.g. small 5V-capable IC process nodes and some tricks to have the individual power transistors floating.
> But while a high voltage converter might be expensive, it's far more efficient and would pay off costs over the duration of the life of a home
AC is easier to transform. Those transformers are cheap and rugged. DC is very difficult to monitor and control, especially in larger voltage and current levels.
That's actually what Edison's original DC power network was designed around. DC power lines, and a coal chugging power station every mile. Turns out that all the extra pollution and expense is a bad idea, so AC won out. It's been working just fine for over 100 years. Read more here:
50 or 60 Hz AC, especially 3-phase, makes it really easy to build and operate electric motors, too, and to transform between different voltages.
The OP’s argument is that solar power generation, plus the fact that most electrical consumption is now fundamentally DC-friendly (LEDs, electronics, electric cars, etc.), may change the equation. The concept of a whole-house rectifier is an interesting one, and something that is already used in some data centers. You still have the problem of different electronics wanting different voltages, though...
As you note, for facilities like data centres that are engineered for a specific type of load, a building-level rectifier makes sense. For a home where you have many devices using a little bit of power at a variety of voltages, you're going to end up with lots and lots of small DC converters everywhere which defeats the point. Just run 120 VAC.
> 50 or 60 Hz AC, especially 3-phase, makes it really easy to build and operate electric motors, too
Three-phase motors are never used in residential settings, and most residential motors would be more efficient as brushless DC. There's no need for sinusoidal AC motors any more except in specialized industrial applications.
Whilst most devices may use DC internally that may not be true of consumption. You still have electric showers, cookers, hobs, washing machines, dryers, heaters, air-conditioning, vacuum cleaners and kitchen appliances like blenders.
Even if power generation is distributed and localized, you still need higher voltages than what is in a home to transmit it unless you're talking about not having a power grid at all. You still need voltages on the order of 1 to 40 kV to distribute power around a neighbourhood, for instance. You aren't going to wire your house at distribution voltage - it would be expensive and unsafe.
Sure, in an ideal world. But economies of scale do exist and everyone having solar panels and expensive personal batteries remain difficult and not a viable solution compared to the larger, distributed system that allows people to draw what they need when they need it.
Long-range high-voltage DC (HVDC) interconnects are being discussed in Europe, and have already been widely deployed in China.
For local (100km) connections AC is still used almost exclusively, only exception being for some underground cables where induction losses would be too high.
You are entirely correct. Power between different AC regions is mostly transferred using HVDC-links because it avoids synchronisation requirements between them.
Additionally the capacitive losses of an undersea HVAC cable are prohibitive, leaving only HVDC as an option.
You'll notice almost all of these are undersea cables. HVDC connections are used there because you can't really hang air cables over the ocean and the capacitive losses of HVAC cables under the sea makes them prohibitive.
Everything in your house runs off and/or generates vastly different DC voltages. Yes, you could get rid of some AC-DC conversions in there, but your various solar panels, batteries, infeeds from the grid, electric car, etc are all going to function at dramatically different DC voltages, such that you’re going to need a ton of DC-DC converters on everything anyway.
On top of that, high-voltage DC is quite a bit more dangerous than AC - it is much better at generating sustained arcs that are very good at catching things on fire.
Yeah, but isn't DC-DC volt conversion still much better, than DC - AC - AC- DC?
Also, most devices have to internally change voltage for various elements anyway. So we do already have tons of DC-DC converters in everything.
Another important advantage of DC only grid: it is smooth.
DC from AC has lots of fluktations(because AC always goes up and down), which of course most device expect that and sensitive electronics like in hospital, does not run on the ordinary grid for that reason. But it would be nice if also common devices can have stable electric input. Who knows, maybe quite some tricky computer bugs or hardware failures have their root in this.
Another aspect of DC: it does not constantly creates a changing magnetic field arround it, like any AC power line does.
DC-DC is fine at lower amperages, once you go into a few kW, then it gets more difficult.
DC - AC - AC - DC has the advantage that you can use a transformer to shift voltage levels, which does this very efficiently, usually between 95% and 99% of the energy is passed along.
A DC-DC converter on the other hand usually will top out at 90% in deal conditions, thyristors seem to be able to do 95 maybe 98%, but those only become economical in the megawatt range. You loose more if you want this to work bidirectional (ie, you can feed energy from both sides of the converter).
There is also some other issues that DC brings along, if you were doing it on a household scale, you'd have to worry about magnetic fields in the house significantly shifting, DC cables can't be safely cut from power as easily as DC (it will ark and it will not stop arking until interrupted, you need special fuses, AC stops arking on the 0 crossing), switching heavy DC loads can cause cables to jump at much lower wattages, etc.
Additionally, DC only grid would not be smooth. Every device that joins or leaves the grid will cause voltage fluctuations. It'll maybe be smooth within 10V or so. Within that it'll be moving fast.
For bigger grids there is also the disadvantage that it's rather difficult to sync up a DC connection. Worst case you get a lot of current until the grid normalizes. On AC grids you only really need to synchronize the frequency, then you can connect and any currents should seize over the next zero crossing.
"if you were doing it on a household scale, you'd have to worry about magnetic fields in the house significantly shifting"
You mean every time gets switched on or off? (or changes in device energy demand)
But with AC the magnetic field changes constantly all the time. That sounds not better. Or is it, that the constant change is pretty much constant, so the effective influence on other devices is lower?
AC causes radio emissions (which is lost power), it doesn't affect the magnetic field unless it forms coils. And most coils on AC will be put into iron cores to contain them nice and close to where they belong. Even then an AC coil doesn't really behave magnetic very much since the field reverses polarity so often.
A DC wire will constantly produce a static magnetic field. With only a few kilowatts, you can probably measurably shift a compass needle within your house, since all your wires will necessarily have to form a loop (positive and ground).
Magnetic fields aren't created by shifting the DC voltage or current, they're created by current flowing through a wire, AC just has the advantage that 20ms later, the field goes the other direction so few things care.
What will happen when DC load shifts is that the magnetic field changes and it'll dissipate or absorb power from the DC line to do so (Relays require you to put in a diode so that when they are switched off, they can dissipate the stored field energy safely without damaging electronics. This dissipation happens at much higher voltage levels than what induced it and is of opposite polarity to your input voltage.
> thyristors seem to be able to do 95 maybe 98%, but those only become economical in the megawatt range.
IGBTs and SiC are plenty economical for boosting and bucking in the kilowatt range. For very light loads (like inside the home), switched supercapacitors would work fine and are quite efficient. The need for a big spool of copper is over.
In-house DC could easily be limited to 48V which is not dangerous. Efficient DC buck converters could be built for pennies inside each outlet and negotiate with the load to provide the right DC voltage.
If you want to cut the voltage by 60%, from 120V to 48V, then you’ll have to triple the thickness of all of the conductors in your walls. The costs would be immense, as would the challenge of stuffing such thick wires through the wall.
That said, lots of people use 48V wiring for things like LED lighting and home audio. It makes perfect sense for those applications.
> If you want to cut the voltage by 60%, from 120V to 48V, then you’ll have to triple the thickness of all of the conductors in your walls.
And that's not even getting into 240V appliances like dryers, ovens, cooktops, and (increasingly) EV chargers.
The (e.g.) newly released VW ID.4 can charge at up to 11 kW, which at 240V, would use 45.8A, but (IIRC) you can only use up to 80% of a circuit, so you would need wiring that could handle ~60A. At -48V, we're talking >200A: those are some thick cables (or a busbar).
So you need high-voltage (i.e. >48V) into the home anyway.
And most home applications are in that lower-power category. You'll still need 120VAC and 220VAC but only on particular, dedicated circuits. Like maybe one or two per room. Other outlets and lighting would work fine at 48VDC on 12 gauge Romex because 20A is more than sufficient for the majority of loads.
Because all your DC appliances use low voltage. Low voltage requires thick cables at long distances: For example, look at how thick auto battery cables are. Wiring a house with cables that thick is absurdly expensive.
In theory, we could use high voltage DC, but then each appliance will still need something to drop the voltage.
I personally like how USB is becoming the low voltage standard. You can find outlets with USB built in.
48VDC at 20A is more than enough for most loads in a home (960W) and 12 gauge wire (standard in many homes) will handle that just fine. Auto battery cables are thick because they have to supply hundreds of amps at 12V. Different situation.
Out of about 100 things that plug in to the wall in an average home you've just listed 5 that won't work. (You missed the water heater and the clothes dryer BTW). This is exactly what I meant by "most."
There's no reason homes couldn't distribute 120Vdc over the same wire guage used for ac since all applience motors are now dc and all electronic PSUs are switched mode. Each appliance already has "something to drop the voltage".
Just go to a higher voltage already, if you go for DC. Most computers e.g. are fine with 300~350 V DC (and would not be fine with 120 V DC, as the high-voltage side of their power supplies isn't designed for the current you'd have at "just" 120 V).
The US uses 120v AC because incandescent light bulbs and heaters that ran at 100v DC would work at 120v AC. It allowed upgrading the original DC grids to AC with minimal impact.
DC faults are far harder to interrupt than AC ones at the same power (may not apply above 5~15 kV).
Also, power balancing via frequency-tracking is fairly easy. I guess you could just rely on voltage targeting for the local level (up to an apartment complex), and do something else to handle distribution infrastructure.
But yes, there is nothing fundamentally stopping you from putting separate 300-ish V DC wiring in your house and feeding that directly to compatible switched-mode power supplies.
Rule of thumb: universal (120-240 V) supplies that don't need to be switched between low and high range can handle the current in the input rectifier (you'd only use 2 of the 4 diodes, but at 120 V they have to handle double the current anyways), but active PFC could get confused.
If you have fixed ceiling LED lighting, you may be able to re-wire it for series operation, but that requires sufficient understanding of the isolation voltages between cases and input connections.
In the end, there would be a DC to DC step-down regulator. E.g. Meanwell has suitable LED drivers in their catalog that officially take ~300 V DC.
For the solar panels, you'd just need a high voltage DC MPPT, likely a step-up converter for ~320 V DC (~230 V AC peak).
Theoretically such a DC supply could be provided by just rectifying a 3-phase supply in star configuration, as that would get a decent power factor (actually better than most passive PFC SMPSs) with just 6 diodes (and 9.5% RMS ripple). There is a related configuration that needs a transformer and uses 12 diodes to get 1 % RMS ripple, but as I just realized, they produce too much voltage for the somewhat-common DC-tolerant SMPSs: 538 V DC from a 3-phase 230 V (phase-to-neutral) AC supply.
You'd need active PFC or a transformer to get decent power factor at compatible DC voltages.
Just to be clear though: the solar panel wants to feed a switching regulator for MPPT, even if you specially configure it to supply some LEDs straight from the outdoor sun (no battery buffer, brightness fluctuating accordingly).
You could however totally shift that MPPT switching regulator to the battery charging, so your computer and LED drivers take the raw panel string voltage, as they have internal regulation. The MPPT would likely still have a current sense shunt at the panel output to accurately perform it's MPPT duties, but that'd be faulty if it'd get toasty (precision and heat don't like another).
DC Kills. And quickly too. There is no 0 Volt in its strength (AC cycles from positive peak to negative peak, about 50 to 60 times a second, depending on which country you are in).
DC is dangerous. Anything above 100 Volts is risky.
Most home lighting and fans can be run with 48 Volts though. So we could have separate wiring for appliances and low voltage DC stuff.
AC above 100 Volts kills quickly too. It can actually kill more easily.
You need about 4 times as much DC current going through the body compared to AC. 220Volt DC is safer than 220Volt AC. This is in part because our body (specifically the heart) can handle a constant voltage shock much better than a constant frequency interfering with the nervous system. When you get hit with DC, all your muscles contract once, then adapt. When you get hit with AC, all your muscles contract 50 times a second. In fact the IEEE did experiments, concluding that it's easier to let go of a possibly lethal DC source than an AC source for this reason.
Additionally, the human body has a higher impedance to DC than to low frequency AC. So you already need more DC voltage to induce the same current.
Mostly inertia. The 14-gauge Romex in your walls is big enough for DC LED lighting as long as your lights are on separate circuits from your wall outlets and the voltage is around 24VDC or more. And we need lighting power supplies that can change 24VDC to constant current, but those are simpler than the 120VAC-to-constant current supplies in light bulbs now.
For power outlets you probably need at least 12 gauge wire and 48VDC. And we need a new standard DC smart outlet that can negotiate with the load and locally buck the voltage down to whatever the load needs.
All this is easier with new construction than with retrofitting of course. Then it's a chicken/egg problem: We need lights and appliances built for this infrastructure but we need the infrastructure before anybody will build devices for it. The solution is to build bridge devices that can replace wall warts that negotiate with smart outlets. Lights are trickier: Someone's going to have to build "raw" LED "bulbs" that assume the power supply will be added elsewhere.
I've already started retrofitting my table lamps but the rest will be harder.
I'm not really sure I'd call two reverse-parallel (DC) LEDs an AC LED. The design in figure 1b seems basically equivalent to a dropper capacitor[1] and DC rectifier feeding standard LEDS (expect you're lighting different LEDs on each half-wave instead of the same ones), which us how a lot of LED lamps are already driven.
I noticed that the automotive industry has a Power of Ethernet standard that can deliver close to 100W at 48V. I felt that has some advantages for house wiring. Can run lights and small appliances. And you get a data connection at the same time.
It’s because A/C power equipment (transformers, panelboard, etc) have 0 semiconductors. This makes them extremely reliable over a long period of time vs a DC rectifier.
Meh, AC->DC conversion works fine. I think we'd be better off finding a way to migrate US outlets to 240V so we can plug in more powerful stuff. That'll never happen unless someone invents a cheap AC step-down dongle for legacy appliances, because it's impractical to run a fourth wire to existing outlets.
I doubt this is possible in a country so democratic. For the same reason you can't really switch to the metric system: you will have to convince the people the change is worth the inconvenience. The value of it is only obvious to expert engineers and economists (and some geeks like us) but they won't make the decision because ordinary people won't vote.
It would be a fire hazard to run low-voltage DC with the high current needed. It's better to run high-voltage at low current. And AC is better at changing voltage when needed, because they can work with transformers.
AC is much easier on switches (the arc which occurs when opening the switch is extinguished at the next cycle, instead of the more sustained arc which occurs with DC).
AC is safer for humans, especially at high voltages. DC causes muscles to clench onto whatever the person touched, while AC makes you kind of jump away a bit more.
AC works with the rest of the power grid: synchronous generators, transformers and other heavy-duty equipment all rely on AC. DC alternatives are much more complex.
Edit: ah, I see you're not necessarily proposing we get rid of AC. In that case, having two systems would be complicated.
They do, but we make exceptions for appliance power all the time. Laundry appliances frequently have a higher voltage circuit dedicated for them, at least in California.
Why not treat the exception as the exception? Appliances might consume disproportionately more power, but they're much more static compared to the variety of devices frequently plugged in and out of a home's sockets, most of which have to lug around an inefficient rectifier with them.
I have these on my eBike. They're interesting, but trying to use them is twitchy. You end up doing the USB thing of trying it one way, then flipping, then flipping back, then staring at both ends intently trying to line up the ying/yang of the openings to match them up. Then they disconnect much more easily than I think they should as they're just plastic.
Much rather just move to some sort of USB-C reversible socket design.m
(Reposting my comment from 2013 with a current link to the tariff.)
With so few customers (and no new ones allowed), it’s not surprising that the DC distribution system in San Francisco isn’t widely known, but it’s not a secret.
Regulated electric utilities like PG&E provide their services pursuant to tariffs, and this is no exception. See Tariff A-15, “Direct-Current General Service,” on the PG&E website:
Used to ride one of these elevators, which would stall and stutter and generally freak out anyone who rode it regularly. But this, from the article (which I remember reading way back in 2012 when it was first posted here) is always what I thought about when I was in it
> If a winding drum’s control system fails, its motor can drive the elevator through the roof, according to San Francisco–based elevator consultant Richard Blaska.
Yes. The Peninsula line is potentially part of the California high speed rail system, so it's being electrified like high speed rail, not local transit.
this seems obvious, doesn't it? Why would the utility spend all that money maintaining a complex and risky DC grid when they could just mandate AC - DC rectifiers.
I can't speak to undersea cables, but back-to-back AC/DC inter-connections certainly are used between grid sections, as they allow for easier control of both frequency and phase. One languishing project called Tres Amigas originally intended to tie all three US power grids together in northern Texas, though, as with many large power infrastructure projects, that one suffered from the bureaucracy and conflicting goals of too many stakeholders.
There are a few high voltage DC transmission lines, and more planned or proposed, which allow for much greater control of power flow in a network. Places with high penetration of renewables and relatively low demand (like Iowa, for instance) could see value in more directly exporting wind power, delivered to the point of load (say, a population center like Chicago). Designing AC transmission for such a project would be more complicated due to the way power flows around AC grids.
Yes, submarine power cables quickly have to be DC due to capacitance between the cable and the ocean. Compensation for that is neither easy (middle of the ocean) nor cheap.
In the past 20 years with the growth of ISPs, telecom, internet infrastructure stuff, a huge percentage of it runs entirely from giant -48VDC battery banks, rectifiers, bus bars, fuse panels and distribution systems. In general as you get to the size of really big important core routers, DWDM systems and such, a lot of them don't even exist in AC powered versions.
The difference is that it's normal 3-phase AC power distributed to the building, and turned into DC within it.
The origin of this is within the Bell system and other similar legacy telco equipment manufacturers (Northern Telecom, etc), all of which standardized on -48VDC power systems more than 65 years ago.
It is just not apparent to most people how common and widespread -48VDC stuff is, since unless you work at the OSI layer 1 in the telecom business, you'll rarely have a reason to lay eyes on it in person.