Posted by jnord 20 hours ago
Installing a ceiling fan used to be treacherous and so heavy. Also loud and buzzy after installed. Now the fans in these things are so lightweight and easy.
seeing the same in many more areas (lighting, etc)
The irony is all the recessed lights I picked out are DC, they all have little AC-DC boxes hanging off them using a proprietary connector. If I hadn't needed to pass a rough-in inspection going all DC would've been trivial.
It is silly to have AC to DC converters in all of my wall connected electronics ( LED bulbs, home controller, computer equipment etc )
You could wire your house for 12, 24 or 48V DC tomorrow and some off-grid dwellers have done just that. But since inverters have become cheap enough such installations are becoming more and more rare. The only place where you still see that is in cars, trucks and vessels.
And if you thought cooking water in a camper on an inverter is tricky wait until you start running things like washing machines and other large appliances off low voltage DC. You'll be using massive cables the cost of which will outweigh any savings.
It would be relatively easy for the US to go to 240V: swap out single-pole breakers for double-pole, and change your NEMA 5 plugs for NEMA 6.
For a transition period you could easily have 240V and 120V plugs right next to each other (because of split phase you can 'splice in' 120V easily: just run cable like you would for a NEMA 14 plug: L1/L2/N/G).
What would be the real challenge would be going from 50 to 60Hz.
Other way around, no? The US is already 60Hz.
Edit: I mostly remember this because the SNES games I used to buy in the US and brought back to Europe ran noticeably slower.
...There was some kind of switch involved, I hope?
In all likely not worth the trouble. When I moved to Canada I gave away most of my power tools for that reason and when I moved back I had to do that all over again.
If you ever have to do it again, you can probably get a transformer rated high enough for power-tools for cheaper than replacing all of your power tools.
Killed a few tapes with a transformer on a US tape deck before buying a 220V 50Hz unit. No, I don’t remember if the pitch was grossly off, but I’m guessing it wasn’t.
I think the answer to your question is that it mostly doesn't matter for personal mug size quantities of hot water and if it does matter to you there are readily available competing options such as dedicated taps for your kitchen sink.
Perhaps the biggest reason is that a traditional kettle on any half decent electric range will match if not exceed the power output of any imported electric kettle. Many even go well beyond that with one burner marked "quick boil" or similar.
I’m surprised that American exceptionalism can tolerate half powered sockets.
No one in the USA drinks hat tea. The choices (and it tends to be regionally-based) is sweet or unsweet tea. No need to boil a kettle quickly for that.
... Unless you're buying it pre-made, does this not still start with making hot tea the regular way? Or what exactly are you doing with the tea bags and loose tea from the supermarket?
There are dozens of us.
Perplexingly I was traveling in one of the iced tea regions of the country in need of a cup of hot tea, and they had no way to make it. Like, you have a commercial coffee maker and hot cups, the coffee maker has a hot(ish) water tap. All you need is a $4 box of teabags that’ll last until the heat death of the universe. Nope.
Still though, I don't seem to see most of those people seriously clamoring for the electric kettle to go a bit faster. The cost for the wiring difference and dealing with odd imported kettles just isn't worth it generally.
How expensive would a proper AC->DC->AC brick for that power level be?
A pure sinewave inverter for that kind of power is maybe 600 to 1000 bucks or so, then you'd still need the other side and maybe a smallish battery in the middle t stabilize the whole thing. Or you could use one of those single phase inverters they use for motors.
I can watch 1080p video on YouTube and it runs in an up-to-date web browser using less than 50% CPU on 12-year-old hardware with 8GB of RAM and a graphics card that was a budget option at the time (my searches indicate it draws at most 80W, though it expects a 500W PSU for some reason).
I end up converting stuff anyhow, because all my loads run at different voltages- even though I had my lights, vent fan, and heater fans running on 12V I still ended up having to change voltages for most of the loads I wanted to run, or generate a AC to to charge my computer and run a rice cooker.
Not to mention that running anything that draws any real power quickly needs a much thicker wire at 12V. So you're either needing to run higher voltage DC than all your loads for distribution and then lowering the voltage when it gets to the device, or you simply can't draw much power.
Not that you can't have higher voltage DC; with my newer system the line from my solar panels to my charger controller is around 350VDC and I can use 10awg for that... but none of the loads I own that draw much power (saws, instapot, rice cooker, hammond organ, tube guitar amp) take DC :D
4KW of panels, 400W 48V EG4 6000XP charge controller/ inverter 3x EG4 LifePower4 48V batteries a raspberry pi running solar assistant
I feels like a bit overkill, and there is still a whole mppt unused on the 6000xp so I could still double my panel input. Also solar assistant tells me that I rarly go below 75% battery storage. If I just wanted to run my fridge and assorted convenience loads (and ran things like table saws off a generator) then I could get away with a lot less of a system.
But I'm operating a recording studio, and there were a couple days this winter where I had a full-band session and a couple days of storms and got down to below 50%.
Thus, even if you had DC in the walls, it would be 100+ volts, and you'd still have conversion down to the lower voltages that electronics use. If you look at the comments in this thread from people who work in telco, they talk about how voltage enters equipment at -48V and is then further lowered.
For 800V DC, a simple UPS could interface with the main supply using just a pair of (large) diodes, and a more complex and more efficient one could use some fancy solid state switches, but there’s no need for anything as complex as a line-interactive AC UPS.
However, higher DC voltage is riskier, and it's not at all standard for electrical and building code reasons. In particular, breaking DC circuits is more difficult because there's no zero-crossing point to naturally extinguish an arc, and 170V (US/120VAC) or 340V (Europe/240VAC) is enough to start a substantial arc under the right circumstances.
Unfortunately for your lighting, it's also both simple and efficient to stack enough LEDs together such that their forward voltage drop is approximately the rectified peak (i.e. targeting that 170/340V peak). That means that the bulb needs only one serial string of LEDs without parallel balancing, making the rest of the circuitry (including voltage regulation, which would still be necessary in DC world) simpler.
IEEE 802.3bt can deliver up to 71W at the destination: just pull Cat 5/6 everywhere.
* https://en.wikipedia.org/wiki/Power_over_Ethernet#Standard_i...
In the commercial/industrial space this may be worth it: how long do these bulbs last? how much (per hour (equivalent)) do you pay your facilities folks? how much time does it take for employees or tenants to report an outage and for your folks to get a ladder (or scissor lift) to change the bulb?
The part that would genuinely be cheaper is avoiding problematic flicker. It takes a reasonably high quality LED driver to avoid 120Hz flicker, but a DC-supplied driver could be simpler and cheaper.
The gain from DC-DC converters is small and DC devices are small part of usage compared appliances. There is no way will pay back costs of replacing all the appliances.
(Am I just showing my age here? How many of you have ever bought incandescent globes for house lighting? I vaguely recall it may be illegal to sell them here in .au these days. I really like quartz halogen globes, and use them in 4 or 5 desk lamps I have, but these days I need to get globes for em out of China instead of being able to pick them up from the supermarket like I could 10 or 20 years ago.)
If your house gets 800V DC you're still gonna need "bricks" to convert that to 5VDC of 12VDC (or maybe 19VDC) that most of the things that currently have "bricks" need.
And if your house gets lower voltage DC, you're gonna have the problem of worth-stealing sized wiring to run your stove, water heater, or car charger.
I reckon it'd be nice to have USB C PD ports everywhere I have a 220VAC power point, but 5 years ago that'd have been a USB type A port - and even now those'd be getting close to useless. We use a Type I (AS/NZS 2112) power point plug here - and that hasn't needed to change in probably a century. I doubt there's ever been a low voltage DC plug/socket standard that's lasted in use for anything like that long - probably the old "car cigarette lighter" 12DC thing? I'm glad I don't have a house full of those.
My understanding is that DC breakers are somewhat prone to fires for this reason, too.
The electricians I was working with also told me stories about how with the really big breakers, you don't stand in front of it when you throw it, because sometimes it can turn into a cloud of molten metal vapor. And that's just using them as intended.
Allegedly
While on "work experience" from high school I was put on washing power lines coming straight out of the local power station near the ocean - lots of salt buildups to clear.
Same deal, flashover suits and occasional arcs .. and much laughter from the ground operators who drifted the work bucket close.
Another story in the same line is that I heard that a horse got killed by contact with a lantern battery, but I don't have any reference for that, just a story by a family member that collected coaches.
It would have self-extinguished if you waited long enough for the probe to vaporize.
I think its that DC breakers are more expensive, so people use AC rated breakers instead. They are both rated for 400v @10 amps, its the same thing right?
It turns out they are not, and most people, even electronics types rarely play with 200v+ of DC.
(My stand mixer is the lone sad exception)
I spent a few years getting flown out around the world to service gear at different datacenters. I learned to pack an IEC 60320 C14 to NEMA 5-15R adapter cable and a dumb, un-protected* NEMA 5-15R power strip. While on-site at the datacenters, an empty PDU receptacle was often easy to find. At hotels, I'd bring home a native cable borrowed from or given to me by the native datacenter staff or I'd ask the hotel front desk to borrow a "computer power cable," (more often, I'd just show them a photo) and they generally were able to lend me one. It worked great. I never found a power supply that wasn't content with 208 or 240V.
Example adapters: https://www.amazon.com/dp/B0FD7PHB7Y or https://www.amazon.com/dp/B01IBIC1XG
*: Some fancier power strips with surge suppression have a MOV over-voltage varistor that may burn up if given 200V+, rendering the power strip useless. Hence, unprotected strips are necessary.
AC arcs are easier to extinguish than DC arcs, but DC will creep much easier than AC and so on.
From a personal point of view: I've worked enough with both up to about 1KV at appreciable power levels and much higher than that at reduced power. Up to 50V or so I'd rather work with DC than AC but they're not much different. Up to 400V or so above that I'd much rather have AC and above 400V the answer is 'neither' because you're in some kind of gray zone where creep is still low so you won't know something is amiss until it is too late. And above 1KV in normal settings (say, picture tubes in old small b&w tvs and higher up when they're color and larger) and it will throw you right across the room but you'll likely live because the currents are low.
HF HV... now that's a different matter and I'm very respectful of anything in that domain, and still have a burn from a Tronser trimmer more than 45 years after it happened. Note to self: keep eye on SWR meter/Spectrum analyzer and finger position while trimming large end stages.
Can you say more about "creep"? Is the resistance changing? Or is material actually migrating?
Also curious why it's worse using DC.
Electromagnets dont work for DC, so your breaker will never trip. For thermal protection, you need current, so that checks out, and it would make sense for it to be rated under 50V as thats considered the highest voltage thats not life threatening on touch.
PV Batteries in general have a very high current (100s of A) at ~50Vish volts, so I dont think there's a major usecase for using household breakers for them.
Im still not getting your point BTW, switches and breakers are two separate things, with different workings, and household (and datacenter) DC would be I think around 400ish V, which is a bit higher than the peak voltage of AC, but still within the arc limits of household wiring (at least in 230V countries).
The advantage of DC is that you use your wiring more efficiently as the mean and peak wattage is the same at all times. Going with 48V would mean high resistive losses.
If electromagnets don't work for DC then what am I supposed to do with this pile of DC solenoids and relays? ;)
> PV Batteries in general have a very high current (100s of A) at ~50Vish volts, so I dont think there's a major usecase for using household breakers for them.
That's what the SCCR rating is for. When there's a fault you're going to have a LOT of current flowing until your safety kicks in. Something like the grid or a battery bank will happily provide thousands of amps almost instantaneously. Breakers designed for protecting building wiring are rated for this. Now, most household breakers aren't dual DC/AC rated, but you can actually buy DC rated breakers that fit in a home panel (Square D QO series).
> Im still not getting your point BTW, switches and breakers are two separate things, with different workings, and household (and datacenter) DC would be I think around 400ish V, which is a bit higher than the peak voltage of AC, but still within the arc limits of household wiring (at least in 230V countries).
My point is that there isn't any material reason why DC can't be as safe as AC, all the proper safety equipment already exists. Extinguishing a DC arc during a fault is a solved problem for equipment at household scale.
> The advantage of DC is that you use your wiring more efficiently as the mean and peak wattage is the same at all times. Going with 48V would mean high resistive losses.
I just mentioned 48V because it's a common equipment voltage for household DC systems. 400V would be good for big motors and resistive heating loads.
Regarding DC vs AC and wiring efficiency, talking about mean vs peak wattage just confuses the issue. 1 volt DC is 1 volt RMS. It is an apples-to-apples comparison. If you want to say "we can use 170VDC or 120VAC with the same insulation withstand rating, and at lower current for the same power", then that is absolutely true. But your common 600V THHN building wire won't care if you're using 400V AC or DC, so it's mostly immaterial.
Thinking about the failure modes gave me the heebie jeebies, but the gas had been disconnected ages prior.
Once you get into higher power (laptops and up), switching and distribution get harder, so the advantages fade.
For bigger appliances (fridge, etc), AC is fine + practical.
However, there's also PoE (24 or 48V!), so maybe that's the right approach. It's not like each outlet is going to run a heater anyway.
Unless you mean running AC and installing inverters in the wall? What is this even for? All my electronics are DC but critically they all require different voltages. The only thing I might benefit from would be higher voltage service because there are times that 15 A at 120 V doesn't cut it.
The irony...
I always thought AC’s primary benefit was its transmission efficiency??
Would love to learn if anyone knows more about this
To expand on this, a given power line can only take a set maximum current and voltage before it becomes a problem. DC can stay at this maximum voltage constantly, while AC spends time going to zero voltage and back, so it's delivering less power on the same line.
The transmission efficiency of AC comes from the fact that you can pretty trivially make a 1 megavolt AC line. The higher the voltage, the lower the current has to be to provide the same amount of power. And lower current means less power in line loss due to how electricity be.
But that really is the only advantage of AC. DC at the same voltage as AC will ultimately be more efficient, especially if it's humid or the line is underwater. Due to how electricy be, a change in the current of a line will induce a current into conductive materials. A portion of AC power is being drained simply by the fact that the current on the line is constantly alternating. DC doesn't alternate, so it doesn't ever lose power from that alternation.
Another key benefit of DC is can work to bridge grids. The thing causing a problem with grids being interconnected is entirely due to the nature of AC power. AC has a frequency and a phase. If two grids don't share a frequency (happens in the EU) or a phase (happens everywhere, particularly the grids in the US) they cannot be connected. Otherwise the power generators end up fighting each other rather than providing power to a load.
In short, AC won because it it was cheap and easy to make high voltage AC. DC is comming back because it's only somewhat recently been affordable to make similar transformations on DC from High to low and low to high voltages. DC carries further benefits that AC does not.
BTW, megavolt DC DC converters are a sign to behold: https://en.wikipedia.org/wiki/File:Pole_2_Thyristor_Valve.jp...
There are many factors involved, and "efficiency" is only one. Cost is the real driver, as with everything.
AC is effective when you need to step down frequently. Think transformers on poles everywhere. Stepping down AC using transformers means you can use smaller, cheaper conductors to get from high voltage transmission, lower voltage distribution and, finally lower voltage consumers. Without this, you need massive conductors and/or high voltages and all the costs that go with them.
AC is less effective, for instance, when transmitting high power over long, uninterrupted distances or feeding high density DC loads. Here, the reactive[1] power penalty of AC begins to dominate. This is a far less common problem, and so "Tesla won" is the widely held mental shortcut. Physics doesn't care, however; the DC case remains and is applied when necessary to reduce cost.