Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If we could magically pick a frequency and voltage for electrical systems to use (without sunk costs), what would it be?

What's the most efficient for modern grids and electronics?

Would it be a higher frequency (1000hz)?

I know higher voltage systems are more dangerous but make it easier to transmit more power (toaster ovens in the EU are better because of 240v). I'm curious if we would pick a different voltage too and just have better/safer outlets.



> If we could magically pick a frequency and voltage for electrical systems to use (without sunk costs), what would it be?

> What's the most efficient for modern grids and electronics?

I do not think it is possible to answer the question as posed. It is a trade-off. Higher frequencies permit smaller transformers in distribution equipment and smaller filtering capacitors at point of use. On the other hand, the skin effect increases transmission losses at higher frequencies.

If you want minimum losses in the transmission network, especially a very long distance transmission network, then low frequencies are better.

If you want to minimize the size and cost of transformers, higher frequencies might be better. Maybe the generator is close to the user so transmission loss is less important.

If you want smaller end-user devices, high frequency or DC might be more desirable.

You have to define some kind of objective function before the question becomes answerable.


I think the question could be constrained as "what frequency uses the minimum amount of copper to remake the electrical distribution network that exists today?"

This would be a pretty good approximation of the ratio of transmission lines to transformers.


You could build the lot with DC and massively reduce transformers, but transformers are probably a lot more reliable than switching converters everywhere. Not sure which would be cheaper tbh.


Sure, the transformers would be smaller, but your transmission lines would be thicc.


Isn’t it the other way around?

>Generally, for long-distance power transmission, DC lines can be thinner than AC lines because of the "skin effect" in AC, which concentrates current flow near the surface of the conductor, making thicker wires less efficient; therefore, for the same power transmission, a DC line can be smaller in diameter than an AC line


The issue is actually that DC voltage conversion is much harder than AC, because AC can use transformers, and DC can’t.

This is especially a problem at high voltages and currents.

Also, DC arcs don’t self extinguish as well as AC arcs do, so DC arcs are a lot more dangerous and destructive.

It’s why HVDC lines are still relatively rare (and capital expensive), and typically used for long haul or under salt water, where the inductive loss from AC would cost more than the higher capital costs required for DC voltage conversion and stability.


Distribution lines are aluminium.


Why isn't silver used though? It's a better conductor isn't it?


Cost and weight. High voltage electrical lines use aluminium because of the weight, they are mostly aerial lines. Silver is too expensive to use for almost anything.


But suppose it was cheap enough, would it be useful?


this gives a wrong assumption that optimal distribution network is _the same_ for different frequencies

or that it, itself, isn't a consequence of its own series of sunken cost fallacies


It doesn't. That's why I said it would be good approximation. It's a classic optimization problem. Two iterations is a lot more informative than one.


Wouldn't it make sense to do both then? Low frequency or even dc long distance transmission that gets converted to standard frequency closer to the user?


There's considerable losses involved when you want to convert between frequencies. DC also has considerable losses over long distance, so there's a lower bound on the frequency before the efficiency starts to go down again.


It is not that DC has more losses. it is that transforming DC voltage is non-trivial. With AC you just need a couple of magnetically coupled inductors, no moving parts, easy to build, efficient, reliable. With DC this does not work, you need to convert it to AC first do the transform then convert it back. Nowdays we can achieve pretty good efficiency doing this with modern semiconducting switches. But historically you needed to use something like a motor-generator and the efficiency losses were enough that just transmitting in ac was the clear winner.

The losses over distance thing is the fundamental conflict between desired properties. For transmission you want as high a voltage as possible, but high voltage is both very dangerous and tricky to contain. So for residential use you want a much lower voltage. we picked ~ 200 volts as fit for purpose for this task. but 200 volts has high loses during long distance transmit. So having a way to transform the current into voltage is critical.

Some of our highest voltage most efficient long distance transmission lines are DC, but this is only possible due to modern semiconducting switches.

https://en.wikipedia.org/wiki/High-voltage_direct_current


Right, with modern technology my understanding is HVDC (essentially 0HZ?) is the way to go now (high voltage to minimize resistive loss, DC to minimize skin effect) if we were building a new grid with the same wires, but not economical to retrofit an existing system that is already working, since it is already working


Power line losses are proportional to I^2R so whether its DC or AC isn't really the concern. V=IR so assuming R is constant, a higher transmission voltage results in exponentially lower power losses. DC is actually whats currently used for long distances to achieve lowest power line losses (HVDC).


The skin effect is an important difference (and advantage in favor of HVDC) so it is in fact a concern of AC vs DC.


True, skin effect limits the conductor size (~22mm in Al @60Hz) but overhead transmission already uses bundled conductors to address that as well as to improve mechanical strength, cooling, and reactance. The advantage of HVDC is in the lower dielectric and reactive losses while skin effect is minimal.


Also inductive losses, which are only a thing in AC.


>exponentially

quadratically


Objective function: Minimize operational cost Decision variable(s): Electrical system frequency Scope: Global electrical grid


2 meaningless statements.

For instance, I would say that the scope of the global electrical grid includes every phone charger. Not just because the last foot devices are techically connected, but because they are the reason the rest even exists in the first place. So nothing that serves either the long haul or the local at the expense of the other can be called "minimal operational cost".

So trains use their own 25hz or even lower because that's good for long haul. But that would mean phone chargers are undesirably large and heavy. Or maybe it would mean that every house has it's own mini power station that converts the 25hz utility to something actually usable locally.

Meanwhile planes use 400hz 200v 3-phase for some mix of reasons I don't know but it will be a balance of factors that really only applies on planes. Things like not only the power to weight but also the fact that there is no such thing as mile long run on a plane, the greater importance to avoid wires getting hot from high current, etc.

Simply saying "the objective function is 'what is best?' and the scope is 'global'" doesn't turn an undefined scope and objective into defined ones.


Not the original commenter, but here is another angle: If the original grid designers had a time machine and spent 2 decades studying electrical engineering in modern times before going back, what frequency would they have chosen?

Does this help you understand the original commenter's question?


In this imaginary world, does every house and business have it's own power station? Are small electronics powered by dc or ac? Do perhaps power strips incorporate something like a power supply that takes the long haul power and proivide something locally more usable, like how many power strips today incorporate usb power supplies? Is it worth making the grid slightly less efficient for houshold/office usage in trade for making it more efficient for EV charging at every parking spot, or is there something totally different like wireless power in the roads all along the roads...

You're asking how high is up.


Any simulation is an "imaginary world". Anyway, you clearly have no answers and add zero value to the conversation with your lame horse laugh fallacy responses. So, please, the next time someone asks a question out of curiosity (as the original commenter did), spare us your condescending and useless, zero value response.


Maximizing what is best (i.e. NPV) is the goal of many uncertainty studies.


You could push it to 100hz, MAYBE 200hz at most. Higher than that, transmission losses due to skin effect would make it a bad idea. Also generator motors would require too high RPM.

240v is a good middle ground for safety and power.


For 1kHz, you'd run the generators at the same speeds you run them for 50Hz but with 20 times as many poles.


Higher Voltage: Less conductor material needed, smaller wires. But need more spacing inside electronics to prevent arcing, and it becomes more zappy to humans. Also becomes harder to work with over 1kV as silicon kind of hits a limit around there.

Higher Frequency: Things that use electricity can be made smaller. But losses in long transmission become much worse.

DC instead of AC: Lower losses in transmission, don't need as much spacing inside electronics for arcing. But harder and less efficient to convert to different voltages.


The skin effect causes AC current density J in a conductor decreases exponentially from its value at the surface J_S according to the depth d from the surface. The depth decreases as the square root of frequency. This means that the effective power a wire can carry decreases with increasing AC frequency.

https://en.wikipedia.org/wiki/Skin_effect#Formula


at same time, high frequency makes high voltage secure (or more secure) I receive 15KV discharges at high frequency and I live to write about it.


Impulse Labs has built an induction range that has super high power; their trick is that they have a battery that they recharge from the mains. Might be expensive but the same technique could work for a toaster (or for a water kettle) to basically keep a whole bunch of energy in reserve and deliver it when needed.


That's a great idea - I wonder if electric kettles would be more popular in the US if they worked as quickly as they do on 240V? How large a volume of battery would one need to accomplish this, I wonder?


Unfortunately quite a large one. Electric kettles are typically one of the highest power draw items in a typical household. A 3kw kettle would need a ~250wh battery for a 5 minute boil (+ electronics capable of sustaining 3kw for that time period and recharging the battery at a reasonable rate afterwards). This would likely be larger than the entirety of the rest of the kettle with current technology.


That's not so bad; why not cut the battery draw in half by pulling from the wall circuit in parallel? 120WH in power tool batteries would cost ~$170-$180, so we can imagine a $200 kettle. Expensive, not a mass market item - but give it a few more years of continued battery improvement, and maybe this could happen. The base would certainly be much bulkier than that of a normal electric kettle, but it would still be smaller than the base of an average blender, so not unreasonable for a North American kitchen.


I'm not sure it'd be commercially viable. A stove is a major home appliance, and the Impulse Labs unit is a high-end one with a price tag of $6000. An electric kettle, on the other hand, is considered a near-disposable piece of home electronics with a price closer to $50; it'd be hard to build a substantial battery into one at an affordable price.


It would cost less than $50 to equip a kettle with appropriately sized battery. You only need something like 0.2 kWh of capacity.


Electric kettles mostly aren’t popular because of a perceived lack of need.

Most Americans don’t drink tea and most coffeemakers heat water themselves. For most other applications using a pot on a stove is not a deal breaker.


I wouldn't want a kettle that wears out after only 3-10 years of use.


That is a reasonable criticism, but getting three entire years of use from a hypothetical battery-electric kettle sounds like a genuine improvement to me. With a conventional 120V kettle, I get maybe 2-3 uses out of it before its overwhelming gutlessness frustrates me so much I can't stand to have it around anymore.


> electric kettles would be more popular in the US if they worked as quickly as they do on 240V

Euro standards are 8-10A 240V circuits. I have an EU kettle, and it draws max 2200W.

US standards are 15A 120V circuits. It could draw 1800W, though some kettles might limit to 12A and draw 1440W.

So a Euro kettle might have 22%-52% more power than a US, which increases a 4 minute boil to 4m53s or 6m7s worst case.

So it seems like it's not a significant factor, though it would be useful if US kettles really maximize power.


In the UK, where almost everyone has an electric kettle, almost all are 3kW.

Our electric circuits aren't rated for 13A continuous draw (e.g. plug-in EV chargers should be set to 8A or 10A), but they are fine at 13A for the few minutes it takes to boil a kettle. 2.2kW kettles would be a major drain on productivity: millions of extra minutes spent every day waiting for a cup of tea!


Most electric circuits are not rated for continuous load at rated power. The US nominal 15A circuit can only supply 12A continuous. Thus heaters of all types and EV charging being limited to about 1.5kW.


I think 13A continuous draw is OK, as 3kW electric heaters are widely available: https://www.dimplex.co.uk/products/portable-heating/convecto...

Perhaps electric cars limit their draw below 13A as they're much more likely to be connected using extension leads.


You would certainly hope that selling 3kW heaters was an indication of that, and that's what I used to think, but what I've read about EV charging makes me think that it isn't.


In Europe the 16A Schuko (https://en.wikipedia.org/wiki/Schuko) is the most common socket and 230V the most common voltage, 240v is mostly used by UK. It allows for a max of ~ 3,7kW, but devices over 3.5kW are rare.


Unregulated continuous resistive load per Schuko plug is limited to 10A (2.2kW). 3kW for a few minutes at a time or with active monitoring.


The context is electric kettles. Boiling water takes a few minutes at a time.


> Euro standards are 8-10A 240V circuits.

Hrm, which country is that? Something between 13 and 16 amps is normal everywhere in Western Europe that I can think of, at 230V.

In Ireland, any random kettle that you buy is likely to be 3kW (pedantically, 2.99kW); you do sometimes see 2.2kW ones, generally _very_ cheap ones.


This was a kettle bought in Germany, then used in Switzerland.

The Swiss outlets in my recent construction apartment were 8A. The standard allows a different outlet with higher amperage but I only ever saw that in commercial settings, similar to US 20A outlets.


Random German 3kW kettle: https://www.mediamarkt.de/de/product/_grundig-wk-6330-176647...

That said, 2.2kW kettles definitely do seem to be more common there than here in Ireland (where 13 amp outlets are standard).


As an Aussie, it’s weird to think that induction and other cooking appliances are so slow or expansive. We have a $200 AUD induction stovetop from Aldi that draws up to 7.2kw across 4 pans


Aircraft AC electrical systems are 115V 400Hz, allegedly to minimize component weight.


> Induction motors turn at a speed proportional to frequency, so a high frequency power supply allows more power to be obtained for the same motor volume and mass. Transformers and motors for 400 Hz are much smaller and lighter than at 50 or 60 Hz, which is an advantage in aircraft (and ships). Transformers can be made smaller because the magnetic core can be much smaller for the same power level. Thus, a United States military standard MIL-STD-704 exists for aircraft use of 400 Hz power.

> So why not use 400 Hz everywhere? Such high frequencies cannot be economically transmitted long distances, since the increased frequency greatly increases series impedance due to the inductance of transmission lines, making power transmission difficult. Consequently, 400 Hz power systems are usually confined to a building or vehicle.

* https://aviation.stackexchange.com/questions/36381/why-do-ai...


Do electric car motors make use of this property too? I know some cars, such as Porches use higher voltage to enable faster charging.


Yes, but not just electric cars. Theres a push to move the entire electrical system of all cars from 12V to 48V to reduce the amount of low gauge wiring in a vehicle


Boats are also making that move. 24V is pretty common now with newer boats, and with electric propulsion 48V is quite attractive.

The NMEA2000 standard is confined to 12V however, meaning that all boats still need a 12V system as well. Maybe just with DC-DC conversion, or maybe with also a backup battery.


Which is why trucks already have 24v as otherwise they would need too much wire.


Very much true. A higher switching frequency means that a smaller transformer is needed for a given power load.

In reference to consumer power supplies, only reason why GaN power bricks are any smaller than normal is because GaN can be run at a much higher frequency, needing smaller inductor/transformer and thus shrinking the overall volume.

Transformers and inductors are often the largest (and heaviest!) part of any circuit as they cannot be shrunk without significantly changing their behavior.

Ref: Page 655, The Art of Electronics 3rd edition and Page 253, The Art of Electronics the X chapters by Paul Horowitz and Winfield Hill.


The higher the voltage the less power lost to resistance and the less money spent on copper

Short protection at the breaker for every circuit would probably be necessary at that voltage


Why are toasters better at 240V? Can’t you just pull more current if you’re only at 120V (or whatever it is in the US) and get the same power?

I guess there’s some internal resistance or something, but…


Correct. You can get the same power with half the voltage by doubling the current.

The trouble is the wires. A given wire gauge is limited in its ability to conduct current, not power. So if you double to the current, you'll need to have roughly twice as much copper in your walls, in your fuse panel, in your appliance, etc.

Additionally, losses due to heat are proportional to the current. If you double the current and halve the voltage, you'll lose twice as much power by heading the wires. For just a house, this isn't a lot, but it's not zero.

This is why US households still have 240V available. If you have a large appliance that requires a lot of power, like an oven, water heater, dryer, L2 EV charger, etc, you really want to use more voltage and less current. Otherwise the wires start getting ridiculous.

This is not to say that higher voltage is just necessarily better. Most of the EU and the UK in particular has plugs/outlets which are substantially more robust and difficult to accidentally connect the line voltage to a human. Lots of people talk about how much safer, for instance, UK plugs/outlets are than US plugs. If you look at the numbers though, the UK has more total deaths per year to electrocution than the US, despite the fact the US is substantially more populous. This isn't because of the plugs or the outlets, US plugs really are bad and UK plugs really are good. But overall, the US has less deaths because we have lower voltage; it's not as easy to kill someone with 120V as 240V.

So there's a tradeoff. There is no best one size fits all solution.


This is a very well written comment overall, but the energy loss in the wire is even worse than stated!

By modelling the wire as an (ideal) resistor and applying Ohm's law, you can get P = I^2*R. the power lost in the wire is actually proportional to the square of current through it!

Therefore, if you double the current, the heat quadruples instead of doubling! You actually have to use four times the copper (to decrease resistance by 4x and get heat under control), or the wasted energy quadruples too.

Crucially, voltage is not in the equation, so high voltages - tens or hundreds of kilovolts - are used for long distance power transmission to maximise efficiency (and other impedance-related reasons).


I was also surprised to read that heat is proportional to the current. In addition, increasing the temperature also increases the resistance in the conductor (Cu). It's around 0.4% per 1C for Cu, around 20% more heat at 70C.

Not sure about US, yet some high current lanes (thinks of threephase ~400V x 36A; IEC 60502-1) in the households are actually made of Al, not Cu. They tend to be underground though, the wires in the walls are still Cu.


Al wire is used a lot more than most people think. Here's the big differences between Al and Cu wire.

Cu is more conductive than Al so an Al wire has to have a cross section area about 1.56 times that of a Cu with the same current capacity.

But Cu is also denser than Al so the Al wire is only about 0.47 times the weight of the Cu wire.

Al is is much cheaper than Cu so the Al wire is only about 13% the cost of the Cu wire.


Al wire is prone to oxidation and thus needs an antioxidant paste applied at connection points.


Hasn't anyone tried using silver wires?


Silver wire is used for some things, but it is a lot more expensive than copper which rules it out for most applications.

Here is a table of the conductivity (in units of 10^7 S/m), the density, and the cost of copper (Cu), aluminum (Al), silver (Ag), and gold (Au).

               Cu    Al      Ag       Au
  Conductivity 5.96  3.5     6.3      4.1
  g/cm^3       8.96  2.6    10.5     19.3
  $/kg         9.03  1.2  1030    92100
If we had a copper wire with a specified capacity in amps, here is what aluminum, silver, and gold wires of the same length and capacity would weigh and cost as a percentage of the weight and cost of the copper wire, and what their diameter would be as a percentage of the diameter of the copper wire.

      Weight  Cost  Diameter
  Al    49       7    139
  Ag   110   12646     97
  Au   310 3190000    121


Suppose Ag was as cheap as, or even cheaper than Al, would it be more useful?


Sure, we could alloy and/or plate it. You have a source on Ag cheaper than Al you'd like to disclose to the class?


Silver and gold could be a lot more useful to society if they weren't so expensive. Plenty of asteroids out there to mine.


In 2017, there were 13 electrocution-related deaths in the UK. In the US, there are between 500 and 1,000 electrocution deaths per year. This translates to 0.019 deaths per 100,000 inhabitants in the UK and between 0.149 and 0.298 deaths per 100,000 inhabitants in the US.


Note that there is 240V in every US home. Low power loads run from a center tap 120V circuit. Also I wonder if people manage to electrocute themselves on "high voltage" circuits (typically 7.5KV) which due to the 120V thing have to be run everywhere so are more available to the casual electrocutee. In the UK although there are high voltage transmission lines, they terminate at substations, are often buried at that point, and there are high fences make it hard to electrocute ones self. E.g. a guy around here managed to electrocute himself very badly (but still survived) out in the woods by touching a bear that had itself become electrocuted by a live unprotected 7.5KV line.


Your deaths claim surprised me. AFAICT England has ~10 deaths by electrocution per year. The US has ~100 domestic electrocutions and even more occupational electrocutions.


How many of those deaths are attributable to electrical plugs, though? Given the US CPSC report at [1], I suspect that most of them aren't - in fact, one of the leading causes of electrocution deaths is "fractal wood burning".

[1]: https://www.cpsc.gov/s3fs-public/Electrocutions-2011-to-2020...


Hard to compare. Does not US allow people to work on their own outlets etc in their own house while in UK you need to hire an electrian.


I don’t know is toaster are close to max power draw, but kettles certainly are.

Most places with 240V regularly have 16A sockets, allowing a maximum draw of 3840W of power. That’s the limit. Cheap fast kettles will often draw 3000W and boil 250ml of water at room tempature in 30s.

Kettles in the US are often limited to 15A and thus max 1800W (usually 1500W) and take twice as long (60s)

Technology Connections has a great video on this: https://youtu.be/_yMMTVVJI4c


I mention Impulse Labs and their battery-assisted 120V high power induction range in the comments elsewhere. Seems like a similar concept could be used to make an incredibly powerful kettle; throw in a battery that charges from the mains, and when you ask to boil, send in 20kW and boil the 250ml in 4 seconds.

    4.18 J/g/C * 250g * (1/ 20,000 kJ/s) * 75C = 3.918s


For that order of magnitude to work, in practice, the most challenging aspect will be getting enough surface area between the water and heater.

Otherwise, you will very quickly vaporize the water near the heater and the resulting lack of contact will inhibit heating the rest of the water volume.


Yes — in fact this is a problem with the high-power setting on our induction hob (which I think is something like 5kW). Liquids bubble vigorously before they're even warm.


Microwave radiation could work to transfer heat in even as boiling initiates in spots. All the best kettles are BIS dual use items.


If you can transmit that amount of heat that quickly, I think it'd be much more convenient and feasible to have it in the form factor of an instant-boiling-water spout next to the sink, rather than a kettle. Then, rather than having to fill the kettle first, you directly get the amount of hot water you need into the vessel you want it in, and you can put a precise amount of heat into a precise amount of water flowing through a pipe to emit it at the right temperature.


By the way, you can already have a boiling water tap today, you just buy a device that uses hot water tank to store the energy you rather than the battery. Insinkerator sells these. It might not be as energy efficient as the hypothetical tankless water boiler as described by you, because you have some losses from the heat slowly leaking away from the tank, but given the battery costs, I suspect that over the lifetime of the device, these losses add up to less than what battery costs.


Yeah that's a great way to start a fire with a lot of steam first :)


Houses are wired for 16A per circuit on both sides of the pond, with high-power appliances typically pulling around 10A to avoid tripping the fuse when something else is turned on at the same time. It's just a nice point where wires are easy to handle, plugs are compact, and everything is relatively cheap.

The US could have toasters and hair dryers that work as well as European ones if everything was wired for 32A, but you only do that for porch heaters or electric vehicle chargers.


No, the standard in the US is 15 or 20A. 20 is more popular nowadays.

240V appliances typically get a 35 or 50A circuit.

But then you also have to deal with the fact that a lot of homes have wiring that can only handle 10A, but someone has replaced the glass fuse with a 20A breaker. Fun stuff.


I still haven't seen a single 20A domestic appliance though


I have, though it's semi-prosumer equipment. The largest UPS systems for standard systems, like someone might want for a small home-lab rack, can be found with 20A 120V plugs that work in normal outlets; if they're on a 20A rated circuit like a kitchen refrigerator or bathroom sink outlet (the two most common sorts in rental units).

I suspect some beauty products might also use 20A, or in combination easily reach that.


Very common in light commercial applications but almost unheard of in residential because nobody installs nema 20a sockets in houses even if the circuit can handle it


More current needs thicker wires. The average US outlet is wired for 120v15amp. 20 amp circuits are somewhat common, though 20amp receptacles are not. Certainly not enough for commodity appliances to rely on.

Going to more than 20amp requires a multiphase circuit which are much more expensive and the plugs are unwieldy and not designed to be plugged and unplugged frequently.


> Going to more than 20amp requires a multiphase circuit

There is no multi-phase power available in the vast majority of US houses. A typical residence has a 120/240 split-phase service, which is single-phase only. A service drop is two hot conductors from two of the three transformer phases and a center-tapped (between the two hot legs) neutral conductor. Either hot leg is 120v to ground and line to line is 240V.

> https://en.m.wikipedia.org/wiki/Split-phase_electric_power

Single-phase breakers are also available in sizes larger than 20A, usually all the way up to 125A.


Having more current running through a wire means thicker wires. Higher voltage means less current to achieve the same power, so thinner wires for the same power. The tradeoff for higher voltage is it's more dangerous (higher chance of arcing etc).


You need thicker wires for the same power. Which is why Americans live in constant fear of power extension cords, and people in EU just daisy-chain them with abandon.


If you're in a country that uses type-G plugs, almost* all of those extension cords have fuses that break well below the current that will cause a problem.

* Currently using a cable spool which will have problems before blowing the fuse if it's wound up and I draw too much current. It has a thermal cutoff, but I still unspool some extra wire on the floor.


You need the same thickness of wire for 10A regardless of which voltage you have. So with 230V, your 10A wire will let you draw 2.3 kW while someone with 120V and 15A wire would only get 1.8 kW and pay more for the wiring.


You have a 240v toaster?


Well, closer to 230.


goes to check real quick between 238V and 245V at my outlets.


Living in an a country with 240v mains, yep.


Today, with swichmode supplies, I think DC would make most sense. We lose much power to frequency deviation and inductive loads causing phase shift.

It would also make sense to have a high voltage and low voltage nets in houses. Low voltage for lighting and other low power equipment. High voltage for power hungry equipment. For example 48V and 480V.


DC high voltage (200v) is very dangerous and pernicious, and mores difficult to switch on and off because of arcs.


Very old US light switches often had both AC and DC current ratings, and the DC rating was noticeably lower due to the arcing problem, even with fast-acting snap action.

My grandfather worked in a Chicago office building that had 110V DC service even into the 1960s. He had to be careful to buy fans, radios, etc. that could run on DC.


Yes, it’s more dangerous, but technically “high voltage” doesn’t start until 1500 V DC (and 1000 V RMS for AC) by most standards (e.g. IEC 60038)


Do we really lose much to phase shift? Increasing apparent load only causes a tiny fraction of that much waste, and local compensation isn't very expensive.

DC or a significant frequency boost would be good inside a house for lights and electronic items. Not so great for distribution.

I'm not convinced multiple voltages would be a net benefit outside of the dedicated runs we already do for big appliances.


Interesting to think about what the future could look like. What if breaker boxes were "smart" and able to negotiate higher voltages like USB-C does? It would avoid the problem of a kid sticking a fork in an outlet, or a stray wire getting brushed accidentally when installing a light fixture.

Time will tell!


> What if breaker boxes were "smart" and able to negotiate higher voltages like USB-C does?

That'd be difficult; a breaker typically feeds an entire room, not a single outlet. (And when it does feed a single outlet, that's typically because it's dedicated to a specific large appliance, like an air conditioner or electric stove, which wouldn't benefit from being able to dynamically negotiate a voltage.)


Appliances, like electric range, that need higher power have dedicated 240V circuits. My understanding is that 240V circuits use thicker cable because they usually also have higher current. But it is possible to convert 120V to 240V if only one device, sometimes done for imported electric kettles.


> It would avoid the problem of a kid sticking a fork in an outlet

One of the best things about living in the UK! https://www.youtube.com/watch?v=UEfP1OKKz_Q


Without sunk costs, objectively it's 000Hz. Voltage is mostly irrelevant too.

https://en.wikipedia.org/wiki/High-voltage_direct_current


>Would it be a higher frequency (1000hz)?

Totally not, that would mean both worse skin effect and worse impedance. Likely the best option (if you really don't care about the existing infrastructure) would be DC, 0Hz. There are some downsides of DC, of course.


This would be excellent for solar because it removes the efficiency loss from the inverters. AFAIK it's very hard to add/remove loads without spiking power to everything else on the line. A friend of mine applied for a job at a company that was trying to run DC in the home but it's not a trivial endeavor.


Is there even a use to running DC in a home?

Almost everything complex does run on DC internally, but you feed those via AC adapters that then invert it to DC. You'd have to get bespoke DC-DC adapters (transformers, really) for everything.


AC-DC are effectively -> AC + power correction boost to even higher AC, then high voltage DC, the AC (high freq), then transformer, then low volt DC + feedback to the primary DC ; {then DC->DC (potentially)}

DC (high) voltage to DC would skip the 1st few steps of AC->DC.


That is very interesting to know, thank you!

I always assumed it was just one induction and transformation step.


Is there even a use to running DC in a home?

Lights, first and foremost. LEDs are DC.

> Almost everything complex does run on DC internally

almost everything runs on either 5V or 12VDC. What you would need are appliances that bypass the wall-warts/adapters and tap directly off DC, but this comes with some significant challenges. I'm already talking way outside my wheelhouse though, so I'll stop before I make a mockery of this topic.


I meant it more in the sense of: "[in the current appliances environment] is there even a use to running DC in a/your home?"

Its very cool as a theoretical exercise and you could probably make a proof-of-concept house, but if you want to live in it and use literally anything non-bespoke, you have to convert DC to AC, which kind of defeats the purpose.


absolutely, in the here-and-now. But as a system, if the entire appliance/solar industry committed to a household DC standard (and if it was technically feasible, which it very well may not be), then you might have something. Solar would just provide regulated DC (which presumably would be less lossy than inverting to AC). But the devil is in the details: a DC system would have to have a comms channel announcing "hey, I'm an appliance X, and I intend to draw X amps from the DC bus" and coordinate it with the supply, otherwise it will cause all other appliances to experience a drop in voltage. That's just one of the problems, there are others.

However, if that were worked out, you could have DC plugs in the house for all existing appliances and fixtures, and theoretically you would get a gain on both sides (no inverter, and no v-regulator (or a simpler, less lossy one)).


I think one of the "easiest" steps would probably be to run a concurrent circuit of DC from a central inverter, stepped at the same voltage as the solar / home battery output.

Then you wire your EV charger and big electrical appliances like stoves, ovens, microwaves, fridges (?), electric boilers, central AC, heat pumps, etc. into that DC circuit.

That alone would switch the majority of your electrical consumption to DC. Maybe long-term, you could have a special DC socket that you plug toasters, kettles, crock pots, vacuums etc in to, if they became available and cheaper.


Probably about the same as we have.

Higher frequencies have terrible transmission (skin effect, transmission line length limit) and would start to interfere with radio.

Lower frequencies need larger transformers.

DC while nice is too expensive.

So about where we are now.


DC. Modern electronics make AC grids unnecessary.

48VDC inside homes would be enough for most applications except heating and it would be kid-safe.

240V for heating applications.


I don't care what the frequency is, I just want my LEDs to not flicker!


Normal LED lightbulbs shouldn't flicker on standard 60Hz circuits. Do you have a dimmer switch or smart switch in the circuit? I've noticed these cause flickering that's visible on camera or out of the side of my eyes.


Get better ones. I installed expensive LED lamps at home and they’re fine. The guys in the office picked the cheapest ones and I don’t want to turn these ugly things on.

Edit: Paulmann Velora are the expensive lamps at home.


that means that either the breaker is faulty, or the power stabilizator in the lamp itself is junk.


https://lamptest.ru tests for flicker too. I don't know if those models are sold in the US, though. Philips, Osram and IKEA should be.


What about if they flickered at 10^-15Hz?


That would be really annoying to tell if they're broken, or just currently in the "off" phase :)


Assuming they started on, you’d be lucky if they lasted long enough to ever reach their off state (~32 million years later).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: