mrb a month ago

The 60 Hz grid is why we have HDDs spinning at 7200 RPM. Because of 60 Hz, it was simpler to design AC electrical motors to spin at 3600 RPM or 7200 RPM (1x or 2x factor). When DC motors were made, for compatibility with AC motors, they often spun at the same speed (3600 or 7200 RPM) even though there was no electrical reason to do so. When HDD manufacturers selected DC motors for their drives, they picked 3600 RPM models as they were very common. Then, for performance reasons, in the 1990s, 7200 became more common than 3600 eventually.

  • geor9e a month ago

    Maybe this was obvious, but I will spell it out for slow people like myself:

    60 Hz = 60 cycles per second = 3600 cycles per minute

    A simple 2 pole AC motor spins 1 rotation per cycle, so 3600 RPM. AC is a sine wave cycle of current. Current flows one way to attract it to one pole, the current flows the other way to attract it to the other pole.

    For a big mainframe disc drive, it sounds like the obvious choice. Why they stuck with it after switching to DC, who knows. Maybe they didn't want to redesign the controller circuits.

    • SoftTalker a month ago

      10k RPM also not too uncommon.

      • newsclues a month ago

        And 15k sad drives

        • MrDrMcCoy a month ago

          Their capacity is indeed that. Would love it if they were keeping up with the density of slower drives.

  • emmelaich a month ago

    There was a brief stop off at 5400 at long the way.

    Actually, still available. Good to have something quieter when performance requirements are not as high.

    • kees99 a month ago

      5400 is (was?) very common RPM for laptop HDDs.

  • Tempest1981 a month ago

    And apparently why we have LED car taillights that flicker annoyingly.

    If anyone here works at a car manufacturer, please try to get this fixed. Apparently not everyone notices.

    • sathackr a month ago

      It's PWM dimming. I see it most on Cadillac SUVs. Drives me nuts too.

      They could design it not to be that low of a frequency but apparently someone thought that 40-50hz was imperceptible to humans and went with it

    • lutorm a month ago

      I find it most annoying when moving your eyes. Anyone will notice the flicker if their eyes aren't stationary since then each "flick" will end up on a different part of the retina. So, instead of a normal motion blur you see a dashed line which is hard to ignore.

    • hunter2_ a month ago

      What's the cause? Alternators run at very high frequencies with good rectifiers, so I'm guessing the flicker is introduced by PWM dimming, but why would that be a low enough frequency to bother people?

      I'm sensitive to flicker myself, but only on the more extreme half of the spectrum. For example, half rectified LED drivers on 60 Hz AC drive me nuts, but full rectified (120 Hz) I very rarely notice. I don't notice any problem with car tail lights, except in the case of a video camera where the flicker and the frame rate are beating. The beating tends to be on the order of 10 Hz (just shooting from the hip here) so if frame rates are 30/60/120 then I guess the PWM frequency is something like 110 or 130 Hz?

      • SV_BubbleTime a month ago

        Automotive engineer here! I don’t do lights, so I have no idea!

        BUT… I kinda do. You want the lowest PWM frequency you can get away with. In this case, at the back of the car, furthest from the battery, you really don’t want a 8kHz PWM nor do you need it. It costs money to isolate the supply demand, so you don’t want a noisy field for no reason. The “good enough” frame rate is 60Hz static, not moving, no other flashing lights, not using a camera, etc.

        60Hz or 60fps has issues though. If you expose a PWM LED to another flashing light or movement you get really bad imaging. Imagine you took an LED in your hand and shook it, at 60 Hz you will see snapshots of where the LED was as you’re shaking it. At 240Hz you will see a blur. Guess which is better for a vehicle?

        I figure most car LED taillights internal to their housing would be 200-1000Hz depending on factors but I haven’t ever measured.

        200Hz PWM is a really common value. No need for Samaritan base-12 here.

        For halogen and incandescent, we use PWM, fun fact. Low Hz though! About 88Hz, depending on voltage. You might wonder why. We can get 98% of the light output with 85% of the required wire. It’s all about weight and cost. Although not many vehicles use this anymore.

        • flimflamm a month ago

          Imagine your eye's effective integration time (the period over which it "samples" light) shrinks during a rapid saccade—say down to about 5–10 ms. Under steady conditions, our flicker fusion threshold is around 60 Hz, but that’s because our retina integrates over a longer window.

          If we want to “smooth out” the PWM cycles so we don’t see discrete pulses, we need a few cycles (say, 3–5) within that 5–10 ms window. In other words:

              For a 10 ms integration window:
                  3 cycles → f≥30.01 s=300 Hzf≥0.01s3 =300Hz
                  5 cycles → f≥50.01 s=500 Hzf≥0.01s5 =500Hz
              For a 5 ms window:
                  3 cycles → f≥30.005 s=600 Hzf≥0.005s3 =600Hz
                  5 cycles → f≥50.005 s=1000 Hzf≥0.005s5 =1000Hz
          
          So, to cover worst-case scenarios (rapid eye movement, bright conditions where the eye’s temporal resolution is higher), the PWM systems for LED lights should be rather in the ballpark of 300–1000 Hz than 200 Hz. Given that one would be viewing headlights some 2 meters away (impacts the what is the view angle change and speed).

          And yes, what we are now seeing in cars is super annoying. Similar disregardful to user comfort can be seen with HUD displays (at least with Volvo).

        • hunter2_ a month ago

          Fascinating! I've played with VW's BCM stuff, but for some reason everything except the lighting is translated into English on the common factory equivalent scan tools, so it's a matter of sorting though a bunch of "lichtfunktion" this and "dimmwert" that.

          > We can get 98% of the light output with 85% of the required wire.

          I guess a 12v bulb can survive a much higher voltage than it's rated for (which needs fewer amps for the same watts, so thinner wire) if it's pulsed, like 120v assuming 10% duty cycle, as long as the frequency is dialed in such that the filament is just about ramped up to its rated temperature when power is cut. Very clever!

        • devit a month ago

          Actually, you want to use a DC-to-DC converter that properly delivers the constant desired reduced voltage, rather than making a ridiculous stroboscopic light (aka PWM).

          • SV_BubbleTime a month ago

            Constant current is more expensive and does not scale to a lot of LEDs or high power LEDs well. The scale wouldn’t be an issue with the tail lights, but the power kind of is. Take a look at modern taillights sometime, they all have heatsinks on them. I’m sure there are some; but I am not aware of any constant current taillights.

        • consp a month ago

          > For halogen and incandescent, we use PWM,

          Must be fancy cars then. All I've ever owned used relays to switch the lights, just straight 12v onto it no pwm whatsoever.

          • SV_BubbleTime a month ago

            That used to be the case, of course.

            We PWM those now because the PWM drivers are there anyhow, and with a filament, it’s still lit and bright during the off pulses. It’s cooling, but you really can’t detect it.

      • brandmeyer a month ago

        > introduced by PWM dimming, but why would that be a low enough frequency to bother people?

        The human fovea has a much lower effective refresh rate than your peripheral vision. So you might notice the flickering of tail lights (and daytime running lights) seen out of the corner of your eye even though you can't notice when looking directly at them.

        • hunter2_ a month ago

          For sure. I summarized my particular sensitivity too aggressively earlier, but I tend to see flicker straight on up to 65 Hz and peripherally up to 120 Hz if it's particularly egregious (i.e., long valleys) but usually up to something less. In any case, I've never noticed car tail lights flickering even peripherally, despite video artifacts revealing that they do flicker.

    • arcanemachiner a month ago

      I see it, but only in my peripheral vision. Drives me nuts.

      • taneq a month ago

        This is because your cones (colour-specific 'pixels' in your retina) cluster around your fovea (high-resolution area around the center of focus) while the rods (black-and-white, bigger, dumber, but much more sensitive and therefore faster-responding 'pixels') fill the rest of your visual field. Rods respond fast enough to see the flicker but cones don't.

        You might also notice in light that's almost too dim to see, you can see things better if you look a bit to the side of them, for the same reason.

      • Tempest1981 a month ago

        For me, it's also noticeable when my eyes are sweeping from one object to another. Like you should be doing when driving. I guess that's somewhat peripheral, but not entirely.

        Find a Cadillac Escalade with the vertical LED lights, and give it a try.

        What's a good way to capture this on video? Or to measure the frequency?

        • taneq a month ago

          That's a different thing. Your eyes automatically filter out the blurry signal during a saccade (ie. while they're doing a fast pan from one location to another.)

          Rapidly strobing lights (eg. LEDs being PWM'd) mess with this process and cause you to perceive the intermediate images as your eye moves, leading to some weird visual effects. This was a big thing in the early days of DMD projectors, where they projected sequential red/green/blue frames, and while it looked great when you stared at it, any time your eyes moved the whole image strobed with overlaid RGB fields. (more info: https://www.prodigitalweb.com/rainbow-effect/)

      • spockz a month ago

        Or for some reason in the rear view and side mirrors!

  • hcarvalhoalves a month ago

    Also 5400rpm HDDs were common – that would be 1,5x factor.

  • lutorm a month ago

    Did European hard drives run at 3000 rpm then?

Ericson2314 a month ago

https://en.wikipedia.org/wiki/Amtrak%27s_25_Hz_traction_powe... predates standardized 60 Hz, and still hasn't been converted (!!)

  • _trampeltier a month ago

    Switzerland trains still use 16,7 Hz (16 2/3Hz)

    • Ericson2314 a month ago

      Yes indeed, but the German-Swiss-Austrian 16.7 Hz system is many orders of magnitude bigger than the Southend Electrification! The path dependency is much more understandable in that case.

      • cenamus a month ago

        It's also 16.7 Hz precisely (save for the fluctuations) and not 16 2/3 Hz, for some electrical reasons regarding transorming 50 Hz to 16.7 Hz mechanically, which I don't completely understand

        • schobi a month ago

          I understand that historically it was 16 2/3 Hz and power transfer from to the 50 Hz grid was done via 1:3 mechanical coupling with a motor and generator. That was a ratio easily achievable with gears.

          Nowadays with switched power supplies, this is not a problem any more. Keeping track of 16.7 Hz seems a little easier. Imagine building a numeric display for a power plant operator to see how far off you are.

          • Symbiote a month ago

            > Imagine building a numeric display for a power plant operator to see how far off you are.

            You could build a display with 3 as the denominator, and a decimal numerator:

               | |_   2.1
               | | \  ---
               | \_/   3
layer8 a month ago

To cite the main origin:

> Stillwell recalled distinctly the final meeting of the committee at which this recommendation was agreed upon. They were disposed to adopt 50 cycles, but American arc light carbons then available commercially did not give good results at that frequency and this was an important feature which led them to go higher. In response to a question from Stillwell as to the best frequencies for motors, Scott said, in effect, “Anything between 6,000 alternations (50 Hz) and 8,000 alternations per minute (67 Hz).” Stillwell then suggested 60 cycles per second, and this was agreed to.

rkagerer a month ago
freeqaz a month ago

If we could magically pick a frequency and voltage for electrical systems to use (without sunk costs), what would it be?

What's the most efficient for modern grids and electronics?

Would it be a higher frequency (1000hz)?

I know higher voltage systems are more dangerous but make it easier to transmit more power (toaster ovens in the EU are better because of 240v). I'm curious if we would pick a different voltage too and just have better/safer outlets.

  • burnerthrow008 a month ago

    > If we could magically pick a frequency and voltage for electrical systems to use (without sunk costs), what would it be?

    > What's the most efficient for modern grids and electronics?

    I do not think it is possible to answer the question as posed. It is a trade-off. Higher frequencies permit smaller transformers in distribution equipment and smaller filtering capacitors at point of use. On the other hand, the skin effect increases transmission losses at higher frequencies.

    If you want minimum losses in the transmission network, especially a very long distance transmission network, then low frequencies are better.

    If you want to minimize the size and cost of transformers, higher frequencies might be better. Maybe the generator is close to the user so transmission loss is less important.

    If you want smaller end-user devices, high frequency or DC might be more desirable.

    You have to define some kind of objective function before the question becomes answerable.

    • willis936 a month ago

      I think the question could be constrained as "what frequency uses the minimum amount of copper to remake the electrical distribution network that exists today?"

      This would be a pretty good approximation of the ratio of transmission lines to transformers.

      • Panzer04 a month ago

        You could build the lot with DC and massively reduce transformers, but transformers are probably a lot more reliable than switching converters everywhere. Not sure which would be cheaper tbh.

        • mgiampapa a month ago

          Sure, the transformers would be smaller, but your transmission lines would be thicc.

          • Mistletoe a month ago

            Isn’t it the other way around?

            >Generally, for long-distance power transmission, DC lines can be thinner than AC lines because of the "skin effect" in AC, which concentrates current flow near the surface of the conductor, making thicker wires less efficient; therefore, for the same power transmission, a DC line can be smaller in diameter than an AC line

            • lazide a month ago

              The issue is actually that DC voltage conversion is much harder than AC, because AC can use transformers, and DC can’t.

              This is especially a problem at high voltages and currents.

              Also, DC arcs don’t self extinguish as well as AC arcs do, so DC arcs are a lot more dangerous and destructive.

              It’s why HVDC lines are still relatively rare (and capital expensive), and typically used for long haul or under salt water, where the inductive loss from AC would cost more than the higher capital costs required for DC voltage conversion and stability.

      • TheSpiceIsLife a month ago

        Distribution lines are aluminium.

        • mrlonglong a month ago

          Why isn't silver used though? It's a better conductor isn't it?

          • AdrianB1 a month ago

            Cost and weight. High voltage electrical lines use aluminium because of the weight, they are mostly aerial lines. Silver is too expensive to use for almost anything.

            • mrlonglong a month ago

              But suppose it was cheap enough, would it be useful?

      • NooneAtAll3 a month ago

        this gives a wrong assumption that optimal distribution network is _the same_ for different frequencies

        or that it, itself, isn't a consequence of its own series of sunken cost fallacies

        • willis936 a month ago

          It doesn't. That's why I said it would be good approximation. It's a classic optimization problem. Two iterations is a lot more informative than one.

    • im3w1l a month ago

      Wouldn't it make sense to do both then? Low frequency or even dc long distance transmission that gets converted to standard frequency closer to the user?

      • connicpu a month ago

        There's considerable losses involved when you want to convert between frequencies. DC also has considerable losses over long distance, so there's a lower bound on the frequency before the efficiency starts to go down again.

        • somat a month ago

          It is not that DC has more losses. it is that transforming DC voltage is non-trivial. With AC you just need a couple of magnetically coupled inductors, no moving parts, easy to build, efficient, reliable. With DC this does not work, you need to convert it to AC first do the transform then convert it back. Nowdays we can achieve pretty good efficiency doing this with modern semiconducting switches. But historically you needed to use something like a motor-generator and the efficiency losses were enough that just transmitting in ac was the clear winner.

          The losses over distance thing is the fundamental conflict between desired properties. For transmission you want as high a voltage as possible, but high voltage is both very dangerous and tricky to contain. So for residential use you want a much lower voltage. we picked ~ 200 volts as fit for purpose for this task. but 200 volts has high loses during long distance transmit. So having a way to transform the current into voltage is critical.

          Some of our highest voltage most efficient long distance transmission lines are DC, but this is only possible due to modern semiconducting switches.

          https://en.wikipedia.org/wiki/High-voltage_direct_current

        • manwe150 a month ago

          Right, with modern technology my understanding is HVDC (essentially 0HZ?) is the way to go now (high voltage to minimize resistive loss, DC to minimize skin effect) if we were building a new grid with the same wires, but not economical to retrofit an existing system that is already working, since it is already working

        • mlyons1340 a month ago

          Power line losses are proportional to I^2R so whether its DC or AC isn't really the concern. V=IR so assuming R is constant, a higher transmission voltage results in exponentially lower power losses. DC is actually whats currently used for long distances to achieve lowest power line losses (HVDC).

          • choilive a month ago

            The skin effect is an important difference (and advantage in favor of HVDC) so it is in fact a concern of AC vs DC.

            • mlyons1340 a month ago

              True, skin effect limits the conductor size (~22mm in Al @60Hz) but overhead transmission already uses bundled conductors to address that as well as to improve mechanical strength, cooling, and reactance. The advantage of HVDC is in the lower dielectric and reactive losses while skin effect is minimal.

            • lazide a month ago

              Also inductive losses, which are only a thing in AC.

          • ahartmetz a month ago

            >exponentially

            quadratically

    • osigurdson a month ago

      Objective function: Minimize operational cost Decision variable(s): Electrical system frequency Scope: Global electrical grid

      • Brian_K_White a month ago

        2 meaningless statements.

        For instance, I would say that the scope of the global electrical grid includes every phone charger. Not just because the last foot devices are techically connected, but because they are the reason the rest even exists in the first place. So nothing that serves either the long haul or the local at the expense of the other can be called "minimal operational cost".

        So trains use their own 25hz or even lower because that's good for long haul. But that would mean phone chargers are undesirably large and heavy. Or maybe it would mean that every house has it's own mini power station that converts the 25hz utility to something actually usable locally.

        Meanwhile planes use 400hz 200v 3-phase for some mix of reasons I don't know but it will be a balance of factors that really only applies on planes. Things like not only the power to weight but also the fact that there is no such thing as mile long run on a plane, the greater importance to avoid wires getting hot from high current, etc.

        Simply saying "the objective function is 'what is best?' and the scope is 'global'" doesn't turn an undefined scope and objective into defined ones.

        • osigurdson a month ago

          Not the original commenter, but here is another angle: If the original grid designers had a time machine and spent 2 decades studying electrical engineering in modern times before going back, what frequency would they have chosen?

          Does this help you understand the original commenter's question?

          • Brian_K_White a month ago

            In this imaginary world, does every house and business have it's own power station? Are small electronics powered by dc or ac? Do perhaps power strips incorporate something like a power supply that takes the long haul power and proivide something locally more usable, like how many power strips today incorporate usb power supplies? Is it worth making the grid slightly less efficient for houshold/office usage in trade for making it more efficient for EV charging at every parking spot, or is there something totally different like wireless power in the roads all along the roads...

            You're asking how high is up.

            • osigurdson a month ago

              Any simulation is an "imaginary world". Anyway, you clearly have no answers and add zero value to the conversation with your lame horse laugh fallacy responses. So, please, the next time someone asks a question out of curiosity (as the original commenter did), spare us your condescending and useless, zero value response.

        • osigurdson a month ago

          Maximizing what is best (i.e. NPV) is the goal of many uncertainty studies.

  • redox99 a month ago

    You could push it to 100hz, MAYBE 200hz at most. Higher than that, transmission losses due to skin effect would make it a bad idea. Also generator motors would require too high RPM.

    240v is a good middle ground for safety and power.

    • anticensor a month ago

      For 1kHz, you'd run the generators at the same speeds you run them for 50Hz but with 20 times as many poles.

  • Workaccount2 a month ago

    Higher Voltage: Less conductor material needed, smaller wires. But need more spacing inside electronics to prevent arcing, and it becomes more zappy to humans. Also becomes harder to work with over 1kV as silicon kind of hits a limit around there.

    Higher Frequency: Things that use electricity can be made smaller. But losses in long transmission become much worse.

    DC instead of AC: Lower losses in transmission, don't need as much spacing inside electronics for arcing. But harder and less efficient to convert to different voltages.

  • UltraSane a month ago

    The skin effect causes AC current density J in a conductor decreases exponentially from its value at the surface J_S according to the depth d from the surface. The depth decreases as the square root of frequency. This means that the effective power a wire can carry decreases with increasing AC frequency.

    https://en.wikipedia.org/wiki/Skin_effect#Formula

    • Zardoz84 a month ago

      at same time, high frequency makes high voltage secure (or more secure) I receive 15KV discharges at high frequency and I live to write about it.

  • andrewla a month ago

    Impulse Labs has built an induction range that has super high power; their trick is that they have a battery that they recharge from the mains. Might be expensive but the same technique could work for a toaster (or for a water kettle) to basically keep a whole bunch of energy in reserve and deliver it when needed.

    • marssaxman a month ago

      That's a great idea - I wonder if electric kettles would be more popular in the US if they worked as quickly as they do on 240V? How large a volume of battery would one need to accomplish this, I wonder?

      • nicoburns a month ago

        Unfortunately quite a large one. Electric kettles are typically one of the highest power draw items in a typical household. A 3kw kettle would need a ~250wh battery for a 5 minute boil (+ electronics capable of sustaining 3kw for that time period and recharging the battery at a reasonable rate afterwards). This would likely be larger than the entirety of the rest of the kettle with current technology.

        • marssaxman a month ago

          That's not so bad; why not cut the battery draw in half by pulling from the wall circuit in parallel? 120WH in power tool batteries would cost ~$170-$180, so we can imagine a $200 kettle. Expensive, not a mass market item - but give it a few more years of continued battery improvement, and maybe this could happen. The base would certainly be much bulkier than that of a normal electric kettle, but it would still be smaller than the base of an average blender, so not unreasonable for a North American kitchen.

      • duskwuff a month ago

        I'm not sure it'd be commercially viable. A stove is a major home appliance, and the Impulse Labs unit is a high-end one with a price tag of $6000. An electric kettle, on the other hand, is considered a near-disposable piece of home electronics with a price closer to $50; it'd be hard to build a substantial battery into one at an affordable price.

        • xyzzyz a month ago

          It would cost less than $50 to equip a kettle with appropriately sized battery. You only need something like 0.2 kWh of capacity.

      • bobthepanda a month ago

        Electric kettles mostly aren’t popular because of a perceived lack of need.

        Most Americans don’t drink tea and most coffeemakers heat water themselves. For most other applications using a pot on a stove is not a deal breaker.

      • Borealid a month ago

        I wouldn't want a kettle that wears out after only 3-10 years of use.

        • marssaxman a month ago

          That is a reasonable criticism, but getting three entire years of use from a hypothetical battery-electric kettle sounds like a genuine improvement to me. With a conventional 120V kettle, I get maybe 2-3 uses out of it before its overwhelming gutlessness frustrates me so much I can't stand to have it around anymore.

      • avidiax a month ago

        > electric kettles would be more popular in the US if they worked as quickly as they do on 240V

        Euro standards are 8-10A 240V circuits. I have an EU kettle, and it draws max 2200W.

        US standards are 15A 120V circuits. It could draw 1800W, though some kettles might limit to 12A and draw 1440W.

        So a Euro kettle might have 22%-52% more power than a US, which increases a 4 minute boil to 4m53s or 6m7s worst case.

        So it seems like it's not a significant factor, though it would be useful if US kettles really maximize power.

        • gmac a month ago

          In the UK, where almost everyone has an electric kettle, almost all are 3kW.

          Our electric circuits aren't rated for 13A continuous draw (e.g. plug-in EV chargers should be set to 8A or 10A), but they are fine at 13A for the few minutes it takes to boil a kettle. 2.2kW kettles would be a major drain on productivity: millions of extra minutes spent every day waiting for a cup of tea!

          • chipsa a month ago

            Most electric circuits are not rated for continuous load at rated power. The US nominal 15A circuit can only supply 12A continuous. Thus heaters of all types and EV charging being limited to about 1.5kW.

          • Symbiote a month ago

            I think 13A continuous draw is OK, as 3kW electric heaters are widely available: https://www.dimplex.co.uk/products/portable-heating/convecto...

            Perhaps electric cars limit their draw below 13A as they're much more likely to be connected using extension leads.

            • gmac a month ago

              You would certainly hope that selling 3kW heaters was an indication of that, and that's what I used to think, but what I've read about EV charging makes me think that it isn't.

        • AdrianB1 a month ago

          In Europe the 16A Schuko (https://en.wikipedia.org/wiki/Schuko) is the most common socket and 230V the most common voltage, 240v is mostly used by UK. It allows for a max of ~ 3,7kW, but devices over 3.5kW are rare.

          • anticensor a month ago

            Unregulated continuous resistive load per Schuko plug is limited to 10A (2.2kW). 3kW for a few minutes at a time or with active monitoring.

            • AdrianB1 a month ago

              The context is electric kettles. Boiling water takes a few minutes at a time.

        • rsynnott a month ago

          > Euro standards are 8-10A 240V circuits.

          Hrm, which country is that? Something between 13 and 16 amps is normal everywhere in Western Europe that I can think of, at 230V.

          In Ireland, any random kettle that you buy is likely to be 3kW (pedantically, 2.99kW); you do sometimes see 2.2kW ones, generally _very_ cheap ones.

          • avidiax a month ago

            This was a kettle bought in Germany, then used in Switzerland.

            The Swiss outlets in my recent construction apartment were 8A. The standard allows a different outlet with higher amperage but I only ever saw that in commercial settings, similar to US 20A outlets.

    • jamesy0ung a month ago

      As an Aussie, it’s weird to think that induction and other cooking appliances are so slow or expansive. We have a $200 AUD induction stovetop from Aldi that draws up to 7.2kw across 4 pans

  • buildsjets a month ago

    Aircraft AC electrical systems are 115V 400Hz, allegedly to minimize component weight.

    • throw0101c a month ago

      > Induction motors turn at a speed proportional to frequency, so a high frequency power supply allows more power to be obtained for the same motor volume and mass. Transformers and motors for 400 Hz are much smaller and lighter than at 50 or 60 Hz, which is an advantage in aircraft (and ships). Transformers can be made smaller because the magnetic core can be much smaller for the same power level. Thus, a United States military standard MIL-STD-704 exists for aircraft use of 400 Hz power.

      > So why not use 400 Hz everywhere? Such high frequencies cannot be economically transmitted long distances, since the increased frequency greatly increases series impedance due to the inductance of transmission lines, making power transmission difficult. Consequently, 400 Hz power systems are usually confined to a building or vehicle.

      * https://aviation.stackexchange.com/questions/36381/why-do-ai...

      • kristianp a month ago

        Do electric car motors make use of this property too? I know some cars, such as Porches use higher voltage to enable faster charging.

        • choilive a month ago

          Yes, but not just electric cars. Theres a push to move the entire electrical system of all cars from 12V to 48V to reduce the amount of low gauge wiring in a vehicle

          • bergie a month ago

            Boats are also making that move. 24V is pretty common now with newer boats, and with electric propulsion 48V is quite attractive.

            The NMEA2000 standard is confined to 12V however, meaning that all boats still need a 12V system as well. Maybe just with DC-DC conversion, or maybe with also a backup battery.

          • consp a month ago

            Which is why trucks already have 24v as otherwise they would need too much wire.

    • 6SixTy a month ago

      Very much true. A higher switching frequency means that a smaller transformer is needed for a given power load.

      In reference to consumer power supplies, only reason why GaN power bricks are any smaller than normal is because GaN can be run at a much higher frequency, needing smaller inductor/transformer and thus shrinking the overall volume.

      Transformers and inductors are often the largest (and heaviest!) part of any circuit as they cannot be shrunk without significantly changing their behavior.

      Ref: Page 655, The Art of Electronics 3rd edition and Page 253, The Art of Electronics the X chapters by Paul Horowitz and Winfield Hill.

  • DrBenCarson a month ago

    The higher the voltage the less power lost to resistance and the less money spent on copper

    Short protection at the breaker for every circuit would probably be necessary at that voltage

  • xanderlewis a month ago

    Why are toasters better at 240V? Can’t you just pull more current if you’re only at 120V (or whatever it is in the US) and get the same power?

    I guess there’s some internal resistance or something, but…

    • nwallin a month ago

      Correct. You can get the same power with half the voltage by doubling the current.

      The trouble is the wires. A given wire gauge is limited in its ability to conduct current, not power. So if you double to the current, you'll need to have roughly twice as much copper in your walls, in your fuse panel, in your appliance, etc.

      Additionally, losses due to heat are proportional to the current. If you double the current and halve the voltage, you'll lose twice as much power by heading the wires. For just a house, this isn't a lot, but it's not zero.

      This is why US households still have 240V available. If you have a large appliance that requires a lot of power, like an oven, water heater, dryer, L2 EV charger, etc, you really want to use more voltage and less current. Otherwise the wires start getting ridiculous.

      This is not to say that higher voltage is just necessarily better. Most of the EU and the UK in particular has plugs/outlets which are substantially more robust and difficult to accidentally connect the line voltage to a human. Lots of people talk about how much safer, for instance, UK plugs/outlets are than US plugs. If you look at the numbers though, the UK has more total deaths per year to electrocution than the US, despite the fact the US is substantially more populous. This isn't because of the plugs or the outlets, US plugs really are bad and UK plugs really are good. But overall, the US has less deaths because we have lower voltage; it's not as easy to kill someone with 120V as 240V.

      So there's a tradeoff. There is no best one size fits all solution.

      • Liftyee a month ago

        This is a very well written comment overall, but the energy loss in the wire is even worse than stated!

        By modelling the wire as an (ideal) resistor and applying Ohm's law, you can get P = I^2*R. the power lost in the wire is actually proportional to the square of current through it!

        Therefore, if you double the current, the heat quadruples instead of doubling! You actually have to use four times the copper (to decrease resistance by 4x and get heat under control), or the wasted energy quadruples too.

        Crucially, voltage is not in the equation, so high voltages - tens or hundreds of kilovolts - are used for long distance power transmission to maximise efficiency (and other impedance-related reasons).

        • xxs a month ago

          I was also surprised to read that heat is proportional to the current. In addition, increasing the temperature also increases the resistance in the conductor (Cu). It's around 0.4% per 1C for Cu, around 20% more heat at 70C.

          Not sure about US, yet some high current lanes (thinks of threephase ~400V x 36A; IEC 60502-1) in the households are actually made of Al, not Cu. They tend to be underground though, the wires in the walls are still Cu.

          • tzs a month ago

            Al wire is used a lot more than most people think. Here's the big differences between Al and Cu wire.

            Cu is more conductive than Al so an Al wire has to have a cross section area about 1.56 times that of a Cu with the same current capacity.

            But Cu is also denser than Al so the Al wire is only about 0.47 times the weight of the Cu wire.

            Al is is much cheaper than Cu so the Al wire is only about 13% the cost of the Cu wire.

            • SoftTalker a month ago

              Al wire is prone to oxidation and thus needs an antioxidant paste applied at connection points.

            • mrlonglong a month ago

              Hasn't anyone tried using silver wires?

              • tzs a month ago

                Silver wire is used for some things, but it is a lot more expensive than copper which rules it out for most applications.

                Here is a table of the conductivity (in units of 10^7 S/m), the density, and the cost of copper (Cu), aluminum (Al), silver (Ag), and gold (Au).

                               Cu    Al      Ag       Au
                  Conductivity 5.96  3.5     6.3      4.1
                  g/cm^3       8.96  2.6    10.5     19.3
                  $/kg         9.03  1.2  1030    92100
                
                If we had a copper wire with a specified capacity in amps, here is what aluminum, silver, and gold wires of the same length and capacity would weigh and cost as a percentage of the weight and cost of the copper wire, and what their diameter would be as a percentage of the diameter of the copper wire.

                      Weight  Cost  Diameter
                  Al    49       7    139
                  Ag   110   12646     97
                  Au   310 3190000    121
                • mrlonglong a month ago

                  Suppose Ag was as cheap as, or even cheaper than Al, would it be more useful?

                  • haneefmubarak a month ago

                    Sure, we could alloy and/or plate it. You have a source on Ag cheaper than Al you'd like to disclose to the class?

                    • mrlonglong a month ago

                      Silver and gold could be a lot more useful to society if they weren't so expensive. Plenty of asteroids out there to mine.

      • shaky-carrousel a month ago

        In 2017, there were 13 electrocution-related deaths in the UK. In the US, there are between 500 and 1,000 electrocution deaths per year. This translates to 0.019 deaths per 100,000 inhabitants in the UK and between 0.149 and 0.298 deaths per 100,000 inhabitants in the US.

        • dboreham a month ago

          Note that there is 240V in every US home. Low power loads run from a center tap 120V circuit. Also I wonder if people manage to electrocute themselves on "high voltage" circuits (typically 7.5KV) which due to the 120V thing have to be run everywhere so are more available to the casual electrocutee. In the UK although there are high voltage transmission lines, they terminate at substations, are often buried at that point, and there are high fences make it hard to electrocute ones self. E.g. a guy around here managed to electrocute himself very badly (but still survived) out in the woods by touching a bear that had itself become electrocuted by a live unprotected 7.5KV line.

      • masfuerte a month ago

        Your deaths claim surprised me. AFAICT England has ~10 deaths by electrocution per year. The US has ~100 domestic electrocutions and even more occupational electrocutions.

        • victorbjorklund a month ago

          Hard to compare. Does not US allow people to work on their own outlets etc in their own house while in UK you need to hire an electrian.

    • andruby a month ago

      I don’t know is toaster are close to max power draw, but kettles certainly are.

      Most places with 240V regularly have 16A sockets, allowing a maximum draw of 3840W of power. That’s the limit. Cheap fast kettles will often draw 3000W and boil 250ml of water at room tempature in 30s.

      Kettles in the US are often limited to 15A and thus max 1800W (usually 1500W) and take twice as long (60s)

      Technology Connections has a great video on this: https://youtu.be/_yMMTVVJI4c

      • andrewla a month ago

        I mention Impulse Labs and their battery-assisted 120V high power induction range in the comments elsewhere. Seems like a similar concept could be used to make an incredibly powerful kettle; throw in a battery that charges from the mains, and when you ask to boil, send in 20kW and boil the 250ml in 4 seconds.

            4.18 J/g/C * 250g * (1/ 20,000 kJ/s) * 75C = 3.918s
        • burnerthrow008 a month ago

          For that order of magnitude to work, in practice, the most challenging aspect will be getting enough surface area between the water and heater.

          Otherwise, you will very quickly vaporize the water near the heater and the resulting lack of contact will inhibit heating the rest of the water volume.

          • gmac a month ago

            Yes — in fact this is a problem with the high-power setting on our induction hob (which I think is something like 5kW). Liquids bubble vigorously before they're even warm.

          • wbl a month ago

            Microwave radiation could work to transfer heat in even as boiling initiates in spots. All the best kettles are BIS dual use items.

          • JoshTriplett a month ago

            If you can transmit that amount of heat that quickly, I think it'd be much more convenient and feasible to have it in the form factor of an instant-boiling-water spout next to the sink, rather than a kettle. Then, rather than having to fill the kettle first, you directly get the amount of hot water you need into the vessel you want it in, and you can put a precise amount of heat into a precise amount of water flowing through a pipe to emit it at the right temperature.

            • xyzzyz a month ago

              By the way, you can already have a boiling water tap today, you just buy a device that uses hot water tank to store the energy you rather than the battery. Insinkerator sells these. It might not be as energy efficient as the hypothetical tankless water boiler as described by you, because you have some losses from the heat slowly leaking away from the tank, but given the battery costs, I suspect that over the lifetime of the device, these losses add up to less than what battery costs.

          • hobs a month ago

            Yeah that's a great way to start a fire with a lot of steam first :)

    • wongarsu a month ago

      Houses are wired for 16A per circuit on both sides of the pond, with high-power appliances typically pulling around 10A to avoid tripping the fuse when something else is turned on at the same time. It's just a nice point where wires are easy to handle, plugs are compact, and everything is relatively cheap.

      The US could have toasters and hair dryers that work as well as European ones if everything was wired for 32A, but you only do that for porch heaters or electric vehicle chargers.

      • mystified5016 a month ago

        No, the standard in the US is 15 or 20A. 20 is more popular nowadays.

        240V appliances typically get a 35 or 50A circuit.

        But then you also have to deal with the fact that a lot of homes have wiring that can only handle 10A, but someone has replaced the glass fuse with a 20A breaker. Fun stuff.

        • bardak a month ago

          I still haven't seen a single 20A domestic appliance though

          • mjevans a month ago

            I have, though it's semi-prosumer equipment. The largest UPS systems for standard systems, like someone might want for a small home-lab rack, can be found with 20A 120V plugs that work in normal outlets; if they're on a 20A rated circuit like a kitchen refrigerator or bathroom sink outlet (the two most common sorts in rental units).

            I suspect some beauty products might also use 20A, or in combination easily reach that.

          • bcoates a month ago

            Very common in light commercial applications but almost unheard of in residential because nobody installs nema 20a sockets in houses even if the circuit can handle it

    • bbatha a month ago

      More current needs thicker wires. The average US outlet is wired for 120v15amp. 20 amp circuits are somewhat common, though 20amp receptacles are not. Certainly not enough for commodity appliances to rely on.

      Going to more than 20amp requires a multiphase circuit which are much more expensive and the plugs are unwieldy and not designed to be plugged and unplugged frequently.

      • quickthrowman a month ago

        > Going to more than 20amp requires a multiphase circuit

        There is no multi-phase power available in the vast majority of US houses. A typical residence has a 120/240 split-phase service, which is single-phase only. A service drop is two hot conductors from two of the three transformer phases and a center-tapped (between the two hot legs) neutral conductor. Either hot leg is 120v to ground and line to line is 240V.

        > https://en.m.wikipedia.org/wiki/Split-phase_electric_power

        Single-phase breakers are also available in sizes larger than 20A, usually all the way up to 125A.

    • generallee5686 a month ago

      Having more current running through a wire means thicker wires. Higher voltage means less current to achieve the same power, so thinner wires for the same power. The tradeoff for higher voltage is it's more dangerous (higher chance of arcing etc).

    • ajuc a month ago

      You need thicker wires for the same power. Which is why Americans live in constant fear of power extension cords, and people in EU just daisy-chain them with abandon.

      • extraduder_ire a month ago

        If you're in a country that uses type-G plugs, almost* all of those extension cords have fuses that break well below the current that will cause a problem.

        * Currently using a cable spool which will have problems before blowing the fuse if it's wound up and I draw too much current. It has a thermal cutoff, but I still unspool some extra wire on the floor.

    • tgsovlerkhgsel a month ago

      You need the same thickness of wire for 10A regardless of which voltage you have. So with 230V, your 10A wire will let you draw 2.3 kW while someone with 120V and 15A wire would only get 1.8 kW and pay more for the wiring.

    • mypgovroom a month ago

      You have a 240v toaster?

      • xanderlewis a month ago

        Well, closer to 230.

        • qayxc a month ago

          goes to check real quick between 238V and 245V at my outlets.

      • stuaxo a month ago

        Living in an a country with 240v mains, yep.

  • Max-q a month ago

    Today, with swichmode supplies, I think DC would make most sense. We lose much power to frequency deviation and inductive loads causing phase shift.

    It would also make sense to have a high voltage and low voltage nets in houses. Low voltage for lighting and other low power equipment. High voltage for power hungry equipment. For example 48V and 480V.

    • frrlpp a month ago

      DC high voltage (200v) is very dangerous and pernicious, and mores difficult to switch on and off because of arcs.

      • flyinghamster a month ago

        Very old US light switches often had both AC and DC current ratings, and the DC rating was noticeably lower due to the arcing problem, even with fast-acting snap action.

        My grandfather worked in a Chicago office building that had 110V DC service even into the 1960s. He had to be careful to buy fans, radios, etc. that could run on DC.

      • stephen_g a month ago

        Yes, it’s more dangerous, but technically “high voltage” doesn’t start until 1500 V DC (and 1000 V RMS for AC) by most standards (e.g. IEC 60038)

    • Dylan16807 a month ago

      Do we really lose much to phase shift? Increasing apparent load only causes a tiny fraction of that much waste, and local compensation isn't very expensive.

      DC or a significant frequency boost would be good inside a house for lights and electronic items. Not so great for distribution.

      I'm not convinced multiple voltages would be a net benefit outside of the dedicated runs we already do for big appliances.

  • freeqaz a month ago

    Interesting to think about what the future could look like. What if breaker boxes were "smart" and able to negotiate higher voltages like USB-C does? It would avoid the problem of a kid sticking a fork in an outlet, or a stray wire getting brushed accidentally when installing a light fixture.

    Time will tell!

    • duskwuff a month ago

      > What if breaker boxes were "smart" and able to negotiate higher voltages like USB-C does?

      That'd be difficult; a breaker typically feeds an entire room, not a single outlet. (And when it does feed a single outlet, that's typically because it's dedicated to a specific large appliance, like an air conditioner or electric stove, which wouldn't benefit from being able to dynamically negotiate a voltage.)

      • ianburrell a month ago

        Appliances, like electric range, that need higher power have dedicated 240V circuits. My understanding is that 240V circuits use thicker cable because they usually also have higher current. But it is possible to convert 120V to 240V if only one device, sometimes done for imported electric kettles.

  • xxs a month ago

    >Would it be a higher frequency (1000hz)?

    Totally not, that would mean both worse skin effect and worse impedance. Likely the best option (if you really don't care about the existing infrastructure) would be DC, 0Hz. There are some downsides of DC, of course.

    • gosub100 a month ago

      This would be excellent for solar because it removes the efficiency loss from the inverters. AFAIK it's very hard to add/remove loads without spiking power to everything else on the line. A friend of mine applied for a job at a company that was trying to run DC in the home but it's not a trivial endeavor.

      • jorvi a month ago

        Is there even a use to running DC in a home?

        Almost everything complex does run on DC internally, but you feed those via AC adapters that then invert it to DC. You'd have to get bespoke DC-DC adapters (transformers, really) for everything.

        • xxs a month ago

          AC-DC are effectively -> AC + power correction boost to even higher AC, then high voltage DC, the AC (high freq), then transformer, then low volt DC + feedback to the primary DC ; {then DC->DC (potentially)}

          DC (high) voltage to DC would skip the 1st few steps of AC->DC.

          • jorvi a month ago

            That is very interesting to know, thank you!

            I always assumed it was just one induction and transformation step.

        • gosub100 a month ago

          Is there even a use to running DC in a home?

          Lights, first and foremost. LEDs are DC.

          > Almost everything complex does run on DC internally

          almost everything runs on either 5V or 12VDC. What you would need are appliances that bypass the wall-warts/adapters and tap directly off DC, but this comes with some significant challenges. I'm already talking way outside my wheelhouse though, so I'll stop before I make a mockery of this topic.

          • jorvi a month ago

            I meant it more in the sense of: "[in the current appliances environment] is there even a use to running DC in a/your home?"

            Its very cool as a theoretical exercise and you could probably make a proof-of-concept house, but if you want to live in it and use literally anything non-bespoke, you have to convert DC to AC, which kind of defeats the purpose.

            • gosub100 a month ago

              absolutely, in the here-and-now. But as a system, if the entire appliance/solar industry committed to a household DC standard (and if it was technically feasible, which it very well may not be), then you might have something. Solar would just provide regulated DC (which presumably would be less lossy than inverting to AC). But the devil is in the details: a DC system would have to have a comms channel announcing "hey, I'm an appliance X, and I intend to draw X amps from the DC bus" and coordinate it with the supply, otherwise it will cause all other appliances to experience a drop in voltage. That's just one of the problems, there are others.

              However, if that were worked out, you could have DC plugs in the house for all existing appliances and fixtures, and theoretically you would get a gain on both sides (no inverter, and no v-regulator (or a simpler, less lossy one)).

              • jorvi a month ago

                I think one of the "easiest" steps would probably be to run a concurrent circuit of DC from a central inverter, stepped at the same voltage as the solar / home battery output.

                Then you wire your EV charger and big electrical appliances like stoves, ovens, microwaves, fridges (?), electric boilers, central AC, heat pumps, etc. into that DC circuit.

                That alone would switch the majority of your electrical consumption to DC. Maybe long-term, you could have a special DC socket that you plug toasters, kettles, crock pots, vacuums etc in to, if they became available and cheaper.

  • pwrson a month ago

    Probably about the same as we have.

    Higher frequencies have terrible transmission (skin effect, transmission line length limit) and would start to interfere with radio.

    Lower frequencies need larger transformers.

    DC while nice is too expensive.

    So about where we are now.

  • dreamcompiler a month ago

    DC. Modern electronics make AC grids unnecessary.

    48VDC inside homes would be enough for most applications except heating and it would be kid-safe.

    240V for heating applications.

  • IncreasePosts a month ago

    I don't care what the frequency is, I just want my LEDs to not flicker!

    • sxp a month ago

      Normal LED lightbulbs shouldn't flicker on standard 60Hz circuits. Do you have a dimmer switch or smart switch in the circuit? I've noticed these cause flickering that's visible on camera or out of the side of my eyes.

    • lnsru a month ago

      Get better ones. I installed expensive LED lamps at home and they’re fine. The guys in the office picked the cheapest ones and I don’t want to turn these ugly things on.

      Edit: Paulmann Velora are the expensive lamps at home.

    • SSLy a month ago

      that means that either the breaker is faulty, or the power stabilizator in the lamp itself is junk.

    • homebrewer a month ago

      https://lamptest.ru tests for flicker too. I don't know if those models are sold in the US, though. Philips, Osram and IKEA should be.

    • xanderlewis a month ago

      What about if they flickered at 10^-15Hz?

      • viraptor a month ago

        That would be really annoying to tell if they're broken, or just currently in the "off" phase :)

        • xanderlewis a month ago

          Assuming they started on, you’d be lucky if they lasted long enough to ever reach their off state (~32 million years later).

UltraSane a month ago

Fun fact: Japan uses BOTH 60Hz and 50Hz for mains electricity due to historical generator purchases. This means the Japanese electric grid is split into two regions that cannot easily share electricity.

  • _trampeltier a month ago

    The US alone has 3 grids (East, West and Texas). With the same frequency but still not connected.

    In Switzerland trains use 16.7Hz but they are connected with large frequency inverters. Before it was with large motors / generators. Now its just static with electronic.

    • trothamel a month ago

      The same frequency, but not connected via AC. There are multiple DC and Variable Frequency Transformer ties between the various interconnections.

      https://en.wikipedia.org/wiki/North_American_power_transmiss...

      • cf100clunk a month ago

        Exactly, DC is used for those links that would otherwise be out of synchronization. In Canada one of the DC links goes from the North American grid on the British Columbia Lower Mainland in Delta via a single underwater cable over to Vancouver Island. The water is the other conductor, and also keeps the cable cool.

    • Aloha a month ago

      We have more than three interconnections that are not in synchronous connection to each other.

      They can share power and are somewhat connected with HVDC interconnections however.

      • viraptor a month ago

        Same with mainland Australia and Tasmania with HVDC cable going through the sea.

  • fyrn_ a month ago

    This is covered in some detail in this paper? They even discuss the two engineers who made the purchase and who manufactured the generators..

  • themaninthedark a month ago

    I want to say this was part of the issue after the Tohoku Earthquake, my recollection was that some generators got brought in to support the ones that were flooded but were incompatible. However, I can not find any note of it in the timelines and after-action reports that showed up when I searched.

    So possibly misremembering or fog of war reporting, or perhaps not important enough for the summaries.

  • krackers a month ago

    I like the hypothesis by fomine3 on HN that "Japan's divided frequency situation made Japanese electric manufacturers to invent inverter for electrics (example: air conditioners, microwave, refrigerators). Inverter primary makes electrics controllable efficiently, also they can ship same product (same performance) for both 50/60Hz area in Japan."

  • userbinator a month ago

    It also has the most "Metric" of mains voltages, 100V nominal instead of the 110/115/120 or 220/230/240 that's common everywhere else in the world.

antithesis-nl a month ago

(1997), which I wondered about due to the "Many people continue to be affected by the decisions on frequency standards made so very long ago" phrasing and the intro-bit about the need for adapters in the paper itself.

Because these days, voltage and especially frequency are pretty much irrelevant for mains-power AC, and "ignorant" will be more accurate than "affected" when it comes to "many people"...

  • bluGill a month ago

    They don't know it but they likely have a motor someplace in their house that runs at the speed it does because of frequency. They are ignorant but it affects them.

    • theamk a month ago

      It is less and less likely... motor-based clocks are a thing of the past; hand appliances (like mixers and blenders) use either DC or universal motors to allow speed control. Even refrigerators feature "variable speed motors" nowadays, which means they are frequency-independent.

      I think fans will likely be the last devices which care about frequency.. but new ones are often 12V/24V-based, with a little step-down modules.

      • quickthrowman a month ago

        Most commercial AC fan and pump motors are already powered by variable frequency drives, and a lot of newer residential appliances have EC motors to allow for speed control.

        I’m seeing more and more EC motors in commercial applications, for things like 2-3 HP fam motors and pumps.

      • satiric a month ago

        What about dryer motors? I mean, I don't much care what rpm the dryer runs at, but it should change speed with the grid frequency right?

        • bluGill a month ago

          Modern dryiers are generaly run on a phase converter so while the motor is ac the frequency is controlled by a computer.

        • kencausey a month ago

          I wonder if resistive heating devices like ovens which have a tuned temperature component would become systematically less accurate if the frequency changed significantly.

          • smallmancontrov a month ago

            Nah, the thermal time constant is a low-pass filter on the order of .01Hz, all of the line frequencies in this thread are waaaay higher than the control loop bandwidth. The loop would never notice the substitution.

            You might be able to trip up a fancy soldering iron where loop bandwidth is intentionally maximized, but I still suspect the first thing to go would be the magnetics on anything with a transformer.

            • exmadscientist a month ago

              > first thing to go would be the magnetics

              Yes, but not for the reason you'd think: 50 Hz magnetics have to be physically larger to work (peak flux density for a given current is higher), and magnetics are so big and heavy that they're not designed with much margin. So 60 Hz transformers will often not work at all at 50 Hz, and 50 Hz transformers will sometimes perform pretty badly at 60 Hz (though also sometimes going this direction works fine).

      • creeble a month ago

        True for consumers (houses), not true for industrial applications where motors are in the >100HP range.

      • bluGill a month ago

        Fans because a furnace only needs two speeds at most.

alexjplant a month ago

Tangentially related: I have a vague memory of reading somewhere that the PCM sampling frequency for CD Audio was decided between Sony and Phillips by way of a surf contest. Their respective frequencies of choice were such that they could fit some arbitrarily-long piece of classical music onto a single disc so they decided to battle it out on the waves (the conference where they were hashing this out was in Hawaii). Phillips supposedly won so we got 44.1 kHz.

I just did a cursory web search to find this anecdote and was unsuccessful. Did I make this up whole cloth or is it just buried someplace? Or was I bamboozled at a young age by some random forumite on a now-defunct site?

*EDIT: This comprehensive account [1] seems to confirm that the story is completely apocryphal.

[1] https://www.dutchaudioclassics.nl/The-six-meetings-Philips-S...

  • hunter2_ a month ago

    Sony won, not Philips. It seems that the rationale is like so: a higher rate would not be compatible with both NTSC and PAL VCRs, and a lower rate would shrink the transition band for the antialiasing filter (or alternatively, reduce usable audio bandwidth to an extent that encroaches on the generally accepted limit of human hearing). Although the latter hardly seems relevant when the alternatives being debated were so close (Philips' 44056 Hz, for example)!

    https://en.wikipedia.org/wiki/44,100_Hz#Origin

    • alexjplant a month ago

      > a lower rate would shrink the transition band for the antialiasing filter (or alternatively, reduce usable audio bandwidth to an extent that encroaches on the generally accepted limit of human hearing)

      I've seen people smarter than me argue that the ideal sampling rate is actually somewhere around 64 kHz because it would allow for a gentler anti-aliasing filter with fewer phase artifacts.

    • anticensor a month ago

      Why couldn't they make use of a sampling rate with five samples per frame (which would exactly give 88.2kHz by the way)?

      • wappieslurkz a month ago

        Because doubling the sample rate would half the playing time of a CD, or require twice the density, while not bringing any audible increase in quality for the human ear.

Aloha a month ago

I know Southern California Edison had 50hz power, I always used to find old clocks and radios as a kid with a conversion sticker.

I've always kept an eye out for good papers about the effort to convert, but they're hard to find.

pkulak a month ago

I always assumed it's because 60 is a highly composite number (superior, in fact!). It's kinda the best number if you're ever going to need to divide. 50 is kinda garbage in that regard. :/

  • layer8 a month ago

    Well, 50 Hz means the period is a round 20 ms instead of 16.6666… ms.

    And PAL got a higher resolution thanks to it.

    • pkulak a month ago

      At the cost of framerate. No free lunch!

  • tzs a month ago

    Another point in favor of 60 is there are 60 seconds in a minute and 60 minutes in an hour. If you make the frequency 60 cycles/second then you only need divide-by-60 circuits to derive 1 cycle/second, 1 cycle/minute, and 1 cycle/hour.

    With 50 cycles/second you would need both divide-by-50 and divide-by-60.

  • beeflet a month ago

    It's worth mentioning 60Hz vs 50Hz distinction has ended up having a knock-on effect on framerate standards because of early TVs using it for timing, not to mention market segmentation because different products had to be manufactured for the US vs European markets.

    here is a well animated video about it: https://www.youtube.com/watch?v=DyqjTZHRdRs&t=49s

  • PortiaBerries a month ago

    Yes! I was looking for this comment. Maybe because I didn't grow up with the metric system, but 60 feels like a much "rounder" number to me than 50.

  • makerdiety a month ago

    The number sixty is highly composite maybe because it's a multiple of three? In which case, I can see why Nikola Tesla liked the number three or multiples of three.

    So he can do exploratory electrical science and analysis with flexible cases?

throw0101c a month ago

Meta: the IEE(E) has been around for a little while. One of the references:

> [5] L.B. Stillwell, ”Note on Standard Frequency,” IEE Journal, vol. 28, 1899, pp. 364-66.

That's 126 years ago.

gorfian_robot a month ago

relatives living in Australia (a 50Hz county) but with family in the US decided in their new home to provide both 50Hz power with Australian style outlets and 60Hz with US style outlets since many household appliances are much cheaper and diverse in the US. since they built the place with a solar/battery system it wasn't too big of a deal to use two different inverters for the AC side of things.

cwmoore a month ago

"Effects of a 60 Hz Magnetic Field Exposure Up to 3000 μT on Human Brain Activation as Measured by Functional Magnetic Resonance Imaging"

2015

Alexandre Legros, Julien Modolo, Samantha Brown, John Roberston, Alex W Thomas

https://pubmed.ncbi.nlm.nih.gov/26214312/

alana314 a month ago

As a result I hear B0 everywhere, in power lines, electric arcs, industrial motors, speaker buzz, etc

  • fahrnfahrnfahrn a month ago

    My digital guitar tuner oscillates between B and A# with no input.

1-6 a month ago

Duodecimal Society members are happy.

aetherspawn a month ago

50 factors

1 2 5 10 25 50

60 factors

1 2 3 4 5 6 10 20 30 60

Seems for AC 60 is overall more flexible in design of AC motors, transformers and other resonant devices

  • Dylan16807 a month ago

    Only if you need to align to a second for some reason. 50Hz is still 60 cycles per something.

netfortius a month ago

From a practical stand point, as someone who moved from the US to Europe, and was forced to leave a lot of appliances behind (not because of voltage - that could have been addressed!), f*ck this!

  • comrade1234 a month ago

    I moved to Europe too. I have a few transformers that let me use my expensive American kitchen equipment (angel juicer, kenwood giant mixer, etc) here. Slowly I’ve been getting rid of the American equipment and replacing it with European as it’s obvious that I’m not moving back.

mrlonglong a month ago

All that's academic now we have power supplies that are capable of handling both frequencies and the various voltages in use across the globe.

cjohnst a month ago

When f is 60Hz, it makes for some nice round numbers, and easier for mental calculations.

ω=2πf

At 60Hz, ω is 376.99... very near to the integer 377.

Also, Z₀, impedance of free space is not far off at 376.73... Ω

  • kalleboo a month ago

    I'm not an electrical engineer so what makes integer 377 easy for mental calculations?

    It's seems like 50Hz would be even more convenient because ω = pi*100

  • venusenvy47 a month ago

    This is interesting, but isn't this more like a coincidence? Power line engineering and wireless engineering are fairly distinct fields in EE.

    • cjohnst a month ago

      Coincidence, maybe, but convenient.

      That currently may be the case that many or most engineers rarely use both.

      But, from a historical perspective, when power generation and distribution were new, there was probably not such a distinction.

      Even today, large industrial users of power need to know and understand the types of equipment they're using, impedance and power factor of them, and the aggregate effect on the power grid, and adjust for it.

      I imagine that many of the early large users of power were radio broadcasters.

      There was likely significant overlap of power engineering and wireless broadcasting. An engineer would need to understand what effect the broadcast system was having on the grid and adjust for it. Calculating the impedance of the entire system using ω and the associated capacitance and inductance of portions of the system. The broadcasting antenna would certainly be a part of calculating power draw and power factor.

alex_young a month ago

60 Hz sure makes it easy to keep clocks on time.

  • theamk a month ago

    As far as clocks are concerned, 60 Hz or 50 Hz are very similar, just make sure the number of teeth on gears match the frequency.

  • Jedd a month ago

    The two clocks I reference the most - my wrist watch, and my mobile phone - don't really benefit from the alleged advantage of having mains power cycling every 0.0167 seconds.

    Perhaps you could expound further on this hypothesis?

    I'd always assumed people who spent 3+ years studying electrical engineering had solved this problem. Certainly in Australia (~240V / 50Hz) we don't seem to have a problem with all our clocks being 20% wrong all the time.

  • hatsunearu a month ago

    1/60th of a second isn't a common unit of time though

    • toast0 a month ago

      It's convenient to count 60 pulses to make a seconds pulse, then 60 of those to make a minute pulse, then 60 of those to make an hour pulse. Then 60 of those to make 2 and a half days :P

  • ThisNameIsTaken a month ago

    Not sure it's precise enough though. In 2018, many clocks in Europe were off because the frequency on the net had drifted due to (as I understood it) the network being out of sync across various countries. Some here might actually understand the details of this.

    • mystified5016 a month ago

      In the US, we modulate (or used to) grid frequency specifically for these analog clocks such that in a 24hr period it averages to exactly 60Hz.

      It doesn't really matter on a second-to-second timescale how accurate grid frequency is. If you can keep the average frequency right, all your clocks will speed up and slow down in sync, and average out to 24hours per day

    • wongarsu a month ago

      The frequency drifts up and down whenever demand doesn't exactly match supply. Higher demand slows the frequency down, higher supply speeds it up. This is actually the main way power companies know if supply and demand match, and if power stations have to ramp up or down.

      The frequency changes are pretty small in normal operation, but on a clock that uses the frequency to keep time they accumulate. They only work reliably because power companies know about them and occasionally deliberately run a bit over or under capacity to make the average match again.

      • andlier a month ago

        Fun fact, there are databases of the exact frequency vs. time and it can be used to accurately time stamp audio/video recordings by correlating the ~50/60hz noise in the recording with the database. Good writeup on the technique and how it has been used in court cases: https://robertheaton.com/enf/

      • kps a month ago

        In 2018 the European grid lost a cumulative 6 minutes due to a Serbia/Kosovo dispute.

        • Gare a month ago

          Which has been corrected since.

      • folli a month ago

        This is fascinating, didn't know. Why does higher demand lower the frequency?

        • bonzini a month ago

          With a lot of simplification, consuming electricity acts as a brake on giant wheels inside the power plants that are usually spinning at mains frequency. The plants accelerate the same wheels, so with much demand the braking wins and with too little the acceleration wins.

        • avidiax a month ago

          There is conservation of energy. Energy in strictly equals energy out.

          The electrical grid is a bunch of heavy spinning motor-generators that are electrically connected to heavy spinning motor-generators and other loads like lightbulbs. The motor-generators are electrically identical, except that we expect to add energy to one side and extract energy on the other*.

          So what happens if the energy added by power plants is less than the energy extracted by lightbulbs and the loads on the motor-generators? Conservation of energy means that we must get the energy by slowing down the generators, extracting their kinetic energy. That lowers the grid frequency.

          The same thing can happen in reverse to increase the grid frequency. Too much power generation must increase the kinetic energy of the motor-generators.

          * Many of the loads on the grid are intentional or unintentional flywheels, so they may actually add energy to the grid if the grid is slowing, increasing stability.

        • creeble a month ago

          Because generators -- where virtually _all_ AC power is created -- start running slower with high demand. They catch up, via increased power input through governors, but changes in load will necessarily have some impact on speed.

        • duskwuff a month ago

          The frequency is generated by rotating electrical generators. Higher electrical demand increases the mechanical load on the generator, making it rotate more slowly, producing a lower frequency.

    • nottorp a month ago

      I don't think 60 or 50 Hz matters wrt to this.

      The only thing that matters is that a clock that expects a certain frequency gets that frequency and not 1% more or 1% less.

  • timw4mail a month ago

    I'm sure it makes the gear ratios easier to calculate for the power-frequency-synced gearing on those old electric clocks.

    But now? It's pretty much just an implementation detail.

  • nom a month ago

    think again

ethbr1 a month ago

Tl;dr - Because Westinghouse (60 Hz) beat out GE (50 Hz) in the early (~1910+) American AC electrical equipment market.

  • SigmundA a month ago

    My Tl;dr would be a little longer - early systems used both higher (around 130 hz) which caused issues with resonance and difficulties making induction motors, and lower (around 30hz) which caused light flicker.

    50-60 hz solved these issues, Westinghouse thought 60hz was better for light flicker and beat out GE who settled on the 50hz standard used by it's European affiliate that moved up from 40hz due to flicker.

    • Analemma_ a month ago

      In Seattle you can take a tour of the Georgetown Steam Plant, which was an early power station. At one point they mention that the plant had two totally separate generators at different frequencies: one for household mains power and one for the electric streetcars.

    • cf100clunk a month ago

      25hz too, mentioned in another thread.

      • SigmundA a month ago

        25hz is part of the "around 30hz" I mentioned. It was a compromise for the Niagra falls turbines between 16.7hz which was good for large motors at the time and flickering lights for which Westinghouse wanted at least 33.3hz.

  • MBCook a month ago

    But that TLDR doesn’t answer why those companies chose those frequencies.

ej1 a month ago

[flagged]