Why is it that we use voltages and watts more often than amperages? 9v batteries, 12v car battery, 1000W Microwave oven. But amperages not so much, even though its “half” of what makes power, A*V=W. What property of amperes makes it so “unnecessary” to be aware of?

Bonus: how many amps and volts does a typical 1000W microwave use?

  • ggtdbz@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    5
    ·
    17 hours ago

    Where I live most people actually do think of household electricity in terms of instantaneous current draw. The power grid is insufficient, so you get rationed grid power throughout the day depending on what area you live in, with the rest filled in by mob-run local power generators.

    You pay a subscription based on the maximum amperage and you have to manage your power use accordingly.

    🤓☝️ Um that’s functionally the same as a power limit

    Yes, but it’s ampere based because it’s managed by a breaker out on the street, and the ampere limit is colloquially understood. “I’m paying for 5A, the bastard won’t give me more” is a perfectly understandable statement. When I was 7 years old I already understood that “we only have 5 amperes during most of the day” and that it meant the microwave wasn’t available during that time. And since the breaker is out on the street, you learn your limit very quickly, since you have to get dressed and go down a few flights of stairs in the freezing cold to turn it back on. If you have an elevator in your building, it sure as shit isn’t running when the generator power is active.

    Annoying when you’re using solar to escape this hell electrical system and everyone has to re-learn to think in terms of Watts/VA. I have a table printed out stuck to the wall to “convert” between amps and Watts at 230V. Do you want to explain to grandma which devices are intuitively at 1kW≈1kVA and which are not? No? Then let her keep using amperes, it’s fine.

    Yes, the power generators run off diesel, yes, the diesel fumes and generator noise is a problem, yes, we get price gouged by both the generator mobs and the government grid, yes, I hate dieselpunk and think diesel is the most disgusting fuel. The generators give you a much closer wave to 230@50Hz though, so it has that over the grid. Was solar the most expensive thing we’ve ever paid for? Yes. Does it make me feel like a king with a 24 hour battery-backed microwave? Also yes

    • ____@infosec.pub
      link
      fedilink
      arrow-up
      1
      ·
      16 hours ago

      Five amps?

      USian here. My refrigerator expects (and is legally required to have) a dedicated branch circuit of fifteen or twenty amps with nothing else on it.

      Been years a since I had to actually apply Ohm’s Law, but I believe since we’re on 120V and most of the world is on 240V, you’d only need half the amperage we do.

      You (and apparently most of your fellow citizens?) are expected to run your home most of the day on half the power available to an average American fridge? (Figuring that as 7.5A for US fridge at 220)

      Puts our privilege in perspective for sure. If you don’t mind what general area of the world are you in? (Need not be overly specific, just curious about region(

      • ggtdbz@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        1
        ·
        4 hours ago

        USian here. My refrigerator expects (and is legally required to have) a dedicated branch circuit of fifteen or twenty amps with nothing else on it.

        Does any fridge use 1,800W? Even at peak? A small freeze-dryer? I’m holding off on making cheap jokes at the expense of American cuisine, your fridge might be legally over provisioned for some reason, but it’s not drawing this much power. My entire house idles at about 1.1~1.5A, or about 250~350W, if nothing is running but the absolute essentials. And that’s a relevant number for me, I do solar! I do batteries! Every little bit counts. Fridge is the most important thing to power, and literally everything else comes after.

        During the financial meltdown/pandemic people ran their fridges for eight or less hours a day because there was just not enough diesel. That’s the time we decided to splurge on solar. If you want more fun anecdotes, during that time I was waiting in line for 3-4 hours for fuel, and then bribing the attendant more than the value of the fuel to let me fill over 20 liters. Not fun times. And I’m someone who was lucky enough to be able to pay his way through the worst of it.

        I’m in Lebanon. I too count my blessings. It’s not culturally mandatory to strand your kids at 18 here, nor culturally accepted to have them dodge bullets at school. So you know. Even at the peak of people being harassed by Hafez’s secret police, people were not getting snatched on the street en masse. I don’t have the mental scaffolding to even begin to grapple with your reality my dear. Using the gas because the microwave is unavailable until tomorrow, temporarily stealing my neighbor’s water, angling for favor with feudal lords’ bureaucrats… Problems yes, but problems I understand.

        I hope I’m not being too mean here, the US still fascinates me in a way no other potential new home does, even with everything happening right now. How’s that for perspective.

  • XeroxCool@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    18 hours ago

    It depends on what’s useful to know.

    A microwave is a heating device. It’s not useful to know you’re using a 7a microwave on its own. Is it 120v or 220v? What’s important is the wattage, as an indicator of how much heat it can put into food in a given time. A 700w microwave is going to take longer than the instructions say but could be a 3.5a euro oven or a 7a north american oven.

    With lights, wattage ignores the change in voltages as well. But it relied upon the tungsten incandescent being ubiquitous. The socket type defines the voltage, so you just want to know if it’s a soft 25w reading light or a 100w for a garage bay. But now, with the prevalence of fluorescent and then LED lights, wattage has become almost irrelevant. They usually list actual wattage in pale text and “incandescent wattage equivalent” in bold. I’m happy to say I’m finally seeing bulbs state actual lumens now, which is what really matters to the end user. LED lighting is now the least of your electric bill worries.

    With a car battery, you’re seeing the options in a later stage of market uniformity. Cars used to very commonly have 6v systems, so the 12v system was distinct. Large trucks use 24v (though I think with dual 12v batteries). But for you buying a car battery, just about all passenger cars are 12v. It’s a specific size like “group 65”, so it’s a 12v of certain size and terminal placement. You do have some options for amperage, listed as CCA. You can’t give more amps to the starter, but rather the battery lasts longer per charge and drops voltage less when under load.

    But, you will actually see amps listed on power tools. I guess they traditionally had very little export, so the voltage is constant for the market. You can compare amps and take a good guess how two saws will compare. Even still, many tools now list actual specs like rpm and torque

  • Decq@lemmy.world
    link
    fedilink
    arrow-up
    15
    ·
    edit-2
    1 day ago

    Many batteries these days are rated in Ah, so that’s something.

    One reason is because most power supplies are voltage supplies and not current supplies. Though they do report the max current they can supply. Also voltage is in those cases (arguably) more important. The wrong voltage and it probably won’t work at all or even breaks. Whereas to little current, it probably still does something, though at reduced power or it cuts off at some point.

    • Iconoclast@feddit.uk
      link
      fedilink
      arrow-up
      5
      ·
      1 day ago

      Ah, or mAh might be a little confusing though - two different batteries can have the same Ah rating but wildly different capacities.

      I always convert everything to watt-hours by multiplying ampere-hours by voltage.

      • Decq@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 day ago

        It is confusing, but it is what they do. I’m not sure why they do it. Probably marketing reasons, seeing as a lot of people think bigger number == bigger better. Of course if you know the nominal voltage of the battery pack it’s not a big issue. But yeah Wh or Joule would be better.

      • jdnewmil@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        1 day ago

        Batteries have both electron capacity (cumulative) and current capacity (rate) ratings. The chemistry and size determine how many electrons (aka Amperes times hours) can be stored, and the conductor sizes (including within the cells) determine how quickly it can be charged or discharged in sustained operation (without permanent damage).

        A car battery can be shorted with a screwdriver and discharged at a high current, but only for a short time without damage to the cells. A 100Ah car battery can supply rated current for roughly twice as long as a 50Ah battery.

        Sometimes people call these ratings energy and power ratings by multiplying each by rated voltage, but the voltage does vary with charge state and rate of current flow so those “ratings” are rather approximate.

  • Valthorn@feddit.nu
    link
    fedilink
    arrow-up
    2
    ·
    1 day ago

    We used to see it on phone chargers. 5V with different amperages. Nowadays it’s mostly given i watts, though.

  • DigDoug@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    ·
    2 days ago

    While constant current sources exist, they’re very uncommon. A battery’s voltage is constant (or at least we consider it so), but the current it needs to supply is dependent on the load. In general, we consider current to be a result of voltage, to the extent that some teachers prefer that Ohm’s Law be written as I = V/R (rather than V=IR) to show this relationship more clearly (where I is current, V is voltage and R is resistance).

    • litchralee@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      For an example of where constant current sources are used – and IMO, deeply necessary – we can look to the humble LED driver circuit. LEDs are fickle devices, on account of their very sharp voltage-current curve, which also changes with operating temperature and is not always consistent from the factory. As a practical matter, the current through an LED is what predominantly controls the brightness, so constant current sources will provide very steady illumination. If instead an LED were driven with a constant voltage source, it would need to be exceedingly stable, since even a few tens of millivolts off can destroy some LEDs through over-current and/or over-heating.

      For cheap appliances, some designs will use a simple resistor circuit to set the LED current, and this may be acceptable provided that the current is nowhere near overdriving the LED. Thing of small indicator LEDs that aren’t that bright anyway. Whereas for expensive industrial LED projectors, it would be foolish to not have an appropriately designed current source, among other protective features.

  • HubertManne@piefed.social
    link
    fedilink
    English
    arrow-up
    1
    ·
    22 hours ago

    devices draw the amperage they want so there is not much risk of over amperage but plug something that wants less volts into a more volts source and. erm. not good. wats are bigger because its more a measurement of total power although I see people arguing we should use joules or such.

    • XeroxCool@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      19 hours ago

      A high amperage device is not what’s at risk. The wiring is. With defined voltages (by way of plug type), devices can’t draw extra amperage, but you can certainly ask the wires for more amperage than they can safely provide. Fuses and circuit breakers do not protect the device, they protect the wires from burning off their insulation, shorting, or catching fire.

      But as a caveat, a 120v device plugged into a 220v source will draw too many amps for the device.

      • HubertManne@piefed.social
        link
        fedilink
        English
        arrow-up
        1
        ·
        16 hours ago

        yeah for most things though 120 is more than enough and the plug differences prevent it as far as 220 being used in 110. I mean I was just explaining why you do not see it much. I mean its the reason for fuses. I guess the christmass lights thing used to be common enough to be warned against though. One of the rare times I looked into the total amps being supplied to my condo as I was contemplating an on demand water heater.

  • Infrapink@thebrainbin.org
    link
    fedilink
    arrow-up
    7
    arrow-down
    2
    ·
    1 day ago

    Because current (amps) has nothing to do with energy. Formally, an ampere is the current of 1019 electrons moving through a given point in 1.6 seconds; in more reasonable terms, it’s 1 coulomb per second. The amount of energy in those electrons doesn’t matter to the amount of current, but energy is very relevant to making machines do things.

    Potential (volts) does include energy; specifically, 1 volt is 1 joule per coulomb. Add more energy and you get more volts, but the current remains the same. So volts are more relevant to how much use you can get out of your electrons.

    Power (watts), meanwhile, tells you how effective your machine is at extracting that energy. 1 watt is 1 joule per second. Suppose you are running a 6W heater. Every second, that heater converts 6J of electrical energy into heat energy, while the current remains the same.

    Thus, knowing current is important for electrical engineering, but potential and power matter more for operation.

    • jdnewmil@lemmy.ca
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      “power is only related to voltage” is nonsense. Current and voltage contribute equally to power (P=IVpf, but I am not going to discuss power factor here).

      The reason current is less frequently mentioned is that our electric power system supplies power (current and voltage) to many users and the wiring for giving power to many locations is simpler and more reliable when we try to keep voltage relatively constant and let the power using devices demand just as much current as they need to extract the power they need. This means current values can vary wildly between circuits, so it is not very informative to talk about current unless you know a lot about what is hooked up and consuming power nearby. The circuit that supplies the lights uses much less current than the one that supplies the air conditioner (when it is on) but there are often many lights on a single circuit so the current is quite different in different segments of wire even while the voltage only varies a but in each circuit.

      You could devise a wiring system that held current constant by running the current in a loop through all the consuming devices, but like Christmas tree lights if any load disconnected then all the loads would stop receiving power, which would make it very unreliable.

  • givesomefucks@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    2 days ago

    Amps are the variable part of the equation…

    There are other parts of the equation, every one being constant would make every electrical component binary. Either full power or no power.

    That’s why we really only see variable amperage on battery charges to force a slower charge rate for the health of the batter. On something like a radio, you could think of the volume knob as amperage control. The more power, the louder the sound comes out of the speaker.

    A steady amperage current would “lock” the volume at one setting forever.

      • givesomefucks@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 days ago

        It’s been a minute since I learned electrical stuff, so I might be off on details.

        Like maybe a pretty steady amperage from the cord and it’s regulated inside thru resistance or something more complicated?

        But that’s the general gist of why not all parts of the equation can be static.

        The advertised Wattage is also “max” it can use/produce.

        Like a 850watt power supply can handle an 850 power watt draw, but if all the computer is doing is playing YouTube, it’s going to draw a lot less amps, and produce a lot less watts as a result. If it needs more watts, it “pull” more amps to make them

        Steam turbines are actually self regulating because of this. The more power being used, the more amps are automatically produced. Once you spin it up it manages its own speed.

        • cecilkorik@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          Steam turbines are actually self regulating because of this. The more power being used, the more amps are automatically produced. Once you spin it up it manages its own speed.

          This is sort of true, within a narrow operating window and an idealized environment, but also pretty simplified. That sort of application of Ohm’s law only works according to the naive interpretation when you’re talking about ideal DC devices. In reality, inductance and capacitance become significant and muddy the waters a lot when you start getting into real power grids with huge inductive loads like motors and transformers all over them, and steam turbines trip and/or bypass all the time to avoid overload or overspeed.

  • Zwuzelmaus@feddit.org
    link
    fedilink
    arrow-up
    4
    ·
    2 days ago

    Popularity, tradition, laziness :)

    The danger of electricity is the first thing about it that you learn as a little child. And later you learn: the more voltage, the more danger. Even such people who never learn anything else about electricity know that all their life. Now the voltage has an unbeatable popularity.

    Power sources with a fixed voltage were invented before such ones with a fixed amperage. Therefore the voltage was used as “the” number to tell the size of it. People like to be lazy, they like to have one single number to tell the size of things. Accordingly, for the devices that use electricity the traditon was established to say only the voltage (often it must fit to the fixed voltage of the source) and then be satisfied.

  • susi7802@sopuli.xyz
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    2 days ago

    Amperage is only relevant for a short moment, just before you go through the pearly gates.

  • LadyMeow@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    5
    ·
    2 days ago

    Interestingly, in your example you put car battery. However CCA, or cold cranking amps is like the only thing interesting about the battery, other than physical size.

    • bearoftheisle@europe.pub
      link
      fedilink
      arrow-up
      2
      ·
      20 hours ago

      Sure, for someone in the know. Most people just know that they’re 12V (if that) and have no idea about A or Ah.

  • etchinghillside@reddthat.com
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    2 days ago

    Everyone is aware of amps when their breakers trip.

    If you know the voltage of an outlet you know the amps of a 1000w microwave.

  • Skyrmir@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    2 days ago

    Amps are used when deciding the wiring. So that afterwords you know if your plug matches, it works in that outlet.

  • FiskFisk33
    link
    fedilink
    arrow-up
    2
    ·
    2 days ago

    Because in most relevant scenarios amperage is essentially just a function of voltage and resistance/impedance