Comment by bilsbie

Comment by bilsbie a day ago

10 replies

Offshoot question. Why don’t we make resistors by making wire so thin that only a certain current can fit through?

Wouldn’t that be more efficient than converting current to heat?

grues-dinner a day ago

Resistance is V/I. You literally cannot have current flowing in a resistor without a voltage across it (either the voltage causes the current to flow, or the resistor in the path of a flowing current has a voltage appear across it).

A voltage drop with a flowing current is power (P=VI).

There is literally nothing you can do to avoid resistors dissipating that power as heat, it's just what they are. If they didn't do it, they wouldn't be resistors.

What you can do is use larger resistances which need less current to see the same voltage (e.g. change a pull up from 10k to 100k or higher, but that's more sensitive to noise), or smaller resistances that drop less power from a given current (e.g. a miiliohm-range current shunt, and then you need a more sensitive input circuit) or find another way to do what you want (e.g. a switched-mode power supply is far more efficient than a voltage divider at stepping down voltage). This is usually much more complex and often requires fiddly active control, but is worth it in power-constrained applications, and with modern integrated technology, there's often a chip that does what you need "magically" for not much money.

mort96 a day ago

You're describing thin film resistors, and they exist.

They also just convert current to heat though. Some amount of current moving through a material with some amount of resistance always produces a fixed amount of current in accordance with Ohm's law. You can't really get away from that.

  • bilsbie a day ago

    Thanks. So Is there a physical reason resistors have to make heat? Is it theoretically possible to find a material that limits current but produces very little heat?

    I guess the explanations always confuse me. Let’s say a short circuit with no resistors has a certain amount of power. Then we add a resistor and the power in the circuit goes down. The resistor isn’t turning the difference in power into heat, right.

    • mort96 a day ago

      There is no such thing as a short circuit with no resistance, because everything (other than superconductors) has a resistance. If you had a circuit with a magical ideal voltage source and no resistance, you'd have infinite current.

      But let's talk about short-circuiting lithium batteries for example. They have a roughly 50 milliohm (aka 0.05 ohm) of "equivalent series resistance".

      That means, if you short circuit a lithium battery with a superconductive wire (aka with 0 resistance), the circuit has a resistance of 0.05 ohms. We can compute the current with Ohm's law: I=V/R. V is typically 3.6 volts for li-ion batteries, R is 0.05 ohm, so I (aka current) is 3.6/0.05 = 72 amperes. 72 amperes * 3.6 volts is 259 watts. Now in the real world, the battery's chemistry would step in here and limit current in complicated ways, but this means that under the assumption that our battery would work as an ideal voltage source + a 0.05 ohm resistor, and if there was no extra heat coming from the chemical reactions, a shorted battery would produce 259 watts of heat.

      We can add a 1 ohm resistor to the circuit, which means our circuit's combined resistance would be 1.05 ohm. Using Ohm's law again, we find that the current would be 3.6/1.05 = approx 3.43 amperes. 3.43 amperes * 3.6 volts is 12.35 watts of heat.

      So thanks to our resistor, we're now producing 12.35 watts instead of 259 watts of heat, because the resistor limits the current going through the circuit. With a higher resistance resistor we'd produce even less heat.

      A core idea here is that power consumptions equals heat. I don't understand the physics reasons why, but "this thing consumes 10 watts of power" means the same as "this thing produces 10 watts of heat". Higher resistance means less current which means less watts, which means both less heat and less power consumption because those are the same.

    • raron a day ago

      Resistors doesn't "limit the current flow", they just (let's say) make it harder to "push" the same amount of current through. (If you increase the voltage you will get higher current flowing through any resistor.)

      There are ways to limit the current through some components regardless of the applied voltage (to a sane level) without producing much heat. These are active switching mode DC-DC converters widely used eg. for driving LEDs in (higher quality) light bulbs and charging batteries.

    • analog31 a day ago

      It's useful to look at the units of measure. Voltage is energy per unit charge. As the charge carriers go across the resistors, their energy changes, and that energy has to go somewhere. It's not always lost as heat in all devices. In an LED, some of the energy is "lost" as light. But still, the sum total of heat and light power generated by an LED is equal to the product of the current and the forward voltage.

      Another useful heuristic is that heat is generated from what's left after all of of the other ways of converting energy are used up, such as light, chemical potential, and so forth. It's energy's last resort. The usefulness of a resistor lies in its simple voltage-current relationship, which is equivalent to saying that the only thing it generates is heat.

    • marcosdumay a day ago

      > So Is there a physical reason resistors have to make heat?

      Look at it like this: resistors produce heat by definition.

      What you are describing isn't a resistor. It's actually a switched power supply, but that material is probably more heterogeneous than you wanted.

    • [removed] a day ago
      [deleted]