1

Apologies for my lack of deep knowledge here - this is not my background and I am definitely an amateur so I greatly appreciate your willingness to help me!

Background: I was curious about the negotiation between a USB power supply and the device plugged in. Therefore, I took a power brick that is rated for 5V at 1A and plugged it in. I then took a USB 2.0 cable that I stripped to expose the VCC and GND wires. I hooked my multimeter directly to the ends of these wires because I wanted to get readings without a device. I measured the voltage and got an expected, for me, 5V. I then measured the current at about 100mA, but it was oscillating between that and 0mA very consistently.

Question: I assume that I get the 100mA because I don't actually have a device plugged in, but why does it bounce back to 0 so consistently? I assume that the power brick is waiting for some negotiation from the device prior to supplying any meaningful amperage?

Relevant Post: At the bottom of this relevant post Tom Carpenter writes:

How does the master know the device needs power? Simple, all USB devices are allowed to draw an amount of current without requesting it - up to 100mA as far as I recall. This gives the device enough power to turn on, assert its presence (with a pull up resistor on the D+ line - again probably too in depth). Once the master is aware of the device, it allocates a power allowance to that device and asks if it will need more (e.g. for a high power device, if it wants the 500mA high current allowance).

I assume the above is why I received the 100mA, but am still unsure why it bounces. I have a recording of the setup if it would be preferred to see.

Updated:

Doc
  • 11
  • 3
  • 3
    Did you connect the multimeter between VCC and GND when you did the current measurement? – MrGerber Jan 20 '24 at 17:12
  • Please edit the question to add schematics and/or high quality photos of your measurement setup. The photos should be good enough so that anyone just looking at them could reproduce your setup. I bet there's some basic instrumentation setup mistake that should be easy to rectify as soon as we can see what it is. – Kuba hasn't forgotten Monica Jan 20 '24 at 17:16
  • @MrGerber - Yes, was this a mistake? I thought that this wouldn’t short because the power supply would have a resistance. – Doc Jan 20 '24 at 17:33
  • @Kubahasn'tforgottenMonica - I will add the pics/video, thanks! – Doc Jan 20 '24 at 17:33
  • @Doc Power supplies are typically designed to output a constant voltage no matter what. This causes problems when you short them out. If the power supply had enough internal resistance to avoid this being an issue, that same resistance would cause large voltage drops when it tries to supply any current to a device. – Hearth Jan 20 '24 at 17:38
  • @Doc Power supplies are typically rated for a maximum current. Many of them will internally limit their output to that maximum, or something a little higher than it. But this limiting doesn't come without a cost: the power has to be dissipated somewhere. A few power supplies may be rated for a continuous short circuit, but the vast majority of them can only handle their output being shorted for brief periods (typically to allow a short surge to charge capacitors in the load device), before they either break or shut themselves off to avoid breaking. – Hearth Jan 20 '24 at 17:41

2 Answers2

1

Most likely, your supply is using a linear current limit circuit to limit at 100 mA, overheating (or tripping a temperature sensor or time-delay current-limit sensor) due to that current limit, and cycling off to protect itself from damage.

Then it attempts to turn back on after a few milliseconds to see if the fault is gone. Since the fault is still present, it quickly cycles back off again.


The supply tries to output 5 V no matter what, but your multimeter in current measurement mode is a short circuit, so that 5 V has to be dropped down to a handful of millivolts, meaning that \$5\ \mathrm{V} · 100\ \mathrm{mA} = 500\ \mathrm{mW}\$ has to be dissipated somewhere, and that somewhere is inside the current-limit circuit. This circuit is probably not designed to dissipate that much power for extended periods (ordinarily, almost all 5 V will be dissipated in the device plugged in), so it has to shut off to protect itself from damage. Most of these fault detection circuits, apparently including the one in your power supply, restart on their own after a fixed time.

Hearth
  • 32,466
  • 3
  • 55
  • 133
  • Thank you so much for such a clear description! Just need to digest this and see if it all clicks so I don’t waste your time with half-baked questions. – Doc Jan 20 '24 at 18:10
1

Based on your description, you shorted out the power supply with the multimeter in current measurement mode.

You are not supposed to do that, it is not the way to measure current out from a supply.

The supply protected itself by shutting down and trying to restart periodically.

Good thing is it did not blow the (possibly special and expensive) multimeter fuse.

Justme
  • 147,557
  • 4
  • 113
  • 291