0

Sorry if my question is too naive, I'm new to this. I am trying to design a controlled current source for currents up to about 1A (diagram below). My requirement is that the current depend only on the value of a single voltage source and a single resistance (which I can precisely choose).

schematic

simulate this circuit – Schematic created using CircuitLab

The problem is that the output voltage of the op-amp = gate voltage of MOSFET is becoming equal to the rail voltage, causing saturation at about 300mA itself (i.e. before reaching 1A). Is there a way I can improve the above circuit to prevent the op-amp from saturating and getting a 1A current?

EDIT: I've changed the design following τεκ's answer and swapped the FET and Load. This should ensure that FET's source voltage doesn't go above 5V, but the gate voltage still goes to 14V and saturates the op-amp. Thanks in advance.

VP06
  • 3
  • 2

2 Answers2

1

With a 5 ohm sense resistor and a 5 volt control signal you will get 1 amp through the load if the load's value of resistance is low enough. For instance, if the load is 10 ohms then you need 15 volts at the source of the MOSFET to drive 1 amp through 10 ohms plus 5 ohms.

Plus you need another 2 to 10 volts more on the gate to get enough conduction (generalism alert) through the MOSFET. So, given your values, even if the load was zero ohms, you will need a supply voltage that is at least 7 volts and more likely 10 to 15 volts. This assumes the op-amp output is rail-to-rail.

I will note that you have shown V2 as 1 volt and this could only ever push 200 mA through 5 ohms.I assume this is an error on your part as is not putting a ground symbol at the bottom of R1.

Andy aka
  • 456,226
  • 28
  • 367
  • 807
1

Swap the load and MOSFET.

schematic

simulate this circuit – Schematic created using CircuitLab

τεκ
  • 4,249
  • 1
  • 17
  • 17