my idea is to create a “general purpose” board with a MOSFET gate driver and the appropriate MOSFET for it. This board should be able to work with the logic level signal output of 3.3V/5V and have the capability to drive LED strips, PC fans or kind of “anything” that needs a little bit more juice and is controllable via PWM.
So I found the UCC27517 which is able to source and sink 4A Peak.
Now my question is, what are those 4A linked to.
The maximal Drain/Source current of the MOSFET?
Or is that the total load that is coming up when the MOSFET gate is driven?
Should I not be able to drive a logic level MOSFET directly from the output of the microcontroller?
What is the driver internally supposed to do better than the direct output of the microcontroller?
And yes, I know there are plenty of resources on google that may explain the topic, but those brought me just more confusion.
Thas is peak load of the driver. I.e. when switching the mosfet on or off the maximum current that would flow through the IC should not exceed 4A.
More knowledgeable people will probably answer with science but my amateur understanding is that mosfet gate gate capacitance is the main factor of why mosfet drivers exist. Big power mosfets have large gate capacitance which can cause large current spikes when driven directly by microcontroller output and will easily blow it’s push/pull gates.
To limit that current you have to add a limiting resistor between the gate and output (that’s the case even with dedicated mosfet driver ICs) but because of small output current capability of a generic microcontroller the resistor will have to be so big that your mosfet will turn on and off too slowly for any kind of fast pwm and will also waste a lot of energy because it will spend more time in the non saturated region where it’s half-opened.
Chapter 10 of the datasheet you linked talks about these considerations.
Not really a KiCad related thread but I’ll leave it in projects for now. If none of the power experts want to give you some pointers then I’ll close the topic. You might consider asking in an electronics forum like eevblog.
The reason you need a gate driver IC is not so much to protect the uP, but to turn the FET on fast. The longer it takes to charge the Gate, the more power is wasted in the FET’s linear region. Spend too much time there, and the FET will overheat.
That is pretty much correct. I would quibble that it is usually not called capacitance because it tends to be non linear. But if you check out the spec sheet on most MOSFETs, you find numbers relating to “gate charge”.
Some significant current is required to get that charge into or out of the MOSFET gate quickly in order to achieve efficient power switching. The gate driver might have an output resistance of 1 ohm, give or take. The other thing is voltage: Depending on the MOSFET you may need 5V or 8V or 12V in order to completely turn on the device. I do not think most microcontrollers can deliver that voltage.
To turn a MOSFET on/off you need to move charge to and from the gate.
The peak current need is associated with driving gate voltage and rate of charge transfer for a given switching speed. This is controlled via the supply voltage of the driver and the TOTAL gate resistance that is made up of
On-die
lead/traces.
Dedicated gate resistor
Driver ON/OFF resistance (which is also what governs the peak current)
The power of the driver is therefore governed by the switching frequency
NOTE if you intent to sink some reasonable current you need to get the gate voltage quite high. This isn’t really of concern for “logic level FETs” where they might just need to sink 100uA via a pullup but for some power (led, relay, motor) the “threshold voltage” is just the voltage needed for something like 1mA to flow. To give you an idea a SiC device I am presently using has a threshold of 2V but I need to hit it with 8V to permit reasonable current to flow and I drive it with 20V to ensure the Ron is good enough for 600A to flow and it is efficient. The driver is capable of 9A peak with a gate resistance of 2R to permit this switching in 20ns
Okay guys, thank you very much.
The answers I got cleared up a LOT of my confusion, as soon as I come home I am going to try implement what you wrote and see how much I really understood.
I recommend to build something on a breadboard, to get more insight on how those power MOSfets work.
You can build a quite decent MOSfet driver with a NE555. It can deliver an output current of 500mA (continuous, not peak, peak is probably higher but unspecified).
Then put some series resistors between the MOSfet driver and the gate of the fet, and observe the result on your oscilloscope. If the resistor is relatively high, charging of the gate is slow and you will see a “plateau” on the gate voltage, just like in the graph BobZ. During this time there is both an output voltage over the MOS fet, and a drain current, and this means a high power dissipation.
When the gate is charged quickly, the MOSfet acts more like a switch. When off there is a source drain voltage, and when on there is a source drain current, but not both at the same time and the fet wil stay cool.