I tested an over-current and over-voltage protection chip, and connected a small resistor to the output terminal to make it over-current. After the chip is turned off, it starts to detect the current every 2 seconds, but if I short-circuit the output terminal directly, the frequency of the chip startup detection will Greatly shortened, and the detection time will be greatly extended, resulting in serious heating of the chip.
I think that short circuit is nothing more than an extreme case of overcurrent, and the chip processing mechanism should be the same. But the performance difference on this chip is very big.
Why does this happen?
The main difference is that over-current protection comes into effect when a current increases from a design limit. This may be anything from less than 1% to any amount of tolerated current. Short circuit current is probably a fault where there may be no real limit or knowledge of the size of the current and heat dissipation, which could result in catastrophic damage. An old-style fuse is very useful here!
in other words, you can have over current protection where you can tolerate a current which (should) not cause damage. Short circuit currents are not so forgiving in most devices.
You might find it interesting (if you haven't done so already) to get a worksheet of the device you are looking at, and go through the technical details. There are millions of various types of PSUs, fuses, limiters and so forth on the net.
Hope this helps.