Hi,
I've read a few times online where people said to watch the voltage ratings of resistors. Is this necessary? I thought that since the resistance of a resistor is fixed (since that's the whole point of a resistor) then doesn't that mean by P = V^2 over R that assuming the power rating is not exceeded, the voltage must be okay? And too high a voltage across a resistor would draw too much current (and therefore power), so for e.g. say a 100 ohm 5 watt resistor , at maxium power would have sqrt(100x5) = about 22 volts across it? And any more voltage than that would be too high?
So (summary of above paragraph) the voltage rating is built into the wattage rating? And other than that there's no need to consider voltage ratings of resistors?
Or am I missing something?
That was my thoughts, then looking online again I found conflicting answers: eg "A resistor can be used at any combination of voltage (within reason) and current so long as its “Dissipating Power Rating” is not exceeded with the resistor power rating indicating how much power the resistor can convert into heat or absorb without any damage to itself.",
which is what i thought, and then this, which is the opposite
"The voltage rating is for the resistor series typically and specifies the maximum peak voltage you can apply without danger of damaging the resistor due to corona, breakdown, arcing, etc.The power rating is completely independent of the voltage rating. It specifies the maximum steady state power the package is able to dissipate under given conditions.
You have to conform to both specs. If placing the maximum voltage across the resistor results in more power than the spec allows you have to reduce the voltage until you meet the spec. Likewise you can't increase the voltage above the rating just because you're not hitting the maximum power limit.
"
So which is true?
The second answer seems counter intuitive to me, since, if you assume there is a resistor with a voltage rating that can be exceeded without exceeding the power rating, then doesnt that also mean it would be impossible to run that resistor at its actual power rating, since that would imply that at the power rating there would be a higher voltage across it than its voltage rating would allow. Does that make sense or am i talking cr#p?
I've read a few times online where people said to watch the voltage ratings of resistors. Is this necessary? I thought that since the resistance of a resistor is fixed (since that's the whole point of a resistor) then doesn't that mean by P = V^2 over R that assuming the power rating is not exceeded, the voltage must be okay? And too high a voltage across a resistor would draw too much current (and therefore power), so for e.g. say a 100 ohm 5 watt resistor , at maxium power would have sqrt(100x5) = about 22 volts across it? And any more voltage than that would be too high?
So (summary of above paragraph) the voltage rating is built into the wattage rating? And other than that there's no need to consider voltage ratings of resistors?
Or am I missing something?
That was my thoughts, then looking online again I found conflicting answers: eg "A resistor can be used at any combination of voltage (within reason) and current so long as its “Dissipating Power Rating” is not exceeded with the resistor power rating indicating how much power the resistor can convert into heat or absorb without any damage to itself.",
which is what i thought, and then this, which is the opposite
"The voltage rating is for the resistor series typically and specifies the maximum peak voltage you can apply without danger of damaging the resistor due to corona, breakdown, arcing, etc.The power rating is completely independent of the voltage rating. It specifies the maximum steady state power the package is able to dissipate under given conditions.
You have to conform to both specs. If placing the maximum voltage across the resistor results in more power than the spec allows you have to reduce the voltage until you meet the spec. Likewise you can't increase the voltage above the rating just because you're not hitting the maximum power limit.
"
So which is true?
The second answer seems counter intuitive to me, since, if you assume there is a resistor with a voltage rating that can be exceeded without exceeding the power rating, then doesnt that also mean it would be impossible to run that resistor at its actual power rating, since that would imply that at the power rating there would be a higher voltage across it than its voltage rating would allow. Does that make sense or am i talking cr#p?
Comment