I agree with Jazz's replies. Following are some additional comments:
This comes up quite often. If you drive the amp hard enough with the tone controls dimed, and 50 to 100mV is hard enough, then Fender BF & SF amps will produce full clean power at the volume control setting you described. When you turn the volume control up higher the circuits start to distort more but the amplitude shown on the scope trace doesn’t increase much more.
That’s not normal at all. I’m assuming that you remembered the earlier discussion and had the resistor disconnected from the amp when you made both of those measurements.
The resistance of all resistors changes as the temperature changes. The temperature coefficient of resistance (TCR) of a resistor like the one you are using is such that the resistance increases as the resistor heats up. The percentage change is modest and the resistance should not decrease as you described. You could do an experiment to test the resistor as a standalone part just to make sure there wasn’t a problem with your setup.
Just connect the resistor to your Ohmmeter and monitor the resistance value while you heat the resistor with a heat gun or hairdryer. You should observe the resistance increase a little as you heat it and then return to the starting value as the resistor cools back to room temperature.
You can also do an experiment to investigate the apparent low amplitude display on your scope. Just connect both the scope and the DVM (set to AC volts) directly to the output of your signal generator and compare the results. Remember that the meter is displaying Vrms and the scope is showing the whole peak-to-peak sine wave (Vpp)
The conversion equation is Vpp = 2.8 x Vrms.
The differences in the readings you described (and showed) seem way off.
One last comment...Under normal conditions your amp should not be blowing the fuse (not even a 2.5A Slo-Blo) when putting out continuous full power.
Cheers,
Tom
Originally posted by gtrplayr1976
View Post
Originally posted by gtrplayr1976
View Post
The resistance of all resistors changes as the temperature changes. The temperature coefficient of resistance (TCR) of a resistor like the one you are using is such that the resistance increases as the resistor heats up. The percentage change is modest and the resistance should not decrease as you described. You could do an experiment to test the resistor as a standalone part just to make sure there wasn’t a problem with your setup.
Just connect the resistor to your Ohmmeter and monitor the resistance value while you heat the resistor with a heat gun or hairdryer. You should observe the resistance increase a little as you heat it and then return to the starting value as the resistor cools back to room temperature.
You can also do an experiment to investigate the apparent low amplitude display on your scope. Just connect both the scope and the DVM (set to AC volts) directly to the output of your signal generator and compare the results. Remember that the meter is displaying Vrms and the scope is showing the whole peak-to-peak sine wave (Vpp)
The conversion equation is Vpp = 2.8 x Vrms.
The differences in the readings you described (and showed) seem way off.
One last comment...Under normal conditions your amp should not be blowing the fuse (not even a 2.5A Slo-Blo) when putting out continuous full power.
Cheers,
Tom
Comment