Hi,
I am fairly new to tube amp maintenaince but i'm quite good with electronics generally and understand most of the principles, so having previously modded and rebiased my EL84 equipped Carvin Nomad with absolutely no trouble, I have had a go at my (brand new) Peavey 6505 head, firstly to install an adjustable bias pot (as the 6505, and 5150 before it, have a fixed bias), and secondly to then bias it a bit hotter, as I've read that these amps are shipped with a very cold bias.
I replaced the standard 15K resistor with a 10K multiturn cermet pot and a 6.8K resistor in series, uneventfully. (The procedure for this mod is described in several online articles on 5150 bias mods). After getting some really weird readings when trying to bias the amp I replaced the stock (very poorly matched) Ruby tubes (which I understand are fairly cheapo) with a matched quad of JJ Tesla 6L6GCs, and now I'm at least getting consistent readings.
I am measuring the bias current using a Fluke 77 Series II meter, using the output transformer shunt method. The internal resistance of the meter is about 10 ohms.
OK, now the problem. I'm measuring a current of about 31 mA, which is a believable bias current, BUT this is meant to be for TWO tubes (cos it's a two-tube-per-side push pull amp), so it's really 15mA which is quite cold.
I have read that some amps (Marshalls - which much of the 5150 is based on) have a 'low' resistance in the output transformer primary, and that in these amps one should NOT believe a bias current measured with a meter with a 'high' internal resistance. In this context what is a 'low' transformer primary resistance, and what is a 'high' meter resistance?
I feel in my bones that 10 ohms (meter) is a significant proportion of 28-37 ohms (output tran primary) and so I'm reluctant to believe that enough of the current is being 'shunted' through my meter.
I have used another method of measuring bias current - measuring the output transformer primary resistances (with amp off) and then measuring the voltage drop across the respective windings with the amp on and warmed up fully. Using this method I get:
(red wire to brown) 1.105VDC divided by 29.5 ohms = 37 mA, and
(red wire to blue) 1.384 VDC divided by 37.5 ohms - 37 mA
divided by two (for two tubes per side) = 18.5 mA
If this is right, using the V=IR method, then there is definitely scope to increase the bias a little bit. I was planning on running the tubes a little colder than the usual 30-34 mA recommended, as the tubes will be getting some heavy work in use.
QUESTIONS:
1) Can I believe the (very low sounding) bias current as measured with the output transformer shunt method, bearing in mind quite similar resistance in primary windings and meter?
2) Can I believe the 'calculated' bias current (V=IR method)?
3) the plate voltage is supposed to be about 500V on the 5150/6505. But should I actually measure this as well?
Cheers,
Marcus
I am fairly new to tube amp maintenaince but i'm quite good with electronics generally and understand most of the principles, so having previously modded and rebiased my EL84 equipped Carvin Nomad with absolutely no trouble, I have had a go at my (brand new) Peavey 6505 head, firstly to install an adjustable bias pot (as the 6505, and 5150 before it, have a fixed bias), and secondly to then bias it a bit hotter, as I've read that these amps are shipped with a very cold bias.
I replaced the standard 15K resistor with a 10K multiturn cermet pot and a 6.8K resistor in series, uneventfully. (The procedure for this mod is described in several online articles on 5150 bias mods). After getting some really weird readings when trying to bias the amp I replaced the stock (very poorly matched) Ruby tubes (which I understand are fairly cheapo) with a matched quad of JJ Tesla 6L6GCs, and now I'm at least getting consistent readings.
I am measuring the bias current using a Fluke 77 Series II meter, using the output transformer shunt method. The internal resistance of the meter is about 10 ohms.
OK, now the problem. I'm measuring a current of about 31 mA, which is a believable bias current, BUT this is meant to be for TWO tubes (cos it's a two-tube-per-side push pull amp), so it's really 15mA which is quite cold.
I have read that some amps (Marshalls - which much of the 5150 is based on) have a 'low' resistance in the output transformer primary, and that in these amps one should NOT believe a bias current measured with a meter with a 'high' internal resistance. In this context what is a 'low' transformer primary resistance, and what is a 'high' meter resistance?
I feel in my bones that 10 ohms (meter) is a significant proportion of 28-37 ohms (output tran primary) and so I'm reluctant to believe that enough of the current is being 'shunted' through my meter.
I have used another method of measuring bias current - measuring the output transformer primary resistances (with amp off) and then measuring the voltage drop across the respective windings with the amp on and warmed up fully. Using this method I get:
(red wire to brown) 1.105VDC divided by 29.5 ohms = 37 mA, and
(red wire to blue) 1.384 VDC divided by 37.5 ohms - 37 mA
divided by two (for two tubes per side) = 18.5 mA
If this is right, using the V=IR method, then there is definitely scope to increase the bias a little bit. I was planning on running the tubes a little colder than the usual 30-34 mA recommended, as the tubes will be getting some heavy work in use.
QUESTIONS:
1) Can I believe the (very low sounding) bias current as measured with the output transformer shunt method, bearing in mind quite similar resistance in primary windings and meter?
2) Can I believe the 'calculated' bias current (V=IR method)?
3) the plate voltage is supposed to be about 500V on the 5150/6505. But should I actually measure this as well?
Cheers,
Marcus
Comment