I spend lots of time at the Carvin forums - carvinbbs.com and carvinmuseum.com. I give simple advice and whip up simple mods for folks to try. For amps with 4x6L6GC or 4xEL34 (plate voltage is around 450V), Carvin recommends biasing to 100mA at the standby switch on amps that have bias pots (most do now). Screen resistors, depending on vintage, are 270 or 470 Ohms, though a few early 50W models have no screen resistors.
Now, some folks are recommending bias settings (at the standby switch) up to 180mA. An article at http://www.carvinmuseum.com/pdf/Ka-Boom_v1a.pdf recommends 70% to 80% of rated plate dissipation, and it measures and subtracts out the screen current, preamp current, and phase inverter current.
I've used 65% to 75% of Pa measured on each tube with a bias meter (this provides a cushion, since the screen current is included in the measurement), and I generally install 1K screen resistors.
I went through Kevin O'Connor's TUT series, and he recommends 50% of Pa measured with a 1 or 10 Ohm cathode resistor, but he admits 33% to 66% is safe.
I understand that there's no sense turning up bias current if you can't hear a difference, and lower increases tube life.
I did a bit of searching here, because the topic seemed likely to have come up alot, and perhaps it generated heated arguments (pun intended). A 70% of Pa figure comes up a bit, and the question doesn't seem to be beaten to death.
So let me ask you:
For 6L6GCs or EL34s in a class AB1 amp with 450V plate voltage and 470 Ohm screen resistors:
1. What percentage of Pa do you try to stay under, measuring current at the cathodes, a) on your own gear, and b) on other people's gear?
2. If you did a standby switch measurement, and subtracted out a measurement with no power tubes, what percentage would you try to stay under, given that you now have to take into account the normal variation of tubes from various sources that claim to be "matched", and the fact that the set's match may have drifted over time. I always advocate the safer bias meter (nooby users, and Eurotubes will sell you a bias probe attachment for $25 now), but the standby switch is the usual method that people are using.
Now, some folks are recommending bias settings (at the standby switch) up to 180mA. An article at http://www.carvinmuseum.com/pdf/Ka-Boom_v1a.pdf recommends 70% to 80% of rated plate dissipation, and it measures and subtracts out the screen current, preamp current, and phase inverter current.
I've used 65% to 75% of Pa measured on each tube with a bias meter (this provides a cushion, since the screen current is included in the measurement), and I generally install 1K screen resistors.
I went through Kevin O'Connor's TUT series, and he recommends 50% of Pa measured with a 1 or 10 Ohm cathode resistor, but he admits 33% to 66% is safe.
I understand that there's no sense turning up bias current if you can't hear a difference, and lower increases tube life.
I did a bit of searching here, because the topic seemed likely to have come up alot, and perhaps it generated heated arguments (pun intended). A 70% of Pa figure comes up a bit, and the question doesn't seem to be beaten to death.
So let me ask you:
For 6L6GCs or EL34s in a class AB1 amp with 450V plate voltage and 470 Ohm screen resistors:
1. What percentage of Pa do you try to stay under, measuring current at the cathodes, a) on your own gear, and b) on other people's gear?
2. If you did a standby switch measurement, and subtracted out a measurement with no power tubes, what percentage would you try to stay under, given that you now have to take into account the normal variation of tubes from various sources that claim to be "matched", and the fact that the set's match may have drifted over time. I always advocate the safer bias meter (nooby users, and Eurotubes will sell you a bias probe attachment for $25 now), but the standby switch is the usual method that people are using.
Comment