While investigating the current drain of low voltage circuit referred to in earlier post, thought I'd check the bias current of the EL34s. It's the 50W version so only V5 and V6 present in attached schematic. The amp has 1R resistors soldered in and test points for checking bias current. At CT of output transformer I get 490V.
V6 was running 42mA at idle, V5 32mA. This equates to 82% and 63% of the 25W rated max respectively.
There are two bias trim presets. The minimum V6 could be set to was 40mA. But when set to the minimum, the bias current of V5 drops to about 15mA!
I haven't checked the resistors involved in setting bias voltage for any faults, but why would the bias of one of output pair affect the other in this way?
Thanks for advice.
V6 was running 42mA at idle, V5 32mA. This equates to 82% and 63% of the 25W rated max respectively.
There are two bias trim presets. The minimum V6 could be set to was 40mA. But when set to the minimum, the bias current of V5 drops to about 15mA!
I haven't checked the resistors involved in setting bias voltage for any faults, but why would the bias of one of output pair affect the other in this way?
Thanks for advice.
Comment