My home is currently providing +/- 240V AC in lieu of the supposed 230V. One of my DIY amplifiers, a Marshall 2204 (50W, MV) based circuit, has a Hammond 290GX power transformer that is made for 120V, 220V, and 240V.
Currently I have the 220V tap wired up, but I am changing it over to 240V as my heaters are +3.8V per side and overall amp voltages are rather higher than I’d like.
While I’m in the amp, I am thinking of elevating the heater voltages to better manage the v2 cathode follower voltages as well which currently stands at 192VDC…. Following the docs at Valve Wizard’s web page (https://valvewizard.co.uk/heater.html) on elevated heaters, I believe I could use a divider of 1M - 100k to ground (with a 10 or 16µF smoothing across the 100k to ground) from the screens B+ (which is 508V at the moment of writing this) *** see below - my calculations show this should roughly provide +45 VDC for my heater’s center tap.
Something has me confused though… *** The docs say that "The elevation voltage can be taken from a potential divider across the HT (it doesn't matter where you position the divider)". I see a lot of people say to take this from the screens supply on 1959, 2203/04 circuits, but schematics seem to always show this happens after the standby switch. I see that in newest scems, Marshall takes elevated heaters from the CT supply before the standby switch.
So when the amp is in standby, the heater supply CT sees just the 100k/16µF to ground? Is that okay to do? Merlin’s doc mentions "The divider should have a fairly high resistance so as not to waste current, although the lower arm (R2) should not be excessively large or Rhk(max) may be grossly exceeded, so it is advisable not to make it greater than 100k.".
I can’t find data in tube sheets that I have on hand to explain Rhk… I assume this is the amount of resistance between the heater and cathode? The only reference I could find is something like "Cathode - heater insulation resistance" - I’d love to understand this more.
Could someone fill in some of these blanks?
Currently I have the 220V tap wired up, but I am changing it over to 240V as my heaters are +3.8V per side and overall amp voltages are rather higher than I’d like.
While I’m in the amp, I am thinking of elevating the heater voltages to better manage the v2 cathode follower voltages as well which currently stands at 192VDC…. Following the docs at Valve Wizard’s web page (https://valvewizard.co.uk/heater.html) on elevated heaters, I believe I could use a divider of 1M - 100k to ground (with a 10 or 16µF smoothing across the 100k to ground) from the screens B+ (which is 508V at the moment of writing this) *** see below - my calculations show this should roughly provide +45 VDC for my heater’s center tap.
Something has me confused though… *** The docs say that "The elevation voltage can be taken from a potential divider across the HT (it doesn't matter where you position the divider)". I see a lot of people say to take this from the screens supply on 1959, 2203/04 circuits, but schematics seem to always show this happens after the standby switch. I see that in newest scems, Marshall takes elevated heaters from the CT supply before the standby switch.
So when the amp is in standby, the heater supply CT sees just the 100k/16µF to ground? Is that okay to do? Merlin’s doc mentions "The divider should have a fairly high resistance so as not to waste current, although the lower arm (R2) should not be excessively large or Rhk(max) may be grossly exceeded, so it is advisable not to make it greater than 100k.".
I can’t find data in tube sheets that I have on hand to explain Rhk… I assume this is the amount of resistance between the heater and cathode? The only reference I could find is something like "Cathode - heater insulation resistance" - I’d love to understand this more.
Could someone fill in some of these blanks?
Comment