Okay, looks like I can just run a resistor from HT to the cathode to provide the extra current to keep the LED conducting in a linear region. Sweet. That's easy.
One of my biggest problems is going to be undoing the bent pins. Since a couple of these tubes are grid leak biased, they bent the cathode pin into the center ground post of the socket and then soldered it to the ground post. So that's going to be loads of fun to undo. I'm not even entirely sure how to do it. I've got a solder sucker and solder wick, but my solder wick sucks (can someone recommend a good solder wick?)
This is actually an area where my knowledge of electronics falls short. So when I do the LED bias, the LED is going to bring the voltage at the cathode up whatever the forward voltage drop is. About 1.8V or so... So if I put a 33K resistor from say 250V to the cathode, why is it that it supplies current but doesn't change the voltage? Is it the LED that forces the voltage to remain at 1.8V or whatever it is?
One of my biggest problems is going to be undoing the bent pins. Since a couple of these tubes are grid leak biased, they bent the cathode pin into the center ground post of the socket and then soldered it to the ground post. So that's going to be loads of fun to undo. I'm not even entirely sure how to do it. I've got a solder sucker and solder wick, but my solder wick sucks (can someone recommend a good solder wick?)
This is actually an area where my knowledge of electronics falls short. So when I do the LED bias, the LED is going to bring the voltage at the cathode up whatever the forward voltage drop is. About 1.8V or so... So if I put a 33K resistor from say 250V to the cathode, why is it that it supplies current but doesn't change the voltage? Is it the LED that forces the voltage to remain at 1.8V or whatever it is?
Comment