I've noticed that many popular designs and modern clone based amps seem to use ubiquitous ratios regardless of other circuits aspects. Examples:
Fender Deluxe Reverb 820/47 at 8 ohms 20W
Fender Vibrolux 820/47 at 8 ohms 50W
Fender Bandmaster 820/100 at 4 ohms 50W
Fender Twin Reverb 820/100 at 4 ohms 100W
What I see is that the higher wattage amps are using more NFB with these relative ratios. And among the two power tube models the move from 820/47 for 8 ohms to 820/100 for 4 ohms is a straight x2 difference when I think it should be closer to x1.5, right?
And this all over the BF and SF era amps by Fender. It always seems to be 820/47 or 820/100 and the amps listed above all use the same AB763 circuit otherwise.
And then there's Marshall's classic designs that use either 100k/4.7k or 47k/4.7k on different taps depending on some gain differences in preamp circuits. But they seem to have totally missed the 50W to 100W difference. For equivalent era and amp type (like a 50W '70 "Super Lead" compared to a 100W 70' "Super Lead") they usually went with the same secondary tap for both models but the 50W had the 100k series resistor and the 100W had the 47K ??? So that would be A LOT more NFB for the 100W models.
Not being very technical I have to ask about this. Is this arbitrary? Or maybe even technical error? Or is there a reason for the higher NFB ratio for the higher wattage amps?
Fender Deluxe Reverb 820/47 at 8 ohms 20W
Fender Vibrolux 820/47 at 8 ohms 50W
Fender Bandmaster 820/100 at 4 ohms 50W
Fender Twin Reverb 820/100 at 4 ohms 100W
What I see is that the higher wattage amps are using more NFB with these relative ratios. And among the two power tube models the move from 820/47 for 8 ohms to 820/100 for 4 ohms is a straight x2 difference when I think it should be closer to x1.5, right?
And this all over the BF and SF era amps by Fender. It always seems to be 820/47 or 820/100 and the amps listed above all use the same AB763 circuit otherwise.
And then there's Marshall's classic designs that use either 100k/4.7k or 47k/4.7k on different taps depending on some gain differences in preamp circuits. But they seem to have totally missed the 50W to 100W difference. For equivalent era and amp type (like a 50W '70 "Super Lead" compared to a 100W 70' "Super Lead") they usually went with the same secondary tap for both models but the 50W had the 100k series resistor and the 100W had the 47K ??? So that would be A LOT more NFB for the 100W models.
Not being very technical I have to ask about this. Is this arbitrary? Or maybe even technical error? Or is there a reason for the higher NFB ratio for the higher wattage amps?
Comment