Saw a video several months back and a gent claims you can measure the A/C voltage from your speakers with a multi-meter, measure the ohms of the speaker(s), and then use Ohms law to approximate the power (at any level of distortion you like).
I recently applied this technique to one of my amps and while playing I could keep it right around 15 volts without too much trouble (sometimes higher, sometimes lower), so at least in terms of my meter, and for what I was playing (single notes in the key of B, the average was 15 volts.
So using ohms law I would come up with 15vac / 8ohm = 1.875amps and then, 1.875 amps x 15vac = 28 watts (with a highly distorted signal in this case).
Or just use a shortcut of squaring 15vac divided by ohms ; (15^2) / 8 = 28 watts
One thing I got from all this is that wattage is a moving target, and depending on what you play it's quite variable, not to mention the efficiency of the speakers and cabinet you use. So SPL can vary tremendously regardless of what the wattage spec 'says' the amp should be doing.
Is there any validity to this method of measurement as an approximation, or is it all rubbish unless you measure things with a scope, a dummy load, and with a specific frequency ?
Thanks for any input !
Here's the video from Gerald Weber.
https://www.youtube.com/watch?v=1b2jQWK8xlQ
I recently applied this technique to one of my amps and while playing I could keep it right around 15 volts without too much trouble (sometimes higher, sometimes lower), so at least in terms of my meter, and for what I was playing (single notes in the key of B, the average was 15 volts.
So using ohms law I would come up with 15vac / 8ohm = 1.875amps and then, 1.875 amps x 15vac = 28 watts (with a highly distorted signal in this case).
Or just use a shortcut of squaring 15vac divided by ohms ; (15^2) / 8 = 28 watts
One thing I got from all this is that wattage is a moving target, and depending on what you play it's quite variable, not to mention the efficiency of the speakers and cabinet you use. So SPL can vary tremendously regardless of what the wattage spec 'says' the amp should be doing.
Is there any validity to this method of measurement as an approximation, or is it all rubbish unless you measure things with a scope, a dummy load, and with a specific frequency ?
Thanks for any input !
Here's the video from Gerald Weber.
https://www.youtube.com/watch?v=1b2jQWK8xlQ
Comment