Does amplifier wpc @ 4 or 2 ohm really matter at "low" volumes?

Scorpius

New Member
This is the question I've tried to find answer for...

There is much talk about how good amplifier design should ideally double wattage per ohm halve, for example 100 wpc 8 ohm should ideally be200wpc to 4ohms. However, very few amplifier actually do this, even most High-End brands like Luxman, Accuphase etc fail to achieve this in most of their amp designs.

Which let me thinking, is it really, really necessary to have this on amplifier? People usually listen their music at few watts at most, sometimes achieving briefly perhaps 10-50 watts.

Lets make an example:
All tree given amplifiers from same brand are identical (aside from different toroidal trafos and secondary capacitors), minimum impedance of listened speakers to be ~4.5 ohm.

A: 100wpc to 8 ohm, 150wpc to 4 ohm (very common)
B: 100wpc to 8 ohm, 200wpc to 4 ohm (very rare)
C: 100wpc to 8 ohm, 100wpc to 4 ohm (imaginary example)

should these amps equal identical musical experience, because 100 wpc to 8 or 4 ohms is never exceeded?

Or am I getting this wrong?
 
First--scenario C is not imaginary--it is more common than you think with cheaper amps with limited power supplies. I have even seen amps that are rated lower at 4 ohms than at 8 ohms.

Wattage is a function of resistance, voltage and current--refer to Ohm's Law. Play with some values in here https://testguy.net/content/266-Ohm-s-Law-Watt-s-Law-Cheat-Sheet and you will see how the current requirement increases to maintain a given watt output as resistance drops.

So basically, the answer is no--it will not be an identical musical experience, because a higher current demand is being placed on the power supply at any level of operation, and the power supply may or may not be up to the task. In order to maintain a given voltage and corresponding watt output, the current will double as resistance is halved.
 
Last edited:
First--scenario C is not imaginary--it is more common than you think with cheaper amps with limited power supplies. I have even seen amps that are rated lower at 4 ohms than at 8 ohms.

Wattage is a function of resistance, voltage and current--refer to Ohm's Law. Play with some values in here https://testguy.net/content/266-Ohm-s-Law-Watt-s-Law-Cheat-Sheet and you will see how the current requirement increases to maintain a give watt output as resistance drops.

So basically, the answer is no--it will not be an identical musical experience, because a higher current demand is being placed on the power supply at any level of operation, and the power supply may or may not be up to the task. In order to maintain a given voltage and corresponding watt output, the current will double as resistance is halved.
Nicely put. Plus add different amplifier circuit types into the mix.
 
Generally speaking, a direct-coupled amplifier will generate more power into 4 ohms than 8 ohms but only a very few have the ability to double the power. This is only part of the story however, as the lower impedance is also more taxing on the output circuits which can be destructive when so driven for extended periods of time. This will be dictated by the quality and engineering that went into the unit.

Conversely, there are indeed other amplifiers designed to provide identical amount of power into a wide range of loads- this is not a hypothetical scenario. Such units usually feature large output transformers to match the output to the load which helps maintain cooler operation and helps to extend product life and reliability. Output transformers add considerable weight and expense however.

As to the effect on sonic performance, none of these factors will have much of an impact so long as each unit is driven within its safe operational limits.
 
First--scenario C is not imaginary--it is more common than you think with cheaper amps with limited power supplies. I have even seen amps that are rated lower at 4 ohms than at 8 ohms.

Wattage is a function of resistance, voltage and current--refer to Ohm's Law. Play with some values in here https://testguy.net/content/266-Ohm-s-Law-Watt-s-Law-Cheat-Sheet and you will see how the current requirement increases to maintain a give watt output as resistance drops.

So basically, the answer is no--it will not be an identical musical experience, because a higher current demand is being placed on the power supply at any level of operation, and the power supply may or may not be up to the task. In order to maintain a given voltage and corresponding watt output, the current will double as resistance is halved.

Are you absolutely sure? Do you want to lock your answer? ;)

See, I made that example exactly to what you just described. I know ohms law etc, but in my example the power capabilities were never exceeded in any scenarios. If maximum peak of music was 50W @ 8ohm, it was under 100W @ 4 ohms because minimal impedance of speakers was 4.5 ohm.

Lets make it more simple: Lets say listener in previous example listens music at ~1W (very common situation). Does all amplifiers in example sound identical?
 
First--all amplifiers--even those with identical specifications do not sound identical. Second, music is not a static since wave test signal at a set frequency of a given amplitude, so a single cymbal crash or kick drum hit can be 100X that baseline 1 w "nominal" listening level. Thirdly, speakers are not a fixed "dummy load"--their impedance rating is a "composite", so even an 8 ohm speaker may have dips in impedance far lower than 8 ohms at certain points in the frequency response curve.

So, yes, I'll stick to my answer--components (mainly power supply and output transistors) can become "stressed", even at lower listening levels--higher levels just increased the amount of stress.
 
Generally speaking, a direct-coupled amplifier will generate more power into 4 ohms than 8 ohms but only a very few have the ability to double the power. This is only part of the story however, as the lower impedance is also more taxing on the output circuits which can be destructive when so driven for extended periods of time. This will be dictated by the quality and engineering that went into the unit.

Conversely, there are indeed other amplifiers designed to provide identical amount of power into a wide range of loads- this is not a hypothetical scenario. Such units usually feature large output transformers to match the output to the load which helps maintain cooler operation and helps to extend product life and reliability. Output transformers add considerable weight and expense however.

As to the effect on sonic performance, none of these factors will have much of an impact so long as each unit is driven within its safe operational limits.

Ah, that's what I think also. However, it seems that doubling wattage for given amplifier is quite costly and apparently can even double the price for amplifier. Lets take Luxman for example:
Power amplifier M-700u: 120W to 8ohm, 210W to 4 ohm, weight 27.5kg, price 8000€
Power amplifier M-900u: 150W to 8ohm, 300W to 4 ohm, weight 48kg (!!!), price 15000€ (!!!)

So you pay almost double for ~1 dB increase of music preformance @ 4 ohm....geez o_O
 
Then all other things are not equal.

On paper--as in a technical spec sheet with typically published specs, you could theoretically find two identically spec'd amps, but I have to agree, not all amps are designed and created equal, even if they meet the same spec numbers.
 
Ah, that's what I think also. However, it seems that doubling wattage for given amplifier is quite costly and apparently can even double the price for amplifier. Lets take Luxman for example:
Power amplifier M-700u: 120W to 8ohm, 210W to 4 ohm, weight 27.5kg, price 8000€
Power amplifier M-900u: 150W to 8ohm, 300W to 4 ohm, weight 48kg (!!!), price 15000€ (!!!)

So you pay almost double for ~1 dB increase of music preformance @ 4 ohm....geez o_O

Well you have just identified with unmistakable clarity the meaning of "law of diminishing returns" as it pertains to hi fi.
 
All other things equal - the answer is no.
Can you clarify why they sound different and how much different? The question of this topic has been troubling me for quite some time so it would be nice to get explained, scientific answers. :)

First--all amplifiers--even those with identical specifications do not sound identical. Second, music is not a static since wave test signal at a set frequency of a given amplitude, so a single cymbal crash or kick drum hit can be 100X that baseline 1 w "nominal" listening level. Thirdly, speakers are not a fixed "dummy load"--their impedance rating is a "composite", so even an 8 ohm speaker may have dips in impedance far lower than 8 ohms at certain points in the frequency response curve.

So, yes, I'll stick to my answer--components (mainly power supply and output transistors) can become "stressed", even at lower listening levels--higher levels just increased the amount of stress.
Could you clarify more deeply how can it sound different? And in your answer you bypassed scenario I described: maximal (even impulse) is not exceeded over 1W at any time during listening. And yes speakers are not dummy load, I specifically stated that minimal impedance (i.e the "dip" you described) was 4.5 ohm.

And you mention that main psu and transistor can become "stressed" which apparently reduces sound quality. Can you point to any references about this phenomena? Has it been measured? Or discovered in various listening testes? Links please :)
 
Last edited:
On paper--as in a technical spec sheet with typically published specs, you could theoretically find two identically spec'd amps, but I have to agree, not all amps are designed and created equal, even if they meet the same spec numbers.

I cannot comment on what specs can or cannot reveal in terms of sonic performance. The point is to isolate all other variables except an amplifier's maximum output capability in response to the OP's question.
 
Can you clarify why they sound different and how much different? The question of this topic has been troubling me for quite some time so it would be nice to get explained, scientific answers. :)

Well this gets rather deep into the weeds for which there are really no concrete answers. Whereas the question of sonic differences between units of differing power ratings, some contend that a low power amplifier will have better response linearity at low output levels (ie, one watt +/-) than say a 200 watt unit being operated at the same level.. This is the idea behind the "first watt" principle advanced most notably by Nelson Pass.
 
I cannot comment on what specs can or cannot reveal in terms of sonic performance. The point is to isolate all other variables except an amplifier's maximum output capability in response to the OP's question.

Indeed, this was my point. Whether it happens in real life or not, I narrowed it down to this so I can understand the phenomena more clearly.

Well this gets rather deep into the weeds for which there are really no concrete answers. Whereas the question of sonic differences between units of differing power ratings, some contend that a low power amplifier will have better response linearity at low output levels (ie, one watt +/-) than say a 200 watt unit being operated at the same level.. This is the idea behind the "first watt" principle advanced most notably by Nelson Pass.

Ok, this is pretty interesting, I have to google this!
Why this "First Watt" princible has not been tackled by desinging Class AB-amps to have larger section of class A operation? Do you reduce the power in Class B state by doing so?

My no-feedback SET-tube Class AB-amp is designed in such way, that is operates in Class A up to 3W, and in class B from 3W to 15W (@ 8ohm). Usually Class AB amplifiers has Class A operation only for few milliamps, which I find very weird...
 
Then let's go with the OP's example with the Luxmans above:

120w @ 8 ohms 210w @ 4 ohms vs 150 w @ 8 ohms vs 300w @ 4 ohms, and just for grins, we'll throw in my Krell KMA 160 monoblocks 160w @ 8 ohms 1280w @ 1 ohm

As per our handy-dandy calculator:

120w/8 ohm = 31 V/3.9 A
210w/4 ohm = 29V/7.25A

150w/8 ohm = 36.6V/4.3A
300w/4 ohm = 34.6v/8.7A

160w/8 ohm = 35.8V/4.5A
1280w/1 ohm = 35.8V/35.8A

That is the basic Ohm's Law calculation. Now, even at lower listening levels (ie, less wattage) the same effect occurs, so the limitations of the power supply and current handling capabilities of the output transistors come into play.

Let's just say (as an example) the bridge rectifier of the power supply can only handle 4A--and we back-calculate
4A/8 ohm = max output of 128w with a required rail voltage of 32V
4A/4 ohm = max output of 64w with a required rail voltage of 16V

Now let's go the other way--let's just say (as an example) the maximum rail voltage supplied by the power supply is 46V--and we (again) back-calculate
46V/8 ohm = max output of 264.5w and requires 5.75A current
46V/4 ohm = max output of 529w and requires 11.5A current
 
Last edited:
Indeed, this was my point. Whether it happens in real life or not, I narrowed it down to this so I can understand the phenomena more clearly.



Ok, this is pretty interesting, I have to google this!
Why this "First Watt" princible has not been tackled by desinging Class AB-amps to have larger section of class A operation? Do you reduce the power in Class B state by doing so?

My no-feedback SET-tube Class AB-amp is designed in such way, that is operates in Class A up to 3W, and in class B from 3W to 15W (@ 8ohm). Usually Class AB amplifiers has Class A operation only for few milliamps, which I find very weird...

In the early days of hi fi, the advantage of Class A is lower distortion but it came at the price of operating efficiency- namely the amount of heat they generated at low output levels. I once owned a similar "hybrid" Class A type amplifier but I didn't like it for just that reason - it would become extremely hot just idling (no signal) and had to be played moderate to loud to keep it cool. Heat is a well known enemy of electronics which drastically reduces the life of the unit.

Some still like Class A but the simple fact is that this compromise is no longer necessary in order to have the best of both worlds- namely high operating efficiency and low distortion.
 
The reason that a class A or AB amplifier will not quite double it's output when driving into 4 ohms instead of 8 ohms is not related to ohms law. It is a function of power supply droop at higher current levels, higher output device hfe at very high currents and the effect of the emitter resistors in the output chain. These need to be there and have as low a resistance as possible for stability reasons. This is why an amp which is rated at 100watts @ 8ohms will probably produce 192watts @ 4 ohms, not 200 watts - much to the chagrin of the salemen who love to see nice big round numbers that sell well.
It is possible to build an amp that will double it's output power with 1/2 load impedance, but this would need a fully regulated power supply that would likely be far more complex than the amplifier circuit it is serving. It would also, by default, be very expensive. One of the designs by John Linsey Hood for example used 22 transistors in the power supply for each channel.
 
Then let's go with the OP's example with the Luxmans above:

120w @ 8 ohms 210w @ 4 ohms vs 150 w @ 8 ohms vs 300w @ 4 ohms, and just for grins, we'll throw in my Krell KMA 160 monoblocks 160w @ 8 ohms 1280w @ 1 ohm

As per our handy-dandy calculator:

120w/8 ohm = 31 V/3.9 A
210w/4 ohm = 29V/7.25A

150w/8 ohm = 34.6V/4.3A
300w/4 ohm = 34.6v/8.7A

160w/8 ohm = 35.8V/4.5A
1280w/1 ohm = 35.8V/35.8A

That is the basic Ohm's Law calculation. Now, even at lower listening levels (ie, less wattage) the same effect occurs, so the limitations of the power supply and current handling capabilities of the output transistors come into play.

Let's just say (as an example) the bridge rectifier of the power supply can only handle 4A--and we back-calculate
4A/8 ohm = max output of 128w with a required rail voltage of 32V
4A/4 ohm = max output of 64w with a required rail voltage of 16V

Now let's go the other way--let's just say (as an example) the maximum rail voltage supplied by the power supply is 46V--and we (again) back-calculate
46V/8 ohm = max output of 264.5w and requires 5.75A current
46V/4 ohm = max output of 529w and requires 11.5A current
Lets see if I understood the logic. You claimed:
Now, even at lower listening levels (ie, less wattage) the same effect occurs, so the limitations of the power supply and current handling capabilities of the output transistors come into play.

Lets play more with our calculator:
Lets actually listen at lower levels and even imagine that maximum rail voltage is 46V and bridge rectifier can only handle 4A, but we listen @ 1W (largest instantenous peak)

1w/8ohm = 2.82V/ 0.35A
2w/4ohm = 2.82V/ 0.71A

So our bridge rectifier and trafo is practically sleeping -> sonic performance should be identical.

So as long as we dont exceed rail voltage of trafo and/or current capability of bridge rectifier, there should not be any difference in sonic performance.

is this correct?
 
I thought I would put a second comment on here about this. I run a fully active system with the power amps turned down to about 50% input level. Why? Remember that we are talking hi-fi here, not PA systems. We should not be taking amplifiers (or any other component) to anywhere near their maximum capacity. Any amp running at maximum will be a) struggling and b) distorting/clipping - none of which is good. My own general rule of thumb is - If you need to take a hi-fi amp above half volume for your desired listening level you need a bigger amp. We should always have sufficient headroom available to handle signal transients without clipping and that will actually give quite a low average power level (probably just a few watts).
 
Back
Top Bottom