steveUK
Active Member
With bipolar outputs, if you observe on a distortion analyzer, you adjust the bias until the distortion falls away, usually around 10mv. The idea is to switch on both comp pairs so the switching dist. goes away.
Some feel there is no benefit to more quiescent current if both transistors are on.
Douglas Self writes that a range of about 43mv-52mv is enough for the commonly used emitter resistors, i.e. 22,33,47, 1ohm.
But there is obviously differing opinions amongst amp designers.
Hope this helps.
Thanks for the info. Well, 10mV or 43mV-52mV it is then. Or, if one decides sticks to the Pioneer spec, 100mV.
Question, if one is using the receiver in a situation where not much volume/power is being used, and the ambient temperature is 'not high', then setting a higher quiescent current (up to 100mV) gives less distortion then, that right? Notwithstanding the fact that "Some feel there is no benefit to more quiescent current if both transistors are switched on." I'm thinking that Pioneer, amidst the receiver wars that raged at the time may have used a higher value to reduce the distortion figure - even just a tad - to gain spec points in the war? Could that be the crux of the matter? Even though they knew that in doing so the receiver would run on the hot side?