Amplifier Sensitivity, Decibels, and You!

Most likely

zenith2134 said:
If you are shooting 2volts into an input which was designed for 150mV, would the input be overloaded, meaning would distortion be produced?
Sounds like you're thinking about trying to feed a line level signal into a phono preamp. Bad idea on many levels. :no:

zenith2134 said:
I mean, there werent many 2volt sources in the seventies, now were they?
Well, if you exclude tuners and tape decks, you might be right.
 
Actually this is not what I'm attempting at all, since all of my 70s stuff lists tape in's and Aux at 150mV, while all the built-in phono stages are rated at 2.5mV. Big difference there. I still highly doubt that tape decks and tuners were anywhere near 2 volts until the late 80s
 
zenith2134 said:
Actually this is not what I'm attempting at all, since all of my 70s stuff lists tape in's and Aux at 150mV, while all the built-in phono stages are rated at 2.5mV. Big difference there.
Well, since you failed to state wether that was a minimum or maximum (which you DID imply) I think you should take a little of that heat yourself.

zenith2134 said:
I still highly doubt that tape decks and tuners were anywhere near 2 volts until the late 80s
Really? How about an example from the 60's?

http://home.indy.net/~gregdunn/dynaco/components/FM3/index.html

There's advantages to having been around when these things were new.
 
My question is - Wouldn't distortion rise when you are exceeding the maximum input voltage of the input?? I would assume that the preamp diodes handling the input path would be driven past their safe operating parameters a bit.
 
  • Like
Reactions: ERG
zenith2134 said:
My question is - Wouldn't distortion rise when you are exceeding the maximum input voltage of the input??
Yes, but you need to define what that "maximum" voltage is. A sensitivity rating is not a maximum.

Using that 2.5 mv rating for a phono input, that's not a maximum rating, that's a voltage at which a certain level can be reached but it can accept a much higher input than that. Re-read your specs and you'll see a maximum for the phono input. If you exceed that maximum rating then you will most certainy get distortion.

Likewise, for those line level inputs, odds are that 150 mv figure is to attain a certain output level. If it goes higher than that, you simply turn the gain control down, so I guess you can say that it's basically unlimited.

Line level stages have the volume control on the amp to limit their input level.

The phono stage has no gain control on it's input and it can easily be overdriven. The volume control on the amp comes after the phono preamp stage. That's why line-level signals fed into there can sound perfectly horrible (not to mention the RIAA curve)

Many source units allow adjustment of their output so the gain control on the amp stays pretty constant. Virtually all tuners and tape decks did back in the day.
 
zenith2134 said:
If you are shooting 2volts into an input which was designed for 150mV, would the input be overloaded, meaning would distortion be produced?

I mean, there werent many 2volt sources in the seventies, now were they? I have been thinking this lately, Can someone please clear this up for me? TIA.
zenith2134, I thought the maximum allowable input level would be easy to find from the amplifier's specifications - but I was wrong!!
The specifications I have seen only give a maximum figure for the phono input, there is nothing at all for the line level inputs (tuner/tape/aux).

I've had a quick look at a few schematics to see how the line level inputs are typically handled.
Pioneer SX-737 Receiver and SA-9500 Amplifier : Signal goes through source selector switches, then to the balance control, then to the volume control.
Pioneer SX-850 Receiver : Signal goes through source selector switches, then to the balance control, then to a buffer amplifier operating from the 51volt supply rail, then to the volume control.

If these units are typical of most vintage gear then there is nothing at all to worry about when connecting the 2volt output of a CD player into the line level inputs of a vintage amplifier - simply exercise some restraint with the volume control and all will be well. :thmbsp:

- Richard B.
 
Oh okay, now I see. So just heed the warnings about setting gain and all is cool. Nice.

Guess I was thinking of more modern and cheaper gear where if you input too hot, the pre clips like crazy... But I digress: I gave away all of my equipment that does that
 
got me thinking (dangerous stuff)

well, this thread got me thinking about the whole 2V P-P and input sensitivity issue. And I was wondering..."if one were to take say, a 5532 differential op-amp and whip up a conditioner circuit to chop that 2V P-P down to about 200mV P-P, wouldn't that balance out the gain structure of the receiver/ power amp so you can really drive the outputputs to give you some headroom?"

what do you guys think?:beer:
 
Vintage Panasonic Stereo

I have a 1969 panasonic AM/FM stereo receiver amplifier w. turntable.
Turning the volume knob on the receiver/amp creates extreme static. Is this due to age and/or dust.? What is the best way to clean/correct it? Thanks.

Steve
Maine
 
Time for one of those long, boring semi-technical posts that no one here reads... :boring:


There have been various posts from people who have just acquired a "new" vintage amplifier, have connected it up to their speakers and fed it with a nice clean signal from a (relatively) modern CD player, and have commented along the lines of "I only turned the volume up to 10 o'clock and the whole house was shaking - boy that amp is powerful".
They never seem to consider that their amplifier might well be producing near to full output power even though the volume control is nowhere near maximum - a consequence of the sensitivity mis-match between vintage and modern equipment.

It seemed to me that it would be useful to go over a few basics regarding the decibel (dB) scale and how it relates to the sensitivity of the inputs on vintage amplifiers.


First a few basics about the dB scale (and a little bit of math - but nothing too difficult :) ) :

1. The decibel is a relative rather than an absolute measurement, i.e. it is used to measure the ratio of one signal to another. I am sure everyone is most familiar with it's use in representing the signal to noise ratio of equipment.

2. Positive dB values mean that a signal is greater than the reference value (ratio greater than 1), negative dB values that it is less than the reference (ratio less than 1).

3. If a signal passes through a number of amplification (+dB) or attenuation (-dB) stages, then the overall gain is found by simply adding up the dB values of each stage. For example, assume a signal passes through components with the following typical gain : phono amplifier +60dB, pre amp -20dB, power amp +30dB : The overall gain is therefore +70dB.

4. To convert the ratio between two voltages, V1 & V2, to decibels we use the formula : dB = 20 * log(V1 / V2)
A doubling of voltage = +6dB.
Conversely, to convert a dB value to the ratio between two voltages use the formula : Voltage Ratio = 10^(dB value / 20)
(The symbol "^" means "to the power of").

5. To convert the ratio between two powers, P1 & P2 to decibels we use the formula : dB = 10 * log(P1 / P2)
A doubling of power = +3dB.
Conversely, to convert a dB value to the ratio between two powers use the formula : Power Ratio = 10^(dB value / 10)


So back to the original question : Just how far do we have to turn up the volume to get maximum output power from our amplifier?

Look at your amplifier's manual and find the sensitivity value for the input you are using. The value it gives is the input voltage required in order to produce maximum rated power when the volume control is set to maximum. If the input signal you are feeding to the amplifier is greater than the sensitivity value, then maximum output power will be produced before full volume on the control.

Let's work through a real example to make things easier to follow (using data for my Pioneer SX-1250 receiver and Marantz CD-65SE CD player - typical of many vintage amp / modern CD player combinations) :

Input sensitivity of SX-1250 "Aux" (and "Tape") inputs : 150mV
Output voltage from CD-65SE at maximum signal level : 2000mV (2V)
So the CD player output signal is 20 * log(2000 / 150) = +22dB higher than that required for full output power from the SX-1250.

What this means is that the receiver's full power of 160 watts will be produced with the volume control set to -22dB (referenced to maximum = 0dB). Fortunately, in common with much high-end equipment, the SX-1250 has a volume control marked in -dB, making it easy to see where this occurs.

Perhaps the most "surprising" thing to come out of this calculation is just how little the volume needs to be turned up to get full output : -22dB on the SX-1250 corresponds to somewhere between the 11 and 12 o'clock positions.


So the next time someone tells you how loud their system is even at low "volume" settings (and implying that it would be ten times louder if turned up fully), just pause for a minute before you get too impressed and instead consider if they may have an amplifier/source sensitivity "mis-match". :scratch2:


Finally, we can apply the same calculations to the use of graphic equalizers and bass / treble / loudness controls. If +3dB of boost is applied at some frequency, then the power required at that frequency is doubled. If +10dB of boost is applied, then the power required increases by a factor of 10. So applying high levels of equalization (for example to compensate for the falling bass response of a speaker) massively increases both the power requirements of the amplifier and the handling capacity of the speakers.

- Richard B.

OK I understood some of it. My maths is ok (though I'm not sure what the * represents). May I ask a few simple (stupid?) questions?

1: "with the following typical gain : phono amplifier +60dB, pre amp -20dB, power amp +30dB : The overall gain is therefore +70dB" -- why is there a reduction in voltage in the pre-amp?

2: What should I look for in an integrated amp (on the vintage second-hand market) to be reasonably sure I can put a CD or DVD player through it safely? Is there any way of knowing what the sensitivity of the amp is if there is no manual?

3: If there is a mismatch, what can one do about it? Is there a way of reducing the signal voltage from the modern gear?
 
alright, I'll try, here goes;
1) The * indicates "times", or multiply.
2) The preamps usually do not have a gain stage, so they only "absorb" power,
and do not "produce" any.
3) No, unless you have a schematic and/or can do the math, you won't know the mismatch amounts- you can, however, measure it.
4) Yes, that is what Bogie was referring to(...a direct box...)
However, due to the nature of the circuitry in general, some people may
notice a "change" in the sound.

Help?? Hope so...:thmbsp:
 
OK I understood some of it. My maths is ok (though I'm not sure what the * represents). May I ask a few simple (stupid?) questions?

1: "with the following typical gain : phono amplifier +60dB, pre amp -20dB, power amp +30dB : The overall gain is therefore +70dB" -- why is there a reduction in voltage in the pre-amp?

2: What should I look for in an integrated amp (on the vintage second-hand market) to be reasonably sure I can put a CD or DVD player through it safely? Is there any way of knowing what the sensitivity of the amp is if there is no manual?

3: If there is a mismatch, what can one do about it? Is there a way of reducing the signal voltage from the modern gear?

Excumbrian, sorry for the delay in responding to your questions, I'd just like to expand a little bit on the answers from klm1 above.

Under "normal" operating conditions the output from a pre-amp will usually be lower than the input. For example, if the input signal is the typical 2volt maximum from a CD player and the output is feeding a power-amp with an input sensitivity of 1volt (for full power), then listening at anything less than maximum volume involves the signal level being reduced (maybe by -20dB to -40dB at average listening levels, depending of course on speaker efficiency and personal preference).

Some very high quality pre-amps are purely "passive" devices, i.e. they consist only of source selection switces and a volume control. By their very nature the output must always be lower than the input.

Usually the higher voltage from the modern gear does not cause any problem with older amplifiers. If the volume control has to be set so low that it's difficult to accurately adjust it, then it's always possible to fit in-line attenuators between the source and the amplifier.

Hope this helps to clarify things for you, if not then don't hesitate to ask again. :yes::yes:

- Richard B.
 
Richard B,

First of all many thanks for this thread!
Now, I think I understand a little this voltage issue.

I have in my system :

1.Technics SH-X1000 DAC with output voltage 2,5Vrms and impedance 600Ohm

2. Yamaha preamp CX-1000 with 150mV input sensitivity and pre out 1,5V

3. Yamaha amp with 1,62V input sensitivity and 60kOhm impedance.

I understand that there is a perfect match between pre / amp and very big mismatch between DAC and pre / amp.
That's the reason why I turn a volume only a little 8 o'clock
room is overloaded of sound!
Am I right?

Should I simply remove active preamp and use high quality passive controller?

Thanks,
 
Re the question about 'back in the 70's 150mV input sensitivity was the the norm, now we have 2volts' or something like that. CD's have much greater dynamic range than records or tape could ever cope with, therefore the headroom (level above 'normal' before the onset of clipping in the input stages) must be able to cope with this range. In theory there is no noise from a CD so the dynamic range could be over 90dB, as good as your modern pre-amps and probably a lot better than the room that you are listening in!
 
Back
Top Bottom