One thing I've noticed with pretty much all vintage tube amps is that they have an input capacitor on the RCA plug to block DC. However, in Fisher amplifiers, the value of the capacitor seems rather small when compared to that of other tube gear. For example: the Fisher 30A series uses a 0.022uF cap on the input (into a 500kohm input impedence), while a Dynaco ST35 uses 0.1uF into an identical 500kohm input impedence. Is there any potential harm that could result (not magic smoke, but sound quality-wise) in increasing the value of the input cap from 0.022uF to 0.1uF? From what I understand, the smaller value acts as a low-pass filter and actually decreases the bandwidth of the amplifier on the bottom end.
Thanks!
-D
Thanks!
-D