Damping Factor Debunked from Linea Research – White Paper

What is amplifier Damping Factor ?

The damping factor specification for an amplifier is simply the ratio between its output impedance and the connected load impedance. In order to give a high number for their specifications manufacturers generally use an 8 Ohm load for the measurement. For example from a quoted damping factor figure of 200 you can deduce that the amplifier has an output impedance of 8/200 = 0.04 Ohms, which relative to the other resistances in the system is pretty low.

Some history

Damping factor has it’s origins in the late 1960s hi-fi world when amplifier manufacturers were trying to come up with a specification that made their new transistor based products appear better than those using valves (tubes for our friends in North America). Because they can use relatively large amounts of global negative feedback and do not need an output transformer, transistor amps can have an inherently lower output impedance and hence have a higher damping factor number than those based on valves. Job done, from a specification point of view at least.

The issue

Don’t be mislead in to believing that a high published damping factor equals better speaker control because it won’t be realisable in the real world. The reason for this is that counter to intuition and decades of marketing, the amplifier can have little influence on the figure. This is because when calculating the damping factor of a system, the DC resistance of the cabling and the driver must be must be included.

Please explain …

For the moment lets concentrate solely on the effect that the driver has on a system’s damping factor. As an example, if we took the typical DC resistance of an 8 Ohm driver to be about 80% of it’s nominal impedance (the precise numbers here do not matter much) and used an amplifier with the incredibly high, well for a modern professional amplifier anyway, output impedance of 1 Ohm (which corresponds to an amplifier damping factor of only 8) we get :

Driver DC resistance = 8 x 0.8 = 6.4 Ohms + 1 Ohm amplifier impedance = 7.4 Ohms. Dividing the driver’s 8 Ohm impedance by this 7.4 Ohms gives a system damping factor of a little less than 1.1 which is clearly not a high number.

You might think that this is because of the very high (poor damping factor ?) amplifier output impedance but that is not the case. If we do the calculation again but this time for an amplifier with a 0.1 Ohm output impedance corresponding to an amplifier with a damping factor that is still only 80, we get a system damping factor of 8/(6.4+0.1) = 1.2, not much of an ‘improvement’.

Even when an amplifier has the minimum output impedance possible of zero Ohms, which by the way corresponds to an amplifier damping factor of infinity (!) we can still only achieve a system damping factor of 8/(6.4+0) = 1.25 which is hardly different.

Really ?

After many years of reading that an amplifier with a high damping factor is a good thing and therefore one with twice that must be better, I realise that the above is difficult to believe. But it is true, the amplifier can have little influence on a system’s performance from a damping factor point of view.

The consequences of this is that any hand waving about a particular amplifier holding a speaker in a ‘vice like grip’ due to it’s ‘incredible damping factor’ is just plain nonsense because the damping in any real system is dominated by other resistances, particularly that of the driver, which will swamp the output impedance of any sensible amplifier many, many times over.

Damping factor debunked –01a page 1 of 2

Damping Factor Debunked

I am still not convinced …

Supposing that you still take issue with the idea that the speaker dominates the system damping factor, OK, lets ignore it ! There are other resistances in the circuit that must also be considered. For example :

Amplifier PCB to internal wire crimp 2mR (0.002 Ohms), internal wire to rear panel Speakon socket 2mR, Speakon plug to customer speaker cable 2mR, customer speaker cable (say 25m of 4mm2) 210mR, customer cable to Speakon plug 2mR, cabinet Speakon socket to internal wire 2mR, internal wire to driver crimp 2mR.

Adding these gets us to 222mR or 0.222 Ohms, so even ignoring the driver and the amplifier, the maximum possible system damping factor assuming an 8 Ohm load is only 8/0.221 = 36. With a 4 Ohm load it has reduced to 18 and will only be 9 at best with a 2 Ohm load.

OK, so why is this an issue now ?

If it is unimportant from a technical point of view, why do manufacturers continue to produce amplifiers which have unnecessarily high damping factors ?

There are at least a couple of reasons. Certainly it has become audio folklore that a high damping factor is necessary to keep drivers, particularly high power drivers under ‘control’. Also with lower power non-Class D designs, creating an amplifier with a high damping factor is relatively easy without compromising other aspects of its performance, so there is no need to rock the marketing boat.

However with modern high power Class D amplifiers, rather like their valve (tube) based predecessors, it is not desirable to use large amounts of negative feedback, either to make a poor design measure well or to create unnecessarily large damping factors. In fact to do so can seriously compromise other important amplifier characteristics. Probably the most important one being unconditional stability in to difficult loads.

To conclude

In summary, it is never necessary to get into a debate about the merits or otherwise of a silly high amplifier damping factor specification because in the real world the damping factor of the system is not defined by the amplifier.

Leave a Reply

Your email address will not be published. Required fields are marked *