Many embedded applications use an A/D converter. However, many users unknowingly ruin the A/D measurements by connecting incorrect circuitry to the A/D converter input.
Figure 1 shows typical application circuit of A/D converter with integrated Sample & Hold (S/H) circuit - an application so simple, that it almost cannot be incorrect. However it is incorrect and the reading delivered by the A/D converter will be lower than it should be. To understand what is wrong, we have to look at the S/H circuit. Modern S/H circuits are much more complicated than the one outlined here. However, the basic principle is still the same.
At sampling time, the switch is closed and the sampling capacitor is charged. To protect the external circuitry from experiencing the shock of capacitor being suddenly connected to its output, an on-chip analog buffer is used. We have created ideal buffers in theory and everyone uses them on paper, however there is nothing like an ideal buffer in the real world of electronics. The buffer will act as impedance transformer and the change of capacitance on its output will be transformed to a change of capacitance on its input.
The sampling process is very fast, much faster than the bandwidth of the external amplifier we have connected to the A/D converter input. Therefore, whatever happens at the A/D input will be unaffected by the external amplifier.
Figure 2 shows the equivalent circuit, which will enable us to tell what happens at sampling time. Before sampling takes place, combined capacitances of the PCB track and chip pin (CT + CP) are charged to input voltage VIN. At sampling time the discharged S/H capacitance transformed across the on-chip input buffer (Cs) is connected in parallel to these capacitances and voltage on the input pin will drop. The only component capable of delivering more charge to the capacitors and lifting the input voltage is the external amplifier, but it will not react until it will be too late. How much will the input voltage drop?
Let us assume some reasonable values, for example: (CT + CP) = 5 pF, CX = 0,5 pF. The input voltage will then drop to 95%! It is clear that we can lower the drop by making the capacitance attached to the A/D converter input higher. Let us calculate the minimum capacitance needed to keep the voltage drop below 1/2 LSB of the A/D converter.
The conditions will be the worst for maximum allowable input voltage. If we assume A/D converter with resolution of N bits, then the maximum input voltage equals to 2N · LSB.
For example, if we intend to use a 12-bit A/D converter with input capacitance change of 0,5 pF, the minimum capacitance connected to the A/D input must be larger than 213 · 0,5 pF ie 4 nF to keep the voltage drop below 1/2 LSB.
It is a known truth that chip manufacturers often do not provide the data the designer needs. I have seen many A/D converter datasheets where the change of input capacitance at sampling time was not provided. However, things are not lost yet because we can measure this parameter very easily. The equipment needed for such measurement is an oscilloscope and signal generator. This simple measurement is not precise, but will give at least rough estimate. We will turn the situation around and compute the change of capacitance from the voltage drop we will observe at the chip pin.
The measurement set-up will look like the one outlined in Figure 3. The input resistor RI should be quite large to make the time constant of the RC network long enough for comfortable measurement. I have used 1 MOhm in my measurements.
First of all we have to measure the capacitance connected to the A/D input (C = CO + CT + CP). We will do this by applying square wave signal to the input resistor and observing time constant of the resulting waveform on the scope (see Figure 4). The time constant can be guessed directly from the scope, but it is better to transfer the data into a spreadsheet and calculate it more accurately. In the example shown in Figure 4 the time constant comes out to be approximately 28,5 µs.
We will now apply highest allowable DC voltage to the input resistor and start the A/D conversion. The resulting voltage drop can be seen in Figure 5 - the voltage drop is around 176 mV at 3,3 V.
Once we have measured the voltage drop, we can calculate the change of A/D input capacitance.
Amplifiers usually do not like capacitive loads. Therefore the extra input resistor is added in Figure 6. This RC network will also provide additional filtering against high frequency noise. Input leakage current of an A/D converter is usually below 1 µA. If we choose hundreds of ohms for the input resistor, the voltage drop across it will be around 100 µV and way below 1/2 LSB.
For more information: Norman Ballard, Motorola Development Manager SA, 011 800 7800, [email protected]
© Technews Publishing (Pty) Ltd | All Rights Reserved