Wednesday, August 17, 2011

ADC LSb mystery

Does a 10-bit ADC really have a resolution (not accuracy) of 10 bits? Apparently it might not. I've been having problems with the LSb (least significant bit) of the ADC output of PICs. What's happening is that even with a stable, fixed-voltage input signal to the ADC, the digitized output--depending on the VDD level and the input voltage signal--may either be stable or the value flip flops between two values (one LSb apart) from one conversion to the next.

The LSb problem is dependent on both VDD and ADC input signal level. For example the problem might not occur at input signal X volts with a VDD = 4.950V but will appear when VDD is within the range of 3.0 to 3.6V. And/or it might get worse (i.e., the number of times the LSb changes becomes more frequent) as one approaches a certain voltage range. Or for a certain signal voltage Y, the problem may be nonexistent at all with VDD from 2.5 to 5.0V. Or for signal Z the the LSb glitch may be present at practically the entire operating voltage range of the MCU/ADC.

I've tried everything I can think of to get rid of this hair-tearing problem: I've used a linear power supply instead of a switchmode to minimize power supply noise. I've used separate power and ground lines for digital and analog and employed star connections. Actually digital noise should be minimal since, as we'll see, the MCU goes to sleep when the ADC is reading. I've also peppered the circuit with filter and bypass caps. The circuit is in open air but I still allowed it to settle down thermally so drifts due to temperature are eliminated. I admit the circuit is breadboarded and not soldered onto a PCB and so contact problems may be the culprit. To minimize that possibility I've used redundant connections. I also tried buffering the voltage signal using an op amp. Though there has been a reduction, the above solutions have not completely eliminated the problem.

Here's the test circuit:

Bridge rectifier output is approximately 12VAC. Potentiometer is a 5-ohm multi-turn Bourns 3590S-2-502L. Because this is the only highly reliable pot that I have and because output voltage of the LM317 must not exceed 5V (the PIC's max VDD ), I had to add the 560 and 220 ohm resistors. When the pot is turned all the way to zero, the 560 ohm resistor is shunted leaving only the 220 ohms. When the pot is at its maximum 5K, it's in parallel with the 560 giving a net resistance of 504 ohms. This in turn is in series with the 220 ohm, giving a total of 724 ohms. Plug those values into the LM317 equation and the effective output range of this power supply is 2.4 to 5.0V.

Firmware is as follows. It's the 3rd or 4th revision. The ADC uses VDD as its positive reference and ground as its negative voltage reference. Thus the ADC output will be ratiometric with respect to the power supply. This way the analog to digital conversion is immune to drifts in the power supply. ADC read is performed approximately every 8ms. Because the PIC12F1822's ADC has silicon flaws, ADC read is performed during sleep, thus although timer0 tick is set to 8ms, several hundred microseconds are added while the system clock is shut down. The ADC read takes so long because there are actually two sets of reads. One is a single read and the other is a set of 16 reads which are then averaged. I tried averaging the reads to see if I could get obtain a value that doesn't keep flipping around. After the reads, the current and previous reads are compared. If they're different then the corresponding MCU pins (neflag and neflagave) are made high, else they're made low. Those pins are connected to an oscilloscope and logic analyzer so I can see how (un)stable the ADC reads are. In this final revision of the firmware I've also enabled the EUSART so I can see by how many bits the ADC reads differ (the Saleae Logic analyzer automatically decodes the serial data stream). Only the lower byte of the ADC output (register ADRESL) is transmitted.


processor = PIC 12F1822
compiler = mikroC Pro v.5.0.0
August 2011

Test of ADC LSb 
internal oscillator is used, WDT disabled, power up timer and brown out reset disabled, MCLR disabled


#define  neflag              PORTA.f5            // not equal flag; 1 = previous and current ADC values are not equal
#define  neflagave           PORTA.f4            // not equal flag average; 1 = previous and current averaged ADC values are not equal
#define  t0ini               256 - 250           // value loaded into TMR0 every time it overflows

void IniReg()
  OSCCON    = 0b1101000;     // internal clock 4Mhz
  TRISA     = 0b100;
  ANSELA    = 0b100;
  PORTA     = 0;

  TMR0      = t0ini;
  OPTION_REG = 0b10000100;   // Weak pull ups disabled
                             // timer0 uses internal clock,
                             // prescaler assigned to timer0, prescaler = 1:32

  ADCON0 = 0b1001;           // channel 2, ADC on
  ADCON1 = 0b11110000;       // right justified, use Frc, Vdd as positive reference voltage
  INTCON.GIE = 0;            // global interrupt disabled
  PIE1.ADIE = 1;             // ADC interrupt enabled

  // baud rate = 2400
  SPBRGH = 0;
  SPBRGL = 25;

} // void IniReg()

/* =========================================================================================================

According to PIC12F1822 Silicon Errata sheet DS80502C the ADC unit in certain silicon revisions is buggy:

"Under certain device operating conditions, the ADC conversion may not complete properly.
When this occurs, the ADC Interrupt Flag (ADIF) does not get set,
the GO/DONE bit does not get cleared and the conversion result
does not get loaded into the ADRESH and ADRESL result registers."

mikroC Pro v5.00 compiler's Adc_Read() built-in ADC function uses the GO/DONE bit to check conversion completion (without putting the MCU to sleep, of course)
so the function cannot be used. And indeed the MCU hangs (when WDT is disabled) or the MCU resets when WDT is enabled.

The workaround used here is as per method 1 in the said errata:

"Select the dedicated RC oscillator as the ADC conversion clock source,
and perform all conversions with the device in Sleep."

=========================================================================================================  */

unsigned int ADC()
  PIR1.ADIF = 0;                       // clear ADC interrupt flag
  INTCON.PEIE = 1;                     // peripheral interrupts enabled
  ADCON0.GO = 1;                       // start ADC conversion
  asm {sleep}                          // zzzzzzzzzzzzzzz.... when ADIF = 1 MCU will wake up and will continue execution of the program
  PIR1.ADIF = 0;                       // clear ADC interrupt flag
  INTCON.PEIE = 0;                     // disable peripheral interrupts
  return ADRESH*256 + ADRESL;          // return the 10-bit ADC value as a 16-bit number

void Compare()
  unsigned char i;
  static unsigned int CURR = 0;        // current single ADC reading
  static unsigned int CURRAVE = 0;     // current average of 16 ADC readings
  static unsigned int PREV;            // previous single ADC reading
  static unsigned int PREVAVE;         // previous average of 16 ADC reading
  unsigned int SUM;                    // sum of 16 ADC readings

  SUM = 0;
  for (i=0; i<16; i++)
    SUM += ADC();

  CURR = ADC();
  TXREG = CURR;                        // bec ADC is performed during sleep while serial data transmission is dependent of the system clock
                                       // transmission can only begin after all ADC reads are done

  if (CURR == PREV)
    neflag = 0;
    neflag = 1;

  CURRAVE = SUM / 16;
  if (SUM % 16 >= 8)                   // number rounding algorithm
    CURRAVE += 1;

    neflagave = 0;
    neflagave = 1;

void main()

    if (INTCON.TMR0IF)
      TMR0 = t0ini;
      INTCON.TMR0IF = 0;
  } // while(1)
} // void main()

I used the oscilloscope with time base on roll mode with one channel monitoring neflag and other neflagave. I opted not to take screenshots of the scope--would take too much effort. But here are some of the logic analyzer data at particular voltages. As you can see the averaging routine sometimes works, sometimes doesn't, depending on how bad the LSb problem is. Resistor values are as per their color code. VDD values are actual measured voltages using a Fluke 87V.

I. Ra = 5.1K, Rb = 4.3K

A. VDD = 2.502V

B. VDD = 4.000V

C. VDD = 5.000V

II. Ra = 5.1K || 330 = 310, Rb = 4.3K

I provide only a screenshot for VDD = 5.000V. There is absolutely no LSb problem for VDD = 2.5V to 5.0V

III. Ra = 5.1K, Rb = 4.3K || 330 = 306

A. VDD = 2.999V

B. VDD = 3.499V

C. VDD = 4.500V

D. VDD = 5.000V

So what could be causing this ADC read glitch? Well, no ADC is 100% accurate. It will have various sources of errors--among them INL, DNL, offset, and gain errors--due to silicon and its fabrication. But accuracy isn't our problem; it's resolution. Ostensibly none of these is the error that I've encountered. What it does look very much like is conversion error due to transition noise, also known as code-transition noise or code-edge noise. Transition noise is,
the range of input voltages that cause an ADC output to toggle between adjacent output codes. As you increase the analog input voltage, the voltages that define where each code transition occurs (code edges) are uncertain due to the associated transition noise.

Keeping the input signal constant and performing lots of ADC reads, the distribution of ADC output codes is supposedly Gaussian (normal distribution). In the tests with the PIC, however, apparently there are only two codes, i.e, output varies by only one LSb. I haven't done it, but it should be easy to have the MCU read a thousand or ten thousand times and tell us whether the codes vary by more than one LSb.

Interestingly, averaging, which I used in the firmware above, is the recommended algorithm to address LSb flickering. However, as we've seen even averaging (at least the way it was implemented above) isn't a panacea. Maybe I'm doing it wrong.

No comments:

Post a Comment