P2 ADC question
JRoark
Posts: 1,215
@JonnyMac posted some Spin code in his "Nuts & Volts" article that reads ADC pins and returns the value. I did a quick translation to FlexBASIC (which works fine) and then added some trivial features.
The question I have is this: In the following code if I swap "P_ADC_1X" to "P_ADC_10x", shouldn't this result in a range of 0-0.330 volts instead of 3.3v? (ie, 10x)? If I do this, that simple subsitution breaks the code and I don't understand why. All it returns is a full-scale reading, even when the pin is driven by 0.125 volts. Smart pin docs seem to be a little sketchy on this. Can anyone shed some light on why this does not work?
Here is the code:
If I need to post the entire program, I will, but this is the bit that seems to be misbehaving. The complete program configures 58 pins for ADC input and displays all 58 values every 5 seconds.
EDIT: For clarity, this is running on a P2-EVAL with Rev B silicon using @ersmith FlexBASIC V4.2.7
The question I have is this: In the following code if I swap "P_ADC_1X" to "P_ADC_10x", shouldn't this result in a range of 0-0.330 volts instead of 3.3v? (ie, 10x)? If I do this, that simple subsitution breaks the code and I don't understand why. All it returns is a full-scale reading, even when the pin is driven by 0.125 volts. Smart pin docs seem to be a little sketchy on this. Can anyone shed some light on why this does not work?
Here is the code:
SUB ADCStart(pin_num as ulong) direction(pin_num) = input ' reset smart pin wrpin(pin_num, P_ADC OR P_ADC_GIO) ' read ground reference wxpin(pin_num, %00_1101) ' 8192 samples, 14 bit resolution, SINC2 filter wypin(pin_num, 0) pinlo(pin_num) ' enable waitcnt(getcnt() + 8192 << 3) ' allow 8x readings ADC.CalLO(pin_num) = rdpin(pin_num) ' save ground cal level direction(pin_num) = input ' reset smart pin wrpin(pin_num, P_ADC OR P_ADC_VIO) ' read VIO reference wxpin(pin_num, %00_1101) ' 8192 samples, 14 bit resolution, SINC2 filter wypin(pin_num, 0) pinlo(pin_num) ' enable waitcnt(getcnt() + 8192 << 3) ' allow 8x readings ADC.CalHI(pin_num) = rdpin(pin_num) ' save VIO cal level direction(pin_num) = input ' reset smart pin wrpin(pin_num, P_ADC OR P_ADC_1X) ' read the pin voltage <=== THIS LINE wxpin(pin_num, %00_1101) ' 8192 samples, 14 bit resolution, SINC2 filter wypin(pin_num, 0) pinlo(pin_num) ' enable END SUB
If I need to post the entire program, I will, but this is the bit that seems to be misbehaving. The complete program configures 58 pins for ADC input and displays all 58 values every 5 seconds.
EDIT: For clarity, this is running on a P2-EVAL with Rev B silicon using @ersmith FlexBASIC V4.2.7
Comments
The calibration values are not directly usable, so this higher gain modes are mainly for AC measurements, like audio.
Andy
As a point of clarification: So if you are trying to measure small voltages with anything other than P_ADC_1X, you would either need to run it through a voltage follower (maybe an op amp) with a DC offset towards ~1.65v, or we would need to capacitively couple the signal to the pin and use a weak voltage divider on the pin side in order to achieve the 1/2Vio bias?
For DC you will need to generate some offset, yes. Maybe the average of the LO- and HI-calibration values will be usable for correction of the mid-point drift in this case.
Andy
It has been mentioned that there is some kind of voltage divider, is it capable of handling up to maybe 14V. Since I would like to data log battery activity, which would include charge/discharge cycles, would the P2 be able to do this.
Ray
I've got a clientless morning so I'll be playing with the 10x ADC mode shortly. Thanks to all for your input!
- The P_ADC_3X input range is from 0.867v to 2.455v. 1.661v midpoint
- The P_ADC_10X input range is from 1.40079v to 1.91163v. 1.65621v midpoint
- The P_ADC_30X input range is from 1.57316v to 1.73787v. 1.65552v midpoint
- The P_ADC_100X input range is from 1.62780v to 1.68242v. 1.65511v midpoint
I'm not even going to attempt measuring in P_ADC_100X mode. 30x mode is all I can reasonably manage. (I'm using a potentiometer as a voltage divider to drive the pin and I've already had some caffeine). I went there but take the 100x numbers with a grain of salt.The ADCs seem to center around a 1.66v-ish reference voltage.
I had read the ADCs could read slightly above and below the supply rails in P_ADC_1X mode. At a hard ground the ADC reports 2771 counts, and at the positive rail it reports 13,543 counts, so that seems to be born-out (and the code snippet by @JonnyMac for Nuts & Volts reflects this possibility in the way he does the pre-calibration).
A fun morning! Many thanks to all for helping me get my brain sorted on ADCs.
That's actually where I'm going with this. I guess great minds think alike?
I'll be using the 1X mode and feeding the Prop pin from a voltage divider in a 5:1 ratio. Whatever voltage the ADC measures, multiply it by 5 to get the real-world voltage. This gives a 0 to 16.5 volt battery range (not counting the area the ADCs can measure above the rails, which I haven't played with yet).
The one fixed number is the usable +0.3 Volts above VIO and -0.3 Volts below GND.
EDIT: Had quoted the wrong piece.
I see what you did there...
Honestly I'm not sure.
The ADCs are power hungry and act like a stiff on-board load. All together the board pulls ~500+mA with all of them running and the red warning light glowing. I’m presently driving the ADC inputs with a lab supply monitored by a 5-1/2 digit meter so this input value shouldnt change. By making two measurements of the same ADC pin, one with the 3.3 volt bus lightly loaded and the other with a heavy load, we can get a clue. If the 3.3 v buss drops a bit, and the ADC changes its output value accordingly, wouldnt that give us a clue the ADCs use a simple voltage divider as a reference?
I’ll try this in a bit and post the result. The “new normal” of Covid means I’ve got time to play!
The question remaining is about the supply volts on span/range, ie: What is the numerical readings of calibration at VIO and GIO at different supply voltages? Do the numbers stay roughly the same? That should answer the question but I haven't got any time to test it right now.
Seems reasonable. I'll take a shot at this in a bit and report back.
Yes, there is no band-gap voltage reference, it is just a 50% (matched resistors) divider from Vio, to create a mid-virtual-gnd, and every adc has their own 50% divider. (so that will vary from pin to pin, by the resistor matching )
That also means power supply drift and noise * 50%, are added to the measurement.
You need to account for the 50% Vcc 'virtual reference' as the ADC measures relative to that.
The very simplest 12V interface would be a single series resistor, but that would be less precise than a 2 resistor divider.
With a single resistor you have internal fab-variance and temperature changes on the feedback resistor, but the dropper resistor does not track that.
A 2 resistor divider would give you 0~24V range if you set 12V as mid point ~1.66V, if you needed less span than that, choices would be
* A ~6V Zener + 2 resistors would give 6~18V span ( example PDZ6.8B,115 ±2% 500µA slope specified)
* A '431'-like shunt regulator + 4 resistors would give 6~18V span, choose a better 431, with lower min current like the AS431 ±0.5% 50µA
* An opamp + 5 resistors allows any gain/offset you like.
- Calibrate the ADC at full 1x scale
- Measure the ADC's bias point with the %100010 mode that used to be PinB mode before Revision C
- Have a neighboring smartpin read and filter the ADC's digital output bitstream, and configure the ADC pin's own smartpin to drive the output in %00111 NCO DUTY mode
- Calibrate the DUTY output against the previously calibrated full-scale ADC by outputting a few different duty cycle values and seeing what voltages you get back
- Put the ADC in in 10x mode
- Calibrate the ADC, now in 10x mode, against the previously calibrated DUTY output by outputting a few voltages near the ADC's bias point and seeing what measurements you get back
- Disable the DUTY output, allowing the input to work again, and use ADC pin's own smartpin (or its already-configured neighbor) to do ADC filtering on real input data
The package may not be moisture proof ?
BTW, latest finding is those four pins give valid ADC readings when VIO supply voltage is at 2.5 Volts.
There is some change below 2.5 volts, presumably that's due to other factors though:
- With VIO supply at 3.3 Volts, the readings are: VIO is 83.3% (13653+-445 of 16384) and GIO is 17.0%, (2788+-460 of 16384).
- With VIO supply at 3.0 Volts, the readings are: VIO is 83.3% (13656 of 16384) and GIO is 17.1%, (2795 of 16384).
- With VIO supply at 2.75 Volts, the readings are: VIO is 83.3% (13652 of 16384) and GIO is 17.1%, (2806 of 16384).
- With VIO supply at 2.5 Volts, the readings are: VIO is 83.3% (13641 of 16384) and GIO is 17.2%, (2824 of 16384).
- With VIO supply at 2.25 Volts, the readings are: VIO is 83.0% (13606 of 16384) and GIO is 17.5%, (2872 of 16384).
- With VIO supply at 2.0 Volts, the readings are: VIO is 82.0% (13439 of 16384) and GIO is 18.3%, (3006 of 16384).
Also, if you go much below 3.3V, you unbias the level translators, unless you also drop the 1.8V core voltage. That could be why you were seeing 100% duty - no transitions were getting through.
I excluded those four pins from the above testing.
My question is this: “what are the values of R1 and R2 in the attached schematic?". I can't find anything in the P2 hardware docs or on the forum that describes this.
FWIW I’m trying to design an AFE that needs to know what impedence it is driving.
Provided the ADC is within range, that node is always very close to 50% of VIO.
Somehow I erroneously read your answer above: “...Yes, there is no band-gap voltage REFERENCE. it is just a 50% (matched resistors) divider from Vcc, and every adc has their own 50% divider...” to apply to the input bias, not to the *reference*.
I blame trifocals. Lol