MCP3208 only reading 1/2 of supplied voltage
Benj
Posts: 66
I am breadboarding an MCP3208 ADC on a BS2P24/40 demo board and am having a tough time figuring out why the inputs are only reading half of the voltage I'm putting on the input channels.
MCP3208 pin out:
1 (Channel 0) - +5v
2 (Channel 1) - +5v
9 (DGND) - vss
10 (CS) - BS pin 0
11 (DIN) - BS pin 3
12 (DOUT) - BS pin 2
13 (CLK) - BS pin 1
14 (AGND) - vss
15 (VREF) - vdd (+5)
16 (VDD) - vdd (+5)
Code:
When I apply +5v to channel 0 and/or channel 1, it only reads 2.502v (2047 in 12-bit) on the debug screen. The +5vdc on channels 0 and 1 is coming from VDD on the demo board (so the same as the supply for the chip, but that shouldn't matter?) This is originally code from a MCP3202, which didn't work until I altered the SHIFTOUT string. Is there something else in the code that isn't compatible with the 3208? Anyone see anything?
Benj
MCP3208 pin out:
1 (Channel 0) - +5v
2 (Channel 1) - +5v
9 (DGND) - vss
10 (CS) - BS pin 0
11 (DIN) - BS pin 3
12 (DOUT) - BS pin 2
13 (CLK) - BS pin 1
14 (AGND) - vss
15 (VREF) - vdd (+5)
16 (VDD) - vdd (+5)
Code:
' {$STAMP BS2} ' {$PBASIC 2.5} ' ' -----[ I/O Definitions ]------------------------------------------------- CS PIN 0 ' Chip Select Clock PIN 1 ' Clock DataOut PIN 2 ' DOUT DataIn PIN 3 ' DIN ' -----[ Constants ]------------------------------------------------------- Cnts2Mv CON $0139 ' x 1.22 (To Millivolts) Start_bit CON 1 SE_0 CON %1000 SE_1 CON %1001 ' -----[ Variables ]------------------------------------------------------- result0 VAR Word ' Conversion Result CH0 result1 VAR Word ' Conversion Result CH1 mVolts0 VAR Word ' Result0 --> mVolts mVolts1 VAR Word ' Result1 --> mVolts ' -----[ Init Setup ]------------------------------------------------------ DEBUG CLS, "ADC CH 0 : ", CR, "Volts :", CR, "ADC CH 1 : ", CR, "Volts :" ' -----[ Program Code ]---------------------------------------------------- DO LOW CS ' Enable ADC SHIFTOUT DataIn, Clock, MSBFIRST, [Start_bit\1, SE_0\4] ' Select CH0, Single-Ended SHIFTIN DataOut, Clock, MSBPOST, [result0\12] ' Read ADC HIGH CS ' Disable ADC mVolts0 = result0 */ Cnts2Mv ' Convert To Millivolts LOW CS ' Enable ADC SHIFTOUT DataIn, Clock, MSBFIRST, [Start_bit\1, SE_1\4] ' Select CH1, Single-Ended SHIFTIN DataOut, Clock, MSBPOST, [result1\12] ' Read ADC HIGH CS ' Disable ADC mVolts1 = result1 */ Cnts2Mv ' Convert To Millivolts DEBUG HOME, CRSRXY, 11, 0, DEC result0, CLREOL, ' Displays voltages & digital value CRSRXY, 11, 1, DEC mVolts0 DIG 3, ' for both channels ".", DEC3 mVolts0, CRSRXY, 11, 2, DEC result1, CLREOL, CRSRXY, 11, 3, DEC mVolts1 DIG 3, ".", DEC3 mVolts1 PAUSE 100 LOOP
When I apply +5v to channel 0 and/or channel 1, it only reads 2.502v (2047 in 12-bit) on the debug screen. The +5vdc on channels 0 and 1 is coming from VDD on the demo board (so the same as the supply for the chip, but that shouldn't matter?) This is originally code from a MCP3202, which didn't work until I altered the SHIFTOUT string. Is there something else in the code that isn't compatible with the 3208? Anyone see anything?
Benj
Comments
So Benj - that's good news! Care to mark your thread as "solved"?