Proton BASIC Compiler - Auto setting Voltage Input

  • PicŪ Basic

  • Auto setting Voltage Input

    Hi guys,

    Most of you probably do this already, but some of the new guys may find this helpful.

    When you input an A/D reading then it is always a bugger to calibrate the input voltage to read accurate because of the resistance tolerance. If you calculate the voltage input on the divider it is seldom what you read with the pic due to this tolerance variation, iow you end up reading too high or too low.

    I know if it's a once off it's probably ok to trim with a pot which is more expensive than a resistor, but when you have to do a bunch or there is no pot space on a board then it becomes a pain (trust me). You really want to program the pic and it must calibrate the A/D read itself.

    In this example I used from the pic's A/D input a 1k to GND and a 6k8 to an input voltage of 12.0V, expecting the pic to display 12.0V on the display. You need to multiply with a factor and devide with another to "tune" the value to represent 120, which with a decimal point will read 12.0V.

    I made a fixed 12V power supply using a 7812, the output is actually 12.01V by sheer luck. If the output differed, then you can simply adjust the reading you get in the A/D to read the same in the code when the auto calibrate is done.

    So what I do is power the app from the 12.0V supply which is connected through a voltage divider to the A/D input of the pic. When I then program the pic, it checks if the value of 120 is in an eeprom location, if , then it is calibrated already so run the app, if not, trim the value of the A/D input to 120, save in the eeprom location and then do the app. Hence the auto calibrate is done only once. You can of course also do this calibration if a switch is pressed, I make mine to auto calibrate itself directly after programming.

    A 10-bit A/D can for all practical purpose display only 99.9 (since the count can only run up to 1024), hence expect a value accurate to +/-0.1V in this range. (On higher resolution 12 bit A/D's you can add a second place after the DP to +/-00.01V, or extend the voltage range to 400.0V)

    This is the code I use.

    Dim Vin as DWord
    Dim Test as Byte
    Dim DvdR as Word
            Test = ERead 1    ' Check if calibrated or not
            If Test = 120 Then GoTo Prp2      ' Already calibrated DvdR
            DvdR = 280    ' Start with an approx value
     DelayMS 200
            Vin = ADIn 7                          ' Read Volatge input
            Vin = Vin * 110                       ' Multiply factor
            Vin = Vin / DvdR                     ' Divide factor
            If Vin > 120 Then DvdR = DvdR + 1 : GoTo Prp1 : EndIf     ' Trim divider out
            If Vin < 120 Then DvdR = DvdR - 1 : GoTo Prp1 : EndIf     ' Trim divider in
            If Vin = 120 Then   ' Trim value correct
            EWrite 5, [DvdR]   ' Save divide factor
            EWrite 1, [120]    ' Set as calibrated
            GoTo Prp2     ' Done, exit
            GoTo Prp1  ' Not trimmed yet
            DvdR = ERead 5      ' Load the tuned divider factor
    That's it.

    Where ever in the code you read the A/D input, you use: -

    Vin = ADIn 7                          ' Read Volatge input
    Vin = Vin * 110                       ' Multiply factor
    Vin = Vin / DvdR                     ' Divide with tuned factor

    Vin = ADIn 7 : Vin = Vin * 110 : Vin = Vin / DvdR  ' Read A/D
    to come up with a fairly accurate reading.

    You can make any standard voltage for auto calibrating your A/D value to read accurate, if your voltage range is ie 60V, then you can do the same with a proper voltage divider.

    BTW, you should invest in say a 10.00V precision voltage reference to which you can compare multimeters and other measuring stuff against. Makes life lot easier if you know you are reading the right value.

    Hope this helps someone.
    This article was originally published in forum thread: Auto setting Voltage Input started by fanie View original post