Shop OBEX P1 Docs P2 Docs Learn Events
What's the best way to do this, interpolate? — Parallax Forums

What's the best way to do this, interpolate?

charleyshfcharleyshf Posts: 165
edited 2011-11-26 12:57 in Propeller 1
Good morning,

I've been working on a project of mine on and off for over a year now, and I have been trying to do the following.

I have a vision system (blackfin camera) that can give me information on a specified color blob by returning rectangular values(x1 - left,x2 - right,y1 - top ,y2-bottom), I am just interested in the x1,x2 values for now. If I add x1 and x2 then divide that by 2 I get the center of the object. Now what I want to do is have my camera actually track the object using a servo, this is where I am having a hard time. I know that the servo values are 500(far left) to 2500(far right) with 1500 being center. The values returned from the blackfin camera are from 1-254 for x1 and also x2. One term that came up is "interpolate", googled it and, well here I am.....

I am trying to avoid using floating point math if possible, but it's starting to more and more look like I might end up going that route, I was looking at some C code but it didn't help..... So I am wondering if there's a better of simplier way to go about doing this? I'd really appreciate any help

Comments

  • kwinnkwinn Posts: 8,697
    edited 2011-11-22 06:37
    With values between 1 and 254 as inputs and a range of 2000 (2500 - 500) as the required output this is almost perfect for integer math with shifts for multiply/divide. Add the two inputs together to obtain a value between 2 and 508, shift the result left by 2 to get a number between 8 and 2032, then add 480 to get a number between 488 and 2512 as your output. This can be clipped to 500 - 2500 for the servo signal but that may not be necessary since the maximum error is so small.
  • MagIO2MagIO2 Posts: 2,243
    edited 2011-11-22 07:03
    Hi kwinn ... I think this would work if the camera is fixed and the servo points something else to the object.

    But I'd understand that the camera should follow the object. The point here is that the difference between 127 (middle of the camera coordinate system) and the position of the middle of the object only tells you in which direction you have to move the camera. So, if your servo is 1500 and (127- objectx) is negative you'd add a value to 1500. For object which always have the same distance you could find a factor that you multiply by (127-objectx). If the object varies in distance this is not possible. Then you simply need a loop which approximates the servo position.
  • Dave HeinDave Hein Posts: 6,347
    edited 2011-11-22 07:37
    Your camera control is basically a closed-loop control system. Unless you get the offset exactly right the camera will pan all the way to one corner or it will oscillate back-and-forth, depending on the sign of the offset error. You should use the (x,y) values from the camera as a correction to the current position, as MagIO2 suggested. You could do something like this

    x_servo += (x1 - 256) << 2
    y_servo += (y1 - 256) << 2

    x1 and y1 are the sums of the coordinate pairs provided by the camera, as described by Kwinn. x_servo and y_servo are the servo pulse times in msecs. As Kwinn said, you may need to limit them to a range from 500 to 2500.

    This will cause the (x1,y1) values to converge to (256,256). There will be a time delay caused by the camera processing and the servo movement that will tend to destabalize the control loop, so you will probably need to use a lower gain to prevent oscillation. In general, you should do something like this.

    x_servo += ((x1 - x_offset) * x_scale) >> 8
    y_servo += ((y1 - y_offset) * y_scale) >> 8

    Start with a value of 256 for x_offset and y_offset, and adjust them to center the image. Start with a value of 1024 for x_scale and y_scale. Increase this to get a quicker response, or descrease it to reduce overshoot and to prevent oscillation.

    Note, x_scale and/or y_scale may actually need to be negative values if the servos are connected to move in the opposite direction of the camera values.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2011-11-22 08:46
    To do this accurately, you need to know two things ahead of time: R, the total angle of rotation of the servo from lock to lock (i.e. the actual angle covered by servo units 500 to 2500); and F, the X-angle of the camera's field of view (which should be given in the camera's spec sheet; otherwise you'll have to measure it). Here's an illustration:

    attachment.php?attachmentid=87073&d=1321980274

    The camera's field of view can be thought of as a plane against which subjects are projected. The position of a subject relative to the field of view's center will be proportional to the sine of the angle from the center. But as that position approaches the center, we can forget about sines and say that the position relative to the center is proportional to the angle from the center. That makes the math easier.

    In the above figure, the angle of deviation, dF is given by:
    df = F * (X - 127.5) / (254 - 1)

    For X = 1, this yields -F/2; for X = 254, it yields +F/2, as expected.

    What we want is that same angle in terms of servo units, dS. This is given by:
    dS = (dR / R) * (2500 - 500),

    where dR == dF, from the above figure.

    Knowing dS and the current position, you can calculate the new position. It's probably better to underestimate the correction a little, though, and take a new reading, so you can home in on the target without overshooting it.

    -Phil
    480 x 335 - 13K
  • MagIO2MagIO2 Posts: 2,243
    edited 2011-11-22 10:27
    I'd propose to add a hysteresis, which means that you don't move if the distance from the current position to the calculated position is smaller than a predefined value. This avoids frequent moves because of small deviations which could for example come from the camera.
  • charleyshfcharleyshf Posts: 165
    edited 2011-11-22 11:41
    First THANK YOU everyone for responding! This has been really been a challenge for me, I had gotten some help here on the forums a while back when I was working on strings that are received from the Blackfin Camera and that took me some time to do, I had figured that after getting that part out of the way, and I was able to make use of all the values, the rest shouldn't be "too hard" to deal with :smile: . I'll be honest, at first glance this is going to take me some time to understand.

    Okay,, so..

    @Phil - I know that the cameras field of view is 3.6mm f2.0 (90-deg), as for the servo, it's a HS-425BB and I am running it at a regulated 5vdc and the spec sheet says - Speed (4.8V/6.0V): 0.21 / 0.16 sec @ 60 deg, so i'd say that the speed of the servo is 0.18 @ 60 deg but that is with no load, so maybe 0.20 @60 deg?


    @MagIO2 - I was thinking about limiting the minimum size of the returned rectangular value, this way it won't be all over the place if the object is too tiny. Also with this camera, it also returns the number of pixels it finds in the specified object, so if it's too low I can have it just ignore the object. This made me think of something else, if the returned object's pixel count keeps going up in numbers, wouldn't that basically say the object is getting closer to the camera?

    So everyone knows, the camera can store values of colors (in YUV, but I can convert to RGB) you want to track, up to 16 total and it can track them all at the same time(somehow I don't think I want to go that route), I had a lot of fun writing spin code to not only take all these values, but being able to assign all those values so the prop can make use of them was a pain, especially when the camera detects mutiple objects, it returns all of them in order of highest pixel count, x1,x2,y1,x2, then slaps a /R/N at the end of each one, sometimes spaces, sometimes not, it was a real eye opener, err that might of been all the coffee and late nights working on it. For now though I am going to focus on just one object, the others I am planning on having my homemade robot react to different colors and do various things, that is if I can get the tracking part to work, let alone understand it properly.

    Thank you again everyone..
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2011-11-22 11:53
    The servo speed is not what I was after for R but, rather, the total range of motion between the 500 and 2500 values. You may have to determine that experimentally with a protractor.

    -Phil
  • charleyshfcharleyshf Posts: 165
    edited 2011-11-22 12:06
    Hi Phil,

    I just caught that, sorry.. Something else I just thought about, if the camera returns values for x1 and x2, lets say x1=10 and x2=30 and I add those two together and then divide that result by 2, giving me the center, would that scale then be 1 -127?

    Thanks again.

    The servo speed is not what I was after for R but, rather, the total range of motion between the 500 and 2500 values. You may have to determine that experimentally with a protractor.

    -Phil
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2011-11-22 12:08
    No, the average will still be in the range 1-254.

    -Phil
  • kwinnkwinn Posts: 8,697
    edited 2011-11-22 20:02
    @MagIO2, I can see that my response was less than clear or complete. Unfortunately I am much better at visualizing problems and solutions than I am at putting them into words.

    I know the camera has a 90 degree field of view and was assuming a servo with a 90 degree range of motion. If that were the case it would be relatively simple to calculate the servo value required to center (or nearly center) the object in the camera field of view.

    Lets assume the servo is at it's center position of 1500 and an object is in the cameras view where X1 = 20 (left) and X2 = 50 (right).

    That would place the center point between the two edges at (20+50)/2 = 35

    The center of the camera field of view is at (254-1)/2 = 127.5 or rounded to 127

    To center the object in the camera field of view would require a change of 127 – 35 = 92 camera steps.

    The scale factor between the camera and servo would be (2500 – 500)/(254 – 1) = 7.905 which is close enough to 8 to make integer math and shifting instead of floating point practical.

    The 92 camera steps multiplied by 8 (shifted left 3 bits) = 736 servo steps which would be added or subtracted from the current servo position.

    The calculations can be simplified even further.

    254 – (X1 + X2) would give the number of camera steps x2 ( 254 – (20 + 50) = 184

    184 x 4 = 736 servo steps

    By using signed integer arithmetic the direction the servo turns can be automatically accounted for as part of the calculation, and if a small error in centering is permitted the servo loop will only require a few iterations.
  • charleyshfcharleyshf Posts: 165
    edited 2011-11-23 05:59
    Good morning,

    I spent the better part of the day yesterday trying to understand what's been posted here so far, I really appreciate the input, but honestly I am still lost, i'm not the greatest at programming and I have to go over things mutiple times to understand things.

    One thing that I was thinking about was if I already know that sending a 11ms pulse would move the servo 1 degree.

    So if the camera sees an object at x1=20 and x2=50 and finding the center of that (x1+x2)/2 = 35

    assuming the servo is at center 1500 (and can fully turn 180 degrees from far left to far right)
    this also assumes that anything to the left of camera's center point of view (eg 126) will subtract from the current servo position and anything to the right (128) will add to the servo's current position.

    The center of the object is at 35(to the left)

    using kwinn's scaling factor between camera & servo (2500 – 500)/(254 – 1) = 7.905 or rounded to 8

    then wouldn't I multiply 8 * 11 = 88 and tell the servo to move (1500 - 88) = 1412? Or am I off the right path?

    Thanks again
  • kwinnkwinn Posts: 8,697
    edited 2011-11-23 07:52
    @charleyshf, the first thing you need to determine is the actual scaling factor between the camera positions and the servo positions. You can do this by mounting the camera on the servo and recording the servo, X1, and X2 positions as you move the servo to place an image in several positions across the camera field of view. I would suggest 5 readings with the object at the far left, mid left, center, mid right, and far right of the camera field of view. This will give us the actual scaling factor and linearity.
  • charleyshfcharleyshf Posts: 165
    edited 2011-11-23 13:04
    Hello,

    It's taking me some time because I have to re-wire and also do a few other things before I can do this step, but I will tell you I was told a while back that the camera returns the blob in coordinates of X1,X2,Y1,Y2 where X is the horizontal and Y the vertical. Each value could be from 1-254, this is WRONG, this is also something that has led me down the wrong path for a while now. The fact is that the camera will return the values for x1 and x2 anywhere from 1-160, so if I changed the resolution to 320x420 the x values returned could be from 1-320 on the x and 1-240 on the y, this was the first test I did on the camera today because my numbers just weren't adding up. I will be starting the other tests shortly





    With my test setup the camera is set to 160 x 120 resolution, it can be set to larger resolutions, but I am sticking with the 160x120 until I can get a grip on this.

    kwinn wrote: »
    @charleyshf, the first thing you need to determine is the actual scaling factor between the camera positions and the servo positions. You can do this by mounting the camera on the servo and recording the servo, X1, and X2 positions as you move the servo to place an image in several positions across the camera field of view. I would suggest 5 readings with the object at the far left, mid left, center, mid right, and far right of the camera field of view. This will give us the actual scaling factor and linearity.
  • kwinnkwinn Posts: 8,697
    edited 2011-11-24 14:58
    By Blackfin camera do you mean the SRV-1 Blackfin camera with 640x480, 320x256 or 160x128 pixel resolutions as found at “http://www.surveyor.com/index_blackfin.html” ?
  • charleyshfcharleyshf Posts: 165
    edited 2011-11-24 17:31
    Yes, that's the Blackfin camera I have been working with..
    kwinn wrote: »
    By Blackfin camera do you mean the SRV-1 Blackfin camera with 640x480, 320x256 or 160x128 pixel resolutions as found at “http://www.surveyor.com/index_blackfin.html” ?
  • charleyshfcharleyshf Posts: 165
    edited 2011-11-24 18:50
    Okay I was able to take a few readings, I have to resolve some issues with lighting and cats being curious:thumb: . Each reading was taken 5 times for each position, the object was left stationary while the servo was moved

    actual values on x axis from camera and servo position
    servo position X1 X2 Centroid
    1300 left 120 150 135
    1400 left 90 122 106
    1500 center 60 94 77
    1600 right 30 62 46
    1700 right 02 30 16

    I tried to get more readings, but the cats made it almost impossible for me(they wanted the object I was working with).
  • kwinnkwinn Posts: 8,697
    edited 2011-11-24 21:23
    I have taken the readings you provided, entered them into a spreadsheet, performed some calculations, and plotted the results (see attached picture).

    The first four colums are the numbers you provided.

    The Servo steps column is (1700 – the Servo pos column). This is how far the servo rotated the camera.

    The Camera steps column is the (Centroid column – 16). This is the number of pixels the image Centroid moved on the camera.

    The Scale Factor column is the (Servo steps / Camera steps). The average scale factor is at the bottom of this column, and for this camera resolution it is 3.33 which can be rounded to 3.

    The plot shows a comparison of the servo position and Centroid x Scale Factor. Good correlation and linearity.

    The calculation steps for the new servo position (assuming camera resolution is 160x120) would be:

    Repeat:

    Read blob x1 and x2 values from camera

    Calculate Centroid

    If abs value of Centroid – 80 is gt than 5 ServoPosition = ServoPosition – ((80 – Centroid) x 3)

    Move servo to new position

    This should be a fair starting point for the calculations. Try it and see how it works.
  • charleyshfcharleyshf Posts: 165
    edited 2011-11-25 09:09
    Hi kwinn,

    It's working! There's quite a bit of oscillation however it appears to be working, I think I have the code correct:
    x_result := || (x_centroid - 80)
      PST.Str(string("x_centroid = "))
      PST.Dec (x_centroid)
      PST.Str(string(pst#NL))
      PST.Str(string("x_result = "))
      PST.Dec (x_result)
      PST.Str(string(pst#NL))
      If x_result > 5
        cam_x_location := cam_x_location - ((80 - x_centroid) * 3) 
        PST.Str(string(pst#NL))
        PST.Str(string("X Servo position result = "))
        PST.Dec(cam_x_location)
        PST.Str(string(pst#NL))
        SERVO.Set (ServoPAN,(cam_x_location))
    

    I really appreciate all of you help in understanding this, If I have this right, then I would think the next step would be to deal with the oscillation that I am getting(going back and fourth trying to center on the object), also I will have to set limits so that the servo doesn't go out of it's limits as well.

    Thanks again
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2011-11-25 09:13
    The oscillation comes from overcorrecting. Just back off on the corrections a little, and it should go away.

    -Phil
  • kwinnkwinn Posts: 8,697
    edited 2011-11-25 16:29
    Good to hear that it's working. As Phil posted the oscillation is probably from over correcting. Try replacing the 3 in "80 - x_centroid) * 3" with something smaller like 2. See if it makes it better or worse. Another thing to try is increasing from 5 in the "If x_result > 5" statement. It also appears you are printing/logging the variables each time through the loop. This will help in figuring out where the oscillation is coming from. It may be due to the servo stopping each time through the loop.
  • charleyshfcharleyshf Posts: 165
    edited 2011-11-25 16:54
    Hello,

    I actually did just that, I changed the if x_result to > 12 and I also replaced the 3 with 2, and it's actually right on target with no oscillation, I also have a limit set on the minimum amount of pixels to react to, this way it doesn't try to track something way too small as the camera can track something as small a 5 pixels(which at that point it would track just about everything depending on lighting conditions).

    Thanks again

    kwinn wrote: »
    Good to hear that it's working. As Phil posted the oscillation is probably from over correcting. Try replacing the 3 in "80 - x_centroid) * 3" with something smaller like 2. See if it makes it better or worse. Another thing to try is increasing from 5 in the "If x_result > 5" statement. It also appears you are printing/logging the variables each time through the loop. This will help in figuring out where the oscillation is coming from. It may be due to the servo stopping each time through the loop.
  • kwinnkwinn Posts: 8,697
    edited 2011-11-26 11:24
    charleyshf wrote: »
    Hello,

    I actually did just that, I changed the if x_result to > 12 and I also replaced the 3 with 2, and it's actually right on target with no oscillation, I also have a limit set on the minimum amount of pixels to react to, this way it doesn't try to track something way too small as the camera can track something as small a 5 pixels(which at that point it would track just about everything depending on lighting conditions).

    Thanks again

    Even better news. Now you just have to mount it on a robot and program it to chase the cats. Then vengeance can be yours ; - )
  • charleyshfcharleyshf Posts: 165
    edited 2011-11-26 12:57
    EXACTLY! The only thing I am resolving is the y-axis, I have it working great, except that with lighting it's doesn't take much for it to lose the object, I ended up enabling the auto white balance/gain on the blackfin, placing the robot on the floor and after letting the blackfin camera adjust to the lighting, disabling the auto white balance/gain, which it's much better, but I think that the y-axis is going to be set not to not look up to far..


    kwinn wrote: »
    Even better news. Now you just have to mount it on a robot and program it to chase the cats. Then vengeance can be yours ; - )
Sign In or Register to comment.