What's the best way to do this, interpolate?
charleyshf
Posts: 165
Good morning,
I've been working on a project of mine on and off for over a year now, and I have been trying to do the following.
I have a vision system (blackfin camera) that can give me information on a specified color blob by returning rectangular values(x1 - left,x2 - right,y1 - top ,y2-bottom), I am just interested in the x1,x2 values for now. If I add x1 and x2 then divide that by 2 I get the center of the object. Now what I want to do is have my camera actually track the object using a servo, this is where I am having a hard time. I know that the servo values are 500(far left) to 2500(far right) with 1500 being center. The values returned from the blackfin camera are from 1-254 for x1 and also x2. One term that came up is "interpolate", googled it and, well here I am.....
I am trying to avoid using floating point math if possible, but it's starting to more and more look like I might end up going that route, I was looking at some C code but it didn't help..... So I am wondering if there's a better of simplier way to go about doing this? I'd really appreciate any help
I've been working on a project of mine on and off for over a year now, and I have been trying to do the following.
I have a vision system (blackfin camera) that can give me information on a specified color blob by returning rectangular values(x1 - left,x2 - right,y1 - top ,y2-bottom), I am just interested in the x1,x2 values for now. If I add x1 and x2 then divide that by 2 I get the center of the object. Now what I want to do is have my camera actually track the object using a servo, this is where I am having a hard time. I know that the servo values are 500(far left) to 2500(far right) with 1500 being center. The values returned from the blackfin camera are from 1-254 for x1 and also x2. One term that came up is "interpolate", googled it and, well here I am.....
I am trying to avoid using floating point math if possible, but it's starting to more and more look like I might end up going that route, I was looking at some C code but it didn't help..... So I am wondering if there's a better of simplier way to go about doing this? I'd really appreciate any help
Comments
But I'd understand that the camera should follow the object. The point here is that the difference between 127 (middle of the camera coordinate system) and the position of the middle of the object only tells you in which direction you have to move the camera. So, if your servo is 1500 and (127- objectx) is negative you'd add a value to 1500. For object which always have the same distance you could find a factor that you multiply by (127-objectx). If the object varies in distance this is not possible. Then you simply need a loop which approximates the servo position.
x_servo += (x1 - 256) << 2
y_servo += (y1 - 256) << 2
x1 and y1 are the sums of the coordinate pairs provided by the camera, as described by Kwinn. x_servo and y_servo are the servo pulse times in msecs. As Kwinn said, you may need to limit them to a range from 500 to 2500.
This will cause the (x1,y1) values to converge to (256,256). There will be a time delay caused by the camera processing and the servo movement that will tend to destabalize the control loop, so you will probably need to use a lower gain to prevent oscillation. In general, you should do something like this.
x_servo += ((x1 - x_offset) * x_scale) >> 8
y_servo += ((y1 - y_offset) * y_scale) >> 8
Start with a value of 256 for x_offset and y_offset, and adjust them to center the image. Start with a value of 1024 for x_scale and y_scale. Increase this to get a quicker response, or descrease it to reduce overshoot and to prevent oscillation.
Note, x_scale and/or y_scale may actually need to be negative values if the servos are connected to move in the opposite direction of the camera values.
The camera's field of view can be thought of as a plane against which subjects are projected. The position of a subject relative to the field of view's center will be proportional to the sine of the angle from the center. But as that position approaches the center, we can forget about sines and say that the position relative to the center is proportional to the angle from the center. That makes the math easier.
In the above figure, the angle of deviation, dF is given by:
For X = 1, this yields -F/2; for X = 254, it yields +F/2, as expected.
What we want is that same angle in terms of servo units, dS. This is given by:
where dR == dF, from the above figure.
Knowing dS and the current position, you can calculate the new position. It's probably better to underestimate the correction a little, though, and take a new reading, so you can home in on the target without overshooting it.
-Phil
Okay,, so..
@Phil - I know that the cameras field of view is 3.6mm f2.0 (90-deg), as for the servo, it's a HS-425BB and I am running it at a regulated 5vdc and the spec sheet says - Speed (4.8V/6.0V): 0.21 / 0.16 sec @ 60 deg, so i'd say that the speed of the servo is 0.18 @ 60 deg but that is with no load, so maybe 0.20 @60 deg?
@MagIO2 - I was thinking about limiting the minimum size of the returned rectangular value, this way it won't be all over the place if the object is too tiny. Also with this camera, it also returns the number of pixels it finds in the specified object, so if it's too low I can have it just ignore the object. This made me think of something else, if the returned object's pixel count keeps going up in numbers, wouldn't that basically say the object is getting closer to the camera?
So everyone knows, the camera can store values of colors (in YUV, but I can convert to RGB) you want to track, up to 16 total and it can track them all at the same time(somehow I don't think I want to go that route), I had a lot of fun writing spin code to not only take all these values, but being able to assign all those values so the prop can make use of them was a pain, especially when the camera detects mutiple objects, it returns all of them in order of highest pixel count, x1,x2,y1,x2, then slaps a /R/N at the end of each one, sometimes spaces, sometimes not, it was a real eye opener, err that might of been all the coffee and late nights working on it. For now though I am going to focus on just one object, the others I am planning on having my homemade robot react to different colors and do various things, that is if I can get the tracking part to work, let alone understand it properly.
Thank you again everyone..
-Phil
I just caught that, sorry.. Something else I just thought about, if the camera returns values for x1 and x2, lets say x1=10 and x2=30 and I add those two together and then divide that result by 2, giving me the center, would that scale then be 1 -127?
Thanks again.
-Phil
I know the camera has a 90 degree field of view and was assuming a servo with a 90 degree range of motion. If that were the case it would be relatively simple to calculate the servo value required to center (or nearly center) the object in the camera field of view.
Lets assume the servo is at it's center position of 1500 and an object is in the cameras view where X1 = 20 (left) and X2 = 50 (right).
That would place the center point between the two edges at (20+50)/2 = 35
The center of the camera field of view is at (254-1)/2 = 127.5 or rounded to 127
To center the object in the camera field of view would require a change of 127 – 35 = 92 camera steps.
The scale factor between the camera and servo would be (2500 – 500)/(254 – 1) = 7.905 which is close enough to 8 to make integer math and shifting instead of floating point practical.
The 92 camera steps multiplied by 8 (shifted left 3 bits) = 736 servo steps which would be added or subtracted from the current servo position.
The calculations can be simplified even further.
254 – (X1 + X2) would give the number of camera steps x2 ( 254 – (20 + 50) = 184
184 x 4 = 736 servo steps
By using signed integer arithmetic the direction the servo turns can be automatically accounted for as part of the calculation, and if a small error in centering is permitted the servo loop will only require a few iterations.
I spent the better part of the day yesterday trying to understand what's been posted here so far, I really appreciate the input, but honestly I am still lost, i'm not the greatest at programming and I have to go over things mutiple times to understand things.
One thing that I was thinking about was if I already know that sending a 11ms pulse would move the servo 1 degree.
So if the camera sees an object at x1=20 and x2=50 and finding the center of that (x1+x2)/2 = 35
assuming the servo is at center 1500 (and can fully turn 180 degrees from far left to far right)
this also assumes that anything to the left of camera's center point of view (eg 126) will subtract from the current servo position and anything to the right (128) will add to the servo's current position.
The center of the object is at 35(to the left)
using kwinn's scaling factor between camera & servo (2500 – 500)/(254 – 1) = 7.905 or rounded to 8
then wouldn't I multiply 8 * 11 = 88 and tell the servo to move (1500 - 88) = 1412? Or am I off the right path?
Thanks again
It's taking me some time because I have to re-wire and also do a few other things before I can do this step, but I will tell you I was told a while back that the camera returns the blob in coordinates of X1,X2,Y1,Y2 where X is the horizontal and Y the vertical. Each value could be from 1-254, this is WRONG, this is also something that has led me down the wrong path for a while now. The fact is that the camera will return the values for x1 and x2 anywhere from 1-160, so if I changed the resolution to 320x420 the x values returned could be from 1-320 on the x and 1-240 on the y, this was the first test I did on the camera today because my numbers just weren't adding up. I will be starting the other tests shortly
With my test setup the camera is set to 160 x 120 resolution, it can be set to larger resolutions, but I am sticking with the 160x120 until I can get a grip on this.
actual values on x axis from camera and servo position
servo position X1 X2 Centroid
1300 left 120 150 135
1400 left 90 122 106
1500 center 60 94 77
1600 right 30 62 46
1700 right 02 30 16
I tried to get more readings, but the cats made it almost impossible for me(they wanted the object I was working with).
The first four colums are the numbers you provided.
The Servo steps column is (1700 the Servo pos column). This is how far the servo rotated the camera.
The Camera steps column is the (Centroid column 16). This is the number of pixels the image Centroid moved on the camera.
The Scale Factor column is the (Servo steps / Camera steps). The average scale factor is at the bottom of this column, and for this camera resolution it is 3.33 which can be rounded to 3.
The plot shows a comparison of the servo position and Centroid x Scale Factor. Good correlation and linearity.
The calculation steps for the new servo position (assuming camera resolution is 160x120) would be:
Repeat:
Read blob x1 and x2 values from camera
Calculate Centroid
If abs value of Centroid 80 is gt than 5 ServoPosition = ServoPosition ((80 Centroid) x 3)
Move servo to new position
This should be a fair starting point for the calculations. Try it and see how it works.
It's working! There's quite a bit of oscillation however it appears to be working, I think I have the code correct:
I really appreciate all of you help in understanding this, If I have this right, then I would think the next step would be to deal with the oscillation that I am getting(going back and fourth trying to center on the object), also I will have to set limits so that the servo doesn't go out of it's limits as well.
Thanks again
-Phil
I actually did just that, I changed the if x_result to > 12 and I also replaced the 3 with 2, and it's actually right on target with no oscillation, I also have a limit set on the minimum amount of pixels to react to, this way it doesn't try to track something way too small as the camera can track something as small a 5 pixels(which at that point it would track just about everything depending on lighting conditions).
Thanks again
Even better news. Now you just have to mount it on a robot and program it to chase the cats. Then vengeance can be yours ; - )