Shop OBEX P1 Docs P2 Docs Learn Events
NEW PRODUCT: TSL1401-DB Linescan Imaging Sensor - Page 2 — Parallax Forums

NEW PRODUCT: TSL1401-DB Linescan Imaging Sensor

2456

Comments

  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2009-06-04 16:37
    'Good idea! It would have to go on the front, I suppose, since the backside is often hidden by any board it might be plugged into. I'll have to see if I can fit it on there somehow.

    -Phil
  • LawsonLawson Posts: 870
    edited 2009-06-04 18:22
    Just saw this! Really useful looking device. By any chance would this also be compatible with the TSL3301? The digital interface with on chip ADC of the TSL3301 looks a lot simpler to deal with on a Prop/SX.

    Lawson

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Lunch cures all problems! have you had lunch?
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2009-06-04 19:21
    Lawson,

    It's not available with the TSL3301. 'Sorry.

    I've programmed for both chips. In fact, I designed TAOS's TSL3301EVM for them. The TSL1401 is much easier to work with, IMO, which is the reason I picked it for this module. The '3301 has an onboard ADC but uses a rather complex isosynchronous serial protocol. Also having to clock out the analog pixel levels one bit at a time can slow things down unless you're using a really fast processor. (The TSL3301EVM required a 50MHz SX.) Since the MoBoStamp-pe already has onboard ADCs, and the Propeller can do sigma-delta, the analog output seemed the more natural choice. Plus, it's accessible with a BS2 without the analog processing, just using the Stamp's input logic threshold on each pixel to form a binary (light/dark) image.

    -Phil
  • ElectricAyeElectricAye Posts: 4,561
    edited 2009-06-04 23:44
    Any idea how close up this camera can "see" and still be in focus?

    thanks,
    Mark
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2009-06-05 00:05
    With the lens hanging by a thread (i.e. screwed almost all the way out), a field of view about 1/2" wide is in focus. That corresponds to a subject distance of the same amount from the front of the lens.

    I determined this by printing a set of parallel lines .04" apart, then setting up the monitor program to count light-to-dark edges. At the best focus, there were 12 of them in the field of view.

    BTW, one is not stuck using the lens and holder provided. Any lens with a standard M12 x 0.5 thread will fit the holder, and any holder with a standard 22mm mounting hole spacing will fit the board.

    -Phil
  • ElectricAyeElectricAye Posts: 4,561
    edited 2009-06-05 02:53
    Phil Pilgrim (PhiPi) said...
    ...

    BTW, one is not stuck using the lens and holder provided. Any lens with a standard M12 x 0.5 thread will fit the holder, and any holder with a standard 22mm mounting hole spacing will fit the board.

    -Phil

    Thanks, Phil, that's very good to know. smile.gif
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2009-06-07 20:40
    Since there was another thread about using the TSL1401-DB for distance sensing, I decided to cobble together a setup with a cheap laser line generator from China and try it out. The specs say 3.5V - 4.5V, but mine runs fine on 5V. I connected it to the daughterboard's mezzanine connector using a mating plug from DigiKey. (I recommend plugging it in before soldering the wires to the pins, just to keep them aligned when the plastic shell softens from the heat.) Here's a diagram from the other thread that illustrates the principle:

    attachment.php?attachmentid=61405

    And here's a photo that shows my implementation:

    attachment.php?attachmentid=61412

    I found that a red filter works best to increase contrast, so I drilled a hole in the lens cover, and cut out a piece of dark red filter material I got from Edmund Scientific. You can also use Rubylith, which may still be carried by some graphic art dealers. If you can't find it, a local print shop would probably just give you a piece if they have it, since few places use it any more. The inset in the photo above shows the filter installed, with an sub-inset showing the pieces.

    Here is what the system looks like aimed at a piece of paper:

    attachment.php?attachmentid=61413

    The monitor program is helpful for setting everything up. Here is some sample output from the laser line:

    attachment.php?attachmentid=61414

    I wrote a short program in PBASIC to read the location of the brightest pixel and display it in the DEBUG screen, along with a moving pointer. Here is the main program. This can be copied and pasted into the template provided on the TSL1401-DB product page:

    ' -----[noparse][[/noparse] Program Code ]----------------------------------------------------
    
    ' Your program code goes here.
    
    value           VAR     Byte
    
    OWOUT owio, 0, [noparse][[/noparse]SETEXP, 200]                  'Set exposure to 200.
    DO
      OWOUT owio, 0, [noparse][[/noparse]ACQBIN]                     'Snap a picture.
      GOSUB Ready                                 'Wait for picture.
      OWOUT owio, 0, [noparse][[/noparse]DUMPADR, MAXLOC]            'Read the location of brightest pixel.
      OWIN owio, 2, [noparse][[/noparse]value]
      DEBUG HOME, DEC value, CLREOL, CR           'Display it on debug screen,
      DEBUG REP "_"\value - 1, "|", REP "_"\(128 - value)  ' and as a pointer.
    LOOP
    END
    
    
    


    Here's some sample output:

    attachment.php?attachmentid=61415

    -Phil


    _

    Post Edited (Phil Pilgrim (PhiPi)) : 6/7/2009 8:45:49 PM GMT
    640 x 439 - 42K
    498 x 486 - 32K
    528 x 603 - 35K
    927 x 141 - 3K
  • ElectricAyeElectricAye Posts: 4,561
    edited 2009-06-07 21:48
    Very impressive! My Propeller is just dying to get this new set of eyes.


    smile.gif
  • W9GFOW9GFO Posts: 4,010
    edited 2009-06-08 16:40
    To avoid having to use a line generator - how would it work if it were carefully aligned and defocused slightly? My thinking is that with the line generator, nearly all of the laser light is wasted because the sensor cannot see it but if it were a spot and maybe a couple pixels off target - defocusing should bring more of the laser light onto the sensor than if it were a line. With a proper sturdy mount and careful alignment even defocusing may not be needed.

    Sound plausible?

    Rich H
  • Beau SchwabeBeau Schwabe Posts: 6,568
    edited 2009-06-08 17:29
    Phil Pilgrim,

    This is the same principle used in some laser range finders, and there was a lengthy thread several months ago using a method similar to the one you posted suggesting using two TAOS light to frequency converters in a differential relationship to one another that went over the math on how to calculate the distance from a known laser angle and a known distance between the laser and the TAOS sensors. I posted this awhile ago, but for some reason I can't find it now. The example was titled something like the "The Trig method" and also had a video demonstration.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    Beau Schwabe

    IC Layout Engineer
    Parallax, Inc.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2009-06-08 17:55
    Rich, you're right of course: a spot or shorter line would concentrate more light energy and produce a higher response in the sensor. The devil is in the details, and trying to align a spot would be an exercise in frustration, I'm afraid.

    But you can easily generate a shorter line with a spot laser and a cylindrical lens (e.g. the kind used with magnifying rulers). Attached is a set of photos to illustrate. There are also holographic diffusers available that can shape a beam of light almost any way you want. I would steer clear of cylindrical Fresnel lenses, though. They tend to produce a dashed line and the sensor could easily get lost between dashes.

    BTW, defocussing is a good way to wring more precision from 128 pixels. The shape of the peak will become Gaussian, and by observing the response both on around the highest pixel, you can do a weighted average to hone in more finely on the peak position.

    -Phil
    648 x 486 - 35K
    648 x 486 - 21K
    648 x 486 - 24K
  • TubularTubular Posts: 4,705
    edited 2009-06-08 21:31
    Great experimenting as always, Phil.

    I think it would be well worth you designing up a 'laser mezzanine'. Admittedly the resolution may be lower having to have the laser so close to the lense, but you could recover that using guassian fit as you describe. Line scanning is pretty neat, but laser range finding? Who _wouldn't_ want to add that to their robot?

    I regularly use industrial sensors based on the same principle - Balluff Leuze odsl8 or Erwin sick OD25 etc. The distance between the rx and tx axes would be about 20mm at the sensor, which is comparable to what might work on the mezzanine.

    tubular
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2009-06-09 23:43
    In the "Stamps in Class" forum, I've posted a thread showing how to use the TSL1401-DB with a BS2 for object tracking — in this case a pendulum.

    -Phil
  • HumanoidoHumanoido Posts: 5,770
    edited 2009-06-10 10:02
    Do you think you can add a distance function and if so,
    what would be the estimated range? Can I use this on
    my kite to determine its distance in the first 500 feet or
    so? What would be the upper end range limit?

    humanoido
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2009-06-10 16:10
    The laser technique I outlined depends on the strength of the laser and its distance from the camera's optical axis. Outdoors, the laser would be swamped by ambient lighting.

    Another approach, which I have not tried, would be to use two linescan sensors, with their sensor axes aligned, but separated by some distance. Then you would compare the two images, shifting one of them left or right to get the best match. The amount of shift would determine the distance. This is the way one's eyes work for perceiving distance (binocular vision) and the way optomechanical rangefinders work. But the accuracy depends on the separation of the two sensors and upon their resolution. It also depends on the feature richness of the image. If you're looking at a blank wall, for example, there's nothing to correlate between the two images.

    For your kite, and for the distances required, even the rangefinder technique would be pretty dicey, I think. At the very least, you'd need a Propeller or some other processor with enough memory to hold both grayscale images and that's capable of the correlation math. Also, the imager separation required for accuracy at that distance might be greater than the size of your kite.

    -Phil
  • David BDavid B Posts: 592
    edited 2009-06-10 16:12
    I'm trying to think of what would be a good algorithm for finding the position of a wide peak on an array like this, assuming that we initially know nothing about the position of the peak.

    So what if we write some code to find the simple average for a window of maybe 16 cells. We slide that window along the entire array, saving the location of the window having the greatest simple average.

    Then we find the weighted mean of the data at the position of that window, to give us a final value for the position of the peak.

    I used the value of 16 as an example, I guess that in practice, you'd want this window wide enough to take in all of the peak, or else we'll be throwing away useful data, but not so wide that it extends beyond the peak, or we'll be including excess noise.

    Does this make sense?
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2009-06-10 16:21
    Yes, that makes perfect sense. Another approach, if you know the the width of the peak you're looking for, would be slide a model of the peak across the image and compute the correlation at each point. Then pick the point with the highest correlation. This can be done pixel-by-pixel first, then on a sub-pixel basis, once the gross location has been determined. It would require a processor with enough memory to hold an entire grayscale scan, such as a Propeller.

    -Phil
  • W9GFOW9GFO Posts: 4,010
    edited 2009-09-07 07:24
    Hey Phil, do you have any Spin code ready that you could share? Just to get me started?

    Rich H

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    The Servo Boss, a 12 channel servo tester kit from Gadget Gangster.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2009-09-07 18:02
    Rich,

    No, I'm sorry, I don't yet. However, with the impending debut of the Propeller Backpack, which the TSL1401-DB will plug into, it's nearing the top of my to-do list.

    -Phil
  • W9GFOW9GFO Posts: 4,010
    edited 2009-09-09 08:47
    Here is an attempt at translating the BS2 program into spin.

    It's not working yet, might have to do with the SHIFTIN part or who knows what else. The terminal display is not working right, things like HOME and CRSRXY aren't working but I think that is because the BST terminal doesn't support it.

    I'll try it on a windoze computer later but for now, here it is - such as it is...

    Rich H

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    The Servo Boss, a 12 channel servo tester kit from Gadget Gangster.
  • RobertWRobertW Posts: 66
    edited 2009-09-18 02:28
    Phil,

    We grind some tool bits at the place I work.· The parts come in two different widths 0.500 and 0.750 inches wide.· We measure the width with micrometers but if they are not held correctly, there can be errors in measuring, especially with new guys.· I have thought about methods to measure the parts 'non-contact'·and learned of this sensor.· I don't fully understand the relationship between the distance between the sensor and the part, and·the resulting·field of vision, to know if this sensor would be possible to measure the width of the parts within 0.001 inch per side.· Is it possible?· Or, is it possible to measure, within the tolerance,·if I·scaned 1 side, moved the part a set distance, and scaned the other side, and calculated the width?

    Thank you,

    Rob W.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2009-09-18 18:32
    Rob,

    The precision you need is one part 750 for a 3/4" bit. The TSL1401 chip has 128 pixels; it would need 1500 to measure to that precision directly. You could, as you suggest, make a fixture that moves the bit from side-to-side by a known distance. In order to avoid errors due to parallax, it should be moved so that the edge being inspected is on the camera's optical axis. In other words, each bit would have to be moved very precisely by its own diameter. Also, if you're inspecting the fluted part, you will want to rotate the bit in front of the camera (without any wobble) so you can get a maximum reading. Finally, for maximum optical contrast, you will want to use a diffuse backlight.

    Another factor to consider is the lens itself. To get the required 0.0005" precision, the total field of view would have to be 0.0005" x 128 = 0.064". With the supplied 8mm f.l. lens, the subject distance equals the field of view, so the subject would have to be 0.064" from the lens. Unfortunately, this lens will not focus that closely without screwing it all the way out of its holder. You could use a longer focal length lens to narrow the field of view (i.e. magnify the edge). Basically, the lens would become a microscope for the linescan array and would have to magnify a subject by at least 5X. The reason for this is that the sensor's pixels are spaced at 400 pixels per inch. You need the equivalent of 2000 pixels per inch to get the required precision.

    All of these options, of course, will require very tight fixturing to get the required accuracy. It's not impossible, but you'd be looking at some significant optical and mechanical development to make it work.

    -Phil
  • RobertWRobertW Posts: 66
    edited 2009-09-18 23:21
    Hi Phil,

    Thank you for the quick reply and the thorough explaination of the sensor.· The bits being measured are more rectangular in nature so we would not have to worry about rotating them to find the largest diameter of the tip like a drill or endmill, but I can see that the standard sensor and lens does not have·a wide enough field of vision to work 'out of the box' in this application.· Making a fixture that moves the bits with the required precison is currently·beyond my level of knowledge.· It is an interesting idea and I will have to give it some thought on how it could work.· Thanks again.

    -Rob W.
  • W9GFOW9GFO Posts: 4,010
    edited 2009-09-19 16:34
    Is there any reason why a go, no-go gage would not work?

    Rich H

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
    The Servo Boss, a 12 channel servo tester kit from Gadget Gangster.
  • Brian_BBrian_B Posts: 842
    edited 2010-01-17 16:25
    ·I don't know if anyone has seen this yet, and I hope I don't get in trouble for posting it :-) . Here is what the new Parallax Mobo Project box looks like. AWSOME as always.



    Brian


    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔





    "Imagination is more important than knowledge..." Albert Einstein

    Post Edited (Brian_B) : 1/17/2010 4:34:58 PM GMT
    2160 x 1440 - 863K
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2010-01-17 18:52
    Brian,

    I'm glad you like the enclosure, and thanks for posting! Hopefully, these will be available from Parallax soon!

    -Phil

    Addendum: I can't tell from your photo. Did you get the elevator socket with the kit, to raise the LED display to window level? Thanks, -P.

    Post Edited (Phil Pilgrim (PhiPi)) : 1/17/2010 7:01:07 PM GMT
  • Brian_BBrian_B Posts: 842
    edited 2010-01-17 19:25
    Phil,
    No elevator socket, but I think they look fine down low. I really see you taking this along way , do you have other face plates in the works ? (lcd's, membrane switches, sensors).

    VERY Cool !

    Brian

    P.s. I have a line scan camera project that I'm going to start in Feb. , I'm going to start off sorting washers by size on a moving conveyor and then see how fast I can sort potatos by size . Be warned ,The questions are coming.

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔





    "Imagination is more important than knowledge..." Albert Einstein
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2010-01-17 20:31
    Brian_B said...
    ...and then see how fast I can sort potatos by size.
    Oh boy! You're embarking on a brave endeavor! That's how I spent the mid-80s, but with apples, kiwis, papayas, pears, and peaches. Fortunately, I was working with a true mechanical genius, as my mechanical skills at the time were zilch. Even then, it was quite a ride!

    To start, you will need an incremental shaft encoder (with a flex coupling — lesson learned the hard way) for the conveyor, so you can acquire scans at regular spacings. That's what the TSL1401-DB's trigger input is for. How were you planning to do the sorting (i.e. getting the spuds off the conveyor at their proper locations)? You might consider interfacing the MoBo/1401 to a PLC to keep track of stuff on the conveyor after it's sized. I've got an RS485 daughterboard in the works that would work well for that.

    -Phil
  • Brian_BBrian_B Posts: 842
    edited 2010-01-17 21:09
    Phil,
    We have a 2 lane "Haugen" at work and we need an extra lane ($75,000 I've been told) .I've been studying it for about 6 months now and I think I've got basic concept of how to line up the potato's and how to kick them off the belt at the right point without copying their hardware.

    This is how I see it working:
    1) the camera is triggered at first sight of the potato, The controller marks the count on the encoder.
    2) after the camera has seen the whole potato the controller figures out how many pixels the potato took up( say 1500).
    3) If the potato is 250-500 it goes into box 1 ,500-1000 box 2, 1000-2000 box 3 ,ect..
    4) the controller then adds the amount of encoder pulses it takes to get to box 3 to the first sight count
    5) When the potato arrives at the right count on the encoder it is kicked off the belt

    ·I'm hoping that I'm not totally wrong on how I thought this should work.


    ·I also have 10 micro smart PLC's that I bought cheap.


    Thanks'


    Brian

    ▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔





    "Imagination is more important than knowledge..." Albert Einstein

    Post Edited (Brian_B) : 1/17/2010 9:25:31 PM GMT
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2010-01-17 22:00
    What's the conveyor speed (fpm)?

    -Phil
Sign In or Register to comment.