Why does the TSL1401R-LF lose accuracy when you reduce the delay time between CLK pulses?

I'm trying to achieve 30 fps with my linescan camera module. I'm attempting to do this by reducing the delays in between the SI and CLK pulses. However, I noticed that as I reduce the delays the accuracy fell in proportion. When I'm using a 170 microsecond delay the accuracy was pretty high and the camera's outputted values were noticeably lower when an object passed through its line of vision. When I attempt to reduce that to a 20 microsecond delay I will still see lower values if a cover the camera or a higher value if I shine a light on the camera, but the change is slight and there is not change if I hold an object from a distance.

The lower the delay time, the worse the accuracy. This confuses me because the datasheet states that the minimum delay time can be as low as 20 nanoseconds for the SI pulse and 50 nanoseconds for the CLK pulses. Since I'm still clearly above the minimum why am I losing so much accuracey? Am I doing something wrong or is this just how the TSL1401R-LF works? Is there no way to achive 30 fps without losing the accuracy?

Comments

  • It's not clear to me how reducing the SI->CLK delays will help you get to 30 fps. The frame rate is solely dependent upon the SI->SI interval, which has to be 33.33ms for 30 fps. That's a lot of time to clock in 128 pixels (260 usec per pixel, to be exact). So you really do not need nanosecond-scale timing to achieve your objective.

    -Phil
    “Perfection is achieved not when there is nothing more to add, but when there is nothing left to take away. -Antoine de Saint-Exupery
  • It's not clear to me how reducing the SI->CLK delays will help you get to 30 fps. The frame rate is solely dependent upon the SI->SI interval, which has to be 33.33ms for 30 fps. That's a lot of time to clock in 128 pixels (260 usec per pixel, to be exact). So you really do not need nanosecond-scale timing to achieve your objective.

    -Phil

    In order to achieve 30 fps I need my program to finishing running in 33 seconds or less. The thing that is slowing down by program is the delays after the HIGH and LOW commands that send digital signals to the camera. I don't necessary have to reduce the delays after the SI signal but I'm just confused as to why the accuracy fell even though I'm well above the minimum hold time.
  • Yemen4u2 wrote:
    In order to achieve 30 fps I need my program to finishing running in 33 seconds or less.
    33 milliseconds.
    I don't necessary have to reduce the delays after the SI signal but I'm just confused as to why the accuracy fell even though I'm well above the minimum hold time.
    Without seeing your program, it would be hard for me to say what's happening. But the main thing is that you don't have to reduce the delays nearly that much and can quite comfortably reach your fps goal using a timing that gives you the sensitivity required for your application.

    If you're still concerned about this, you should probably contact AMS customer support directly.

    -Phil
    “Perfection is achieved not when there is nothing more to add, but when there is nothing left to take away. -Antoine de Saint-Exupery
  • quite comfortably reach your fps goal using a timing that gives you the sensitivity required for your application.

    What exactly does that mean? How woud I do this?

    This is my code if you can think of a way for me to reach the 33 millisecond time without reducing the delays I'd be happy to implement it.
    int delayTime = 170;
    
    void readPixels()  
    {
      digitalWriteFast(SI, HIGH);
      delayMicroseconds(delayTime/2);
      digitalWriteFast(CLK, HIGH);
      delayMicroseconds(delayTime/2);
      digitalWriteFast(SI, LOW);
      delayMicroseconds(delayTime/2);
      digitalWriteFast(CLK, LOW);
      delayMicroseconds(delayTime);
    
      for(int i = 0; i < 128; i++)
      { 
        digitalWriteFast(CLK, HIGH);
        pixelsArray1[i]=analogRead(Cam1Aout);
        pixelsArray2[i]=analogRead(Cam2Aout);
        pixelsArray3[i]=analogRead(Cam3Aout);
        delayMicroseconds(delayTime);
        digitalWriteFast(CLK, LOW);
        delayMicroseconds(delayTime);
      }
    
      delayMicroseconds(20);
      
    }
    
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 22,158
    edited 2016-06-25 - 19:01:29
    I don't know which processor you're writing this code for or what the C instruction overhead is. Also, how much time does the analogRead function require?

    Perhaps some oscilloscope traces would be helpful.

    -Phil
    “Perfection is achieved not when there is nothing more to add, but when there is nothing left to take away. -Antoine de Saint-Exupery
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 22,158
    edited 2016-06-28 - 16:19:22
    I think I see the problem now. The exposure (integration) time is exactly the SI->SI time interval (minus the first 18 clock periods). The shorter the integration time, the lower the output response will be, i.e. the darker the image. By shortening the CLK pulses, and without adding delays after the 129th CLK, the SI->SI time also gets shorter, resulting in less total light integration.

    If, by shortening your integration time to 33 ms, you get insufficient response, your only recourse is to increase the illumination on your subject.

    -Phil
    “Perfection is achieved not when there is nothing more to add, but when there is nothing left to take away. -Antoine de Saint-Exupery
  • Phil,

    Since you seem to be familiar with this sensor, let me ask you a question. I am using just the sensor not the Parallax board. (full contact edge sensing and I couldn't use the parallax board and lens)

    Maybe it's because I'm running it 1000+ fps (with the appropiate illumination to match), but I am seeing a signal output on my scope (as I scan the 128 pixels) that looks a ski slope. As if pixels on the far side get more integration time than on the near side of the array. So I tried to slow it down some but it didn't seem tohelp. Same overall shape, just more amplified. Maybe I need to slow it all down to 30 fps (and reduce the lighting) to get a flat linear response from each pixel?

    So right now I am just planning to do something in software to adjust the level. Not a big deal. The sensor otherwise works great. I am simply curious if what I am seeing is a normal characteristic of these sensors. As I scan from left to right through the pixel array, the pixels towards the right are as much as twice as responsive. Then, before it reaches the very end of the array, the response starts going back down again! Especially because of this last thing, makes me think that this is just a normal characteristic of these sensors. Due to normal variation thoughout the wafer during manufacturer. (and why analog chips are harder to make than digital)
    I am the Master, and technology my slave.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 22,158
    edited 2016-06-28 - 16:20:22
    The_Master,

    I suspect that uneven lighting is causing the "ski-slope" output you're seeing. Try rotating the array (but not the lighting) 180 degrees and see if you get the opposite response. Also, don't forget to add that 129th CLK pulse before the next SI sequence.

    -Phil
    “Perfection is achieved not when there is nothing more to add, but when there is nothing left to take away. -Antoine de Saint-Exupery
  • You're right. I'm a little embarrassed now.

    Make a long story short, there's a lot more 'directionality' to my LED light source than I realized.

    This sensor is great.
    I am the Master, and technology my slave.
  • 'No reason to be embarrassed. Lighting is one of the trickiest -- yet most critical -- factors to be mastered when designing an optical sensing system. Good luck with your project! :)

    -Phil
    “Perfection is achieved not when there is nothing more to add, but when there is nothing left to take away. -Antoine de Saint-Exupery
Sign In or Register to comment.