Shop OBEX P1 Docs P2 Docs Learn Events
Video question — Parallax Forums

Video question

MonroeMonroe Posts: 18
edited 2014-04-01 17:49 in General Discussion
There is a built in video generation factor for the propeller I understand. What I'm interested in is can this video generator output NBTV (narrow band TV) at 48 lines of resolution? Rather than the standard NTSC signal or Pal. Or is it fixed to that system only. In other words can we manipulate the way propeller generates video or is this locked to only those standards?

Monroe
«1

Comments

  • Mark_TMark_T Posts: 1,981
    edited 2014-03-05 04:31
    The Propeller video generation hardware is very flexible, but I can't see that anyone's written
    a driver for NBTV before, so it might take some work - is there a good description of NBTV
    signal somewhere?
  • MonroeMonroe Posts: 18
    edited 2014-03-05 06:37
    The NBTV band is very flexible the standard is 32 lines but we need a little more than that. Our application is for broadcasting video from space over long distances. Right now my team "Team Prometheus" is working with Gary Millard on a software solution for the R Pi. http://users.tpg.com.au/users/gmillard/nbtv/nbtv.htm this is some preliminary work to determine the projects feasibility and testing transmission of the narrowband signal.

    The reason I'm asking is we are also working with Team Sol-X and their GDB http://www.solarsystemexpress.com/ which uses the propeller. Our main interest in the propeller is for a Cubesat we are designing along with our UAV Spaceplane the Condor https://www.facebook.com/photo.php?fbid=737764666249732&set=a.117123151647223.17748.100000486155457&type=1&theater

    We have been working with the Ardupilot as our autopilot because we can do HIL testing and that helped us develop the spaceplane project rapidly. We are looking at the propeller and possibly the propeller II for a solution there also. So in order to move over to the propeller we need a good deal of work. But being the team captain means if I decide to move to parallax we will.

    Monroe
  • potatoheadpotatohead Posts: 10,261
    edited 2014-03-05 06:54
    I would be surprised if the Prop can't do this. You can easily display it too, perhaps with an ordinary TV / VGA driver.

    Prop video is software defined. You write a kernel which outputs the signal, and maybe does some other things. Best start is one of the existing driver templates and modify from there. Eric Ball has a few out there to start from.
  • MonroeMonroe Posts: 18
    edited 2014-03-05 07:06
    That is indeed good news! one step forward. Now if we can get the propeller to function as a SDR and a modem and at the same time handle the reaction wheels for the satellite we would have a perfect solution. Not being a programmer I am feeling around in the dark. But If I can determine the propeller can handle or objectives I will make an effort to learn with the propeller. If anyone can consult with me on the viability of using the propeller please contact me monroe@teamprometheus.org any help that can direct me about a solution would be appreciated. It is a large task and I am a manager of the project and I need guidance to go in the direction of the propeller.
  • kwinnkwinn Posts: 8,697
    edited 2014-03-05 08:21
    There is already a modem object in the OBEX, so that a leaver only the SDR function and reaction wheels from your list. I am pretty sure the prop can handle both, but it would be best to post a description of what those functions involve to be sure.

    When it comes to controlling mechanical systems the propeller is an excellent choice.
  • Dave HeinDave Hein Posts: 6,347
    edited 2014-03-05 08:39
    Monroe, welcome to the Parallax forum. I don't think the Propeller video hardware will help you since that is used to drive a display, and it seems like what you need is video acquisition and transmission. What kind of video sensor are you using to capture the image? The Prop may be able to interface directly to the sensor, or you may need a video DAC to capture the image.
  • MonroeMonroe Posts: 18
    edited 2014-03-05 14:40
    Dave, you are correct. We do not need to drive a display. We are flexible on the sensor resolution it is CMOS (whatever works best for the encoder with a usable resolution so far 48 lines of resolution is good and 12.5 fps is fast enough). Yes, what we will need is an encoder to create the audio output for the transmitter. Thanks for the welcome- Hope all is well.

    Monroe
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2014-03-05 18:51
    Monroe,

    How many pixels wide? 64? If that's the case, do the math; your baud rate for 12.5 fps will have to be
    64 pixels/line * 48 lines/frame * 12.5 frames/sec * 11 asynchronous bits/pixel == 422,4 kBaud (bits/sec).

    That's well above what an audio modem/codec can accomodate -- not to mention FEC (forward error correction) redundancy. Wha't's your carrier frequency? And what modulation modes and RF bandwidth are you allowed tio use?

    -Phil
  • MonroeMonroe Posts: 18
    edited 2014-03-06 03:38
    The video is analog. The bandwidth is 25khz for now but the goal is 6khz using SSB. 1200 baud data when not transmitting video. We are looking at putting telemetry in the video frame as well perhaps using one line of the frame. PACTOR is an option we are also looking at to allow access to the satellite via internet. That's all pretty complex I suppose but the key factor right now is just the video encoding and conversion to an audio signal.

    For the April mission we hope the R-Pi is ready and we are using that to see the horizon only for flight. Right now we can get 70 mile range with the 900 mhz system and we can use that if we have too. The goal is to work it into a satellite system as we go.

    The idea came from the Apollo 11 video transmission using NBTV they had 500khz to use but they also had 320 lines of resolution at 12.5 fps. We need far less for what we are doing. 48 lines is enough at 12.5 fps. NBTV is 32 lines as a standard for the mechanical systems what we are doing is somewhat different but the same resolution range.

    What we are doing is trading BW for SNR for really long range comms with affordable/manageable antennas.

    Bandwidth and Freq right now we are using Dual Band 433 down 144 up full duplex 25khz max BW. The uplink is 1200 baud data only and right now AX25 protocol. Yeah right now FM the SSB system would be AM. We can go as high as 12 khz SSB and that's a limit I set. There are not BW limits on amature bands but we want to be civil. Some might complain if we went to 12khz on eSSB so we'll see.

    It's all in a state of flux and experimental for sure.

    A 56k modem on phone line uses 3 khz BW over audio. So this is far less than that.
  • PropGuy2PropGuy2 Posts: 360
    edited 2014-03-06 06:13
    I would point you to AMSAT the amateur radio satellite people. They have numerous satellites already in orbit, they have the hardware, software, satellite tracking, and expertise to do what you are contemplating. What is also nice is that AMSAT already has a working and spacecraft approved base frame satellite package, where by all you have to do is add your instrumentation. As I recall there might already be a NBTV setup that has been in service. They may be amateur radio operators but most have worked as top level professional engineers in the space industry. In any case they are very open to helping any and all organizations that are interested in satellite projects. - Ken W1HV aka PropGuy2
  • Dave HeinDave Hein Posts: 6,347
    edited 2014-03-06 07:05
    Monroe,

    How many pixels wide? 64? If that's the case, do the math; your baud rate for 12.5 fps will have to be
    64 pixels/line * 48 lines/frame * 12.5 frames/sec * 11 asynchronous bits/pixel == 422,4 kBaud (bits/sec).

    That's well above what an audio modem/codec can accomodate -- not to mention FEC (forward error correction) redundancy. Wha't's your carrier frequency? And what modulation modes and RF bandwidth are you allowed tio use?

    -Phil
    If you can achieve a 4800 bps transmit rate you could use image compression to transmit the video as a modified JPEG stream. JPEG can compress typical images by a factor of 10:1 to 20:1 without much degradation. However, you can push it to 100:1 compression and still have a usable image. If you can't achieve 4800 bps and have to run at 1200 bps you would have to reduce the frame rate to around 3 frames/second.

    A JPEG image normally contains quantization and encoding tables, which require about 2 Kbytes, which is a lot of overhead for a small image. If you use a fixed set of tables you could remove them from the JPEG data that you transmit. Another issue with doing JPEG on the Prop is the Discrete Cosine Transform (DCT) that requires doing multiplications. You could replace this with a Walsh-Hadamard Transform that only uses additions and subtractions. There will be a loss in coding performance, but it will still work fairly well.
  • MonroeMonroe Posts: 18
    edited 2014-03-06 07:07
    Yes, we are aware of AMSAT and many other programs. We where involved in Ardusat and we have been working 7 years on new aerospace applications.
  • MonroeMonroe Posts: 18
    edited 2014-03-06 07:16
    Dave, yeah if I remember your pretty much an expert in video compression. 4800 baud is possible I'm not yet sure how practical. The launch/drop in April will give us more incite as to how practical that may be. Lot's of things work on paper but real world application. Dropping frame rate is a little tough to fly by still have testing to do. If anyone really interested you can download the NBTV software Gary created and have some fun with the parameters and see for yourself what NBTV can look like.

    Right now so far I like the interference in the analog signal more than I like the effect of interference on a digital signal at this bandwidth. But that could change.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2014-03-06 08:22
    Monroe,

    Okay, then, analog. So do the math again:
    64 pixels/line * 48 lines/frame * 12.5 frames/sec == 38400 pixels/sec

    Youu will not squeeze that signal into 6 kHz without losing a lot of detail in the tranmitted picture.

    -Phil
  • MonroeMonroe Posts: 18
    edited 2014-03-06 08:38
    That's why I mention 6khz ideal and open it up to 12khz eSSB like I said whatever we can get to work best. It does get you thinking along the right lines.
  • MonroeMonroe Posts: 18
    edited 2014-03-06 11:11
    This is an example of 32 lines at 6-10khz picked up on 160 meter band at 15km buy a simple transistor radio with ferrite wound antenna. This is actually better than we need- all we need is to see the horizon and it's plenty contrasty. It's more like an artificial horizon that's not artificial. No details are needed.

    http://www.youtube.com/watch?v=7klqrR3TAoQ

    s1.jpg
    s2.jpg


    These 2 are snapshots of some 48 line test footage from 50,000ft showing the horizon we are looking for.

    135 x 96 - 2K
    135 x 96 - 2K
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2014-03-06 12:07
    Here are two 64x48-pixel images (upsampled 3x so you can see them). The first was converted to audio at 38400 pixels/sec. The audio was then lowpass filtered by a 15-pole Chebychev with cutoff at 6 kHz. From that, the second image was reconstructed.

    attachment.php?attachmentid=107372&d=1394136420attachment.php?attachmentid=107373&d=1394136420

    -Phil
    192 x 144 - 14K
    192 x 144 - 19K
  • MonroeMonroe Posts: 18
    edited 2014-03-06 13:51
    The image you provided is more than acceptable to serve our purpose. I'm not seeing anything unacceptiable? Our purpose for needing it is valid. Shale we move forward?
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2014-03-06 14:31
    Monroe wrote:
    The image you provided ...
    Just to be sure, you're referring to the right-hand image, correct?

    -Phil
  • potatoheadpotatohead Posts: 10,261
    edited 2014-03-06 14:36
    It appears he is. I'm guessing edge detection to understand orientation using the horizon.
  • MonroeMonroe Posts: 18
    edited 2014-03-06 14:48
    Yes, what we need is to be able to determine only the horizon. We need zero detail other than the difference between the sky and the ground. The image is only used for reference to attitude, much like an artificial horizon. We could go near infrared with the sensor to make the contrast even higher if necessary. This is not for viewing enjoyment by any means it is a tool to get us home.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2014-03-06 14:49
    potatohead wrote:
    I'm guessing edge detection to understand orientation using the horizon.
    'Makes sense. That being the case, I think the Prop could do the necessary analysis of the full-resolution image on the sat itself and just send orientation as part of the digital telemetry. OTOH, if such an image has to be transmitted, the Prop could certainly handle those details, too.

    -Phil
  • MonroeMonroe Posts: 18
    edited 2014-03-06 15:03
    OK! Both are very useful! First we only need the Prop to encode and relay the audio to the transmitter. Edge detection would be useful for satellite orientation. Now we are on the same page! :)
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2014-03-06 15:12
    Converting the pixels to analog audio should be a snap: just keep streaming the raw data as DUTY-mode DAC, and run it through an analog low-pass filter. You'll have to include vertical synchronization via some blank lines. There's probably not enough bandwidth for an effective horizontal sync, so the receiver will have to reconstruct that by timing off of the vertical sync. In only 48 lines, it shouldn't drift too much.

    -Phil
  • MonroeMonroe Posts: 18
    edited 2014-03-06 15:26
    Ok, how long will it take you to do it?
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2014-03-06 16:21
    Will you be using a PropCAM, along with a Propeller Backpack?

    -Phil
  • MonroeMonroe Posts: 18
    edited 2014-03-06 17:00
    For sure lets go with that! Input rez looks good for our application. I believe we can integrate that into our system nicely. In fact it looks ideal.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2014-03-06 17:35
    I think a sensible first approach would be to capture every other line. This could be controlled by a Spin cog and done continuously. A separate cog could then scan the acquired data asynchronously and output the audio via the "Aud. Out/Vid. In" port. The RC filtering alone may be enough to limit the bandwidth to the transmitter. This output also includes a PTT circuit for transmitters that support it in their audio mic inpout.

    If the RC filtering is inadequate or not sharp enough, the video could also be subjected to FIR filtering. Preliminary calculations show that there is enough compute speed available to pull this off at 38.4 kHz (sending every other horizontal pixel).

    For telemetry data, the Bell 202 modem object will likely suffice.

    -Phil
  • MonroeMonroe Posts: 18
    edited 2014-03-06 18:01
    Sounds like a good starting point. Bell 202 is ok for a start. The PTT circuit will also be useful. That also means I can use my cheaper half duplex radios for testing. So for this first test you have nearly 25khz to work with. So these first hoops we need to jump are not as hard. How can I help you help us? What do you need?
  • Mark_TMark_T Posts: 1,981
    edited 2014-03-06 19:10
    The audio was then lowpass filtered by a 15-pole Chebychev with cutoff at 6 kHz. From that, the second image was reconstructed.

    -Phil
    But you don't use Chebyshev for video, a phase-linear FIR filter would be appropriate.
Sign In or Register to comment.