Recording Video Using The Propeller
marcla
Posts: 19
Hello!
Is it posseble to record comprosite video and save it on a Sd card?
If it is posseble, how? I guess i need a ADC?
/Martin
Is it posseble to record comprosite video and save it on a Sd card?
If it is posseble, how? I guess i need a ADC?
/Martin
Comments
Leon
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Amateur radio callsign: G1HSM
Suzuki SV1000S motorcycle
The amount of data and the data rate is too large without compression and the Propeller is not fast enough to do even simple compression in real-time. Think about it. Video does 30 frames per second, each frame with maybe 640 x 480 pixels. That's just under 10 million pixels per second.
The SRAM I'm using is the CY62167EV30, which is a 2M x 8 Static Ram (2MByte).
The data I'm storing is 16 bit RGB pixels (R5, G6, B5). At a resolution of 640 x 480,
this takes 614,400 bytes per frame. So in theory, I can store up to 3 frames per chip.
The circuit is also set up to capture partial frames, by specifying a start and stop line number.
Since this is dual port, you can control when to start or stop the video storage to the SRAM.
This should mean you can process the data with the Prop at whatever frame rate it can handle.
It could easily be expanded to handle more frames if needed.
The application I'm starting with requires capturing only a partial frame.
If anyone's interested, I'll keep you posted on the progress.
Jim
I'm interested. I've always been intrigued by dual-port memory. Never quite grokked it though.
There is an example code and hardware solution included for capturing NTSC.
The website is at.
http://mydancebot.com/viewport/
TJ
The video chip will be controlling the clock and storage of data when it's selected. When it has stored what was asked of it, the Prop will take over control of the SRAM and retrieve what it wants.
In my case, I'm using a camera chip. So the camera will write to the SRAM and the Propeller will read from it.
Let me know if you have more questions.
Jim
an imbedded processer is of course used, but only for start/stop/delete and to perform other functions such as provide a simple OSD menu system
so instead of using the Propeller to compress and decompress video why not just interface to dedicated components?
people only did things the HARD way!
Come on guys, look what digital video we did with toys in 1987!
en.wikipedia.org/wiki/PXL-2000
Hint:Use an LM3914 dot/bar display driver and some diodes to convert composite video to binary for just a few bucks.
Post Edited (VIRAND) : 1/12/2009 8:19:42 AM GMT
@mynet43, how many IO pins are you using for the SRAM?
2MB = 22 just for address lines, then there's the CE etc
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
http://www.propgfx.co.uk/forum/·home of the PropGFX Lite
·
Hi! I thought this thread was dead from lack of interest[noparse]:)[/noparse]
I have the memory configured for 8-bit I/O, so I only need 8 pins there.
The addressing will be handled entirely by external logic. I'll need a few control lines but basically the memory is used in two ways.
1. Data is stored in it directly from the camera. For this application, I'm using a subset of a frame. I set two latches, one with the starting line number and the other with the ending line number of the frame. When a comparator matches the starting line number, the camera stores data directly to the SRAM using an address counter and the camera's clock. When all requested lines have been stored, the camera stops storing data to the SRAM and turns control back to the Propeller.
2. The propeller uses a similar external logic to clock in data from the SRAM. The data direction is gated to allow the camera to only write and the Propeller to only read from the SRAM. Hence my reference to dual-port, which it really isn't.
So basically I don't use ANY address lines directly from the propeller. I set up the address in the program and shift it out to a latch, then start an external address counter, which is limited by a comparator.
The design isn't complete but it's coming along pretty well. It's a little tricky to keep track of all the stuff.
I hope that answers your question. Let me know if you have more...
Jim
Ah, that'll explain thing a lot more clearly, I was wondering how you'd manage all those IO pins thinking it was direct prop manipulation of the sram, not external gubbins.
Thanks for the reply [noparse]:)[/noparse]
Cheers,
Jim.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
http://www.propgfx.co.uk/forum/·home of the PropGFX Lite
·
Hanno
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Prop Tools under Development or Completed (Index)
http://forums.parallax.com/showthread.php?p=753439
My cruising website http://www.bluemagic.biz
Thanks for the feedback on ViewPort.
You didn't say what resolution video your robot was using. I looked at your great video. The camera pictures there seem to be low resolution gray scale.
The project I'm working on is VGA 16-bit color at 30 frames/second. To handle this on the propeller would require loading and processing more than 18MBytes per second. That's about 4 clock cycles per byte to load and process it. What am I missing?
Also, I tried to locate the tutorial for integrating video, using the link you provided. I poked around on the web site for a while but was unable to locate it. Do you have a better link?
I haven't used Viewport, but it looks fantastic.
Jim
An upcoming feature article in Circuit Cellar will cover all this in detail.
This array can be streamed to ViewPort at up to 2mbps- yielding a frame rate around 10fps... ViewPort can draw the location and path of the object being tracked, show vertical and horizontal scanlines, and show a histogram of pixel values. Of course ViewPort can also show the state of the Propeller's IO states at up to 80Msps, track spin variables over time, and change spin variables with intuitive controls like textboxes, dials and sliders... ViewPort has been around for close to 2 years- it's complete with a 60page manual, a PE Lab written by Andy Lindsay (famous for "What's a Microcontroller"), videos, a dozen documented spin tutorials (including 2 for vision) and a thriving support and developer community.
A 30 day free trial version is here: mydancebot.com/viewport After 30 days pay just $29 to keep going or upgrade to advanced features.
And thanks to it's plugin architecture, the best is yet to come: Integrated Spin Debugger (pause/step spin code, breakpoint, profiler,...), OpenCV (full access to state-of-the art computer vision library)...
Try it out- you'll love it!
Hanno
Yes, grabbing full color would require at least 2 cogs. I'm sure it could be done- (ViewPort can sample up to 80Msps by using 4 interleaved cogs). I might attempt it, but would prefer to wait for Propeller2 which will greatly simplify this task.
Cluso,
Thanks for the compliment- you're doing some amazing stuff too!
Yes, I believe the OpenCV integration will bring great things to the Propeller- hopefully people will try it out when it's ready. 1 line of code to detect faces and store the result in spin variables- what more do you want?
Sorry for the large file and slow download- I'll post a YouTube video as soon as I'm happy with the Color and Circle finders...
Hanno
Is your Propeller frame grabber downloadable from OBEX?
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
www.fd.com.my
www.mercedes.com.my
No, neither ViewPort nor its parts are available on the OBEX or under open license. I'm treating my Propeller hobby as a full-time profession. My wife would be upset if I didn't make money to pay bills. Thanks to everyone who's supported ViewPort so far- I'm having lots of fun creating a very full featured development platform for the Propeller.
Hanno
Is it similar in functionality to CMUCam?
Why do you need a PC to implement the robot vision?
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
www.fd.com.my
www.mercedes.com.my
Thanks so much for the feedback to my questions.
I calculated the throughput requirements of 16-bit VGA, compared to your 240 x 200 x 4 camera.
The VGA takes more than 25 times as much bandwidth to download the data, assuming you're using the same frame rate.
I think this might be tricky to do in 2 cogs[noparse]:)[/noparse]
I envy you supporting yourself with the Propeller... Who knows, I may be close[noparse]:)[/noparse] It sure is a fantastic chip.
Thanks for the help.
Jim
The Propeller by itself can do simple video grabbing (grayscale, 200x240xfull frame rate) and simple vision processing. That's what I have running on my DanceBot which is currently balancing a flute of champagne in an art exhibit here in Christchurch, NZ. No PC required...
To do state of the art vision (light years ahead of CMUCAM) you need plenty of memory and processing power. OpenCV has been the leading Computer Vision library for 10 years, it was used by Stanford to win the DARPA race. Until now, it was difficult to do vision processing with OpenCV and control real-world devices. With the ViewPort integration, people will have the best of all worlds- easy integration with all sorts of real world sensors and actuators and state of the art vision algorithms presented with a simple GUI.
Hanno
Yes, uncompressed full bandwidth video takes much more than 2mbps. However, the Propeller is capable of digitizing the full NTSC signal into memory. For the PropScope (Parallax's next generation Oscilloscope) I'm able to continuously read the INA port at variable speeds up to 20Msps while looking for a complex trigger condition. (No, not a typo) This speed allows you to nicely resolve the NTSC signal. Once it's inside the Propeller's memory you just need a fast way of transferring it somewhere else- other people have solved that problem by using multiple IO lines...
Hanno
I attended the last two DARPA races. Awesome to see a car speed up and pass another car in traffic, with no one on board and totally autonomous. Find a parking lot, pull in to the right parking space, stop at every stop sign, etc, etc.
I look forward to your integration of OpenCV, it should open up all kinds of possibilities.
I think I'll continue with my project, to capture the video I need in SRAM, then let the Propeller play with subsets of it at it's leisure.
Keep us posted.
Jim
2 Minute YouTube video
Here's a thread dedicated to OpenCV
Hanno
Post Edited (Hanno) : 1/15/2009 8:22:46 AM GMT
Here's the Beta Thread
Hanno