How do you handle FPS formats?
lardom
Posts: 1,659
It just occurred to me that there might be a syncing problem with film @24fps vs. NTSC @30fps. There is also "avi.", DVD, PAL etc. I've never thought about it in engineering terms. I only noticed a sync problem when seeing a monitor on another monitor. How do you solve the phase problem?
(Since the Propeller can do video I decided to ask the question here.)
(Since the Propeller can do video I decided to ask the question here.)
Comments
24fps to 30/60Hz NTSC is typically handled via a process called telecine, or 3:2 pulldown. The basic idea is you show one frame for two 60Hz fields, then the next frame for three 60Hz fields. (Where each field is half the lines of an interlaced frame.) Handling the 0.1% difference between 60Hz and the actual 59.94Hz field rate can be done by either speeding up the audio or adding an extra field at the right interval. 25/50Hz PAL is often done by simply speeding up the playback or a similar pulldown idea.
Resolution differences are next, i.e. converting from 720x480 NTSC DVD or 720x576 DVD to 640x480 VGA. The first thing to consider is aspect ratio - or how square the pixels or the screen is. DVD will either be a 4:3 screen or 16:9 and depending upon the desired output you can either letterbox or windowbox or take what you want out of the middle. Next is to handle the differences in number of pixels. The easy way is to only use every n'th pixel (or repeat pixels). However better results are achieved if the pixels are filtered first. A related (but more annoying) issue is if each pixel isn't full color. DVD is 4:2:0 sampled, so the color difference information is halved in both the horizontal and vertical directions.
Next is codecs, i.e. how the video or audio is compressed. For the most part this isn't too difficult - decode the video to raw pixels then re-encode as necessary. The challenge is trying to reduce or eliminate any steps which might cause quality issues - i.e. YCrCb to RGB to YCrCb conversions.
Finally the container format (i.e. AVI). Again, this is usually fairly simple as the purpose of the container is simply to organize the audio and video data into a single file. The difficulty comes if the container isn't suited for the codec (i.e. using AVI for anything current) and avoiding synchronization issues.
Wacky, yes, but specific to NTSC DV. I figured you for a native 24p man. OTOH, I can't even afford the lens cap for one of those cameras...
-- Gordon
It does, but 24pA isn't really meant as a display format, just for editing (with compatible editing software). Panasonic wasn't never really clear about that aspect and it caused confusion. The workflow is meant to efficiently convert to 24p in a NLE, then it can be finished in any way desired (not just NTSC). Once pulldown is removed, merely by ignoring the 5th and 6th field in the cadence, and then deinterlacing, the frames are just like any 24p material. From what I recall, there are no field dominance or other issues to contend with.
-- Gordon
Sorry, I've helped take this thread a bit off track. That said... I've had several EFX-TEK customers ask me when we're going to design a "cheap" video player with the Propeller. Answer: not any time soon. As with many technologies, it's harder than it looks.
It seems to me that the obvious solution to the 3:2:2:3 issue is to record everything at 120 fps. Then you can down-sample as needed for both 24 fps and 30 fps without introducing uneven sampling.
-Phil
My calculator blew up trying to figure out the bitrate for a 4K resolution stream at 120 fps.
In any case, NTSC is a "legacy" format so I don't think it makes much difference to accommodate for it in any special way. Shooting at 24p and bumping up seems to suit most NTSC applications. All the cameras and workflows now support that.
Jon, I've been blown away with the video features on the new DSLRs. Do you recognize the camera these guys are using? Have you ever used one of these Fig Rig jobbies? It seems to work very well, if you look at the finished work, also on YouTube. I figure if anyone knew this stuff, you would.
-- Gordon
-Phil
Hard to tell what the camera is, but based on the logo on the young man's shirt I'm betting it's a Canon 5d MarkII, perhaps a 7D (I use the 60D).
I absolutely LOVE my FigRig and use it all the time. Funny, I have taken it out to sets and parties where pros (i.e., much better at shooting movies than me -- I'm an actor) are present and they're always blown away by the "steering wheel." As camera rigs are getting smaller the FigRig is gaining in popularity. It's great for the run-and-gun stuff I tend to do.
Here's me in a tub with Lauren De Long (you may recognize her from a Microsoft commercial when they were trying to be cool) shooting on the aforementioned DVX100.
And here's my pal, Lynda Reynoso with my FigRig loaded for a shoot that we did about a year ago (my first DLSR project) -- you can see it here: http://www.youtube.com/watch?v=_2hGic9hh94. We did this as an exercise; an attempt to shoot and finish a short piece in a day (we did, but it was a long day).
That little piece is, essentially, a monologue so I shot it on a FigRig; this allowed me to act (without dialog) while holding the camera on Lynda. Also mounted to rig is a shotgun mic and a Zoom H4n audio recorder. DLSRs shoot great images and video but do not record sound well.
@Phil: I think James Cameron is on record for suggestion we should shoot and project at 120FPS for a more realistic experience. Funny, 24 FPS was born out of economics (practical length of celluloid film) and has now become an aesthetic standard that most digital shooters attempt to emulate (mostly in post production).
Don't know if you're a Mitchell and Webb fan, but I figured they used something similar (strapped to Webb's waist or something?) on the reverse POV shots for 'Sir Digby Chicken Caesar'. The effect is hilarious.
-- Gordon
Pause the video at 0:23 and you'll see something we all face: reflection problems. When we shot our little Doritos commercial for the Super Bowl contest we had that, too. Luckily, our DP and Director, Peter Montgomery (PJMonty here in the forums), was able to remove the offending reflections and errant equipment with After Effects. In a past life he did special effects work for Disney and is an expert compositor.
Back in the Propeller world, it will be interesting to see if it has the horsepower to play low-compression video from external media. Chip told me that he'll be writing a driver for an SD card, but I think it may take parallel access to get the bandwidth up enough to be meaningful.