Also when playing about with my original beta P2 video driver in VGA output mode @evanh, remember that colour text mode regions still need a 4-5x p2:pixel clock ratio to operate. If you push VGA resolution far you'll have issues with those two bundled demos unless you remove the text regions. I have extensions in the new codebase that now allow mono text regions to operate at lower clocks down to 2x ratios which is how I collected those hires mode captures earlier with the mono text status lines at the top of the screen.
This feature and PAL fixes will be part of the next release and bundled with the HyperRAM driver to give full support for external memory frame buffers. I have to document the changes too. In terms of PASM code it's all pretty much done now, mainly just a tidy up, final sanity test and document. I feel some of the APIs at the high level may require improvements, I'm still thinking about that. It would be good to also get this working with Chip's SPIN2 version too not just Fastspin but I don't have a Windows test setup created for that at this stage.
Also I haven't tested out HyperFlash yet. I wonder if I'll worry about that for now before releasing this driver. Writes to HyperFlash are complex and need some high level control functions to sequence it. That's sort of another level on top of this driver. Perhaps I'll just check that dummy flash reads and writes to flash control registers work at the lowest level and assume full flash capability will come later. Anything broken that is specific to HyperFlash can always be fixed then too.
I'm running sysclock at 10x for all these tests so the text demo is working fine.
I have just struck one issue though: In DVI mode, I can't seem to reduce horizontal sync pulse below 32 dot clocks without losing the picture entirely. Same for back porch.
BTW: 960x540 is a recognised mode! I never thought it was. 60 Hz refresh is out of reach with it though.
I'm running sysclock at 10x for all these tests so the text demo is working fine.
I have just struck one issue though: In DVI mode, I can't seem to reduce horizontal sync pulse below 32 dot clocks without losing the picture entirely. Same for back porch.
Hmm, I guess that could be hitting a limit. Did the same thing happen with VGA mode or only DVI? I wonder if it is the display device having issues rather than the driver output. Be sure to not enable any borders to give the driver the most processing time.
BTW: 960x540 is a recognised mode! I never thought it was. 60 Hz refresh is out of reach with it though.
Cool I didn't know if it would be either. I thought 60Hz might just be achievable if the blanking is reduced sufficiently at 300MHz or so. EDIT: Actually no, it's already over the raw limit at 960x540x60 with DVI, it would only work with VGA. However you could try 50Hz which might squeeze in. Try 1072 total pixels per line with 560 total lines per frame at 300MHz, that's pretty close to 50Hz. Or something like (960+32+56+32) x 550 lines @ 297MHz gets you spot on.
I'm thinking 960x540 would start to look pretty good on a FHD panel. Nice square pixels.
VGA works with smaller h-blanking but a tint is kicking in, so not far from losing that either. So it's possibly a limit of the monitors/TVs rather than prop2 output. Impressively the old 5:4 LCD is still showing a stable picture! Albeit a truncated one.
Here's the most extreme I've gone:
hd960x540_timing '59 Hz with 33.0 MHz pixel clock (htot 1032: 31.98 kHz, vtot 546: 58.56 Hz)
long CLK330MHz
long 330000000
'_HSyncPolarity___FrontPorch__SyncWidth___BackPorch__Columns
' 1 bit 7 bits 8 bits 8 bits 8 bits
long (SYNC_POS<<31) | ( 8<<24) | ( 32<<16) | ( 32<<8 ) | (960/8)
'_VSyncPolarity___FrontPorch__SyncWidth___BackPorch__Visible
' 1 bit 8 bits 3 bits 9 bits 11 bits
long (SYNC_POS<<31) | (1<<23) | ( 2<<20) | ( 3<<11) | 540
long 10 << 8
long 0
long 0 ' reserved for CFRQ parameter
Notably, VGA inputs (I think because h-sync is over 31 kHz) will detect that as either 640x480 or 640x350 but HDMI inputs are happy to understand its real resolution. So that's really good to know. A lot of my experience with VGA timings can be thrown out when it comes to DVI/HDMI signalling.
Another thing that is surprising is how low the h-sync can be set. I was pretty certain that 30 kHz was the typical minimum they'd accept, maybe 29 kHz. And this applies to both VGA and DVI. I'm really surprised since I've definitely struck h-sync limits before.
This one works!
hd960x540_timing '49 Hz with 30.0 MHz pixel clock (htot 1120: 26.79 kHz, vtot 546: 49.06 Hz)
long CLK300MHz
long 300000000
'_HSyncPolarity___FrontPorch__SyncWidth___BackPorch__Columns
' 1 bit 7 bits 8 bits 8 bits 8 bits
long (SYNC_POS<<31) | ( 16<<24) | ( 64<<16) | ( 80<<8 ) | (960/8)
'_VSyncPolarity___FrontPorch__SyncWidth___BackPorch__Visible
' 1 bit 8 bits 3 bits 9 bits 11 bits
long (SYNC_POS<<31) | (1<<23) | ( 2<<20) | ( 3<<11) | 540
long 10 << 8
long 0
long 0 ' reserved for CFRQ parameter
Does it look pretty clean @evanh? No issues at all? I really like the idea we can get a 1080p panel fed from a P2 natively over DVI by this 960x540 signal even at 50Hz. 300MHz might just allow a frame buffer too with HyperRAM. I can (just) get 297 MHz working at room temp anyway. If you can tweak the horizontal timing a fraction lower you might get to use 297MHz which is a well known frequency multiple for HD.
Oh wow! My TV can go well below the minimum 250 MHz TDMS clock. I got right down to 140 MHz. Not sure if that's the bottom either. My older 16:10 monitor, with DVI input, can't though.
If you can tweak the horizontal timing a fraction lower ...
Easy peasy, that example was pushing the h-blanking to max length for 300 MHz.
EDIT: Basically, DVI/HDMI inputs are way better at detecting the outputted resolution than VGA inputs can handle. The syncs aren't the dictating factor any longer. It's a whole new ballgame when ditching VGA compatibility.
If you can tweak the horizontal timing a fraction lower ...
Easy peasy, that example was pushing the h-blanking to max length for 300 MHz.
EDIT: Basically, DVI/HDMI inputs are way better at detecting the outputted resolution than VGA inputs can handle. The syncs aren't the dictating factor any longer. It's a whole new ballgame when ditching VGA compatibility.
Evan, assuming you've tried it, could you please post the P2 long values for 50Hz * 1080 pixels * 550 lines at 297 MHz?
Minimum h-blanking I put at 72 dot clocks, maximum - lots. Doesn't seem to matter.
Any resolution works. I can be arbitrary with both width and height and the TV correctly detects it. Basically no modes any more. This has the feel of the old days again! I'm in heaven!
Roger,
Need a cleaner, faster, simpler way to make a mode setting. The copy'n'pasting I'm doing is a tad batty. Three places in the driver and two in the demo for each case.
Tony,
I just made a mode of 688x512 with 25 MHz dot clock to suit DVI minimum 250 MHz TDMS clock and sure enough, even the old 16:10 monitor correctly detect the chosen resolution.
There is no fixed mode list with HDMI. The old mode specs are worthless now. It's so cool!
EDIT: Of course, there is still the native panel specs. The TV/monitor's internal scan converter has to handle not just pixel scaling but also framerate conversion too. The native panels are usually a fixed scan rate internally. They'll track v-sync a few percent either side.
Beyond that, the easiest way is to allow "tearing". I have no idea if this is the norm or not.
EDIT2: Bed time. I'm going to be short on sleep for the fourth night running.
Roger,
Need a cleaner, faster, simpler way to make a mode setting. The copy'n'pasting I'm doing is a tad batty. Three places in the driver and two in the demo for each case.
Ok yeah. Despite the fact that the current video driver's Fastspin interface API in the beta is still slightly limited it wasn't really ever the intent to need to change the driver if you want to test out new modes, though that is one way to do it. I have this getTiming API that can be passed a known video mode constant and it will return a pointer to a timing structure for that video mode's resolution which can then be passed in during video driver creation with initDisplay as the userTiming argument in the demo. However there is nothing stopping the caller from having its own custom timing configs and passing those in instead. That way you aren't adding to the driver but to your own application when you want to experiment with custom modes. Only one place to change it then.
That being said it sort of depends on some documented knowledge of how to create the timing structure in the first place, and configuring the appropriate PLL settings you need for the P2 clock rate. It may make sense to have an API that figures out more of this for you and you just pass something a bit like a modeline (or set of timing arguments) and a pointer to a buffer and it fills it in for you which you can use later for the userTiming.
Setting up the P2 timing itself is a problem in that will people always want the video driver to go change the PLL for them (which sometimes can be convenient), or do they need full control the PLL themselves? I do keep the P2 clock frequency and pixel divider (or XFRQ) settings in this timing structure along with the PLL settings, so I have knowledge of what to do at the SPIN layer anyway.
If we create another API to do set this timing structure there may be a couple of extra parameters needed in some cases such as CFRQ values for PAL/NTSC, and some extensions for extra wide front porches, colour bursts breezeways etc. This needs a little more thought as to how to make things easy for users when setting up this structure. Suggestions are welcome.
Until now I've been far more focussed on the low level PASM and this may have resulted in a non-ideal Fastspin interface API seen by people - in fact as I recall this API and demos were originally whipped late the night I released the first beta and it probably shows. I recall that was an all-nighter too.
... I have this getTiming API that can be passed a known video mode constant ...
That's what I'm using. Adding more entries is cumbersome.
... It may make sense to have an API that figures out more of this for you and you just pass something a bit like a modeline (or set of timing arguments) and a pointer to a buffer and it fills it in for you which you can use later for the userTiming.
I don't have a preference in mind but anything to eliminate the multi-place edits would be welcome.
Setting up the P2 timing itself is a problem in that will people always want the video driver to go change the PLL for them (which sometimes can be convenient), or do they need full control the PLL themselves?
I've got no problem with the driver changing the sysclock on me. It needs done anyway.
... I have this getTiming API that can be passed a known video mode constant ...
That's what I'm using. Adding more entries is cumbersome.
Until there is a more convenient way just try to not use the getTiming API if you still are, that was more for pre-defined existing (typical) modes. Just setup your own timing structures in DAT memory of your outer object and set your userTiming pointer to that. Should only need a PLL line and the updated mode data each time you try out a new resolution. Just 2-3 lines altered in your file each time you test/design a new mode.
PS: I'm not as interested in any mode line builder routine as I am in just submitting the parameters to the driver mode set routine. As I've just found out, DVI/HDMI displays don't have any preset mode list. That's a VGA only thing.
There is no fixed mode list with HDMI. The old mode specs are worthless now. It's so cool!
EDIT: Of course, there is still the native panel specs. The TV/monitor's internal scan converter has to handle not just pixel scaling but also framerate conversion too. The native panels are usually a fixed scan rate internally. They'll track v-sync a few percent either side.
RBv2 allows horizontal blanking of only 80 pixels, e.g. Htotal 1040 for 960x540. There is supposed to be a different sync polarity for RB but is that irrelevant for DVI/HDMI?
Just setup your own timing structures in DAT memory of your outer object and set your userTiming pointer to that. Should only need a PLL line and the updated mode data each time you try out a new resolution. Just 2-3 lines altered in your file each time you test/design a new mode.
I'm not at all familiar with Spin, hang on ... okay, needs the HUBSET modes and "SYNC_POS" defined ... ah-ha! Just had to prefix them with "VID#". Worked a treat thanks.
Another mode is 720p24 (1280x720 at 24Hz). Chip had this working with DVI/HDMI output using standard blanking (Htotal = 1650, Vtotal = 750, sysclk = 297Mhz). Has anyone else got this mode going digitally?
For RBv1, Htotal = 1440, Vtotal = 790, sysclk = 273Mhz. Might be able to do following if monitor/TV supports RBv2: Htotal = 1360, Vtotal = 772, sysclk = 252Mhz.
Evan, the values you've been trying seem rather ad hoc. Keeping to the VESA rules/guidelines for reduced blanking could lead to greater success with more TVs/monitors.
A digital panel monitor without flyback requirements can certainly tolerate a much wider range of blanking values compared to the old analog CRTs. I guess depending on how they work internally there may possibly be some baggage left over from the VGA signal detection days on some older implementations if they try to share common VGA timing circuitry etc but I'd hope the newer displays are a lot more flexible nowadays.
True for VGA inputs only - from what I'm seeing here.
Both the DVI and HDMI inputs have none of those old hang-ups. The only significant limit, not really from VGA days, is the 250 MHz minimum TDMS clock for DVI. Which doesn't seem to apply to HDMI at all.
EDIT: I need to get a long USB cable out to the lounge and plug the prop2 into the Plasma TV. That'll be a good test for HDMI compatibility on older equipment.
I guess I could set up push button mode select booting from SD card. Won't need the extra long USB that way.
Mostly for general control keywords it should behave the same as P1 spin if you don't need P2 specific stuff. The P1 SPIN documentation can be a rough basis for you if you don't know Fastspin.
Comments
This feature and PAL fixes will be part of the next release and bundled with the HyperRAM driver to give full support for external memory frame buffers. I have to document the changes too. In terms of PASM code it's all pretty much done now, mainly just a tidy up, final sanity test and document. I feel some of the APIs at the high level may require improvements, I'm still thinking about that. It would be good to also get this working with Chip's SPIN2 version too not just Fastspin but I don't have a Windows test setup created for that at this stage.
Also I haven't tested out HyperFlash yet. I wonder if I'll worry about that for now before releasing this driver. Writes to HyperFlash are complex and need some high level control functions to sequence it. That's sort of another level on top of this driver. Perhaps I'll just check that dummy flash reads and writes to flash control registers work at the lowest level and assume full flash capability will come later. Anything broken that is specific to HyperFlash can always be fixed then too.
I'm running sysclock at 10x for all these tests so the text demo is working fine.
I have just struck one issue though: In DVI mode, I can't seem to reduce horizontal sync pulse below 32 dot clocks without losing the picture entirely. Same for back porch.
BTW: 960x540 is a recognised mode! I never thought it was. 60 Hz refresh is out of reach with it though.
Hmm, I guess that could be hitting a limit. Did the same thing happen with VGA mode or only DVI? I wonder if it is the display device having issues rather than the driver output. Be sure to not enable any borders to give the driver the most processing time.
Cool I didn't know if it would be either. I thought 60Hz might just be achievable if the blanking is reduced sufficiently at 300MHz or so. EDIT: Actually no, it's already over the raw limit at 960x540x60 with DVI, it would only work with VGA. However you could try 50Hz which might squeeze in. Try 1072 total pixels per line with 560 total lines per frame at 300MHz, that's pretty close to 50Hz. Or something like (960+32+56+32) x 550 lines @ 297MHz gets you spot on.
I'm thinking 960x540 would start to look pretty good on a FHD panel. Nice square pixels.
Evan, can your TV do 720x576 50Hz at 28 MHz, instead of 27 MHz?
27 MHz (standard):
28 MHz (non-standard): * might need tweaking
Here's the most extreme I've gone:
Notably, VGA inputs (I think because h-sync is over 31 kHz) will detect that as either 640x480 or 640x350 but HDMI inputs are happy to understand its real resolution. So that's really good to know. A lot of my experience with VGA timings can be thrown out when it comes to DVI/HDMI signalling.
Another thing that is surprising is how low the h-sync can be set. I was pretty certain that 30 kHz was the typical minimum they'd accept, maybe 29 kHz. And this applies to both VGA and DVI. I'm really surprised since I've definitely struck h-sync limits before.
This one works!
I'm not sure about the assumption but the intention is to emulate a retro system with 4:3 display using DVI/HDMI output.
Does it look pretty clean @evanh? No issues at all? I really like the idea we can get a 1080p panel fed from a P2 natively over DVI by this 960x540 signal even at 50Hz. 300MHz might just allow a frame buffer too with HyperRAM. I can (just) get 297 MHz working at room temp anyway. If you can tweak the horizontal timing a fraction lower you might get to use 297MHz which is a well known frequency multiple for HD.
50Hz * 1080 pixels * 550 lines equals exactly 29.7MHz.
EDIT: Basically, DVI/HDMI inputs are way better at detecting the outputted resolution than VGA inputs can handle. The syncs aren't the dictating factor any longer. It's a whole new ballgame when ditching VGA compatibility.
Evan, assuming you've tried it, could you please post the P2 long values for 50Hz * 1080 pixels * 550 lines at 297 MHz?
Any resolution works. I can be arbitrary with both width and height and the TV correctly detects it. Basically no modes any more. This has the feel of the old days again! I'm in heaven!
Roger,
Need a cleaner, faster, simpler way to make a mode setting. The copy'n'pasting I'm doing is a tad batty. Three places in the driver and two in the demo for each case.
I just made a mode of 688x512 with 25 MHz dot clock to suit DVI minimum 250 MHz TDMS clock and sure enough, even the old 16:10 monitor correctly detect the chosen resolution.
EDIT: Of course, there is still the native panel specs. The TV/monitor's internal scan converter has to handle not just pixel scaling but also framerate conversion too. The native panels are usually a fixed scan rate internally. They'll track v-sync a few percent either side.
Beyond that, the easiest way is to allow "tearing". I have no idea if this is the norm or not.
EDIT2: Bed time. I'm going to be short on sleep for the fourth night running.
I think TonyB means to try our the 1080 total pixels x 550 total lines @ 297MHz (but 960x540 active pixels) which should be 16:9 and 50Hz.
Ok yeah. Despite the fact that the current video driver's Fastspin interface API in the beta is still slightly limited it wasn't really ever the intent to need to change the driver if you want to test out new modes, though that is one way to do it. I have this getTiming API that can be passed a known video mode constant and it will return a pointer to a timing structure for that video mode's resolution which can then be passed in during video driver creation with initDisplay as the userTiming argument in the demo. However there is nothing stopping the caller from having its own custom timing configs and passing those in instead. That way you aren't adding to the driver but to your own application when you want to experiment with custom modes. Only one place to change it then.
That being said it sort of depends on some documented knowledge of how to create the timing structure in the first place, and configuring the appropriate PLL settings you need for the P2 clock rate. It may make sense to have an API that figures out more of this for you and you just pass something a bit like a modeline (or set of timing arguments) and a pointer to a buffer and it fills it in for you which you can use later for the userTiming.
Setting up the P2 timing itself is a problem in that will people always want the video driver to go change the PLL for them (which sometimes can be convenient), or do they need full control the PLL themselves? I do keep the P2 clock frequency and pixel divider (or XFRQ) settings in this timing structure along with the PLL settings, so I have knowledge of what to do at the SPIN layer anyway.
If we create another API to do set this timing structure there may be a couple of extra parameters needed in some cases such as CFRQ values for PAL/NTSC, and some extensions for extra wide front porches, colour bursts breezeways etc. This needs a little more thought as to how to make things easy for users when setting up this structure. Suggestions are welcome.
Until now I've been far more focussed on the low level PASM and this may have resulted in a non-ideal Fastspin interface API seen by people - in fact as I recall this API and demos were originally whipped late the night I released the first beta and it probably shows. I recall that was an all-nighter too.
I don't have a preference in mind but anything to eliminate the multi-place edits would be welcome.
I've got no problem with the driver changing the sysclock on me. It needs done anyway.
Until there is a more convenient way just try to not use the getTiming API if you still are, that was more for pre-defined existing (typical) modes. Just setup your own timing structures in DAT memory of your outer object and set your userTiming pointer to that. Should only need a PLL line and the updated mode data each time you try out a new resolution. Just 2-3 lines altered in your file each time you test/design a new mode.
Maybe the reduced blanking (RB) discussion should be continued here:
https://forums.parallax.com/discussion/169518/vesa-coordinated-video-timing-generator-cvt
RBv2 allows horizontal blanking of only 80 pixels, e.g. Htotal 1040 for 960x540. There is supposed to be a different sync polarity for RB but is that irrelevant for DVI/HDMI?
CVT generator C source code:
https://cgit.freedesktop.org/xorg/xserver/tree/hw/xfree86/modes/xf86cvt.c
https://cgit.freedesktop.org/xorg/xserver/tree/hw/xfree86/utils/cvt/cvt
For RBv1, Htotal = 1440, Vtotal = 790, sysclk = 273Mhz. Might be able to do following if monitor/TV supports RBv2: Htotal = 1360, Vtotal = 772, sysclk = 252Mhz.
Experience tells me polarity has been irrelevant to VGA for decades now too. Probably not relevant since actual real EGA CRTs.
Both the DVI and HDMI inputs have none of those old hang-ups. The only significant limit, not really from VGA days, is the 250 MHz minimum TDMS clock for DVI. Which doesn't seem to apply to HDMI at all.
EDIT: I need to get a long USB cable out to the lounge and plug the prop2 into the Plasma TV. That'll be a good test for HDMI compatibility on older equipment.
I guess I could set up push button mode select booting from SD card. Won't need the extra long USB that way.