Using XINPUT with external 8MHz oscillator
feilipu
Posts: 22
I've an interesting failure with a home built Propeller system, using an 8MHz oscillator as a clock source.
It is linked here, and circuits attached.
I'm able to successfully operate the PS/2 keyboard and the VGA output, and of course upload and store programs via the serial port and EEPROM.
Using the demo files flagged with these headers
''***************************************
''* PS/2 Keyboard Driver v1.0.1 *
''* Author: Chip Gracey *
''* Copyright (c) 2004 Parallax, Inc. *
''* See end of file for terms of use. *
''***************************************
''***************************************
''* VGA Text 32x15 v1.0 [Modified] *
''* Author: Chip Gracey *
''* Copyright (c) 2006 Parallax, Inc. *
''* See end of file for terms of use. *
''***************************************
If I configure the clockmode with XINPUT + PLL16X then the keyboard driver works as expected, but the VGA driver does not synchronise.
If I configure the clockmode with XTAL1 + PLL16X then the keyboard driver continually reports key repeats (even though no keys are pressed), but the VGA driver syncs with a perfect screen.
I've tested the difference is simply changing the one bit marked with X: _clkmode = %0_1_1_0X_111
I think the right thing to do is to use the XINPUT setting, but then the VGA driver doesn't work, so that's wrong.
Is there something that I'm missing, here? Any help welcome.
EDIT & TL;DR: Running at 8MHz and PPL16X is not successful. But it seems to work sucessfully XTAL1 7.3728MHz and PLL16X, giving a system clock of approx 118MHz.
Phillip
It is linked here, and circuits attached.
I'm able to successfully operate the PS/2 keyboard and the VGA output, and of course upload and store programs via the serial port and EEPROM.
Using the demo files flagged with these headers
''***************************************
''* PS/2 Keyboard Driver v1.0.1 *
''* Author: Chip Gracey *
''* Copyright (c) 2004 Parallax, Inc. *
''* See end of file for terms of use. *
''***************************************
''***************************************
''* VGA Text 32x15 v1.0 [Modified] *
''* Author: Chip Gracey *
''* Copyright (c) 2006 Parallax, Inc. *
''* See end of file for terms of use. *
''***************************************
If I configure the clockmode with XINPUT + PLL16X then the keyboard driver works as expected, but the VGA driver does not synchronise.
If I configure the clockmode with XTAL1 + PLL16X then the keyboard driver continually reports key repeats (even though no keys are pressed), but the VGA driver syncs with a perfect screen.
I've tested the difference is simply changing the one bit marked with X: _clkmode = %0_1_1_0X_111
I think the right thing to do is to use the XINPUT setting, but then the VGA driver doesn't work, so that's wrong.
Is there something that I'm missing, here? Any help welcome.
EDIT & TL;DR: Running at 8MHz and PPL16X is not successful. But it seems to work sucessfully XTAL1 7.3728MHz and PLL16X, giving a system clock of approx 118MHz.
Phillip
Comments
You might could run 8Mhz into XI and run it without multiply but then you run the P1 at 8Mhz. Not really useful.
if you want to use a common clock you need at least to divide it down to say 4Mhz and feed that to the propeller, but 8Mhz as input frequency is just to fast.
Enjoy!
Mike
I'm suprised to hear that the PLL doesn't run at 128MHz, since that figure is quoted quite often within both the manual, and is even found within the example code, as the maximum rate PLL rate.
To quote Table 1.7... "The Clock PLL's internal frequency must be kept within 64 MHz to 128 MHz – this translates to an XIN frequency range of 4 MHz to 8 MHz."
My hardware is running and the VGA interface is working properly at that rate too, when the XTAL1 setting is selected (just not when XINPUT is selected).
So, I'm happy to continue in this manner.
So, perhaps I'm looking in the wrong place, and the timing on the keyboard driver needs to be adjusted to suit the faster system clock?
I thought this was done automatically in the keyboard driver, but I can go back and look again.
Cheers, Phillip
The documentation is sort of misleading there and you are not the first one making the same mistake in understanding it.
But with 8Mhz you will need PLL8X giving 64Mhz clock or 5Mhz input x PLL16X giving 80Mhz the nice spot is 6.25 MHz x PLL16X to run 96 Mhz, @Cluso99 even runs at 104Mhz and 6.5 input but with very well designed boards having very short traces because he builds very small boards.
Enjoy!
Mike
It seems the datasheet is a little misleading in that respect.
I read it as giving the range of 4MHz to 8MHz, and as I'm never leaving cycles on the table of course I was setting it up to run at 8MHz.
As I've got some 7.3728MHz oscillators to hand (base clock for the RC2014), I've just tried that base frequency with PLL16.
It works perfectly with XTAL1+PLL16. So that's my issue resolved.
So, if anyone else is ever interested this is a 117.9MHz result.
4 layer board, well buffered 3V3 flood-fill supply and ground. Short traces.
Cheers, Phillip
But the CPUs in the cog just don't work anymore with such a high clock.
Andy
well done, this might get hot, depending on your use, but interesting that it runs stable.
Enjoy!
Mike
The 4 to 8MHz input spec is correct but like all specs you have to take into account all the other specs. For instance I regularly use a 10MHz crystal with a x8 PLL which still results in the standard recommended operating frequency of 80MHz although the 10MHz input violates the PLL max of 128MHz. Remember that the PLL works by multiplying x16 first and then dividing down but the Prop was thus rated taking generous process corners and voltage and temperature extremes into account. We never actually expect or rely on the core operating at or near 128MHz though.
BTW we found the DIP performs just a tad faster than the QFP for some weird reason.
Thanks Peter.
It is for a "retrocomputer" called the RC2014.
It is fun designing with through-hole for 1970's and 1980's technology, and 4 layer boards don't cost too much in the 50x100mm size. Retrocomputing: one of the last justified reasons to use assembly.
I'm sure that the application is for use at "room temperature" only, and I've got a brownout / reset TPS7333 device in the design. So I'm not too worried about the corner cases of temperature and supply variations.
But, if there are manufacturing variations to worry about, then I guess I'll find out when I build a few more.
Cheers, Phillip
Thanks for the tip. Funny you should say that. @macca did two cards early, and they were announced on RC2014 forum. I looked at them closely, but I thought they didn't exactly hit the mark for me, so I designed this one to suit what I needed, an "all in one" solution. As soon as I discussed it on the forum @macca mentioned a third design that he'd also done but not announced. His third design and this design are quite similar. If I'd known about his final design, I probably wouldn't have done this one.
But anyway, now it is done, and there are some subtle but important differences, and the list of differences is linked here.
Also, we discussed your design for the high level graphics driver. I have experience using the FTDI EVE graphics devices, and I'm keen to get a GL language interface onto this UX Module too. It is a stretch goal though, as I've not the PASM experience to make much contribution.
This UX Module is the first Propeller design I've done. The OBEX makes the "on-ramp" very easy. The only novel component to this project is to build an ACIA MC68B50 emulation, on a Z80 RC2014 bus. Everything else has been done before, which is comforting.
[ As an aside, I've been testing the highres_text_vga driver, with 2 cogs, and it works nicely now too. Just needed to adjust the pixel rate calculation for 7MHz granularity and it is good to go up to 1024x768@76Hz ].
You mean the VJET polygon drawing stuff? Yes, it is quite similar in concept to the FTDI EVE hardware, just significantly less powerful - with the 4 rasterizer cogs @80MHz I usually use, that is. 117 MHz is a good bit faster and you may be able to use 5 or 6 cogs. Also, the display list doesn't contain triangles/lines/polygons per se, it contains "trapezoid stacks", essentially a representation of a y-monotonous polygon as set of trapezoids defined by height and left/right slope. If you've ever written a triangle/poly fill algorithm, you probably understand what I mean. There's some computation involved in generating these from a polygon (triangles as a special case are quite simple though). The code for this stuff is in Spin, so if you can read past the spaghetti, here it is.
If you need more help, just ask any questions, I have near infinite time for answering them (especially since it's procrastination season for me right now...).
Yeah, that's a really nice thing, innit? Just copy the files into your project and you're set - it's how embedded development should be.
I don't think the cog budget will stretch so far. Cog 0 for Spin, 1 for ACIA Z80, 1 for Tx/Rx serial, 1 for keyboard, 2 for VGA. I make only 2 cogs spare for rendering. Hmm. Still might be enough to get something.
I just watched the Youtube video on Hexagon. Game looks beautiful. Crazy that you can actually play it well enough to capture that video.
No, sorry. I don't really understand, and I've not written such an algorithm. Only some 2D stuff for a I2C LCD screen, and high level stuff for an ATmega synthesier using EVE.
More learning needed.
Thanks I've kept 3 base addresses free in the UX Module design. 0x80 is for the ACIA emulation so it will work with the standard RC2014 software. Another is to separate the serial Tx/Rx onto another "TTY" port.
The final base address is to build a graphics device. It might be TMS9918 like, or better I'd prefer it to be based on VJET like solution. Ideally it would work to the same VGA memory as the HiRes VGA driver. A bit like a graphic overlay.
I'm sure I'll have lots of questions on this. I'd love to hear more.
I'll open a new discussion here, or on the github repository if that works better?
Cheers, Phillip
... and because I'm too impatient, I'll just post them here, now, too.
Running both hires text and any sort of graphics at the same time is just not going to happen. However, if you're OK with switching between highres text and lowres text + graphics, it'll work out. One cog is needed to drive the VGA from line buffers, leaving 3 render cogs. 3 might be ok at 108 MHz, but 4 is really the sweet spot. If we can get rid of that pesky keyboard cog that spends of it's time doing absolutely nothing, that'd be 4 render cogs. Just need to find some other place to stuff the PS/2 bitbanging and then the scancode conversion and stuff can be just be done in Spin. Anyways, the terminal emulation can be incorporated into the graphics rendering by simply placing some text commands into the display list that point into the terminal buffer. Can be an overlay or an underlay depending on where the commands are inserted. However, the code handling the terminal buffer will need to be adjusted for smaller screen size. VJET as-is runs in 256x238 with non-square pixels. Could be bumped to 256x240 (The reason it's 238 is because I couldn't get my code to do 240 lines and it literally makes zero difference on TV due to overscan, so I didn't bother fixing it) If we assume most of that isn't overscanned, you can reasonably get 40x28 characters (each character being 6x8, TMS9918 style). (However, I think to actually render a screen full of text, it will need 4 render cogs.) Also, it'd really help if the terminal emulation would automagically zero-terminate each line of display memory instead of just having spaces till the end.
Other thing is: What is filling the display list? Letting the Z80 write it directly would not only be quite slow, but would also require the Z80 code to know about hub RAM layout (and if it messed up, the render cogs will spill their temp values all over and cause a really bad time). So instead it probably should be a command interface. If the bus interface code can just dump whatever data arrives at its port to the Spin code, that could handle generating the display list from commands, similar to the spin interface existing VJET geometry processing code. Problem with that is that normal spin is kinda slow at doing that. Flexspin compiles to LMM assembler and is a lot faster, but it'll make your code size explode and as such requires careful optimization to fit everything. Also when you use it, you'll find and report a bug every 2 days, it's inevitable
I also do think that the geometry code (especially the polygon function) could be optimized more. Or if flexspin is used, anyways, custom inline ASM could also help.
Thanks! As you correctly imagined, playing well through the capture preview is nigh impossible. So I actually cheated on that a bit by splitting the signal into both the capture and my TV. Although I feel my new PCI capture card has quite a bit less latency than the USB devices.
I'm actually more inclined to close the serial interface cog (at least when development is done) as a preference, as the keyboard needs to be "fast" (in its own cog?) to allow gaming control.
And, absolutely no problem to make the video modal (lowres text + graphics) and highres.
I'm going to focus on getting the highres text working, as that's just a matter of matching terminal emulation codes and keyboard scanning. This part is critical for CP/M and general usage.
The graphics part is a "stretch"
I'd like to have a command processor, so that the nice GL language can exist. It might be useful without the Z80 bus interface for other people too, so I wouldn't want the Z80 to know too much.
The EVE hardware has 32bit GL commands which are executed out of an on chip buffer. The command buffer is rendered line by line for display. I'm not sure but, I'm imagining a cog rendering a slice of the screen, and then displaying it. With 4 cogs sharing the render / display it would be real dance. It looks like the VJET geometry processing code is quite similar in function.
The EVE also has a co-processor that will execute groups of GL commands (saving the user to implement buttons, boxes, etc), but implementing a co-processor functionality would be a stretched-stretch. I would guess not possible.
Start small.
P.
Yeah, it is like that a bit, the main difference is that the high-level commands have to be pre-chewed by the geometry code. The display list also has space for temp values in it used to stregnth-reduce some multiplications out of the rendering cogs.
Adding some macro commands to draw common GUI elements is definitely very easy to do.
However, do know that moving the keyboard handling somewhere else would not make it measurably slower - the reason it generally is handled in it's own cog is because the keyboard controls the data transfer speed and thus the the pins have to be monitored constantly. Compare to something like a SNES controller, where the speed is controlled entirely by the host - that is generally read once per frame in a simple spin routine.
Excuse the slow Basic programming. I had a Basic language WTF moment.
User Experience Module for RC2014
The UX Module has a major port (0x40, 0x80, 0xC0), and a minor port at one higher. I'm thinking the Major port can be a Command/Status port (like the ACIA), and the Minor port can be for Data.
It would be easy to upload 32-bit words in LSB->MSB order from the Z80 to Parallax.
And, If I can dig into the EVE DL Commands a bit more then it might even be possible to replicate them (to some extent).
More when the next PCB is available.
I don't think there's anything that actually needs a 32 bit parameter, it all works with 16 bit (the X positions technically have a a fractional part, but that's really not very necessary given how imprecise the rendering is.) Bu
Yeah. Here's a random first idea:
Random observation: You seem to be using a 32k EEPROM - i'd recommend upgrading to a 64k one, that will give you some extra space up top to store extra code/data or maybe some user settings.
I did not know. I had assumed the the EEPROM had to match exactly the preconceptions of the hardware boot loader. So I selected the normal choice.
Can you recommend a 64kB EEPROM option or part number, please?
P.
Most of us who used it ran it at 64MHz. I'm pretty sure I still have my uOLED 96 Prop somewhere in my collection. There wasn't much programming space left after all the OLED driving code was loaded. IMO, it was a pretty novelty.
BTW you're not going to run reliably at 128MHz no matter how good you are! There were lots of exhaustive tests done many years ago. With a proper design I run reliably at 104MHz and am totally confident that 108MHz works reliably too. But I wouldn't guarantee anything higher.