Where does official SPIN2 store the CLKFREQ these days?
I have a PASM2 video driver that gets a delay value passed in units of microseconds and needs to be converted into clock ticks for a waitx instruction. For this I need to read the current P2 clock frequency and do some calculations.
Do Flexspin and official Parallax SPIN2 now both store the clock frequency in HUBRAM at address $14 or is there some difference here?
If there is a difference what is a good way to detect and handle this in PASM2 so the code can run on both platforms?
Update: just found it, looks like it is still at a different location. So the question becomes what's the best way to differentiate the two platforms so the same code can be used..?
"The current clock frequency, located at LONG[$44]. Initialized with the 'clkfreq_' value."
I know flexspin lets you do
@clkfreqto find out the address, not sure if that works on Chips compiler.
Aha, yeah the documentation for Chip's SPIN2 also mentions that @clkfreq value too. Should be fine if both platforms support it. I'll give it a try. Cheers!
If you intend adjusting that value, and setting PLL clock mode, at run-time, I've come to realise that also knowing the frequency of the installed crystal or external oscillator is most flexible way to support all boards. Something that Peter grappled with early on with his 12 MHz crystal.
At this stage, the only solution I have for achieving this is to extract the info from the compile-time
clkmode_symbol. Details here - https://forums.parallax.com/discussion/comment/1528982/#Comment_1528982
Yes I've tried to get at this information in the past when computing the PLL stuff but ran into problems doing so in a tool agnostic way (supporting both flexspin + PropTool). Maybe these days it is doable....
On that note, my use of Flexspin bitwise indexing of a constant isn't legal in Pnut either. Chip said he'll have a look at maybe adding it as a new feature a few posts down from my link there.
Here's the generic solution for now: (Tested and working with both Pnut and Flexspin)
EDIT: Added the missing first line.
EDIT2: Bug fix - divp wasn't bounded correctly.
And here's a spin test program that doesn't need DEBUG() to print. Pnut's DEBUG() has a flaw notable limitation, in that it simply can't handle the sysclock frequency being run-time adjusted.
EDIT: Updated with above bug fix.
EDIT2: Added print of the critical XIN frequency
Update 26-10-2021: Arrg! Must have been tired, it wasn't accessible to print. Right, changed the setting method to return
xinfreqinstead of the redundant
clkmode. It actually will print the XIN frequency now.
Update 19-1-2022: Found a lingering regression in
div33(). At some stage early on I'd changed from using C flag to Z flag without testing. That was a mistake because Z wasn't being set in the same manner as C previously was. Reverted back to using C flag now.
It's not limited to PNut. In flexspin also I found I couldn't change the P2 clock frequency at run time for my video driver without the serial output being corrupted. Solution was to debug at the startup frequency with no video output which was okay for the bug I had, but in some cases when real-time operation is required during debug this wouldn't be okay and you'd want to run at its regular frequency. I can see it would be nice to have the debug adapt to the current P2 clock frequency on the fly, but it would have to add a frequency check for each debug output operation which might slow it down a little. Basically it needs a RDLONG and comparison to last known frequency and jump to a handling routine to update it and reset the serial baud timing if they are different.
Have a look at my source code. In Flex, I'd had that part sorted for years. Even Flex's debug() is sorted with a single wxpin(). Pnut's debug() is a different story - it's locked inside the protected hubRAM - https://forums.parallax.com/discussion/comment/1528269/#Comment_1528269
@evanh Are you implying flexspin's DEBUG can now handle clock frequency changes, or do you need some other stuff patched? If it's meant to work now that's not what I see with this version I grabbed from the top of tree recently. As soon as the PLL changes it still messes up the output.
Version 5.9.3-beta-v5.9.2-33-g35412c83 Compiled on: Sep 26 2021
EDIT: Ok, so I looked at your code in post #8 and see how you have done this in flexspin evanh. It's your own pieces added on top, it's not part of flexspin. Though it would be nice if it was at some point.
Actually, sorry, that source is not how I use Flex's debug() at all. I've removed all use of debug() there because it can't be used in Pnut/Proptool. Not until Chip makes a feature change at least.
The Flex debug() baud adjustment is a simple wxpin() to the hardware. In C you can use _setbaud() instead.
Duh!, _setbaud() is present in FlexSpin too, not just FlexC. Here it is quickly ported back to debug() for FlexSpin only. Note: The source above works with Pnut too, therefore don't use this code here.
_setbaudworks? Because it really shouldn't, the debugger resets the serial pin to the correct mode on entry.
_setbauddoes work with the regular
-gsoftware-only debug, but that doesn't work in PASM.
The more correct solution to the issue is to set your initial clock to match the clock you're going to switch to.
The Flex compiled debug() is just a wrapper around printf(), even for Spin code. There's no locked debugger. It will use whatever baud you set with the _setbaud() function, as many times as you like. Try my second example - pllmode-debug.spin2. It adjusts the timing hundreds of times each run. No trickery, just using Eric's built-in function to do what it's intended for.
Then there's no user switching at all, clkset() becomes entirely redundant.
Chip just needs to improve Pnut's debug() to handle resetting the baud after or during a clkset().
That's not true anymore, real BRK-based debugging is a thing now an
What is BRK?
And when did Flex get it? It's true I've been using -g for debug features. It's the only option I knew about.
At any rate, that just mean -gbrk also needs improved is all.
Just like clkset(), if debug() doesn't support _setbaud() it's kind of pointless having _setbaud() at all.
BRK is the instruction that is used for setting the DEBUG break condition I think. See SETBRK/BRK/GETBRK etc.
Oh, heh, wasn't expecting a machine instruction, never looked at that section of the docs.
Huh, GETBRK works at user level code too. Pulls out a bunch of hidden status flags.