That's right. ABORT now executes TASKSTOP if no \ is in the method-call chain. And then TASKSTOP will do a COGSTOP when no tasks are left. I thought that was the logical thing to do. What do you think?
This is ok, old programms wich use ABORT without tasking works as expected.
Pnut 47.2 is now working.
@ersmith said:
Please let's not be mixing in CON constants with the preprocessor, I really like what Chip has now. Having a way to define symbols in the child might be useful but perhaps some new syntax could be implemented for that.
Yeah we'd want a nice clean way to do it. Doesn't have to use the CON constants but would be handy to have some way for parent code to control what the child implements or exposes to the parent. That way you don't have to modify the child code at all.
Although another solution might end up being some common included "config" file that gets modified for a customized solution. Just trying to avoid a situation where you'd need to hand edit multiple files to configure the same macro symbols in each file and keep this information consistent over these files. Thinking further, maybe if the tool always controls these shared macro symbols by being defined on the command line this becomes a non-issue.
Command-line DEFINE/UNDEF of master pre-processor symbols is clean and hopefully sufficient. Getting preprocessor symbols tangled up with CON symbols could pose chicken-and-egg problems.
Maybe, and I suggest this very tentatively as something to look at down the road, we could also have a #define_global that acts like #define but applies to all subsequent files (just as though it were defined on the command line). I don't know if this would be tricky to add in PNut or not. Flexspin has something like this with #pragma exportdef NAME which is ugly but compatible with C. It's handy for some kinds of things, as it allows a main program to define symbols to be consumed by its subobjects.
@Rayman said:
Shouldn’t define have global scope to start with?
No, preprocessors have always traditionally worked on a file-by-file basis. That way you don't have to worry about a #define in one file corrupting or conflicting with one in a totally different file.
If an #include file feature is eventually supported we could just put common defines in a single shared file there and include them in the required files. This might also be useful for including shared CON constants from different levels in the object hierarchy.
Being able to #include different files from #ifdef logic is a particularly useful thing, otherwise everything conditional needs to be put into a single file - or at least on a per included object basis (which itself is going to be very useful).
One thing that is probably going to be important down the line is to automatically predefine some internal symbols we can use, like the fact this compiler tool is PNut, and the version of the compiler so these can be checked if deviations in the code are required for handling minor differences with different compilers - PNut vs Flexspin for example, or features only available after some version of PNut.
Chip, I'm keen on having a compile time symbol indicating the max size of an "inline" code block that could fit in cogRAM. I note the documented value has changed, albeit note a great amount, a couple of times.
I've discovered that bad things happen if the code doesn't fit but there is no warning or error issued at compile time. I'd like to be able to do something like FIT _INLINE_MAX at the end of some inline pasm code and have it spit a compile error if it don't fit.
EDIT: Oops, actually testing it with Pnut, I see you're doing all this automatically during compile anyway ...
@rogloh said:
One thing that is probably going to be important down the line is to automatically predefine some internal symbols we can use, like the fact this compiler tool is PNut, and the version of the compiler so these can be checked if deviations in the code are required for handling minor differences with different compilers - PNut vs Flexspin for example, or features only available after some version of PNut.
@rogloh said:
One thing that is probably going to be important down the line is to automatically predefine some internal symbols we can use, like the fact this compiler tool is PNut, and the version of the compiler so these can be checked if deviations in the code are required for handling minor differences with different compilers - PNut vs Flexspin for example, or features only available after some version of PNut.
Good idea. I will add these symbols:
__PNUT__
__PNUT_V48__ 'current version
Will __PNUT_V48__ remain defined in all future versions that remain compatible with v48 code?
@rogloh said:
One thing that is probably going to be important down the line is to automatically predefine some internal symbols we can use, like the fact this compiler tool is PNut, and the version of the compiler so these can be checked if deviations in the code are required for handling minor differences with different compilers - PNut vs Flexspin for example, or features only available after some version of PNut.
Good idea. I will add these symbols:
__PNUT__
__PNUT_V48__ 'current version
Will __PNUT_V48__ remain defined in all future versions that remain compatible with v48 code?
I guess it would need to, since we only have defined and undefined states.
I could make this dynamic inside the preprocessor, so that we wouldn't actually need a bunch of separate symbols in the table.
I don't think __PNUT_V48__ needs to be defined, since if there's a preprocessor present at all then PNut is at least version 48 . For later versions, maybe, although perhaps rather than version numbers you'd want to define symbols for various new features (to take an older example, something like __PNUT_HAS_TASKS__).
If an #include file feature is eventually supported we could just put common defines in a single shared file there and include them in the required files. This might also be useful for including shared CON constants from different levels in the object hierarchy.
Being able to #include different files from #ifdef logic is a particularly useful thing, otherwise everything conditional needs to be put into a single file - or at least on a per included object basis (which itself is going to be very useful).
#include will be very very useful.
But I have another big Problem. My Propeller 2 Projekt is very big with many objects.
The last Pnut Version is v35u to compile my software.
All newer Versions brings Error: OBJ data exceeds 1024k limit.
I think Chip cannot expand the OBJ data space.
So I must wait for Pnut48 TS which has the #include Command and not the OBJ data space limit.
If an #include file feature is eventually supported we could just put common defines in a single shared file there and include them in the required files. This might also be useful for including shared CON constants from different levels in the object hierarchy.
Being able to #include different files from #ifdef logic is a particularly useful thing, otherwise everything conditional needs to be put into a single file - or at least on a per included object basis (which itself is going to be very useful).
#include will be very very useful.
But I have another big Problem. My Propeller 2 Projekt is very big with many objects.
The last Pnut Version is v35u to compile my software.
All newer Versions brings Error: OBJ data exceeds 1024k limit.
I think Chip cannot expand the OBJ data space.
So I must wait for Pnut48 TS which has the #include Command and not the OBJ data space limit.
If an #include file feature is eventually supported we could just put common defines in a single shared file there and include them in the required files. This might also be useful for including shared CON constants from different levels in the object hierarchy.
Being able to #include different files from #ifdef logic is a particularly useful thing, otherwise everything conditional needs to be put into a single file - or at least on a per included object basis (which itself is going to be very useful).
#include will be very very useful.
But I have another big Problem. My Propeller 2 Projekt is very big with many objects.
The last Pnut Version is v35u to compile my software.
All newer Versions brings Error: OBJ data exceeds 1024k limit.
I think Chip cannot expand the OBJ data space.
So I must wait for Pnut48 TS which has the #include Command and not the OBJ data space limit.
Try Spin Tools IDE ?
Spin Tools IDE is an excellent suggestion, and worth trying. You could also try FlexProp; the bytecode compiler in FlexProp is quite different from the standard one but can usually create smaller code.
I don't need a new IDE. I am happy with the VSC Editor.
And the compiler from Macca don't compile unused Methods.
I call unused Methods frequently with callbacks.
I don't need a new IDE. I am happy with the VSC Editor.
And the compiler from Macca don't compile unused Methods.
I call unused Methods frequently with callbacks.
If you can call the method then it isn't really unused, is it? If the compiler is removing them that sounds like a bug in the compiler. @macca is usually very quick to fix this kind of bug when it's reported. (Taking the address of a method should mark it as "used".)
I don't need a new IDE. I am happy with the VSC Editor.
And the compiler from Macca don't compile unused Methods.
I call unused Methods frequently with callbacks.
If you can call the method then it isn't really unused, is it? If the compiler is removing them that sounds like a bug in the compiler. @macca is usually very quick to fix this kind of bug when it's reported. (Taking the address of a method should mark it as "used".)
There is not a bug in the compiler. I modified the pointer to a Method to call the unused Methods
I don't need a new IDE. I am happy with the VSC Editor.
And the compiler from Macca don't compile unused Methods.
I call unused Methods frequently with callbacks.
You can disable the unused method removal from the preferences page, and the command line compiler doesn't remove unused methods by default anymore.
I posted a new v48.1 to fix a bug introduced in v47 which caused SEND() and RECV() to not work.
When I added cooperative multitasking, I had moved the locations of the MRECV and MSEND registers by one location and I had forgotten to update the compiler with this change.
@ersmith said:
I don't think __PNUT_V48__ needs to be defined, since if there's a preprocessor present at all then PNut is at least version 48 . For later versions, maybe, although perhaps rather than version numbers you'd want to define symbols for various new features (to take an older example, something like __PNUT_HAS_TASKS__).
Yeah it's much better to separate the fact this code is compiled by PNUT vs the version number/feature. Then we can check commonly with a single symbol for PNUT and, if we ever need to worry about it further, then a version or feature number. Doing it by feature makes it simpler, otherwise we sort of need a version macro defined meaning the version is greater than xxx, rather than xxx itself.
Eg. generally we just need to do this and won't care so much about sub-versions
#ifdef PNUT
' PNUT version implemenation
#else ' assume flex, or instead could do a flex specific Macro name check here
' flexspin version implemenation
#endif
but occasionally if it is essential to the code logic for newer features we may need this, and checking lots of different version numbers for a match would get messy
#ifdef PNUT
#ifdef PNUT_HAS_TASKS
' new task feature
#else
return ERR_NOT_SUPPORTED ' not supported in older versions
#endif
#else ' assumes flex, or instead could use a flex specific Macro name check here
' flexspin version no tasks yet
return ERR_NOT_SUPPORTED
#endif
@ersmith said:
I don't think __PNUT_V48__ needs to be defined, since if there's a preprocessor present at all then PNut is at least version 48 . For later versions, maybe, although perhaps rather than version numbers you'd want to define symbols for various new features (to take an older example, something like __PNUT_HAS_TASKS__).
Yeah it's much better to separate the fact this code is compiled by PNUT vs the version number/feature. Then we can check commonly with a single symbol for PNUT and, if we ever need to worry about it further, then a version or feature number. Doing it by feature makes it simpler, otherwise we sort of need a version macro defined meaning the version is greater than xxx, rather than xxx itself.
Eg. generally we just need to do this and won't care so much about sub-versions
#ifdef PNUT
' PNUT version implemenation
#else ' assume flex, or instead could do a flex specific Macro name check here
' flexspin version implemenation
#endif
but occasionally if it is essential to the code logic for newer features we may need this, and checking lots of different version numbers for a match would get messy
#ifdef PNUT
#ifdef PNUT_HAS_TASKS
' new task feature
#else
return ERR_NOT_SUPPORTED ' not supported in older versions
#endif
#else ' assumes flex, or instead could use a flex specific Macro name check here
' flexspin version no tasks yet
return ERR_NOT_SUPPORTED
#endif
I added a preprocessor symbol 'PNUT' that is always defined.
Also, no more -U on the command line to undefine symbols. All symbols have the same rules now - command-line symbols are no longer special.
Comments
Hi Chip,
This is ok, old programms wich use ABORT without tasking works as expected.
Pnut 47.2 is now working.
Thanks Uwe
Maybe, and I suggest this very tentatively as something to look at down the road, we could also have a
#define_global
that acts like#define
but applies to all subsequent files (just as though it were defined on the command line). I don't know if this would be tricky to add in PNut or not. Flexspin has something like this with#pragma exportdef NAME
which is ugly but compatible with C. It's handy for some kinds of things, as it allows a main program to define symbols to be consumed by its subobjects.#define scope_in_current_file
##define scope_in_current_file_and_overrides_all_obj_files
Shouldn’t define have global scope to start with?
No, preprocessors have always traditionally worked on a file-by-file basis. That way you don't have to worry about a
#define
in one file corrupting or conflicting with one in a totally different file.If an #include file feature is eventually supported we could just put common defines in a single shared file there and include them in the required files. This might also be useful for including shared CON constants from different levels in the object hierarchy.
Being able to #include different files from #ifdef logic is a particularly useful thing, otherwise everything conditional needs to be put into a single file - or at least on a per included object basis (which itself is going to be very useful).
I just posted a new v48 which includes a preprocessor and flash-image saving. P2_PNut_Public is updated, as well.
One thing that is probably going to be important down the line is to automatically predefine some internal symbols we can use, like the fact this compiler tool is PNut, and the version of the compiler so these can be checked if deviations in the code are required for handling minor differences with different compilers - PNut vs Flexspin for example, or features only available after some version of PNut.
I tested the flash file creation by adding an External Tool to Spin Tools IDE.
I put the created file onto a uSD, ran a program to copy it into the flash and reboot. Everything worked as expected. Thanks, Chip.
Chip,
I'm keen on having a compile time symbol indicating the max size of an "inline" code block that could fit in cogRAM. I note the documented value has changed, albeit note a great amount, a couple of times.
I've discovered that bad things happen if the code doesn't fit but there is no warning or error issued at compile time. I'd like to be able to do something like
FIT _INLINE_MAX
at the end of some inline pasm code and have it spit a compile error if it don't fit.EDIT: Oops, actually testing it with Pnut, I see you're doing all this automatically during compile anyway ...
Hi Chip,
I can not use {Spin2_v48} in my Program and compile it with Pnut48.
Error: Highest selectable Spin2 version ist v47.
Uwe
Because no new keywords were introduced in v48, {Spin2_v47} is as high as you can select. I will make a note of this.
The preprocessor is always enabled, but only does something when a preprocessor command is encountered.
Good news! Thanks for testing it, Jon.
Good idea. I will add these symbols:
Will
__PNUT_V48__
remain defined in all future versions that remain compatible with v48 code?I guess it would need to, since we only have defined and undefined states.
I could make this dynamic inside the preprocessor, so that we wouldn't actually need a bunch of separate symbols in the table.
I don't think
__PNUT_V48__
needs to be defined, since if there's a preprocessor present at all then PNut is at least version 48 . For later versions, maybe, although perhaps rather than version numbers you'd want to define symbols for various new features (to take an older example, something like__PNUT_HAS_TASKS__
).#include will be very very useful.
But I have another big Problem. My Propeller 2 Projekt is very big with many objects.
The last Pnut Version is v35u to compile my software.
All newer Versions brings Error: OBJ data exceeds 1024k limit.
I think Chip cannot expand the OBJ data space.
So I must wait for Pnut48 TS which has the #include Command and not the OBJ data space limit.
Uwe
Try Spin Tools IDE ?
Spin Tools IDE is an excellent suggestion, and worth trying. You could also try FlexProp; the bytecode compiler in FlexProp is quite different from the standard one but can usually create smaller code.
Hi Chip,
send() is not longer working in Pnut v48.
Uwe
I don't need a new IDE. I am happy with the VSC Editor.
And the compiler from Macca don't compile unused Methods.
I call unused Methods frequently with callbacks.
Uwe
If you can call the method then it isn't really unused, is it? If the compiler is removing them that sounds like a bug in the compiler. @macca is usually very quick to fix this kind of bug when it's reported. (Taking the address of a method should mark it as "used".)
There is not a bug in the compiler. I modified the pointer to a Method to call the unused Methods
You can disable the unused method removal from the preferences page, and the command line compiler doesn't remove unused methods by default anymore.
Oh, no! Sorry about this. I will fix it.
I posted a new v48.1 to fix a bug introduced in v47 which caused SEND() and RECV() to not work.
When I added cooperative multitasking, I had moved the locations of the MRECV and MSEND registers by one location and I had forgotten to update the compiler with this change.
P2_PNut_Public repository is now updated, too.
Chip,
Please consider adding a digit to your .exe file name so that bug-fix versions can be differentiated. Perhaps pnut_v48x1.exe.
Yeah it's much better to separate the fact this code is compiled by PNUT vs the version number/feature. Then we can check commonly with a single symbol for PNUT and, if we ever need to worry about it further, then a version or feature number. Doing it by feature makes it simpler, otherwise we sort of need a version macro defined meaning the version is greater than xxx, rather than xxx itself.
Eg. generally we just need to do this and won't care so much about sub-versions
but occasionally if it is essential to the code logic for newer features we may need this, and checking lots of different version numbers for a match would get messy
I added a preprocessor symbol 'PNUT' that is always defined.
Also, no more -U on the command line to undefine symbols. All symbols have the same rules now - command-line symbols are no longer special.