Shop OBEX P1 Docs P2 Docs Learn Events
PNut/Spin2 Latest Version (v44 -Data Structures Added, New Methods for Memory Testing/Manipulation) - Page 62 — Parallax Forums

PNut/Spin2 Latest Version (v44 -Data Structures Added, New Methods for Memory Testing/Manipulation)

15758596062

Comments

  • @cgracey said:

    @macca said:

    @ke4pjw said:
    Having a VER block could also (possibly?) help with changes in how directives behave. ie. The REPEAT loop enumeration change that happened a while back. Though, I bet that would make Chip and Eric's jobs more complex.

    If I'm not wrong, the repeat behaviour is implemented in the runtime interpreter, the compiler should have a copy of the old interpreter (or be able to patch it) and upload the appropriate version.
    While this may be "trivial", I remember sometime ago that the introduction of a new keyword (don't rememeber what, maybe field) caused a renumber of all bytecodes (existing binaries were not compatible with the new interpreter). In such case, the compiler not only needs the old interpreter but also generate a completely different bytecode. This would be very bad!

    We almost need to have each version of PNut available and have the source code name which version it needs.

    The source code does need to name which version it needs (I thought we had pretty much converged on having a version identifier or section at the start of files that need new keywords?)

    But I don't think we need to keep old versions of PNut around, or old interpreters. We just need to avoid making breaking changes to the existing keywords (of the REPEAT loop sort). Instead any new behavior should be marked by new keywords.

    Maybe we could take a stealthy approach, where if we see a keyword being used a method, variable, or constant name, that keyword is removed from the symbol table and reentered as the type the user has declared?

    I think that's kind of what @macca is doing. That will work well for new keywords that are restricted to PUB and PRI sections, but less well for things that update OBJ, CON, VAR, or DAT (where keywords and declarations can be mixed).

  • cgraceycgracey Posts: 14,134
    edited 2023-11-15 14:25

    @ersmith said:

    @cgracey said:

    @macca said:

    @ke4pjw said:
    Having a VER block could also (possibly?) help with changes in how directives behave. ie. The REPEAT loop enumeration change that happened a while back. Though, I bet that would make Chip and Eric's jobs more complex.

    If I'm not wrong, the repeat behaviour is implemented in the runtime interpreter, the compiler should have a copy of the old interpreter (or be able to patch it) and upload the appropriate version.
    While this may be "trivial", I remember sometime ago that the introduction of a new keyword (don't rememeber what, maybe field) caused a renumber of all bytecodes (existing binaries were not compatible with the new interpreter). In such case, the compiler not only needs the old interpreter but also generate a completely different bytecode. This would be very bad!

    We almost need to have each version of PNut available and have the source code name which version it needs.

    The source code does need to name which version it needs (I thought we had pretty much converged on having a version identifier or section at the start of files that need new keywords?)

    But I don't think we need to keep old versions of PNut around, or old interpreters. We just need to avoid making breaking changes to the existing keywords (of the REPEAT loop sort). Instead any new behavior should be marked by new keywords.

    Maybe we could take a stealthy approach, where if we see a keyword being used a method, variable, or constant name, that keyword is removed from the symbol table and reentered as the type the user has declared?

    I think that's kind of what @macca is doing. That will work well for new keywords that are restricted to PUB and PRI sections, but less well for things that update OBJ, CON, VAR, or DAT (where keywords and declarations can be mixed).

    We could identify all user symbols in each section. Why not? This could totally solve the problem, couldn't it? A warning could pop up after compile, identifying which keywords were canceled and became user symbols.

    The REPEAT-var debacle involves v35n-v37. That is a real pain to handle. Edit: And it's impossible to resolve when mixing objects that used either type of REPEAT. This will have to be chalked up as a youthful indiscretion with lasting consequences.

  • @cgracey said:

    Maybe we could take a stealthy approach, where if we see a keyword being used a method, variable, or constant name, that keyword is removed from the symbol table and reentered as the type the user has declared?

    I think that's kind of what @macca is doing. That will work well for new keywords that are restricted to PUB and PRI sections, but less well for things that update OBJ, CON, VAR, or DAT (where keywords and declarations can be mixed).

    We could identify all user symbols in each section. Why not? This could totally solve the problem, couldn't it? A warning could pop up after compile, identifying which keywords were canceled and became user symbols.

    That's fine for something like a compiler that's already doing multiple passes over the source code, but it seems like it could be a real pain for other tools like editors. EDIT: Actually come to think of it it may not always be fine: if the user redefines something like IF or WHILE then reading the code could become very "interesting". I still think the VER 43 solution (or something like it) is more straightforward in general.

    The REPEAT-var debacle involves v35n-v37. That is a real pain to handle. Edit: And it's impossible to resolve when mixing objects that used either type of REPEAT. This will have to be chalked up as a youthful indiscretion with lasting consequences.

    Yes. I think we have to draw a line and say v41 (or something similar) is the "base" Spin2 and nothing earlier is supported. After that though we should try not to break existing code, so that we can get a stable OBEX.

  • cgraceycgracey Posts: 14,134

    @ersmith said:

    @cgracey said:

    Maybe we could take a stealthy approach, where if we see a keyword being used a method, variable, or constant name, that keyword is removed from the symbol table and reentered as the type the user has declared?

    I think that's kind of what @macca is doing. That will work well for new keywords that are restricted to PUB and PRI sections, but less well for things that update OBJ, CON, VAR, or DAT (where keywords and declarations can be mixed).

    We could identify all user symbols in each section. Why not? This could totally solve the problem, couldn't it? A warning could pop up after compile, identifying which keywords were canceled and became user symbols.

    That's fine for something like a compiler that's already doing multiple passes over the source code, but it seems like it could be a real pain for other tools like editors. EDIT: Actually come to think of it it may not always be fine: if the user redefines something like IF or WHILE then reading the code could become very "interesting". I still think the VER 43 solution (or something like it) is more straightforward in general.

    The REPEAT-var debacle involves v35n-v37. That is a real pain to handle. Edit: And it's impossible to resolve when mixing objects that used either type of REPEAT. This will have to be chalked up as a youthful indiscretion with lasting consequences.

    Yes. I think we have to draw a line and say v41 (or something similar) is the "base" Spin2 and nothing earlier is supported. After that though we should try not to break existing code, so that we can get a stable OBEX.

    Agreed.

  • If you are going to create a new block/section name just for versioning, I'm wondering whether it could be useful to make it more of a general container for including any future compiler version requirements and/or switches we come up with, rather than just a statement for a single version number. For now we might just need a version number in it but later if you do change any behaviour again or add features that might break stuff then it could be indicated by another keyword in this section. So instead of just a VER 41 or something we could have this as a USES block:

    USES
       V41
       REPEAT_VAR_END
       ' ... (any future things that aren't just version related but might alter the compiler behavior - like compiler command line switches)
    
    PUB blah()
    

    This would allow us to "solve" the repeat var stuff at least as a one off. If the compiler sees the REPEAT_VAR_END (or a suitable name) it could still generate the repeat code that "special" way vs the current/original way. Then the only thing that has to change (for the small number of people who really insist it needs to work like that because they can't use the latest compiler otherwise), is the code requiring that behaviour (just once) to include this USES block in the file if it hasn't already been changed back to behave the current way. Anyone running an old compiler will simple barf on the USES block seen after that, forcing the upgrade to the newer version that supports it.

    This new USES block should remain fairly generic and could also indicate other things later like libraries if the language ever evolves in that way for example.

    Moving forward however, you will really need to make your compilers support the superset of all old versions, honouring their old features and performing the legacy behaviour indicated by the older version number if it differs to the new, otherwise old code will not be usable anymore with upgraded compilers. And the worst thing then will be if you bring in someone's older code as a shared object and you have to then go hunt down an older matching compiler version just to build it, which may not be compatible with your own code if it uses any newer features requiring a newer compiler. Ending up in situations like that would truly suck hard.

    USES could be some other short name like NEEDS, USE, REQ (requires/request) etc. Examples...

    USE
         V41 ' minimum spin compiler version
         OPTS ' optimization enabled
         FP_MATH ' floating point math library included
         NO_DEAD_CODE ' dead code removal
    
  • maccamacca Posts: 770
    edited 2023-11-16 07:26

    I'm not very keen on flags that imply changes to the interpreter's behavior, but one possibility is to modularize the interpreter's source with the use of preprocessor directives and recompile it along with the program and appropriate flags (this of course requires PNut to implement the preprocessor directives, Spin Tools IDE and flexprop already implement them, albeit in a slightly incompatible variant). This will not solve the problem of two sources requiring different behavior, this should generate an error whenever the source requirements are not compatible with the interpreter. And it would be a real pain if the change required renumbering the bytecodes as has happened in the past.

  • Let's not let the perfect become the enemy of the good. Please, can we just agree on a very simple, basic version standard that will allow us to extend the Spin2 language without breaking existing code? If we decide some other new cool features are needed in the future, we'll be able to add those.

    The proposal that seemed to have the most momentum was a VER block with a single entry (a version number) and which has to come first in the file (preferably on the first line). Once that's in, we can expand on it.

    The only disadvantages I see to the VER block are (1) it introduces yet another keyword which could break existing code, and (2) code with a VER block cannot compile with an older compiler, even if the code does not actually use any new features. Both of these could be fixed by putting the version declaration in a comment. But again, let's not let the perfect get in the way of the good. (2) is not really that big a deal in practice, and (1) can be avoided if the VER keyword can only appear first in the file (the compiler can remove "VER" from the keyword table if we see some other section like CON or DAT before VER). So I'm perfectly happy to add VER to flexspin as is.

    Ultimately the choice is up to @cgracey though, because his compiler defines the Spin2 standard. All I will say is that the status quo isn't very sustainable, unless we want the OBEX to bit-rot.

  • VonSzarvasVonSzarvas Posts: 3,426
    edited 2023-11-21 15:30

    @ersmith said:
    The only disadvantages I see to the VER block are (1) it introduces yet another keyword which could break existing code, and (2) code with a VER block cannot compile with an older compiler, even if the code does not actually use any new features. Both of these could be fixed by putting the version declaration in a comment....

    Those were my immediate thoughts too. Using a {comment tag} or verbose CON statement (just like the DEBUG configuration statements do), would totally avoid any further breakage, and also set a precedent for future expansion (just as the CON debug_* options exist and might grow, then the CON compiler_* options might grow).

    Introducing yet another keyword just to work around some other new keyword problem doesn't feel like the right solution to me. I'd rather see the original issue dealt with - ie. use BYTE() etc.. instead of introducing BYTES(). And set a simple comment flag for moving forward. Otherwise we're making a mountain out of a molehill.

    I always liked the elegant way the BasicSTAMP compiler dealt with the language versions and would have liked that to be adopted for spin. Would have saved all the file switching pain with PropellerTool too.
    {$P1} {$P2} {$V32} etc..

    Between comments and CON blocks, there's already two ways there to set the compiler version and processor type, without needing to add new keywords and break things.

  • Don't go the Arduino direction !
    After some time, you can't build you working code again with the updated tool.
    Frustrated many times, finding break points inside tested and validated programs, due to updated IDE, and al the surrounding stuffs

    And just like Jon told, how to deal with different version extracted from the obex?
    Can a compiler, jump on every different version, inside the "OBJ" part of a Spin2 program?

    This is one reason, I do't really use the P2 now, I only play, and look.
    Waiting the P2 surounding get mature and stable

  • RaymanRayman Posts: 14,517

    Guess this is the reason PropTool has the option of including the tool itself in the archive...

    Must have been a time when Spin1 code was also evolving...

  • @cgracey : Have you thought about how, or if, you want to provide version protection in PNut?

    For now in flexprop I've done kind of what Macca does: by default the new keywords (starting at FIELD) can be overridden by user variables. When this happens a warning is printed. I've also added an option to have an explicit version like {$ver 41} in the file, in which case the keywords up to that version become "hard" and can no longer be overridden, and keywords with a later version are ignored completely. This is probably all overkill, but in the absence of official guidance I'm trying to keep the compiler flexible.

  • cgraceycgracey Posts: 14,134

    @ersmith said:
    @cgracey : Have you thought about how, or if, you want to provide version protection in PNut?

    For now in flexprop I've done kind of what Macca does: by default the new keywords (starting at FIELD) can be overridden by user variables. When this happens a warning is printed. I've also added an option to have an explicit version like {$ver 41} in the file, in which case the keywords up to that version become "hard" and can no longer be overridden, and keywords with a later version are ignored completely. This is probably all overkill, but in the absence of official guidance I'm trying to keep the compiler flexible.

    I am thinking about all this.

    I kind of like the syntax {v42}, since it doesn't need a new block and it's terse. It would just have to be documented up front.

    I am kind of bummed about requiring ANYTHING to be added for new programs. That is my hang-up.

    What appeals to me is checking for name conflicts against NEW keywords, and then letting them be user names, with a warning.

  • ElectrodudeElectrodude Posts: 1,651
    edited 2023-11-21 15:11

    @cgracey said:

    @ersmith said:
    @cgracey : Have you thought about how, or if, you want to provide version protection in PNut?

    For now in flexprop I've done kind of what Macca does: by default the new keywords (starting at FIELD) can be overridden by user variables. When this happens a warning is printed. I've also added an option to have an explicit version like {$ver 41} in the file, in which case the keywords up to that version become "hard" and can no longer be overridden, and keywords with a later version are ignored completely. This is probably all overkill, but in the absence of official guidance I'm trying to keep the compiler flexible.

    I am thinking about all this.

    I kind of like the syntax {v42}, since it doesn't need a new block and it's terse. It would just have to be documented up front.

    I am kind of bummed about requiring ANYTHING to be added for new programs. That is my hang-up.

    What appeals to me is checking for name conflicts against NEW keywords, and then letting them be user names, with a warning.

    If you go this route, please make it e.g. {$v42} and not just {v42}. The latter is just a funny-looking comment, while the former is clearly a directive.

    But, as others have said, all these versioning problems could be eliminated by just calling the new keywords byte, word, and long. Every syntax highlighter already has to deal with those keywords (as well as e.g. and and or) already having multiple fundamentally different meanings - what's one more overloading?

  • cgraceycgracey Posts: 14,134

    @Electrodude said:

    @cgracey said:

    @ersmith said:
    @cgracey : Have you thought about how, or if, you want to provide version protection in PNut?

    For now in flexprop I've done kind of what Macca does: by default the new keywords (starting at FIELD) can be overridden by user variables. When this happens a warning is printed. I've also added an option to have an explicit version like {$ver 41} in the file, in which case the keywords up to that version become "hard" and can no longer be overridden, and keywords with a later version are ignored completely. This is probably all overkill, but in the absence of official guidance I'm trying to keep the compiler flexible.

    I am thinking about all this.

    I kind of like the syntax {v42}, since it doesn't need a new block and it's terse. It would just have to be documented up front.

    I am kind of bummed about requiring ANYTHING to be added for new programs. That is my hang-up.

    What appeals to me is checking for name conflicts against NEW keywords, and then letting them be user names, with a warning.

    If you go this route, please make it e.g. {$v42} and not just {v42}. The latter is just a funny-looking comment, while the former is clearly a directive.

    But, as others have said, all these versioning problems could be eliminated by just calling the new keywords byte, word, and long. Every syntax highlighter already has to deal with those keywords (as well as e.g. and and or) already having multiple fundamentally different meanings - what's one more overloading?

    We are going back to BYTE(), WORD(), and LONG().

    I was thinking comment words like Spin2_v42 could be used to set compiler level. You could put a $ in front of it. Maybe we'll require it. Not sure.

  • @cgracey said:

    @Electrodude said:

    @cgracey said:

    @ersmith said:
    @cgracey : Have you thought about how, or if, you want to provide version protection in PNut?

    For now in flexprop I've done kind of what Macca does: by default the new keywords (starting at FIELD) can be overridden by user variables. When this happens a warning is printed. I've also added an option to have an explicit version like {$ver 41} in the file, in which case the keywords up to that version become "hard" and can no longer be overridden, and keywords with a later version are ignored completely. This is probably all overkill, but in the absence of official guidance I'm trying to keep the compiler flexible.

    I am thinking about all this.

    I kind of like the syntax {v42}, since it doesn't need a new block and it's terse. It would just have to be documented up front.

    I am kind of bummed about requiring ANYTHING to be added for new programs. That is my hang-up.

    What appeals to me is checking for name conflicts against NEW keywords, and then letting them be user names, with a warning.

    If you go this route, please make it e.g. {$v42} and not just {v42}. The latter is just a funny-looking comment, while the former is clearly a directive.

    But, as others have said, all these versioning problems could be eliminated by just calling the new keywords byte, word, and long. Every syntax highlighter already has to deal with those keywords (as well as e.g. and and or) already having multiple fundamentally different meanings - what's one more overloading?

    We are going back to BYTE(), WORD(), and LONG().

    I was thinking comment words like Spin2_v42 could be used to set compiler level. You could put a $ in front of it. Maybe we'll require it. Not sure.

    Going back to BYTE(), WORD(), and LONG() won't help with LSTRING(). Honestly I think it's time to bite the bullet and provide some kind of long term solution for the compiler, whether it's a version number or having a way for user variables to override new keywords. Both @macca and I have already implemented the new keywords in a backwards compatible way in our respective compilers.

  • cgraceycgracey Posts: 14,134
    edited 2023-12-13 09:34

    I posted a new v43 at the top of this thread.

    • Renamed BYTES()/WORDS()/LONGS() methods to BYTE()/WORD()/LONG() to conserve name space.

    • New AUTO keyword added to DEBUG SCOPE Display to auto-scale trace data.

    • New %"Text" added for expressing constants of up to four characters within a long, little-endian, zero-padded.

    • implemented Spin2 keyword gating to inhibit namespace conflicts as new keywords are added in the future.

      • The comment {Spin2_v##} is sought before any Spin2 code, to enable new keywords.
      • {Spin2_v43}, for example, will enable the new LSTRING keyword (actually introduced in v42).
      • {Spin2_v41} is the default if no {Spin2_v##} comment was found.
      • As you enable newer keywords, you may need to change your symbol names to resolve conflicts.
      • This way, existing code is not automatically rendered uncompilable by Spin2 namespace growth.
  • ElectrodudeElectrodude Posts: 1,651
    edited 2023-12-13 15:37

    @cgracey said:
    I posted a new v43 at the top of this thread.

    • Renamed BYTES()/WORDS()/LONGS() methods to BYTE()/WORD()/LONG() to conserve name space.

    Thanks

    • implemented Spin2 keyword gating to inhibit namespace conflicts as new keywords are added in the future.
      • The comment {Spin2_v##} is sought before any Spin2 code, to enable new keywords.
      • {Spin2_v43}, for example, will enable the new LSTRING keyword (actually introduced in v42).
      • {Spin2_v41} is the default if no {Spin2_v##} comment was found.
      • As you enable newer keywords, you may need to change your symbol names to resolve conflicts.
      • This way, existing code is not automatically rendered uncompilable by Spin2 namespace growth.

    Can you make them start with a dollar sign inside the brackets, like {$Spin2_v41}, to follow Pascal-style compiler directives and to make it more obvious that they actually mean something to the compiler and aren't just some convention?

  • cgraceycgracey Posts: 14,134

    @Electrodude said:

    @cgracey said:
    I posted a new v43 at the top of this thread.

    • Renamed BYTES()/WORDS()/LONGS() methods to BYTE()/WORD()/LONG() to conserve name space.

    Thanks

    • implemented Spin2 keyword gating to inhibit namespace conflicts as new keywords are added in the future.
      • The comment {Spin2_v##} is sought before any Spin2 code, to enable new keywords.
      • {Spin2_v43}, for example, will enable the new LSTRING keyword (actually introduced in v42).
      • {Spin2_v41} is the default if no {Spin2_v##} comment was found.
      • As you enable newer keywords, you may need to change your symbol names to resolve conflicts.
      • This way, existing code is not automatically rendered uncompilable by Spin2 namespace growth.

    Can you make them start with a dollar sign inside the brackets, like {$Spin2_v41}, to follow Pascal-style compiler directives and to make it more obvious that they actually mean something to the compiler and aren't just some convention?

    We'll talk about this on the live forum today.

  • cgraceycgracey Posts: 14,134

    We discussed this {Spin2_v43} vs {$Spin2_v43) and the consensus was to leave it as it is, which was kind of a relief to me. I think needing curly braces around it makes it stand out enough.

  • cgraceycgracey Posts: 14,134

    New PNut_v44.exe at the top of this thread.

    Added data structures:

    {Spin2_v44}
    
    CON  sPoint(byte x, byte y)
         sLine(sPoint a, sPoint b, byte color)
    
         LineCount = 100
    
    VAR  sLine Line[LineCount]              'Line is an array of sLine structures
    
    PUB go() | i
    
      debug(`plot myplot size 256 256 hsv8x update)
    
      repeat
        repeat LineCount with i             'set up random lines
          Line[i].a.x := getrnd() & $FF
          Line[i].a.y := getrnd() & $FF
          Line[i].b.x := getrnd() & $FF
          Line[i].b.y := getrnd() & $FF
          Line[i].color := getrnd() & $FF
    
        drawLines(Line, LineCount)          'draw them by passing Line base-structure address
    
    PRI drawLines(^sLine pLine, count) | i  'pLine is a structure pointer of type sLine
    
      debug(`myplot clear linesize 2)
    
      repeat count with i
        debug(`myplot color `(pLine[i].color))
        debug(`myplot set  `(pLine[i].a.x, pLine[i].a.y))
        debug(`myplot line `(pLine[i].b.x, pLine[i].b.y))
    
      debug(`myplot update)
    
  • B) That's great. That will enable using Spin2 in most cases where I was forced to use C until now.

    Is the documentation also updated already? Or can you post a short summary of the implementation details (result of this discussion)? AFAIK...

    • no index range checking
    • size of individual structures limited to $FFFF
    • (sub) indices limited to $FFFF
    • no limit on overall array size
      ... correct me if I'm wrong.
  • cgraceycgracey Posts: 14,134

    @ManAtWork said:
    B) That's great. That will enable using Spin2 in most cases where I was forced to use C until now.

    Is the documentation also updated already? Or can you post a short summary of the implementation details (result of this discussion)? AFAIK...

    • no index range checking
    • size of individual structures limited to $FFFF
    • (sub) indices limited to $FFFF
    • no limit on overall array size
      ... correct me if I'm wrong.

    That's all correct.

    The documentation is updated, but I haven't made a complete section on Data Structures, yet. So, search the doc for "structure" and you will see all the details. It's pretty simple, I think. It was hard to think about, though, at first.

    Remember these things...

    If you state the name of a structure or structure.substructure (not ending in an actual BYTE/WORD/LONG element), it becomes a read-only address of where that structure or substructure exists.

    Structure pointers are the same, but if no [index] or "." is expressed, structure pointers behave exactly as normal LONGs, so that they can be assigned. And of course, by themselves they will just return their value, which is the structure pointer.

    So, to assign a structure pointer, you can just do:

    structPtr := structInstance

    Of course, you could have indexes, and substructures, as well, as part of the assignment. Like just has to be assigned to like.

    I hope it helps. Let me know if you have any more questions and how it works out for you.

  • RaymanRayman Posts: 14,517

    That looks good. How can one have preloaded structures? Something in a dat section?

    Or is this runtime assignment only?

  • Wuerfel_21Wuerfel_21 Posts: 4,984
    edited 2024-03-13 11:20

    Uh, that seems like it's missing notion of value semantics. As I was saying, if passing around a struct of 2 to 4 values isn't practically identical to passing them individually, that's really worthless and stupid. Also, the ^ symbol may be annoying to type for many people due to dead keys.

  • cgraceycgracey Posts: 14,134

    @Rayman said:
    That looks good. How can one have preloaded structures? Something in a dat section?

    Or is this runtime assignment only?

    Right now it is runtime assignment, but I will implement some way to do this in a DAT section with constants.

  • cgraceycgracey Posts: 14,134

    @Wuerfel_21 said:
    Uh, that seems like it's missing notion of value semantics. As I was saying, if passing around a struct of 2 to 4 values isn't practically identical to passing them individually, that's really worthless and stupid. Also, the ^ symbol may be annoying to type for many people due to dead keys.

    My example was kind of simplistic, perhaps.

    What do you mean by value semantics?

    What would be better than ^ ?

  • Wuerfel_21Wuerfel_21 Posts: 4,984
    edited 2024-03-13 16:29

    What I mean is that structures need to be able to exist on their own as values, not just as pointers. That is, you'd be able to pass an sLine into the function directly, just as if it were two individual variables.

    The pointer adds cruft in many cases. Basically, ask any Java (which only has pointers) programmer and they'll cry you a river :) Often you really just want the value without baggage of identity.

    Maybe negligable with the bytecode interpreter, but with flexspin's ASM backend (which, you know, i've worked on), pass-by-value is also vastly faster. It's just moving stuff into argument registers. Whereas when you pass a pointer, each field has to at some point roundtrip through WRLONG/RDLONG and wastes many cycles.

    To demonstrate with two C programs (compiled with inlining disabled to show it better):

    struct vec3 {
        int x,y,z;
    };
    
    void PassByValue(struct vec3 v) {
        // Writing to OUTA just to use the value
        _OUTA = v.x;
        _OUTA = v.y;
        _OUTA = v.z;
    }
    
    void main() {
        struct vec3 a = {1,2,3};
    
        PassByValue(a);
    }
    

    compiles as:

    _PassByValue
        mov outa, arg01
        mov outa, arg02
        mov outa, arg03
    _PassByValue_ret
        ret
    
    _main
        mov COUNT_, #3
        call    #pushregs_
        mov local01, #1
        mov local02, #2
        mov local03, #3
        mov arg01, local01
        mov arg02, local02
        mov arg03, local03
        call    #_PassByValue
        mov ptra, fp
        call    #popregs_
    _main_ret
        ret
    

    struct vec3 {
        int x,y,z;
    };
    
    void PassByReference(struct vec3 *v) {
        // Writing to OUTA just to use the value
        _OUTA = v->x;
        _OUTA = v->y;
        _OUTA = v->z;
    }
    
    void main() {
        struct vec3 a = {1,2,3};
    
        PassByReference(&a);
    }
    

    turns into this:

    _PassByReference
        rdlong  outa, arg01
        add arg01, #4
        rdlong  outa, arg01
        add arg01, #4
        rdlong  outa, arg01
    _PassByReference_ret
        ret
    
    _main
        mov COUNT_, #0
        call    #pushregs_
        add ptra, #12
        wrlong  #1, fp
        add fp, #4
        wrlong  #2, fp
        add fp, #4
        wrlong  #3, fp
        sub fp, #8
        mov arg01, fp
        call    #_PassByReference
        mov ptra, fp
        call    #popregs_
    _main_ret
        ret
    

    It's actually not as bad as it could be for a simple example like this, but all those WRLONG and RDLONG are really worth avoiding, they're so slow (especially when they might conflict with FIFO fetching of instructions).

    (Of course a function that has to modify the caller's original structure is better off getting a pointer. It's about the right thing for the right situation)


    I already suggested @ to denote pointer types in the other thread, since that's already related to pointers. C uses the star *, so maybe that's more comfortable for most people.

    For reference, it's the backick ` and circumflex ^ that are dead keys on default german layouts. You need to hit space afterwards to get the standalone character. (if I type ^+e I'd get ê, which is the point of that) On Windows, hitting the dead key again will inexplicably result in two of the character.

  • pik33pik33 Posts: 2,366

    For reference, it's the backick ` and circumflex ^ that are dead keys on default german layouts

    I discovered that ~ works the same strange way in Polish layout. Write ~e and you got.... ę. Why, what for??? What an idiotic "feature" - to write ę on Polish keyboard you simply use RAlt-e. There is no need for such monstrosities with Polish layout. All 18 (2x9) additional characters are written using right ALT + non modified character.

    Hopefully Polish layout didn't made stupid things with ^ and ` - only ~ is affected.

  • RaymanRayman Posts: 14,517

    Doesn't Spin2 have some limit on amount of longs you can pass by value?
    Is it 2 longs?

  • @cgracey said:
    What would be better than ^ ?

    Perhaps PRI drawLines(pLine: sLine, count) | i, i.e. var: type? This is what Python and Rust do. But would this interact badly with specifying types on return variables?

Sign In or Register to comment.