Shop OBEX P1 Docs P2 Docs Learn Events
PNut/Spin2 Latest Version (v50 - PLOT bitmaps, DITTO code, ORGH inline, @\"string\n", IF_??? DEBUG) - Page 73 — Parallax Forums

PNut/Spin2 Latest Version (v50 - PLOT bitmaps, DITTO code, ORGH inline, @\"string\n", IF_??? DEBUG)

16869707173

Comments

  • cgraceycgracey Posts: 14,274
    edited 2025-01-20 00:13

    Thinking more about object pointers...

    They would be like method pointers, minus the 12-bit method index. They would just need the VAR base and code base of an object. Also, maybe an association to an object should be part of the object pointer variable declaration, in order to reduce syntactic clutter during usage.

    Maybe it could all work like this:

    OBJ objtype = "objfile"
    
    VAR ^objtype t
    
    PUB ......
      t := @realobject
      t.method()
      x := t.consymbol * 3
    
  • JonnyMacJonnyMac Posts: 9,225

    Does that work for methods with parameters?

  • cgraceycgracey Posts: 14,274

    @JonnyMac said:
    Does that work for methods with parameters?

    Yes, I just gave the simplest example.

  • Ideally make sure there's compatibility with flexspin's version of the feature (that also uses the OBJ equals syntax)

  • roglohrogloh Posts: 5,882
    edited 2025-01-20 03:12

    @cgracey said:
    Thinking more about object pointers...

    They would be like method pointers, minus the 12-bit method index. They would just need the VAR base and code base of an object. Also, maybe an association to an object should be part of the object pointer variable declaration, in order to reduce syntactic clutter during usage.

    Maybe it could all work like this:

    OBJ objtype = "objfile"
    
    VAR ^objtype t
    
    PUB ......
      t := @realobject
      t.method()
      x := t.consymbol * 3
    

    Interesting. Where is this "realobject" defined? In objfile somewhere? I like the idea of being able to dynamically select the executed code/data based on a dynamic object pointer's value rather than a static association - it could be useful for my memory driver abstraction and it is heading a bit closer to actual OOP. Although ideally we want the object pointer to be assignable to one of several instances of the same base object type (class) with different implementations for each subclass (if you want some inheritance+polymorphism).

  • JonnyMacJonnyMac Posts: 9,225
    edited 2025-01-20 05:15

    Yes, I just gave the simplest example.

    Okay, found the detail in the docs: when using the method pointer you have to explicitly declare the number of return values if !0.

  • @cgracey said:
    Thinking more about object pointers...

    They would be like method pointers, minus the 12-bit method index. They would just need the VAR base and code base of an object. Also, maybe an association to an object should be part of the object pointer variable declaration, in order to reduce syntactic clutter during usage.

    Maybe it could all work like this:

    OBJ objtype = "objfile"
    
    VAR ^objtype t
    
    PUB ......
      t := @realobject
      t.method()
      x := t.consymbol * 3
    

    This is exatly what I would want.

    By the way, I begin to rewrite my programs using structs and find that I can not access structure definitions in an obj from the parent obj.
    We can access constants definition with obj.CONST_NAME
    But not access structure definition with obj.STRUCT_NAME

    Maybe it is better to have a #include for the preprocessor.
    So we can write alle structure definition and global constants in one File and #include this in every object that needs that definitions.

  • cgraceycgracey Posts: 14,274

    @rogloh said:

    @cgracey said:
    Thinking more about object pointers...

    They would be like method pointers, minus the 12-bit method index. They would just need the VAR base and code base of an object. Also, maybe an association to an object should be part of the object pointer variable declaration, in order to reduce syntactic clutter during usage.

    Maybe it could all work like this:

    OBJ objtype = "objfile"
    
    VAR ^objtype t
    
    PUB ......
      t := @realobject
      t.method()
      x := t.consymbol * 3
    

    Interesting. Where is this "realobject" defined? In objfile somewhere? I like the idea of being able to dynamically select the executed code/data based on a dynamic object pointer's value rather than a static association - it could be useful for my memory driver abstraction and it is heading a bit closer to actual OOP. Although ideally we want the object pointer to be assignable to one of several instances of the same base object type (class) with different implementations for each subclass (if you want some inheritance+polymorphism).

    The "realobject" could come from anywhere. It would just need to exist in memory and be the same type of object.

  • cgraceycgracey Posts: 14,274
    edited 2025-01-20 16:24

    @wummi said:

    @cgracey said:
    Thinking more about object pointers...

    They would be like method pointers, minus the 12-bit method index. They would just need the VAR base and code base of an object. Also, maybe an association to an object should be part of the object pointer variable declaration, in order to reduce syntactic clutter during usage.

    Maybe it could all work like this:

    OBJ objtype = "objfile"
    
    VAR ^objtype t
    
    PUB ......
      t := @realobject
      t.method()
      x := t.consymbol * 3
    

    This is exatly what I would want.

    By the way, I begin to rewrite my programs using structs and find that I can not access structure definitions in an obj from the parent obj.
    We can access constants definition with obj.CONST_NAME
    But not access structure definition with obj.STRUCT_NAME

    Maybe it is better to have a #include for the preprocessor.
    So we can write alle structure definition and global constants in one File and #include this in every object that needs that definitions.

    I need to make child-object STRUCT definitions readable by the parent. I will look into this today.

    #INCLUDE would be good, but it would need to be implemented in a way where it wouldn't interfere with source file offsets for error reporting.

  • ersmithersmith Posts: 6,124
    edited 2025-01-20 16:40

    @cgracey said:

    #INCLUDE would be good, but it would need to be implemented in a way where it wouldn't interfere with source file offsets for error reporting.

    Most preprocessors deal with this by having a #line directive which can reset the file name and line number for error reporting, and then inserting appropriate #line during and after #include. See e.g. https://learn.microsoft.com/en-us/cpp/preprocessor/hash-line-directive-c-cpp?view=msvc-170

  • roglohrogloh Posts: 5,882

    @ersmith said:

    @cgracey said:

    #INCLUDE would be good, but it would need to be implemented in a way where it wouldn't interfere with source file offsets for error reporting.

    Most preprocessors deal with this by having a #line directive which can reset the file name and line number for error reporting, and then inserting appropriate #line during and after #include. See e.g. https://learn.microsoft.com/en-us/cpp/preprocessor/hash-line-directive-c-cpp?view=msvc-170

    Perhaps Chip's concern is the line number reporting during compilation, not so much execution, and how potentially nested include files would upset that. Also the #ifdef code could well too depending on how/when that is done, although I believe they are just being replaced with blanks IIRC so original line numbers should still be countable.

    I imagine some type of stack structure that tracks actual line numbers per currently included file might be useful if it could refer to the original source file in the nested group and current line number of the #include line. The line numbers get restarted at 1 for each new file the compiler includes and are restored back to the original line number of the "parent" source file when each include file ends, which then continues incrementing. Of course there still may be problems I don't understand as I might be missing how/when the compiler uses the line numbers if multiple passes are done and some of this information is not present in later passes. For that you may need to map a global line number to an original file/line number for example.

    This is a common and solvable problem for compiler error reporting. For example GCC reports errors found in include files using a hierarchy where each included file and related line number is known and displayed like this:

    In file included from /usr/include/stdio.h:28:0, 
    from ../.././gcc-4.7.0/libgcc/../gcc/tsystem.h:88,
    from ../.././gcc-4.7.0/libgcc/libgcc2.c:29:
    /usr/include/features.h:324:26: fatal error: bits/predefs.h: No such file or directory
    compilation terminated.
    
  • ersmithersmith Posts: 6,124

    @rogloh said:

    @ersmith said:

    @cgracey said:

    #INCLUDE would be good, but it would need to be implemented in a way where it wouldn't interfere with source file offsets for error reporting.

    Most preprocessors deal with this by having a #line directive which can reset the file name and line number for error reporting, and then inserting appropriate #line during and after #include. See e.g. https://learn.microsoft.com/en-us/cpp/preprocessor/hash-line-directive-c-cpp?view=msvc-170

    Perhaps Chip's concern is the line number reporting during compilation, not so much execution, and how potentially nested include files would upset that.

    That's exactly what I was talking about. Typically a preprocessor is implemented as a first pass, before any other part of the compiler, that replaces macros with their definitions, removes code that fails #ifdef tests, and substitutes the contents of a #include file for the #include line. So code that starts off looking like:

    CON
    #ifdef NOT_DEFINED
    #include "a.defs"
    #else
    #include "b.defs"
    #endif
    DAT
       byte "this is the original spin2"
    ...
    

    will after preprocessing look something like:

    CON
    
    
    
    #line 1 "b.defs"
    '' contents of file b.defs
    MyVal = "B"
    
    #line 6 "orig.spin2"
    DAT
      byte "this is the original spin2"
    

    The preprocessor deletes lines in #ifdef as appropriate, replacing them with blank lines to keep the line count straight. But around #include it inserts #line directives so the subsequent compiler passes can figure out what lines to give for errors. You can see this kind of thing in action if you run a stand-alone C preprocessor (like gcc -E). #line are the only directives that the rest of the compiler has to deal with, and they're pretty simple.

    Some modern compilers fold the preprocessing into other passes, so they no longer have a seperate preprocessor. That's OK too, but means you need to keep track of the line numbers/file names in another way.

  • roglohrogloh Posts: 5,882

    Ok gotcha @ersmith . I see how the #line directive is used in the final output now and how it tracks the source input file positions. The Microsoft example posted had only showed it used for output during execution which is commonly used too in C, but this embedded source information can also be used to track files and line numbers during compilation. Hopefully Chip could try to follow something like that.

  • cgraceycgracey Posts: 14,274
    edited 2025-01-21 11:40

    @wummi said:

    @cgracey said:
    Thinking more about object pointers...

    They would be like method pointers, minus the 12-bit method index. They would just need the VAR base and code base of an object. Also, maybe an association to an object should be part of the object pointer variable declaration, in order to reduce syntactic clutter during usage.

    Maybe it could all work like this:

    OBJ objtype = "objfile"
    
    VAR ^objtype t
    
    PUB ......
      t := @realobject
      t.method()
      x := t.consymbol * 3
    

    This is exatly what I would want.

    By the way, I begin to rewrite my programs using structs and find that I can not access structure definitions in an obj from the parent obj.
    We can access constants definition with obj.CONST_NAME
    But not access structure definition with obj.STRUCT_NAME

    Maybe it is better to have a #include for the preprocessor.
    So we can write alle structure definition and global constants in one File and #include this in every object that needs that definitions.

    I've got objects outputting their STRUCTs now, and parent objects are fully receiving them into their context, but I still need to implement the syntax handling in the compiler, so you'll be able to do 'object.struct'. That part should be much easier to implement. I think I've got the hard parts all done.

  • cgraceycgracey Posts: 14,274
    edited 2025-02-02 13:08

    I posted a new PNut_v49 which exports CON STRUCT's, just like CON integers and floats have always been.

    https://obex.parallax.com/obex/pnut-spin2-latest-version/

    CON STRUCT StructX(Object.StructA x[10]) 'StructX is ten StructA's, gets exported
    CON STRUCT StructY = Object.StructA      'StructY is a copy of StructA, gets exported
    VAR Object.StructA StructJ               'StructJ is an instance of StructA
    VAR ^Object.StructA StructK              'StructK is a pointer to StructA
    PUB Name(^Object.StructA StructL)        'StructL is a pointer to StructA
    DAT StructM Object.StructA               'StructM is an instance of StructA
    

    This took a long time to work out, because I started out keeping STRUCT definitions all isolated, but when exporting to the parent, things got hairy due to all the interdependencies. Now, when STRUCTs are defined that involve other STRUCTs, the other STRUCT definition is added in, instead of being referenced. This simplified many things. It also enabled STRUCTs to be exported upwards using "=" syntax (see 2nd line above).

    I found and fixed a bug in the SmoothLine routine that the DEBUG displays use. I had optimized the routine in v44, but didn't realize that I needed to make two variables into 64-bit integers to avoid overflow. The bug caused lines whose slope was near 1 to be drawn in the wrong direction with a vertical or horizontal segment added on.

  • evanhevanh Posts: 16,235
    edited 2025-02-02 13:28

    [haven't been reading]

  • @cgracey
    Just to clarify, Object is the symbol assigned to a child object? i.e., you're getting structure definitions from an external file? And by 'gets exported' I'm assuming you mean it's available for use in creating an instance, like the VAR, PUB and DAT examples?

  • cgraceycgracey Posts: 14,274
    edited 2025-02-02 14:56

    @avsa242 said:
    @cgracey
    Just to clarify, Object is the symbol assigned to a child object? i.e., you're getting structure definitions from an external file? And by 'gets exported' I'm assuming you mean it's available for use in creating an instance, like the VAR, PUB and DAT examples?

    We can use the STRUCTure definitions from a child object by using the syntax: object.structname.

    Any STRUCT that we declare, in turn, is available to any parent object in the future. We can pass structures from the lowest object all the way to the highest by doing this in every object in the chain: CON STRUCT structname = object.structname.

  • cgraceycgracey Posts: 14,274
    edited 2025-02-16 09:38

    I posted a new PNut_v50 at the top of this thread.

    PNut_v50 has several new features and fixes a bug introduced in PNut_v49 that caused structure sizes to be wrong. Here's what's new in v50:

    • New DEBUG PLOT commands allowing up to 8 bitmap layers that you can selectively copy into the PLOT window. This is useful for doing photo-realistic displays, where pre-drawn images are copied into the plot window to show, say, a toggle switch being in an ON or OFF position.

    • New DITTO directive for DAT blocks and inline PASM sections, which can iteratively generate code.

    • ORGH is now available for inline PASM code in PUB/PRI methods. ORGH has the same usage syntax as ORG, but executes PASM code in-place from hub memory, without loading it into register space.

    • New @\"string" method is like @"string", but allows escape characters like \n (new line, 10) and \xFF ($FF).

    • Predefined registers, like PR0, DIRA, OUTA, and INA, can now be used in CON block expressions.

    • PASM DEBUG instructions can now be expressed with a conditional prefix, like 'IF_C DEBUG'. This is accomplished by placing an opposite-condition 'SKIP #1' before the DEBUG (BRK) instruction.

  • @cgracey said:

    • PASM DEBUG instructions can now be expressed with a conditional prefix, like 'IF_C DEBUG'. This is accomplished by placing an opposite-condition 'SKIP #1' before the DEBUG (BRK) instruction.

    But doesn't that mess up the SKIP state? i.e. if skipping is paused during a subroutine call and the callee ends up doing this

  • cgraceycgracey Posts: 14,274

    @Wuerfel_21 said:

    @cgracey said:

    • PASM DEBUG instructions can now be expressed with a conditional prefix, like 'IF_C DEBUG'. This is accomplished by placing an opposite-condition 'SKIP #1' before the DEBUG (BRK) instruction.

    But doesn't that mess up the SKIP state? i.e. if skipping is paused during a subroutine call and the callee ends up doing this

    Yes. I will need to make a warning about that. Can you think of a better way to do it?

  • TonyB_TonyB_ Posts: 2,204
    edited 2025-02-16 10:35

    @cgracey said:

    @Wuerfel_21 said:

    @cgracey said:

    • PASM DEBUG instructions can now be expressed with a conditional prefix, like 'IF_C DEBUG'. This is accomplished by placing an opposite-condition 'SKIP #1' before the DEBUG (BRK) instruction.

    But doesn't that mess up the SKIP state? i.e. if skipping is paused during a subroutine call and the callee ends up doing this

    Yes. I will need to make a warning about that. Can you think of a better way to do it?

    JMPREL #1?

  • evanhevanh Posts: 16,235

    Huh? What's the history to BRK not obeying EEEE encoded bits?

  • roglohrogloh Posts: 5,882

    @evanh said:
    Huh? What's the history to BRK not obeying EEEE encoded bits?

    I wanted to ask the same question. What happened to the EEEE bits for these instructions? Are they being used in some other way or cannot be used for some specific reason?

    The P2 instruction spreadsheet does mention this for BRK:

    "If in debug ISR, set next break condition to D. Else, set BRK code to D[7:0] and unconditionally trigger BRK interrupt, if enabled."

  • cgraceycgracey Posts: 14,274

    BRK gets detected very early in the pipeline. It is this way because things have to start happening early in order for eveything to get done on time. BRK Is is the only instruction with this constraint.

  • evanhevanh Posts: 16,235

    I presume then, that there is a distinct pipeline difference between a programmed branch and an IRQ?

  • cgraceycgracey Posts: 14,274

    @evanh said:
    I presume then, that there is a distinct pipeline difference between a programmed branch and an IRQ?

    On an IRQ, a JMP is fed into the pipeline.

  • roglohrogloh Posts: 5,882
    edited 2025-02-16 23:20

    @cgracey said:
    BRK gets detected very early in the pipeline. It is this way because things have to start happening early in order for eveything to get done on time. BRK Is is the only instruction with this constraint.

    Thanks Chip. Makes more sense now as to why it ended up being different to all the other instructions.

    For the skipping of the debug in the conditional case to avoid breaking SKIPF code being debugged, could you do some patching of the next instruction to become a NOP by using a prior conditional ALTI changing the instruction field to zeroes? This might require one additional long somewhere to hold some upper zero bits, or alternatively patching it with some other innocuous instruction or something with EEEE bits that are the opposite of the condition you want. Are any of the reserved registers suitable for use in ALTI that could already return zeroes for us here when referenced as a D field, like INA & INB perhaps, or would they still return valid input pin data?

    e.g.

    if_c DEBUG("debug message") 
    

    becomes

    if_c ALTI zero,#%101_000_000 ' modify next instruction field
         BRK #code 
    

    Another way I thought of is still to break but use a special reserved break code value (all zeroes or all ones?) with a preceding conditional ALTS that is detected and returns immediately without incurring so much overhead as all the other "normal" codes. The former approach is preferable however, for higher performance. JMPREL is good but it is slower for HUBEXEC and will also interfere with a REP loop.

  • evanhevanh Posts: 16,235
    edited 2025-02-16 23:29

    @cgracey said:

    @evanh said:
    I presume then, that there is a distinct pipeline difference between a programmed branch and an IRQ?

    On an IRQ, a JMP is fed into the pipeline.

    That doesn't sound any different to a regular conditional branch.

  • TonyB_TonyB_ Posts: 2,204

    @rogloh said:
    JMPREL is good but it is slower for HUBEXEC and will also interfere with a REP loop.

    REP blocks INT1/2/3. If it also blocks debug INT0 there's no point having BRK inside a REP loop.

Sign In or Register to comment.