They would be like method pointers, minus the 12-bit method index. They would just need the VAR base and code base of an object. Also, maybe an association to an object should be part of the object pointer variable declaration, in order to reduce syntactic clutter during usage.
Maybe it could all work like this:
OBJ objtype = "objfile"
VAR ^objtype t
PUB ......
t := @realobject
t.method()
x := t.consymbol * 3
@cgracey said:
Thinking more about object pointers...
They would be like method pointers, minus the 12-bit method index. They would just need the VAR base and code base of an object. Also, maybe an association to an object should be part of the object pointer variable declaration, in order to reduce syntactic clutter during usage.
Maybe it could all work like this:
OBJ objtype = "objfile"
VAR ^objtype t
PUB ......
t := @realobject
t.method()
x := t.consymbol * 3
Interesting. Where is this "realobject" defined? In objfile somewhere? I like the idea of being able to dynamically select the executed code/data based on a dynamic object pointer's value rather than a static association - it could be useful for my memory driver abstraction and it is heading a bit closer to actual OOP. Although ideally we want the object pointer to be assignable to one of several instances of the same base object type (class) with different implementations for each subclass (if you want some inheritance+polymorphism).
@cgracey said:
Thinking more about object pointers...
They would be like method pointers, minus the 12-bit method index. They would just need the VAR base and code base of an object. Also, maybe an association to an object should be part of the object pointer variable declaration, in order to reduce syntactic clutter during usage.
Maybe it could all work like this:
OBJ objtype = "objfile"
VAR ^objtype t
PUB ......
t := @realobject
t.method()
x := t.consymbol * 3
This is exatly what I would want.
By the way, I begin to rewrite my programs using structs and find that I can not access structure definitions in an obj from the parent obj.
We can access constants definition with obj.CONST_NAME
But not access structure definition with obj.STRUCT_NAME
Maybe it is better to have a #include for the preprocessor.
So we can write alle structure definition and global constants in one File and #include this in every object that needs that definitions.
@cgracey said:
Thinking more about object pointers...
They would be like method pointers, minus the 12-bit method index. They would just need the VAR base and code base of an object. Also, maybe an association to an object should be part of the object pointer variable declaration, in order to reduce syntactic clutter during usage.
Maybe it could all work like this:
OBJ objtype = "objfile"
VAR ^objtype t
PUB ......
t := @realobject
t.method()
x := t.consymbol * 3
Interesting. Where is this "realobject" defined? In objfile somewhere? I like the idea of being able to dynamically select the executed code/data based on a dynamic object pointer's value rather than a static association - it could be useful for my memory driver abstraction and it is heading a bit closer to actual OOP. Although ideally we want the object pointer to be assignable to one of several instances of the same base object type (class) with different implementations for each subclass (if you want some inheritance+polymorphism).
The "realobject" could come from anywhere. It would just need to exist in memory and be the same type of object.
@cgracey said:
Thinking more about object pointers...
They would be like method pointers, minus the 12-bit method index. They would just need the VAR base and code base of an object. Also, maybe an association to an object should be part of the object pointer variable declaration, in order to reduce syntactic clutter during usage.
Maybe it could all work like this:
OBJ objtype = "objfile"
VAR ^objtype t
PUB ......
t := @realobject
t.method()
x := t.consymbol * 3
This is exatly what I would want.
By the way, I begin to rewrite my programs using structs and find that I can not access structure definitions in an obj from the parent obj.
We can access constants definition with obj.CONST_NAME
But not access structure definition with obj.STRUCT_NAME
Maybe it is better to have a #include for the preprocessor.
So we can write alle structure definition and global constants in one File and #include this in every object that needs that definitions.
I need to make child-object STRUCT definitions readable by the parent. I will look into this today.
#INCLUDE would be good, but it would need to be implemented in a way where it wouldn't interfere with source file offsets for error reporting.
Perhaps Chip's concern is the line number reporting during compilation, not so much execution, and how potentially nested include files would upset that. Also the #ifdef code could well too depending on how/when that is done, although I believe they are just being replaced with blanks IIRC so original line numbers should still be countable.
I imagine some type of stack structure that tracks actual line numbers per currently included file might be useful if it could refer to the original source file in the nested group and current line number of the #include line. The line numbers get restarted at 1 for each new file the compiler includes and are restored back to the original line number of the "parent" source file when each include file ends, which then continues incrementing. Of course there still may be problems I don't understand as I might be missing how/when the compiler uses the line numbers if multiple passes are done and some of this information is not present in later passes. For that you may need to map a global line number to an original file/line number for example.
This is a common and solvable problem for compiler error reporting. For example GCC reports errors found in include files using a hierarchy where each included file and related line number is known and displayed like this:
In file included from /usr/include/stdio.h:28:0,
from ../.././gcc-4.7.0/libgcc/../gcc/tsystem.h:88,
from ../.././gcc-4.7.0/libgcc/libgcc2.c:29:
/usr/include/features.h:324:26: fatal error: bits/predefs.h: No such file or directory
compilation terminated.
Perhaps Chip's concern is the line number reporting during compilation, not so much execution, and how potentially nested include files would upset that.
That's exactly what I was talking about. Typically a preprocessor is implemented as a first pass, before any other part of the compiler, that replaces macros with their definitions, removes code that fails #ifdef tests, and substitutes the contents of a #include file for the #include line. So code that starts off looking like:
CON
#ifdef NOT_DEFINED
#include "a.defs"
#else
#include "b.defs"
#endif
DAT
byte "this is the original spin2"
...
will after preprocessing look something like:
CON
#line 1 "b.defs"
'' contents of file b.defs
MyVal = "B"
#line 6 "orig.spin2"
DAT
byte "this is the original spin2"
The preprocessor deletes lines in #ifdef as appropriate, replacing them with blank lines to keep the line count straight. But around #include it inserts #line directives so the subsequent compiler passes can figure out what lines to give for errors. You can see this kind of thing in action if you run a stand-alone C preprocessor (like gcc -E). #line are the only directives that the rest of the compiler has to deal with, and they're pretty simple.
Some modern compilers fold the preprocessing into other passes, so they no longer have a seperate preprocessor. That's OK too, but means you need to keep track of the line numbers/file names in another way.
Ok gotcha @ersmith . I see how the #line directive is used in the final output now and how it tracks the source input file positions. The Microsoft example posted had only showed it used for output during execution which is commonly used too in C, but this embedded source information can also be used to track files and line numbers during compilation. Hopefully Chip could try to follow something like that.
@cgracey said:
Thinking more about object pointers...
They would be like method pointers, minus the 12-bit method index. They would just need the VAR base and code base of an object. Also, maybe an association to an object should be part of the object pointer variable declaration, in order to reduce syntactic clutter during usage.
Maybe it could all work like this:
OBJ objtype = "objfile"
VAR ^objtype t
PUB ......
t := @realobject
t.method()
x := t.consymbol * 3
This is exatly what I would want.
By the way, I begin to rewrite my programs using structs and find that I can not access structure definitions in an obj from the parent obj.
We can access constants definition with obj.CONST_NAME
But not access structure definition with obj.STRUCT_NAME
Maybe it is better to have a #include for the preprocessor.
So we can write alle structure definition and global constants in one File and #include this in every object that needs that definitions.
I've got objects outputting their STRUCTs now, and parent objects are fully receiving them into their context, but I still need to implement the syntax handling in the compiler, so you'll be able to do 'object.struct'. That part should be much easier to implement. I think I've got the hard parts all done.
Comments
Thinking more about object pointers...
They would be like method pointers, minus the 12-bit method index. They would just need the VAR base and code base of an object. Also, maybe an association to an object should be part of the object pointer variable declaration, in order to reduce syntactic clutter during usage.
Maybe it could all work like this:
Does that work for methods with parameters?
Yes, I just gave the simplest example.
Ideally make sure there's compatibility with flexspin's version of the feature (that also uses the OBJ equals syntax)
Interesting. Where is this "realobject" defined? In objfile somewhere? I like the idea of being able to dynamically select the executed code/data based on a dynamic object pointer's value rather than a static association - it could be useful for my memory driver abstraction and it is heading a bit closer to actual OOP. Although ideally we want the object pointer to be assignable to one of several instances of the same base object type (class) with different implementations for each subclass (if you want some inheritance+polymorphism).
Okay, found the detail in the docs: when using the method pointer you have to explicitly declare the number of return values if !0.
This is exatly what I would want.
By the way, I begin to rewrite my programs using structs and find that I can not access structure definitions in an obj from the parent obj.
We can access constants definition with obj.CONST_NAME
But not access structure definition with obj.STRUCT_NAME
Maybe it is better to have a #include for the preprocessor.
So we can write alle structure definition and global constants in one File and #include this in every object that needs that definitions.
The "realobject" could come from anywhere. It would just need to exist in memory and be the same type of object.
I need to make child-object STRUCT definitions readable by the parent. I will look into this today.
#INCLUDE would be good, but it would need to be implemented in a way where it wouldn't interfere with source file offsets for error reporting.
Most preprocessors deal with this by having a
#line
directive which can reset the file name and line number for error reporting, and then inserting appropriate#line
during and after#include
. See e.g. https://learn.microsoft.com/en-us/cpp/preprocessor/hash-line-directive-c-cpp?view=msvc-170Perhaps Chip's concern is the line number reporting during compilation, not so much execution, and how potentially nested include files would upset that. Also the #ifdef code could well too depending on how/when that is done, although I believe they are just being replaced with blanks IIRC so original line numbers should still be countable.
I imagine some type of stack structure that tracks actual line numbers per currently included file might be useful if it could refer to the original source file in the nested group and current line number of the #include line. The line numbers get restarted at 1 for each new file the compiler includes and are restored back to the original line number of the "parent" source file when each include file ends, which then continues incrementing. Of course there still may be problems I don't understand as I might be missing how/when the compiler uses the line numbers if multiple passes are done and some of this information is not present in later passes. For that you may need to map a global line number to an original file/line number for example.
This is a common and solvable problem for compiler error reporting. For example GCC reports errors found in include files using a hierarchy where each included file and related line number is known and displayed like this:
That's exactly what I was talking about. Typically a preprocessor is implemented as a first pass, before any other part of the compiler, that replaces macros with their definitions, removes code that fails
#ifdef
tests, and substitutes the contents of a#include
file for the#include
line. So code that starts off looking like:will after preprocessing look something like:
The preprocessor deletes lines in
#ifdef
as appropriate, replacing them with blank lines to keep the line count straight. But around#include
it inserts#line
directives so the subsequent compiler passes can figure out what lines to give for errors. You can see this kind of thing in action if you run a stand-alone C preprocessor (likegcc -E
).#line
are the only directives that the rest of the compiler has to deal with, and they're pretty simple.Some modern compilers fold the preprocessing into other passes, so they no longer have a seperate preprocessor. That's OK too, but means you need to keep track of the line numbers/file names in another way.
Ok gotcha @ersmith . I see how the #line directive is used in the final output now and how it tracks the source input file positions. The Microsoft example posted had only showed it used for output during execution which is commonly used too in C, but this embedded source information can also be used to track files and line numbers during compilation. Hopefully Chip could try to follow something like that.
I've got objects outputting their STRUCTs now, and parent objects are fully receiving them into their context, but I still need to implement the syntax handling in the compiler, so you'll be able to do 'object.struct'. That part should be much easier to implement. I think I've got the hard parts all done.