@macca, so after some wandering through alternatives... I'm thinking I leave the doc generator as is (no constants, structs) unless we detect {Spin2_Doc_CON} as the initial line in a CON section; then all constants and structs from that section are documented. This way, one can mark as many or as few sections as needed, no matter where they are in the source file.
How does this sound in terms of simplicity and opt-in behavior?
I'm using Google's agent AI thingy called Antigravity with some success to develop Spin2 code. the desktop interface is essentially VSCode (but pointed at different extension site). I wanted to have syntax highlighting and all the other features of this Spin2 VSCode extension ... which took a bit of hunting to get the .visx direct from the vscode extension server and manually load it into Antigravity. If anyone else is headed this direction, here is the url to get the .visx directly: https://marketplace.visualstudio.com/_apis/public/gallery/publishers/IronSheepProductionsLLC/vsextensions/spin2/2.10.6/vspackage
@refaQtor said:
I'm using Google's agent AI thingy called Antigravity with some success to develop Spin2 code. the desktop interface is essentially VSCode (but pointed at different extension site). I wanted to have syntax highlighting and all the other features of this Spin2 VSCode extension ...
IT's fun to hear that you got that working.
You might also want to look at P2 Knowledge base MCP. I think Antigravity supports MCPs, so this would give your agent P2 Architecture understanding and a full Spin2/Pasm2 language reference.
I did get that plugged in following the same directions that you have for Claude/Cursor... as a config entry under "mcpServers": {} I generally use Claude Opus 4.6 (thinking) beneath Antigravity.
and I've been using it to build test harness with pnut-term-ts . I have too many tests to fit into one executable (I tried), so now I have separate binary for the 3-tier (HUB/PSRAM/uSDcard -thanks to you) string-interning b-tree database in paged memory, the AST parser, the Eval and Compile stages, the XBYTE Tail-Call Optimized Lisp engine (on, so far, only one COG - ~430 SKIPF packed PASM instructions), the !UNHOLY! 4-color Garbage Collector that binds it all together concurrently from a TASK on a separate COG, the Lisp REPL, 9P host filesystem access, the Octoserialports (again, thank you), the prolog/datalog inference engine... ALL of that as my foundation to, finally, support smooth logical configuration of Smart Pins! This is my fourth go at it all, in as many months - have implemented all the systems functionally, though feebly, and learned a lot. GO TIME, THIS TIME! working on proving the GC now.
and that takes lots of builds of separate binaries... to speed that up, I kick off parallel builds of ALL the tests (I still have a few Ryzen cores left over) after the source folder is copied as many times to work in... HUGE TIMESAVER, though I still doing the P2 tests serially. maybe I'll get around to parallelizing that, too. I've got a half-dozen or so P2 EDGE 32MB modules on my bench here.
so, yeah, now I'll head over and bump up your Patreon support!. you've made some great tools and foundational libraries that we can count on.
Comments
@macca, so after some wandering through alternatives... I'm thinking I leave the doc generator as is (no constants, structs) unless we detect {Spin2_Doc_CON} as the initial line in a CON section; then all constants and structs from that section are documented. This way, one can mark as many or as few sections as needed, no matter where they are in the source file.
How does this sound in terms of simplicity and opt-in behavior?
I'm using Google's agent AI thingy called Antigravity with some success to develop Spin2 code. the desktop interface is essentially VSCode (but pointed at different extension site). I wanted to have syntax highlighting and all the other features of this Spin2 VSCode extension ... which took a bit of hunting to get the .visx direct from the vscode extension server and manually load it into Antigravity. If anyone else is headed this direction, here is the url to get the .visx directly: https://marketplace.visualstudio.com/_apis/public/gallery/publishers/IronSheepProductionsLLC/vsextensions/spin2/2.10.6/vspackage
IT's fun to hear that you got that working.
You might also want to look at P2 Knowledge base MCP. I think Antigravity supports MCPs, so this would give your agent P2 Architecture understanding and a full Spin2/Pasm2 language reference.
I did get that plugged in following the same directions that you have for Claude/Cursor... as a config entry under "mcpServers": {} I generally use Claude Opus 4.6 (thinking) beneath Antigravity.
and I've been using it to build test harness with pnut-term-ts . I have too many tests to fit into one executable (I tried), so now I have separate binary for the 3-tier (HUB/PSRAM/uSDcard -thanks to you) string-interning b-tree database in paged memory, the AST parser, the Eval and Compile stages, the XBYTE Tail-Call Optimized Lisp engine (on, so far, only one COG - ~430 SKIPF packed PASM instructions), the !UNHOLY! 4-color Garbage Collector that binds it all together concurrently from a TASK on a separate COG, the Lisp REPL, 9P host filesystem access, the Octoserialports (again, thank you), the prolog/datalog inference engine... ALL of that as my foundation to, finally, support smooth logical configuration of Smart Pins!
This is my fourth go at it all, in as many months - have implemented all the systems functionally, though feebly, and learned a lot. GO TIME, THIS TIME! working on proving the GC now.
and that takes lots of builds of separate binaries... to speed that up, I kick off parallel builds of ALL the tests (I still have a few Ryzen cores left over) after the source folder is copied as many times to work in... HUGE TIMESAVER, though I still doing the P2 tests serially. maybe I'll get around to parallelizing that, too. I've got a half-dozen or so P2 EDGE 32MB modules on my bench here.
so, yeah, now I'll head over and bump up your Patreon support!. you've made some great tools and foundational libraries that we can count on.