OK, it took me a bit longer than I had hoped to get closures working, but they're in fastspin BASIC now (github source code that is, I haven't made a new binary release):
dim printer as sub(t as integer)
dim stepper as sub()
sub constructFuncs(n as integer, msg as string)
var count = 1
printer = sub(t as integer)
print msg; count
end sub
stepper = sub()
count = count + n
end sub
end sub
pausems 1000 ' wait for terminal
constructFuncs(1, "step1: ")
for i = 1 to 5
printer(i)
stepper
next
constructFuncs(2, "try2: ")
for i = 1 to 5
printer(i)
stepper
next
print "done"
Closures aren't quite complete yet -- they don't close over the object they're contained in, but that's mostly a matter of adding a pointer to the closure and hooking up some symbol tables. They do grab all the parameters and locals, at least.
Now that the architecture is there it could be possible to add this to fastspin's Spin language as well. I'm not sure how Chip would feel about adding first class functions to Spin2 though.
OK, it took me a bit longer than I had hoped to get closures working, but they're in fastspin BASIC now (github source code that is, I haven't made a new binary release):
dim printer as sub(t as integer)
dim stepper as sub()
sub constructFuncs(n as integer, msg as string)
var count = 1
printer = sub(t as integer)
print msg; count
end sub
stepper = sub()
count = count + n
end sub
end sub
pausems 1000 ' wait for terminal
constructFuncs(1, "step1: ")
for i = 1 to 5
printer(i)
stepper
next
constructFuncs(2, "try2: ")
for i = 1 to 5
printer(i)
stepper
next
print "done"
Closures aren't quite complete yet -- they don't close over the object they're contained in, but that's mostly a matter of adding a pointer to the closure and hooking up some symbol tables. They do grab all the parameters and locals, at least.
Now that the architecture is there it could be possible to add this to fastspin's Spin language as well. I'm not sure how Chip would feel about adding first class functions to Spin2 though.
This brings up an interesting question though. How should closures in fastspin work? I guess you could argue that fastspin isn't really a dynamic language so maybe it should work more like C++ than JavaScript or any of the dynamic languages? Maybe what you already have is okay?
This brings up an interesting question though. How should closures in fastspin work? I guess you could argue that fastspin isn't really a dynamic language so maybe it should work more like C++ than JavaScript or any of the dynamic languages? Maybe what you already have is okay?
It would certainly be feasible to change it so that each subfunction gets a static snapshot of the locals. This way is actually a bit more efficient though, in that all the subfunctions of a given function share the same closure object. And I think you could emulate the C++ way with this, but going the other way is harder.
This brings up an interesting question though. How should closures in fastspin work? I guess you could argue that fastspin isn't really a dynamic language so maybe it should work more like C++ than JavaScript or any of the dynamic languages? Maybe what you already have is okay?
It would certainly be feasible to change it so that each subfunction gets a static snapshot of the locals. This way is actually a bit more efficient though, in that all the subfunctions of a given function share the same closure object. And I think you could emulate the C++ way with this, but going the other way is harder.
Well, if doing it right is actually easier I guess there is no issue!
I was thinking this is a world first, closures in BASIC. But it seems Visual BASIC got lambdas and closures in v9 in 2008. Totally different syntax, but hey it's BASIC right?
Hi AC. The early history of microcomputer languages is really the history of BASIC. As Heater points out there were others, but it was BASIC which became popular via the Apple and Tandy Radio-Shack, then Commodore product lines. Until well past the mid 1980's it was considered standard for a low-end computer to boot to BASIC's READY prompt. It was self-sufficient even on very limited hardware, easy to learn, and capable of doing practical things. Even the original IBM PC came with a built-in BASIC interpreter.
Microsoft shipped BASIC with all versions of MS-DOS and Windows (which was originally a shell running atop MS-DOS) until IIRC Windows 95. The command-line simple BASIC became QBASIC which was a full screen editor, integrated development system style stripped down version of QuickBASIC, which was a compiler as well as interpreter capable of producing standalone executable program files (aka "professional looking software.") With Windows 9x Microsoft jumped to Visual Basic, which they did not feel like giving away and became the most popular programming environment in history.
Meanwhile, BASIC was getting serious competition from C, which was about the same age but had always been compiled, often not self-sufficient when applied to a microcomputer, and very fast and efficient compared to BASIC. Again there were other offerings but with small computers it really came down to C, usually supported by a compiler running on a "real computer," and BASIC which might be self-sufficient.
I don't remember line-number BASIC being "capable of doing practical things." I remember that it was purely a teaching language, and that even high-school students knew that was no good, and knew why it was no good (no functions with local variables, and slow as molasses).
This included GW-BASIC that was shipped with early MS-DOS. There were some commercial BASICs that had functions and local variables, and were reasonably fast --- I never used any except for QuickBasic (I preferred QuickPascal though).
Through the 1980s, the only two languages that were used were 6502 assembly-language and Z80 assembly-language --- Turbo Pascal got some use under CP/M and MS-DOS --- C started to come into use in the early 1990s.
I don't remember C being "very fast and efficient" --- compared to line-number BASIC though, anything would be fast and efficient (including pencil-and-paper, in many cases).
When I was 18 years old (1984) I wrote a program on the C64 in SuperForth to do turtle-graphics in four dimensions (W, X, Y, Z). The program displayed a 3D image of the wire-frame object drawn looking down at the turtle. This was a 2D representation of the 3D object (I had the lines going back to a vanishing point on the "horizon"). The C64 was too slow! Drawing a hypercube took several seconds. When I made my turtle orbit the hypercube I wanted to see the object spinning on the screen, but this was ponderously slow. Also, I realized too late that SuperForth doesn't allow the user to distribute programs that include the outer-interpreter (which was, of course, the basis for the turtle programming). It was a cool program though --- I may yet rewrite it for the modern desktop-computers that are much faster.
Some years later I wrote a 65c02 Forth cross-compiler hosted on MS-DOS (written in UR/Forth from LMI). This was based on ISYS Forth for the Apple-IIc that generated fast code, but had severe memory problems because it ran on the Apple-IIC rather than be a cross-compiler running on a separate machine. My cross-compiler included a source-level debugger that would single-step through a program displaying where it was in the source-code as well as the target machine's state, such as what was on the data-stack, return-stack, etc.. I used my cross-compiler to write a program for the Apple-IIc that did symbolic math. I could do derivatives and then simplify the result. My goal was to do integrals, but I had pushed the Apple-IIc's capability about as far as it could go with the derivatives, so I didn't continue with integrals which are much more difficult. I also displayed equations on the screen in a nice format with Greek letters and other math symbols, and I graphed functions.
Realistically, there was no C or Pascal compiler available for the 6502 that would have been capable of either of these programs.
I assume that AwesomeCronk is a high-school student --- I would be interested in seeing the list of 21 approved subjects he has for his social-studies class.
I would also be interested in learning about the BASIC Stamp --- I never used it, but I read that it is popular for high-school science-fair projects, but not so much as the Arduino --- is it still sold, or has Parallax upgraded to a bigger-and-better educational board now?
I don't really know anything about Parallax --- only this P2 has sparked any interest in me --- it seems powerful enough to compete out in the real world.
When I was at Testra I encouraged them to produce an educational board based on the MiniForth, but they said no. I was told: "Hobbyists require too much support, and they don't have any money."
I was thinking this is a world first, closures in BASIC. But it seems Visual BASIC got lambdas and closures in v9 in 2008. Totally different syntax, but hey it's BASIC right?
I think my syntax matches VB's "long form" (multi line). Their short form for function is very nice though, maybe I'll try to get that in as well.
Hugh A, I remember Microsoft BASIC including GW-BASIC on the PC being the basis for thousands of business systems which would have been far more expensive to develop in other ways. For all its faults BASIC was also the original Rapid Application Development platform because you could break in, look at variables, fix bugs, and restart without a compile cycle. (FORTH could of course also do this sort of thing, but it never became nearly as popular.) When the performance wasn't good enough you failed over to machine language helper routines. An incredible amount of functionality was implemented this way.
And the next step was C because C was the language Microsoft migrated to for DOS and Windows development as they pried themselves away from x86 assembly language. The Windows API is basically a collection of C function calls. You can call them from other languages, because of compatibility arrangements but if you look at any documentation from the mid-late 1980's you will see it is all C based. In those days C was the only somewhat performant language with compilers that could practically run on low-end computers, as it had been designed for the PDP-8.
A system which follows the basic design principle of those early BASIC systems is, unsurprisingly, the Parallax Propeller. Its primary language, Spin, is interpreted and therefore slow and not demonstrative of the processor's true capabilities. But for most business logic Spin is good enough. When Spin isn't good enough you back up and punt with PASM. As in those BASIC systems you never use a lot of PASM code; in fact, the Propeller is designed with the idea that you _can't_ use a lot of PASM code, because cog memory is small. But you use PASM for the stuff that has to be fast and it turns out that's only a fraction of any typical application. That is pretty much how all small-scale practical software was developed before Visual Basic came along, which changed the game in other radical ways.
The big advance for VB was when Microsoft introduced the compiler for VB which put it on an equal footing to VC. Of course, VB is nothing like the original BASIC, but it's roots were there.
To me, the BASIC language in VB style is much simpler than C. Functions and Subs are easy. The Void in C is a total disaster IMHO. It's said that C was designed by programmers who feared their job security so they invented an obscure language. I subscribe to this premise that C is unnecessarily complex just for the sake of it.
In regard to BASIC, I'm not entirely opposed to it. The 65ISR-chico doesn't support subroutines, so the only language that would make sense for it would be BASIC --- this would allow people to avoid assembly-language.
The BASIC code would be pretty simple, as the only data types supported would be:
8-bit byte
16-bit integer
byte array (256 elements) --- also can be used as a circular buffer
integer array (256 elements) --- also can be used as a circular buffer
string (255 char maximum)
Pointers would not be supported in this BASIC because the 65ISR-chico lacks the W register and hence can't do indirect addressing.
This might seem like a pretty crude language, but you could do quite a lot of micro-controller applications with it, and it would be easy enough that a high-school student could pick it up quickly without any previous exposure to computer-science --- switching to assembly-language wouldn't really buy you any more features --- asm might buy you a slight boost in speed.
Is that how closures work? I have no idea, I assumed there was a stack as usual but somehow a closure took a snapshot of it, on the heap presumably.
Yeah but if you just copy the variables into the closure then two functions that close over the same stack will have different copies of those variables. Shouldn't a change by one closure function be seen by the others?
I have implemented rquotations in VFX and SwiftForth (should be easy to implement in any Forth).
I do not copy the local variables to a struct on the heap. That is a very bad idea because Forth lacks GC --- also this is very slow.
The local variables stay on the stack. The HOF (Higher Order Function) can have local variables of its own, but when it executes the rquotation the rquotation will be accessing the parent function's local variables. The parent function is the function that called the HOF. The rquotation is only valid so long as the parent function is in scope --- this is not like a Scheme closure that has the local variables in a struct on the heap and continues to be valid even if the parent function goes out of scope (exits) --- the Scheme scheme requires GC because there may be multiple closures in addition to the parent function all expecting the struct to continue to be valid, so the struct can only be deallocated after all of these have gone out of scope.
My rquotations are pretty simple, but they are hugely useful! They make general-purpose data-structures possible, the lack of which has always been Forth's primary weakness. Employers don't have time or money for employees to implement data-structures, such as self-balancing binary-trees, that are complicated to implement. Also, employers don't want to cut-and-paste code from old programs that had these data-structures, because cut-and-paste programming hugely complicates the source-code and is very error-prone. Rquotations solve all of these problems! :-D
The idea is that the HOF traverses a data-structure and it executes the rquotation for every node, giving the struct's address to the rquotation as a parameter --- the rquotation communicates information back to the parent function via the parent function's local variables.
Here is the code (requires my novice-package):
\ ******
\ ****** R[ ]R quotations.
\ ****** https://groups.google.com/forum/#!topic/comp.lang.forth/3LSqmBIZuzY
\ ****** This is highly non-standard! ANS-Forth (section 3.2.3.3.) says:
\ ****** A program shall not access values on the return stack (using R@, R>, 2R@ or 2R>) that it did not place there using >R or 2>R;
\ ******
\ In the stack-picture comments, RQ is a continuation (a vector to a quotation).
\ HumptyDumpty invented rquotations --- this was very good programming --- I hadn't thought of it.
\ What I call REX0 he called RCALL --- also, he didn't have REX which I invented (this only works in VFX and SwiftForth).
\ If only REX0 is used, rquotations can be used under any ANS-Forth system (theoretically non-standard though).
\ REX is a lot more useful though because the HOF almost always needs to have locals.
\ My improved version for VFX or SwiftForth should be easy to port to other ANS-Forth systems --- any Forth system with locals.
\ Some assembly-language is required, but it is pretty straight forward.
VFX? SwiftForth? or [if]
: rexit ( -- ) rdrop ;
: (r:) ( -- rq ) r@ 5 + ; \ 5 is the size of a JMP instruction in 32-bit x86
: r[ ( -- rq ) postpone (r:) postpone ahead ; immediate
: ]r ( -- ) postpone rexit postpone then ; immediate
: rex0 ( rq -- ) >r ; \ requires the HOF to not have locals
\ REX0 is the same as EXECUTE
\ We don't use EXECUTE however because in the other version (not VFX or SwiftForth) REX0 is different.
VFX? [if]
code rex ( rq -- ) \ requires the HOF to have locals
push edi \ this is the HOF's LF which won't be used by the quotation
mov edi, 0 [edi] \ this is the parent's LF which will be used by the quotation
mov eax, ebx
mov ebx, 0 [ebp] lea ebp, w [ebp]
call eax
pop edi \ restore HOF's LF
next, end-code
[then]
SwiftForth? [if]
156 constant lf-offset \ this is the offset for the local-frame in the user-variables (ESI is the user-variable base)
code rex ( rq -- ) \ requires the HOF to have locals
lf-offset [esi] edx mov
edx push \ this is the HOF's LF which won't be used by the quotation
-4 [edx] eax mov \ this is the old ESP
0 [eax] eax mov \ this is the parent's LF which will be used by the quotation
eax lf-offset [esi] mov
ebx eax mov [drop]
eax call
lf-offset [esi] pop \ restore HOF's LF
ret end-code
[then]
[else] \ this was written by HumptyDumpty and works on gForth, SwiftForth and VFX
: rexit ( -- ) RDROP ;
: (r:) ( -- rq ) R@ false ;
: r[ ( -- rq ) postpone (r:) postpone IF ; immediate
: ]r ( -- ) postpone REXIT postpone THEN ; immediate
: rex0 ( rq -- ) >R true ; \ requires the HOF to not have locals
\ REX is not supported in HumptyDumpty's code.
[then]
\ REX is used in a HOF that has local variables.
\ REX0 is used in a HOF that does not have local variables.
\ REX0 is also used in the parent function itself, when there is no HOF used.
In regard to BASIC, I'm not entirely opposed to it. The 65ISR-chico doesn't support subroutines, so the only language that would make sense for it would be BASIC --- this would allow people to avoid assembly-language.
The BASIC code would be pretty simple, as the only data types supported would be:
8-bit byte
16-bit integer
byte array (256 elements) --- also can be used as a circular buffer
integer array (256 elements) --- also can be used as a circular buffer
string (255 char maximum)
Pointers would not be supported in this BASIC because the 65ISR-chico lacks the W register and hence can't do indirect addressing.
This might seem like a pretty crude language, but you could do quite a lot of micro-controller applications with it, and it would be easy enough that a high-school student could pick it up quickly without any previous exposure to computer-science --- switching to assembly-language wouldn't really buy you any more features --- asm might buy you a slight boost in speed.
It might be interesting to have a BASIC for the 65ISR-chico that also compiled code for the P1.
I said previously that the several ISRs were similar to the cogs on the P1 --- each one deals with some I/O, and can't be interrupted.
I assume that the 65ISR-chico would be less expensive than the P1 --- it could be used for simple projects, and if it proved inadequate, the user could upgrade to a P1 --- this would work best if the already-written code ported over without needing to be rewritten.
A little information-hiding with macros would allow hardware-specific aspects of the program to port easily --- just have compatible macro-libraries for the two systems.
Ironically C was devised by the guys building Unix and Unix was their way to quickly put together a "simple" operating system after having seen the massive complexity of Multics.
C might be complex to program in, only because it is such a simple language.
Yeah, I had a first model Atari 520ST with the monochrome monitor. Then a 1040ST. Awesome, high res bit mapped graphics and a whole megabyte of RAM! They were quite the rage in Blighty. Spent hour playing Starglider and with my first ever C compilers.
I imagine working for Atari was interesting. They seemed to be developing all kind of things in all kind of directions until they ran out of steam. I was stunned to see they were demoing a workstation based on Transputers at the Personal Computer World Show in London. The graphics on that was stunning for it's time.
Anyway, sadly somehow I never heard of Mint or MultiTOS at the time.
Yeah, I had a first model Atari 520ST with the monochrome monitor. Then a 1040ST. Awesome, high res bit mapped graphics and a whole megabyte of RAM! They were quite the rage in Blighty. Spent hour playing Starglider and with my first ever C compilers.
I imagine working for Atari was interesting. They seemed to be developing all kind of things in all kind of directions until they ran out of steam. I was stunned to see they were demoing a workstation based on Transputers at the Personal Computer World Show in London. The graphics on that was stunning for it's time.
Anyway, sadly somehow I never heard of Mint or MultiTOS at the time.
I did a lot of my early C programming on an Atari 1040ST. It was a great machine with an external hard drive attached.
Oh boy, I was hungry for a hard drive. There was no way I could justify the hundreds of pounds cost for the 10MB drives I could get at the time.
Oh yes, the mega-drive 10. I had one. You had to switch it on first and give it some time to get to speed. Like the start of a jet engine and about as loud. I later on replaced the hard drive for a bigger on (20 MB!) what a huge drive that was.
About that time I also had to solder the first time, some RAM expansion kit for the 512ST to get it up to 1MB. I think I spend a whole month income just for the RAM dip chips.
I don't remember line-number BASIC being "capable of doing practical things."
Simple, you were likely not around anyone actually doing practical things with it. That was going on all over the place though. Before I left high school I had written a little inventory control system, grading system, test generator, and a variety of computation utilities, most of which involved antennas.
And this:
Hugh A, I remember Microsoft BASIC including GW-BASIC on the PC being the basis for thousands of business systems which would have been far more expensive to develop in other ways.
Totally, one of my mentors was making good bucks doing that for trucking companies using BASIC. Started on CP/M, and moved it all to MSDOS as the companies did.
For a lot of people, they just needed it to work. Once it did, value was not hard to see. From there, moving up to bigger, faster made sense.
This too:
Its primary language, Spin, is interpreted and therefore slow and not demonstrative of the processor's true capabilities. But for most business logic Spin is good enough.
Interestingly, SPIN on a Propeller 1 runs about as fast as native machine language does on many 8 bit machines.
On the Propeller 2, interpreted SPIN is likely to run at speeds similar to native PASM on a Propeller 1.
The PICs were made by Microchip and the SXs were made by Scenix Semiconductor (now Ubicom).
Which PIC was this? This was the PIC16?
What was the difference between the SX and the PIC chip? Was the SX just a clone of the PIC chip?
The SX is discontinued now?
I was never much interested in the BASIC Stamp. The PIC chips have very little memory, and no way to access external memory.
In the 1990s, the Dallas 80c320 was by far the most popular micro-controller processor. It had 2KB of RAM in addition to the direct-access page. It ran at 6 times the speed of the 8032. It had 2 data-pointers, rather than just 1 as in the 8032.
Now Dallas has the 80c420 that is faster and has some improved features (the data-pointers can go down as well as up). If I were to design a hobbyist board for high-school robot builders, I would consider the '420 to be a good choice. I could write a compiler to generate reasonably efficient code, and the more advanced students could resort to assembly-language as needed.
If I did this, I would be competing against Parallax and their Propeller, which is unlikely to work very well --- the Propeller has a distinctive architecture which is pretty cool --- the Propeller dodges the issue of IRQs not getting serviced in time, causing data loss, so it is easier to analyze in regard to whether or not it is capable of doing a particular job without any stuttering.
OTOH, the '420 is likely capable of doing things that the Propeller is not capable of --- the '420 is a powerful processor! --- for the hobbyist market though, high performance isn't really needed because high-school students are mostly just building animatronic monsters for use in home movies, and the monsters don't move very fast (they just sit there slowly opening and closing their toothy jaws as if they are chewing their cud).
The primary argument against using the 80c420 is that if the hobbyist board becomes popular, Red China will begin selling clones. They can buy 80c420 chips too! They could build 80c420 clones if necessary.
I remember at Testra that they used the 80c320 for their motion-control board originally. They built the MiniForth processor because they needed something faster and less expensive.
Also, they were very concerned that Red China was going to clone their motion-control board. Their worst nightmare was that their customer would say: "We found another supplier that sells the exact same thing at half the price that you are asking. Unless you are willing to sell your boards at below cost, we won't buy from you anymore." In this case, all of their R&D would be lost! Obviously, filing a lawsuit against Beijing isn't going to work.
The MiniForth couldn't be cloned. Red China can buy the Lattice isp1048 PLD, but they can't implement the MiniForth processor because they don't have the needed files. Also, they can't develop their own MiniForth processor because they are relying on LDL (Lattice Design Language) which isn't adequate --- they don't have Testra's proprietary HDL that is needed.
They would have to do quite a lot of work to get a comparable product. Pirating however, is all about work avoidance. They want to just buy off-the-shelf components. They want their "work" to consist of copying the ROM image from the competitor's board and burning their own duplicate ROMs --- if their competitor spent 1 or 2 years developing a product, they want to clone it in 1 or 2 weeks and sell it for no more than 1/2 the price that the competitor is asking.
The Rabbit processor was primarily developed to prevent cloning. Z-World previously used the Z80, but anybody can buy Z80 chips and build their own boards (rather than buy the expensive Z-World boards) and then use Dynamic C despite the fact that Dynamic C is legally restricted to being used only on Z-World boards. The Z80 boards from Z-World had a dongle that was supposed to prevent pirating, but it could be overcome (IIRC, it was a small PAL chip). With the Rabbit however, Dynamic-C could only work on the Rabbit chip and would no longer work on an off-the-shelf Z80 chip. Nobody could build their own boards except by buying Rabbit chips, and the Rabbit chips were sold only by Z-World (actually by Rabbit Semiconductor, but that was a subsidiary of Z-World). Zilog actually built the Rabbit, but they made a deal in which Rabbit Semiconductor would have exclusive rights to the Rabbit. Zilog never sold Rabbits themselve. Zilog did come out with the eZ80 though, that competed against the Rabbit.
AFAIK, Arduino has largely stopped development. Their Arduino hobbyist boards were cloned by Red China and are now being sold for 1/2 the price that Arduino is asking. Because of this, there is no point in doing further development on the Arduino --- this would just be doing somebody else's R&D for free --- working for free is, of course, not a good way to survive in a capitalist economy.
If Testra had been willing to build a hobbyist board based on the MiniForth, I would have been a lot more interested in staying with them. They were totally focused on selling motion-control boards though. They had their laser-etcher customer that they made most of their money from.
The Scenix SX processors were an improvement over the PIC architecture. Microchip sued Scenix and Parallax for patent infringement and trade secret theft and won. Whether or not you might agree with the outcome of the suit, Microchip did allow Scenix to fill remaining orders for the SX chips which allowed Parallax to stockpile them for use in making Basic Stamps. Parallax has sufficient wafers to make their remaining projected demand for the SX-based Stamps and they will sell any chips above that to other interested parties until they run out. No more will be made!
I would strongly disagree with your characterization of the Dallas 80c320 in the 1990s as the "most popular" microprocessor. These days a good compiler can turn out efficient code for pretty much any microprocessor and customers are concerned more about I/O features, included memory, effective speed (on compiled C or C++ code), price, some on power requirements. Arduino really doesn't care what you run your programs on as long as they accept Arduino compiled code and perform adequately.
The propeller is very carefully designed to implement software-defined peripherals rather than have specialized I/O engines on-chip. It also works well for creating interpreters for other architectures. There's a Z80 emulator that runs in a single cog of the propeller at speeds similar to the Z80 chips. The other cogs (processors) can be used to implement a memory card controller, keyboard & mouse controller, video generator for a text display, and UARTs (4 per cog) for communications. It can also do speech synthesis and even sing 4-part harmony in a stereo field (with the voices appearing to come from different positions in the field).
Dallas no longer exists. It's now part of Maxim and the '420 is obsolete. You can buy the 89C430 but at around $15 for a chip it's not cheap. If you want cheap '51-core chips then look at Silabs.
OTOH, the '420 is likely capable of doing things that the Propeller is not capable of...
Such as?
Whilst 8051 family chips are still popular, with new models being introduced all the time, they are still an 8-bit chip with an architecture going back to the very early '80s
So the dozen or so new boards introduced in the last year is a sign of stopped development? There is even a new UNO using Microchips new ATmega4809 AVR along with an FPGA board.
Comments
Everyday a new C++ thing to learn.
Despite the fact I have spent quite some years using C++ !
Closures aren't quite complete yet -- they don't close over the object they're contained in, but that's mostly a matter of adding a pointer to the closure and hooking up some symbol tables. They do grab all the parameters and locals, at least.
Now that the architecture is there it could be possible to add this to fastspin's Spin language as well. I'm not sure how Chip would feel about adding first class functions to Spin2 though.
It would certainly be feasible to change it so that each subfunction gets a static snapshot of the locals. This way is actually a bit more efficient though, in that all the subfunctions of a given function share the same closure object. And I think you could emulate the C++ way with this, but going the other way is harder.
I was thinking this is a world first, closures in BASIC. But it seems Visual BASIC got lambdas and closures in v9 in 2008. Totally different syntax, but hey it's BASIC right?
@localroger - thanks for the trip down the rabbit hole - I was lost for several hours!
This included GW-BASIC that was shipped with early MS-DOS. There were some commercial BASICs that had functions and local variables, and were reasonably fast --- I never used any except for QuickBasic (I preferred QuickPascal though).
Through the 1980s, the only two languages that were used were 6502 assembly-language and Z80 assembly-language --- Turbo Pascal got some use under CP/M and MS-DOS --- C started to come into use in the early 1990s.
I don't remember C being "very fast and efficient" --- compared to line-number BASIC though, anything would be fast and efficient (including pencil-and-paper, in many cases).
When I was 18 years old (1984) I wrote a program on the C64 in SuperForth to do turtle-graphics in four dimensions (W, X, Y, Z). The program displayed a 3D image of the wire-frame object drawn looking down at the turtle. This was a 2D representation of the 3D object (I had the lines going back to a vanishing point on the "horizon"). The C64 was too slow! Drawing a hypercube took several seconds. When I made my turtle orbit the hypercube I wanted to see the object spinning on the screen, but this was ponderously slow. Also, I realized too late that SuperForth doesn't allow the user to distribute programs that include the outer-interpreter (which was, of course, the basis for the turtle programming). It was a cool program though --- I may yet rewrite it for the modern desktop-computers that are much faster.
Some years later I wrote a 65c02 Forth cross-compiler hosted on MS-DOS (written in UR/Forth from LMI). This was based on ISYS Forth for the Apple-IIc that generated fast code, but had severe memory problems because it ran on the Apple-IIC rather than be a cross-compiler running on a separate machine. My cross-compiler included a source-level debugger that would single-step through a program displaying where it was in the source-code as well as the target machine's state, such as what was on the data-stack, return-stack, etc.. I used my cross-compiler to write a program for the Apple-IIc that did symbolic math. I could do derivatives and then simplify the result. My goal was to do integrals, but I had pushed the Apple-IIc's capability about as far as it could go with the derivatives, so I didn't continue with integrals which are much more difficult. I also displayed equations on the screen in a nice format with Greek letters and other math symbols, and I graphed functions.
Realistically, there was no C or Pascal compiler available for the 6502 that would have been capable of either of these programs.
I assume that AwesomeCronk is a high-school student --- I would be interested in seeing the list of 21 approved subjects he has for his social-studies class.
I would also be interested in learning about the BASIC Stamp --- I never used it, but I read that it is popular for high-school science-fair projects, but not so much as the Arduino --- is it still sold, or has Parallax upgraded to a bigger-and-better educational board now?
I don't really know anything about Parallax --- only this P2 has sparked any interest in me --- it seems powerful enough to compete out in the real world.
When I was at Testra I encouraged them to produce an educational board based on the MiniForth, but they said no. I was told: "Hobbyists require too much support, and they don't have any money."
I think my syntax matches VB's "long form" (multi line). Their short form for function is very nice though, maybe I'll try to get that in as well.
And the next step was C because C was the language Microsoft migrated to for DOS and Windows development as they pried themselves away from x86 assembly language. The Windows API is basically a collection of C function calls. You can call them from other languages, because of compatibility arrangements but if you look at any documentation from the mid-late 1980's you will see it is all C based. In those days C was the only somewhat performant language with compilers that could practically run on low-end computers, as it had been designed for the PDP-8.
A system which follows the basic design principle of those early BASIC systems is, unsurprisingly, the Parallax Propeller. Its primary language, Spin, is interpreted and therefore slow and not demonstrative of the processor's true capabilities. But for most business logic Spin is good enough. When Spin isn't good enough you back up and punt with PASM. As in those BASIC systems you never use a lot of PASM code; in fact, the Propeller is designed with the idea that you _can't_ use a lot of PASM code, because cog memory is small. But you use PASM for the stuff that has to be fast and it turns out that's only a fraction of any typical application. That is pretty much how all small-scale practical software was developed before Visual Basic came along, which changed the game in other radical ways.
To me, the BASIC language in VB style is much simpler than C. Functions and Subs are easy. The Void in C is a total disaster IMHO. It's said that C was designed by programmers who feared their job security so they invented an obscure language. I subscribe to this premise that C is unnecessarily complex just for the sake of it.
</rant>
The BASIC code would be pretty simple, as the only data types supported would be:
8-bit byte
16-bit integer
byte array (256 elements) --- also can be used as a circular buffer
integer array (256 elements) --- also can be used as a circular buffer
string (255 char maximum)
Pointers would not be supported in this BASIC because the 65ISR-chico lacks the W register and hence can't do indirect addressing.
This might seem like a pretty crude language, but you could do quite a lot of micro-controller applications with it, and it would be easy enough that a high-school student could pick it up quickly without any previous exposure to computer-science --- switching to assembly-language wouldn't really buy you any more features --- asm might buy you a slight boost in speed.
I have implemented rquotations in VFX and SwiftForth (should be easy to implement in any Forth).
I do not copy the local variables to a struct on the heap. That is a very bad idea because Forth lacks GC --- also this is very slow.
The local variables stay on the stack. The HOF (Higher Order Function) can have local variables of its own, but when it executes the rquotation the rquotation will be accessing the parent function's local variables. The parent function is the function that called the HOF. The rquotation is only valid so long as the parent function is in scope --- this is not like a Scheme closure that has the local variables in a struct on the heap and continues to be valid even if the parent function goes out of scope (exits) --- the Scheme scheme requires GC because there may be multiple closures in addition to the parent function all expecting the struct to continue to be valid, so the struct can only be deallocated after all of these have gone out of scope.
My rquotations are pretty simple, but they are hugely useful! They make general-purpose data-structures possible, the lack of which has always been Forth's primary weakness. Employers don't have time or money for employees to implement data-structures, such as self-balancing binary-trees, that are complicated to implement. Also, employers don't want to cut-and-paste code from old programs that had these data-structures, because cut-and-paste programming hugely complicates the source-code and is very error-prone. Rquotations solve all of these problems! :-D
The idea is that the HOF traverses a data-structure and it executes the rquotation for every node, giving the struct's address to the rquotation as a parameter --- the rquotation communicates information back to the parent function via the parent function's local variables.
Here is the code (requires my novice-package):
I said previously that the several ISRs were similar to the cogs on the P1 --- each one deals with some I/O, and can't be interrupted.
I assume that the 65ISR-chico would be less expensive than the P1 --- it could be used for simple projects, and if it proved inadequate, the user could upgrade to a P1 --- this would work best if the already-written code ported over without needing to be rewritten.
A little information-hiding with macros would allow hardware-specific aspects of the program to port easily --- just have compatible macro-libraries for the two systems.
Atari Falcon 030 | Nostalgia Nerd:
Ironically C was devised by the guys building Unix and Unix was their way to quickly put together a "simple" operating system after having seen the massive complexity of Multics.
C might be complex to program in, only because it is such a simple language.
Ah, that was an interesting blast from the past . Thanks, for posting that, Heater.
I imagine working for Atari was interesting. They seemed to be developing all kind of things in all kind of directions until they ran out of steam. I was stunned to see they were demoing a workstation based on Transputers at the Personal Computer World Show in London. The graphics on that was stunning for it's time.
Anyway, sadly somehow I never heard of Mint or MultiTOS at the time.
Oh yes, the mega-drive 10. I had one. You had to switch it on first and give it some time to get to speed. Like the start of a jet engine and about as loud. I later on replaced the hard drive for a bigger on (20 MB!) what a huge drive that was.
About that time I also had to solder the first time, some RAM expansion kit for the 512ST to get it up to 1MB. I think I spend a whole month income just for the RAM dip chips.
Mike
Simple, you were likely not around anyone actually doing practical things with it. That was going on all over the place though. Before I left high school I had written a little inventory control system, grading system, test generator, and a variety of computation utilities, most of which involved antennas.
And this:
Totally, one of my mentors was making good bucks doing that for trucking companies using BASIC. Started on CP/M, and moved it all to MSDOS as the companies did.
For a lot of people, they just needed it to work. Once it did, value was not hard to see. From there, moving up to bigger, faster made sense.
This too:
Interestingly, SPIN on a Propeller 1 runs about as fast as native machine language does on many 8 bit machines.
On the Propeller 2, interpreted SPIN is likely to run at speeds similar to native PASM on a Propeller 1.
What was the difference between the SX and the PIC chip? Was the SX just a clone of the PIC chip?
The SX is discontinued now?
I was never much interested in the BASIC Stamp. The PIC chips have very little memory, and no way to access external memory.
In the 1990s, the Dallas 80c320 was by far the most popular micro-controller processor. It had 2KB of RAM in addition to the direct-access page. It ran at 6 times the speed of the 8032. It had 2 data-pointers, rather than just 1 as in the 8032.
Now Dallas has the 80c420 that is faster and has some improved features (the data-pointers can go down as well as up). If I were to design a hobbyist board for high-school robot builders, I would consider the '420 to be a good choice. I could write a compiler to generate reasonably efficient code, and the more advanced students could resort to assembly-language as needed.
If I did this, I would be competing against Parallax and their Propeller, which is unlikely to work very well --- the Propeller has a distinctive architecture which is pretty cool --- the Propeller dodges the issue of IRQs not getting serviced in time, causing data loss, so it is easier to analyze in regard to whether or not it is capable of doing a particular job without any stuttering.
OTOH, the '420 is likely capable of doing things that the Propeller is not capable of --- the '420 is a powerful processor! --- for the hobbyist market though, high performance isn't really needed because high-school students are mostly just building animatronic monsters for use in home movies, and the monsters don't move very fast (they just sit there slowly opening and closing their toothy jaws as if they are chewing their cud).
The primary argument against using the 80c420 is that if the hobbyist board becomes popular, Red China will begin selling clones. They can buy 80c420 chips too! They could build 80c420 clones if necessary.
I remember at Testra that they used the 80c320 for their motion-control board originally. They built the MiniForth processor because they needed something faster and less expensive.
Also, they were very concerned that Red China was going to clone their motion-control board. Their worst nightmare was that their customer would say: "We found another supplier that sells the exact same thing at half the price that you are asking. Unless you are willing to sell your boards at below cost, we won't buy from you anymore." In this case, all of their R&D would be lost! Obviously, filing a lawsuit against Beijing isn't going to work.
The MiniForth couldn't be cloned. Red China can buy the Lattice isp1048 PLD, but they can't implement the MiniForth processor because they don't have the needed files. Also, they can't develop their own MiniForth processor because they are relying on LDL (Lattice Design Language) which isn't adequate --- they don't have Testra's proprietary HDL that is needed.
They would have to do quite a lot of work to get a comparable product. Pirating however, is all about work avoidance. They want to just buy off-the-shelf components. They want their "work" to consist of copying the ROM image from the competitor's board and burning their own duplicate ROMs --- if their competitor spent 1 or 2 years developing a product, they want to clone it in 1 or 2 weeks and sell it for no more than 1/2 the price that the competitor is asking.
The Rabbit processor was primarily developed to prevent cloning. Z-World previously used the Z80, but anybody can buy Z80 chips and build their own boards (rather than buy the expensive Z-World boards) and then use Dynamic C despite the fact that Dynamic C is legally restricted to being used only on Z-World boards. The Z80 boards from Z-World had a dongle that was supposed to prevent pirating, but it could be overcome (IIRC, it was a small PAL chip). With the Rabbit however, Dynamic-C could only work on the Rabbit chip and would no longer work on an off-the-shelf Z80 chip. Nobody could build their own boards except by buying Rabbit chips, and the Rabbit chips were sold only by Z-World (actually by Rabbit Semiconductor, but that was a subsidiary of Z-World). Zilog actually built the Rabbit, but they made a deal in which Rabbit Semiconductor would have exclusive rights to the Rabbit. Zilog never sold Rabbits themselve. Zilog did come out with the eZ80 though, that competed against the Rabbit.
AFAIK, Arduino has largely stopped development. Their Arduino hobbyist boards were cloned by Red China and are now being sold for 1/2 the price that Arduino is asking. Because of this, there is no point in doing further development on the Arduino --- this would just be doing somebody else's R&D for free --- working for free is, of course, not a good way to survive in a capitalist economy.
If Testra had been willing to build a hobbyist board based on the MiniForth, I would have been a lot more interested in staying with them. They were totally focused on selling motion-control boards though. They had their laser-etcher customer that they made most of their money from.
I would strongly disagree with your characterization of the Dallas 80c320 in the 1990s as the "most popular" microprocessor. These days a good compiler can turn out efficient code for pretty much any microprocessor and customers are concerned more about I/O features, included memory, effective speed (on compiled C or C++ code), price, some on power requirements. Arduino really doesn't care what you run your programs on as long as they accept Arduino compiled code and perform adequately.
The propeller is very carefully designed to implement software-defined peripherals rather than have specialized I/O engines on-chip. It also works well for creating interpreters for other architectures. There's a Z80 emulator that runs in a single cog of the propeller at speeds similar to the Z80 chips. The other cogs (processors) can be used to implement a memory card controller, keyboard & mouse controller, video generator for a text display, and UARTs (4 per cog) for communications. It can also do speech synthesis and even sing 4-part harmony in a stereo field (with the voices appearing to come from different positions in the field).
No. Maybe the combined total of 8051-core chips might just claim that title but I suspect that PIC chips outsold them.
Dallas no longer exists. It's now part of Maxim and the '420 is obsolete. You can buy the 89C430 but at around $15 for a chip it's not cheap. If you want cheap '51-core chips then look at Silabs.
Such as?
Whilst 8051 family chips are still popular, with new models being introduced all the time, they are still an 8-bit chip with an architecture going back to the very early '80s
So the dozen or so new boards introduced in the last year is a sign of stopped development? There is even a new UNO using Microchips new ATmega4809 AVR along with an FPGA board.
@cgracey, anything?