I can't answer for lardom, but I've been using the Propeller since it first came out. It has a number of relatively unique features:
8 x 32bit fast processors (cogs), each with 512 x 32bit words of local program / data memory. Instruction time is 50ns for most instructions. All 32 I/O pins are accessed identically by all the processors and each processor has a video generator, and two multifunction counters
There's a shared 32K byte memory (hub) that's accessed in a round-robin fashion with each processor getting one cycle (8, 16, or 32bits) every 200ns and there are special instructions for semaphores/locks.
The small local memory is adequate for most I/O drivers and a variety of interpreters which is how they're commonly used. There is a fast IEEE floating point package that runs in a single cog. Several cogs can be synchronized and work together for increased throughput for example for a high resolution VGA text display with each cog building a scan line, then displaying it.
For robotics, particularly during development, it's hard to decide ahead of time what peripherals might be needed and how they might function. With multiple processors, it's easy to just drop in a different I/O driver with little or no effect on any other part of the system as long as there's a free cog or two. When I was debugging an integer Basic interpreter, with a few minutes' work, I could add a TV text display for debugging that was specific to the problem I was having.
Thanks, that's helpful. There's no doubt that the thing is ingenious.
I only wish it were "bigger" -- especially that it had more memory. I think a "desktop" computer built on it would be a beautiful thing, with one COG to manage each physical device (say, a keyboard, a mouse, a display, a printer, a file system, and a network connection) with the remaining COGS running the memory manager, the user interface and the current application. Whoops! Ran out of COGS there. But elegant, in theory, in any case. No interrupts and predictable behavior throughout. A real programmer's dream. In fact, our original two-page manifesto imagined a future where such a deterministic system might become a reality. See the second paragraph here:
But alas, we missed our target date, and unless I'm missing something, we're going to need a Propeller with way more memory to make that happen. After all, shouldn't a 32-bit machine have 32-bit addresses and 4 gigs of memory to go with them?
The Propeller is a micro-controller not a desk top machine.
It's quite a weird and unique micro-controller:
What with having 32 bit processors at a time when most such micro-controllers were still 8 bitter. 32 bits makes the normal maths people do much easier, and faster.
What with having 8 of those processors so that up to 8 independent tasks can be going on with perfect timing isolation, they all execute deterministically. Rather hard to do on a normal single core micro-controller. It has the great value that adding a new task, possibly written by someone else, can not upset the timing of anything else you already have running in your program. This is magic and makes sharing code and reusing code very easy. See OBEX on the Parallax site.
What with not having interrupts. If you have cores you don't need interrupts to schedule things. This makes programming your little tasks very much easier and again makes integrating, mixing and matching, code from elsewhere trivial.
What with only being able to execute 512 instructions in each COG at full speed. A necessary consequence of the deterministic architecture really.
You may not be aware but the Propeller design is the vision of one man, Chip Gracey, whose vision was to build a really simple to use but powerful micro-controller in his own way. Chip has some strong views on the complexity of modern systems, from micro-controllers upwards. Which includes creating his own programming language, Spin, for the Propeller.
Rather like you and your vision for a programming language.
Now, love and respect such ingenious, radical, independent thinking. Wish I could do it myself. But such monastic visions seem to have a down side. The resulting thing is so weird most people don't know what to do with it. For example a typical micro-controller user will look at the Propeller for two minutes and dismiss it. They will see that it has no interrupts, that's obviously useless. They will see it has such small native executions space, that's obviously useless. They will see it has no built in peripherals, UART and so on, that is obviously useless. They will see that you cannot program it in C you have to use this weird Spin language, that is obviously useless. All the while they will not see what the Propeller actually is and how it offers very simple solutions to some problems they have. Problems that are exacerbated by the MCU's they are trying to use.
(Yes, we can use C for the Prop now, but for many years that was not the case. In fact I am describing the thoughts that went through my mind on discovering the Propeller years ago. Thoughts that sis indeed make me skip over it for quite some time. Till I got curious...)
Oh, but I'm rambling again...
Anyway, I'm still puzzling over how you can bootstrap the Plain English system. I mean, what if all the binary executables of the CAL-XXX program somehow disappeared and all you had left was the source text as I see it on github, how would you go about getting Plain English working again?
The Propeller is a micro-controller not a desk top machine.
Ah, but with just a big chunk memory it could be both! Think about that a minute. The same MCU in every kind of device: the programmer's development system and the robot; the artist's desktop canvas and the casual user's phone; etc, etc. And all of those devices programmable in Plain English!
What with having 32 bit processors at a time when most such micro-controllers were still 8 bitter. 32 bits makes the normal maths people do much easier, and faster.
And 4 gigs of memory with linear 32-bit addresses makes everything else much easier and faster. So near and yet so far!
What with having 8 of those processors so that up to 8 independent tasks can be going on with perfect timing isolation, they all execute deterministically. Rather hard to do on a normal single core micro-controller. It has the great value that adding a new task, possibly written by someone else, can not upset the timing of anything else you already have running in your program. This is magic and makes sharing code and reusing code very easy. See OBEX on the Parallax site.
Yes. A true architectural advance. Stroke of genius. And great (as I've described) not only for micro- applications but for desktop systems as well. One COG for a print spooler; one for communications; one for the display, etc, etc.
If you have cores you don't need interrupts to schedule things. This makes programming your little tasks very much easier and again makes integrating, mixing and matching, code from elsewhere trivial.
You may not be aware but the Propeller design is the vision of one man, Chip Gracey, whose vision was to build a really simple to use but powerful micro-controller in his own way. Chip has some strong views on the complexity of modern systems, from micro-controllers upwards. Which includes creating his own programming language, Spin, for the Propeller. Rather like you and your vision for a programming language.
Indeed. Chip and I should get together and figure out how to make a Propeller with 4 gigs of memory.
Now, I love and respect such ingenious, radical, independent thinking. Wish I could do it myself. But such monastic visions seem to have a down side. The resulting thing is so weird most people don't know what to do with it. For example a typical micro-controller user will look at the Propeller for two minutes and dismiss it. They will see that it has no interrupts, that's obviously useless. They will see it has such small native executions space, that's obviously useless. They will see it has no built in peripherals, UART and so on, that is obviously useless. They will see that you cannot program it in C you have to use this weird Spin language, that is obviously useless. All the while they will not see what the Propeller actually is and how it offers very simple solutions to some problems they have. Problems that are exacerbated by the MCU's they are trying to use.
"First they ignore you, then they laugh at you, then they fight you, then you win." - Mahatma Gandhi
In fact I am describing the thoughts that went through my mind on discovering the Propeller years ago. Thoughts that indeed made me skip over it for quite some time. Till I got curious...) Oh, but I'm rambling again...
Anyway, I'm still puzzling over how you can bootstrap the Plain English system. I mean, what if all the binary executables of the CAL-XXX program somehow disappeared and all you had left was the source text as I see it on github, how would you go about getting Plain English working again?
The same way someone would get C working again if all the C executables disappeared. We'd use some other language to create a very minimal CAL-1000 that could edit text and re-compile itself, then use that for the remainder of the development.
But we're not in that situation, so things are easier. To port the CAL to another suitable system, all we have to do is cross-compile and modify the CAL's Noodle (standard routine library) -- create a version of the CAL (using the current CAL) that puts out the kind of EXE the new hardware needs, and make appropriate changes to what used to be calls to Windows functions.
Your handle and the little picture that goes with it makes me think you're familiar with tube amplifiers. If so, you might get a kick out of this (it's a PDF; you'll have to download it from filedropper):
I'm no audiophile or guitar effects man. I learned my first electronics in my early teens from a friend's father who was a TV transmitter builder. So it was all tubes, he had hundreds of them in his junk box. Used to do experiments and build HAM radio gear with tubes way back then.
That link shows someone putting a lot of time and effort into building a nice looking amplifier. That's not really my style. I only have one tube device in the house now a days. It would make tube purists mad. It has a 12AX7 based pre-amplifier stage driving a bunch massive BUZ-whatever MOSFETS. Good for about 100 watts. It's constructed on a plank. The interconnects are sort of wire wrapped and soldered to brass nails hammered into the blank. Hmmm...I should rebuild that on a proper nice looking breadboard.
Somewhere down in the cellar I have some old tube AM and FM radios. Some work. Some waiting to be fixed.
My prized possession is a huge VCR97 6 inch diameter cathode ray tube as used in aircraft radar in WWII. That is still waiting for a 1000v power supply to see if it still lights up.
I've implemented an IBM 1130 emulator on the Propeller, which could run COBOL (circa 1968) along with RPG I, Fortran IV, Assembly, and SL/1 (a variation of PL/1)...
...except that the COBOL compiler was sold as an "IBM licensed program product" and the people with the source code for the compiler don't want to step on IBM's toes. Frankly, I don't think that IBM gives a hoot about licensing something from 50 years ago that runs only on hardware not manufactured in decades.
This is frustrating to me because I have several programs that I'd like to run again (just for fun).
The emulator could also run APL but my implementation doesn't support the IBM Selectric element (typeball) with the APL character set that was used for the console printer.
Re: Propeller vs. desktop
Actually, it would be both a lousy micro-controller and not-so-great desktop. IBM's System 360 is a good historical example of a computer architecture that spanned that kind of range of sizes and speeds successfully. They did it by fooling everyone and emulating the System 360 instruction set on the low-end models using hardware not too different from other microcomputers at the time while using state of the art supercomputer hardware for the high-end models.
There's a Propeller 2 in development with lots of new features including more (16) and faster processors and more shared memory (512K bytes) along with the ability to execute directly from the shared memory. There are other features to help with I/O and signal processing (analog and digital ... better video too). There's an FPGA version of the Propeller 2 available here ... a work in progress ... with silicon expected sometime soon. The native Propeller instruction set is not suited for larger memories, but an interpreter for a suitable instruction set should be easy enough and could support software demand paging into shared memory for a 4GB external address space with good performance.
C was added to the supported language list for the Propeller because of demand from Parallax's education customers as well as the interest and efforts of several members of the user community who did most of the work.
That's great. Really. But why use the Propeller on your wireless bot instead of something else? What advantages does it give you?
The Propeller was so much better than the Basic Stamp that I gave my Basic Stamp away.
The Propeller based machine was much cheaper. My camera slider required an lcd, a keypad and a stepper. I could drive all of them in software instead of buying breakout boards for each accessory and it simplified my circuits.
One of the most amazing Propeller apps I've ever seen was one that responded to several spoken commands written by Phil Pilgrim. The only hardware requirement is a mic.
Another fun Propeller app is Chip's singing monks. He wrote a phoneme synthesizer that takes a text representation of phonemes including pitch and generates a digital audio signal. He then wrote a stereo synthesizer that takes a digital audio signal and places it at a specified point between two stereo sound sources using delays and attenuation. With 4 copies of the phoneme synthesizer each running in a cog feeding into the stereo synthesizer in another cog, there's a chorus of voices in stereo doing a Gregorian chant in 4-part harmony all in software on a general purpose multiprocessor.
Another fun Propeller app is Chip's singing monks. He wrote a phoneme synthesizer that takes a text representation of phonemes including pitch and generates a digital audio signal. He then wrote a stereo synthesizer that takes a digital audio signal and places it at a specified point between two stereo sound sources using delays and attenuation. With 4 copies of the phoneme synthesizer each running in a cog feeding into the stereo synthesizer in another cog, there's a chorus of voices in stereo doing a Gregorian chant in 4-part harmony all in software on a general purpose multiprocessor.
Mike, you are absolutely right. I sometimes think of the "Propeller beanie" as a cranial heat sink device for a rare genius like Chip Gracey
Now, if Plain English would compile to C in the same way that would be great.
I don't see why that would be great. Antoine de Saint-Exupery (the guy who wrote The Little Prince) once said -- with ironic verbosity -- "Perfection is achieved not when there is nothing more to add, but when there is nothing left to take away." I agree. The guy who gets the job done with the fewest parts is the winner. Elegance is a single word for the concept. So it seems that a system like this...
English > Compiler > Executable
...is preferable to a system that requires an artificial intermediate language and an extra step:
English > Translator > C > Compiler > Executable
In the former system, on the developer's end, there's less to design, less to code, less to document; and on the student's end, there's less to take in, less to digest, etc.
Of course the rub is that these two systems are not quite equivalent. The latter system, it will be argued, can produce executables that run on different kinds of machines. But there's a bit of slight-of-hand going on there. Let's get into a little more detail. The former system really looks like this:
English + Library > Compiler > Executable
And the latter system really looks like this:
English + Library > Translator > C + Library > Compiler > Executable
To support different hardware configurations, both of these systems need to be expanded. The former:
English + Lib for X > Compiler for X > Exe for X English + Lib for Y > Compiler for Y > Exe for Y English + Lib for Z > Compiler for Z > Exe for Z
And the latter:
English + Lib for X > Translator > C + Lib for X > Compiler for X > Exe for X English + Lib for Y > Translator > C + Lib for Y > Compiler for Y > Exe for Y English + Lib for Z > Translator > C + Lib for Z > Compiler for Z > Exe for Z
And the former approach still wins. The latter approach might work if we reduced our program to a text-only shadow of it's original self; but the "common intermediate" that you're imagining doesn't really exist when one considers everything that our system needs and does.
Another fun Propeller app is Chip's singing monks. He wrote a phoneme synthesizer that takes a text representation of phonemes including pitch and generates a digital audio signal. He then wrote a stereo synthesizer that takes a digital audio signal and places it at a specified point between two stereo sound sources using delays and attenuation. With 4 copies of the phoneme synthesizer each running in a cog feeding into the stereo synthesizer in another cog, there's a chorus of voices in stereo doing a Gregorian chant in 4-part harmony all in software on a general purpose multiprocessor.
English + Library > Translator > Javascript + web browser environment.
Only one translation step. Runs everywhere. An audience and user base in billions. Job done.
But in general I'm with Antoine de Saint-Exupery because if you have N different languages and M different targets then you have N*M translators to write. Or you can write translators for the N languages to some common thing and then write M translators from the common thing to the N targets. That is only N + M translators to write. Much simpler and more efficient.
The COBOL example above shows this. COBOL to C. C to anywhere.
Another fun Propeller app is Chip's singing monks. He wrote a phoneme synthesizer that takes a text representation of phonemes including pitch and generates a digital audio signal. He then wrote a stereo synthesizer that takes a digital audio signal and places it at a specified point between two stereo sound sources using delays and attenuation. With 4 copies of the phoneme synthesizer each running in a cog feeding into the stereo synthesizer in another cog, there's a chorus of voices in stereo doing a Gregorian chant in 4-part harmony all in software on a general purpose multiprocessor.
English + Library > Translator > Javascript + web browser environment.
Only one translation step. Runs everywhere. An audience and user base in billions. Job done.
If only that were the case! Unfortunately, after translation, it won't be the same program we started with. For one thing, the student will have to work harder: he'll now have to learn Javascript as well as Plain English and assembler to see how we get from English to machine code. The student will also be less impressed: he'll think that Plain English can't stand on it's own -- that it requires a "real" program (the Javascript interpreter/compiler), written in something like C, to run.
I've actually got about half of an English to Javascript translator written. It's not as easy as it sounds. Here's a typical difficulty: Plain English passes all parameters by reference; Javascript passes simple types by value and other types by reference. We got around this defect in Javascript by making simple types Javascript arrays, but took both a complexity hit and a performance hit for doing so.
And here's something that's pretty much impossible to overcome: Plain English assumes wysiwyg graphics; Javascript does not. So it's very difficult to port our page-layout facility with any degree of integrity. Case in point: our reference manual is written in such a way that each topic fits, exactly, on a single page. To make that happen, we need to know how the paragraphs are going to wrap -- exactly what fonts are being used and exactly how they will be rendered both on the screen (for the author) and on the printer (for the reader). Javascript cannot provide us with the needed information.
But in general I'm with Antoine de Saint-Exupery because if you have N different languages and M different targets then you have N*M translators to write. Or you can write translators for the N languages to some common thing and then write M translators from the common thing to the N targets. That is only N + M translators to write. Much simpler and more efficient. The COBOL example above shows this. COBOL to C. C to anywhere.
True. IF, and it's a big IF, you can find that common intermediate. Show me how, using only standard C libraries common to "anywhere," to write a page editor like the one included in our IDE. It's easy to make something run anywhere if you delete most of that something before you start!
For one thing, the student will have to work harder: he'll now have to learn Javascript as well as Plain English and assembler to see how we get from English to machine code.
Ah, sorry, perhaps I misunderstood. I though the idea was that students of Plain English were not supposed to have to worry about all that messy machine code stuff.
The student will also be less impressed:...
Perhaps. But then again, I though the idea was that Plain English enabled them to do stuff without worrying about all that. Besides, as it stands Plain English cannot stand alone, it needs Windows.
Here's a typical difficulty: Plain English passes all parameters by reference; Javascript passes simple types by value and other types by reference. We got around this defect in Javascript ...
I'm not sure I would call it a "defect". Seems eminently sensible to me. From the JS programmers perspective it hardly notices.
Anyway, when it comes to transpiling other languages to JS I'm sure that is not an issue. After all one can compile C and C++ to JS. As you know in those languages you can pass by value or by reference. It all works out. And runs nearly as fast as compiling to native code. See Emsripten and asm.js.
Now, the wysiwyg thing is a bit of a problem. Certainly Javascript does not expect anything in that regard, it's just a programming language. It's not Javascript's fault that the programming environment inside a browser is so awful, what with all that crappy HTML and CSS and the hideous DOM API. I agree it's a nightmare.
One approach is to ignore it, go around it. Reduce all your HTML and CSS to to the bare minimum. Get yourself a canvas and draw on it. Then you can put everything exactly where you like it. Or perhaps use webgl or svg.
Pretty much what happens when the unity game engine is running in the browser. Or unreal.
And the former approach still wins. The latter approach might work if we reduced our program to a text-only shadow of it's original self; but the "common intermediate" that you're imagining doesn't really exist when one considers everything that our system needs and does.
Whether it 'wins' depends on your code generator.
Before something can claim a win, it needs to first exist.
If you are using a small subset of 25 intel opcodes, then a new code generator should not be too complex, but you could consider coding that small part in C, to make it more easily core-portable.
Treat C as a high level assembler.
I don't know Gerry, every suggestion we come up with for getting more exposure of Plain English, and hence perhaps users, is met with a show stopping reasons why it cannot be done. Something has to give. Else it is a dead end.
I do appreciate all you arguments for simplicity, direct compilation, and so on. Very much so.
I also I admire the whole effort. I know nothing much of compiler technology or writing compilers. I was totally amazed at myself a few years back when I manage to get a simple C/Pascal like language to generate x86 and then Propeller assembler. It's posted around here some place, not actually very useful mind.
Now, if Plain English would compile to C in the same way that would be great.
Now you got me, @Heater.
Did this really run on the propeller?
As far as I can follow GnCuobol as it is called now (OpenCobol is renamed since joining the Gnu Group) is based on dynamic linking and PropGCC does not support this.
Ah, sorry, perhaps I misunderstood. I though the idea was that students of Plain English were not supposed to have to worry about all that messy machine code stuff.
The user of Plain English shouldn't have to think about machine code. The student of Plain English, who wants to know how the thing works (or how, in general, to write a compiler) does.
Perhaps [they'll be less impressed with a non-standalone program]. But then again, I though the idea was that Plain English enabled them to do stuff without worrying about all that.
We want Plain English to be suitable for both the beginner (writing "Hello, World!") and the expert (writing a complete IDE or a wysiwyg page-layout application) -- so nothing is abandoned or suddenly changes as the student makes step-by-step progress from beginner to expert.
Besides, as it stands Plain English cannot stand alone, it needs Windows.
I think the distinction between an operating system (like Windows) and an application program (like an IDE) is pretty well established. Granted, a system where both the operating and the application programs were written in Plain English would be ideal. Unfortunately, that hasn't yet proven feasible because of the lack of standardization in the hardware arena. Even Microsoft can't keep up -- they, too, have to depend on the vendors of various printers, displays, etc, to code up the drivers. It's a mess.
I'm not sure I would call [passing parameters of different types different ways] a "defect". Seems eminently sensible to me. From the JS programmers perspective it hardly notices.
I called it a defect because it both complicates the compiler and makes the language harder to learn. Our approach is simple and natural: all parameters are passed by reference. JavaScript's approach is convoluted -- just google "javascript value reference" to see a list of nearly 49 million articles that attempt to explain it.
Anyway, when it comes to transpiling other languages to JS I'm sure that is not an issue. After all one can compile C and C++ to JS. As you know in those languages you can pass by value or by reference. It all works out. And runs nearly as fast as compiling to native code. See Emsripten and asm.js.
Translating C to Javascript is easier because C is more like Javascript to begin with; for example, both languages pass primitive types by value so the problem we have doesn't come up. And such translations are not complete. Using straight C I can access the complete file system on a machine; using C translated to Javascript I cannot. So it turns out you really can't compile C and C++ to Javascript -- unless you're willing to lose major functionality in the process.
Now, the wysiwyg thing is a bit of a problem. Certainly Javascript does not expect anything in that regard, it's just a programming language. It's not Javascript's fault that the programming environment inside a browser is so awful, what with all that crappy HTML and CSS and the hideous DOM API. I agree it's a nightmare.
It may not be Javascript's fault, but inserting Javascript into our process drags all that along with it. I don't see how that's an improvement. But I think the reason we're talking past each other is now clear. You seem to think "larger audience" is the goal, when in fact the overriding goal is "more (or at least equivalent) functionality with less pieces."
One approach is to ignore it, go around it. Reduce all your HTML and CSS to to the bare minimum. Get yourself a canvas and draw on it. Then you can put everything exactly where you like it.
My son actually tried to do that when he first started his Javascript-based business. Didn't work; apparently there are too many quirks and shortcomings to make it both fully functional and reliable on all devices.
Pretty much what happens when the unity game engine is running in the browser. Or unreal.
Yes, products like that seem to have accomplished a similar goal. But they really haven't since they're primarily screen-only output-oriented programs. They're not editors; they don't print in a wysiwyg fashion; etc.
If you are using a small subset of 25 intel opcodes, then a new code generator should not be too complex...
It's not the code generator that's the problem. It's the "standard library" that is difficult to port because analogs of the functions readily available in all versions of Windows do not always exist on other platforms. As bad as the Windows core is, one has to admit that it's both stable and complete.
...but you could consider coding that small part in C, to make it more easily core-portable. Treat C as a high level assembler.
I get the idea, I just don't like the thought of inserting C (or anything else) into a system that works just fine without it. Why add more parts? So the system will run on 1% more systems than the 85% of desktops it already runs on? And accomplishing even that is questionable, since it's not the "core" of the thing that's difficult to port; it's all the rest.
...and when doing the new code generator, look at doing both ARM and P2.
Aaaaarg! Why does everyone want us to do our work twice? No, if we go this route, we'll place our bets on the single system that we think will dominate the marketplace.
I don't know Gerry, every suggestion we come up with for getting more exposure of Plain English, and hence perhaps users, is met with a show stopping reasons why it cannot be done. Something has to give. Else it is a dead end.
Yes, I suppose I do sound like an ornery old dinosaur. But unnecessary complexity is a creeping thing -- it sneaks in little by little -- and it's only extreme vigilance that keeps it out. When we were coding our system, our first task every day was to delete at least ten lines of code -- without reducing functionality or performance. On good days we'd have an epiphany and delete a hundred. So what may seem like orneriness is actually just a stubborn refusal to get less (beauty, functionality, performance) with more parts; or, to put it more positively, a determined persistence to get more (beauty, functionality, performance) with less stuff.
...and when doing the new code generator, look at doing both ARM and P2.
Aaaaarg! Why does everyone want us to do our work twice?
Hehe, now it seems you grasp why I suggested C as the common 'high level ASM' code generator...
( or, make your own 'high level ASM', but I would try the C avenue first )
You have said the Code generator is small, now is the time to make that a little more portable.
If you just use 25 opcodes in your asm output they could easily be emulated in a COG. Just tell @Heater that it is impossible to do so. He wrote a lot of emulator code. I think 8080, Z80 and other.
The windows API, on the other hand is nothing @Heater would do. For some reasons @Heater dislikes everything Microsoft does. Even I had starting problems with him, because my Username here starts with MS. But I won't change my First and Last name to avoid MS.
Over the years I found out that it is quite fun to read his posts and (for a short time) to work together with him on some insane stupid project.
But with a emulator running your existing opcodes and some cog driver doing Keyboard, Mouse, VGA you might be able to build a minimal runtime to support what is needed by your current code generator.
Not to run the IDE (yet) but to run the created programs
As for the Linux/Win problem I would like you to look at FASM, a very nice Macro Assembler. On their website are a couple of nice projects to unify Linux/Win development done in assembler. It is like a layer of common functions.
You would still be bound to x86 but win and Linux possible.
And for the RasPi - there is some Windows 10 out for the RasPi, for free as I remember. Convert your 25 used opcodes to arm and run Windows 10 on the RasPi's.
Comments
I only wish it were "bigger" -- especially that it had more memory. I think a "desktop" computer built on it would be a beautiful thing, with one COG to manage each physical device (say, a keyboard, a mouse, a display, a printer, a file system, and a network connection) with the remaining COGS running the memory manager, the user interface and the current application. Whoops! Ran out of COGS there. But elegant, in theory, in any case. No interrupts and predictable behavior throughout. A real programmer's dream. In fact, our original two-page manifesto imagined a future where such a deterministic system might become a reality. See the second paragraph here:
osmosian.com/manifesto.pdf
But alas, we missed our target date, and unless I'm missing something, we're going to need a Propeller with way more memory to make that happen. After all, shouldn't a 32-bit machine have 32-bit addresses and 4 gigs of memory to go with them?
If only there was a way...
The Propeller is a micro-controller not a desk top machine.
It's quite a weird and unique micro-controller:
What with having 32 bit processors at a time when most such micro-controllers were still 8 bitter. 32 bits makes the normal maths people do much easier, and faster.
What with having 8 of those processors so that up to 8 independent tasks can be going on with perfect timing isolation, they all execute deterministically. Rather hard to do on a normal single core micro-controller. It has the great value that adding a new task, possibly written by someone else, can not upset the timing of anything else you already have running in your program. This is magic and makes sharing code and reusing code very easy. See OBEX on the Parallax site.
What with not having interrupts. If you have cores you don't need interrupts to schedule things. This makes programming your little tasks very much easier and again makes integrating, mixing and matching, code from elsewhere trivial.
What with only being able to execute 512 instructions in each COG at full speed. A necessary consequence of the deterministic architecture really.
You may not be aware but the Propeller design is the vision of one man, Chip Gracey, whose vision was to build a really simple to use but powerful micro-controller in his own way. Chip has some strong views on the complexity of modern systems, from micro-controllers upwards. Which includes creating his own programming language, Spin, for the Propeller.
Rather like you and your vision for a programming language.
Now, love and respect such ingenious, radical, independent thinking. Wish I could do it myself. But such monastic visions seem to have a down side. The resulting thing is so weird most people don't know what to do with it. For example a typical micro-controller user will look at the Propeller for two minutes and dismiss it. They will see that it has no interrupts, that's obviously useless. They will see it has such small native executions space, that's obviously useless. They will see it has no built in peripherals, UART and so on, that is obviously useless. They will see that you cannot program it in C you have to use this weird Spin language, that is obviously useless. All the while they will not see what the Propeller actually is and how it offers very simple solutions to some problems they have. Problems that are exacerbated by the MCU's they are trying to use.
(Yes, we can use C for the Prop now, but for many years that was not the case. In fact I am describing the thoughts that went through my mind on discovering the Propeller years ago. Thoughts that sis indeed make me skip over it for quite some time. Till I got curious...)
Oh, but I'm rambling again...
Anyway, I'm still puzzling over how you can bootstrap the Plain English system. I mean, what if all the binary executables of the CAL-XXX program somehow disappeared and all you had left was the source text as I see it on github, how would you go about getting Plain English working again?
And 4 gigs of memory with linear 32-bit addresses makes everything else much easier and faster. So near and yet so far!
Yes. A true architectural advance. Stroke of genius. And great (as I've described) not only for micro- applications but for desktop systems as well. One COG for a print spooler; one for communications; one for the display, etc, etc.
Yes. Ditto.
Indeed. Chip and I should get together and figure out how to make a Propeller with 4 gigs of memory.
"First they ignore you, then they laugh at you, then they fight you, then you win." - Mahatma Gandhi
I would think (even hope) that addition broke Chip's heart.
It happens.
The same way someone would get C working again if all the C executables disappeared. We'd use some other language to create a very minimal CAL-1000 that could edit text and re-compile itself, then use that for the remainder of the development.
But we're not in that situation, so things are easier. To port the CAL to another suitable system, all we have to do is cross-compile and modify the CAL's Noodle (standard routine library) -- create a version of the CAL (using the current CAL) that puts out the kind of EXE the new hardware needs, and make appropriate changes to what used to be calls to Windows functions.
Your handle and the little picture that goes with it makes me think you're familiar with tube amplifiers. If so, you might get a kick out of this (it's a PDF; you'll have to download it from filedropper):
filedropper.com/5c1documentationfinalcompressed
If you're not into such things, just pretend this post never happened.
I'm no audiophile or guitar effects man. I learned my first electronics in my early teens from a friend's father who was a TV transmitter builder. So it was all tubes, he had hundreds of them in his junk box. Used to do experiments and build HAM radio gear with tubes way back then.
That link shows someone putting a lot of time and effort into building a nice looking amplifier. That's not really my style. I only have one tube device in the house now a days. It would make tube purists mad. It has a 12AX7 based pre-amplifier stage driving a bunch massive BUZ-whatever MOSFETS. Good for about 100 watts. It's constructed on a plank. The interconnects are sort of wire wrapped and soldered to brass nails hammered into the blank. Hmmm...I should rebuild that on a proper nice looking breadboard.
Somewhere down in the cellar I have some old tube AM and FM radios. Some work. Some waiting to be fixed.
My prized possession is a huge VCR97 6 inch diameter cathode ray tube as used in aircraft radar in WWII. That is still waiting for a 1000v power supply to see if it still lights up.
http://www.r-type.org/exhib/aaj0166.htm
I'm trying to.
I've implemented an IBM 1130 emulator on the Propeller, which could run COBOL (circa 1968) along with RPG I, Fortran IV, Assembly, and SL/1 (a variation of PL/1)...
...except that the COBOL compiler was sold as an "IBM licensed program product" and the people with the source code for the compiler don't want to step on IBM's toes. Frankly, I don't think that IBM gives a hoot about licensing something from 50 years ago that runs only on hardware not manufactured in decades.
This is frustrating to me because I have several programs that I'd like to run again (just for fun).
The emulator could also run APL but my implementation doesn't support the IBM Selectric element (typeball) with the APL character set that was used for the console printer.
Walter
Actually, it would be both a lousy micro-controller and not-so-great desktop. IBM's System 360 is a good historical example of a computer architecture that spanned that kind of range of sizes and speeds successfully. They did it by fooling everyone and emulating the System 360 instruction set on the low-end models using hardware not too different from other microcomputers at the time while using state of the art supercomputer hardware for the high-end models.
There's a Propeller 2 in development with lots of new features including more (16) and faster processors and more shared memory (512K bytes) along with the ability to execute directly from the shared memory. There are other features to help with I/O and signal processing (analog and digital ... better video too). There's an FPGA version of the Propeller 2 available here ... a work in progress ... with silicon expected sometime soon. The native Propeller instruction set is not suited for larger memories, but an interpreter for a suitable instruction set should be easy enough and could support software demand paging into shared memory for a 4GB external address space with good performance.
C was added to the supported language list for the Propeller because of demand from Parallax's education customers as well as the interest and efforts of several members of the user community who did most of the work.
The Propeller was so much better than the Basic Stamp that I gave my Basic Stamp away.
The Propeller based machine was much cheaper. My camera slider required an lcd, a keypad and a stepper. I could drive all of them in software instead of buying breakout boards for each accessory and it simplified my circuits.
One of the most amazing Propeller apps I've ever seen was one that responded to several spoken commands written by Phil Pilgrim. The only hardware requirement is a mic.
If you are really weird that is.
Now, if Plain English would compile to C in the same way that would be great.
English > Compiler > Executable
...is preferable to a system that requires an artificial intermediate language and an extra step:
English > Translator > C > Compiler > Executable
In the former system, on the developer's end, there's less to design, less to code, less to document; and on the student's end, there's less to take in, less to digest, etc.
Of course the rub is that these two systems are not quite equivalent. The latter system, it will be argued, can produce executables that run on different kinds of machines. But there's a bit of slight-of-hand going on there. Let's get into a little more detail. The former system really looks like this:
English + Library > Compiler > Executable
And the latter system really looks like this:
English + Library > Translator > C + Library > Compiler > Executable
To support different hardware configurations, both of these systems need to be expanded. The former:
English + Lib for X > Compiler for X > Exe for X
English + Lib for Y > Compiler for Y > Exe for Y
English + Lib for Z > Compiler for Z > Exe for Z
And the latter:
English + Lib for X > Translator > C + Lib for X > Compiler for X > Exe for X
English + Lib for Y > Translator > C + Lib for Y > Compiler for Y > Exe for Y
English + Lib for Z > Translator > C + Lib for Z > Compiler for Z > Exe for Z
And the former approach still wins. The latter approach might work if we reduced our program to a text-only shadow of it's original self; but the "common intermediate" that you're imagining doesn't really exist when one considers everything that our system needs and does.
So it's :
English + Library > Translator > Javascript + web browser environment.
Only one translation step. Runs everywhere. An audience and user base in billions. Job done.
But in general I'm with Antoine de Saint-Exupery because if you have N different languages and M different targets then you have N*M translators to write. Or you can write translators for the N languages to some common thing and then write M translators from the common thing to the N targets. That is only N + M translators to write. Much simpler and more efficient.
The COBOL example above shows this. COBOL to C. C to anywhere.
Here it is:
obex.parallax.com/object/542
I've actually got about half of an English to Javascript translator written. It's not as easy as it sounds. Here's a typical difficulty: Plain English passes all parameters by reference; Javascript passes simple types by value and other types by reference. We got around this defect in Javascript by making simple types Javascript arrays, but took both a complexity hit and a performance hit for doing so.
And here's something that's pretty much impossible to overcome: Plain English assumes wysiwyg graphics; Javascript does not. So it's very difficult to port our page-layout facility with any degree of integrity. Case in point: our reference manual is written in such a way that each topic fits, exactly, on a single page. To make that happen, we need to know how the paragraphs are going to wrap -- exactly what fonts are being used and exactly how they will be rendered both on the screen (for the author) and on the printer (for the reader). Javascript cannot provide us with the needed information.
True. IF, and it's a big IF, you can find that common intermediate. Show me how, using only standard C libraries common to "anywhere," to write a page editor like the one included in our IDE. It's easy to make something run anywhere if you delete most of that something before you start!
Non-sampled vocal synthesis is a real trick. The best program I've found is Cantor, and even that only goes so far:
virsyn.de/en/E_Products/E_CANTOR_3/e_cantor_3.html
The sample snippet at the bottom, I Can Do All Things through Christ is my attempt at using it.
Anyway, when it comes to transpiling other languages to JS I'm sure that is not an issue. After all one can compile C and C++ to JS. As you know in those languages you can pass by value or by reference. It all works out. And runs nearly as fast as compiling to native code. See Emsripten and asm.js.
Now, the wysiwyg thing is a bit of a problem. Certainly Javascript does not expect anything in that regard, it's just a programming language. It's not Javascript's fault that the programming environment inside a browser is so awful, what with all that crappy HTML and CSS and the hideous DOM API. I agree it's a nightmare.
One approach is to ignore it, go around it. Reduce all your HTML and CSS to to the bare minimum. Get yourself a canvas and draw on it. Then you can put everything exactly where you like it. Or perhaps use webgl or svg.
Pretty much what happens when the unity game engine is running in the browser. Or unreal.
Whether it 'wins' depends on your code generator.
Before something can claim a win, it needs to first exist.
If you are using a small subset of 25 intel opcodes, then a new code generator should not be too complex, but you could consider coding that small part in C, to make it more easily core-portable.
Treat C as a high level assembler.
I do appreciate all you arguments for simplicity, direct compilation, and so on. Very much so.
I also I admire the whole effort. I know nothing much of compiler technology or writing compilers. I was totally amazed at myself a few years back when I manage to get a simple C/Pascal like language to generate x86 and then Propeller assembler. It's posted around here some place, not actually very useful mind.
So what to do?
I'd suggest targeting the RaspPi 3, bare metal, along the Ultibo lines, and when doing the new code generator, look at doing both ARM and P2.
Now you got me, @Heater.
Did this really run on the propeller?
As far as I can follow GnCuobol as it is called now (OpenCobol is renamed since joining the Gnu Group) is based on dynamic linking and PropGCC does not support this.
So how do you got around this problem?
VERY interested,
Mike
We want Plain English to be suitable for both the beginner (writing "Hello, World!") and the expert (writing a complete IDE or a wysiwyg page-layout application) -- so nothing is abandoned or suddenly changes as the student makes step-by-step progress from beginner to expert.
I think the distinction between an operating system (like Windows) and an application program (like an IDE) is pretty well established. Granted, a system where both the operating and the application programs were written in Plain English would be ideal. Unfortunately, that hasn't yet proven feasible because of the lack of standardization in the hardware arena. Even Microsoft can't keep up -- they, too, have to depend on the vendors of various printers, displays, etc, to code up the drivers. It's a mess.
I called it a defect because it both complicates the compiler and makes the language harder to learn. Our approach is simple and natural: all parameters are passed by reference. JavaScript's approach is convoluted -- just google "javascript value reference" to see a list of nearly 49 million articles that attempt to explain it.
Translating C to Javascript is easier because C is more like Javascript to begin with; for example, both languages pass primitive types by value so the problem we have doesn't come up. And such translations are not complete. Using straight C I can access the complete file system on a machine; using C translated to Javascript I cannot. So it turns out you really can't compile C and C++ to Javascript -- unless you're willing to lose major functionality in the process.
It may not be Javascript's fault, but inserting Javascript into our process drags all that along with it. I don't see how that's an improvement. But I think the reason we're talking past each other is now clear. You seem to think "larger audience" is the goal, when in fact the overriding goal is "more (or at least equivalent) functionality with less pieces."
My son actually tried to do that when he first started his Javascript-based business. Didn't work; apparently there are too many quirks and shortcomings to make it both fully functional and reliable on all devices.
Yes, products like that seem to have accomplished a similar goal. But they really haven't since they're primarily screen-only output-oriented programs. They're not editors; they don't print in a wysiwyg fashion; etc.
I get the idea, I just don't like the thought of inserting C (or anything else) into a system that works just fine without it. Why add more parts? So the system will run on 1% more systems than the 85% of desktops it already runs on? And accomplishing even that is questionable, since it's not the "core" of the thing that's difficult to port; it's all the rest.
Aaaaarg! Why does everyone want us to do our work twice? No, if we go this route, we'll place our bets on the single system that we think will dominate the marketplace.
I believe you do.
Thinking...
( or, make your own 'high level ASM', but I would try the C avenue first )
You have said the Code generator is small, now is the time to make that a little more portable.
There is no 'single system' in Microcontroller land, and hardly even a single system in Raspberry Pi land.
Portability matters, without it, you are going to be a curiosity.
Lazarus/FPC just manages that, with the new Bare Metal work on Pi.
The windows API, on the other hand is nothing @Heater would do. For some reasons @Heater dislikes everything Microsoft does. Even I had starting problems with him, because my Username here starts with MS. But I won't change my First and Last name to avoid MS.
Over the years I found out that it is quite fun to read his posts and (for a short time) to work together with him on some insane stupid project.
But with a emulator running your existing opcodes and some cog driver doing Keyboard, Mouse, VGA you might be able to build a minimal runtime to support what is needed by your current code generator.
Not to run the IDE (yet) but to run the created programs
As for the Linux/Win problem I would like you to look at FASM, a very nice Macro Assembler. On their website are a couple of nice projects to unify Linux/Win development done in assembler. It is like a layer of common functions.
You would still be bound to x86 but win and Linux possible.
And for the RasPi - there is some Windows 10 out for the RasPi, for free as I remember. Convert your 25 used opcodes to arm and run Windows 10 on the RasPi's.
ducking now,
Mike