Man, I love that chip! Spent a ton of time programming it and the 6502. Yeah, the 6809 can be slow responding to something, but if there is more work to be done, it can complete that quickly.
Now, you can use it in more clever ways to cut down that cycle count. There is the full interrupt and a more limited one. I'm not where I can go read my Moto programmers reference, but the key is the saving of the entire CPU state as opposed to just enough to respond.
A 6502 can respond much faster, for example but then again, it's a much more simple chip too. And it's got the same choice really, but the decision is more about software.
Both of those chips were considered "fast" relative to RAM speeds at the time, and in DMA based systems, the higher clock speeds were balanced out by the CPU waiting. In the more clever design I mention below, the higher clock meant a much faster CPU, but better, faster RAM was needed too. More expensive.
The differences play out in notable ways. If one is wanting fast code to respond to events, a 6502 can get the jump on the 6809. However, if one wants to really pack a lot into the program space and write more powerful programs per byte, the 6809 is very difficult to beat in 8 bit land.
Both chips access memory fairly quickly. The key to running both at very nice speeds is to utilize fast RAM and provide the CPU access to it on one clock phase and the system bus can be used for other things, such as DMA, Video, etc... on the other phase. Or, one needs to go cycle stealing and work access in when both chips are busy. The 6809 can go off for a lot of cycles on one instruction. Some going what? 14+ cycles for something like LDD ($FEED), X++ A 6502 takes a few instructions to do that, but it can do it in a similar amount of time. Since it executes a few instructions, it can be interrupted in less time than the 6809 can while chewing on that big, nice, luxery instruction, but given the higher code density and overall speed of getting things done, the two may well perform in very similar response times if the task requires much more than an instruction or two to complete.
Abusing both chips can squeeze a ton of speed out of them too. The nice stack capabilities in the 6809 can be used to block copy memory very quickly, for example. Using it that way is kind of backward, but it's quick! The 6502 can use illegal ops and self-modifying code to get similar kinds of things done.
I found the overall programming experience on the 6809 pure fun. Great 8 bit assembly language. With two stacks, two accumulators that could be combined to form a 16 bit one, multiply instruction, auto increment / decrement, tons of great addressing modes and full 16 bit index registers, that thing could do a ton in 64K, and coupled with a nice MMU, run big programs quickly. The 6502 is harder, but simpler in terms of overall instructions, but it's really fast at responding, taking some brief action and returning to proceed as normal.
There was a project completed to drop a 6809 into the Atari computers, which use the 6502. We haven't seen a lot of code yet, but I think the slower response times are going to make doing some "racing the beam" tricks harder and may actually take a few tight cycle ones off the table entirely. On the other hand, being able to run more compact programs and compute larger values more easily means the machine could have a very different character too. I won't know as I won't be tinkering with that mess, but somebody will. So I'll watch with interest as that, "what if it had this CPU?" question gets answered so many years later.
"slow" means very different things. If you set aside the response times for software interrupts and plan out the hardware ones, a 6809 can respond very well. Other wise, it's probably the fastest and most powerful of the 8 bit chips in most things due to the powerful instructions and very flexible addressing modes. A total joy to program in assembly language.
And there is a great example of competing designs. On most technical merits, the 6809 is a powerful chip, best in class. The 6502 is kind of simple by comparison, but it got used all over the place and remains in use today. I submit at the time those chips were made, we didn't know how to do the best 8 bit chip, but we did know how to make pretty great 8 bit chips, Z80, 6502, 6809 were great chips. What made them great? Well, one needs to go and look at the voodoo under the hood to understand that and build better ones.
It's no different today. If one goes looking, they can be a Chip type, who says just about anything anybody wants to know about chip design is on the Internet today. Or they might be one who appreciates the work a Chip type of person does too.
Seems to me, if one knows some basic machine language, not even assembly language, then it's possible to build a Forth on the target and then have that Forth build a much better one. Knowing Forth really means knowing how to build a Forth IMHO.
"Obtuse"? Perhaps. Or perhaps I just have a view of things from a different angle.
Ada could not be rejected from all quarters of the community because 95% of the programming community never knew...
It might be true that 95% of current programmers don't know but as far as I recall during the early days of the micro-computer revolution, when there were far fewer programmers than today and Ada was on the table for them along with C and Pascal and a bunch of others.
Strangely enough wikipedia's article on Ada directly contradicts you "Ada attracted much attention from the programming community as a whole during its early days." I certainly recall that it was talked about all over. It was known.
For whatever reason it did not hold the "communities" attention. Perhaps the compilers were terrible, that was certainly true of one I remember evaluating for 8086 machines in a military project in the early 1980's. Perhaps they were too expensive, that "few hundred dollars" was a lot at the time (Not to the military but to the wider world). Perhaps it was because Ada is sort of a continuation of the Pascal theme which also proved unpopular.
Ada was a specific language for a specific purpose,
Not as far as I can see. My impression was that it was designed as a general purpose language to displace the myriad of languages that the DoD had in use. Certainly Ada aficionados have pitched it as general purpose language to me over the years.
Nobody in their right minds would attempt to write a graphical operating system in Ada. Or a mobile phone app. Or a web site. Or a microwave oven controller.
I'm very curious as to why you say that. As far as I can tell Ada has all the features expected of a high level language in it's class, comparable to C/C++, Pascal and so on. Certainly Jean Ichbiah saw it as a systems programming language, the sort of thing you would build operating systems from. There are now Ada bindings for Qt and OpenGL so I guess someone thinks Ada has use for graphical apps. And why not the microwave oven controller? I would have thought that was Ada's perfect home.
And debugging and maintenance costs were massively reduced.
Having spent a couple of decades involved in military and avionics projects I'm not totally convinced of that argument. Certainly when on the testing side of things I hadn't noticed that the defect rate of code thrown over the fence by programmers was any less than the other languages in use at the time. In fact I can point to at least two Ada projects that were totally screwed by the choice of Ada.
The amazingly low defect rate that is achieved on such projects is more due to the extensive up front specification and design. And the rigorous review and checking that goes on at all stages. Then there is the extensive testing. My observation is that the actual programming is a small part of all this effort and the language in use hardly matters.
Hardware manufacturers hated it...Software manufacturers hated it.
I believe you are right. At least when talking about the situation 30 years ago languages were a means of "lock-in" for vendors. Surprisingly standardization of languages has been going on very well since. Even MS managed to put together a decent C/C++ compiler, they cloned Java and JavaScript very well. Today the vendor lock in comes at a higher level, Windows or Mac, iOS or Android and so on.
But many software engineers loved it - not least because having once learned it, their skills were applicable to just about every company in the industry. They could move at will anywhere they liked and be productive the day they arrived.
Oh yeah. The happy band of freelance engineers I mixed with years ago certainly loved Ada for that reason. Not for the language as such but the continuous stream of lucrative contracts you could get with it. But they were clever guys, they did not care which language they used, and like the DoD the British military firms had many languages in use. To them the language in use was only a small detail.
The question is: "What 5 programming languages should every programmer know?"
The first and foremost programming language all programmers should know is assembler. Yes I know there are many machine architectures and no single "assembler language" but let's go with "assembler" in general.
At least that is my view this morning after having been watching a bunch of DEF-CON videos last night.
Why?
Well how else are we going to hack stuff?
Some how we need to be able to create Java exploits, overflow those C buffers and so on to get control of our machines. How else do we root iOS or Android devices? How do we get around DRM? How do we crack eBooks etc etc etc.
Increasingly computer device and media providers are trying to take control of all computing and lock us out of the loop. This can not be allowed to happen.
Assembler skills are a valuable tool that will be essential in the coming rebellion!
Is the DEF-CON paranoia getting to me? Seems to be getting to many, DEF-CON has grown to 15000 visitors this year.
No, you absolutely don't have to build your own, but Forth has a very interesting property. That property is if you know enough forth to write effective programs, you will know what the kernel of Forth does, and you know how words are built, etc...
So then you can always build a Forth! And once that Forth is made, on anything mind you doesn't have to be assembler, you then can bring your programs along for the ride. The only thing it takes is having some knowledge outside of Forth to build the first few words you need to then define all the other ones.
I think Forth is neat that way. There is very little to Forth itself. Forth programs depend on that little bit to do what they do. As a language, it's tiny. Assembler like.
Strangely enough wikipedia's article on Ada directly contradicts you "Ada attracted much attention from the programming community as a whole during its early days." I certainly recall that it was talked about all over. It was known.
That matches my recollection. Ada was talked about a great deal in that era. Lots and lots of discussion in all the trade magazines.
The reasons for assembly that I gave were the usual things: ability to do low level tasks, understand what the higher level languages boil down to, etc...
While I'd taken an undergrad course that touched on Fortran, Algol and APL (late 1960s), it was a big ho hum sleepy trip to the black box of computer central. Around 1971 in grad school I took an assembly language class, taught on a PDP7. Now, that was exciting. It is the magic of it that I emphasize. The PDP7 had 18bit words of which 4 bits were instruction and 14 were memory address. It left me with a lasting impression that with only 16 instructions, the machine could build up a higher level language, solve differential equations, run a display etc. So I would invert your statement, pedagogically, to "understand what higher level languages boil up from.". This is from the standpoint of someone who is mostly hardware oriented, definitely not a professional programmer. That same year the 4004 came out.
Got the most interesting comment: Assembly language isn't a language, but more of a description language for a binary.
There can be no doubt that assembly and its tools are a big step of abstraction above the binary. One can be coding in assembly and not realize how different instructions are substantially the same machine code, one bit or flag different. One has to study the assembly breakdowns (which are always provided), or actually hand code a bit for the machine. There are words and a syntax to be learned. And alternative assembly languages, for example, comparing the Microchip native PIC asm with the one that Chip Gracey came up with.
Strangely enough wikipedia's article on Ada directly contradicts you "Ada attracted much attention from the programming community as a whole during its early days." I certainly recall that it was talked about all over. It was known.
Attention, yes. Talked about, yes. Talked to death in fact. Mostly by people who knew very little more about it than its name and lineage, and who wrongly assumed that any initiative instigated by the Department of Defense must necessarily be a "bad thing".
However, your post also reminded me of a few other influential groups that contributed to the death of Ada:
Software and hardware consultants hated it. All their vaunted and highly overpriced advice about what software language or hardware architecture to use on a particular project became completely irrelevant, since the answer on software was always "Ada" and the answer on hardware was always "whatever is the cheapest available that will do the job" - and if you got that wrong you could change it later anyway.
Software tool vendors hated it. Most of their overpriced tools were designed to overcome specific language deficiencies that simply didn't exist in Ada (Ada has a few of its own, of course - but not many of them). Also, Ada has it's own tools - but they were often free, since they were developed under contract to the DoD. If fact, Ada had its own Programming Support Environment specified, which performed the job of most such tools anyway.
Software "gurus" hated it. It deprived them of their guru status, which was often based on hoarding little bits of knowledge about various language "tricks" or "quirks" that are almost entirely absent in orthogonal languages such as Ada. In the place of gurus, Ada generated a community of "language lawyers" - who instead of hoarding their knowledge, would argue about it incessantly in public forums, until everyone understood all the nuances of all the features of the language. Amongst its practitioners, Ada is probably the best understood language ever used.
Anyone who wants to learn what is probably the best strongly-typed, object-oriented, modular, structured programming language ever devised specifically for real-time safety critical applications should study Ada.
The SPARK Ada initiative (a fully compatible subset of Ada, which restricts the use of certain non-provable Ada features) should also be mandatory for anyone who does real-time safety critical work. SPARK programs can be formally and unambiguously proven to satisfy safety or security constraints.
Which brings me to what may have been the final and most decisive contributor to the downfall of Ada. Ada compilers are written in Ada, which means an Ada system can be entirely bootstrapped from itself if required, and (with sufficient effort) proven to be free of the backdoors and other exploits so beloved of various individuals, companies and even government agencies.
Ada was just too successful for it's own good. It had to go.
Could a single cog 4 port serial driver capable of 4 x 115Kbaud be written for the Propeller in Forth? Don't get me wrong, I think Forth has it's place, but there are some things you need assembly for.
Yes, of course a single cog 4 port serial driver capable of 4 x 115Kbaud be written for the Propeller in Forth. And if throughput started to choke when all four channels are maxed out, the bottle necks could be optimized in assembler. As done in Propforth, for example. FORTH is not separate from assembler, forth is the intro to assembler. But its optional and so I can stubbornly avoid PASM, until next release when the docs need to be written.
Nobody commented on my "forth + 1 more = the five languages one needs" joke. Need to add humor next time.
BASIC, IBM 360 assembler, FORTRAN, PL/1, x86 assembler, Pascal, C, FORTH, Perl, python; in that order
Without proper requirements, testing, equipment, and time, none of these is of much use.
"Dynamic environment, aggressive schedule" means "poorly managed, failure inevitable", regardless of the programming language.
By the way, who was that idiot that thought GOTO was bad idea and led us to the worse nightmare of exceptions?
Niklaus Wirth?
BTW, nobody has mentioned Modula 2, which I actually liked at the time, even though it had all the features that I despise now. Give me a typeless language, with a choice of object orientation or not, regular expressions, and automatic garbage collection (!), and I'm a happy man.
____________________
These are the languages I've at least been exposed to (in chronological order) and how I would rate them (0 to 5 stars). Those which I've pursued over the course of a year or more or with more than superficial motivation are in boldface.
FORTRAN IV (**), IBM 1130/1800 assembly (***), BASIC (* to ***, depending on dialect), PDP7 assembly (***), PL/I (*), PL/C (zero), Snobol (**), IBM 360 assembly (**), PL360 (***), Z80 assembly (***), Z8 assembly (****), Ladder logic (zero), S8 assembly (****), Forth (***), Postscript (****), Modula 2 (**), Motorola 56000 assembly (**), PIC assembly (*** but only with Parallax mnemonics, * otherwise), Perl (*****), Javascript (***), 8051 assembly (*), SX assembly (***), AVR assembly (**), Spin (*****), PASM (*****), C (*)
I'm not sure how I'd rank the programming languages I've used, but there are clearly threads. Algol-like is one of those. Lisp-like is another. Assembly language is one that varies with the instruction set and hardware, but there are common features in many different assembly languages of which macro languages is one.
Let's see ... a partial list: Basic, Fortran II/III/IV, PL/I, Snobol, Forth, Pascal, Modula2, Lisp, Smalltalk, AWK, Perl, Javascript, Spin, C, PL360, RPG, Cobol, Assembly languages for IBM 7094/360/1401/1440/1130, Univac I/1107/1108, PDP-1/5/8/11, Datapoint 2200/5500/6600, PIC, Z80, 8008, 8080, 8051, Propeller ASM. I learned Ada at one point ... sort of academically. I never wrote a program in it other than to try out some of the features.
There was quite a debate back and forth on jumps. The essence of it was that they were vastly overused, that there were good reasons to have them occasionally, but probably less often than you wanted and you really should be encouraged or forced to use them only under well controlled conditions. Partly as a challenge from a mentor, I wrote an operating system for the IBM360 using PL360 without any GOTOs except for one place where tasks were resumed from a queue using a "return from interrupt" instruction and interrupts themselves, both hardware triggered and software interrupts that worked somewhat like a subroutine call but using the interrupt mechanism. It wasn't anywhere as hard as I thought it would be. Later I repeated the effort on a Z80 where most of the OS was written in an extended Pascal with its very limited GOTOs.
In random order that all with varying degrees of proficiency:
FORTRAN, BASIC, Assemblers: 8080, COSMAC 1802, 6502, 6800, Z80, 6809, Forth, COBOL, RPGII, IBM370 assembler, UNIVAC 1100 Series assembler, PLUS (like PL/1), Sperry DCP Assembler, Mapper, Pascal, C, C++, Prolog, Lisp, Snobol, Perl, Ruby, Python, Java, Javascript (Node), PL/SQL, Spin, PASM, Groovy, Falcon, Factor, R, Octave, Smalltalk, Go.....I'm probably forgetting some.
I confuse myself daily with context switching!
The 1100 assembler was my most prolific. Python is my current favorite. RPG shouldn't count. Prolog was fun - wrote a small decision support system. Perl is fun (once we came to terms with each other). Java, C, C++ all involve too much typing - header files, code files, class definitions, class implementations, includes, imports, braces, braces and more braces,blah, blah, blah..... Javascript and Node are interesting
My real problem isn't learning the languages as much as the libraries - there are so many things in some of the standard libraries, it's hard to remember all the goodies that can really make the language sing.
I didn't read every reply on this thread. But, the subject of assembly caught my interest enough to chime in. I loved it when working with the 6502. During my stint as a s/w quality assurance engineer many years ago I learned that as computers got more complicated, programming got worse. It appeared that most people calling themselves programmers were simply writing modules. They had no idea what they were for or what they were supposed to do [in the big picture.] They only had to make sure their module accepted specific X or X's as inputs and put out specific Y or Y's as outputs. No one person writes a program anymore. Computers have so much memory and other storage space there is no need to be tidy with your programming. Current software is filled with junk and redundancies. And the oddest thing I found with higher languages is that the simpler the task the more complicated it is to write proportionally speaking. add 2 and 2 in assembly and a C language and tell me which task was simpler. But I wouldn't want to program our modern tasks in assembly. I certainly always appreciated knowing every step the computer was taking to accomplish the task, not to mention the puzzle factor of trying to get something done with limits and making your code efficient.
Personally, I think if modern software was programmed in assembly (assuming some one figured out how to stop time), our software would probably take up 1/4 of the space it does and probably wouldn't be so buggy. You have to be accountable with assembly.
I believe that one of the biggest sources of error in modern software, using C as an example, has its roots in malloc. That's one reason that I like languages that support autovivification and automatic garbage collection. Although garbage collection can severely impact determinism, the routines that support it and autovivification only have to be written and debugged once, by the language developer. As a consequence, their own uses of memory allocation and deallocation can be vetted thoroughly across an extremely large number of apps, rather than each app programmer having to verify their use of dynamic memory allocation on his/her own.
Languages I've been paid to program and/or have formal education in: C, C++, Java, Javascript, Pascal, Perl, Python, Spin, Tcl/Tk, BASIC, Visual Basic (notnet), VisualC#, bash, batch, csh, sh, awk, sed, AHDL, ASM for x86, x386, 6502, 68K, AVR, MIPS3, MIPS4, MIPS64, 8051, PPC8560, PASM.
I agree with Phil about malloc, and generally it should be avoided unless you have a good reason to use it. It's easy for someone who is not well versed to f-it up.
BASIC, 6502 machine then assembly language, LOGO, 6809 assembly language, PASCAL, C, a touch of x86 assembly language, PERL, TcL/TK, SPIN, PASM, Forth.
Of those, TcL, x86, PERL were brief, project oriented adventures, though I did do some system automation in PERL on IRIX that got involved. Forth was fun to explore, but not a language for me right now. Perhaps later. I was very intrigued and it did make me think a bit differently.
Comments
Man, I love that chip! Spent a ton of time programming it and the 6502. Yeah, the 6809 can be slow responding to something, but if there is more work to be done, it can complete that quickly.
Now, you can use it in more clever ways to cut down that cycle count. There is the full interrupt and a more limited one. I'm not where I can go read my Moto programmers reference, but the key is the saving of the entire CPU state as opposed to just enough to respond.
A 6502 can respond much faster, for example but then again, it's a much more simple chip too. And it's got the same choice really, but the decision is more about software.
Both of those chips were considered "fast" relative to RAM speeds at the time, and in DMA based systems, the higher clock speeds were balanced out by the CPU waiting. In the more clever design I mention below, the higher clock meant a much faster CPU, but better, faster RAM was needed too. More expensive.
The differences play out in notable ways. If one is wanting fast code to respond to events, a 6502 can get the jump on the 6809. However, if one wants to really pack a lot into the program space and write more powerful programs per byte, the 6809 is very difficult to beat in 8 bit land.
Both chips access memory fairly quickly. The key to running both at very nice speeds is to utilize fast RAM and provide the CPU access to it on one clock phase and the system bus can be used for other things, such as DMA, Video, etc... on the other phase. Or, one needs to go cycle stealing and work access in when both chips are busy. The 6809 can go off for a lot of cycles on one instruction. Some going what? 14+ cycles for something like LDD ($FEED), X++ A 6502 takes a few instructions to do that, but it can do it in a similar amount of time. Since it executes a few instructions, it can be interrupted in less time than the 6809 can while chewing on that big, nice, luxery instruction, but given the higher code density and overall speed of getting things done, the two may well perform in very similar response times if the task requires much more than an instruction or two to complete.
Abusing both chips can squeeze a ton of speed out of them too. The nice stack capabilities in the 6809 can be used to block copy memory very quickly, for example. Using it that way is kind of backward, but it's quick! The 6502 can use illegal ops and self-modifying code to get similar kinds of things done.
I found the overall programming experience on the 6809 pure fun. Great 8 bit assembly language. With two stacks, two accumulators that could be combined to form a 16 bit one, multiply instruction, auto increment / decrement, tons of great addressing modes and full 16 bit index registers, that thing could do a ton in 64K, and coupled with a nice MMU, run big programs quickly. The 6502 is harder, but simpler in terms of overall instructions, but it's really fast at responding, taking some brief action and returning to proceed as normal.
There was a project completed to drop a 6809 into the Atari computers, which use the 6502. We haven't seen a lot of code yet, but I think the slower response times are going to make doing some "racing the beam" tricks harder and may actually take a few tight cycle ones off the table entirely. On the other hand, being able to run more compact programs and compute larger values more easily means the machine could have a very different character too. I won't know as I won't be tinkering with that mess, but somebody will. So I'll watch with interest as that, "what if it had this CPU?" question gets answered so many years later.
"slow" means very different things. If you set aside the response times for software interrupts and plan out the hardware ones, a 6809 can respond very well. Other wise, it's probably the fastest and most powerful of the 8 bit chips in most things due to the powerful instructions and very flexible addressing modes. A total joy to program in assembly language.
And there is a great example of competing designs. On most technical merits, the 6809 is a powerful chip, best in class. The 6502 is kind of simple by comparison, but it got used all over the place and remains in use today. I submit at the time those chips were made, we didn't know how to do the best 8 bit chip, but we did know how to make pretty great 8 bit chips, Z80, 6502, 6809 were great chips. What made them great? Well, one needs to go and look at the voodoo under the hood to understand that and build better ones.
It's no different today. If one goes looking, they can be a Chip type, who says just about anything anybody wants to know about chip design is on the Internet today. Or they might be one who appreciates the work a Chip type of person does too.
Seems to me, if one knows some basic machine language, not even assembly language, then it's possible to build a Forth on the target and then have that Forth build a much better one. Knowing Forth really means knowing how to build a Forth IMHO.
"Obtuse"? Perhaps. Or perhaps I just have a view of things from a different angle. It might be true that 95% of current programmers don't know but as far as I recall during the early days of the micro-computer revolution, when there were far fewer programmers than today and Ada was on the table for them along with C and Pascal and a bunch of others.
Strangely enough wikipedia's article on Ada directly contradicts you "Ada attracted much attention from the programming community as a whole during its early days." I certainly recall that it was talked about all over. It was known.
For whatever reason it did not hold the "communities" attention. Perhaps the compilers were terrible, that was certainly true of one I remember evaluating for 8086 machines in a military project in the early 1980's. Perhaps they were too expensive, that "few hundred dollars" was a lot at the time (Not to the military but to the wider world). Perhaps it was because Ada is sort of a continuation of the Pascal theme which also proved unpopular. Not as far as I can see. My impression was that it was designed as a general purpose language to displace the myriad of languages that the DoD had in use. Certainly Ada aficionados have pitched it as general purpose language to me over the years. I'm very curious as to why you say that. As far as I can tell Ada has all the features expected of a high level language in it's class, comparable to C/C++, Pascal and so on. Certainly Jean Ichbiah saw it as a systems programming language, the sort of thing you would build operating systems from. There are now Ada bindings for Qt and OpenGL so I guess someone thinks Ada has use for graphical apps. And why not the microwave oven controller? I would have thought that was Ada's perfect home. Having spent a couple of decades involved in military and avionics projects I'm not totally convinced of that argument. Certainly when on the testing side of things I hadn't noticed that the defect rate of code thrown over the fence by programmers was any less than the other languages in use at the time. In fact I can point to at least two Ada projects that were totally screwed by the choice of Ada.
The amazingly low defect rate that is achieved on such projects is more due to the extensive up front specification and design. And the rigorous review and checking that goes on at all stages. Then there is the extensive testing. My observation is that the actual programming is a small part of all this effort and the language in use hardly matters. I believe you are right. At least when talking about the situation 30 years ago languages were a means of "lock-in" for vendors. Surprisingly standardization of languages has been going on very well since. Even MS managed to put together a decent C/C++ compiler, they cloned Java and JavaScript very well. Today the vendor lock in comes at a higher level, Windows or Mac, iOS or Android and so on. Oh yeah. The happy band of freelance engineers I mixed with years ago certainly loved Ada for that reason. Not for the language as such but the continuous stream of lucrative contracts you could get with it. But they were clever guys, they did not care which language they used, and like the DoD the British military firms had many languages in use. To them the language in use was only a small detail.
The first and foremost programming language all programmers should know is assembler. Yes I know there are many machine architectures and no single "assembler language" but let's go with "assembler" in general.
At least that is my view this morning after having been watching a bunch of DEF-CON videos last night.
Why?
Well how else are we going to hack stuff?
Some how we need to be able to create Java exploits, overflow those C buffers and so on to get control of our machines. How else do we root iOS or Android devices? How do we get around DRM? How do we crack eBooks etc etc etc.
Increasingly computer device and media providers are trying to take control of all computing and lock us out of the loop. This can not be allowed to happen.
Assembler skills are a valuable tool that will be essential in the coming rebellion!
Is the DEF-CON paranoia getting to me? Seems to be getting to many, DEF-CON has grown to 15000 visitors this year.
So it's not enough to use Forth to know Forth, I need to write my own? It really is a mind virus.
No, you absolutely don't have to build your own, but Forth has a very interesting property. That property is if you know enough forth to write effective programs, you will know what the kernel of Forth does, and you know how words are built, etc...
So then you can always build a Forth! And once that Forth is made, on anything mind you doesn't have to be assembler, you then can bring your programs along for the ride. The only thing it takes is having some knowledge outside of Forth to build the first few words you need to then define all the other ones.
I think Forth is neat that way. There is very little to Forth itself. Forth programs depend on that little bit to do what they do. As a language, it's tiny. Assembler like.
That matches my recollection. Ada was talked about a great deal in that era. Lots and lots of discussion in all the trade magazines.
While I'd taken an undergrad course that touched on Fortran, Algol and APL (late 1960s), it was a big ho hum sleepy trip to the black box of computer central. Around 1971 in grad school I took an assembly language class, taught on a PDP7. Now, that was exciting. It is the magic of it that I emphasize. The PDP7 had 18bit words of which 4 bits were instruction and 14 were memory address. It left me with a lasting impression that with only 16 instructions, the machine could build up a higher level language, solve differential equations, run a display etc. So I would invert your statement, pedagogically, to "understand what higher level languages boil up from.". This is from the standpoint of someone who is mostly hardware oriented, definitely not a professional programmer. That same year the 4004 came out.
There can be no doubt that assembly and its tools are a big step of abstraction above the binary. One can be coding in assembly and not realize how different instructions are substantially the same machine code, one bit or flag different. One has to study the assembly breakdowns (which are always provided), or actually hand code a bit for the machine. There are words and a syntax to be learned. And alternative assembly languages, for example, comparing the Microchip native PIC asm with the one that Chip Gracey came up with.
Five variants of one language does not count
ALGOL, FORTRAN, C, PL/M, Coral, Pascal, Ada, Spin etc etc are all much the same in some "structured programming" world.
C++, Eiffel etc move us into the object oriented world.
Lisp, Scheme, Self, JavaScript etc etc take us into space a bit.
Then there is Prolog and such that nobody understands.
That's like saying SPIN is like Pascal (which you vehemently objected to at one time).
Perhaps not variants of the same language exactly but basically the same ideas.
Anyone who is familiar with one can very soon assimilate the other.
Basically they all express the 1970's ideas of "structured programming" that is: sequence, selection and iteration.
Throw in some ideas about arrays, structures and types and that sums up all of ALGOL, C, Pascal, PL/M, Coral, Ada, etc etc ad nauseam.
Yes, yes Pascal, for example has some fussy rules about types. And yes there is no "main" etc. But really it is all the same conceptually.
It's not until we get to Simula, C++, Eiffel, and yes, Object Pascal that we get some new ideas about in program organization.
Meanwhile, languages like Lisp, Scheme and Prolog belong in a different universe.
Yeah, OK.
Thing is Spin is in no way like Pascal if you think about types for example.
Spin is also not like Pascal if you think about objects.
On the other hand, it's all just sequence, selection and iteration. Same old, same old 1960's ALGOL style "structured programming" ideas.
By the way, who was that idiot that thought GOTO was bad idea and led us to the worse nightmare of exceptions?
Attention, yes. Talked about, yes. Talked to death in fact. Mostly by people who knew very little more about it than its name and lineage, and who wrongly assumed that any initiative instigated by the Department of Defense must necessarily be a "bad thing".
However, your post also reminded me of a few other influential groups that contributed to the death of Ada:
Software and hardware consultants hated it. All their vaunted and highly overpriced advice about what software language or hardware architecture to use on a particular project became completely irrelevant, since the answer on software was always "Ada" and the answer on hardware was always "whatever is the cheapest available that will do the job" - and if you got that wrong you could change it later anyway.
Software tool vendors hated it. Most of their overpriced tools were designed to overcome specific language deficiencies that simply didn't exist in Ada (Ada has a few of its own, of course - but not many of them). Also, Ada has it's own tools - but they were often free, since they were developed under contract to the DoD. If fact, Ada had its own Programming Support Environment specified, which performed the job of most such tools anyway.
Software "gurus" hated it. It deprived them of their guru status, which was often based on hoarding little bits of knowledge about various language "tricks" or "quirks" that are almost entirely absent in orthogonal languages such as Ada. In the place of gurus, Ada generated a community of "language lawyers" - who instead of hoarding their knowledge, would argue about it incessantly in public forums, until everyone understood all the nuances of all the features of the language. Amongst its practitioners, Ada is probably the best understood language ever used.
Anyone who wants to learn what is probably the best strongly-typed, object-oriented, modular, structured programming language ever devised specifically for real-time safety critical applications should study Ada.
The SPARK Ada initiative (a fully compatible subset of Ada, which restricts the use of certain non-provable Ada features) should also be mandatory for anyone who does real-time safety critical work. SPARK programs can be formally and unambiguously proven to satisfy safety or security constraints.
Which brings me to what may have been the final and most decisive contributor to the downfall of Ada. Ada compilers are written in Ada, which means an Ada system can be entirely bootstrapped from itself if required, and (with sufficient effort) proven to be free of the backdoors and other exploits so beloved of various individuals, companies and even government agencies.
Ada was just too successful for it's own good. It had to go.
Ross.
Yes, of course a single cog 4 port serial driver capable of 4 x 115Kbaud be written for the Propeller in Forth. And if throughput started to choke when all four channels are maxed out, the bottle necks could be optimized in assembler. As done in Propforth, for example. FORTH is not separate from assembler, forth is the intro to assembler. But its optional and so I can stubbornly avoid PASM, until next release when the docs need to be written.
Nobody commented on my "forth + 1 more = the five languages one needs" joke. Need to add humor next time.
BASIC, IBM 360 assembler, FORTRAN, PL/1, x86 assembler, Pascal, C, FORTH, Perl, python; in that order
Without proper requirements, testing, equipment, and time, none of these is of much use.
"Dynamic environment, aggressive schedule" means "poorly managed, failure inevitable", regardless of the programming language.
BTW, nobody has mentioned Modula 2, which I actually liked at the time, even though it had all the features that I despise now. Give me a typeless language, with a choice of object orientation or not, regular expressions, and automatic garbage collection (!), and I'm a happy man.
____________________
These are the languages I've at least been exposed to (in chronological order) and how I would rate them (0 to 5 stars). Those which I've pursued over the course of a year or more or with more than superficial motivation are in boldface.
FORTRAN IV (**), IBM 1130/1800 assembly (***), BASIC (* to ***, depending on dialect), PDP7 assembly (***), PL/I (*), PL/C (zero), Snobol (**), IBM 360 assembly (**), PL360 (***), Z80 assembly (***), Z8 assembly (****), Ladder logic (zero), S8 assembly (****), Forth (***), Postscript (****), Modula 2 (**), Motorola 56000 assembly (**), PIC assembly (*** but only with Parallax mnemonics, * otherwise), Perl (*****), Javascript (***), 8051 assembly (*), SX assembly (***), AVR assembly (**), Spin (*****), PASM (*****), C (*)
'Very subjective, of course!
-Phil
Edsger Dijkstra: http://www.u.arizona.edu/~rubinson/copyright_violations/Go_To_Considered_Harmful.html
And yes, definitely one of those situations where the cure was worse than the disease.
Let's see ... a partial list: Basic, Fortran II/III/IV, PL/I, Snobol, Forth, Pascal, Modula2, Lisp, Smalltalk, AWK, Perl, Javascript, Spin, C, PL360, RPG, Cobol, Assembly languages for IBM 7094/360/1401/1440/1130, Univac I/1107/1108, PDP-1/5/8/11, Datapoint 2200/5500/6600, PIC, Z80, 8008, 8080, 8051, Propeller ASM. I learned Ada at one point ... sort of academically. I never wrote a program in it other than to try out some of the features.
There was quite a debate back and forth on jumps. The essence of it was that they were vastly overused, that there were good reasons to have them occasionally, but probably less often than you wanted and you really should be encouraged or forced to use them only under well controlled conditions. Partly as a challenge from a mentor, I wrote an operating system for the IBM360 using PL360 without any GOTOs except for one place where tasks were resumed from a queue using a "return from interrupt" instruction and interrupts themselves, both hardware triggered and software interrupts that worked somewhat like a subroutine call but using the interrupt mechanism. It wasn't anywhere as hard as I thought it would be. Later I repeated the effort on a Z80 where most of the OS was written in an extended Pascal with its very limited GOTOs.
FORTRAN, BASIC, Assemblers: 8080, COSMAC 1802, 6502, 6800, Z80, 6809, Forth, COBOL, RPGII, IBM370 assembler, UNIVAC 1100 Series assembler, PLUS (like PL/1), Sperry DCP Assembler, Mapper, Pascal, C, C++, Prolog, Lisp, Snobol, Perl, Ruby, Python, Java, Javascript (Node), PL/SQL, Spin, PASM, Groovy, Falcon, Factor, R, Octave, Smalltalk, Go.....I'm probably forgetting some.
I confuse myself daily with context switching!
The 1100 assembler was my most prolific. Python is my current favorite. RPG shouldn't count. Prolog was fun - wrote a small decision support system. Perl is fun (once we came to terms with each other). Java, C, C++ all involve too much typing - header files, code files, class definitions, class implementations, includes, imports, braces, braces and more braces,blah, blah, blah..... Javascript and Node are interesting
My real problem isn't learning the languages as much as the libraries - there are so many things in some of the standard libraries, it's hard to remember all the goodies that can really make the language sing.
Personally, I think if modern software was programmed in assembly (assuming some one figured out how to stop time), our software would probably take up 1/4 of the space it does and probably wouldn't be so buggy. You have to be accountable with assembly.
-Phil
I agree with Phil about malloc, and generally it should be avoided unless you have a good reason to use it. It's easy for someone who is not well versed to f-it up.
BASIC, 6502 machine then assembly language, LOGO, 6809 assembly language, PASCAL, C, a touch of x86 assembly language, PERL, TcL/TK, SPIN, PASM, Forth.
Of those, TcL, x86, PERL were brief, project oriented adventures, though I did do some system automation in PERL on IRIX that got involved. Forth was fun to explore, but not a language for me right now. Perhaps later. I was very intrigued and it did make me think a bit differently.
I've got revisiting C and Jscript on my list.