What lured me into the Propeller chip is the easy interface to TV, Mouse and Keyboard. I had always thought that it would take hard programming and lots of hardware to do that, but now I can do it all with the Propeller.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔ Toys are microcontroled.
Robots are microcontroled.
I am microcontroled.
Mike Green said...
ElectricAye,
I'm afraid you will be stuck at lust and longing for a long time.
Hi Mike,
Mind you, I'm not saying that lust is a bad thing. Lust has carried our species through wars, famine, plague and depressions for millennia, much as my Propeller lust has carried me from utter cluelessness to a working prototype. It's just that a nice comment now and then would maybe help me bridge that gulf between lust and love. I'm not asking for flowers and candy or for the OBEX to do my laundry; I just need some sweet non-nothings whispered into the code here and there.
Mike Green said...
Pick an object in the Object Exchange that either interests you or solves a problem that interests you and maybe has minimal documentation... Get it to work for you, then learn how it works by experimenting with it, modifying it, expanding it, then documenting it for others. You'll learn a lot and you'll help the community....
Yes, I've thought about that. But I'm guessing members of the community would evict me for littering their spartan OBEX objects with my wordy, often moronic, and sometimes DADAistic, if not surrealistic, commentary. Unfortunately, much of what makes the OBEX objects work is done in assembly language, of which I know next to nothing. And what little I know of the objects was derived from reverse engineering them, sticking in a value to see what happens. I would therefore have to scrub my code comments of the many obscenities with which it is seismically isolated. A monumental task that would be.
It's a nice thought, though. Thank you for having it.
Thanks, SRLM. It's refreshing to know that someone out there has shed the ancient shackles of limited memory allotments in embedded systems and has aspired to unite art and science with the epiphanies of technology. I'm looking forward to holding yours up as a shining example.
ElectricAye said...
Thanks, SRLM. It's refreshing to know that someone out there has shed the ancient shackles of limited memory allotments in embedded systems and has aspired to unite art and science with the epiphanies of technology.
Mike Green said...
...·The objects in the object exchange are mostly contributed by enthusiasts who are interested in sharing their efforts. It takes a lot of effort and time to document code well, even more to document adequately for beginners to use. There's little incentive to put in extra time to document well unless you get paid for it (and chained to your desk until it gets done).
A good way to learn about the Propeller and both Spin and assembly is to learn how someone else designed and coded something...
Mike,
even professional developers - i.e. those who 'get paid for it and are chained to a desk' - don't often document their code well - unless it's an enforced policy. That said, I don't think it is a good idea to perpetuate the "oh that's just the way it is because we don't have time" attitude.··Once you have to maintain it - even your own code - it becomes quickly obvious·how important documentation is.· Sure, studying and tweeking other's code is really the *only* way to learn to substantially·program anything.· Yet·simple comments make·learning·faster and easier.
A code repository, of ALL places, needs good commenting and docs. I submit to you the PERL and Python communities as shining examples of how to do it right.
Mind you, I'm not talking necessarily about all-encompassing documents, rather about simple things that either 1) self document, or 2) have guiding comments, e.g.:
·GOOD - self documenting comment-like code AND comments --- just code and write each line (see bottom of post too, please) }}
{... stuff omitted ...} DAT
{Toggle P16}
org 0 'Begin at Cog RAM addr 0
Toggle mov dira, Pin 'Set Pin to output
mov Time, cnt 'Calculate delay time
add Time, #9 'Set minimum delay here
:loop waitcnt Time, Delay 'Wait
xor outa, Pin 'Toggle Pin
jmp #:loop 'Loop endlessly
{ BAD - ah, the old a,b,c, x,y,z and t vars - oh, wait, whats a "d" do? Darn, Doggy, Doo·doo --- oh, "Delay"·oh, I get it }
DAT
org 0
Toggle mov dira, 3
mov t, x
add t, #9
:here waitcnt t, d
xor outa, 3
jmp #:here
·{ UGLY·- CountDracula, eh ?·... must be the automatic blood draw·and test·rountine·}
{{ and, yes, I have seen code like that - one of·our overseas developers *and his code-reviewing manager* loved Mister Roger's Neighborhood --- he named his vars and routines after things in the show. When·we inherited·their 100,000 lines of C and C++ code, several people quit after looking at what they'd have to maintain.
The way I train programmers to code and comment is easy to do and implement:
Write the pseudo code as a comment before you code it ... as they tell you to do in English class, make a outline before you write.· This will actually improve your coding and --- you won't believe me until you do it some months --- it will make·coding *much* faster, and less error prone.
Leave the pseudo code there as a comment once the code is in.
}}
:EndRant
mov Browser, NewLocation
' thanks
; Howard
// in Florida
/* */
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
No matter how much you push the envelope, it'll still be stationery.
CounterRotatingProps said...
Write the pseudo code as a comment before you code it ... as they tell you to do in English class, make a outline before you write. This will actually improve your coding and --- you won't believe me until you do it some months --- it will make coding *much* faster, and less error prone.
Yep, I agree. That's the way that I've been trying to teach beginning C++ programmers this quarter: think about your code before you write it, and at the very least make a picture on paper. The programing part is the easy part: it's designing a program that is hard.
[noparse][[/noparse]...] The programing part is the easy part: it's designing a program that is hard.
Indeed - and if you can't *easily* explain the design to yourself in words, then it's a *bad* design before it's even been coded.· Consider the Mars orbiter that was lost·in 1999 because two different groups of engineers used two different standards of measurement, causing a failure in data transmission from the Lockheed team to the mission's navigators.
Perhaps the systems integration team would have noticed the potential deadly flaw if the offending code modules had ONE comment? (*)
;
; These routines use SI units
;
A·$125 million comment?
(*) this is speculation by way of example·-·but it was a failure of communictions, the transmission of which is always via hardware which uses code.
cheers,
Howard
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
No matter how much you push the envelope, it'll still be stationery.
Well said, Howard.
Now, this is pure speculation, but I'm guessing the reason so many programmers use "electronic grunt" (i.e. one-letter variable names, zero comments, etc) has to do with the legacy of limited computer memory. Nowadays, memory and storage are incredibly cheap compared to yesteryear, but old habits die hard. People are still stuck in the old-fashioned mode of whittling everything down to the boney bits. Maybe that was cool years ago, or maybe that flaunted some kind of superior intellectual efficiency. But today it looks like semaphore without the flags. In SPIN, I think variable names can be something like 30 characters long, which allows you to write almost a biography of the variable, or at least its resume. Mind you, my whining about this issue isn't directed at all those noble volunteers out there who share their sumptuous code, but it's aimed more at Parallax who could've easily provided, it seems to me, a few bread crumbs of commentary in their essential OBEX codes so that the rest of us could use them not only as Objects per se but also as learning tools.
In SPIN, I think variable names can be something like 30 characters long, which allows you to write almost a biography of the variable, or at least its resume.
make your variables/names THE comments. [noparse]:)[/noparse]
CounterRotatingProps said...
Consider the Mars orbiter that was lost·in 1999 because two different groups of engineers used two different standards of measurement, causing a failure in data transmission from the Lockheed team to the mission's navigators.
······ It was not an engineering fault at all......it was the aliens on Mars that shot it down for coming close to the Face......... ······ The engineering error was a NASA cover up .........
······ ······ No...seriously............I am in total agreement........this is an amazingly good motivational example for students to learn the ·······value of commenting.............I love this example........... · ·······All the talk about them having to come back in a future time to read the code, means nothing to them.....··· ······ Future?? what is that?? come back and visit something I did a month ago????? why dude??? I have better things to do than revisit ······ ANCIENT history!!!!!· oh what it is to be young again.....but then I never thought that way even when I WAS young...... · ······ BUT.....I am sure you all know this already........Documentation is not just in CODE..........It is EVEN MORE important when you do ······ circuits.........I had occasion to want to use a PCB I did 5 years ago and found myself looking for the schematics to figure out what is ······ what...........but even more than the schematics, I found myself going back to my LOG to find out why I did this or that...... · ······· When I design software or hardware, I keep a LOG...in addition to comments and pseudo code etc.........this log has in it all the changes ······· Chroniclized and·comments for·why this or that, what decision drove this to be that way rather than the other, and any errors found and ······· why they occurred and how they were·resolved.....etc.etc. · ········ Just like when you·Sail a boat you keep a daily log, so you should when you program or design a PCB or really any design process..... ········ Comments, Log, Documentation, and even old and superseded designs and schematics. ·
········ I love the Mars example.........very motivational........ · · · ········ Samuel ·
@Chad George
"I do wish the eeprom was on-chip and secure"
I too lament the fact that the EEPROM is external, there should be at least 128KB on the silicon. but that is not a deal breaker.
EEPROM on the chip is still not secure, there are still ways of getting the data if it is worth much.
"I love not having to mess around with interrupts and constantly worrying if some new bit of code will mess up some timing critical old bit of code."
I created a program generator for the AVR that creates a skeletal C program containing all the interrupt code your program needs. It uses a set
of questions to build the code and gives you the max time you can waste in each interrupt routine before you will get into trouble. It is used a lot
here, a real time saver. I hope to write a few simple program generators for the Propeller...I love writing code generators and they can save a lot
of effort and are good learning tools if you also generate comments along with the code.
CounterRotatingProps said..."Write the pseudo code as a comment before you code it"
SRLM said.. "Yep, I agree. That's the way that I've been trying to teach beginning C++ programmers this quarter:"
@SRLM: A serious question "What is the pseudo code for C++?"
Back in the day I worked in a team that coded in HEX because we had no assembler. That HEX was manually derived from ASM we had written. That ASM was derived from pseudo code (which looked like ALGOL) which we had designed and written first. All was well.
In latter years pseudo code disappeared as we then had high level languages PL/M, Pascal, C etc. But at least we had data-flow diagrams, structure charts etc for the overall design.
Now I'm finding C++ code harder to read and understand than a lot of assembler code I've worked on! And I find myself wishing there was some "pseudo code" or design left behind by the authors to help. Getting the Qt library to do anything complicated is proving to be an up hill struggle.
So SRLM, as teacher of C++, what is the "pseudo code" I'm looking for? Perhaps UML is the thing, but looking at those diagrams for anything complicated make my head spin.
Do you have a recommended reading for the "design and documentation part" in a C++ project?
To keep this on topic. Spin's idea of "objects" does actually seem to help keeping code under control. Witness the easy of use of stuff from OBEX.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
For me, the past is not over yet.
Don't forget that the C language was originally intended to be simply a 'platform independent' assembly language. That's why the language has such nice low-level features.
C++ is therefore a hybrid language that tries to bolt a very high level of abstraction ... onto assembly language.
The result works about as well as bolting a rocket engine onto a skateboard. Yes, it can get you from A to B - but wear a crash helmet and watch out for the corners!
Ross.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Catalina - a FREE C compiler for the Propeller - see Catalina
heater said...
So SRLM, as teacher of C++, what is the "pseudo code" I'm looking for?
BTW, I've just finished my first quarter teaching C++ as part of a tutoring job (about 6-12 hours a week, depending on how hard the assignments are). I never realized how much work it takes just to prepare for a session.
I'm afraid the I'm using "pseudo code" loosely. It's not "official" pseudo code (the language). Since program design isn't part of the course, I can't actually require them to use anything like real psuedo code. Mostly, the point that I hammered at was to write down all the steps of your program on paper before you step to the computer. My definition of pseudo code was the logical steps that makes a program written in English (break down into statements and expressions, indentation to indicate what is tied to what). This worked for this course since it was an introduction level (no classes or multithreading) and it was all procedural, so it's fairly straight forward to write out the steps.
heater said...
Now I'm finding C++ code harder to read and understand than a lot of assembler code I've worked on!
Documentation was probably the worst part of the students programming. Almost nobody had comments, and many fell into the trap of bad/duplicate/similar variable names. One guy even named his functions after whatever came into his head at the moment (Matrix, bricks, clinton,...). Those were always the people who got lost: the less comments that they put in while writing the code, the more help they needed to fix the code. I heard "I'll put comments in later. Just show me how to fix the program" a bunch of time...
RossH: Point taken about C and C++. C is a wonderful thing, C++ might be too much of a good thing.
Seems to be this "object oriented" business causing the problem. In "normal" languages you can follow the execution of your code, through the ifs and elses around the loops occasionally taking time out in a function here and there. In object oriented code there tends to be thousands of little fragments of the total "action" you are trying to follow spread around thousands of methods, hidden away in constructors/destructors, some in the header files, some hidden away deep in a tree of inheritance. You end up having to understand the whole code before you can understand any small part of it. Trying to draw a line of execution through this forest is an exercise in madness.
Object orientedness was added to Ada with the same result but worse. Last year I found tutorial examples of object oriented programming in Ada written by a professor in the states where he called the language "absurd" because of the weird way he had go about getting his inheritance example working. Turned out he had totally miss understood how you should do polymorphism in Ada. Not his fault, it is convoluted.
SRLM: Now you've got me, what is ""official" pseudo code"? Never heard of such a thing. As I said we used a pseudo code that "looked like ALGOL". That's because ALGOL is a very nice language to describe algorithms. Of course as we never compiled it it was not rigorously correct but enough to express the ideas we were implementing.
To keep this on topic. I love the Prop to. Spin and PASM don't give me all these headaches I have to deal with professionally.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
For me, the past is not over yet.
What it really amounts to is that you can be a bad programmer in any language. You can program well in C++ - but most programmers don't - many of them do not even understand the language constructs they use every day.
Your Ada example is interesting - I love Ada (have I mentioned that before?) but it still gets a lot of bad press from people who never bothered to learn it properly - even when their job was supposedly to teach it. The early versions of Ada were as simple and orthogonal as Pascal (which itself originated as a teaching language based on Algol) but with all the low level expressive power of C - i.e. it incorporated machine level constructs that languages like Pascal deliberately tried to keep programmers away from. And Ada had an astonishingly simple but powerful multitasking kernel built right into the language. After years of fork(), setjmp(), longjump(), P(), V(), signal() etc - plus all the obscure message passing techniques and message processing loops that single-threaded programs have to resort to - Ada was a breath of fresh air.
Ada's biggest problem was that it so nearly made it. For a while it looked like putting a whole slew of software companies and software consultants out of business - in the most lucrative end of the software business. There were two serious attempts to nobble Ada - the first one didn't succeed, but the second one did. The first was based on how 'complex' Ada was to learn, and how resource hungry it was to run. But Ada is a simpler language than C++ by a country mile - and it could run comfortably on machines that would still struggle to host a decent C++ development environment. I remember having an fully compliant Ada development environment on an 80286 PC - A PC that could barely run Windows 3.0, and would have struggled to run a C++ compiler.
Ada's final downfall (and you can speculate endlessly whether this was deliberate or accidental) was its ill-fated decision to jump on the 'object orientation' bandwagon. Even I would agree that object orientation in Ada is a bit of a shambles. It's not as idiotic as C++, but it's not particularly brilliant, either. But at least it's well documented and consistent, which C++ never was. I have literally dozens of C++ books littering my shelf, but only one Ada book - the Ada Language Reference Manual (LRM).
In summary, It's easy to be a bad programmer in C++, but you have to work a little harder to be a bad programmer in Ada
[noparse][[/noparse]... soapbox.descend() ...]
I'm trying to think of a way to being this back to the thread topic, but I can't - except to say that I wish there was an Ada compiler for the Prop (but now that I'm nearly finished Catalina .... )
Ross.
P.S. And the 'official' pseudo code language for C++ has surely got to be UML - aka "Unusable Modelling Language".
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Catalina - a FREE C compiler for the Propeller - see Catalina
heater said...
...
Seems to be this "object oriented" business causing the problem. In "normal" languages you can follow the execution of your code, through the ifs and elses around the loops occasionally taking time out in a function here and there. In object oriented code there tends to be thousands of little fragments of the total "action" you are trying to follow spread around thousands of methods, hidden away...
For me, SPIN was my first and only exposure to an object oriented language and I know what you mean about the "fragmentation". However, such "fragmentation" would probably not be a problem if the objects were documented. My biggest gripe about items in the OBEX is that the lack of comments precludes a "black box" approach to using the objects, which is what I thought the philosophy of creating "objects" was supposed to be. There should be commentary that describes what the limitations of each object are, or at least tell us whether it returns an integer or float, etc. Most humorous are the "demos" that do nothing but call up the undocumented object and provide a serene moment of "Gee, that's really cool," which is soon followed by an infernal slog of "But how do I get this to work for me?"
Black boxes can be great so long as you know what the plugs are!
Seems to be totally useless for describing what goes on in the average event driven, object oriented system except at some very high level.
@ElectricAye: Spin may well have "objects" but I don't think many would agree that it is an "object oriented" language which are generally expected to posses encapsulation, inheritance and polymorphism.
Spin has "encapsulation" but lacks the other two. It's those other two that cause a lot of the "fragmention" I was referring to and make following the execution path through a C++ program "an exercise in madness".
I have only used a few objects from OBEX and perhaps they are lacking in documentation but generally it seems a few minutes grokking the "humorous" demo code and checking for pin usage, clock settings etc is enough to get them going.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
For me, the past is not over yet.
Pseudo-code is very useful for describing algorithms (at least one classic text on algorithms uses it) and I've found it quite easy to translate it into C or assembler.
Leon
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Amateur radio callsign: G1HSM
Suzuki SV1000S motorcycle
I was once berated by a professor here for saying that what I had was pseudo code. Apparently it's actually standardized and formalized. He said that this book in particular used "official" or real pseudo code.
Call me cynical if you want, but I am pretty sure that if you asked 100 professors about pseudo code you would discover at least 100 "official" versions of pseudo code.
I used an "official" pseudo code way back when writing software for a Collins 8400 computer that was running a 1401 emulator, which was running programs written in 1401 SPS. Official or not, I find outlining the program in pseudo code very helpful, and the pseudo code becomes the comments in the program which is a great help in documenting it.
> ... a way to being this back to the thread topic, but I can't - except to say that I wish there was an Ada compiler for the Prop (but now that I'm nearly finished Catalina .... )
Ross,
IMO, you're not really off track - our discussing ADA/C/C++ is an appropriate because it helps us appreciate SPIN and the prop tool IDE all that much more. The way the IDE lays out the color blocks for sections is a great help - and a clear nod to OOP concepts. And, although it might not be defined as a full-blown OO language (whatever that, like "psuedo code" is), SPIN has very clean OO structuring. This is perhaps one of the reasons its so (thankfully) easy to use! It was built from the ground up clean, not a C becomes OO in C++ hack, or ADA is nearly killed when it became ADAOOP - er, ah, ADA-OOPS :-P
RE: Catalina - BTW thanks for all your effort - a huge undertaking!
> Ada lives on in VHDL, which is very popular.
Leon, funny you mention that - I've just started a serious look a VHDL and it's variants - felt like I was putting on my old, well-worn ADA shoes. [noparse]:)[/noparse]
At the risk of continuing the thread drifting... SMALLTALK anyone?
(It is still alive and well --- if it didn't have such a heavy footprint, it might be cool to have a tightly coupled PC<->SMALLTALK<->PROP environment.)
- Howard
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
No matter how much you push the envelope, it'll still be stationery.
Reading through the comments on documentation takes me back to the days when Dod was trying to force Ada on the computer world. Programing in Ada was like being a pleb at VMI. "You will have discipline". Back then the Army's battle field computer of the future would have to crack the 100 Mhz barrier. The beltway bandits padded their bids for using Ada so much uncle sugar had to back down. Strong data typing and better comments would have prevented the Mars disaster. With today's hardware speed and memory Sargent York would not have had to been canceled. (Division Air Defense Gun). The Navy's R2D2 and variants thereof are much better because of the DIVAD failure.
As the next generation of developers adopt Propeller type (massively parallel) technology it is going to render obsolete a lot of patented and copyrighted bodies of work. AMD has seen the light and already is marketing their quad processor PC.
When you start with pseudo code, your own code generation and debugging is so much faster that you've been paid back for the extra effort even before the release of version 1.0.
You are right, massively parallel is going to change everything soon.
That's one reason I'm going to start playing with the propeller...to try and get my mind around
multiple processors running at once...I'm too used to interrupts and a single processor.
Here is some Smile I sent to someone in an email...I was still on a buzz from getting to attend a conference
of computer scientists and geeks in CA. (got to shake some famous people's hands )
To keep up with Moore's law they will have to go lower V+, multi core, 3d layering, optical inter chip data handling.
I bet in 15 years the cpu will be a cube like thing with cooling pores sitting in a tiny liquid cooling container on a mini motherboard connected electrically only to the V+ and ground. V+ will likely be close to 0.7v and there will be no bus wires on the board...it will be pretty much all optical.
A cubical 3d cpu package will need to have cooling fluid slowly pumped through an intricate network of pores or channels to keep cool...at least this will be silent and cheap since the pump could be very tiny and simple. The fluid would form a continuous loop through the cpu and a heat radiator of some sort. The cpu would likely come as an integrated package with pump, coolant and radiator ready to stick onto a motherboard.
Cpu core voltage has been dropping steadily from 12v to about 1.2 today... 0.7 is about the limit given today's technology...but that might change, and with every reduction comes a lowering of power use and generated heat...heat is the biggest problem we have in cpu design.
To reach their full potential massively multi-core processors will need a layer a separation between the programmer and the cpu. The layer of separation will be software that reduces the incredible complexity to the point where the cpu can be considered by the programmer as the equivalent of a super powerful risc cpu with only a few dozen instructions. The layer would have to handle the complexities involved with making good use of so many cores. Little progress can be made until this separating layer is built...there simply are not enough man-hours available to deal directly with the complexities of massively multi-core designs. To the programmer it needs to be as simple as programming a single-core cpu, albeit one able to handle hundreds of billions or even trillions of operations/second.
Creating this layer of separation will be much more difficult than writing an ordinary compiler or OS of course, and the programmer will need to keep in mind that a multitude of things can be happening at the same time within the processor.
When I first learned object oriented computing, I was very disappointed that I could not run my c++ program on multiple cpu's without designing it that way. To me it seemed natural that objects would be on separate processors so they could operate how you laid them out, not just get jumped to when needed. The propeller is the closest thing I have seen to this kind of object oriented computing. Maybe SeaForth is another example, but since it is laid out in a grid, your object model has to be "gridded" as well(which may not be bad). And that 18bit thing! What were they thinking!
Comments
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Toys are microcontroled.
Robots are microcontroled.
I am microcontroled.
Maybe more people should try my approach: write the documentation to fulfill English class requirements! Of course, I still have to put some more in code documentation before I submit it, but by the end it will be very well documented.
Hi Mike,
Mind you, I'm not saying that lust is a bad thing. Lust has carried our species through wars, famine, plague and depressions for millennia, much as my Propeller lust has carried me from utter cluelessness to a working prototype. It's just that a nice comment now and then would maybe help me bridge that gulf between lust and love. I'm not asking for flowers and candy or for the OBEX to do my laundry; I just need some sweet non-nothings whispered into the code here and there.
Yes, I've thought about that. But I'm guessing members of the community would evict me for littering their spartan OBEX objects with my wordy, often moronic, and sometimes DADAistic, if not surrealistic, commentary. Unfortunately, much of what makes the OBEX objects work is done in assembly language, of which I know next to nothing. And what little I know of the objects was derived from reverse engineering them, sticking in a value to see what happens. I would therefore have to scrub my code comments of the many obscenities with which it is seismically isolated. A monumental task that would be.
It's a nice thought, though. Thank you for having it.
Thanks, SRLM. It's refreshing to know that someone out there has shed the ancient shackles of limited memory allotments in embedded systems and has aspired to unite art and science with the epiphanies of technology. I'm looking forward to holding yours up as a shining example.
Is that what I did? Oh... I didn't realize...
Mike,
even professional developers - i.e. those who 'get paid for it and are chained to a desk' - don't often document their code well - unless it's an enforced policy. That said, I don't think it is a good idea to perpetuate the "oh that's just the way it is because we don't have time" attitude.··Once you have to maintain it - even your own code - it becomes quickly obvious·how important documentation is.· Sure, studying and tweeking other's code is really the *only* way to learn to substantially·program anything.· Yet·simple comments make·learning·faster and easier.
A code repository, of ALL places, needs good commenting and docs. I submit to you the PERL and Python communities as shining examples of how to do it right.
Mind you, I'm not talking necessarily about all-encompassing documents, rather about simple things that either 1) self document, or 2) have guiding comments, e.g.:
·GOOD - self documenting comment-like code AND comments --- just code and write each line (see bottom of post too, please) }}
{ BAD - ah, the old a,b,c, x,y,z and t vars - oh, wait, whats a "d" do? Darn, Doggy, Doo·doo --- oh, "Delay"·oh, I get it }
·{ UGLY· - CountDracula, eh ?·... must be the automatic blood draw·and test·rountine·}
{{ and, yes, I have seen code like that - one of·our overseas developers *and his code-reviewing manager* loved Mister Roger's Neighborhood --- he named his vars and routines after things in the show. When·we inherited·their 100,000 lines of C and C++ code, several people quit after looking at what they'd have to maintain.
The way I train programmers to code and comment is easy to do and implement:
Write the pseudo code as a comment before you code it ... as they tell you to do in English class, make a outline before you write.· This will actually improve your coding and --- you won't believe me until you do it some months --- it will make·coding *much* faster, and less error prone.
Leave the pseudo code there as a comment once the code is in.
}}
:EndRant
mov Browser, NewLocation
' thanks
; Howard
// in Florida
/* */
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
No matter how much you push the envelope, it'll still be stationery.
Yep, I agree. That's the way that I've been trying to teach beginning C++ programmers this quarter: think about your code before you write it, and at the very least make a picture on paper. The programing part is the easy part: it's designing a program that is hard.
Perhaps the systems integration team would have noticed the potential deadly flaw if the offending code modules had ONE comment? (*)
;
; These routines use SI units
;
A·$125 million comment?
(*) this is speculation by way of example·-·but it was a failure of communictions, the transmission of which is always via hardware which uses code.
cheers,
Howard
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
No matter how much you push the envelope, it'll still be stationery.
make your variables/names THE comments. [noparse]:)[/noparse]
Maybe we can rename this thread to "I will Love The Propeller Even More and More with Each Passing Code Comment" ? [noparse]:)[/noparse])
- H
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
No matter how much you push the envelope, it'll still be stationery.
······ The engineering error was a NASA cover up .........
······
······ No...seriously............I am in total agreement........this is an amazingly good motivational example for students to learn the
·······value of commenting.............I love this example...........
·
·······All the talk about them having to come back in a future time to read the code, means nothing to them.....···
······ Future?? what is that?? come back and visit something I did a month ago????? why dude??? I have better things to do than revisit
······ ANCIENT history!!!!!· oh what it is to be young again.....but then I never thought that way even when I WAS young......
·
······ BUT.....I am sure you all know this already........Documentation is not just in CODE..........It is EVEN MORE important when you do
······ circuits.........I had occasion to want to use a PCB I did 5 years ago and found myself looking for the schematics to figure out what is
······ what...........but even more than the schematics, I found myself going back to my LOG to find out why I did this or that......
·
······· When I design software or hardware, I keep a LOG...in addition to comments and pseudo code etc.........this log has in it all the changes
······· Chroniclized and·comments for·why this or that, what decision drove this to be that way rather than the other, and any errors found and
······· why they occurred and how they were·resolved.....etc.etc.
·
········ Just like when you·Sail a boat you keep a daily log, so you should when you program or design a PCB or really any design process.....
········ Comments, Log, Documentation, and even old and superseded designs and schematics.
·
········ I love the Mars example.........very motivational........
·
·
·
········ Samuel
·
"I do wish the eeprom was on-chip and secure"
I too lament the fact that the EEPROM is external, there should be at least 128KB on the silicon. but that is not a deal breaker.
EEPROM on the chip is still not secure, there are still ways of getting the data if it is worth much.
"I love not having to mess around with interrupts and constantly worrying if some new bit of code will mess up some timing critical old bit of code."
I created a program generator for the AVR that creates a skeletal C program containing all the interrupt code your program needs. It uses a set
of questions to build the code and gives you the max time you can waste in each interrupt routine before you will get into trouble. It is used a lot
here, a real time saver. I hope to write a few simple program generators for the Propeller...I love writing code generators and they can save a lot
of effort and are good learning tools if you also generate comments along with the code.
SRLM said.. "Yep, I agree. That's the way that I've been trying to teach beginning C++ programmers this quarter:"
@SRLM: A serious question "What is the pseudo code for C++?"
Back in the day I worked in a team that coded in HEX because we had no assembler. That HEX was manually derived from ASM we had written. That ASM was derived from pseudo code (which looked like ALGOL) which we had designed and written first. All was well.
In latter years pseudo code disappeared as we then had high level languages PL/M, Pascal, C etc. But at least we had data-flow diagrams, structure charts etc for the overall design.
Now I'm finding C++ code harder to read and understand than a lot of assembler code I've worked on! And I find myself wishing there was some "pseudo code" or design left behind by the authors to help. Getting the Qt library to do anything complicated is proving to be an up hill struggle.
So SRLM, as teacher of C++, what is the "pseudo code" I'm looking for? Perhaps UML is the thing, but looking at those diagrams for anything complicated make my head spin.
Do you have a recommended reading for the "design and documentation part" in a C++ project?
To keep this on topic. Spin's idea of "objects" does actually seem to help keeping code under control. Witness the easy of use of stuff from OBEX.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
For me, the past is not over yet.
Don't forget that the C language was originally intended to be simply a 'platform independent' assembly language. That's why the language has such nice low-level features.
C++ is therefore a hybrid language that tries to bolt a very high level of abstraction ... onto assembly language.
The result works about as well as bolting a rocket engine onto a skateboard. Yes, it can get you from A to B - but wear a crash helmet and watch out for the corners!
Ross.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Catalina - a FREE C compiler for the Propeller - see Catalina
BTW, I've just finished my first quarter teaching C++ as part of a tutoring job (about 6-12 hours a week, depending on how hard the assignments are). I never realized how much work it takes just to prepare for a session.
I'm afraid the I'm using "pseudo code" loosely. It's not "official" pseudo code (the language). Since program design isn't part of the course, I can't actually require them to use anything like real psuedo code. Mostly, the point that I hammered at was to write down all the steps of your program on paper before you step to the computer. My definition of pseudo code was the logical steps that makes a program written in English (break down into statements and expressions, indentation to indicate what is tied to what). This worked for this course since it was an introduction level (no classes or multithreading) and it was all procedural, so it's fairly straight forward to write out the steps.
Documentation was probably the worst part of the students programming. Almost nobody had comments, and many fell into the trap of bad/duplicate/similar variable names. One guy even named his functions after whatever came into his head at the moment (Matrix, bricks, clinton,...). Those were always the people who got lost: the less comments that they put in while writing the code, the more help they needed to fix the code. I heard "I'll put comments in later. Just show me how to fix the program" a bunch of time...
Seems to be this "object oriented" business causing the problem. In "normal" languages you can follow the execution of your code, through the ifs and elses around the loops occasionally taking time out in a function here and there. In object oriented code there tends to be thousands of little fragments of the total "action" you are trying to follow spread around thousands of methods, hidden away in constructors/destructors, some in the header files, some hidden away deep in a tree of inheritance. You end up having to understand the whole code before you can understand any small part of it. Trying to draw a line of execution through this forest is an exercise in madness.
Object orientedness was added to Ada with the same result but worse. Last year I found tutorial examples of object oriented programming in Ada written by a professor in the states where he called the language "absurd" because of the weird way he had go about getting his inheritance example working. Turned out he had totally miss understood how you should do polymorphism in Ada. Not his fault, it is convoluted.
SRLM: Now you've got me, what is ""official" pseudo code"? Never heard of such a thing. As I said we used a pseudo code that "looked like ALGOL". That's because ALGOL is a very nice language to describe algorithms. Of course as we never compiled it it was not rigorously correct but enough to express the ideas we were implementing.
To keep this on topic. I love the Prop to. Spin and PASM don't give me all these headaches I have to deal with professionally.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
For me, the past is not over yet.
[noparse][[/noparse]... soapbox.elevate(speaker => ross) ...]
What it really amounts to is that you can be a bad programmer in any language. You can program well in C++ - but most programmers don't - many of them do not even understand the language constructs they use every day.
Your Ada example is interesting - I love Ada (have I mentioned that before?) but it still gets a lot of bad press from people who never bothered to learn it properly - even when their job was supposedly to teach it. The early versions of Ada were as simple and orthogonal as Pascal (which itself originated as a teaching language based on Algol) but with all the low level expressive power of C - i.e. it incorporated machine level constructs that languages like Pascal deliberately tried to keep programmers away from. And Ada had an astonishingly simple but powerful multitasking kernel built right into the language. After years of fork(), setjmp(), longjump(), P(), V(), signal() etc - plus all the obscure message passing techniques and message processing loops that single-threaded programs have to resort to - Ada was a breath of fresh air.
Ada's biggest problem was that it so nearly made it. For a while it looked like putting a whole slew of software companies and software consultants out of business - in the most lucrative end of the software business. There were two serious attempts to nobble Ada - the first one didn't succeed, but the second one did. The first was based on how 'complex' Ada was to learn, and how resource hungry it was to run. But Ada is a simpler language than C++ by a country mile - and it could run comfortably on machines that would still struggle to host a decent C++ development environment. I remember having an fully compliant Ada development environment on an 80286 PC - A PC that could barely run Windows 3.0, and would have struggled to run a C++ compiler.
Ada's final downfall (and you can speculate endlessly whether this was deliberate or accidental) was its ill-fated decision to jump on the 'object orientation' bandwagon. Even I would agree that object orientation in Ada is a bit of a shambles. It's not as idiotic as C++, but it's not particularly brilliant, either. But at least it's well documented and consistent, which C++ never was. I have literally dozens of C++ books littering my shelf, but only one Ada book - the Ada Language Reference Manual (LRM).
In summary, It's easy to be a bad programmer in C++, but you have to work a little harder to be a bad programmer in Ada
[noparse][[/noparse]... soapbox.descend() ...]
I'm trying to think of a way to being this back to the thread topic, but I can't - except to say that I wish there was an Ada compiler for the Prop (but now that I'm nearly finished Catalina .... )
Ross.
P.S. And the 'official' pseudo code language for C++ has surely got to be UML - aka "Unusable Modelling Language".
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Catalina - a FREE C compiler for the Propeller - see Catalina
Leon
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Amateur radio callsign: G1HSM
Suzuki SV1000S motorcycle
For me, SPIN was my first and only exposure to an object oriented language and I know what you mean about the "fragmentation". However, such "fragmentation" would probably not be a problem if the objects were documented. My biggest gripe about items in the OBEX is that the lack of comments precludes a "black box" approach to using the objects, which is what I thought the philosophy of creating "objects" was supposed to be. There should be commentary that describes what the limitations of each object are, or at least tell us whether it returns an integer or float, etc. Most humorous are the "demos" that do nothing but call up the undocumented object and provide a serene moment of "Gee, that's really cool," which is soon followed by an infernal slog of "But how do I get this to work for me?"
Black boxes can be great so long as you know what the plugs are!
Seems to be totally useless for describing what goes on in the average event driven, object oriented system except at some very high level.
@ElectricAye: Spin may well have "objects" but I don't think many would agree that it is an "object oriented" language which are generally expected to posses encapsulation, inheritance and polymorphism.
Spin has "encapsulation" but lacks the other two. It's those other two that cause a lot of the "fragmention" I was referring to and make following the execution path through a C++ program "an exercise in madness".
I have only used a few objects from OBEX and perhaps they are lacking in documentation but generally it seems a few minutes grokking the "humorous" demo code and checking for pin usage, clock settings etc is enough to get them going.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
For me, the past is not over yet.
Leon
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Amateur radio callsign: G1HSM
Suzuki SV1000S motorcycle
I was once berated by a professor here for saying that what I had was pseudo code. Apparently it's actually standardized and formalized. He said that this book in particular used "official" or real pseudo code.
I used an "official" pseudo code way back when writing software for a Collins 8400 computer that was running a 1401 emulator, which was running programs written in 1401 SPS. Official or not, I find outlining the program in pseudo code very helpful, and the pseudo code becomes the comments in the program which is a great help in documenting it.
Ross,
IMO, you're not really off track - our discussing ADA/C/C++ is an appropriate because it helps us appreciate SPIN and the prop tool IDE all that much more. The way the IDE lays out the color blocks for sections is a great help - and a clear nod to OOP concepts. And, although it might not be defined as a full-blown OO language (whatever that, like "psuedo code" is), SPIN has very clean OO structuring. This is perhaps one of the reasons its so (thankfully) easy to use! It was built from the ground up clean, not a C becomes OO in C++ hack, or ADA is nearly killed when it became ADAOOP - er, ah, ADA-OOPS :-P
RE: Catalina - BTW thanks for all your effort - a huge undertaking!
> Ada lives on in VHDL, which is very popular.
Leon, funny you mention that - I've just started a serious look a VHDL and it's variants - felt like I was putting on my old, well-worn ADA shoes. [noparse]:)[/noparse]
At the risk of continuing the thread drifting... SMALLTALK anyone?
(It is still alive and well --- if it didn't have such a heavy footprint, it might be cool to have a tightly coupled PC<->SMALLTALK<->PROP environment.)
- Howard
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
No matter how much you push the envelope, it'll still be stationery.
As the next generation of developers adopt Propeller type (massively parallel) technology it is going to render obsolete a lot of patented and copyrighted bodies of work. AMD has seen the light and already is marketing their quad processor PC.
When you start with pseudo code, your own code generation and debugging is so much faster that you've been paid back for the extra effort even before the release of version 1.0.
You are right, massively parallel is going to change everything soon.
That's one reason I'm going to start playing with the propeller...to try and get my mind around
multiple processors running at once...I'm too used to interrupts and a single processor.
Here is some Smile I sent to someone in an email...I was still on a buzz from getting to attend a conference
of computer scientists and geeks in CA. (got to shake some famous people's hands )
To keep up with Moore's law they will have to go lower V+, multi core, 3d layering, optical inter chip data handling.
I bet in 15 years the cpu will be a cube like thing with cooling pores sitting in a tiny liquid cooling container on a mini motherboard connected electrically only to the V+ and ground. V+ will likely be close to 0.7v and there will be no bus wires on the board...it will be pretty much all optical.
A cubical 3d cpu package will need to have cooling fluid slowly pumped through an intricate network of pores or channels to keep cool...at least this will be silent and cheap since the pump could be very tiny and simple. The fluid would form a continuous loop through the cpu and a heat radiator of some sort. The cpu would likely come as an integrated package with pump, coolant and radiator ready to stick onto a motherboard.
Cpu core voltage has been dropping steadily from 12v to about 1.2 today... 0.7 is about the limit given today's technology...but that might change, and with every reduction comes a lowering of power use and generated heat...heat is the biggest problem we have in cpu design.
To reach their full potential massively multi-core processors will need a layer a separation between the programmer and the cpu. The layer of separation will be software that reduces the incredible complexity to the point where the cpu can be considered by the programmer as the equivalent of a super powerful risc cpu with only a few dozen instructions. The layer would have to handle the complexities involved with making good use of so many cores. Little progress can be made until this separating layer is built...there simply are not enough man-hours available to deal directly with the complexities of massively multi-core designs. To the programmer it needs to be as simple as programming a single-core cpu, albeit one able to handle hundreds of billions or even trillions of operations/second.
Creating this layer of separation will be much more difficult than writing an ordinary compiler or OS of course, and the programmer will need to keep in mind that a multitude of things can be happening at the same time within the processor.
Nvidia's CUDA platform is very significant. (Still can't get past the grid problem... which will remain for some time.)
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
No matter how much you push the envelope, it'll still be stationery.
Today I got an announcement for the release of ADA for the Lego Mindstorms !!
libre.adacore.com/libre/tools/mindstorms/
Looks like the Prop is falling behind here and there is only one man who can save us.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
For me, the past is not over yet.