Carl Hayes said...
I'm attempting to understand the workings of the tv.spin module provided with the Propeller editor. I hope to use it, but not without understanding it.
Early on, the following statement appears, written in the Spin language:
okay := cog := cognew(@entry, tvptr) + 1
I infer, but am not certain, that this sets both cog and okay to some value, but the rest of the statement baffles me. I can't figure out what is meant by an apparent attempt to use an executable statement as a value to be assigned.
Eh? Wozzat?
Break it down into expressions and operators, and evaluate each.
The ":=" operator assigns the value of the expression on the right to the expression on the left, and returns the value assigned.
expression1 is "okay := cog := cognew(@entry, tvptr) + 1"
expression2 is "okay"
expression3 is "cog := cognew(@entry, tvptr) + 1"
expression4 is "cog"
expression5 is "cognew(@entry, tvptr) + 1"
expression6 is "cognew(@entry, tvptr)"
expression7 is "@entry"
expression8 is "entry"
expression9 is "tvptr"
expression10 is "1"
Evaluate them from bottom-up. Notice that the assignment to "cog" is done before the assignment to "okay".
Summary: add 1 to the return value of cognew. This gives a result from 0 (failure, which is boolean false) to 8 (success, non-zero values are boolean true). Assign this value to "cog", then assign it to "okay"
Post Edited (Andrew E Mileski) : 2/19/2009 8:29:43 PM GMT
Bill Drummond said...
I wish it were documented how the compiler figures things out.
I believe programmers either have PascalBrains or C_Brains, Pascal people wear out the middle of the keyboard first and C people wear out the keys around the edge. Spin seems kinda bipolar to me.
Actually I suppose it would be interesting to see how this particular compiler figures things out, but it wouldn't help me much in programming.· Compilers, even those handling exactly the same language, differ vastly in the amount of optimization they can do.· Some even figure out when a statement can be moved out of a loop, and move it.· Somehow I doubt the Spin compiler does that.· Perhaps it does no optimization at all; most don't.
An optimization that is fairly easy to implement in a compiler is prefiguring the results of operations on contants.· For example:
A := |< 1 can be compiled two ways:· either the compiler can notice that this will always come out the same (because the operand is a constant) and precompute it, or it can put instructions in the compiled code (I want to call this an object module which has been the standard term in data processing for eons, but the term object has been redefined as a term of dilettantism) so that the shift will be done each time the statement is encountered during execution.
A := |< B on the other hand cannot be precomputed by the compiler, because it doesn't know what the value of B will be, so it has to insert code to do the shift at runtime.
I once spent considerable time, a year or so·(at several kilobucks per week + large expenses·so I didn't mind), reverse-engineering an IBM Cobol compiler.· It was very intricate but not godawful clever, to my considerable surprise.· Never a Cobol writer, I had to learn a little of so I could test things.· Terrible language, absolutely rotten.· I hated it.
Not true that programmers have either PascalBrains or C_Brains.· I have an AssemblyBrain, and before that it was a FortranBrain.· Mainframes rock.
An optimization that is fairly easy to implement in a compiler is prefiguring the results of operations on contants. For example:
A := |< 1 can be compiled two ways: either the compiler can notice that this will always come out the same (because the operand is a constant) and precompute it, or it can put instructions in the compiled code (I want to call this an object module which has been the standard term in data processing for eons, but the term object has been redefined as a term of dilettantism) so that the shift will be done each time the statement is encountered during execution.
A := |< B on the other hand cannot be precomputed by the compiler, because it doesn't know what the value of B will be, so it has to insert code to do the shift at runtime.
The Parallax compiler will evaluate that for you at compile time if you put it in a constant() construct.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Cardinal Fang! Fetch the comfy chair.
The Parallax compiler will evaluate that for you at compile time if you put it in a constant() construct.
Yes, of course.· But then you can't use A as a variable elsewhere in the program.· It's still a useful optimization, but I was speakig of optimizations done automatically by the compiler, not those done consciously by the programmer.
Look·at it again:
·A := |< 3 ······or · A := 1 << 3
Most compilers, probably including ours, will generate code equivalent to the following:
··fetch a constant 1 ·· shift it three places left ·· store the result in A
The smartest compilers (assuming they had this operator at all, of course) would generate a little less code:
··fetch a constant 8 ·· store it in A
which is one instruction less.· But it takes a very smart compiler to recognize automatically that the result will always be 8.· It would surprise me if any compiler that runs on a PC (as the Spin compiler does) had that sort of optimization built in.
I said somewhere in an earlier post that such optimizations are fairly easy to build into a compiler.· So they are, but that doesn't mean they aren't a lot of work for the dude who's writing it.· Including stuff like that can easily double or treble the cost of creating and testing the compiler.· It would be unreasonable to demand it or expect it.· But it wouldn't amaze me if Spin had such abilities -- clearly its author is both brilliant and a bear for work.· I'd hire him in a heartbeat if I were still hiring people like that -- but retirement is·wonderful.
The smartest compilers (assuming they had this oerator at all, of course) would generate a little less code:
fetch a constant 8 store it in A
which is one instruction less. But it takes a very smart compiler to recognize automatically that the result will always be 8. It would surprise me if any compiler that runs on a PC (as the Spin compiler does) had that sort of optimization built in.
Actually, a good number of compilers can perform these optimizations automatically. The specific optimization you are talking about is Constant Folding (en.wikipedia.org/wiki/Constant_folding). The SPIN compiler unfortunately does not support automatic constant folding, so you have to enclose your stuff in constant() constructs. The compiler will then generate a single constant in place of the stuff inside the constant().
The Parallax compiler will evaluate that for you at compile time if you put it in a constant() construct.
Yes, of course. But then you can't use A as a variable elsewhere in the program. It's still a useful optimization, but I was speakig of optimizations done automatically by the compiler, not those done consciously by the programmer.
Look at it again:
A := |< 3
or
A := 1 << 3
I'm sorry, I was not as clear as I might have been. What I meant was
A := constant(|< 3)
...and the compiler will evaluate it for you at compile time. You can still use A elsewhere in your code.
At least one of the alternative spin compilers supports constant folding.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Cardinal Fang! Fetch the comfy chair.
Constant folding in a compiler is not really that difficult if the compile stack keeps track of the variable vs. constant nature of its elements. So when it collapses a constant constant op RPN triad, it can perform the operation internally, pushing the constant result onto the compile stack, rather than emitting the code to do it.
That answers one thing I wondered idly about. Now I know the Spin compiler is a stack compiler and not, for example, an infix compiler. I've written parsers of both kinds. Stack parsers (mine converted everything, effectively, to RPN) are much more fun to write. My, that was 35 years ago. How time flies when you're having fun.
I wasn't writing a compiler at the time; it was a TSO command-line scientific calculator.
Harrison. said...
Actually, a good number of compilers can perform these optimizations automatically. The specific optimization you are talking about is Constant Folding (en.wikipedia.org/wiki/Constant_folding).
Thank you, Harrison.· I hadn't known that term.· Perhaps the term (but not the practice) is newer than my knowledge, what little there is of it, of compiler construction -- my knowledge of that is about a third of a century old, and it was never a major activity for me anyway.
But I'm in no doubt that, while not difficult, inclusion of optimizations like that is a lot of drudgery for the writer of compilers.· No doubt at all -- and he doesn't even get many strokes for it, because it's mostly invisible to the user.
Mike Green said...
Carl,
You've come to the heart of the continuing argument between efficiency and simplicity and transparency. It's not as simple as you present. It may be very important to have several different ways to write A = A + 1 because they may superficially appear to be the same, but may, in fact, be different. I've worked on large, heavily managed programming projects and the typical management style is very rigid. It produces code that usually works and, with the right management, can be reliable, but is rarely innovative, usually inefficient. As has been shown with Lockheed's "Skunk Works", it is indeed possible to produce high quality, reliable, cutting edge stuff whether hardware or software. It's difficult, requires a small team of unusually talented technical and managerial talent that's highly motivated to produce quality products. I've been on such projects and they do work (including the production of accurate, detailed documentation). The costs are similar because of the larger infrastructure and lower efficiency of the "traditional" project management. On the other hand, there are higher risks involved with a "Skunk Works" type of structure. You may not be able to get the people you need. You may have meddling from higher up. The project itself may be intractable.
Indeed yes -- and being a mainframe guy, I lean toward transparency and especially maintainability.· Efficiency is almost valueless.· Who cares if the program runs ten minutes longer when it takes the printer an hour to print the 25,000 paychecks, and they don't get into the outgoing mail for another half day?· But when the program breaks, everyone notices that the paychecks didn't show up at the loading dock.· That's when transparency and maintainability shine.
Even more so in my own area, systems programming.· When my stuff breaks, everyone's stuff is broken.· If my JES2 exit (let's say it prints the date on the separator pages that appear at the start and end of each job's printed output) crashes at midnight because·it·was written in January and it wasn't ever tested in November, nobody gets their stuff until it's fixed.· And if the guy who wrote it got promoted, or quit, in April, I'll be very glad I beat him over the head to demand clarity and simplicity in preference to·cleverness.· Note, of course, that elegance is best.· Elegance = cleverness + maintainability + clarity.
Cleverness and efficiency are good, but remember what I have taught for years as Carl's First Law:· any program that works now·is better than any program that doesn't work yet; and Carl's second law:· any program that isn't maintainable, or can be maintained only by its author, doesn'twork.· It is only pretending to work, until you let down your guard.· Then, you're in deep yogurt.
Those, my friend,·are lessons from the trenches.
Control and automation are a different arena (in the 1970s I did process control in a glass-bottle plant with an IBM 1800).· Optimization for efficiency is often necessary and often absolutely essential, for the small computers you use may be barely adequate anyway.· But even there, maintainability is a key asset.· When the forehearth temperature starts to drop and the glass flowing to the bottle machines stiffens up, the shift foreman, who is not your boss,·will (a) treat you like God, but also (b) want to kill you.· You'd better be able to read the code and fix it, fast, under pressure.
Comments
The ":=" operator assigns the value of the expression on the right to the expression on the left, and returns the value assigned.
expression1 is "okay := cog := cognew(@entry, tvptr) + 1"
expression2 is "okay"
expression3 is "cog := cognew(@entry, tvptr) + 1"
expression4 is "cog"
expression5 is "cognew(@entry, tvptr) + 1"
expression6 is "cognew(@entry, tvptr)"
expression7 is "@entry"
expression8 is "entry"
expression9 is "tvptr"
expression10 is "1"
Evaluate them from bottom-up. Notice that the assignment to "cog" is done before the assignment to "okay".
Summary: add 1 to the return value of cognew. This gives a result from 0 (failure, which is boolean false) to 8 (success, non-zero values are boolean true). Assign this value to "cog", then assign it to "okay"
Post Edited (Andrew E Mileski) : 2/19/2009 8:29:43 PM GMT
An optimization that is fairly easy to implement in a compiler is prefiguring the results of operations on contants.· For example:
A := |< 1 can be compiled two ways:· either the compiler can notice that this will always come out the same (because the operand is a constant) and precompute it, or it can put instructions in the compiled code (I want to call this an object module which has been the standard term in data processing for eons, but the term object has been redefined as a term of dilettantism) so that the shift will be done each time the statement is encountered during execution.
A := |< B on the other hand cannot be precomputed by the compiler, because it doesn't know what the value of B will be, so it has to insert code to do the shift at runtime.
I once spent considerable time, a year or so·(at several kilobucks per week + large expenses·so I didn't mind), reverse-engineering an IBM Cobol compiler.· It was very intricate but not godawful clever, to my considerable surprise.· Never a Cobol writer, I had to learn a little of so I could test things.· Terrible language, absolutely rotten.· I hated it.
Not true that programmers have either PascalBrains or C_Brains.· I have an AssemblyBrain, and before that it was a FortranBrain.· Mainframes rock.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
· -- Carl, nn5i@arrl.net
Post Edited (Carl Hayes) : 2/20/2009 7:30:59 AM GMT
The Parallax compiler will evaluate that for you at compile time if you put it in a constant() construct.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Cardinal Fang! Fetch the comfy chair.
Look·at it again:
· A := |< 3
······ or
· A := 1 << 3
Most compilers, probably including ours, will generate code equivalent to the following:
·· fetch a constant 1
·· shift it three places left
·· store the result in A
The smartest compilers (assuming they had this operator at all, of course) would generate a little less code:
·· fetch a constant 8
·· store it in A
which is one instruction less.· But it takes a very smart compiler to recognize automatically that the result will always be 8.· It would surprise me if any compiler that runs on a PC (as the Spin compiler does) had that sort of optimization built in.
I said somewhere in an earlier post that such optimizations are fairly easy to build into a compiler.· So they are, but that doesn't mean they aren't a lot of work for the dude who's writing it.· Including stuff like that can easily double or treble the cost of creating and testing the compiler.· It would be unreasonable to demand it or expect it.· But it wouldn't amaze me if Spin had such abilities -- clearly its author is both brilliant and a bear for work.· I'd hire him in a heartbeat if I were still hiring people like that -- but retirement is·wonderful.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
· -- Carl, nn5i@arrl.net
Post Edited (Carl Hayes) : 2/20/2009 7:32:48 AM GMT
Actually, a good number of compilers can perform these optimizations automatically. The specific optimization you are talking about is Constant Folding (en.wikipedia.org/wiki/Constant_folding). The SPIN compiler unfortunately does not support automatic constant folding, so you have to enclose your stuff in constant() constructs. The compiler will then generate a single constant in place of the stuff inside the constant().
I'm sorry, I was not as clear as I might have been. What I meant was
A := constant(|< 3)
...and the compiler will evaluate it for you at compile time. You can still use A elsewhere in your code.
At least one of the alternative spin compilers supports constant folding.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Cardinal Fang! Fetch the comfy chair.
-Phil
I wasn't writing a compiler at the time; it was a TSO command-line scientific calculator.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
· -- Carl, nn5i@arrl.net
But I'm in no doubt that, while not difficult, inclusion of optimizations like that is a lot of drudgery for the writer of compilers.· No doubt at all -- and he doesn't even get many strokes for it, because it's mostly invisible to the user.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
· -- Carl, nn5i@arrl.net
Even more so in my own area, systems programming.· When my stuff breaks, everyone's stuff is broken.· If my JES2 exit (let's say it prints the date on the separator pages that appear at the start and end of each job's printed output) crashes at midnight because·it·was written in January and it wasn't ever tested in November, nobody gets their stuff until it's fixed.· And if the guy who wrote it got promoted, or quit, in April, I'll be very glad I beat him over the head to demand clarity and simplicity in preference to·cleverness.· Note, of course, that elegance is best.· Elegance = cleverness + maintainability + clarity.
Cleverness and efficiency are good, but remember what I have taught for years as Carl's First Law:· any program that works now·is better than any program that doesn't work yet; and Carl's second law:· any program that isn't maintainable, or can be maintained only by its author, doesn't work.· It is only pretending to work, until you let down your guard.· Then, you're in deep yogurt.
Those, my friend,·are lessons from the trenches.
Control and automation are a different arena (in the 1970s I did process control in a glass-bottle plant with an IBM 1800).· Optimization for efficiency is often necessary and often absolutely essential, for the small computers you use may be barely adequate anyway.· But even there, maintainability is a key asset.· When the forehearth temperature starts to drop and the glass flowing to the bottle machines stiffens up, the shift foreman, who is not your boss,·will (a) treat you like God, but also (b) want to kill you.· You'd better be able to read the code and fix it, fast, under pressure.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
· -- Carl, nn5i@arrl.net