Shop OBEX P1 Docs P2 Docs Learn Events
Ran into a great discussion involving assembly language, and want your thoughts. - Page 2 — Parallax Forums

Ran into a great discussion involving assembly language, and want your thoughts.

2

Comments

  • Too_Many_ToolsToo_Many_Tools Posts: 765
    edited 2013-12-18 10:33
    Heater. wrote: »
    Too_Many_Tools,

    This may be your observation but it's almost certainly not true in general.

    It's also a bit of a put down to all those really good programmers who use weakly of dynamically typed languages. Many gurus of Lisp, Scheme even Forth could give you a lecture on that.

    I also observe that Ada, a very strongly typed language, has been resoundingly rejected by pretty much the entire programming community.

    potatohead

    Optimized tools? Yes. The question then is what are wanting to optimize for: execution, speed, size, programmer productivity, maximal cross platform support, correctness/security, profit, etc?

    LOL...sorry..what I meant to say is that "One of observation I have seen is that good designers want strong compliers that type strongly. "

    In the past I have used assembly language and I have used Ada...and I can easily see the difference in productivity that a strongly typed language gives.

    I have also have seen how strongly the emotions run with typed compliers when those who are used to programming with assembly languages use them.
  • Too_Many_ToolsToo_Many_Tools Posts: 765
    edited 2013-12-18 10:35
    rod1963 wrote: »
    Does inside the box still matter(to borrow a phrase from Steve Ciarcia), that should be the question.

    For most coders the answer would be no.

    It's just a black box that you code on.

    Inside the box only matters to designers, bit bangers and hackers who want to know what goes on. Your average code monkey doesn't need to know if the box is a PPC or x86 or one of the Apple hygene products.

    Micro-controllers seem to be the only hold out. Even coding on the Arduino you still have to know something about hardware to interface to the various boards.

    Assembly? It really depends if you're writing a BIOS, writing virtual peripherals on the Prop, start-up/initialiation code, run-times for compilers. Or simply the joy of it. I like writing assembly on Z-80's and 68k/ColdFire. I wouldn't touch it on a x86, ARM or MIPS.

    Programming Languages? They are really personal preference thing for non-professionals. I personally like Free Pascal, Oberon and BASIC. Mbed's online C++ compiler for ARM's micros is quite nice for hobbyists like myself.

    Well said.

    It should be a black box to the programmer...if the project design is done properly...and this comment is coming from a person who actively does hardware AND software.
  • Too_Many_ToolsToo_Many_Tools Posts: 765
    edited 2013-12-18 10:47
    RossH wrote: »


    Don't be obtuse, Heater.

    Ada could not be rejected from all quarters of the community because 95% of the programming community never knew much more about it than the name. Still don't, obviously.

    Ada was a specific language for a specific purpose, and was very popular in the community for which it was intended. Of which I was a part at the time.

    Nobody in their right minds would attempt to write a graphical operating system in Ada. Or a mobile phone app. Or a web site. Or a microwave oven controller.

    But if you wanted a nuclear missile launch system, or a radar tracking system, or an air traffic control system, or a spacecraft navigation system, or a submarine combat system, then Ada could not be beat.

    It was reliable, portable, and cheap - yes cheap. Sure, the compilers cost a lot initially, but the prices soon came down to where you could buy a fully validated compiler for the PC for a few hundred dollars - much less than you now pay for Micro$oft rubbish. And debugging and maintenance costs were massively reduced. And did you get that "validated" bit? That meant that code written for one compiler was guaranteed to run on any other - even if that compiler was written by a different company, and intended to run on different hardware. Unix, Windows, DOS, IBM Mainframe, various supercomputers, embedded chipsets, etc etc. No learning curve for programmers moving companies or hardware platforms.

    Hardware manufacturers hated it. They could no longer sell you their "proprietary" languages written for their "proprietary" hardware. Whole projects could be migrated from IBM to VAX to PC by a simple recompile. And often were, if it proved that the hardware was not up to the task, or better and cheaper hardware came along.

    Software manufacturers hated it. They could no longer count on lucrative "maintenance" contracts for software that was deliberately written in obscure languages so that no-one other than the original company could ever maintain or modify it.

    Training companies hated it. The entire Ada LRM was smaller than your average "Dummies Guide to <insert your favorite dingbat language here>" book - and about a 10th the size of the equivalent documentation set required to learn to use C++ or Java effectively.

    But many software engineers loved it - not least because having once learned it, their skills were applicable to just about every company in the industry. They could move at will anywhere they liked and be productive the day they arrived.

    And so Ada had to go.

    Ross.

    LOL...very well said...wish I had written it.
  • Too_Many_ToolsToo_Many_Tools Posts: 765
    edited 2013-12-18 10:51
    Martin_H wrote: »
    I've never even seen an Ada compiler to reject it. I haven't even written "hello world" in Ada. Ada might be the perfect programming language and I'd never know it.

    The problem with computer languages is that they depend greatly on network effects. The C and C++ languages are not great and the anti-C/C++ camp have good arguments that I agree with. For example:

    Why can't the compiler emit the header file?

    Why am I constantly typing '!', '', '{', '}', '(', or ')'? This language is impossible to touch type because of an over reliance on characters accessed via the little fingers.

    There are syntax landmines like the assignment within a conditional (e.g. "if (foo = bar)" instead of "if (foo == bar)") which create hard to find bugs.

    The unchecked arrays may be fast, but enable buffer overflows which are the cause of many security flaws.

    I flat out hated the language for a year, but eventually gave in. So why did C/C++ become so influential? The answer's simple, Unix was written in C, Unix included a C compiler, and Unix was the OS of choice in most C departments. So any company in the 80's looking to start a new software development had a pool of talent that knew C, while no one knew Ada.

    tl;dr no one got fired for choosing C.

    Again..well said.

    When all you have are C programmers, every project then looks like it needs to be done in C.
  • Too_Many_ToolsToo_Many_Tools Posts: 765
    edited 2013-12-18 10:53
    Ale wrote: »
    What I find a bit ironic is that some advantages of Ada are needed but not available in C. The MISRA (rules) try to force programmers to sort of "behave" (strong types, for instance)... but that is a post write problem!. Instead of modifying the language (and the compilers!), something beneficial, third-party tools have to be used to make the code "conform". The rules at least shouldn't be contradictory....

    The new languages are flow-diagrams and graphical representations of state-machines... sad... (Rhapsody, I'm looking at you)

    I remember when I started writing C...and then had to use lint for checking.

    I was thinking..I didn't have to do this with Ada...it was already done for me.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2013-12-18 11:04
    FWIW..hardware is a small cost of a total project expenditure
    Not when you're designing something that will be sold in the millions of units. Then shaving every penny from the bill of materials matters a lot, even if development costs increase as a result. Moreover code maintainability will not mean much if the product's lifetime is measured in months or a couple years.

    -Phil
  • Too_Many_ToolsToo_Many_Tools Posts: 765
    edited 2013-12-18 11:04
    Heater. wrote: »
    RossH,

    Hmm...OK we are going to have to define what this "programming community" might mean.

    Of course there are many programming communities...

    During Ada's brief period of fame the defence industry was a substantial part of the programming population. There were a lot less computers and a lot less programmers then.

    Then there is the Windows world. No take up there. How come Windows is not written on Ada, or any of the millions of Windows applications?

    Then there is the Apple world...same story.

    Then the huge and sprawling conurbations of the Free and Open Source software world. How come Richard Stallman created a C compiler not an Ada compiler? How come Linus did not write the Linux kernel in Ada? What about all the thousands of programs that support the Linux/Unix infrastructure. No take up there either.

    Is there even one program in common use that is written in Ada?

    This all looks like a major rejection from all quarters of the "community".

    So Ada is now confined to a small pockets of resistance hiding away in gehttos in places where normal people don't dare to go. It's still in use in an ever decreasing number of industries where it is perceived as offering the possibility of more robust and reliable software. Which is of course an illusion.

    The ugly truth of programming is that our software tools are primitive compared to what is available to hardware developers.

    When a microprocessor/microcontroller is built, an assembler is the first tool made..because it is simple..crude but simple.

    The next is a C compiler ...using the assembler.

    Again simple..but crude never the less.

    And that is as far most companies go..since now they can say "we can sell this thing" without further expense.

    Strongly typed languages like Pascal and Ada take compilers that can cost hundreds of man years to make..few companies are willing to make that investment.

    One big reason why Ada came out of government development and areospace...where they have the money to build complex tools.

    I have had the opportunity to see some of the source code from companies like Microsoft and Apple..and it ain't pretty. Those companies are fighting an up hill battle in terms of software maintainance because ot their language choices years ago. Every wonder why a company will drop a product and not have backwards compatibility with its newer versions? Many times it is they could not maintain the old product because of the complexity due to poor programming choices.
  • potatoheadpotatohead Posts: 10,261
    edited 2013-12-18 12:14
    The simple truth of that is cost pressure. Full refactors are rare. The programming choices often come down to legacy code, options at the time, and time to market.

    There is should and there is can and necessary, not the same things. I don't like it either.

    As for the black boc thing, sometimes it makes great sense, other times it doesn't. When that voodoo breaks?
  • jazzedjazzed Posts: 11,803
    edited 2013-12-18 12:31
    Again..well said.

    When all you have are C programmers, every project then looks like it needs to be done in C.

    LOL. When most programmers know a better language, we won't need C :)

    Python is pretty good but I keep on tripping on adding those silly ':' chars (toung in cheek Martin), and the fact that there is no type checking. Guess someone could make a good Python lint, but having to do two compile steps is annoying.
  • K2K2 Posts: 693
    edited 2013-12-18 12:36
    rod1963 wrote: »
    Assembly? It really depends if you're writing a BIOS, writing virtual peripherals on the Prop, start-up/initialiation code, run-times for compilers. Or simply the joy of it. I like writing assembly on Z-80's and 68k/ColdFire. I wouldn't touch it on a x86, ARM or MIPS.

    The ARM instruction sets are fairly assembly-friendly. Of course there is hardly any occasion to program them that way because they were specifically designed with C compilers in mind, and run C efficiently.

    What complicates ARM assembly appears to be the predilection Roger/Sophie had/has for leaving all of his/her options open, both personally and professionally. As a result, a million things on the chip have to be configured by user code. CMSIS and its ilk save most of us from dealing with that.
  • Too_Many_ToolsToo_Many_Tools Posts: 765
    edited 2013-12-18 13:36
    K2 wrote: »
    The ARM instruction sets are fairly assembly-friendly. Of course there is hardly any occasion to program it that way because it was specifically designed with C compilers in mind, and runs C efficiently.

    What makes assembly difficult on the ARM appears to be the predilection Roger/Sophie had/has for leaving all of his/her options open, both personally and professionally. As a result, a million things on the chip have to be configured by user code. CMSIS and its ilk save most of us from dealing with that.

    All instruction sets are assembly friendly...that is why all chips have assemblers instead of making programmers program in machine code...like they did back in the day of REAL programmers. ;<)

    Like I said, the tools that programmers have are extremely primitive compared to hardware people.

    That is one very BIG reason why companies should require software in the most advanced form possible...anything else costs the company money in the end.
  • Too_Many_ToolsToo_Many_Tools Posts: 765
    edited 2013-12-18 13:43
    jazzed wrote: »
    LOL. When most programmers know a better language, we won't need C :)

    Python is pretty good but I keep on tripping on adding those silly ':' chars (toung in cheek Martin), and the fact that there is no type checking. Guess someone could make a good Python lint, but having to do two compile steps is annoying.

    Any time you see utilities like lint being used tells you have subpar software tooling.

    In the past I favored Pascal and Ada ..because of the strong typing...and when forced to use assembly and C it was like using stones and bearskins to code. Strongly typed languages soon show who the really good programmers are...and why there is so much garbage code written. When your compile runs with no errors on an Ada compiler set to its strongest settings on its first compile, you know that you have graduated to the upper levels of software coding.

    Of course just because it compiles doen't mean that the logic is right.

    That takes a tool that hasn't been written yet...;<)
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2013-12-18 14:00
    K2 wrote:
    The ARM instruction sets are fairly assembly-friendly.
    All instruction sets are assembly friendly...
    Not all instruction sets are assembly-programmer-friendly, however. A lot depends upon symmetry, orthogonality, and pipeline gotchas. Atmel's AVR chips, for example, are optimized for C. But the instruction set and allowable register usage are rife with gaps that trip up even experienced assembly programmers. By sharp contrast, the Propeller has the friendliest architecture I've ever experienced for assembly programming.

    -Phil
  • jazzedjazzed Posts: 11,803
    edited 2013-12-18 15:05
    Not all instruction sets are assembly-programmer-friendly, however. A lot depends upon symmetry, orthogonality, and pipeline gotchas. Atmel's AVR chips, for example, are optimized for C. But the instruction set and allowable register usage are rife with gaps that trip up even experienced assembly programmers. By sharp contrast, the Propeller has the friendliest architecture I've ever experienced for assembly programming.

    -Phil
    I like AVR assembly just fine except for SBRC and other skip instructions.
  • prof_brainoprof_braino Posts: 4,313
    edited 2013-12-18 16:33
    potatohead wrote: »
    The question was, "What 5 programming languages should every programmer know?"

    What do you guys say to that? I'm very curious to read your thoughts on the matter.

    FORTH for embedded, and what ever for when your not going embedded. Depends on your os and application.
  • Too_Many_ToolsToo_Many_Tools Posts: 765
    edited 2013-12-18 17:09
    Not all instruction sets are assembly-programmer-friendly, however. A lot depends upon symmetry, orthogonality, and pipeline gotchas. Atmel's AVR chips, for example, are optimized for C. But the instruction set and allowable register usage are rife with gaps that trip up even experienced assembly programmers. By sharp contrast, the Propeller has the friendliest architecture I've ever experienced for assembly programming.

    -Phil

    Phil...after the machine code is defined by the hardware, the assembler designer takes them and creates the assembly instructions. As you point out...if the hardware is not designed properly, then you see the symmetry/orthoganality and pipeline issues. The hardware should be a black box with minimal to no need to know what is inside...if properly designed..That proper design takes additional logic that many hardware designers refuse to commit.

    And yes in a former life I did design CPUs..some that you likely use in your daily life..so I have been on that side of the tracks.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2013-12-18 17:15
    It's not a question of "proper" or "improper" design. It's a question of what the processor is optimized for, and it's not always for assembly programming. Also pipeline issues are simply a consequence of having a pipeline in the first place. In many DSPs, you have to be aware of the pipeline and be cognizant that sometimes the instruction after an unconditional jump will still get executed. The Prop2 will be the same way. And that's not improper design.

    As to the "black box" assertion, no. Somebody has to know what's going on inside of it, so they can write compilers for it. Furthermore, even HLL users need to know something about the processor. For example, if it doesn't have hardware divide, you will want to be more careful about how often you do divisions, if performance is an issue.

    -Phil
  • Too_Many_ToolsToo_Many_Tools Posts: 765
    edited 2013-12-18 17:37
    It's not a question of "proper" or "improper" design. It's a question of what the processor is optimized for, and it's not always for assembly programming. Also pipeline issues are simply a consequence of having a pipeline in the first place. In many DSPs, you have to be aware of the pipeline and be cognizant that sometimes the instruction after an unconditional jump will still get executed. The Prop2 will be the same way. And that's not improper design.

    As to the "black box" assertion, no. Somebody has to know what's going on inside of it, so they can write compilers for it. Furthermore, even HLL users need to know something about the processor. For example, if it doesn't have hardware divide, you will want to be more careful about how often you do divisions, if performance is an issue.

    -Phil

    I understand..but if a processor is truly optimized for software any and all of these pitfalls (hardware guys call them "enhancements") are transparent to the software people.

    Remember..software IS the largest cost to any project..and everything in a program should be optimized to make the software realm easy = cheaper.

    Anything that doesn't translates into higher costs.

    And again I am a former hardware guy who went to the Dark (software) Side....and so I understand that silicon is cheap and software is expensive.

    The example of a missing hardware divide is an example of hardware system design failure...silicon is cheap enough to have it considering what it will cost in software for projects using the processor.
  • jazzedjazzed Posts: 11,803
    edited 2013-12-18 17:52
    The example of a missing hardware divide is an example of hardware system design failure...silicon is cheap enough to have it considering what it will cost in software for projects using the processor.

    Well, not necessarily a failure. But it's darn nice to have :)

    Now, my bets are on someone (Phil or others) wanting to have a huge argument about such things because you have invoked part of your resume' and they have nothing better to do.
  • Martin_HMartin_H Posts: 4,051
    edited 2013-12-18 18:11
    I stopped programming in assembly professionally around 1990. This was around the time RISC processors became mainstream in the workstation market. After that C was used almost exclusively, and usually produced better code. I seem to recall this was because dual issue machines with instruction pipelines were just too hard to optimize by hand. Your code would work, but would blow the pipeline and performance would suffer. I imagine that with enough practice you could pull it off, but a C compiler is easier to deal with.
  • Too_Many_ToolsToo_Many_Tools Posts: 765
    edited 2013-12-18 18:24
    jazzed wrote: »
    Well, not necessarily a failure. But it's darn nice to have :)

    Now, my bets are on someone (Phil or others) wanting to have a huge argument about such things because you have invoked part of your resume' and they have nothing better to do.

    LOL...yes these discussions can take on a religious zeal. ;<)

    If one takes a look at the big picture, we are moving from a hardware-centric to a software-centric world.

    In the past hardware was extremely expensive..every transistor mattered..so the cost of hardware was the driver in project development.

    That is in the past..silicon is cheap...while software continues to become more expensive with more and more lines of codes written and to maintain. It only follows that any company concerned with whole life project costs would stress better hardware to support cheaper software production/maintainance.

    FWIW...I understand why the microcontroller world does lag in this trend..their cost goals are skewed by cost constraints different from the main stream of computing.

    It is also why microcontroller software tools lag in sophiscation compared to those of more plentiful microprocessors.

    As for resumes, I have been extremely fortunate to work on projects that have been wildly successful in terms of cutting edge successes and popularity.

    Do you know how you recognize pioneers in a field of science/engineering?

    By the arrows in their backs. ;<)
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2013-12-18 18:31
    Remember..software IS the largest cost to any project..
    That is utter nonsense. What percentage of the of the cost of the tens of millions of Mr. Coffee coffeemakers out there do you think was consumed by writing the software to run the machines? And I'll bet none of the controllers in those machines has a hardware divide. So I guess they must be defective, right?

    -Phil
  • rod1963rod1963 Posts: 752
    edited 2013-12-18 19:24
    Phil

    Does the VB(Ruby, Jscript, etc) coder need to know about hyperthreading or the architecture of the latest DSP extensions in the Intel CPU? No. The same with the Apple code monkey writing applets for the Ipod. To them it's a magic box.

    Now certain specialties require this particular expertise such as writing low level device drivers, BIOS'es, compiler run-times, etc. But they don't comprise anywhere near the bulk of coders today. 20 years ago it was a different story.

    Micro-controllers are a whole nother kettle of fish. Resource constraints require a knowledge of the underlying hardware and use of assembler in some cases. Now some 32 bit RISC micros like the PPC and MIPS(PIC32) are not assembler programmer friendly as they are designed for HLL compilers to do all the grunt work, same with the ARM. There's no real advantage to coding a ARM in assembly with today's compilers.
  • rod1963rod1963 Posts: 752
    edited 2013-12-18 19:31
    Phil

    That's a strawman argument. Compare a coffee maker with a el-cheapo 4 or 8 bit micro to the time, money and effort of Apple to build their IOS and then test it and you get a much different cost variance. It gets worse with safety critical software used in the medical, aerospace fields. The processing hardware is relatively cheap compared to the enormous amount of time and money consumed by coders to generate the finished program and works without killing people or causing airplanes to spiral out of control.
  • kwinnkwinn Posts: 8,697
    edited 2013-12-18 19:35
    Not all instruction sets are assembly-programmer-friendly, however. A lot depends upon symmetry, orthogonality, and pipeline gotchas. Atmel's AVR chips, for example, are optimized for C. But the instruction set and allowable register usage are rife with gaps that trip up even experienced assembly programmers. By sharp contrast, the Propeller has the friendliest architecture I've ever experienced for assembly programming.
    -Phil

    I agree. The 6809 was also pretty good as far as those attributes went, at least in comparison to what was available at the time.
  • kwinnkwinn Posts: 8,697
    edited 2013-12-18 19:40
    FORTH for embedded, and what ever for when your not going embedded. Depends on your os and application.
    Could a single cog 4 port serial driver capable of 4 x 115Kbaud be written for the Propeller in Forth? Don't get me wrong, I think Forth has it's place, but there are some things you need assembly for.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2013-12-18 19:48
    rod1963 wrote:
    Does the VB(Ruby, Jscript, etc) coder need to know about hyperthreading or the architecture of the latest DSP extensions in the Intel CPU? No.
    Of course not. That was not my point. Too_Many_Tools said, "Remember..software IS the largest cost to any project.." [empahsis mine] A rebuttal to "any" requires but a single counterexample, which I provided.

    -Phil
  • AleAle Posts: 2,363
    edited 2013-12-18 23:26
    Someone mentioned the 6809 :)... just for fun I'm finishing a verilog implementation of it.. is that thing slow !..20 machine cycles for a sfotware interrupt ?... where other times... (mine is marginally fatser, just because I want to use async memory...).
    I like assembly...
    Phil's counter-argument, the amount of coffee seems valid from what I see here.... actually I think more time is lost with inadequate tools, endless discussions about small not useful details, designs that are being constantly updated, i.e. moving targets/requeirements and so on...
  • GadgetmanGadgetman Posts: 2,436
    edited 2013-12-18 23:47
    *Sigh*

    *Hums 'If I had a hammer'*

    There's a lot of opinions and standpoints here...

    Let me just add mine to the fray...

    THERE IS NO SINGLE CORRECT ANSWER!

    Yes, you can use C to program an mCU. Or the kernel of an OS.
    Yes, you can also do the same With Assembler.
    Would you really want to do it?
    That depends entirely on the situation.

    Take my favorite CPU, the Z80.
    I've never seen a C or Pascal Compiler that is REALLY capable of exploiting the alternate registers properly(It has a dual set of all the registers, from the Accumulator to the H and L registers), or even to use the IX and IY registers. Just doesn't happen.
    (This may be because most of those Compilers were written based on the 8080 and just 'updated' a bit)
    Frankly, I don't think they even use the 'bit' commands or the DMA functions, either.
    If you want to really push this CPU, you have to use assembler.

    Is there a C Compiler that correctly exploits the 'Zero Page' on a 6502?
    ( LDA $13 may sound similar to LDA $0013, and even does the same thing; load the value found at address 0013 into the accumulator, but one of them uses one less memory read... )


    But those are relatively simple CPUs.
    Take a modern CPU With a pipeline, and everything goes to ...
    (I really, really hate messing about in assembler on anything With a pipeline)

    Most compiled Languages are Stack-based...
    Which is handy-dandy if you have lots of RAM. Or it doesn't matter too much if everything is continually PUSHed and POPed.
    In a very tight loop, it may be better to reserve one or more registers for the variables and keep them there. Not every Language or compiler has the optimisation rules to spot this, or can be overridden manually to do it.

    Of course, in a shared/multi-tasking/multi-threading environment, that kind of optimisation may actually be Counter-productive.
  • potatoheadpotatohead Posts: 10,261
    edited 2013-12-19 00:15
    The hardware should be a black box with minimal to no need to know what is inside...if properly designed..

    I have a problem with this statement. "Properly designed" means a hell of a lot of things. For one, we don't always know the optimal design of a given thing. Secondly, optimal depends on markets and people and those things shift and change in often arbitrary ways. Finally, for new things, it's often possible to make a good design early on, but those markets and people do not always select on that basis. In fact, "best" type tech almost never wins on that basis alone.

    What all of that means is there will absolutely be competing designs and where that's true, some understanding of what they do, how they do it and most importantly, why they do what they do is going to be necessary.

    It also means we don't make perfect things. You know, long ago I was confronted with the calculator. Got one given to me in 4th grade. For the 70's this was kind of a radical proposition. A primary concern is the user of "the black box" would reach dependency on the box and or not develop modes of thought, means, methods, concepts needed to realize higher order thoughts too.

    Now I was actually the kind of little kid who not only loved technology, but I took those kinds of comments seriously. Why? Growing up with few means required that I do understand a lot of things my peers didn't have to understand. That I ended up with this spiffy machine where few of my peers did was kind of interesting. The better off ones just used them, never looking back. I actually did the math, then checked the calculator and checked myself. Found it did things very differently and it wasn't a perfect device. Now there was nothing in primary school it did incorrectly, but that's not the point.

    The point was I didn't know the HOW, but I did know something about the WHY and the WHAT. Intriguing problem. The HOW was something I was learning at the time, and there was one thing I knew for sure about HOW type learning and that was once you have it, you have it and many things are possible that were not before, which expand the WHY and WHAT possibilities.

    For a month or two the issue bubbled about and I heard lots of opinion from different people and came to the conclusion that it's simply not wise to trust black boxes.

    Never have since.

    I think it's a noble goal to reach for, but basic human realities paint a very different picture. We almost never get to start from first principles, for example. So very often, we start with what we are given or can find, buy, etc... and from there we work forward to whatever it is.

    This is why I think those who ask the sorts of questions many of us here do should take a look at assembly language. No need to reach mastery, but there is also no need to rely on voodoo blindly either. Sometimes doing that is warranted, even recommended. But sometimes it's just not.

    And here is the trouble: When it's not, what then?

    Those that peeked behind the voodoo curtain can often sort it all out. Those that didn't, need those that did to get it done, or they exist in a state of forced trust. Now this may be cynical, and arguably so, but where there is forced trust there is exploitation, abuse, etc... Ever actually buy things from an open market? Just one example. How much did you pay?

    If you absolutely had to have it, you paid as much as you could pay. If you could walk, you likely dealt around and paid a fair price. If you were doing it for other reasons of less importance, you might even get it really cheap. Just hold that thought.

    Trusting these things is a choice. Truth is, we need, can, and some of us should look under the hood just so we aren't in that position of forced trust. This is to keep honest people honest and out nefarious ones as they should be. Ideally, they get the message and join the ranks of the honest people trying to be honest and we are better for that. It is not important that everybody do that, just that they CAN do that, which means some of us WILL do that, and that is enough to make it all work.

    Do you know everything your tech does for you? I'm gonna link a classic I need to link here from time to time to make this point: http://cm.bell-labs.com/who/ken/trust.html

    Ken Thompson. Yeah, he gets it. And you can bet your Smile he looks under the hood too.

    So you don't know. Of course you can test for it, and you can choose to trust those who supply you, and you would likely know more and even enough to not worry about it. Solid case for that. I think it's a majority case a lot of the time, but it's not an absolute given. Can't be simply because of who we are as beings. Remember that thought. Now is the time to link it to this one.

    Are those tests inclusive? How can you even know? Just a coupla more thoughts. You can't really test for everything in a practical sense.

    Another one has to do with making better things. If we black box things, others don't learn from them. We've bootstrapped ourselves to this level of technical mastery by looking under the hood. We didn't get there by trusting things.

    So then, there are your goals, what you value, how you value it, etc... involved in this. That happy woman Heater mentioned doesn't care. She writes her COBOL and it's good and it works and she does it as a means to her life and her own ends. Nothing wrong with that. Nothing wrong with designing for that either.

    Some arguments were made here on cost and efficiency basis too. Those happen to be extremely highly idealized. I would frame it as tech fundamentalist on a different axis from my own fundamentalism. And I'm not using that term in a negative way here. Please don't take it in the highly charged political context it so often is. I'm just using it to make a point.

    Well, a few of them actually.

    When I read, "should", "must", etc... I look for the context, because the context gets at the intent and that can be very different. If one were to apply those ideas to these Propeller devices we like so much, they quite simply would not exist. Does that make sense? Of course not, and the intent behind them is very different from other kinds of things we see out there and all of that impacts whether or not we can, should and will trust the black boxes.

    Mix in legal and suddenly, you've got a black box that really can't be pried open. There is that forced trust again. It comes up a lot in life. For some of us, it might not come up much. Our means are sufficient to pay down the worries, our goals may be very well aligned with majority use cases, etc... and we can ignore the voodoo, trust it and go and get stuff done. Perfectly reasonable.

    However, when our goals aren't aligned in this way, be it personal or business or both, knowing something about the voodoo may in fact be worth everything! It may be needed to compete, stay out of trouble, enable expression despite those who would suppress it, and it often leads to bigger and better things, because no matter how hard we work at it, there is always somebody out there with a different body of experiences, way of thinking, overall smarts, means, and goals that just don't match up with those best case black box voodoo things.

    ***And I've called it voodoo ever since I learned what that means. Primary school basically. And it is, unless one can understand the theory of operation, and compare that to the observed operation and means and methods employed to realize said operation.

    I'm not gonna tell you that you are wrong for idealizing this to the point of the perfect box that nobody wants to or has to worry about. You aren't wrong for wanting to get there. I am going to tell you absolutely that we are just not there yet, and while that is true, it is wise for at least some of us to pick at things so that we keep the honest people honest and that we enable those who would think the kinds of thoughts that bring us even better technology too.

    That, in a nut shell is why I like assembly language. I'll use voodoo and do regularly. But I also like to have the skills needed to go digging really deep when I need to, or at least some idea of how that might be done. Why? Because down deep, I kind of worry that I might have to. And so far in my life experience, I've had to.

    One last thing. Our life experiences impact us. I know I think somewhat differently than many of my peers. That's OK. Their experiences take them different directions. So there aren't wrong answers here. Just ideas and minds and the idea that the sharing of them makes us better.
Sign In or Register to comment.