Becoming a Better Programmer

124

Comments

  • Cluso99Cluso99 Posts: 13,916
    How about full AT command set, soft UART bit-banged, auto baud, and driving modem DSPs, all in 3.7KB. MC68705U3S. Did that in the mid 80s.
    My Prop boards: P8XBlade2, RamBlade, CpuBlade, TriBlade
    Prop OS (also see Sphinx, PropDos, PropCmd, Spinix)
    Website: www.clusos.com
    Prop Tools (Index) , Emulators (Index) , ZiCog (Z80)
  • I keep hearing this refrain about "poor programming" now a days.

    On the other hand we know that people like Google have invested a lot in creating efficient, optimized, programming environments. Like Clang/LLVM for example. There are some very smart and efficiency obsessed guys working on such things. The Chrome browser, for example, may use a ton of memory but it would be much bigger and slower if they had not put that effort in.

    So where does the "bloat" come from? Let's look at a simple example, the "Hello World" program:
    int main (int argc, char* argv[])
    {
    printf("Hello World!");
    return (0);
    }
    What is wrong with that? Can probably be written in a few tens of bytes in assembler.

    Well, the "Hello World!" probably needs to be in Unicode. Which already requires a huge pile of "bloat" to handle.

    Then of course it needs internationalizing, that "Hello World!" should come out in whatever the users language is. Another pile of bloat.

    Then of course nobody uses the command line anymore so that needs a GUI Window to open to display the message. Another pile of bloat.

    Well, with that GUI you need to support all kinds of fonts, anti-aliasing etc, etc. More bloat.

    Now, you might need this program to work on multiple platforms, Linux, Mac, Windows at least. Not to mention iOS and Android. That means using a cross platform GUI toolkit. Another pile of bloat.

    It's hopeless. But every byte of that bloat is required.

    That about sums it up.

    Last night I did a search of all bmp and jpg files on a particular system, and I was amazed at the results. Photos and images I have never seen before. Resources included in exe files are another great source of bloat.


    Novel Solutions - http://www.novelsolutionsonline.com/ - Machinery Design • - • Product Development
    "Necessity is the mother of invention." - Author unknown.

  • Heater.Heater. Posts: 21,121
    But is it right to call it bloat?

    Just now I have a bunch of apps open and 20 odd tabs in my browser. Images and graphics abound. It's all using 50% of the 8GB of RAM on this machine.

    Given that I like to have all that, and given that such features take space, and given that the whole deal cost half the price of my CP/M machine maxed out with WordStar in 1981, what's the problem?

    There is one thing I run here that I do call bloat. The IntelliJ IDE. That thing takes two minutes to start up and get itself into a usable state. Then consumes 25% of my RAM.

    God knows why. Except, well, it written in Java.

  • Heater.Heater. Posts: 21,121
    edited May 5 Vote Up0Vote Down
    So there I was, tenaciously becoming a better programmer. Four or more days intense hacking of come C++. Carefully formatting and refactoring everything nicely, checking through all the comments, etc, etc. I have to deliver this source to a partner company soon so it had better look good.

    This was so intense that I forgot one thing... Get it pushed up to BitBucket. Or at least backed up somewhere.

    This morning I thought I'd better do that before my computer dies or whatever other disaster.

    But no, I had to do one more little thing first....

    POOF, I inadvertently deleted the whole directory!

    So today I had to start over, which was annoying but not so bad as the whole thing was still fresh in my mind.

    On the bright side, the new recreation is better than the original approach.

    Some well known software engineer said "Be prepared to throw away the first version"







  • TorTor Posts: 1,951
    edited May 4 Vote Up0Vote Down
    Fred Brooks (from his ' The Mythical Man-Month: Essays on Software Engineering' said "plan to throw one away; you will, anyhow.". True, that.

    He had more, e.g. Fred Brooks' law: "Adding manpower to a late software project makes it later."

  • Heater,

    Is there an Auto Save or Auto Backup option?
    I can't remember which CAD program it was but it would save to a temporary file each time you made a change;
    It was neat because if the program crashed you wouldn't lose everything.
    I once had SolidWorks lock-up and I lost hours of work.


    I remember using Lynx but I've never seen DOOM in text-mode.
  • @Heater
    Whatever the basic Dropbox subscription is, I have it. Delete a sync'd file on your system and you delete it from Dropbox...BUT Dropbox keeps a history where you can go back and retrieve previous saves. I don't know if there is a limit, probably depends on available storage space.

    Saved my posterior, a while ago.
    PropBASIC ROCKS!
  • Heater.Heater. Posts: 21,121
    Ah, it was Fred. Thanks. True dat.

    I almost had to buy a copy of "The Mythical Man-Month" for the bosses of every new company I worked for.

    Said bosses get a bit out of shape when I explain that their project proposal may well require 100,000 lines of code. That historically a software engineer produces 10 lines of code per day, as Fred pointed out, and so the whole project could take about 40 man years.

    So no, I won't have a demo ready by next week.

    And, by the way, you should plan to throw that first version away!



  • WHAT ON EARTH???
    (After post-psychological-shock-deep-breaths)
    This looks like a pixelated 1st person shooter game that was video shopped into the command shell video.
  • Heater.Heater. Posts: 21,121
    AwesomeCronk,
    WHAT ON EARTH???

    It's the game "DOOM". Probably the first majorly successful 3D first person shooter and one of the most famous games in history.

    No need for it to be video shopped, there have been a few cases of people modifying DOOM to render it's graphics as text instead of the usual pixels.

    Years ago I saw one that rendered to plain old black and white text characters. Can't find it now.
  • Heater,

    Did you play barney Doom or Castle Smufenstein?
  • Heater.Heater. Posts: 21,121
    No I didn't. I'm too old to not loath and detest the Smurfs.

    I did once hack some resources in Quake to put the faces of a few of my friends on the walls in odd places. Said friends were awestruck to find themselves in the game when they came over to play.

    I was never much of a gamer. The last straw was when, after many hours of play, I scored a 100,000, the highest possible, in Starglider on my Atari ST520. At which point it ranked me as "Cheat". I was so pissed off I have pretty much never played any games since.

  • frank freedmanfrank freedman Posts: 1,411
    edited May 6 Vote Up0Vote Down
    Cut and moved. Not sure how to delete.

    Ordnung ist das halbe Leben
    I gave up on that half long ago.........
  • Don MDon M Posts: 1,572
    Heater. wrote: »
    RS_Jim,

    Assuming you already have a project started in some directory and assuming you have git installed on your machine I would proceed like this:

    1) Create a git repository out of your existing code.

    $ cd myProject
    $ git init

    That will ask a few questions, mostly I just hit "return" and accept the defaults.

    2) Add your code to the git repository:

    $ git add thisFile otherFile thisDirectory otherDirectory ...

    Or you can just do:

    $ git add *

    That will add everything. But that may add a lot of binaries and other stuff which are not source but created from the source, which should perhaps not be in the repo.

    At this point you could still be changing or creating new files and "git add" them as you like.

    3) At this point your code is not actually committed to the repo. You could hack it some more and "git add" it again. When you want to commit it as an actual version you have to use "git commit"

    $ git commit -m "My initial git commit"

    Now you have a version in the git repo. With a message logged describing what your changes were.

    4) You can view your log of changes with:

    $ git log

    5) You can view the status of your repo with:

    $ git status

    That will tell you what files you have changed or added that are not yet committed. And other useful stuff.

    After that, my day to day working is just, edit some source files, do a "git add" when they look good. Then "git commit". Job done.

    Then you might want to put your repo int gighub.com or bitbucket.com for safekeeping and possible sharing. I suggest just signing up for github and following their very clear instructions on how to create a new github repo and get your code into it.

    When you have a github or bitbucket repo up then you can do the following:

    $ git push

    Will push commited changes from your machine to the github repo.

    $ git pull

    Will pull changes from the github repo to your local machine.

    This push/pull thing is neat because you may find yourself hacking your code on different PC's here and there, as I often do. No propblem, push changes from one machine, pull them to another.

    Then there is cloning, as in:

    $ git clone https://github.com/ucb-bar/riscv-sodor.git

    Which is good if your machine does not have the repo yet, it just fetches the repo you want.

    Personally, as a beginner I would advise ignoring all talk or branching, merging, rebasing etc. Why?

    Because, an important thing to realize is that the repo on your machine, and the one on github are exactly equivalent. There is no "master".

    That means that when you hack code on your machine and commit it, it is a branch from github or wherever it came from. Until you push your changes.

    If you find you have messed up your branch, you can always just delete the whole directory and "git clone" it back again.

    And, well, all that branching, merging, rebasing stuff is complex. I almost never need it.


    Thanks Heater. Your simple instructions just made it easier for me to understand git.

  • Heater.Heater. Posts: 21,121
    I'm glad that helps.

    Be warned that git has a billion other features. Which is great when you need them. But it does lean to a billion blogs, posts, etc around the net that really confuse the issue for a beginner.

    For one guy hacking on a project it's quite enough to get by without all the branching, merging, rebasing talk.

    But here is a neat trick:

    Let's say you hack on file A to fix a bug X.

    Then you hack on file B to fix bug Y. These changes are not otherwise related.

    Now you want to commit those changes.

    A good idea is to do this part by part:

    Do a git add of file A and then commit with a comment like "Fixed bug X"

    Then do a git add of file B and commit with a comment like "Fixed bug Y"

    Now in your git log you have a clear record of what happened. As opposed to the confusion in the log that would happen if you added both files and committed with a comment like "Fixed bug X and Y".

    Or imagine you have changed a dozen things in a week and then do a single commit. Then your log entry would say "Fixed X, Y, Z, reformatted this and that added feature P,Q". Impossible to disentangle later if you need to.
  • TorTor Posts: 1,951
    I nearly always use 'git add --patch', which lets me leave in a lot of printfs and debug and test code and whatnot, and still commit my *real* (and tested) changes.
  • Heater.Heater. Posts: 21,121
    Tor,

    I find what you are suggesting with "add --patch" somewhat disturbing. Because:

    1) The code you are committing is not actually the code you tested. It's not uncommon that bugs get masked by the introduction of printfs and other debug checks.

    2) If I were to push that to github or whatever from one place and then pull it to a machine in other place, as I often do when working at home, office, and elsewhere, then I don't have all that nice printf/debug/test code in place. Nor would anyone else collaborating on that code.

    3) If you have test harnesses, unit tests, or whatever in place then they should be part of the repository. Then the whole thing can then be cloned anywhere and the tests run.

    Using "add --patch" as you suggest would not fit the way I work.

    Having said that "add --patch" is very useful. Let's say that in one day you have made three changes to the same file to fix three bugs. Well, then you can add and commit the whole thing. In which case the commit comment should say "Fixed bugs A, B, C". But better is to add and commit, one at at time, the three changes (patches), with appropriate commit comments. Then the commit history explains what has been done more clearly. And if need be the changes can be rolled back, one by one.

    But better yet is to remember to just "add" and "commit" for every little fix. Then you don't have to mess around with "add --patch".

  • cherry-pick is another great way to break up multiple commits into separate branches or pull changes into another branch.
  • Heater.Heater. Posts: 21,121
    edited May 9 Vote Up0Vote Down
    Yes it is.

    My simple idea is that when introducing beginners to git, is to not talk about branching and merging, add --patch, rebasing and so on. That just confuses everyone. Useful when you need it but most of the time, on a one man project or with a few collaborators on github etc, not needed.



  • WhitWhit Posts: 3,975
    Heater. wrote: »
    Yes it is.

    My simple idea is that when introducing beginners to git, is to not talk about branching and merging, add --patch, rebasing and so on. That just confuses everyone. Useful when you need it but most of the time, on one man or with a few collaborators on github etc, not needed.

    I am with you Heater - little bites at a time.

    Whit+

    "We keep moving forward, opening new doors, and doing new things, because we're curious and curiosity keeps leading us down new paths." - Walt Disney
  • TorTor Posts: 1,951
    edited May 9 Vote Up0Vote Down
    Heater. wrote: »
    Tor,

    I find what you are suggesting with "add --patch" somewhat disturbing. Because:

    1) The code you are committing is not actually the code you tested. It's not uncommon that bugs get masked by the introduction of printfs and other debug checks.
    That is true. But: If I work at completely different functions at the same time, as I do, then it has a lot of merit to commit the changes that have been verified, e.g. with unit-testing a function, even though another completely irrelevant function in the same source has a 'printf' in it. And, what's more, I always work on a development branch. My private one. What ends up in the master branch (or whatever the release branch is called) is always the tested, commited, nothing-else stuff. Besides, when I test and implement stuff I do it on my own repo, where I also rebase and rebase -i all the time, changing history but ending up with a very nice flow of incremental changes reflecting a development which is very good to have when I get back to a project later (sometimes a decade later, which I have done). I just look through the commits and that deploys the project back in my mind in an understandable way.

    [..]
    I really recommend that you try to work with --patch in practice. Very few that do will go back to wholesale commits - there are many uses, depending on the individual. More than I've mentioned.

    (one is that when I edit a file, I often end up with unintentional whitespace changes - an extra space on an otherwise empty line, or two blank lines where I want a single one. Instead of trying to edit the file to remove any whitespace changes that weren't intentional, I use add --patch to add just the real diff. After that I end up with a file where 'git diff' only shows whitespace changes. At that point I can do 'git checkout -f' (or -f -- filename) and get the cleaned-up file back, without the unintentional whitespace changes. That actually helps a lot for work stuff - we always review changes by others, and the cleaner the commit is the better.)
    But better yet is to remember to just "add" and "commit" for every little fix.
    Well, yes, just don't end up with CVS - which really was just a way of doing snapshot backups. It wasn't version control.
    Then you don't have to mess around with "add --patch".
    All I can say is - try it. It's not the same thing at all.


  • Heater.Heater. Posts: 21,121
    Don't get me wrong, Tor, "add --patch" and many other git features are great.

    It's just that I like to keep things simple. Most of what I work on only has a master branch, it only moves forwards. Unless something gets hosed!

    I do agree about keeping the commit history "clean". I hate it when people have a commit comment that says "Fixed bug X" but when you look at the commit there is a ton of other, unrelated, changes in their. Often just white space/formatting changes they sneaked in or perhaps some refactoring. It confused the issue when you look back over the history. That is why I like to make a lot of git adds and commits. Often many per day. One thing at a time.

    Except of course I do have many branches. The final product is in Github or BitBucket but I clone it and and hack on things in my office. Then again at home. Or perhaps on my laptop whilst out and about traveling. Sometimes means I end up conflicting with myself! But usually it is easy to get everything merged back into a whole.

    I don't usually have a need for a master branch and then other branches for major changes/experiments.


  • I am not at all sure what GIT has to do with being a better programmer.

    Sure a Source Control and Versioning Tool can make the life more easy, but has nothing to do with good coding.

    I can write good and bad code with a pencil and paper.

    First you need to understand the problem itself. Do not guess, find out the hard facts surrounding the problem. Every hour spend understanding the problem saves multiple hours coding in the wrong direction.

    Second you need to understand the limitations of the hardware used to run your code. This will also set boundaries around the possibility of a solution.

    Third you need to relax and just think. There is where the experience of a programmer shows. Thinking thru multiple solutions to ponder feasibility and amount of work. Just sit there an think. Run some test code to verify assumption about the problem.

    The next step is the actual coding, some people like Peter J. prefer to start at the bottom, coding needed subroutines, gluing them together later, other ones prefer to work from the top down, outlining the solution and filling it up further down. Normally one need a mix of both.

    But usually, at that point, a programmer has the complete solution solved in his head and just need to write it down. Still independent of the language to use, mostly.

    Life is good, everything runs smoothly until it does not.

    Because even the best of the better programmers makes mistakes in steps one to three. Always. And the management change things in between. Also Always.

    And at that point in development consistent and modular programming pays off. Since you avoided duplicates in your code
    :smile: the addition of a third button on some screens is a easy fix at just one point in code, at least you think so...

    Actually here is where the fun begins while programming. Because the next idiot to fix that code next year is - hmm - yourself.

    And programs you write are like boomerang-children. Mine are hunting me down even 20+ years after I wrote them. Still running, somewhere, and in need of some TLC.

    Enjoy

    Mike
    I am just another Code Monkey.
    A determined coder can write COBOL programs in any language. -- Author unknown.
    Press any key to continue, any other key to quit

    The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this post are to be interpreted as described in RFC 2119.
  • msrobots wrote: »
    ...just need to write it down. Still independent of the language to use, mostly.

    In other words, avoid jumping into actual coding before understanding the problem.

    This appears to be very difficult to do; I'm guilty of not "writing it down, before". And the majority of my students don't understand the concept either. And usually, the "jumpers" have the most problems in terms of compile errors and logic-flow issues.

    Mastering this step can save a lot of time and frustration.

    DJ
    Well-written documentation requires no explanation.
  • WhitWhit Posts: 3,975
    Thanks again to you all for your thoughtful comments and suggestions. I have learned a lot in this thread!
    Whit+

    "We keep moving forward, opening new doors, and doing new things, because we're curious and curiosity keeps leading us down new paths." - Walt Disney
  • Just like in writing for something that is short and simple a formal outline isn't required though highly recommended.
    I've done many writing assignments, essay question, and small programs with the outline in my head.
    On large projects I've had to write things down especially the portions that I need to research or figure out.

    One trick I have for hard problems that I can't solve is I go off and do something else and sooner or later a solution will come to me.
    Be sure it's something that is fun or relaxing so your mind is free.
  • kwinnkwinn Posts: 7,976
    Genetix wrote: »
    Just like in writing for something that is short and simple a formal outline isn't required though highly recommended.
    I've done many writing assignments, essay question, and small programs with the outline in my head.
    On large projects I've had to write things down especially the portions that I need to research or figure out.

    One trick I have for hard problems that I can't solve is I go off and do something else and sooner or later a solution will come to me.
    Be sure it's something that is fun or relaxing so your mind is free.

    That is something I always do when I am stumped. My favorite inspiration inducing activity when working at one of the universities is to go for a coffee and some sightseeing in the cafeteria or on the patio. Nothing like it for getting the gray matter working.
    In science there is no authority. There is only experiment.
    Life is unpredictable. Eat dessert first.
  • WhitWhit Posts: 3,975
    Whit+

    "We keep moving forward, opening new doors, and doing new things, because we're curious and curiosity keeps leading us down new paths." - Walt Disney
  • AwesomeCronkAwesomeCronk Posts: 316
    edited May 29 Vote Up0Vote Down
    Heater. wrote: »
    No I didn't. I'm too old to not loath and detest the Smurfs.
    I share your feelings towards them!

    I have never seen anyone run a game in command shell before. Maybe what he typed was a predefined user-command(?) that openedd and initiated in the command prompt window. I tried to learn Command Shell a while back, but couldn't make sense of it.
  • I like to just code on the screen. If I need to make changes, whiteout works wonders but then writing on it has some drawbacks. The pens don't like being held with the ball tip pointing up... seems to cause some ink flow problem. ;)
    My Prop boards: P8XBlade2, RamBlade, CpuBlade, TriBlade
    Prop OS (also see Sphinx, PropDos, PropCmd, Spinix)
    Website: www.clusos.com
    Prop Tools (Index) , Emulators (Index) , ZiCog (Z80)
Sign In or Register to comment.