Shop OBEX P1 Docs P2 Docs Learn Events
A good coverage software test suite for the whole Propeller1 — Parallax Forums

A good coverage software test suite for the whole Propeller1

overclockedoverclocked Posts: 80
edited 2014-08-13 13:13 in Propeller 1
Has anyone seen or start writing one of those?

My though was to both try and cover the full instruction set for both HUB and COG, corner cases for those and have test itself reporting problem when flags/register/results aren't as expected. Maybe even test the full RAM/ROM space to be sure that everything is working.

What do you guys think of this idea? Are there any takers for such a programming excercise? I think that I'll keep going with my Xilinx implementation, but would be great to do this as a joint venture for future porting.

It would be good to have such a project setup for both simulation and real download. A real HW implementation maybe uses an binary error LED value (like using the 8 LED connected from the beginning) would be good enough OR a full blown VGA-terminal showing an error code which corresponds to a specific test OR both of these. Maybe the possibility to run test in an endless loop which halts on first error.

I think this could be useful both for overclocking/stability tests and used as a reference test.

Actually this doesn't in any way demand the person doing it to have any FPGA-board or anything. Should of course run both in emulators on PC's and in the real Propeller.

Comments

  • pik33pik33 Posts: 2,366
    edited 2014-08-12 09:04
    I have a VGA terminal working and this itself is a good test platform. 2-cog vga driver+high level spin code run without any problems, so it seems the Propeller is working.

    Of course the test program especially created to test an experimental VGA Propeller implementation will be useful.Something like Acid800 test suite for Atari 8-bit.
  • Heater.Heater. Posts: 21,230
    edited 2014-08-12 13:10
    I do like the idea.

    A basic instruction set test, a "sanity check" is always good.

    But, having sometimes been involved in CPU hardware testing I have become convinced that it is basically impossible to test everything. Even thoroughly testing something as seemingly simple as RAM is impossible.

    There are just too many combinations of inputs and sequences of inputs and timing issues that there is no way to test that things always work as expected.

    Yes, you can make sure you have tests that "cover" every statement in your source code. And that is a very good idea. It is only just the beginning though.
  • overclockedoverclocked Posts: 80
    edited 2014-08-12 13:34
    Heater. wrote: »
    I do like the idea.

    A basic instruction set test, a "sanity check" is always good.

    But, having sometimes been involved in CPU hardware testing I have become convinced that it is basically impossible to test everything. Even thoroughly testing something as seemingly simple as RAM is impossible.

    There are just too many combinations of inputs and sequences of inputs and timing issues that there is no way to test that things always work as expected.

    Yes, you can make sure you have tests that "cover" every statement in your source code. And that is a very good idea. It is only just the beginning though.

    Absolutely! To test something is better than nothing is a better way to put it. Great to hear that there are supporters for such an idea. As soon as I boot my first Propeller, I'll start looking at that. I hope it will be the coming weekend..
  • SRLMSRLM Posts: 5,045
    edited 2014-08-12 13:59
    I like the idea of unit testing. So far as I know, libpropeller has the only unit testing in Propeller land:

    https://github.com/libpropeller/libpropeller/tree/master/libpropeller/unity_tools

    This is in C++. For this I modified the unity test framework (PC C) to run on the Propeller and developed a specific board with specific peripherals for testing on:

    unit_tester.JPG

    Good luck.
  • prof_brainoprof_braino Posts: 4,313
    edited 2014-08-12 20:08
    Has anyone seen or start writing one of those?.

    I don't know if this is what you want, but here goes....

    The way we test propforth is we have it recompile itself. It exercises most of the forth code as it builds itself from source. It starts with the slow, forth only versions, and gradually rebuilds parts in assembler, and produces new kernels with the optinized code. Once the optimized dev kernel is complete, it builds the Eeprom and SD file sysem versions. Its all done with a series of scripts, and the output of each script is logged. Error checking is very simple. We just compare the results of THIS run with the results of the previous run. If anything is not the same, we look at it. Either we modified the source, abnd expect a change; or we didn't modify the source and/or don't expect a particular change, and investigate further. First time through we had to check Everything, subsequent passes, we only have to check changes. Finally we compile from code, and use the result to compile from code; we check that both results are identical. That way we know the kernel is at least stable. We also make additional script to check external hardware, like the spinerette boards. And Sal has his own libraries of test automation for his custom boards. Its designed so its easy for use to set up our own custiom tests of whatever.

    Since its all with scripts, and interpreted, and logged, the entire process can be automated. The only task left for a human is to start the build automation process, and examine any differences that pop up. In the case of propforth, errors don't occure anymore, as the automation is effective inidentifying them (and Sal is a guru at determining root cause, and effecting a proper change. This last part is of course the part that cannot be automated.)

    Of course, this does not check every opcode of the prop, just the ones used to create forth (and thedrivers for the custom hardware etc). But it should be fairly simple to write a script that uses any given stray opcode or set of opcodes, and make a test for it, and make it very easy to spot if something goes different from expected.

    You could implement the same thing in language of your choce, or just run the propforth build automation (you don't necessarily even have to know forth, just have the automation point out differences for you.)

    The intent is "minimum necessary and sufficient to completely address the issue", there are few bells and whistles, but its fast and very effective.


    EDIT: This was implemented after the test cycle had grown to over 5 week manual testing. Now the build process takes under two hours, and the test automated scripts take about two and a half hours. The limiting factor is the speed of the prop, the cycletime is pretty much the same even on a slow PC, (unless the PC is extremely slow or running windows and we have to lower the baud rate.)
  • msrobotsmsrobots Posts: 3,709
    edited 2014-08-12 20:44
    @Prof_Braino,

    I usually groan when I read forth somewhere in the forum, since I can't wrap my head around it. I tried and failed. Too much COBOL in my life. RPN does not work with me. Sad as it is.

    But for a test-case your proposal should be PERFECT.

    It should deliver the same results on P1 or FP1. Hopefully.

    Not in that game yet, missing time and FPGA...

    Mike
  • prof_brainoprof_braino Posts: 4,313
    edited 2014-08-13 13:13
    msrobots wrote: »
    But for a test-case your proposal should be PERFECT.
    Mike

    In that case, I will keep this in mind. We will look at packaging up something to do a "1 step build and test cycle". The result is code that when executed always produces the same result. We could remove things that ALWAYS change like time stamps. Of course somethings must change, like maybe run number, but we might get something folks can use without having to actually do forth.

    https://code.google.com/p/propforth/issues/detail?id=216
Sign In or Register to comment.