Parallax Propeller 2 Multicore Microcontroller Release Date Update

13

Comments

  • PublisonPublison Posts: 9,768
    edited October 21 Vote Up0Vote Down
    'You could use ViewPort'

    Is there a free version of Viewport on the Parallax website?

    EDIT: Answered above.
    Infernal Machine

  • Publison

    Do you recommend Viewport in any version?
    JUNIOR ENGINEER
  • I even don't know what a pocket protector is. Seems to be an American thing.

    But I'm all for brandishing soldering irons and slide rules!



  • Publison

    Do you recommend Viewport in any version?

    Personally, I am never satisfied with Lite Versions. I always end up upgrading anyway after a while. With Hanno's Viewport I went right for the Ultimate since it included many enhancements which I could use. The full price is not that bad for what it does.

    Infernal Machine
  • pocket protectors where inserts to the shirt pocket of the ironed white shirt of any Engineer to protect the pocket from the pencils he had to show off. The more pencils, the better the Engineer.

    Mike
    I am just another Code Monkey.

    A determined coder can write COBOL programs in any language. -- Author unknown.

    The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this post are to be interpreted as described in RFC 2119.
  • msrobots wrote: »
    pocket protectors where inserts to the shirt pocket of the ironed white shirt of any Engineer to protect the pocket from the pencils he had to show off. The more pencils, the better the Engineer.

    Mike

    They are also called "nerd buckets".
    Re-inventing the wheel is not a waste of time if, when you are done, you understand why it is round.
    Cool, CA, USA 95614
  • That is what I thought. Some mythology about "engineers". Probably created by the manufacturers of pocket protectors.

    I have worked with serious engineers for decades. Never did see a pocket protector.

    A quick google image search for "engineer pocket protector" turns up how fake engineers most of them are.

    Except this:

    quote-i-am-and-ever-will-be-a-white-socks-pocket-protector-nerdy-engineer-born-under-the-second-neil-armstrong-49-11-53.jpg















  • I used ViewPort in the beginning since P1 has no step based debug at all. But as soon as I stepped from just programming it to learn the languages and play with multiple cores to lets connect something I found that it was more easy to use serial Terminal, a scope or even just a beeper or LED.

    Even in my daily work way outside from microcontrollers I rarely need to use a debugger. It's there, and even nice in Visual Studio, but for most bugs a debugger is overkill and sometimes no help at all.

    As soon as you have some bug in a time critical part of your program, you simply can not single step it anymore or the timing is dead in the water. Say a TV or VGA video driver. You cant single step it because the monitor does not work, then. same with communication to another chip via SPI or IC2, the other chip may not support very slow communication while you stepping thru your code by hand.

    For Real Time debugging the P1 is perfect. Run your code in one Cog, run your debugger in another Cog use buffered serial output that the debugger does not need to wait for the output in a third Cog.

    Now you are running in Real Time and share some HUB locations to transfer values out of your buggy code to figure out what happens. Just wonderful, no interrupts needed, nobody can step on one others foot.

    Like @Chip, his creations are often stepping out of the bounds of derivative thinking that is what drew me to the P1 and makes me so interested in the P2. Compared to all other MCs at the time the P1 came out it was completely different.

    WHAT 8 cores with 32 bit? WHAT no interrupts? WHAT no in build peripheral driver, but software driven? WHAT can drive TV and VGA by itself?

    There are a lot of those WHATs in the P2 also. Things which get done different of the usual way. Because @Chip thinks the usual way is not the best way to approach solutions.

    Sure you can program the P1 in C, Spin, Pasm, Basic, Forth and be a master of one of these languages. STILL you will need to rethink the way you used to program. That is the beauty of @Chips creations, they offer way more as multicores on other systems. Just a little bit more though in the details everywhere you look.

    PASM is a very nice assembler language and I used many of them. PASM2 is slightly different, but when I read chips or ozpropdevs CODE snippets I know already that I will like it very much.

    Enjoy!

    Mike
    I am just another Code Monkey.

    A determined coder can write COBOL programs in any language. -- Author unknown.

    The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this post are to be interpreted as described in RFC 2119.
  • msrobots,

    As much as we like to disagree with each other I find that I could have written your post above myself!

    I never understood the endless calls for debuggers around here. Debuggers are for the weak minded who don't understand the code they have written.

    Of course often I don't understand the code I have written myself. Especially after not having looked at it for some time. But in my experience the hard bugs are down to some weird time dependent interaction of multiple processes, or even just interrupt handlers, that a debugger cannot help you find and often masks the problem you are looking for.

    On the other end of the spectrum, debuggers are pretty useless at finding memory leaks and buffer overruns in large scale programs.

    I wonder how the typical COBOL programmer would get along if COBOL had the capability of parallel processing? :)


  • wait - multiple jobs on the same computer at the same time? Interesting concept, we should put that in COBOL2021 standard

    Mike
    I am just another Code Monkey.

    A determined coder can write COBOL programs in any language. -- Author unknown.

    The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD", "SHOULD NOT", "RECOMMENDED", "MAY", and "OPTIONAL" in this post are to be interpreted as described in RFC 2119.
  • I stopped using debuggers during development several decades ago. It's pointless if you understand your code, as Heater said. if you don't understand the code then the remedy is to rewrite it so that you understand it, not use a debugger to try figuring out what it does.
    I fire up gdb now and then for exactly one reason: When somebody presents me with an executable and a coredump and asks me to find out where it crashed. Gdb shows the stack trace.

    For memory leaks, use Valgrind or something if you're on a supported platform.
  • The_MasterThe_Master Posts: 78
    edited October 22 Vote Up0Vote Down
    Mike Green wrote: »

    There are real tradeoffs between die size and costs. A given package will only hold a die up to a specific size. Beyond that and the packaging costs go way up. Stacked dies can be used, but the cost again goes way up.

    Has Parallax ever considered doing any of their packaging or testing in-house? I would think they'd have the capability of doing this. Get some manufacturing costs down, and then they could do all sorts of neat things like put the Prop and flash in the same package.

    I am the Master, and technology my slave.
  • Is it just me... or does microcontrolleruser have a writing style and the overall gestalt of humanoido?

    About Shannon... notice that Shannon did not define "meaning."

    As far as I can figure it: without meaning, no information is received regardless of how efficiently the transmitted message is encoded.

    I think Shannon put a major hole in the Second Law. Can a discipline exist without its fundamental laws?

    What is order now? It can't be what it was then:)


  • Let’s say you have a codebase written by multiple people. Some variable in memory is getting trashed. If you have a debugger supporting flexible hardware watchpoints you wouldn’t use it to help track down the offender?
  • rjo__
    Is it just me... or does microcontrolleruser have a writing style and the overall gestalt of humanoido?
    Interesting idea. My suspicion is that microcontrolleruser is actually some kind of AI algorithm running on Humanoido's million Propeller parallel Big Brain computer :)

    My apologies microcontrolleruser, if you are actually a human. It's hard to tell now a days and I don't like to be discriminatory.
    About Shannon... notice that Shannon did not define "meaning."
    Of course not. Mathematicians, scientists, etc do not talk about "meaning". How can they? There is no rigorous definition of it. No way to measure it. No way to reason about it.
    As far as I can figure it: without meaning, no information is received regardless of how efficiently the transmitted message is encoded.
    Also interesting. What you are describing is the difference between "data" and "information" as my English teacher might have taught us in school in the 1960's. Or my stats teacher in tech school years later.

    This get's a bit philosophical. Your reliance on "meaning" to define "information" is very suspect.

    For example. If I write a message on paper, put the paper in a bottle, throw it into the sea. According to your definition there is no information in there. Because nobody has read it and extracted "meaning". I claim there is information in my bottle even if nobody ever reads it and gets "meaning" out of it.

    But, what happens if you do find my message in a bottle on the beach one day? Perhaps I have written that message in Finnish or some language you don't understand. For you there is no "meaning" and therefore no information. I claim this is not so.

    What we find is that "information" has a ton of definitions today. Depends on who you talk to. Check the wikipedia entry on "information".
    I think Shannon put a major hole in the Second Law. Can a discipline exist without its fundamental laws?
    No. The laws of thermodynamics still stand up.

    In a former life, studying physics, I learned how the laws of thermodynamics can be derived from statistical mechanics. That is to say Shannon style "bits". It was kind of shocking for me at the time.

    Now a days, ideas of "information" is more and more a part of physics.


  • but...

    What is thermodynamic order? Until you deal successfully with that... the Second Law is rubbish.

    IIRC... perfect order constrains the data to a single partition...no?

    Life is treated as an outlier. Unexpected but not prohibited. What if life is predestined by the make up of the elements?

    Where is the concept of thermodynamic order then?

    Information is what makes life possible... by posing information in terms of negative entropy, Shannon pretty much ended the debate in my mind. He didn't resolve it, but he put the issue in stark terms that could not be missed.

    How neat is that?

  • Local Meaning(M) during a period, dt, is the local change of information, dH, resulting from a message, H(m);

    M=(dH/H(m))/dt
  • Sorry, I don't get your point. What has "life", however you define that, got to do with thermodynamics?


  • Heater, there is a popular misconception that life by its very existence defies the laws of thermodynamics because it gets more complex over time. This misconception occurs because people forget about the enormous energy output of the Sun, and the fact that the Earth is the exact opposite of a closed thermodynamic system such as what the Second Law specifically addresses.

  • Heater.Heater. Posts: 19,813
    edited October 24 Vote Up0Vote Down
    localroger,

    I know, I was fishing for exactly that argument that "life by its very existence defies the laws of thermodynamics because it gets more complex over time".

    Now that you have knocked that down for me there are more pressing problems:

    Like where does all that entropy go when it falls into a black hole?

    I mean, if my closed system happens to include a black hole then all the entropy in my system will eventually end up in that black hole. Never to be recovered.

    Seems Hawking and Susskind had a decades long debate about this.










  • rjo__

    Did I just see you write down a mathematical definition of "meaning"?

    Be serious.

  • I have stated it a bit differently, as what I call the Time-Variant Function. The Universe is, at a particular moment in a state; let us call it U(t). The universe will be, at a future time, in a different state calculated by whatever enigmatic gnomes oversee this, creating U(T+1). This future universe becomes the new universe for the next iteration of the Time-Variant function.

    It turns out that the Universe with its massive data storage and the time-variant function combine to create the exact conditions for chaos theory to wreak its havoc.
  • Cluso99Cluso99 Posts: 12,961
    edited October 24 Vote Up0Vote Down
    Perhaps you should pass this enlightenment on to Professor Hawkins ;)

    The detail is in "U".
    My Prop boards: P8XBlade2, RamBlade, CpuBlade, TriBlade
    Prop OS (also see Sphinx, PropDos, PropCmd, Spinix)
    Website: www.clusos.com
    Prop Tools (Index) , Emulators (Index) , ZiCog (Z80)
  • Your U(t) sounds like good old Newtonian mechanics to me.

    The idea of enigmatic gnomes overseeing all this does not sit well with me. If there are such gnomes then I can include them in a new definition of U. Call it U'. Then I can argue that the state of this new bigger universe is a similar function of time, U'(t).

    But by that reasoning I need "meta gnomes" calculating what my gnomes and U combined (U') do.

    But then I can include those meta gnomes in U' and have a new bigger universe, call it U''. The state of that bigger universe is the function of time U''(t).

    But then I need meta-meta gnomes to calculate that.

    You see where this is getting us.....nowhere.

    Anyway. What is this T of which you speak? Time is what we measure with clocks. Clocks are part of the universe. That is to say that U creates t all by itself. T is a function of U.

    So now we have U(t) is U(T(U)) and we can keep expanding... U(T(U(T(U(T(U(T.....)))))))

    As you say, the exact conditions for chaos!

    :)

  • microcontrollerusermicrocontrolleruser Posts: 546
    edited October 24 Vote Up0Vote Down
    Okay Parallax.

    There's a Walmart gift card in this for you if you finish this Propeller 2.

    You can go get some 'cold beverages'.

    Then go out to your Sacramento river pontoon boat and celebrate.

    As long as Parallax has been up there they probably have a couple pontoon boats.

    EDIT Okay if your the Northern California healthy types we can make it an REI gift card.

    Go get some kayaking, rock climbing cross country skiing stuff.
    JUNIOR ENGINEER
  • KeithE wrote: »
    Let’s say you have a codebase written by multiple people. Some variable in memory is getting trashed. If you have a debugger supporting flexible hardware watchpoints you wouldn’t use it to help track down the offender?
    Of course, I do it every day. Every person I know, does it. And I have no idea, what the people here are saying. Maybe they are only too simple minded to use a debugger.

    οἶδα οὐκ εἰδώς
  • Mike GreenMike Green Posts: 22,577
    edited October 24 Vote Up0Vote Down
    "too simple minded to use a debugger"

    Far from it. Every such tool has a cost for implementation. If you're going to have a symbolic debugger, where are you going to store the symbol table? How about the table that stores the correspondences between code address and source line? If you're dealing with a PC or something of that scale, you're going to have a hard disk (or equivalent) with a file system and gigabytes (or even terabytes) of storage, so a symbolic debugger makes sense. If you're dealing with a microcontroller like the Propeller 1, you're going to have 32K, maybe 64K if you're lucky ... without a file system. The Propeller 2 will typically have a megabyte of flash, maybe more, so it makes more sense there, but is not a certainty.

    As others have mentioned before, debugging real-time code or I/O drivers is different from what you may be used to. A debugger changes timing and that may be where the problem is. If I'm explicitly adding debug code, I can choose where it goes so it shouldn't (no guarantees) affect the timing I'm concerned about. That's not what a generalized debugger does.
  • The majority of micro controller users are using development systems on external hosts. Target constraints don’t apply to the development system in such a case. Some people are nostalgic for self hosting, but the Propeller is too small to do a complete job. For example how does it do source control? How do I get to an issue tracker?

    Some debuggers have real-time instruction or data trace. It doesn’t have to impact the target timing. Same with the hardware watchpoints until they trigger. No?
  • I am "too simple minded to use a debugger". I just use printf's to do my debugging. Of course, that doesn't work if I only have the binary to work with. In that case I find somebody that knows how to use a debugger. :)

    I don't think the microcontroller needs much support on it for debugging. Most of the work is done on the host. The micro just needs to be able to support a debug link to the host, and respond to commands to set breakpoints and peek/poke memory. This was implemented in PropGCC to support the GDB debugger in LMM mode. However, I'm not sure if this capability was fully implemented, and whether it even works at this point.
  • I'm familiar with source control systems and hardware watchpoints. The Propeller 1 was not designed for tracing or watchpoints and, from the descriptions of the design and implementation process, would probably never have seen the light of day in anything like the form it did if these were requirements. The amount of complexity introduced by the multiple cores and hub along with limited ROM available for built-in functionality ...

    The Propeller 2 is another design and implementation process. It does have some support for hardware assisted debugging, but it will rely on the same techniques used with the Propeller 1 where one or more cogs will act as an interface to the "outside world" for the debug functions. This is in keeping with the Propeller's use of software implemented I/O.
Sign In or Register to comment.