Shop OBEX P1 Docs P2 Docs Learn Events
GCC / Eclipse and Propeller 2 - seeking developers - Page 15 — Parallax Forums

GCC / Eclipse and Propeller 2 - seeking developers

1121315171822

Comments

  • Ken GraceyKen Gracey Posts: 7,401
    edited 2011-05-31 14:52
    RossH wrote: »
    Hmmm. Not sure about the "crazy bit", but I do question the point of having a debate like this if Parallax are not actually going to participate.

    Are we intended to uncover their actual intentions and requirements purely by chance? This may be fun for us enthusiasts - but it seems an unusual way to conduct a project specifically intended to lift their professional profile.

    Ross.

    We're here and reading, discussing this internally for the moment with plans to participate in furthering the discussion. Please give us a bit of time to formalize our approach to move this to the next step. It's the top priority for me to sort out right now, with the help of several of you and our internal team (Chip, Jeff, Kwabena).

    Thanks,

    Ken Gracey
  • RossHRossH Posts: 5,519
    edited 2011-05-31 16:12
    Thanks, Ken. Glad to know someone is at least reading our ramblings!

    Let the crazy debate continue!

    Ross.
  • jazzedjazzed Posts: 11,803
    edited 2011-05-31 16:57
    potatohead wrote: »
    Until then, anyone want to debate the merits of gcc vs llvm? :)

    As long as performance is the same, the path of least resistance regarding GNU/GCC would be fine. I only want a respected industry standard tool chain that helps Parallax meet and exceed it's revenue goals.
  • RossHRossH Posts: 5,519
    edited 2011-05-31 17:42
    jazzed wrote: »
    As long as performance is the same, the path of least resistance regarding GNU/GCC would be fine. I only want a respected industry standard tool chain that helps Parallax meet and exceed it's revenue goals.

    "Industry standard tool chain"? I'm afraid there ain't no such animal - there are only standards for the languages themselves. The tools that implement them vary widely between the various compiler vendors.

    IAR uses a different set of commands to ICC, which are different to PCC, which are different to GCC, which are different to CLANG, which are different to ... etc etc.

    If a tool chain implemenents the language correctly, does it matter whether the command you need to use to invoke the compiler is "gcc" or "iccarm"? Most developers rarely use the individual commands of any of the tool chains they use - they would generally either use an IDE or a makefile.

    Let's not get ourselves confused between requirements that can easily be addressed by an appropriate choice of IDE with those that should be addressed by an appropriate choice of tool (or for that matter with those that should be addressed by an appropriate choice of language!).

    Ross.
  • jazzedjazzed Posts: 11,803
    edited 2011-05-31 17:45
    RossH wrote: »
    "Industry standard tool chain"?
    One that the industry actually uses.
  • RossHRossH Posts: 5,519
    edited 2011-05-31 17:58
    jazzed wrote: »
    One that the industry actually uses.

    Nice circular argument, jazzed :smile:

    Ross.
  • jazzedjazzed Posts: 11,803
    edited 2011-05-31 18:06
    RossH wrote: »
    Nice circular argument, jazzed :smile:

    Ross.

    You pointed out a few that industry players actually use. Let's pick one of them.
  • RossHRossH Posts: 5,519
    edited 2011-05-31 18:21
    jazzed wrote: »
    You pointed out a few that industry players actually use. Let's pick one of them.

    Sure - but I hope Parallax's customers don't apply the same logic as you ...
    Marketing Guy: "Hey, techie - we need to choose a microcontroller for our new product. What should we use?"
    Technical Guy: "Well, there's this really cool new processor called the Propeller 2 ..."
    Marketing Guy: "Is it industry standard?"
    Technical Guy: "No - but it's got eight cores and it's really easy to program and ..."
    Marketing Guy: "Forget it! We need this product out before Christmas!"
    Technical Guy: "But .. but ... but ... it uses GCC and everything ..."
    Marketing Guy: "I said forget it! We'll just use an ARM like all our competiters do!"
    Ross.
  • potatoheadpotatohead Posts: 10,261
    edited 2011-05-31 18:24
    I know I wasn't going to contribute for a while. But... (kinda hooked on this one)

    I have to ask:

    What if it works LIKE some "industry standard" tool? What is that worth, compared to actually being the tool?

    -->seeing what Ross just wrote, that beats me, I also must ask:

    Say it is the tool. Do we make it operate in the same messy way, or do we get the work done so the work flow / user experience on our stuff "with that tool" is seriously improved over the status quo?


    Heh... Technical Guy: It HAS THIS GREAT DEVELOPMENT TOOL!

    -or- GET IT DONE IN HALF THE TIME?
  • jazzedjazzed Posts: 11,803
    edited 2011-05-31 18:29
    potatohead wrote: »
    Say it is the tool. Do we make it operate in the same messy way, or do we get the work done so the work flow / user experience on our stuff "with that tool" is seriously improved over the status quo?
    It should work at minimum the way people expect it to work. If you want to add special sauce, that's fine, but a company should not have to call parallax and ask them why they can't define their own memory spaces.

    The GNU/GCC tool chain provides everything necessary for a software engineering organization to do it's job. Provide that, and I'll say you've succeeded. Anything less is not worth the trouble for a software engineering organization.
  • potatoheadpotatohead Posts: 10,261
    edited 2011-05-31 18:33
    There is a difference between "expect to work", and "improved"

    A "seriously improved" work flow is highly likely to contain elements that are "not expected". If they are worth it, that's ok, or are we saying that isn't ok, and it's a straight up clone of something bog standard out there?

    And on the expectation that they can hack things up, that also comes with the expectation that they did, in fact, hack things up? What I mean is are we planning on objects, etc... all following some bog standard tool chain, or is the idea to use the standard tool chain, 'the parallax way?'

    I'm asking, because I want to see where people think the value is.
  • David BetzDavid Betz Posts: 14,516
    edited 2011-05-31 18:41
    potatohead wrote: »
    There is a difference between "expect to work", and "improved"

    A "seriously improved" work flow is highly likely to contain elements that are "not expected". If they are worth it, that's ok, or are we saying that isn't ok, and it's a straight up clone of something bog standard out there?

    And on the expectation that they can hack things up, that also comes with the expectation that they did, in fact, hack things up? What I mean is are we planning on objects, etc... all following some bog standard tool chain, or is the idea to use the standard tool chain, 'the parallax way?'

    I'm asking, because I want to see where people think the value is.

    How about if we are a little more clear what we are talking about here. What exactly is the "expected workflow" and what is the "seriously improved workflow"? How is the seriously improved workflow better than the expected workflow? I'd like to see specific examples of what needs to be done and how it is handled by each workflow and some explanation of why the improved workflow is better.
  • jazzedjazzed Posts: 11,803
    edited 2011-05-31 18:43
    Software Engineers know how to do things. If you limit them some way, you'll just make them mad.

    There are Software Engineers in the hobby world of course, but there is a difference between something you really enjoy and something that must be done because it's on the critical path and must work well to protect the leadership roles of companies that have huge stakes in their software and processes.

    Everything else is just fluff.
  • jazzedjazzed Posts: 11,803
    edited 2011-05-31 18:51
    RossH wrote: »
    Sure - but I hope Parallax's customers don't apply the same logic as you ...
    Marketing Guy: "Hey, techie - we need to choose a microcontroller for our new product. What should we use?"
    Technical Guy: "Well, there's this really cool new processor called the Propeller 2 ..."
    Marketing Guy: "Is it industry standard?"
    Technical Guy: "No - but it's got eight cores and it's really easy to program and ..."
    Marketing Guy: "Forget it! We need this product out before Christmas!"
    Technical Guy: "But .. but ... but ... it uses GCC and everything ..."
    Marketing Guy: "I said forget it! We'll just use an ARM like all our competiters do!"
    Ross.

    I hate to say it, but that's probably what will happen in most cases anyway. I guess we should all go buy iPads and be silly happily ever after playing some balloon popping game.

    The cases where ARM or some other mainstream processor or MCU is not the preferred choice will require a solid story otherwise it's a waste of time.

    Whatever Ken and Parallax decides is fine with me. I will back them 100%. I hope everyone else will too.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2011-05-31 18:57
    There is, of course, a fuzzy line between serving a customer's best interests and pandering to his preconceived expectations. (I guess this is why bait-and-switch scams continue to be successful.) I'm not sure that pandering is the best marketing strategy in the long run. Sometimes you just have to bludgeon the client with what's best for them and hope that, in their stupor, they can be led to a rational decision. This has short-term consequences, of course, but the upside is that it also winnows out the more troublesome clients. The idea is to win hearts and minds -- loyal customers -- not to drag the bottom of the sea for anything that can't escape the seine.

    The best marketing strategy is to sell the benefits: "Your time to market will be cut in half with the Prop" -- not the features: "Lookit: we have GCC and .NET, just like the big guys!" Parallax has always been successful by selling the benefits, because there's beef in that bun, by golly -- lots of it. I hope that approach never changes.

    -Phil
  • potatoheadpotatohead Posts: 10,261
    edited 2011-05-31 18:59
    @David: "seriously improved" might be say, number of actions required to go from code edit to propeller running said code, or it could be how the standard objects get setup, or board support, or any number of things.

    What I am wondering about personally, is whether or not just providing a "standard" process, that is literally a clone of some other one is worth doing, as opposed to say, doing something that really works better than that, tuned for the task at hand.

    @Jazzed: So let's say whatever gets shipped as "standard" doesn't align well with your particular method of building code, but the basics needed to do it "your" way were provided, but not shipped as standard. Would that be the same as a limit, or not?
  • jazzedjazzed Posts: 11,803
    edited 2011-05-31 19:06
    potatohead wrote: »
    @Jazzed: So let's say whatever gets shipped as "standard" doesn't align well with your particular method of building code, but the basics needed to do it "your" way were provided, but not shipped as standard. Would that be the same as a limit, or not?
    I don't understand your limit question.

    The organizations that I've worked in had a certain quality requirements. Either the product fit, or it didn't. If it didn't fit, it was bypassed.

    I've already been through the call up someone to add a feature because the methodology of their tool-chain didn't work. I was on hold for essentially 5 months because of it. That will never happen again.

    Dealing with Spin/PASM is a pleasure because I can do practically anything with it. There are limits to the Propeller, but there are ways around them now pretty or not. Propeller 2 will be a much better experience I'm sure.
  • potatoheadpotatohead Posts: 10,261
    edited 2011-05-31 19:22
    Let's say the product fits, but in order to "fit", some configuration is required. One example would be a basic sub-set of the total feature set is the standard configuration. Advanced users would be free to setup a considerably more complex environment.

    Or, another case is the standard configuration of the tools varies considerably from the way a given organization would prefer to see them. Organizations, even people, vary very significantly in how things get done, and what makes sense.

    Same scenario. Some configuration is required to get it to the configuration that makes sense, so how big of a barrier is that?


    (and that's my last one)
  • jazzedjazzed Posts: 11,803
    edited 2011-05-31 19:32
    potatohead wrote: »
    Let's say the product fits, but in order to "fit", some configuration is required. One example would be a basic sub-set of the total feature set is the standard configuration. Advanced users would be free to setup a considerably more complex environment.
    The GNU/GCC and professional tool chain providers' models mentioned by Ross offers this freedom. Other models do not.
    potatohead wrote: »
    Or, another case is the standard configuration of the tools varies considerably from the way a given organization would prefer to see them. Organizations, even people, vary very significantly in how things get done, and what makes sense.

    Same scenario. Some configuration is required to get it to the configuration that makes sense, so how big of a barrier is that?
    Deviations depend on how compelling the technological advantage is. Propeller 2 may offer some technological advantage to make clumsy tool-chain problems worth ignoring for some.
  • David BetzDavid Betz Posts: 14,516
    edited 2011-05-31 19:42
    potatohead wrote: »
    What I am wondering about personally, is whether or not just providing a "standard" process, that is literally a clone of some other one is worth doing, as opposed to say, doing something that really works better than that, tuned for the task at hand.
    I'd be happy to adopt something that was better but I haven't seen any concrete examples of how an alternate toolchain will actually improve on the more traditional one. I'm willing to be shown the error of my ways though. Just provide some details about this "better way".
  • potatoheadpotatohead Posts: 10,261
    edited 2011-05-31 20:35
    @Jazzed: Thanks. Got the answer I was looking for.

    @David: Yes, exactly. Where I was headed with that was the idea that the Prop *is* different enough that it may warrant what I can characterize as "structured deviations" from "the standard" to expose superior technology, or depending on one's world view, highly differentiated technology.

    I suggest those concrete examples are not obvious yet, besides some potentially existing in Catalina, just because it's at a state that would warrant that consideration. (it works today)

    And that was my concern up thread. It occurred to me that just seeking the standard isn't demonstrated to be a optimal path, given how Props work. Was indirectly musing about that, asking what I did.

    Then I would also submit that for each "structured variation" from "the standard" needs to be associated with a key value in the chip that would warrant said variance.

    Secondly, what is "the standard", can and does vary, and have we set out to actually assess what that is? This topic stuck in my head today. In my niche, we have two products. One I would characterize as "managed". It's streamined, and it works the way it works, but it does it very, very well. Those users who have adopted it, see a very high value in that product focus. Keeps them out of trouble, and productive. We have another product that is considerably more messy. It can work multiple ways, leaving it not all that well managed. Users who have adopted that one see very high value doing it "their way", or "the right / best way".

    Both sets of users, and their common use cases overlap considerably! They do many of the exact same things. I suspect a similar dynamic is in play for the potential users of this project.
  • Kevin WoodKevin Wood Posts: 1,266
    edited 2011-05-31 20:42
    Having a debate about standard vs. improved workflows is kind of pointless when the tools aren't even there to begin with. On a very basic level, people want to type their code in C/C++, hit the compile button, and have their code compiled & loaded to their device. That's the minimum that needs to work, and just doing that will require quite a bit of work.

    As for GCC & Eclipse, I think Parallax is looking at them for the same reason everyone else does - so they don't need to start from scratch just to re-invent the wheel. I think that they would be open to any reasonable alternative that provided the same benefits. It seems that right now, there's LLVM & GCC as potential compilers, and Eclipse & Netbeans as potential IDEs.
  • RossHRossH Posts: 5,519
    edited 2011-06-01 02:19
    David Betz wrote: »
    I'd be happy to adopt something that was better but I haven't seen any concrete examples of how an alternate toolchain will actually improve on the more traditional one. I'm willing to be shown the error of my ways though. Just provide some details about this "better way".

    Hi David,

    Do you mean better than GCC? Well, there are several professional toolsets (like IAR) that are better for various reasons - more on those in a moment. But first, what about the Parallax Propeller Tool? This is not a traditional tool (no linker, no separate assembler - in fact, no command line at all!) yet it was probably as big a factor in the success of the Prop I as the chip itself - precisely because it allowed people to start programming a very complex chip without first having to memorize hundreds of obscure command-line options which tools such as GCC require (don't believe me? - try gcc --help and gcc --target-help!). It also meant they didn't have to worry about funny object formats or the obscure utilities required to deal with them. And why should they? - in an embedded environment, typically none of that stuff ends up on the chip itself anyway - these things are purely and simply artefacts of the toolset you use to generate what ends up on the chip.

    (As an aside, I think it's a shame that Parallax may lose the simplicity that was one of the original joys of programming the Propeller - but that's not my main point here).

    Going back to the professional toolsets - the difference between those tools (for which you pay $$$) and 'free' tools like GCC is that professional tools are expected to pay for themselves by demonstrating productivity benefits. If they can't do that, then they don't deserve to be in business (and won't be for very long!).

    What's common factor between these two types of tools? - Productivity!!! In the argument about the merits of various toolsets, let's not forget that this is the real requirement - as jazzed said earlier in this thread, everything else is just fluff.

    Ross.
  • Heater.Heater. Posts: 21,230
    edited 2011-06-01 03:11
    RossH,
    I think it's a shame that Parallax may lose the simplicity that was one of the original joys of programming the Propeller.

    I sincerely hope that does not happen. You are right, the Propeller in it's current market space is all about simplicity of use. From the chip architecture, the instructions set, the PASM and Spin languages all the way up to the Propeller Tool and on through the OBEX.

    No matter what happens with the "professionals" and C users I hope that the Prop II continues in that tradition.
  • RossHRossH Posts: 5,519
    edited 2011-06-01 03:36
    Heater. wrote: »
    No matter what happens with the "professionals" and C users I hope that the Prop II continues in that tradition.

    But what's wrong with trying to make tools that are easy for both professionals and hobbyists to use?

    Where is it written that professional tools have to be complex and obscure?

    Surely that new-fangled interweb thingy has put an end to the days when there was job security to be gained from being the only one on the team that could debug makefiles or write linker scripts?

    Ross.
  • Dr_AculaDr_Acula Posts: 5,484
    edited 2011-06-01 03:44
    I agree with Kevin
    Having a debate about standard vs. improved workflows is kind of pointless when the tools aren't even there to begin with. On a very basic level, people want to type their code in C/C++, hit the compile button, and have their code compiled & loaded to their device. That's the minimum that needs to work, and just doing that will require quite a bit of work.

    None of this exists for the prop II, but it all exists for the prop I. Maybe we can port it over?

    See attached. The first screenshot is Basic. Down the left are all the "new" options ranging from miniature to XMM memory models. Near the bottom left are some "one click buttons" like compile/download to ram, or to eeprom.

    Along the tabs are four supported languges - Spin, Pasm, C and Basic.

    This is an open source IDE and there is nothing copyright about it - all the ideas/layout etc can be copied by anyone. Maybe it can be ported over to a program with the 'look and feel' of the proptool?

    Behind the scenes the program shells out to various command line programs, eg homespun for spin and pasm, catalina for C and BCX Basic for Basic. Any other languages can easily be added if they exist as a command line program.

    But there are also complicated things behind the scenes as well, eg a huge amount of code to massage the output of the BCX from C99 into C89. That could be in a command line program as well though - at the end of the day that part is just a program that inputs text and outputs text.

    I don't know what language the proptool was written in, but I can't see how it would be particularly difficult to add various language tabs, and each of those tabs shells out to a specific command line compiler.

    Like Kevin says, I think it needs to be simple. In screenshot 1, to get a led flashing, you need to click two buttons. One is the 'new miniature' button, and one is the "=> ram" button. Keep things really simple!
    1024 x 768 - 160K
    1024 x 768 - 190K
  • David BetzDavid Betz Posts: 14,516
    edited 2011-06-01 03:53
    RossH wrote: »
    But first, what about the Parallax Propeller Tool? This is not a traditional tool (no linker, no separate assembler - in fact, no command line at all!) yet it was probably as big a factor in the success of the Prop I as the chip itself - precisely because it allowed people to start programming a very complex chip without first having to memorize hundreds of obscure command-line options which tools such as GCC require (don't believe me? - try gcc --help and gcc --target-help!). It also meant they didn't have to worry about funny object formats or the obscure utilities required to deal with them. And why should they? - in an embedded environment, typically none of that stuff ends up on the chip itself anyway - these things are purely and simply artefacts of the toolset you use to generate what ends up on the chip.
    I don't think that providing an easy-to-use GUI tool should necessarily mean forcing everyone to use that tool even for large commercial projects. There is no reason why the Propeller Tool can't continue to exist as it is now but be implemented as a shell around a more traditional (and much more powerful) toolchain. This is what the Arduino does. Their GUI is not a development tool unto itself with simpler language syntax and file formats. It is a shell around far more powerful tools that allow those tools to be used more easily by beginners. You can have it both ways but you need the foundation that a good set of command line tools provides.

    You mention "funny object formats". I don't see anything funny about either ELF or COFF. They are very expressive formats that provide the information needed to link and debug large programs. Other than being binary formats I see nothing "funny" about them. What I do find funny is the Parallax object format which has no official definition and whose structure must be reverse engineered by reading dumping the files generated by the Propeller Tool.

    People have also mentioned 'fitting the unique Propeller work flow" but I think that is just a way to excuse the limited nature of the original Propeller tools which were not written to support commercial developers but to be easy for beginners. The fact that the Propeller has no separate assembler and linker has nothing to do with its unique architecture but merely to choices made during the development of the initial tool set. Let's attempt to address the deficiencies with the original tools rather than claim them as some sort of new way of doing things that is superior to what has worked well for decades.
  • David BetzDavid Betz Posts: 14,516
    edited 2011-06-01 03:58
    RossH wrote: »
    But what's wrong with trying to make tools that are easy for both professionals and hobbyists to use?

    Where is it written that professional tools have to be complex and obscure?

    Surely that new-fangled interweb thingy has put an end to the days when there was job security to be gained from being the only one on the team that could debug makefiles or write linker scripts?

    Ross.

    There are a number of alternatives to Make that are easier to use. If we want something simpler we don't have to build it ourselves. I think one thing we're ignoring here is that building powerful tools is a big job. Yes it would be nice to have something less obscure than GCC but how long will it take to design and implement it? My guess is that Parallax has limited resources for this project and the best way to leverage those resources is to build on something that already has the capability they need.
  • RossHRossH Posts: 5,519
    edited 2011-06-01 04:27
    David Betz wrote: »
    There are a number of alternatives to Make that are easier to use. If we want something simpler we don't have to build it ourselves. I think one thing we're ignoring here is that building powerful tools is a big job. Yes it would be nice to have something less obscure than GCC but how long will it take to design and implement it? My guess is that Parallax has limited resources for this project and the best way to leverage those resources is to build on something that already has the capability they need.

    Well, of course it depends on what Parallax actually wants - which is not yet clear despite 20+ pages of discussion.

    But if Parallax just wants a C compiler with a snazzy IDE (which includes context sensitive editing, project files and a built-in 'make' equivalent) - and with graphical source-level line debugger support - then we can have it in a very short timeframe - i.e. Catalina + BlackCat + Code::Blocks. Cost to Parallax? Minimal!

    Or (as Dr_A has ably just demonstrated) if you want C + SPIN + PASM + BASIC with a snazzy IDE then you can have that too - probably in a longer timeframe, but one that would make a great community project and which could draw on the skills of many existing forum members. Cost to Parallax? Minimal!

    I agree that if you want C++ you are probably going to have to go the GCC or Clang route - but the timeframe will be even longer - and the cost to Parallax? Unknown - but likely to be quite substantial!

    Don't get me wrong - I have no problem with Parallax selecting the GCC/Clang route (especially as the other options will probably be done anyway). But I'm a long way from being convinced that this is the most sensible option - at least not unless Parallax's primary goal is to compete with the Arduino. But you know what? I think that contest is one that Parallax is very likely to lose!

    Ross.
  • LeonLeon Posts: 7,620
    edited 2011-06-01 04:50
    According to Ken's first post, developers who expressed an interest were to be invited to a meeting with Chip a week later, to discuss the various options, and a small team would be put together. Has that happened?
Sign In or Register to comment.