Phil, the P2 will be finished at some point, but it may take an extra round of silicon to fix the bugs that may get introduced with the new features. In the meantime, we'll be able to play around with the P2 from the first round of silicon. This will be obsoleted by the second round of silicon, which may end up being obsoleted by the third round of silicon. Eventually, there will be a stable version of the P2.
How about if only the bugs are fixed in rev B and all of these new features wait until P3? I'm beginning to wonder if it's even worth testing the rev A chips since so much is being changed in rev B.
I think the next version will be okay and we won't need to make another.
Ha! I'll believe it when I see it. At this point, I have zero confidence that that will ever happen. It's been 13 Years! And we still don't have silicon that's finished and ready for production.
Frankly, I don't care that much about the P2, per se. It's overly complex and will only appeal to a very narrow niche market. But, more importantly, its development has been a total drag on Parallax and its ability to sell and support the really good stuff in their product line, including the P1. It's time for Chip to either put up or shut up. No more tweaks!
I understand Phil's point. At first I thought this was going to be an ECO sort of spin to produce an A1. Minimal edits to save costs and reduce risk and pull in the schedule. For example if wafers with base layers were already available, and waiting for metal only. But it sounds like it's an all layer B0 spin at this point.
Phil, the P2 will be finished at some point, but it may take an extra round of silicon to fix the bugs that may get introduced with the new features. In the meantime, we'll be able to play around with the P2 from the first round of silicon. This will be obsoleted by the second round of silicon, which may end up being obsoleted by the third round of silicon. Eventually, there will be a stable version of the P2.
How about if only the bugs are fixed in rev B and all of these new features wait until P3? I'm beginning to wonder if it's even worth testing the rev A chips since so much is being changed in rev B.
There is a lot that needs testing, and does not change rev A to rev B.
There is still Analog characterization to do, smart pin modes have only been superficially tested, connection to various memories has hardly been touched....
To have a useful regression suite testing, I think it is important that Rev A code can rebuild and run on Rev B.
As soon as people started talking about HDMI, it's like the floodgates opened for tons of changes.
Most disappointing are various instruction set related things that should have been worked out in the FPGA days..... There was plenty of time to mull all that stuff over, but now it seems like everyone wants their one or two tweaks for whatever corner cases they have in mind. Counters that won't roll over in 2000 years. How hard would it really be to accumulate the 32-bit counter in software in a cog register once every few seconds? When there are 8 cogs running at 180 MHz+?
I asked about some kind of a test coverage matrix or summary to coordinate what has been tested so that people can coordinate coverage, but 0 responses to that thread.
I was planning to get an eval board and test but now it seems like there may be so many differences to the final chips that any testing is of limited usefulness. All the compiler makers will have to maintain two codepaths to handle the instructions changes.....
Phil, the P2 will be finished at some point, but it may take an extra round of silicon to fix the bugs that may get introduced with the new features. In the meantime, we'll be able to play around with the P2 from the first round of silicon. This will be obsoleted by the second round of silicon, which may end up being obsoleted by the third round of silicon. Eventually, there will be a stable version of the P2.
How about if only the bugs are fixed in rev B and all of these new features wait until P3? I'm beginning to wonder if it's even worth testing the rev A chips since so much is being changed in rev B.
There is a lot that needs testing, and does not change rev A to rev B.
There is still Analog characterization to do, smart pin modes have only been superficially tested, connection to various memories has hardly been touched....
To have a useful regression suite testing, I think it is important that Rev A code can rebuild and run on Rev B.
There's usually not much involved in tweaking to bring stuff up to date. Heck, ozprop just brought his VGA logic analyzer from 2016 back from extinction
I would think for the kind of instructions compilers need would be even less 'fluxical'.
Again, what examples exist from other domains/ micros/ methodologies? The stuff we're testing on analog pins at the moment -- where would this fit into a modern testing framework?
Given over 90 percent is unchanged, and incompletely tested?
Yes.
And the reason we've got tweaks in the Rev B is due to people actually using Rev A more than they did the FPGA. It all can be done at speed.
Frankly, the fact that we were surprised at being able to run at 250Mhz warranted the addition of the HDMI. No brainer. And that's in the realm of things that worked great. No reason to believe it won't.
Pushing Rev A has shaken out a couple performance tweaks.
The edge cases are a bit more dubious, but so far, not expensive.
Lastly, the idea of a robust test matrix makes sense. But, that also implies the scope is also known and well defined. That simply is not true.
For better or worse, the design approach has been to propose, attempt to use, refine, and then debug.
Most of the chip has ran that cycle. Worth having a test suite, and a lot of code written. The few shiny features that have obviously not actually run through that cycle are being run through now.
Phil,
I think you are being a bit harsh. (getting kind of usual for you around here, sad to say)
Also, I think the P2 will be used in a lot of things that the P1 couldn't do very well alone.
Most everyone has gone through bouts of frustration.
Yeah, I agree, but what about you Chip? Right in the thick of it for all these years and still patiently putting up with us, such a cranky lot! I don't think I've even heard a harsh word from you ever, despite the circumstances.
Phil - you've kept the faith all these years and now seeing is believing. We have silicon that works, and not just to spec, but far beyond! Yeah, it's a lot more than the next iteration of P1, more like a P4 or P5 but it has all the beauty of a Propeller. This is the time to rejoice with Chip and "the family" over his newborn and if Chip has some extra plans, so be it, he is entitled, and we get to join in too.
Chip - is there a plan in place to package up more chips if want to use this version in early production "as is" and we pony up the money for it?
Peter, I suppose we could approach ON Semi about packaging the other five wafers, but we'd have to have a real need, like someone wanting to buy them. We'll see how it goes.
Peter, I suppose we could approach ON Semi about packaging the other five wafers, but we'd have to have a real need, like someone wanting to buy them. We'll see how it goes.
I certainly would not throw them away, keep them as insurance for possible Rev B hiccups...
I haven't been frustrated as much as I've wondered over these years if all this effort will still be relevant. Now, I think it's going to be timely, actually. The micro/computer world has run off in one direction for so long that it's become alienated from its roots. I think the P2 will be something unique that goes into the void between micros and FPGAs, but also overlaps them nicely. And it will be FUN to design with - a total playground for invention. That's where it's really going to kick Smile. It seems to me like everything has evolved to become very un-fun over the years and something new is needed, but the prevailing paradigm has caused snow-blindness. If we can get P2, sound money, and good government, we'll be set.
Yes, this P2 is proving to be not only a joy to program (yeah, I know there are lots of unusual features), but extremely powerful. Way better than we had hoped for
Yes, there are a few tweeks being done. They are because we are using the P2 now, and we are discovering a few things that would be nice if tweeked. Hopefully they all work as designed, and there are no gotchas.
We do have working silicon. There is time for pausing to verify the silicon works as expected, and and some tweeks too, before the silicon respin is done.
Just last night I got VGA working. Then I tweeked it to display 1920x1080 (FHD) @ 180MHz. While there was not enough HUB RAM for a full-blown screen buffer so I restricted the visible section to fit in. There are a few timing problems as it doesn't render fast enough for the streamer to take the data. But hey, that will be more fun to tweek it!
When I finally hit the sack, I couldn't sleep, thinking about what I can do with all this newfound power
It seems to me like everything has evolved to become very un-fun over the years and something new is needed, but the prevailing paradigm has caused snow-blindness.
But is that viewpoint because you haven't had the time to buy a few things and play? Have you bought an Arduino and a few shields, installed the IDE, and made something? Millions have and Arduino has flourished. What about a Pi? It was launched as an educational tool but has seen sales way beyond expectations. And then there are chips like the ESP8266; never has it been so easy to get a chip online.
And now we have Arduino compatible boards with an FPGA on; and with the FPGA software rolled right into the IDE. Often with a traditional microcontroller sitting along side it. Put your intensive IO inside the FPGA and let the uC deal with the overall logical decisions.
In the world of 32-bits you can get started for around $10 for an STM32 board and programmer, with free software.
And, contrary to every prediction, the 8-bit market keeps on innovating with new chips launched on a weekly basis.
Move away from the mainstream manufacturers into the area occupied by the outliers, the area that the P2 will have to compete in, and there is the RISC-V which a number of manufacturers are working on and releasing chips. There's even talk of phone manufacturers picking up on the RISC-V which will mean cheaper chips for all.
In the decade since work on the P2 started a lot has changed and I'm not sure the P2 has kept up.
What many students do not learn easily is how a CPU actually works. RISC-V will be impossible for them to comprehend. Without that knowledge the best they'll ever do is drop it into a FPGA as a block.
I feel the CPU functional basics should be taught much earlier than university level. Things like Blockly are a good start for giving a taste of logic and control, and an excellent substitute for BASIC. But there is a chasm between using Blockly/BASIC and understanding a CPU.
The Prop has complete hardware access, clearly documented, and integral assembly coding. This is a big part of making this learning easier. No virtualisation nor NDA'd hardware nor libraries/APIs nor fancy languages/toolkits to obfuscate the exercise.
The only other part needed is the clear literature to step through the basics that describe how each piece of machine code is acquired and acted on. It's just complex enough to keep the uninitiated out.
In the decade since work on the P2 started a lot has changed and I'm not sure the P2 has kept up.
I think a serious limit of the P2 architecture is the 20 bit address space: With the addition of HDMI and all the other video goodies, you'd expect the address space to fit at least a single full-resolution frame buffer. Except that it barely does. (640x480x3 bytes) When P2 development began, even 1MB RAM in a micro controller would have probably seemed ridiculous, but here we are.
I think P2 still has a lot of potential, even after all these years...
The ability to 1080P is going to be key. That's the new standard.
We should be able to interface directly with digital camera modules.
All kinds of I/O pins free. Real ADC and DACs...
I almost forgot USB full speed. That's a critical element for success...
For a microcontroller, that's pretty good...
1 MB RAM would have been better, but 512 kB is still pretty good.
I'm looking forward to seeing what can be done when combined with HyperFlash and HyperRam...
... An additional 16 or even 8 bits would have been more than adequate for most things...
Not really, if you want a useful time-since-reset, you do not want that to wrap inside any sensible time. Another 8 bits gives up-time wraps every 1 hour!
Even 16 bits only nudges you out to 10 days. Both would need additional software and some time-manager COG allocated.
As my numbers indicated, you can decrease from 64b, but not by very much (~ 60 bits).
Which is why I closed with "I suspect either one would not be all that much simpler than the 32 bit version. Perhaps "better to have it and not need it..." applies in this case.". Once you have to go to a second register you might as well take full advantage of what having all 32 bits available provides.
My comments on the upper bits is really just to highlight that they will always read zero but that is not to say that you wouldn't implement the full 32-bits because it is easier to do so than to do less. What does the internal bus read with floating bits? (rhetorical).
Some say it is easy enough to add an interrupt routine to handle the overflow but the thing is that you HAVE TO that when it is so simple in hardware to add 32-bits. Besides, the interrupt may mess up some prickly code that just wants to read CT high without any fuss.
I haven't been frustrated as much as I've wondered over these years if all this effort will still be relevant. Now, I think it's going to be timely, actually. The micro/computer world has run off in one direction for so long that it's become alienated from its roots. I think the P2 will be something unique that goes into the void between micros and FPGAs, but also overlaps them nicely. And it will be FUN to design with - a total playground for invention. That's where it's really going to kick Smile. It seems to me like everything has evolved to become very un-fun over the years and something new is needed, but the prevailing paradigm has caused snow-blindness. If we can get P2, sound money, and good government, we'll be set.
So you are saying that necessity may be the mother of invention but fun is its playground?
Unless though we show others how much fun it is, they are never going to know. But I'm afraid the current generation is not the same as those who saw in the microcomputer age. I have my very bright young friend Jack who seems more interested in Javascript or C# or some SQL running on a Pi than actually getting into the guts of it. There just isn't the necessity for them to do that when the libraries and tools are already there, so therefore there is no "invention" and thereby they miss out on the real fun and satisfaction of doing something far more than was envisioned or thought possible and in the process learning what they can't be taught. This generation's electronic skills are best exemplified (in a sad way) by the Arduino Fritzing mags that I see in the newsagent now. I tell you, we are a dying breed.
But the P2, it is a new breed that Parallax is in a position to educate a new generation but sadly the emphasis is also on what is easy and popular and removes any necessity to explore and try something different. Plug-in breadboards and plug-in code, just like everyone else.
I was at the Propeller Expo when Prop 2 chips were in shipping. A fatal flaw was found. I thought, fix the flaw and we will have a Chipmas. How many years ago was that? How many Arduino's and Pi's have been sold since then? How many people have moved to those platforms? Can we Propeller 2 fans have the discipline to not ask for new features until a production chip is listed in the Parallax store?
I have to think that, amid the noise, there are people emerging who want to get into the guts of things. The "Maker" radiation swamps out many signals, but it also must stimulate people to want to learn and do more.
To me, loading up a multi-GB IDE so that I can play with modern stuff that I'm not welcome to understand seems utterly slavish. Totally NOT interested in that kind of thing.
To me, loading up a multi-GB IDE so that I can play with modern stuff...
I've just looked. The current Arduino IDE, including C compiler, is a 103Mbyte download.
That may not seem like much when compared to a lot of software these days, but compare it to the initial assemblers, debuggers, monitors, compilers, and OS for the 8 bitters and it's huge. For many things, particularly learning something new, small and simple is much better.
I tried to use the Arduino to talk to a cellular modem and a serial port, just asking it to relay between a software serial and hardware UART. I couldn't get the thing to to do that, it wouldn't keep up with both demands.
That would have been trivial on a Propeller, and less exasperating too.
Multicore without a complex or big OS. Concurrency and parallelism are simple and robust. That, and being able to assemble code bodies easily will prove attractive.
I just got done exploring a bunch of functional programming material in an attempt to better understand our friend Red and his project. Interesting stuff.
They go to great lengths to avoid state and complexity.
A big part of that is avoiding state. Another big part of that is compartmentalizing code. Being able to change or swap functions and know the program as a whole will still work.
P2 has those things, built right in.
It is relevant today. In many ways. It just needs to get done.
Comments
Ha! I'll believe it when I see it. At this point, I have zero confidence that that will ever happen. It's been 13 Years! And we still don't have silicon that's finished and ready for production.
Frankly, I don't care that much about the P2, per se. It's overly complex and will only appeal to a very narrow niche market. But, more importantly, its development has been a total drag on Parallax and its ability to sell and support the really good stuff in their product line, including the P1. It's time for Chip to either put up or shut up. No more tweaks!
-Phil
There is a lot that needs testing, and does not change rev A to rev B.
There is still Analog characterization to do, smart pin modes have only been superficially tested, connection to various memories has hardly been touched....
To have a useful regression suite testing, I think it is important that Rev A code can rebuild and run on Rev B.
Most disappointing are various instruction set related things that should have been worked out in the FPGA days..... There was plenty of time to mull all that stuff over, but now it seems like everyone wants their one or two tweaks for whatever corner cases they have in mind. Counters that won't roll over in 2000 years. How hard would it really be to accumulate the 32-bit counter in software in a cog register once every few seconds? When there are 8 cogs running at 180 MHz+?
I asked about some kind of a test coverage matrix or summary to coordinate what has been tested so that people can coordinate coverage, but 0 responses to that thread.
I was planning to get an eval board and test but now it seems like there may be so many differences to the final chips that any testing is of limited usefulness. All the compiler makers will have to maintain two codepaths to handle the instructions changes.....
There's usually not much involved in tweaking to bring stuff up to date. Heck, ozprop just brought his VGA logic analyzer from 2016 back from extinction
I would think for the kind of instructions compilers need would be even less 'fluxical'.
Again, what examples exist from other domains/ micros/ methodologies? The stuff we're testing on analog pins at the moment -- where would this fit into a modern testing framework?
Given over 90 percent is unchanged, and incompletely tested?
Yes.
And the reason we've got tweaks in the Rev B is due to people actually using Rev A more than they did the FPGA. It all can be done at speed.
Frankly, the fact that we were surprised at being able to run at 250Mhz warranted the addition of the HDMI. No brainer. And that's in the realm of things that worked great. No reason to believe it won't.
Pushing Rev A has shaken out a couple performance tweaks.
The edge cases are a bit more dubious, but so far, not expensive.
Lastly, the idea of a robust test matrix makes sense. But, that also implies the scope is also known and well defined. That simply is not true.
For better or worse, the design approach has been to propose, attempt to use, refine, and then debug.
Most of the chip has ran that cycle. Worth having a test suite, and a lot of code written. The few shiny features that have obviously not actually run through that cycle are being run through now.
I think you are being a bit harsh. (getting kind of usual for you around here, sad to say)
Also, I think the P2 will be used in a lot of things that the P1 couldn't do very well alone.
Most everyone has gone through bouts of frustration.
Yeah, I agree, but what about you Chip? Right in the thick of it for all these years and still patiently putting up with us, such a cranky lot! I don't think I've even heard a harsh word from you ever, despite the circumstances.
Phil - you've kept the faith all these years and now seeing is believing. We have silicon that works, and not just to spec, but far beyond! Yeah, it's a lot more than the next iteration of P1, more like a P4 or P5 but it has all the beauty of a Propeller. This is the time to rejoice with Chip and "the family" over his newborn and if Chip has some extra plans, so be it, he is entitled, and we get to join in too.
Chip - is there a plan in place to package up more chips if want to use this version in early production "as is" and we pony up the money for it?
I certainly would not throw them away, keep them as insurance for possible Rev B hiccups...
Yes, there are a few tweeks being done. They are because we are using the P2 now, and we are discovering a few things that would be nice if tweeked. Hopefully they all work as designed, and there are no gotchas.
We do have working silicon. There is time for pausing to verify the silicon works as expected, and and some tweeks too, before the silicon respin is done.
Just last night I got VGA working. Then I tweeked it to display 1920x1080 (FHD) @ 180MHz. While there was not enough HUB RAM for a full-blown screen buffer so I restricted the visible section to fit in. There are a few timing problems as it doesn't render fast enough for the streamer to take the data. But hey, that will be more fun to tweek it!
When I finally hit the sack, I couldn't sleep, thinking about what I can do with all this newfound power
But is that viewpoint because you haven't had the time to buy a few things and play? Have you bought an Arduino and a few shields, installed the IDE, and made something? Millions have and Arduino has flourished. What about a Pi? It was launched as an educational tool but has seen sales way beyond expectations. And then there are chips like the ESP8266; never has it been so easy to get a chip online.
And now we have Arduino compatible boards with an FPGA on; and with the FPGA software rolled right into the IDE. Often with a traditional microcontroller sitting along side it. Put your intensive IO inside the FPGA and let the uC deal with the overall logical decisions.
In the world of 32-bits you can get started for around $10 for an STM32 board and programmer, with free software.
And, contrary to every prediction, the 8-bit market keeps on innovating with new chips launched on a weekly basis.
Move away from the mainstream manufacturers into the area occupied by the outliers, the area that the P2 will have to compete in, and there is the RISC-V which a number of manufacturers are working on and releasing chips. There's even talk of phone manufacturers picking up on the RISC-V which will mean cheaper chips for all.
In the decade since work on the P2 started a lot has changed and I'm not sure the P2 has kept up.
I feel the CPU functional basics should be taught much earlier than university level. Things like Blockly are a good start for giving a taste of logic and control, and an excellent substitute for BASIC. But there is a chasm between using Blockly/BASIC and understanding a CPU.
The Prop has complete hardware access, clearly documented, and integral assembly coding. This is a big part of making this learning easier. No virtualisation nor NDA'd hardware nor libraries/APIs nor fancy languages/toolkits to obfuscate the exercise.
The only other part needed is the clear literature to step through the basics that describe how each piece of machine code is acquired and acted on. It's just complex enough to keep the uninitiated out.
The ability to 1080P is going to be key. That's the new standard.
We should be able to interface directly with digital camera modules.
All kinds of I/O pins free. Real ADC and DACs...
I almost forgot USB full speed. That's a critical element for success...
For a microcontroller, that's pretty good...
1 MB RAM would have been better, but 512 kB is still pretty good.
I'm looking forward to seeing what can be done when combined with HyperFlash and HyperRam...
Which is why I closed with "I suspect either one would not be all that much simpler than the 32 bit version. Perhaps "better to have it and not need it..." applies in this case.". Once you have to go to a second register you might as well take full advantage of what having all 32 bits available provides.
Some say it is easy enough to add an interrupt routine to handle the overflow but the thing is that you HAVE TO that when it is so simple in hardware to add 32-bits. Besides, the interrupt may mess up some prickly code that just wants to read CT high without any fuss.
So you are saying that necessity may be the mother of invention but fun is its playground?
Unless though we show others how much fun it is, they are never going to know. But I'm afraid the current generation is not the same as those who saw in the microcomputer age. I have my very bright young friend Jack who seems more interested in Javascript or C# or some SQL running on a Pi than actually getting into the guts of it. There just isn't the necessity for them to do that when the libraries and tools are already there, so therefore there is no "invention" and thereby they miss out on the real fun and satisfaction of doing something far more than was envisioned or thought possible and in the process learning what they can't be taught. This generation's electronic skills are best exemplified (in a sad way) by the Arduino Fritzing mags that I see in the newsagent now. I tell you, we are a dying breed.
But the P2, it is a new breed that Parallax is in a position to educate a new generation but sadly the emphasis is also on what is easy and popular and removes any necessity to explore and try something different. Plug-in breadboards and plug-in code, just like everyone else.
John Abshier
To me, loading up a multi-GB IDE so that I can play with modern stuff that I'm not welcome to understand seems utterly slavish. Totally NOT interested in that kind of thing.
Heck, even Micromite BASIC has 64bit integers.
I've just looked. The current Arduino IDE, including C compiler, is a 103Mbyte download.
You know, Brian, a man convinced against his will is of the same opinion, still.
That may not seem like much when compared to a lot of software these days, but compare it to the initial assemblers, debuggers, monitors, compilers, and OS for the 8 bitters and it's huge. For many things, particularly learning something new, small and simple is much better.
That would have been trivial on a Propeller, and less exasperating too.
Multicore without a complex or big OS. Concurrency and parallelism are simple and robust. That, and being able to assemble code bodies easily will prove attractive.
I just got done exploring a bunch of functional programming material in an attempt to better understand our friend Red and his project. Interesting stuff.
They go to great lengths to avoid state and complexity.
A big part of that is avoiding state. Another big part of that is compartmentalizing code. Being able to change or swap functions and know the program as a whole will still work.
P2 has those things, built right in.
It is relevant today. In many ways. It just needs to get done.