That is point on my statement. Hi have writ it to theirs NEED's. BUT not say it was build with forward thinking --- AS C and theirs friends have to close architecture.
And are both memory hungry and have problems to implement new CPU's architectures.
As it still are used on new CPU's are -- First as Chip said -- Much code people think them simply can move from one CPU to another. Second "Professional programmer's next religious thinking that it can solve all theirs problems that them have with programing.
BUT no matter what language You use IF You are not good programmer LANGUAGE can't help You fix programming You CPU/System --- With other words IT cant program themselves.
I'm having a hard time following your point but I think we're going to have to agree to disagree.
For more fun, I ran the simulation at ideal-case conditions (which aren't likely to ever happen, as you can't keep the junction temperatures at 0C, but this case must be considered when doing hold-time checks). The ring oscillator ran at 1,776MHz. That would translate to 361MHz!
Don't be too sure
Active cooling is used by the over-clocking zealots ( but of course, that dwarfs the chip price )
I share Sapieha's sentiments that C has been an often-misguiding force in the evolution of computing.
I suppose it made fine sense for large-memory, 'academically'-ideal systems of the time, but it has been shoehorned into everything else since, and has been dissuasive to the advent of computing architectures to which it would not be amenable. I suppose an ARM chip is what it is because of C. I understand that there are mountains of C code and many reasons to support C, but C's hegemony does not inspire me, personally, and I really hope that someday we will have better systems that get created because, finally, the inspiration came to throw off the old shackles that C has subtly placed on computing for the past 30 years. When this happens, I think we will all know it, because computing will suddenly be fresh again and something to get excited about.
Hard to tell how much we disagree on this topic but I can't help but think that C was initially developed for the PDP-7 ( I know I wrote PDP_11 earlier in the thread but I was mistaken) which used 18bit words with 4K words of base memory and expandable to 64K words. That may have seemed like a lot in the 60's and early 70's but by any modern standard not much memory at all!
If C is responsible for things like CPUs having word lengths base on 8 bits I consider that alone to be a great legacy!!
In some ways I see some of the annoying and confusing aspects of C to be the result of it needing to deal with CPU design problems that no longer exist. Very similar to Sendmail in the unix world. Designed to solve a problem that doesn't exist! It is important to remember that C was designed not as a high lever language but as a high level assembler. That fact becomes clear when you write C code that doesn't use any library functions.
For the record I'm not trying to sell the language or convince anyone to use it. I got involved in this discussion when I read a post about how the design of a programming language developed in the late '60s during the era of cobol and fortran that is being used in ways and on types of hardware that couldn't even have been conceived of at the time was short sighted.
Hehe, some impressive numbers buried in that link...
I was thinking more of Peltier coolers, more than liquid gasses...
["Late last month, AMD said that it managed to hit 7 GHz in "extreme overclocking tests," more than doubling the original 3.2 GHz clock speeds enjoyed right out of the box. Apparently the CPU can be safely overclocked at home with speeds up to 3.8 GHz; anything greater will need "exotic cooling materials." AMD's product manager Brent Barry was noted saying that liquid nitrogen and liquid helium are best suited for high-overclocking environments, the former bringing the temperature down to about -140 degrees and the latter to around -240 degrees. Unfortunately, both solutions are somewhat dangerous to use. "This is fairly insane, science experiment stuff," Davis said."]
If we use AMD's indicators, then 200MHz typical on a 160MHz Spec'd prop, should be do-able, with moderate attention to cooling.
Compiled HLLs like C on the Prop I and Prop II are really nothing more than an accommodation for users who either don't want or don't have the time to learn PASM. So I don't think the chip should be designed for the convenience of compiler writers, if it entails compromise on the fronts that really matter.
-Phil
Accommodation for a set of users that is at least 100x larger than the current set of Propeller users. It makes a LOT of business sense for Parallax to accommodate them... also, the "fronts that really matter" differ from person to person.
No but you could argue that both Spin and C++ are a step up from C. I think the biggest feature that Spin is missing is the ability to pass objects as parameters to functions.
Amen. It's the one thing I banged into all the time on on bigger projects. Yeah you can do it without but I really missed it.
Amen. It's the one thing I banged into all the time on on bigger projects. Yeah you can do it without but I really missed it.
What does it mean to pass objects as parameters to functions? I suppose it's something other than "someroutine(someobject.someotherroutine)", as that can already be done.
Accommodation got the computer industry gigabyte OS's and development suites. Not to mention OS's no longer understandable by a single person as Jack Crenshaw said.
Funny, a gigabyte OS was considered a obscene joke years back, today it's a reality. Code bloat is god.
We went from a C or Pascal development suite that fitted on 2 floppies to those distributed on multiple CD's because of bloat, incompetent professional coders and language fads.
I watched as my FPGA development tools inflate like Mr. Creosote and grew from less than a gig to over 3 gigs. I'm just waiting for them to issue a release that uses a entire HDD. I say that as a half joke, because software shows no signs of slimming, just growing like the Blob.
What does it mean to pass objects as parameters to functions? I suppose it's something other than "someroutine(someobject.someotherroutine)", as that can already be done.
The SPIN equivalent:
OBJ
foo: myobjname.spin
PUB Main | x,y,z
result := bar(foo,x)
PUB bar(object baz, x)
PHP doesn't have a notion of different object types, just that a type is type Object. All Objects are type Object, so there is no static typing going on. SPIN could be extended to support this model. I think the compiler could do all the magic and let the runtime remain the same.
In essence, SPIN with object passing would allow local aliases to global object instances. It would be need to be implemented as pass-by-reference. It actually seems really trivial to implement.
A companion to this would be local instantiation of objects in methods. This way you can have a local, anonymous, object instance that does something useful. This could take a little more effort. From a compiler standpoint, you could include as many anonymous global object instances as you need for simultaneous usage (multiple COGs in the same routine). This way local instances would actually be aliases to global anonymous (randomized object names that won't clash) instances. The main downside, as is inherent in SPIN, is that memory/code allocation is done statically, so you can't reclaim or dynamically allocate space.
Passing of objects by reference would somewhat fulfil the inherent lack of structures in SPIN, although it might be a complicated solution to an easy problem. Structures are handy, but in SPIN they would merely be aliases to memory locations, again, something that the compiler can handle and the runtime need not care about.
The next step in Objects would be inheritance. PHP supports single inheritance, which is seen as a limiting factor by some, however it makes the compiler simpler and code development is forced to work within the limitations. This is the difference between encapsulation and multiple-inheritance in C++ IIRC.
Going further and deeper is polymorphism, but since SPIN isn't a typed language, this hardly matters.
I think the last language construct that would be useful to sort out is the String notion that keeps coming up. I've seen grumblings from time to time about the lack of native string literal handling in SPIN. I haven't bothered to understand the underlying issue.
What does it mean to pass objects as parameters to functions? I suppose it's something other than "someroutine(someobject.someotherroutine)", as that can already be done.
No more of this:
ObjTypeB ObjB;
ObjTypeA ObjA;
ObjB.DoSomethingUseful(ObjA);
//... Meanwhile, in Class ObjTypeB
void ObjTypeB::DoSomethingUseful(ObjTypeA& var)
{
var.DoThis();
var.DoThat();
var.variable = 1;
}
There are lots of places where it makes life easier to be able to pass over a class and access it's member functions to manipulte it vs a pointer to an objects internal data and hope you don't forget leter on that the order and size of the variables was really important.
As an engineer (and not a real programmer, despite the tens of thousands of lines of code I've written) I love C for the simple reason it gives me assembly-like control of the CPU, yet with the convenience and coding efficiency of an HLL. When something else comes along that does this better, or with an enhanced fun-factor I'll jump on it. But until then, I simply don't understand the motivation behind knocking C.
Here's a more concrete example from what I'm working on right now. Building the low level stuff for my robot.
In IMUTest.cpp:
int main(void)
{
...
I2C i2cBus(26,27,400000); //Create an I2C Bus running on a cog.
result = i2cBus.GetCog();
...
// Create an Accelerometer device and give it a reference to the I2C bus.
MiniIMU9Accel accelA(i2cBus, MIMU_ACCEL_ADR);
result = accelA.IsOnline();
accelA.Enable(1);
accelA.SetDataRate(1);
accelA.SetFullScale(1);
....
while(1)
{
acc_x = accelA.ReadChannel(0);
acc_y = accelA.ReadChannel(1);
acc_z = accelA.ReadChannel(2);
printf("Accelerometer: X %d Y %d Z %d\n", acc_x, acc_y, acc_z);
pauseMSec(250);
}
return 0;
}
And in MiniIMU9Accel.cpp I can use that I2C bus object and all it's member functions to do the reads and writes. Any other device that I code can also use that bus passed to it when it's created.
MiniIMU9Accel::MiniIMU9Accel(I2C& bus, uint8_t addr)
{
m_bus = &bus;
m_addr = addr;
SetByteOrder(1);
SetUpdateMode(1);
SetFilter(0,0);
SetDataRate(0);
SetFullScale(0);
}
int MiniIMU9Accel::WriteReg(uint8_t reg, uint8_t value)
{
return m_bus->TXByte(m_addr, reg, 1, value);
}
uint8_t MiniIMU9Accel::ReadReg(uint8_t reg)
{
return m_bus->RXByte(m_addr, reg, 1);
}
int16_t MiniIMU9Accel::ReadChannel(uint8_t chan)
{
int16_t chanVal = 0;
uint8_t reg = A_OUT_X_L + (chan << 1);
reg |= A_AUTO_INC_M;
chanVal = m_bus->RXWord(m_addr, reg, 1); // Stored pointer to that bus object used here
chanVal >>= 4; // to read the data from the accelerometer.
return chanVal;
}
In SPIN I either have each object creating it's own personal internal copy of I2C and all the VAR variables or I have to make I2C a singleton and rename the source file for each object to access the methods to the shared DAT data. In this particualr case, that could still be workable but there are many times where I need to have objects operate on other objects that they do not contain and having to do that via pointers to the data is just not really fun. It's not that work can't be done without these things but they are things that I really do miss when switch back to SPIN.
To refresh, Chip said, "I think these last two things (final die artwork and ROM bit pattern) will come together at about the same time and within three weeks we'll have final GDS2 data to send off to the fab. Then, three weeks later we'll see if it works...That is the state of things. I'm really pleased with how things are turning out. I just hope it works on the first try."
First off, thanks so much for the very welcome update. Next, does that "within three weeks" for the GDS2 (GDSII?) data refers to three weeks from the blog post date or three weeks after the "final die artwork and ROM bit pattern" come together (if there's any difference between the "final die artwork" and the "final GDS2 data")? The ever-hopeful side of me hopes that they refer to the same blessed event.
Either way, it sounds like great progress has been and is being made. And, no doubt, we are all (really) pleased that Chip is really pleased. That personal comment was a nice touch on Chip's part and amplifies the message about the progress. The cake may not be in the oven just yet, but it sounds like the batter is just about ready to pour. Someday, we look forward to a "And Chip said that it was good" comment after the P2 creation is complete/tested (then Chip can rest), apologies to Genesis.
I'm not fully certain that GDS2 = GDSII, but Wikipedia states, "[The] GDSII stream format, common acronym GDSII, is a database file format which is the de facto industry standard for data exchange of integrated circuit or IC layout artwork." Link: http://en.wikipedia.org/wiki/GDSII
I share Sapieha's sentiments that C has been an often-misguiding force in the evolution of computing.
I suppose it made fine sense for large-memory, 'academically'-ideal systems of the time, but it has been shoehorned into everything else since, and has been dissuasive to the advent of computing architectures to which it would not be amenable. I suppose an ARM chip is what it is because of C. I understand that there are mountains of C code and many reasons to support C, but C's hegemony does not inspire me, personally, and I really hope that someday we will have better systems that get created because, finally, the inspiration came to throw off the old shackles that C has subtly placed on computing for the past 30 years. When this happens, I think we will all know it, because computing will suddenly be fresh again and something to get excited about.
My thoughts too ...well actually three including Phil
What does it mean to pass objects as parameters to functions? I suppose it's something other than "someroutine(someobject.someotherroutine)", as that can already be done.
I was going to respond to this but it looks like a whole lot of people have done that already. One thing that I always try to do when designing a language is to make every data type a "first class" type. That means that values of that type can be assigned to variables and passed to functions as arguments. This simple rule gives languages a lot of expressive power. I first ran into this concept when reading the excellent book Structure and Interpretation of Computer Programs" by Gerry Sussman and Hal Abelson of MIT but I'm sure it predates that book.
In the case of Spin, I would make it possible to pass a reference to an object as a parameter to a method of another object. I would also make it possible to pass a function as a parameter to another function although this is somewhat more involved in an object-oriented language than in a functional language. In the case of Spin, I'd be happy to just have objects passed as parameters as a start. I suppose one problem with adding this to Spin is that it would require that method arguments have types. At the moment I think all arguments are just 32 bit integers.
Yes, add types! Make them optional so that people who hate types don't have to use them. Using types allows the compiler to perform type checking, which would assist novices in learning the language.
...C has been an often-misguiding force in the evolution of computing.
My gut says that I disagree but in such esteemed company I cannot with further clarification. So could each of you offer at least one feature of modern computers that has resulted from the widespread use of C together with why that is somehow a bad and misguided thing.? Particularly Chip who said "...old shackles that C has subtly placed on computing for the past 30 years." I'm curious to know what those shackles are.
@Chip,
I suppose it [ C ] made fine sense for large-memory, 'academically'-ideal systems of the time,
As far a I know the roots of C were developed on DECs PDP-7. They only had 18 bit word memory from 4K up to an expensive maximum of 64K. Far from being large systems I would say. Perhaps due to this C has been used in small systems ever since all the way down to PIC MCUs and Propeller COGs (Thank God).
I'm not sure there was anything 'academically'-ideal about the hardware or software around C. The hardware was being built down to a price, as always. The C language and Unix were a quicker and simpler alternative to the huge failed Mutics OS project.
What I see coming out of academia are things like ALGOL, then PASCAL, then ADA. All very formalized, rigorous and rejected by the mainstream. Also Lisp that no one outside comprehends:)
...When this happens, I think we will all know it, because computing will suddenly be fresh again and something to get excited about.
I think many of us here had that feeling already when you gave us the Propeller:) A big thank you for that.
What I see coming out of academia are things like ALGOL, then PASCAL, then ADA. All very formalized, rigorous and rejected by the mainstream. Also Lisp that no one outside comprehends:)
Ummm... Those are not recent products of academia. Might be worth your while to look again. :-)
I have wondered whether something like Erlang might be a good choice for programming the Propeller. I'd like to try implementing a subset of it on jazzed's tetraprop board.
I think many of us here had that feeling already when you gave us the Propeller:) A big thank you for that.
PropForth is a sweet match up for jazzed's TetraProp!
I keep wanting to play with Forth but somehow never seem to find time. I think it is an interesting language. I've thought that since I built one to run on the DEC PDP-10 ages ago. I even have a numbered copy of the BYTE Magazine Forth issue cover art by Robert Tinney.
I don't see why people are bashing C. Its not evil or anything.
I write C code. It works. I'm happy. End of story.
The results of my program is what I care about. As long as it isn't too painful to meet those results then life is good.
---
So... if you are bashing C and are veteran C programmer then I accept that. If you are not a veteran C programmer then I say you should try it out. I think the... don't knock it before you try it line applies here.
I'll admit, C has a lot hacks in it. (Macros...) . But... you can just Google anything you want to know about C. An answer to your question exist! There is tons of infomration available about how everything works.
---
Okay, now as for building C code I can see where people may become angry. MAKE files are not fun. But... you don't need to write make files anymore. You can write your build system in SCons or CMake. Both of these build systems make it easy to build your code.
If you want to talk about language growth, based on job ads the top languages, in order are:
Ruby (on Rails)
Java
Python
PHP
C++
C
The interesting trend is that the lists (almost) follows the evolution of the respective languages. As soon as a language gets mature enough to reach widespread adoption, another "faster" or "better" (in some way) language comes up.
I personally couldn't grok Ruby when I saw it. I'm told Ruby was designed to be a "natural" language, perhaps my brain's insistence on order and procedural approach is what blocked it from understanding.
I suspect JAVA is skewed in the matchup because Android uses it so exclusively, and there is a high penetration of Android in the marketplace.
Another bit to note, the languages used aren't always appropriate for the task at hand, the acceptance is driven by fad, non-traditional approaches to software development, and pure need to get some idea to market as fast as possible, because the "cloud" will magically scale their POS and they won't have to think about true architecture.
Given discussions with Chip about SPIN, I think he met his objectives in SPIN. He needed a language to bridge the gap between BASIC, modern languages, and multi-core architecture.
All languages go through evolution, SPIN is somewhat overdue because the interpreter is hard-coded in ROM on a bunch of chips. That being a limitation, there is still a lot that can be done with the compiler end to achieve some significant growth. Roy's port may facilitate that, C/C++ as a HLL for compiler development is actually a good thing. This is a place where it makes sense and is beneficial for the ability of easily making radical changes.
Getting back to a thought I had:
Interpreted languages have made a real showing in overall usage. I don't think this is a bad thing necessarily, I think it bodes well for the future of the Prop and SPIN. I think this trend validates Chip's initial decision to go with an interpreted language instead of compiled-to-asm. The architecture does lend itself well to interpreted languages, and the efficiency is a definite plus. With the P2 I think efficiency will improve significantly, much more than the given 8x due to architecture improvements. SPIN just needs a little remodeling to fix some of the idiosyncratic elements and to add in some necessary improvements.
I share Sapieha's sentiments that C has been an often-misguiding force in the evolution of computing.
I've been thinking about this statement some more and I'm not sure it is really complete. It may be true that languages like C have held back evolution in computing but in that same category are all of the procedural languages including Pascal, ADA, and even Spin. What C has done that may not be as true of the others is it has prevented some compiler optimizations that may have been possible without all of the aliasing issues introduced by pointers but this is a problem with compiler design not CPU design. Please give me an example of something in C that has impeded advances in CPU architecture that was not also true of any procedural (von Neumann) language.
So could each of you offer at least one feature of modern computers that has resulted from the widespread use of C together with why that is somehow a bad and misguided thing.?
A perfect example is the architecture of the Atmel AVR family. It was designed from the ground up to be optimally efficient for compiled C, and I suppose it is.
From the Atmel website:
"To make the AVR instruction set as efficient as possible, the Atmel team behind the AVR CPU invited compiler experts from IAR Systems to co-develop the first AVR C compiler. Following extensive refinement, the AVR architecture became optimized for C-code execution, with bottlenecks completely eliminated during the construction phase. That is why the AVR has become synonymous with small code size, high performance, and low power consumption."
But that's what I mean by the C tail wagging the dog. As a consequence, the AVR has a highly non-orthogonal programming model and can be a pain to program in assembly, due to multiple exceptions and holes relating to what operations you're allowed to perform where. Contrast that with the Propeller, which is very orthogonal and regular and a joy to program in assembly.
A perfect example is the architecture of the Atmel AVR family. It was designed from the ground up to be optimally efficient for compiled C, and I suppose it is.
From the Atmel website:
"To make the AVR instruction set as efficient as possible, the Atmel team behind the AVR CPU invited compiler experts from IAR Systems to co-develop the first AVR C compiler. Following extensive refinement, the AVR architecture became optimized for C-code execution, with bottlenecks completely eliminated during the construction phase. That is why the AVR has become synonymous with small code size, high performance, and low power consumption."
But that's what I mean by the C tail wagging the dog. As a consequence, the AVR has a highly non-orthogonal programming model and can be a pain to program in assembly, due to multiple exceptions and holes relating to what operations you're allowed to perform where. Contrast that with the Propeller, which is very orthogonal and regular and a joy to program in assembly.
-Phil
Even though they claim that the AVR was designed with C in mind, it's hard to see that from looking at the instruction set. It doesn't look very well tailored for C to me. An orthogonal instruction set seems like it would be better for good code generation. Of course, I'm not an optimizing compiler expert so I may be missing something.
That's the thing, C is conceptually very similar to many other single threaded, block structured languages. Think ALGOL, Pascal, ADA, Coral, PL/M etc etc.
That's a good point about optimizations. I always imagined that a languages syntax/semantics must have a big effect on the possibilities for the compiler optimizing things. Surely if you disallow GOTO the compiler has more chance of analysing control flow and optimizing accordingly. Similarly for pointers as you say.
I suspect though that C was born in an era when compiler know how was yet to develop so perhaps the idea was to allow short cuts in the source code instead of hoping your compiler writer has the smarts to optimize for you.
SPIN just needs a little remodeling to fix some of the idiosyncratic elements and to add in some necessary improvements.
i.e., add a few more features that are already in C++. Whether an interpreter is used or not is irrelevant. All of the language implementations on the Prop use a virtual machine except for pure PASM. Even PropGCC COGC programs and PropBasic rely on helper routines to do multiply, divide, etc.
I appreciate the effort that went into developing the Spin language and the Spin VM. However, in retrospect it would have been better to adopt an existing language and implement a subset of it.
I suspect though that C was born in an era when compiler know how was yet to develop so perhaps the idea was to allow short cuts in the source code instead of hoping your compiler writer has the smarts to optimize for you.
That is certainly true with the auto-increment and decrement and assignment operators. Those are really unnecessary now that compiler optimization techniques have improved. However, those constructs are often easier to understand than the more verbose ones. I'd say that "++foo" says "increment foo" to me better than "foo = foo + 1". But, those shorthands are not needed anymore for speed. I'm sure GCC compiles the same code for both expressions and so would any other modern compiler.
Even though they claim that the AVR was designed with C in mind, it's hard to see that from looking at the instruction set. It doesn't look very well tailored for C to me.
I believe it was an iterative process:
1. Start with an architecture.
2. Write a C compiler for it.
3. Compile a bunch of C programs.
4. Analyze the compiled machine code to see which instructions and register/memory accesses got used the most.
5. Refine the architecture to make sure the most common things get done the quickest and with the smallest footprint.
6. Go back to #2.
Comments
I'm having a hard time following your point but I think we're going to have to agree to disagree.
Don't be too sure
Active cooling is used by the over-clocking zealots ( but of course, that dwarfs the chip price )
Hard to tell how much we disagree on this topic but I can't help but think that C was initially developed for the PDP-7 ( I know I wrote PDP_11 earlier in the thread but I was mistaken) which used 18bit words with 4K words of base memory and expandable to 64K words. That may have seemed like a lot in the 60's and early 70's but by any modern standard not much memory at all!
If C is responsible for things like CPUs having word lengths base on 8 bits I consider that alone to be a great legacy!!
In some ways I see some of the annoying and confusing aspects of C to be the result of it needing to deal with CPU design problems that no longer exist. Very similar to Sendmail in the unix world. Designed to solve a problem that doesn't exist! It is important to remember that C was designed not as a high lever language but as a high level assembler. That fact becomes clear when you write C code that doesn't use any library functions.
For the record I'm not trying to sell the language or convince anyone to use it. I got involved in this discussion when I read a post about how the design of a programming language developed in the late '60s during the era of cobol and fortran that is being used in ways and on types of hardware that couldn't even have been conceived of at the time was short sighted.
Hehe, some impressive numbers buried in that link...
I was thinking more of Peltier coolers, more than liquid gasses...
["Late last month, AMD said that it managed to hit 7 GHz in "extreme overclocking tests," more than doubling the original 3.2 GHz clock speeds enjoyed right out of the box. Apparently the CPU can be safely overclocked at home with speeds up to 3.8 GHz; anything greater will need "exotic cooling materials." AMD's product manager Brent Barry was noted saying that liquid nitrogen and liquid helium are best suited for high-overclocking environments, the former bringing the temperature down to about -140 degrees and the latter to around -240 degrees. Unfortunately, both solutions are somewhat dangerous to use. "This is fairly insane, science experiment stuff," Davis said."]
If we use AMD's indicators, then 200MHz typical on a 160MHz Spec'd prop, should be do-able, with moderate attention to cooling.
Exactly. You are right.
Mike
Accommodation for a set of users that is at least 100x larger than the current set of Propeller users. It makes a LOT of business sense for Parallax to accommodate them... also, the "fronts that really matter" differ from person to person.
Amen. It's the one thing I banged into all the time on on bigger projects. Yeah you can do it without but I really missed it.
What does it mean to pass objects as parameters to functions? I suppose it's something other than "someroutine(someobject.someotherroutine)", as that can already be done.
Funny, a gigabyte OS was considered a obscene joke years back, today it's a reality. Code bloat is god.
We went from a C or Pascal development suite that fitted on 2 floppies to those distributed on multiple CD's because of bloat, incompetent professional coders and language fads.
I watched as my FPGA development tools inflate like Mr. Creosote and grew from less than a gig to over 3 gigs. I'm just waiting for them to issue a release that uses a entire HDD. I say that as a half joke, because software shows no signs of slimming, just growing like the Blob.
The SPIN equivalent:
OBJ
foo: myobjname.spin
PUB Main | x,y,z
result := bar(foo,x)
PUB bar(object baz, x)
PHP doesn't have a notion of different object types, just that a type is type Object. All Objects are type Object, so there is no static typing going on. SPIN could be extended to support this model. I think the compiler could do all the magic and let the runtime remain the same.
In essence, SPIN with object passing would allow local aliases to global object instances. It would be need to be implemented as pass-by-reference. It actually seems really trivial to implement.
A companion to this would be local instantiation of objects in methods. This way you can have a local, anonymous, object instance that does something useful. This could take a little more effort. From a compiler standpoint, you could include as many anonymous global object instances as you need for simultaneous usage (multiple COGs in the same routine). This way local instances would actually be aliases to global anonymous (randomized object names that won't clash) instances. The main downside, as is inherent in SPIN, is that memory/code allocation is done statically, so you can't reclaim or dynamically allocate space.
Passing of objects by reference would somewhat fulfil the inherent lack of structures in SPIN, although it might be a complicated solution to an easy problem. Structures are handy, but in SPIN they would merely be aliases to memory locations, again, something that the compiler can handle and the runtime need not care about.
The next step in Objects would be inheritance. PHP supports single inheritance, which is seen as a limiting factor by some, however it makes the compiler simpler and code development is forced to work within the limitations. This is the difference between encapsulation and multiple-inheritance in C++ IIRC.
Going further and deeper is polymorphism, but since SPIN isn't a typed language, this hardly matters.
I think the last language construct that would be useful to sort out is the String notion that keeps coming up. I've seen grumblings from time to time about the lack of native string literal handling in SPIN. I haven't bothered to understand the underlying issue.
No more of this:
There are lots of places where it makes life easier to be able to pass over a class and access it's member functions to manipulte it vs a pointer to an objects internal data and hope you don't forget leter on that the order and size of the variables was really important.
In IMUTest.cpp:
And in MiniIMU9Accel.cpp I can use that I2C bus object and all it's member functions to do the reads and writes. Any other device that I code can also use that bus passed to it when it's created.
In SPIN I either have each object creating it's own personal internal copy of I2C and all the VAR variables or I have to make I2C a singleton and rename the source file for each object to access the methods to the shared DAT data. In this particualr case, that could still be workable but there are many times where I need to have objects operate on other objects that they do not contain and having to do that via pointers to the data is just not really fun. It's not that work can't be done without these things but they are things that I really do miss when switch back to SPIN.
First off, thanks so much for the very welcome update. Next, does that "within three weeks" for the GDS2 (GDSII?) data refers to three weeks from the blog post date or three weeks after the "final die artwork and ROM bit pattern" come together (if there's any difference between the "final die artwork" and the "final GDS2 data")? The ever-hopeful side of me hopes that they refer to the same blessed event.
Either way, it sounds like great progress has been and is being made. And, no doubt, we are all (really) pleased that Chip is really pleased. That personal comment was a nice touch on Chip's part and amplifies the message about the progress. The cake may not be in the oven just yet, but it sounds like the batter is just about ready to pour. Someday, we look forward to a "And Chip said that it was good" comment after the P2 creation is complete/tested (then Chip can rest), apologies to Genesis.
I'm not fully certain that GDS2 = GDSII, but Wikipedia states, "[The] GDSII stream format, common acronym GDSII, is a database file format which is the de facto industry standard for data exchange of integrated circuit or IC layout artwork." Link: http://en.wikipedia.org/wiki/GDSII
My thoughts too ...well actually three including Phil
In the case of Spin, I would make it possible to pass a reference to an object as a parameter to a method of another object. I would also make it possible to pass a function as a parameter to another function although this is somewhat more involved in an object-oriented language than in a functional language. In the case of Spin, I'd be happy to just have objects passed as parameters as a start. I suppose one problem with adding this to Spin is that it would require that method arguments have types. At the moment I think all arguments are just 32 bit integers.
Seems there is quite a chorous singing:
My gut says that I disagree but in such esteemed company I cannot with further clarification. So could each of you offer at least one feature of modern computers that has resulted from the widespread use of C together with why that is somehow a bad and misguided thing.? Particularly Chip who said "...old shackles that C has subtly placed on computing for the past 30 years." I'm curious to know what those shackles are.
@Chip,
As far a I know the roots of C were developed on DECs PDP-7. They only had 18 bit word memory from 4K up to an expensive maximum of 64K. Far from being large systems I would say. Perhaps due to this C has been used in small systems ever since all the way down to PIC MCUs and Propeller COGs (Thank God).
I'm not sure there was anything 'academically'-ideal about the hardware or software around C. The hardware was being built down to a price, as always. The C language and Unix were a quicker and simpler alternative to the huge failed Mutics OS project.
What I see coming out of academia are things like ALGOL, then PASCAL, then ADA. All very formalized, rigorous and rejected by the mainstream. Also Lisp that no one outside comprehends:) I think many of us here had that feeling already when you gave us the Propeller:) A big thank you for that.
I have wondered whether something like Erlang might be a good choice for programming the Propeller. I'd like to try implementing a subset of it on jazzed's tetraprop board.
I'll second that!!!
I don't see why people are bashing C. Its not evil or anything.
I write C code. It works. I'm happy. End of story.
The results of my program is what I care about. As long as it isn't too painful to meet those results then life is good.
---
So... if you are bashing C and are veteran C programmer then I accept that. If you are not a veteran C programmer then I say you should try it out. I think the... don't knock it before you try it line applies here.
I'll admit, C has a lot hacks in it. (Macros...) . But... you can just Google anything you want to know about C. An answer to your question exist! There is tons of infomration available about how everything works.
---
Okay, now as for building C code I can see where people may become angry. MAKE files are not fun. But... you don't need to write make files anymore. You can write your build system in SCons or CMake. Both of these build systems make it easy to build your code.
---
I don't see any point in the bashing C argument.
Thanks,
Ruby (on Rails)
Java
Python
PHP
C++
C
The interesting trend is that the lists (almost) follows the evolution of the respective languages. As soon as a language gets mature enough to reach widespread adoption, another "faster" or "better" (in some way) language comes up.
I personally couldn't grok Ruby when I saw it. I'm told Ruby was designed to be a "natural" language, perhaps my brain's insistence on order and procedural approach is what blocked it from understanding.
I suspect JAVA is skewed in the matchup because Android uses it so exclusively, and there is a high penetration of Android in the marketplace.
Another bit to note, the languages used aren't always appropriate for the task at hand, the acceptance is driven by fad, non-traditional approaches to software development, and pure need to get some idea to market as fast as possible, because the "cloud" will magically scale their POS and they won't have to think about true architecture.
Given discussions with Chip about SPIN, I think he met his objectives in SPIN. He needed a language to bridge the gap between BASIC, modern languages, and multi-core architecture.
All languages go through evolution, SPIN is somewhat overdue because the interpreter is hard-coded in ROM on a bunch of chips. That being a limitation, there is still a lot that can be done with the compiler end to achieve some significant growth. Roy's port may facilitate that, C/C++ as a HLL for compiler development is actually a good thing. This is a place where it makes sense and is beneficial for the ability of easily making radical changes.
Getting back to a thought I had:
Interpreted languages have made a real showing in overall usage. I don't think this is a bad thing necessarily, I think it bodes well for the future of the Prop and SPIN. I think this trend validates Chip's initial decision to go with an interpreted language instead of compiled-to-asm. The architecture does lend itself well to interpreted languages, and the efficiency is a definite plus. With the P2 I think efficiency will improve significantly, much more than the given 8x due to architecture improvements. SPIN just needs a little remodeling to fix some of the idiosyncratic elements and to add in some necessary improvements.
From the Atmel website:
But that's what I mean by the C tail wagging the dog. As a consequence, the AVR has a highly non-orthogonal programming model and can be a pain to program in assembly, due to multiple exceptions and holes relating to what operations you're allowed to perform where. Contrast that with the Propeller, which is very orthogonal and regular and a joy to program in assembly.
-Phil
That's a good point about optimizations. I always imagined that a languages syntax/semantics must have a big effect on the possibilities for the compiler optimizing things. Surely if you disallow GOTO the compiler has more chance of analysing control flow and optimizing accordingly. Similarly for pointers as you say.
I suspect though that C was born in an era when compiler know how was yet to develop so perhaps the idea was to allow short cuts in the source code instead of hoping your compiler writer has the smarts to optimize for you.
I appreciate the effort that went into developing the Spin language and the Spin VM. However, in retrospect it would have been better to adopt an existing language and implement a subset of it.
1. Start with an architecture.
2. Write a C compiler for it.
3. Compile a bunch of C programs.
4. Analyze the compiled machine code to see which instructions and register/memory accesses got used the most.
5. Refine the architecture to make sure the most common things get done the quickest and with the smallest footprint.
6. Go back to #2.
-Phil