Regarding the AVR libc..........
These days I am struggling with <stdint.h> and it is a rather interesting library. I guess this is a new C++ Standard libary and it explaine a lot of types that requre a new Typedef to compile. I can even see where it might be useful with the Propeller where EEPROM and HubRam are 8bits and CogRAM is 32bits. But it is one of those slippery C++ details that I need to figure out how to resolve for porting. Do I add the Typedefs as they stand or do I have to modify them to Propeller usage? Or should I just use Char and Int.
+++++++++++++++++
I suspect that AVR just wants people to learn THEIR way and not buy other processors for fear of a steep learning curve. Arduino seems to have inherited their quirks by adoption of their libc.
Mutual respect and adhering to standards for 'the common good' hurt the bottom line -- your customers have the means to comparison shop and you have to work harder when the customer is better informed. These days, the pop psychology bookshelves are full of titles about how to make people impulsive and not think things through. Media has followed suit with being less informative and more dramatic.
And yet, in some cases non-standard is useful.......
IT IS IMPORTANT to deal with the 32KB code space on the Propeller. If I need to conserve space, I am setting aside use of print and scan functions for the most part and avoiding floating-point. I suspect trying to run a VGA with keyboard and floating-point is very tight.
I DO APPRECIATE that the BasicStamp made me so aware that just about anything that needs to be done can get down in integer maths in a faster simpler manner. The conversion to floating point is really just a human interface feature and is often not needed at all.
It is the world we live in today. Why bother with loyal customers when you can make them dependent and force loyalty. In a programmable world, train the customer to do it your way --- Large, Grande, and Vente rather than Medium, Large, and Extra-Large.
I feel that the Propeller 2 with 512Kb of HubRam and 16 cogs will much more friendly to ANSI C in its entirety.
scanf is one function I have never needed.. there's not a single scanf in any of the C code I have written over the last 25 years or so. And I write C every day..
printf is another story of course, difficult to be without (and its variants sprintf, snprintf, vsprintf etc.)
scanf is one function I have never needed.. there's not a single scanf in any of the C code I have written over the last 25 years or so. And I write C every day..
printf is another story of course, difficult to be without (and its variants sprintf, snprintf, vsprintf etc.)
-Tor
I can't say that I use scanf very often myself but it is one of the larger functions in the ANSI library.
It is a relief to know that someone with years of experience doesn't use it at all. I have been trying to figure out how to avoid it, so my intuition must be somewhat right.
Scanf() certainly threw me for a loop on the Propeller. The implication is that it scans a whole line of code on the input buffer and parses out various formats. It certainly seemed to be likely to be Hubram and Cog intensive if used to accept floating point input.
I suspect the main reason Chip dislikes C is because ANSI-C requires things that are really unnecessary (there are other reasons of course). There is no printf() equivalent in Parallax Spin code for example, and all character IO is based on simple functions like bin(), dec(), hex(), str(), tx()/out(), rx(), and rxcheck().
Every C programming class I took promoted printf(), scanf(), and floating point usage. Clearly, when you have enough resources, that all works just fine.
However, the "un-ANSI" alternatives Parallax provides in SimpleIDE libraries are, in their words, "microcontroller sized." Arduino provides similar alternatives. Propeller GCC supports ANSI C/C++ standards of course, but at a code size cost ... the simplest printf and scanf will actually be smaller than the alternatives, but size grows with complexity.
Sorry I have not been on your tail recently, you seem to be in a bit of a muddle so I thought I'd jump in whilst I have access to the internet for a little while:) Connectivity is very poor out here in the forest.
<stdint.h> is not a C++ thing. It is standard C. Well, the C99 standard.
It is very useful because in C types are not fully defined. An "int" for example is a signed integer, but how big is it? The C standards leave that as implementation dependent, could be 16 bits on an old 16 bit machine or 32 on newer architectures. What should it be on an 8 bit micro like the Z80 or AVRs or PICs etc?
Same with "char", it's big enough to hold a character. But is it signed or not? Makes a difference sometimes.
So, historically people have made their own definitions of these types when they really want to be sure things are the right size and shape. C99 just formalizes that.
No you don't have to write any typedefs to use the types defined in stdint. For sure you don't want to go overriding those definitions!
Regarding the AVR libc.......I suspect that AVR just wants people to learn THEIR way...
Never mind what ATMEL wants, a quick Google would have showed you that avrlibc is an open source project created by people who have nothing to do with ATMEL. It's there to support compiling C for AVR's with GCC.
Yes, C will be a much nicer proposition on the PII.
I remember a quiz where I wrote a program using scanf and printf only to have the teacher write a note saying that the microcontroller didn't have those 2 functions.
Printf and scanf are what we were taught to use in C and I had never even heard of getchar and putchar till we used that microcontroller.
I guess there needs to be an addendum to all C instructors to teach the use of getchar and putchar for those people who will end up using C on a microcontroller.
And as mentioned the size of variables is a whole another animal.
I think the biggest problem is that both printf and scanf have huge numbers of options even in ANSI C and it may be difficult to create a compliant version that is small enough to use in LMM or CMM modes on the Propeller. I imagine the same is true of the Arduino since it only has 32k of code space. I realize that it is possible to implement these functions in 32k of code but there may be more to the user's program than just printf and scanf. The extra space taken by full implementations may, and often does, make the difference between a program fitting in hub memory or not.
For LMM maybe - but when using CMM, a program with a full ANSI standard implementation of both printf() and scanf() fits easily in Hub RAM.
That, of course, depends heavily on how big the program itself is.
I hope I'm not alone in wondering of the value of functions like printf and scanf on a microcontroller!! I understand it's part of the but come on. I think when it comes to some functions we may want to consider a couple of options. The first is that the propgcc "standard libraries" will be subset of what's required by ansi-c ( as opposed the language itself) or include those functions in libraries that are so awful that no one in their right mind would use it and then provide functions more appropriate to a microcontroller!!
This brings up an interesting question: do all of the functions in #included files get loaded, or only the ones that are called? If the latter, it would seem that calls to printf could be decomposed at compile time, with the compiler outputting a sequence of lower-level calls just to the formatters (e.g. string, integer) actually used, rather than to some omnibus printf function. As long as it works like printf, there's no reason to care that a function by that name is never invoked.
For example, for printf("The temperature is %2.2d degrees.\n", temp); the compiler would generate:
putstr("The temperature is ");
putdec(temp, 2, 2);
putstr(" degrees.\n");
This would only work, of course, if the formatting strings are quoted constants (e.g. "%2.2d\n") and not variables. But that's the usual case. In fact, I don't believe I've ever used a variable formatting string in all the Perl printfs I've called.
#include doesn't load anything, unless there is C code in there. Which is really bad practice (ignore C++ for now, that's not C).
#include only provides declarations. What gets loaded comes from libraries during linking, and how much gets loaded depends on the intelligence of the linker A smart one will only load functions actually used by the code. Well, static linking. With dynamic linking (common on desktops) nothing gets loaded into your executable during linking, except hooks for calling functions in dynamically loaded, shared libraries.
The PropellerGCC printf() uses only what is required based on the parameters being used.
That is, printf("Hi\n") produces essentially the same code as puts("Hi"). Once you start adding things like format specifiers ala printf("Hi %s\n", "Phil"), you get bigger code. Then it grows again when floating point is added because the math library is required.
That "code growth" is what troubled Andy. He wanted a "deterministic size" function that provides everything all at once.
That is how print() evolved. The resulting 32 bit double print() function is about 5KB where the fully-bloated 64 bit double enabled ANSI C printf() function is about 17KB. Recently I added a printi() function that does everything that print() does except for floating point. printi() is about 1KB. Sizes from default CMM mode builds.
This brings up an interesting question: do all of the functions in #included files get loaded, or only the ones that are called? If the latter, it would seem that calls to printf could be decomposed at compile time, with the compiler outputting a sequence of lower-level calls just to the formatters (e.g. string, integer) actually used, rather than to some omnibus printf function. As long as it works like printf, there's no reason to care that a function by that name is never invoked.
This would only work, of course, if the formatting strings are quoted constants (e.g. "%2.2d\n") and not variables. But that's the usual case. In fact, I don't believe I've ever used a variable formatting string in all the Perl printfs I've called.
Tor, I could have phrased my question better: do all of the functions prototyped in #included files get loaded ... ? So your answer about the linker brings up the next question: how smart is the linker provided with SimpleIDE?
-Phil
Edit: I think Steve has indirectly answered this question.
That is, printf("Hi\n") produces essentially the same code as puts("Hi"). Once you start adding things like format specifiers ala printf("Hi %s\n", "Phil"), you get bigger code.
This would seem to imply that the compiler does do the decomposition, then, and that an actual function named printf is never called. Right?
Tor, I could have phrased my question better: do all of the functions prototyped in #included files get loaded ... ? So your answer about the linker brings up the next question: how smart is the linker provided with SimpleIDE?
-Phil
Edit: I think Steve has indirectly answered this question.
GCC uses a linker that can prune dead code in two ways. The linker is useful for lots of things, this is just one of them.
1) Libraries are an archive of objects. The linker only chooses the objects it needs to fulfill the program.
2) Dead code (many functions in one file/object) can be eliminated with linker options.
Simple Libraries generally use option 1 unless the developer doesn't have time or some other reason.
For other libraries like the arduino thingy, we use option 2.
Option 2 can be enabled in SimpleIDE with a check box in the Project Manager -> Compiler tab -> Enable Pruning.
In general header files only contain definitions, function prototypes etc. As mentioned above it is frowned upon to put actual code in header files.
So, headers do not generate any code. The linker never sees what is in there. And it has no effect on what is linked in.
The linker only sees actual compiled code, from your .c files. And it tries to resolve those names and link code to them. Perhaps from other code in your program or from any libraries it has available.
SimpleIDE has nothing much to do with this. It's only a glorified editor. The GCC compiler and other tools, assembler, linker, etc are doing all the work.
The question then is, if I call one little function from a huge library that contains thousands of functions do I get just the function I want pulled into my executable or the whole damn thing?
That rather depends on how the library has been put together and how smart the linker is and is still a mystery to me.
printf is an interesting special case because in general it is not known at compile time what that format string is wanting to print or how it should be formatted. Seems that modern compilers do actually "compile" that format string if it is a literal and do the right thing. Which is a bit odd really if you think about it because that format string syntax is not actually the C language but rather a different language.
Certainly I have seen the compiler use puts() instead of the printf() I wrote for simple string printing. There is some smart optimization going on there.
As for the question of "should we have printf and friends in a micro-controller"? Well why not? A modern micro-controller has heaps of code and data space. If printf is using a few percent of that then it's probably very useful.
Heck, modern micro-controllers run JavaScript and Python.
That's up to the compiler, linker, and the way the library is defined. I haven't looked at it lately.
I derived my inference from your prior statement and the following: If the compiler did simply output a call to printf, then all of the myriad functions that printfmight call would have to be linked in, since there would be no way for the linker to know which ones got used and which ones did not. Since the load images grow, depending upon which formatters are used, it would seem that printf is just a pseudo-function that the compiler deals with up front.
Tor, I could have phrased my question better: do all of the functions prototyped in #included files get loaded ... ?
Prototypes don't load anything, they just make the compiler's job easier. You can even leave the header files out, the executable will be the same, if the compiler can figure out enough without them. Needs you to not enable the compiler's strict prototype checking though.
You can even leave the header files out, the executable will be the same,
I suspect this is not true. Surely all function parameters and returns are assumed to be int if they are not specified in a prototype. So passing floats, for example, would result in casts to ints and the generation of different code along with some wonky results.
This would seem to imply that the compiler does do the decomposition, then, and that an actual function named printf is never called. Right?
-Phil
I think the only optimization that the compiler makes is to replace calls to printf where the only argument is a string containing no % arguments with puts. I don't think it does anything further than that. The pruning of the handling of floating point numbers happens if you don't include -lm in your linker command line. I don't think the compiler parses the % arguments in the format string to decide whether to include the floating point conversion functions.
Also, I think it is actually possible that printf will generate less code than lots of individual calls to functions like str(), dec(), hex(), etc. That may not outweigh the size of printf itself though unless there are a lot of complex calls to it. If you think about it, printf has to internally have all of those individual functions. It just adds a layer on top that parse the format string to decide which low-level formatting function to call.
Yes, I kept my answer short because I'm writing on Android and this forum software makes that very painful.
It's surprising how many functions continue to work correctly in the absense of prototypes though. Particularly on 32-bit systems (more things, e.g. long, map into the space of the standard 'int'). But sometimes not. However, the functions loaded do not depend on if header files are included or not, except of course when header files use preprocessor macros to replace words/names in your code.
SimpleIDE has nothing much to do with this. It's only a glorified editor. The GCC compiler and other tools, assembler, linker, etc are doing all the work.
Normally it is true that the IDE would have nothing to do with it. However, I was asked to eliminate using make files though, so the build process is really managed by SimpleIDE. I've considered generating make files, but that's an extra test item. The only thing I don't really like about the process at the moment is no file time dependency checks.
So, the #include headers provide a clue as to what needs to be part of the program. Whenever "Auto Include Simple Libraries" is enabled, SimpleIDE will search the Learn Library for matching headers and include the appropriate libraries.
The build status has a record of everything it takes to build a program except for the pre-compiled libraries that come with the SimpleIDE Learn Library folder.
Also, I think it is actually possible that printf will generate less code than lots of individual calls to functions like str(), dec(), hex(), etc. That may not outweigh the size of printf itself though unless there are a lot of complex calls to it. If you think about it, printf has to internally have all of those individual functions. It just adds a layer on top that parse the format string to decide which low-level formatting function to call.
David, in my experience many calls to the separate output functions are required before using printf() starts saving code space. Using the stdio file system tends reduce that advantage though.
I derived my inference from your prior statement and the following: If the compiler did simply output a call to printf, then all of the myriad functions that printfmight call would have to be linked in, since there would be no way for the linker to know which ones got used and which ones did not.
Exactly, that's why using the little output functions takes up a relatively small code space.
It also occurs to me that using the separate output functions instead of calling printf will let the program run faster, since printf would need to parse the format string every time it's called.
Comments
These days I am struggling with <stdint.h> and it is a rather interesting library. I guess this is a new C++ Standard libary and it explaine a lot of types that requre a new Typedef to compile. I can even see where it might be useful with the Propeller where EEPROM and HubRam are 8bits and CogRAM is 32bits. But it is one of those slippery C++ details that I need to figure out how to resolve for porting. Do I add the Typedefs as they stand or do I have to modify them to Propeller usage? Or should I just use Char and Int.
+++++++++++++++++
I suspect that AVR just wants people to learn THEIR way and not buy other processors for fear of a steep learning curve. Arduino seems to have inherited their quirks by adoption of their libc.
Mutual respect and adhering to standards for 'the common good' hurt the bottom line -- your customers have the means to comparison shop and you have to work harder when the customer is better informed. These days, the pop psychology bookshelves are full of titles about how to make people impulsive and not think things through. Media has followed suit with being less informative and more dramatic.
And yet, in some cases non-standard is useful.......
IT IS IMPORTANT to deal with the 32KB code space on the Propeller. If I need to conserve space, I am setting aside use of print and scan functions for the most part and avoiding floating-point. I suspect trying to run a VGA with keyboard and floating-point is very tight.
I DO APPRECIATE that the BasicStamp made me so aware that just about anything that needs to be done can get down in integer maths in a faster simpler manner. The conversion to floating point is really just a human interface feature and is often not needed at all.
It is the world we live in today. Why bother with loyal customers when you can make them dependent and force loyalty. In a programmable world, train the customer to do it your way --- Large, Grande, and Vente rather than Medium, Large, and Extra-Large.
I feel that the Propeller 2 with 512Kb of HubRam and 16 cogs will much more friendly to ANSI C in its entirety.
printf is another story of course, difficult to be without (and its variants sprintf, snprintf, vsprintf etc.)
-Tor
Scanf() certainly threw me for a loop on the Propeller. The implication is that it scans a whole line of code on the input buffer and parses out various formats. It certainly seemed to be likely to be Hubram and Cog intensive if used to accept floating point input.
Every C programming class I took promoted printf(), scanf(), and floating point usage. Clearly, when you have enough resources, that all works just fine.
However, the "un-ANSI" alternatives Parallax provides in SimpleIDE libraries are, in their words, "microcontroller sized." Arduino provides similar alternatives. Propeller GCC supports ANSI C/C++ standards of course, but at a code size cost ... the simplest printf and scanf will actually be smaller than the alternatives, but size grows with complexity.
Let's compare these things.
Parallax Serial Terminal.spin
SimpleIDE Learn Library
Arduino Library
ANSI C/C++ GCC, LCC, etc...
Syntax purposefully wrong
Syntax purposefully wrong
Syntax purposefully wrong
Syntax purposefully wrong
Bin()
putBin(val)
writeBin(dev,val)
Serial.print(val,BIN)
No direct equivalent.
BinIn
getBin(val)
readBin(dev,val)
No direct equivalent?
No direct equivalent.
Dec()
putDec(val)
writeDec(dev,val)
Serial.print(val,DEC)
printf("%d",val)
fprintf(dev,"%d",val)
DecIn()
getDec(val)
readDec(dev,val)
Serial.parseInt()
scanf("%d",&val)
fscanf(dev,"%d",&val)
Hex()
putHex(val)
writeHex(dev,val)
Serial.print(val,HEX)
printf("%x",val)
fprintf(dev,"%x",val)
HexIn()
getHex()
readHex(dev)
No direct equivalent?
scanf("%x",&val)
fscanf(dev,"%x",&val)
Str()
putStr(str)
writeStr(dev,str)
Serial.print(str)
printf(str)
fprintf(dev, str)
StrIn()
getStr()
readStr(dev)
Serial.readBytes()
gets()
fgets()
fread()
Char()
putChar(ch)
writeChar(dev,ch)
Serial.write(ch)
putchar()
fputc(dev)
CharIn()
getChar()
readChar(dev)
Serial.read()
getchar()
fgetc(dev)
RxCheck()
device dependent ...
fdserial_rxcheck()
Serial.available()
No direct equivalent.
No direct equivalent.
putFloat()
writeFloat(dev)
Serial.println(val)
printf("%f", val)
fprintf(dev,"%f",val)
No direct equivalent.
putLine(str)
writeLine(dev,str)
Serial.println(str)
puts(str)
No direct equivalent.
print() printi() printf()
dprint(dev) dprinti(dev)
fprintf(dev)
No direct equivalent.
printf()
fprintf(dev)
No direct equivalent.
scan() scani() scanf()
dscan(dev) dscani(dev) fscanf()
No direct equivalent.
scanf()
fscanf(dev)
Not required.
Not required.
Serial.flush()
flush()
fflush(dev)
Syntax purposefully wrong
Syntax purposefully wrong
Syntax purposefully wrong
Syntax purposefully wrong
Sorry I have not been on your tail recently, you seem to be in a bit of a muddle so I thought I'd jump in whilst I have access to the internet for a little while:) Connectivity is very poor out here in the forest.
<stdint.h> is not a C++ thing. It is standard C. Well, the C99 standard.
It is very useful because in C types are not fully defined. An "int" for example is a signed integer, but how big is it? The C standards leave that as implementation dependent, could be 16 bits on an old 16 bit machine or 32 on newer architectures. What should it be on an 8 bit micro like the Z80 or AVRs or PICs etc?
Same with "char", it's big enough to hold a character. But is it signed or not? Makes a difference sometimes.
So, historically people have made their own definitions of these types when they really want to be sure things are the right size and shape. C99 just formalizes that.
No you don't have to write any typedefs to use the types defined in stdint. For sure you don't want to go overriding those definitions! Never mind what ATMEL wants, a quick Google would have showed you that avrlibc is an open source project created by people who have nothing to do with ATMEL. It's there to support compiling C for AVR's with GCC.
Yes, C will be a much nicer proposition on the PII.
Printf and scanf are what we were taught to use in C and I had never even heard of getchar and putchar till we used that microcontroller.
I guess there needs to be an addendum to all C instructors to teach the use of getchar and putchar for those people who will end up using C on a microcontroller.
And as mentioned the size of variables is a whole another animal.
This makes about as much sense as a non-smoker refusing to buy a car because it comes equipped with an ashtray.
Ross.
LOL. Maybe some day he will reveal all.
-Phil
But Phil, it's the establishment ;-)
-Phil
For LMM maybe - but when using CMM, a program with a full ANSI standard implementation of both printf() and scanf() fits easily in Hub RAM.
Ross.
I hope I'm not alone in wondering of the value of functions like printf and scanf on a microcontroller!! I understand it's part of the but come on. I think when it comes to some functions we may want to consider a couple of options. The first is that the propgcc "standard libraries" will be subset of what's required by ansi-c ( as opposed the language itself) or include those functions in libraries that are so awful that no one in their right mind would use it and then provide functions more appropriate to a microcontroller!!
For example, for printf("The temperature is %2.2d degrees.\n", temp); the compiler would generate:
This would only work, of course, if the formatting strings are quoted constants (e.g. "%2.2d\n") and not variables. But that's the usual case. In fact, I don't believe I've ever used a variable formatting string in all the Perl printfs I've called.
-Phil
#include only provides declarations. What gets loaded comes from libraries during linking, and how much gets loaded depends on the intelligence of the linker A smart one will only load functions actually used by the code. Well, static linking. With dynamic linking (common on desktops) nothing gets loaded into your executable during linking, except hooks for calling functions in dynamically loaded, shared libraries.
That is, printf("Hi\n") produces essentially the same code as puts("Hi"). Once you start adding things like format specifiers ala printf("Hi %s\n", "Phil"), you get bigger code. Then it grows again when floating point is added because the math library is required.
That "code growth" is what troubled Andy. He wanted a "deterministic size" function that provides everything all at once.
That is how print() evolved. The resulting 32 bit double print() function is about 5KB where the fully-bloated 64 bit double enabled ANSI C printf() function is about 17KB. Recently I added a printi() function that does everything that print() does except for floating point. printi() is about 1KB. Sizes from default CMM mode builds.
-Phil
Edit: I think Steve has indirectly answered this question.
-Phil
That's up to the compiler, linker, and the way the library is defined. I haven't looked at it lately.
GCC uses a linker that can prune dead code in two ways. The linker is useful for lots of things, this is just one of them.
1) Libraries are an archive of objects. The linker only chooses the objects it needs to fulfill the program.
2) Dead code (many functions in one file/object) can be eliminated with linker options.
Simple Libraries generally use option 1 unless the developer doesn't have time or some other reason.
For other libraries like the arduino thingy, we use option 2.
Option 2 can be enabled in SimpleIDE with a check box in the Project Manager -> Compiler tab -> Enable Pruning.
So, headers do not generate any code. The linker never sees what is in there. And it has no effect on what is linked in.
The linker only sees actual compiled code, from your .c files. And it tries to resolve those names and link code to them. Perhaps from other code in your program or from any libraries it has available.
SimpleIDE has nothing much to do with this. It's only a glorified editor. The GCC compiler and other tools, assembler, linker, etc are doing all the work.
The question then is, if I call one little function from a huge library that contains thousands of functions do I get just the function I want pulled into my executable or the whole damn thing?
That rather depends on how the library has been put together and how smart the linker is and is still a mystery to me.
printf is an interesting special case because in general it is not known at compile time what that format string is wanting to print or how it should be formatted. Seems that modern compilers do actually "compile" that format string if it is a literal and do the right thing. Which is a bit odd really if you think about it because that format string syntax is not actually the C language but rather a different language.
Certainly I have seen the compiler use puts() instead of the printf() I wrote for simple string printing. There is some smart optimization going on there.
As for the question of "should we have printf and friends in a micro-controller"? Well why not? A modern micro-controller has heaps of code and data space. If printf is using a few percent of that then it's probably very useful.
Heck, modern micro-controllers run JavaScript and Python.
-Phil
Or something like that:)
Also, I think it is actually possible that printf will generate less code than lots of individual calls to functions like str(), dec(), hex(), etc. That may not outweigh the size of printf itself though unless there are a lot of complex calls to it. If you think about it, printf has to internally have all of those individual functions. It just adds a layer on top that parse the format string to decide which low-level formatting function to call.
Yes, I kept my answer short because I'm writing on Android and this forum software makes that very painful.
It's surprising how many functions continue to work correctly in the absense of prototypes though. Particularly on 32-bit systems (more things, e.g. long, map into the space of the standard 'int'). But sometimes not. However, the functions loaded do not depend on if header files are included or not, except of course when header files use preprocessor macros to replace words/names in your code.
Normally it is true that the IDE would have nothing to do with it. However, I was asked to eliminate using make files though, so the build process is really managed by SimpleIDE. I've considered generating make files, but that's an extra test item. The only thing I don't really like about the process at the moment is no file time dependency checks.
So, the #include headers provide a clue as to what needs to be part of the program. Whenever "Auto Include Simple Libraries" is enabled, SimpleIDE will search the Learn Library for matching headers and include the appropriate libraries.
The build status has a record of everything it takes to build a program except for the pre-compiled libraries that come with the SimpleIDE Learn Library folder.
David, in my experience many calls to the separate output functions are required before using printf() starts saving code space. Using the stdio file system tends reduce that advantage though.
Exactly, that's why using the little output functions takes up a relatively small code space.
-Phil