DIP8 Cortex-M0+ from NXP
Leon
Posts: 7,620
NXP has just announced the LPC800 - a 32-bit Cortex-M0+ in a DIP8 package:
http://www.nxp.com/news/press-releases/2012/11/nxp-revolutionizes-simplicity-with-lpc800.html
I should be getting some samples in two or three weeks.
http://www.nxp.com/news/press-releases/2012/11/nxp-revolutionizes-simplicity-with-lpc800.html
I should be getting some samples in two or three weeks.
Comments
Looking forward to your reports back.
With respect to the LPC800 family, though, I wonder when NXP will get around to adding an ADC.
What apps do you plan to squeeze into that?
Prop2 SD card bootloader and power manager
Sorry couldn't resist.
The PICs and AVRs it will be competing with have far less flash - the PIC12F675 only has 1024 words. ARM C code is very efficient.
I'd guess it would be possible, but it really depends on how hobbyist friendly the tool set for these ARM chips are and how difficult it is to configure these particular ARM chips. But if it takes 2-4 hours just to illuminate a LED, it ain't worth it.
For me I'll stay with a PicAx if I need a 8 pin micro.
A similar program for the new LPC800 will be somewhat simpler.
Good example. I start to realize why the Arduino is so popular and why we love the Propeller so much.
By the way how long is "delay();" ?
Written like that it has no obvious meaning. If I port that code to some other machine what is "delay()" there?
my question is in genral can you execute code from an eeprom or some type of nvram? im really intrested in 8 pin chips for some tasks like power managment or an intermediate chip to simplfy reading of sensors for a host chip like the propeller. i dont know much about avr, and im thinking the skills id accquire using these over attinys would be more beneficial in the long run, but 16Kb is fairly limited if code cant be stored somewhere else. ive also been looking into 8051/2s but like i said this may be a more beneficial if i need to learn a whole new platform
still do you know if code can be ran from an eeprom? are these using thumb to compact instructions?
A subset of Thumb 2, to be more specific. The original Thumb instruction set was 16-bit, and ARM7 TDMI processors allowed you the choice of either ARM (32-bit) or Thumb instructions. With the Cortex family, they got rid of this choice and instead enhanced the Thumb instruction set with certain 32 bit instructions. M3 chips get the full Thumb 2 set, and M0 processors get a subset of Thumb 2.
Still, none of this should be viewed as a liability. Thumb 2 is a great compromise between execution speed and code compactness. I've been quite amazed at how much program can fit into the limited code space of an ARM Cortex M chip.
Perhaps it should also be pointed out that the M0 is miraculous in how much processor you get for so few gates, and so little power consumption. Still, it is noticeably slower than the M3 family. In very rough terms the M0 is half-as-fast, after accounting for clock rate differences, according to some comparisons I made a couple years ago.
Exactly - what were they thinking releasing a new micro, with no ADC ?!
The small 8 bit alternatives leave this for dead.
You can get ADC, and Wide Vcc, and smaller packages.
DIP8 is a gimmick : who is going to volume produce using DIP8 ?
Why make a 32 bit controller, with 16 bit timers ?
The Nuvoton Mini51 is a much better pitch into 8 bit space, : this family has similar price, but include ADC and 2 x 32 bit timers, and Wide Vcc.
See
http://www.nuvoton.com/NuvotonMOSS/Community/ProductInfo.aspx?tp_GUID=5dbf7d7a-b6df-4fe1-91c9-063449500ce7
and
http://www.coocox.org/Cookie.html
It's like having a Harley Davidson fitted with Ferarri engine.
I have a dev board with a STM32F107, all the timers are 16 bit too. I suppose the prescalers replace the larger width of 32 bit timers. These timers generally pack a lot of functionality, maybe 32 bit would require "too much" chip space.
Some TI parts have 32/64 bit timers, but really, one of the reasons you move to more bits, is to leave saturation and granularity problems behind.
If the CPU made the move, is it just skewed to leave some peripherals at 16 bits.