Need easiest to code 1-pin data transmission protocol
HollyMinkowski
Posts: 1,398
I am building a uC controlled device that will be connected
to other uC projects by a 3 wire cable. The 3 wires are + 4-7V,
GND and the line from the single pin that would send data by
simply being set high/low in some pattern.
I want something that is easy to code on any uC. PIC, AVR,
ARM, Prop... whatever the user happens to have. I want it
to be easy to understand so even a beginner could write a
simple subroutine and by sending a byte value to that subroutine
have it transmitted to my project. I need something that can
be sent at wildly varying speeds and still work.... 1 byte/sec
up to a mediocre to moderate speed of perhaps 100,000 bytes/sec.
In effect, I need something an idiot could code that would send data
to my project using one pin. All the smarts required to adjust for
different data speed would be contained in my code.
I could code so that several protocols would work, my code would
detect the protocol after a few bytes are sent and having stored
the data pulses in a buffer would go back to the start and decode
from there after the protocol has been determined.
I thought I would pose this question here since it might spare me
from re-inventing the wheel. I am aware of the common schemes
for doing this sort of thing, I will have those available as well but
I need a super-simple method to supplement those.
Data need only flow one way, from the users uC to the uC in my
project.
(My device is powered from the 4-7V from the other project. It has
no power of its own.)
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
justasm.blogspot.com/
to other uC projects by a 3 wire cable. The 3 wires are + 4-7V,
GND and the line from the single pin that would send data by
simply being set high/low in some pattern.
I want something that is easy to code on any uC. PIC, AVR,
ARM, Prop... whatever the user happens to have. I want it
to be easy to understand so even a beginner could write a
simple subroutine and by sending a byte value to that subroutine
have it transmitted to my project. I need something that can
be sent at wildly varying speeds and still work.... 1 byte/sec
up to a mediocre to moderate speed of perhaps 100,000 bytes/sec.
In effect, I need something an idiot could code that would send data
to my project using one pin. All the smarts required to adjust for
different data speed would be contained in my code.
I could code so that several protocols would work, my code would
detect the protocol after a few bytes are sent and having stored
the data pulses in a buffer would go back to the start and decode
from there after the protocol has been determined.
I thought I would pose this question here since it might spare me
from re-inventing the wheel. I am aware of the common schemes
for doing this sort of thing, I will have those available as well but
I need a super-simple method to supplement those.
Data need only flow one way, from the users uC to the uC in my
project.
(My device is powered from the 4-7V from the other project. It has
no power of its own.)
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
justasm.blogspot.com/
Comments
Any takers?·
Maybe I should clarify this more
I need a data comm protocol that is so simple
that just a quick read of the specifics will allow
even the most novice of programmers to code
something that will work to send data to my project.
My first thought was to make it very tolerant as to the
transmission speed and handle that on my end.
I'd like the data to be super simple to decode when displayed
by a scope.
My first though was to break up a byte into 4 groups
of 2 bits like this xx|xx|xx|xx and since there will
only be 4 distinct characters needed to send each 2 bit
section the protocol would only need 4 characters and
2 pauses of different length to form a data stream. This
could be sent by a super simple asm routine just by
sending a byte value or pointer to a string of bytes at
an address in memory. The asm code would mainly be
just simple shifting to isolate the 4 groups and send them
along with a short space between groups and a longer space
at the end of a string of 4.
The 4 chars needed would be to represent 00, 01, 10 or 11
so you could even use . .. ... .... to represent them, not
very elegant looking...but very easy to understand and code.
And the data would be plain to see by looking at the data
stream.
I was hoping that someone had done all this already so I
would not have to think too hard...LoL I was hoping someone
here would think of something a lot better than this.
There are several elegant but complicated protocols for
something like this, but I need something extremely simple.
I'm sacrificing speed and elegance for simplicity for the end users.
(Of course, I'm the idiot that would code the speed tolerant routine in my project)
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
justasm.blogspot.com/
Post Edited (HollyMinkowski) : 7/23/2010 10:01:53 PM GMT
than what is already in common use...the ideas I have come up with
so far are perhaps even more obfuscated ..LoL
So maybe I should just use something like I2C and just write the code to
handle it for as many uCs as I can code for and be done with it. I was hoping
to come up with something even simpler though in hopes that casual coders could
easily modify and tweak the routines.
Maybe 1-wire serial is about as simple as it can get, so just use that
protocol and write several demo routines for it and hope for the best.
I guess that people have been working at creating simple protocols
for data that use just gnd and a single data line forever and I'm unlikely
to best what has already been created.
At least coders won't have to worry about data collisions since this
will be all one direction from a single host to the slave device.
I guess I could also use 2 wires instead of three and just leave the
data line high to power the external device and set it to gnd potential for
short periods for signaling instead of raising it to +V for signaling. A small cap
would hold enough charge so the external device would not reset.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
justasm.blogspot.com/
Post Edited (HollyMinkowski) : 7/24/2010 10:05:57 AM GMT
Dallas Semi's 1-wire protocol is not exactly simple as it is fully of exceptions made for individual devices. And it seems that if you don't use their driver chips - as well as their slave products - results are often less than expected.
Serial code is all about shift left and shift right in assembly language. For an 8 bit microcontroller, one byte packets are optimal. I doubt two bit serial would be worth the effort.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Ain't gadetry a wonderful thing?
aka G. Herzog [noparse][[/noparse] 黃鶴 ] in Taiwan
Post Edited (Loopy Byteloose) : 7/24/2010 10:11:55 AM GMT
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Leon Heller
Amateur radio callsign: G1HSM
Your receiver detects state changes -- 1 to 0 or 0 to 1. The begin or end state doesn't matter. You are only coding to time and detect changes in state. You have a basic baud rate which the receiver does not initially know.
You begin each byte (or packet or whatever) with a 3 or 4 baud start. The receiver uses this to establish the baud rate and you then send however many bits are appropriate using 1 and 2 baud transition timings to indicate 0 and 1 (or whatever). If the timing falls outside of tolerances for a bit, you abort and wait for a greater than 2 baud stability before attempting to do more decoding.
This should be fairly simple to code and tolerant of wide differences in (initial) timing. Certain kinds of noise would give it a big problem but it doesn't sound like you're too worried about that.
It has the disadvantage that the byte transmit rate will be variable depending on how many zeroes and ones there are in the signal, but the advantage of not needing to know whether it's in the middle of a message when it starts listening to eventually synchronize.
I like the idea of a two wire cable instead of a three wire, sending both
power and data...but there is a problem. What if the user is trying to send
data from an older uC that can only deliver a few ma from its I/O pin...it would
not work in that case unless something like a pn2222 was used as a switch.
That makes it very complex for a newbie, they might not be able to figure
out how to use a transistor to deliver more power from an I/O pin...so better
to remain with the three wire cable and connect directly to the users gnd and
+V rails.
Once I decide on the primary protocol and get it to work I will then add a few
more protocols and let my software discern which is being sent to it. I figure if
I store all the pin changes to a buffer at the start I can't miss any data since
when the speed/protocol is discovered I could jump back and decode from the first
recorded pin change. I wonder how hard it will be to enable handling of several
different data protocols? and just how many could a 160mips Propeller handle.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
justasm.blogspot.com/
Post Edited (HollyMinkowski) : 7/25/2010 3:25:33 AM GMT
So any uC that can pull down to 0 volts will work fine.
Of course the big advantage with 2 wire is that you no longer have to rely on accurate crystal controlled timing. The minus is that a master has to decide the clock rate and slaves must follow. I2C standard can have the slave slow down the clock, but Phillips owns the concept. I suppose that the Master could inquire about preferred speed by polling slave identities and then adjust to the slowest.
I have done several comparisons of SPI, I2C, and other non-standard setups (like the temperature/humidity sensor that Parallax sells). Pretty much everyone in the industry has grabed up the options that can be owned. SPI can easily share a TX/RX pin if the Master and Slave know that it is doing so. I'd stay with SPI if you want newbies to have something simple to comprehend.
One of the weirdest two-wire serial standards are the bus that keyboards and mice use. In one direction it is 11 bits and in the other it is 12 bits. I suspect this was a typical IBM ploy to make hackers work harder, like EBDIC alpha-numeric encoding rather than ASCII.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Ain't gadetry a wonderful thing?
aka G. Herzog [noparse][[/noparse] 黃鶴 ] in Taiwan
Post Edited (Loopy Byteloose) : 7/25/2010 7:11:09 AM GMT
One thing you have not mentioned is how much supply current your downstream device requires and how far away it will be. Both of these things will have a major impact on the options available for driving a 2- or 3-conductor cable. Once the drive method has been determined, you will probably want to provide a small interface board for the host end to keep things drop-dead simple for the user.
-Phil
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Jon McPhalen
Hollywood, CA
To expand - I don't think you can do much better than serial protocols. 8 bits of data plus a start and stop bit = 10 bits. A 1.5bit delay between each byte. I suspect the people who worked out serial protocols many years ago thought about this a lot. I can't see any way to improve this if the data line is digital. Of course, if the data line is analog, that is a different story.
Most micros now have serial built in. Arduino, Picaxe etc. I guess if you felt really keen you could send the baud rate in old-school morse code at the beginning of a packet. Then even humans could understand it!
The 'universal' packet might start off with a series of 0x55 bytes and you can synch to that. Though I can see problems with some micros that don't let you get inside the serial code - eg picaxe.
I think fax machines send some bytes at the beginning to negotiate the fastest link possible over the phone line. Something to similar to this? Send a series of packets starting at 110 baud and work up to 115k or higher. The Rx expects this so increments its counter each time. See how high you can go till it fails. Then back off a couple of steps and send at that rate.
Or you could mindlessly send every packet at all the baud rates. One is going to get through and be the correct rate. Is 'time' precious in the network?
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
www.smarthome.viviti.com/propeller
Morse Code.
The 'driver' is dead simple.
The 'receiver' or monitor is already built into your head
if you are 'debugging' locally or a 'decoder' isn't THAT hard.
5 bits long, baud rate 1Hz to 100KHz (guess on automated)
The 'decoder' would be an interesting project for you to develop.
Can be audio, light, direct to skin.
Quite a Universal protocol, with TONS of source material.
There are also early NRZ tape protocols that were pretty
dead simple, TSC comes to mind, they are faster than Morse,
but amount to about six lines of assembly code for the driver.
Receivers take a bit more effort, as the timing must be derived
from the bit stream.
I've been using this for years, bringing up uC's of all flavors.
jr
Tip #8 offers a way to provide power, as well as serial. The slave ends up with a 0.7v drop in supply.
Morse Code? Everyone had to know that for rank of First Class in the Boy Scouts. I started early with ham radio, but never really found it that interesting to listen to CQ, CQ. And none of the neighbors want me to have a transmitter if it interfered with their TV watch.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
Ain't gadetry a wonderful thing?
aka G. Herzog [noparse][[/noparse] 黃鶴 ] in Taiwan
Post Edited (Loopy Byteloose) : 7/26/2010 2:58:07 PM GMT
...You may be right about SPI
Now that would be unusual, I see no reason why it would not work though.
Perhaps I could add this as an extra protocol.
I actually could do this, and no, this does not require lightning speed so there is time.
I'd say not more than 20-30ma, distance not over 10ft.
Good ideas from you, as always.
Very good idea!
I'm thinking all of this over very carefully, I don't want to make a mistake with this.
I may add a way to update the protocols in case I think of something better after I
let this out into the wild...should be straightforward to alter the protocols by updating
the eeprom data. Especially so since I already have added the ability to upload macros to
the eeprom.
▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔▔
justasm.blogspot.com/