Terminal Speed - a plea
g3cwi
Posts: 262
Dear all
I really wish that developers would standardise on a particular serial terminal speed. I have been downloading a few things from the OBEX this afternoon and every one was set to a different speed (for no obvious reason). On the plus side, they did all use the same pins.
It would be great to have some sort of agreed standard.
Cheers
Richard
I really wish that developers would standardise on a particular serial terminal speed. I have been downloading a few things from the OBEX this afternoon and every one was set to a different speed (for no obvious reason). On the plus side, they did all use the same pins.
It would be great to have some sort of agreed standard.
Cheers
Richard
Comments
Now it's easy.
Anyway, back to today...
115,200 is quite common but relies upon short cable lengths. Many RS232 drivers will not perform at these speeds anyway, so this is usually reserved for short TTL communications.
9,600 and 19,200 are possibly the next most common as they usually will work over shortish distances.
However, it would be better if BAUD was declared a constant in objects, making it easier to change than embedded in the start call.
BTW just for the uninformed baud is a bit rate. Often I see baudrate used - this is a no no as rate is implied in baud.
Heck, I cannot get anyone to agree on naming the prop pins so we can have a separate hardware definition file. Even I don't always use the same pin names
If we want to be rigorous we should remember that baud refers to symbols per second not bits per second. In a binary transmission medium these are the same but otherwise they are not.
Not to be too critical but:
Nit pick mode ON.
A baud is technically "Symbol units per second" or sometimes "Information units per second".
http://en.wikipedia.org/wiki/Baud
Yes, many times this is the same as "Bits per second" where the symbol contains just 1 bit.
However, this is not always the case.
As an example, If I remember correctly, a 56Kb/S dial up modem operates at 19.6Kbaud.
In this case each symbol represents 2 bits or 4 states.
And there were some that crammed even more bits into each symbol.
Anyway, for most of us the two are the same.
However, to be correct, one would generally not want to standardize the term "baud" but instead use "Bit per S".
Nit pick mode OFF.
Duane J
You forgot a few of the other fun signals like DTR, DCD, and DSR along with the equipment/connector type designations DTE and DCE that made connecting two pieces of equipment such a hair pulling head scratching joy. SO MUCH EASIER NOW.
Over long lines I run RS485 or RS422 and over short lines I run "TTL" as it is usually referred to. The only time I use RS232 drivers is when I have to in a commercial product for standards compliance, not that it needs these drivers as they don't make the signal better, just worse IMO. The loaded level out from a modern driver is more like +/-6V along with ripple from it's internal up converter and slew-rate limited edges coupled with the high ESR of over 300 ohms. How far is RS232 usually run? About a meter or two if that! So why bother. If I need to interface to RS232 in non-commercial or non-compliant products I just run straight from 3.3V or even 5V levels and just use a 100K current limit resistor on the inputs. Considering that the majority of serial interfaces are not "RS232" then lets open up the "default" speed to at least 115.2k baud. Trouble is everyone uses "FullDuplexSerial" from the OBEX and last time I checked that object needed a bit of work especially the bit timing at high speeds. Also I tend to define the baud rate as a constant and then this constant is used in the start function (wherever that may be in the code).
Spin can't move bytes around fast enough to need more than about 115,200 bps (I think the limit is just barely over this baud). Yes, I read you were using Forth (I hope you're feeling better) and I don't think you were serious about using 230.4k. I just recently learned about this limit with Spin and I think it's worth pointing out again.
I personally use 57,600 since some of the four port objects have a hard time at higher rates when all four lines are used.
The SimpleSerial object can't go much faster than 9600 baud.
I'm just glad the Prop doesn't have a limited number of possible rates like some uCs. I've interfaced with a Spektrum tx unit with the Prop. It uses some non-standard rate and no stop bit. The Prop didn't have any trouble with this. Some of the forums posts about this (in German) had complaints about their favorite uC not being able to use the neccessary rate.
I cannot confirm the original definition of baud as being bits per second or symbols per second. Just because the wiki says so does not make it correct. BAUD was defined well before anyone knew that more than one symbol could be sent in parallel. AFAIK baud was used for TTY on the ASR33's which were in fact not RS232 but current loop (20mA IIRC). In fact IIRC it was applied to the teletypewriter (for telegrams) and this used again IIRC a 5-bit baudot code.
My work/study began in 1970 with OTC (the overseas section of our Telecom today). They had lots of teletypes and paper tape punches and readers were used as repeaters to prevent the signals from being distorted - yes, a punch followed by a reader with a loop bin for the paper tape to be held! Lots of free confetti!!! In the 70's mainframes used Synchronous RS232 to communicate, sometimes over 50m although IIRC RS232 was only supposed to work to 15m. The clocking was provided by the DCE on pins 15 & 17 of the DB25. Mainframes were DTE and modems (leased lines - no dialup back then) were DCE. In order for mainframes to communicate, some mainframes output a clock on pin 24. Then with a crossover cable, clocks could be provided by one mainframe to both mainframes on pins 15 & 17 of each. Of course, some mainframes used EBCDIC and some used ASCII. A mate and I made quite a bit of $ building converters that were used to convert ASCII to/from EBCDIC, and then followed with a card in the Apple //e to comunicate with mainframes - we sold this to Apple USA.
Modems started at 300 baud accoustic couplers in the early 80's. The in Oz and Europe came 1200 with a 75 baud back channel. I cannot quite recall where multiple bits in transmission occurred - whether it was 1200 FDX or 9600FDX.
Originally, RS232 defined a male DB25 as a DTE (Data Terminal Equipment) and a female DB25 as a DCE (Data Communicaton Equipment such as a modem). Apple and the manufactures of the glass teletypes started the confusion over the DB25 by using a female as the DTE. This was followed by the IBM PC using the DB25 female as a (Centronics) printer port which was nothing to do with RS232 at all.
[/history]
The Go/Forth CSP channel experiments have also been running 230400 between the PC and the prop.
Or did you just mean we weren't serious because we're running Forth?
Feeling much better, as 230400 is correct. One cog and two pins allows 32 simultaneous channels and about 2000 characters per second. This is just a temporary set up, doing synchronous serial transfer over asynchronous serial port. Once we move to Ethernet the synchronous transfer should go much faster.
http://code.google.com/p/propforth/wiki/PF521BETAthroughput
PM me if you want to try the beta, or just wait till the regular public release in a couple weeks when we have the test automation in place.
Back to the original question:
Hi Richard. The "standard" for serial is "as fast as I need to get this done". While some device can only go so fast, most of the time we want to go as fast as possible. To eliminate data transfer bottlenecks, we generally set it to "maximum reliable" to start and tweak it from there. It is weird to get used to at first, but you may appreciate the flexibility when faced with time critical functions on varying peripherals.
In my case the new "fast as possible" for asynchronous serial over two wires is 230400 baud which we just nailed down a couple weeks ago. We are getting a max of about 2000 characters per second for the physical connection, and in the test the physical connection is carrying up to 32 separate channels at the same time. The ballpark looks about the equivalent of 57600 per channel is what we can expect if the PC is not overloaded.
The tests are in preparation of synchronous serial over ethernet which will end up being substantially faster, using 100Mbit ethernet we should get faster rates and /or more channels.
As a note, lots of stuff these days starts at 9600 default, and you can adjust from there. While it is faster than you can type (usually), often we just want to go faster and faster.
Above 9600, the sky is the limit. 115200 is a common next step, but above that 3MBd and even 12Mbd are common PC link capable Async speeds, and the Baud granularity is also much better than it used to be.
Most of the better link devices now have fractional Baud settings, and will swallow a 32 bit Baud parameter and give the nearest supported fraction.
Expect a Prop II to be able to match almost anything a FT2232H can throw at it.
If you are crafting a Serial Module (TxOnly) that can optionally generate a CLOCK with a Data length choice of 9, that module can be used to create a newly specified UFm i2c link, where the speed spec max is 5MBd. (UFm is CMOS pins, not open drain, and is unidirectional)
If you want to replace/swallow a 8051 Mode0 link, that SYNC mode (master only, half duplex, 2 pins) starts at 1MHz, and can get to 12MHz on common parts.
I'll second that. It all depends on cable length. Back in the olden days, when one had a computer with a serial port, and a "serial terminal", I found that with a 3 metre cable, 9600 baud was super reliable, but 19200 would occasionally lose characters. Change the cable to a shielded one and it worked better. My dad and I cabled the house with some serial cable but from one end of the house to the other and with unshielded wires we had to drop the baud rate further to about 2400.
115200 is a fairly standard baud rate for the propeller as this is the download rate so maybe this can be a "standard"? But it works partly because most people use the propplug, and the data is going via USB most of the way (which is shielded) and the final bit that is serial is maybe only an inch from the ft232 chip to the propeller chip.
If you were communicating between two propeller boards from one side of the house to the other, you would have to drop the baud rate. So objects in the Obex need the flexibility to be able to run at different rates.
I wonder if it would be possible to work out the baud rate automatically? Grab some data and look at the waveform. Some bytes will be a series of zeros or a series of ones, but if one took the smallest width of a pulse, that is probably the baud rate. Two devices communicating could auto negotiate the highest rate possible, like fax machines do.
That depends on the cable drivers & cable choice.
ProfiBUS, and some RS485/RS422 variants can go to 12MBd and above, which is well above what a Prop 1 can deliver.
one example :
http://www.exar.com/connectivity/transceiver/rs485/xr5488e
Of course, 'house wiring' that needed those speeds, would be a rare combination, but that could easily occur on a factory floor.
I found that some displays from 4D-Systems used some autobaud-feature. If I remember correct they are using "U" as code because of binary 01010101 or so
Enjoy!
Mike
I use the propplug however I don't connect directly to my props.
I use a small adapter, 25mil square 4 pin header, and use simple 4 wire ribbon cable about 10 feet long.
My props are all made on plug boards. I always use 115.2Kb/S. Works just fine.
Since its not shielded it is occasionally susceptible to static discharge bit errors but I can live with
that as they are clearly prototypes. Worse in the winter and never right now.
Duane J
I have an RTX2000 (http://en.wikipedia.org/wiki/RTX2010) that has autobaud. It's always a puzzle every couple years because I forget the autobaud, and have to work out how to get it going. We type in a character, i or . (period) and it works out the bit timings and sets the proper rate, from 300 to 9600 I think. It's really simple and cool, but a pain in the butt because its so smart and I'm used to "not so smart". I always thought this would be great to include in devices, and not include any default baud rate information, just to mess with people's heads!
My wife won't let me string stuff from one end of the house to another, but we do the serial tests across the lab with the shortest and longest cable available, just to see if there's a difference; there hasn't been, so far.
My 1bit keyboard code uses the space character on the keyboard to time the keyboard baud because I don't have access to the clock pin.
The biggest issue for autobauding is that the other end has to send characters for it to be determined. It is easy enough to do in software.
Sometimes with autobaud on character received a simple assumption can be made that the shortest low pulse (with glitch rejection) is the bit time, as long as other high and low pulses are multiples of this.
The baud/baud rate/bits per second dilemma has been bugging me ever since I knew there were such things as serial links (circa 1974)
Admittedly a quick google around does not clarify things as there are many poor descriptions of the situation, some contradictory or even self contradictory.
Clearly the concept of "bits per second" did not really come to the fore before we had "bits" and that is a term coined by Cooley in the mid 1940's and first seen in print in an article by Shannon (1948).
Not necessarily.
The first telegraph date from 1839 by Cooke and Wheatstone. It could indicate one of 20 characters from the alphabet in each symbol period.
(Not sure how the coped with the missing characters). Mind you it did use 5 signal wires to do so.
http://www.makingthemodernworld.org.uk/icons_of_invention/technology/1820-1880/IC.017/
Later telegraphs used Morse code over a single wire.
In the absence of information theory it was natural to think of line speeds in words per minute or later symbols per second, Baud.
But then we have the original proposal for the definition of baud, which was originally a unit of telegraph signalling speed, set at one pulse per second. It was proposed at the November, 1926
If a single character ie "A", regardless of how many bits are used for that character and added for starts, stops or parity, is sent once every month at a bit rate of "19,200" bits per second then this is 19,200 Baud, even if the through put is one character / month.
I had to laugh at the notion of RS232 standards, the only standard thing about it is that the plastic of the 9 pin D type will be a molten mess before the comms comes good :-)
Actually the DB25 was properly defined. The typical crossover was to remove the DCE and hence it was to connect mainframe to mainframe. The standard cross IIRC was 2 to 3 (TXD to RXD) each way, 4+5 to 8 (RTS+CTS to DCD) each way and 6 to 20 (DSR to DTR) each way. IF it was synchronous, then 24 (XCLK) on one end only to 15+17 (TXC+RXC) on both ends. I carried these premade in a 12" length.
I guess to really answer your thread, we don't see the need beyond having the BAUD defined as a constant in the program. There seem to be 2 or 3 standard rates being used, in order of preference
115,200
9,600
19,200
Use 115,200 where you can (for immediate local connection). At least mostly 8,N,1 is being used.