Shop OBEX P1 Docs P2 Docs Learn Events
Terminal Speed - a plea — Parallax Forums

Terminal Speed - a plea

g3cwig3cwi Posts: 262
edited 2012-06-04 20:19 in Propeller 1
Dear all

I really wish that developers would standardise on a particular serial terminal speed. I have been downloading a few things from the OBEX this afternoon and every one was set to a different speed (for no obvious reason). On the plus side, they did all use the same pins.

It would be great to have some sort of agreed standard.

Cheers

Richard

Comments

  • Heater.Heater. Posts: 21,230
    edited 2012-06-02 13:04
    Bah! These are serial links we are talking about. In the past we had to worry about baud rate, number of data bits, number of stop bits, flow control or not, RTS/CTS, XON, XOFF, etc,etc.
    Now it's easy.
  • JonnyMacJonnyMac Posts: 9,195
    edited 2012-06-02 13:06
    It would be easier to get everyone to agree on the best flavor of ice cream! That said and IMHO, good coding practices mean using a named constant for things like baud rate so that it's easily changed. The pins thing was a no-brainer as the Propeller is programmed through a serial port on pins 31 (RX) and 30 (TX).
  • Cluso99Cluso99 Posts: 18,069
    edited 2012-06-02 14:49
    As heater said, I am grateful for the standardisation of 8N1 (8 bits, No parity, 1 stop bit). On top of that were the flow control because UARTs had no buffers originally (and before micros the send and receive UARTs were in separate chips - 28 pins each). Then of course there was synchronous comms which was another story again. And of course there was ASCII and EBCDIC. Sometimes these character sets were not common across a suppliers mini/mainframes either - I know because I designed and built converters for ICL, DEC, IBM and WANG.

    Anyway, back to today...
    115,200 is quite common but relies upon short cable lengths. Many RS232 drivers will not perform at these speeds anyway, so this is usually reserved for short TTL communications.
    9,600 and 19,200 are possibly the next most common as they usually will work over shortish distances.

    However, it would be better if BAUD was declared a constant in objects, making it easier to change than embedded in the start call.
    BTW just for the uninformed baud is a bit rate. Often I see baudrate used - this is a no no as rate is implied in baud.

    Heck, I cannot get anyone to agree on naming the prop pins so we can have a separate hardware definition file. Even I don't always use the same pin names :(
  • Heater.Heater. Posts: 21,230
    edited 2012-06-02 15:00
    Not uninformed just sloppy:) Indeed "rate" is implied by "baud".
    If we want to be rigorous we should remember that baud refers to symbols per second not bits per second. In a binary transmission medium these are the same but otherwise they are not.
  • Duane C. JohnsonDuane C. Johnson Posts: 955
    edited 2012-06-02 15:22
    Hi Cluso99:
    Not to be too critical but:

    Nit pick mode ON.

    A baud is technically "Symbol units per second" or sometimes "Information units per second".
    http://en.wikipedia.org/wiki/Baud
    Yes, many times this is the same as "Bits per second" where the symbol contains just 1 bit.
    However, this is not always the case.
    As an example, If I remember correctly, a 56Kb/S dial up modem operates at 19.6Kbaud.
    In this case each symbol represents 2 bits or 4 states.
    And there were some that crammed even more bits into each symbol.
    Anyway, for most of us the two are the same.
    However, to be correct, one would generally not want to standardize the term "baud" but instead use "Bit per S".

    Nit pick mode OFF.

    Duane J
  • kwinnkwinn Posts: 8,697
    edited 2012-06-02 15:25
    Heater. wrote: »
    Bah! These are serial links we are talking about. In the past we had to worry about baud rate, number of data bits, number of stop bits, flow control or not, RTS/CTS, XON, XOFF, etc,etc.
    Now it's easy.

    You forgot a few of the other fun signals like DTR, DCD, and DSR along with the equipment/connector type designations DTE and DCE that made connecting two pieces of equipment such a hair pulling head scratching joy. SO MUCH EASIER NOW.
  • Peter JakackiPeter Jakacki Posts: 10,193
    edited 2012-06-02 17:40
    I've been using 230.4k baud with PropForth over Bluetooth, I can send a whole text file to it without flow control and it compiles on the fly, why don't we standardize on that! :) There's a lot of old RS232/422/485 stuff out in industry that's running at the nice safe speed of 9600 baud which might have been fine for the 80's but we wouldn't want that would we? The limiting factor with serial interfaces is usually to do with the physical driver which in many cases is the prehistoric RS232 "standard" (joke).

    Over long lines I run RS485 or RS422 and over short lines I run "TTL" as it is usually referred to. The only time I use RS232 drivers is when I have to in a commercial product for standards compliance, not that it needs these drivers as they don't make the signal better, just worse IMO. The loaded level out from a modern driver is more like +/-6V along with ripple from it's internal up converter and slew-rate limited edges coupled with the high ESR of over 300 ohms. How far is RS232 usually run? About a meter or two if that! So why bother. If I need to interface to RS232 in non-commercial or non-compliant products I just run straight from 3.3V or even 5V levels and just use a 100K current limit resistor on the inputs. Considering that the majority of serial interfaces are not "RS232" then lets open up the "default" speed to at least 115.2k baud. Trouble is everyone uses "FullDuplexSerial" from the OBEX and last time I checked that object needed a bit of work especially the bit timing at high speeds. Also I tend to define the baud rate as a constant and then this constant is used in the start function (wherever that may be in the code).
  • Duane DegnDuane Degn Posts: 10,588
    edited 2012-06-02 18:03
    I've been using 230.4k baud with PropForth over Bluetooth, I can send a whole text file to it without flow control and it compiles on the fly, why don't we standardize on that!

    Spin can't move bytes around fast enough to need more than about 115,200 bps (I think the limit is just barely over this baud). Yes, I read you were using Forth (I hope you're feeling better) and I don't think you were serious about using 230.4k. I just recently learned about this limit with Spin and I think it's worth pointing out again.

    I personally use 57,600 since some of the four port objects have a hard time at higher rates when all four lines are used.

    The SimpleSerial object can't go much faster than 9600 baud.

    I'm just glad the Prop doesn't have a limited number of possible rates like some uCs. I've interfaced with a Spektrum tx unit with the Prop. It uses some non-standard rate and no stop bit. The Prop didn't have any trouble with this. Some of the forums posts about this (in German) had complaints about their favorite uC not being able to use the neccessary rate.
  • Cluso99Cluso99 Posts: 18,069
    edited 2012-06-02 18:30
    [history]
    I cannot confirm the original definition of baud as being bits per second or symbols per second. Just because the wiki says so does not make it correct. BAUD was defined well before anyone knew that more than one symbol could be sent in parallel. AFAIK baud was used for TTY on the ASR33's which were in fact not RS232 but current loop (20mA IIRC). In fact IIRC it was applied to the teletypewriter (for telegrams) and this used again IIRC a 5-bit baudot code.

    My work/study began in 1970 with OTC (the overseas section of our Telecom today). They had lots of teletypes and paper tape punches and readers were used as repeaters to prevent the signals from being distorted - yes, a punch followed by a reader with a loop bin for the paper tape to be held! Lots of free confetti!!! In the 70's mainframes used Synchronous RS232 to communicate, sometimes over 50m although IIRC RS232 was only supposed to work to 15m. The clocking was provided by the DCE on pins 15 & 17 of the DB25. Mainframes were DTE and modems (leased lines - no dialup back then) were DCE. In order for mainframes to communicate, some mainframes output a clock on pin 24. Then with a crossover cable, clocks could be provided by one mainframe to both mainframes on pins 15 & 17 of each. Of course, some mainframes used EBCDIC and some used ASCII. A mate and I made quite a bit of $ building converters that were used to convert ASCII to/from EBCDIC, and then followed with a card in the Apple //e to comunicate with mainframes - we sold this to Apple USA.

    Modems started at 300 baud accoustic couplers in the early 80's. The in Oz and Europe came 1200 with a 75 baud back channel. I cannot quite recall where multiple bits in transmission occurred - whether it was 1200 FDX or 9600FDX.

    Originally, RS232 defined a male DB25 as a DTE (Data Terminal Equipment) and a female DB25 as a DCE (Data Communicaton Equipment such as a modem). Apple and the manufactures of the glass teletypes started the confusion over the DB25 by using a female as the DTE. This was followed by the IBM PC using the DB25 female as a (Centronics) printer port which was nothing to do with RS232 at all.
    [/history]
  • mindrobotsmindrobots Posts: 6,506
    edited 2012-06-02 18:56
    Of course we're serious about 230400 in Forth. My pp-usb/tetraprop stack has been running 230400 between the 5 Propellers and out to the PC for several weeks now. It seems very stable and gulps down big chunks of source code without hiccups.

    The Go/Forth CSP channel experiments have also been running 230400 between the PC and the prop.

    Or did you just mean we weren't serious because we're running Forth? :lol:
  • prof_brainoprof_braino Posts: 4,313
    edited 2012-06-02 22:20
    Duane Degn wrote: »
    you were using Forth (I hope you're feeling better) and I don't think you were serious about using 230.4k.

    Feeling much better, as 230400 is correct. One cog and two pins allows 32 simultaneous channels and about 2000 characters per second. This is just a temporary set up, doing synchronous serial transfer over asynchronous serial port. Once we move to Ethernet the synchronous transfer should go much faster.

    http://code.google.com/p/propforth/wiki/PF521BETAthroughput

    PM me if you want to try the beta, or just wait till the regular public release in a couple weeks when we have the test automation in place.
  • prof_brainoprof_braino Posts: 4,313
    edited 2012-06-03 10:46
    g3cwi wrote: »
    I really wish that developers would standardise on a particular serial terminal speed. I have been downloading a few things from the OBEX this afternoon and every one was set to a different speed (for no obvious reason). On the plus side, they did all use the same pins. It would be great to have some sort of agreed standard.

    Back to the original question:

    Hi Richard. The "standard" for serial is "as fast as I need to get this done". While some device can only go so fast, most of the time we want to go as fast as possible. To eliminate data transfer bottlenecks, we generally set it to "maximum reliable" to start and tweak it from there. It is weird to get used to at first, but you may appreciate the flexibility when faced with time critical functions on varying peripherals.

    In my case the new "fast as possible" for asynchronous serial over two wires is 230400 baud which we just nailed down a couple weeks ago. We are getting a max of about 2000 characters per second for the physical connection, and in the test the physical connection is carrying up to 32 separate channels at the same time. The ballpark looks about the equivalent of 57600 per channel is what we can expect if the PC is not overloaded.

    The tests are in preparation of synchronous serial over ethernet which will end up being substantially faster, using 100Mbit ethernet we should get faster rates and /or more channels.

    As a note, lots of stuff these days starts at 9600 default, and you can adjust from there. While it is faster than you can type (usually), often we just want to go faster and faster.
  • jmgjmg Posts: 15,183
    edited 2012-06-03 16:21
    Certainly 9600 is a widespread common base, and can give an important 'establish a pulse' starting point, even if your system never intends to run at that.

    Above 9600, the sky is the limit. 115200 is a common next step, but above that 3MBd and even 12Mbd are common PC link capable Async speeds, and the Baud granularity is also much better than it used to be.
    Most of the better link devices now have fractional Baud settings, and will swallow a 32 bit Baud parameter and give the nearest supported fraction.

    Expect a Prop II to be able to match almost anything a FT2232H can throw at it.

    If you are crafting a Serial Module (TxOnly) that can optionally generate a CLOCK with a Data length choice of 9, that module can be used to create a newly specified UFm i2c link, where the speed spec max is 5MBd. (UFm is CMOS pins, not open drain, and is unidirectional)

    If you want to replace/swallow a 8051 Mode0 link, that SYNC mode (master only, half duplex, 2 pins) starts at 1MHz, and can get to 12MHz on common parts.
  • Dr_AculaDr_Acula Posts: 5,484
    edited 2012-06-03 16:42
    re prof_braino
    Hi Richard. The "standard" for serial is "as fast as I need to get this done".

    I'll second that. It all depends on cable length. Back in the olden days, when one had a computer with a serial port, and a "serial terminal", I found that with a 3 metre cable, 9600 baud was super reliable, but 19200 would occasionally lose characters. Change the cable to a shielded one and it worked better. My dad and I cabled the house with some serial cable but from one end of the house to the other and with unshielded wires we had to drop the baud rate further to about 2400.

    115200 is a fairly standard baud rate for the propeller as this is the download rate so maybe this can be a "standard"? But it works partly because most people use the propplug, and the data is going via USB most of the way (which is shielded) and the final bit that is serial is maybe only an inch from the ft232 chip to the propeller chip.

    If you were communicating between two propeller boards from one side of the house to the other, you would have to drop the baud rate. So objects in the Obex need the flexibility to be able to run at different rates.

    I wonder if it would be possible to work out the baud rate automatically? Grab some data and look at the waveform. Some bytes will be a series of zeros or a series of ones, but if one took the smallest width of a pulse, that is probably the baud rate. Two devices communicating could auto negotiate the highest rate possible, like fax machines do.
  • jmgjmg Posts: 15,183
    edited 2012-06-03 16:50
    Dr_Acula wrote: »
    If you were communicating between two propeller boards from one side of the house to the other, you would have to drop the baud rate. So objects in the Obex need the flexibility to be able to run at different rates.

    That depends on the cable drivers & cable choice.

    ProfiBUS, and some RS485/RS422 variants can go to 12MBd and above, which is well above what a Prop 1 can deliver.

    one example :
    http://www.exar.com/connectivity/transceiver/rs485/xr5488e

    Of course, 'house wiring' that needed those speeds, would be a rare combination, but that could easily occur on a factory floor.
  • msrobotsmsrobots Posts: 3,709
    edited 2012-06-03 17:01
    @DR_Acula,

    I found that some displays from 4D-Systems used some autobaud-feature. If I remember correct they are using "U" as code because of binary 01010101 or so

    Enjoy!

    Mike
  • Duane C. JohnsonDuane C. Johnson Posts: 955
    edited 2012-06-03 17:04
    Hi Dr_Acula;

    I use the propplug however I don't connect directly to my props.
    I use a small adapter, 25mil square 4 pin header, and use simple 4 wire ribbon cable about 10 feet long.
    My props are all made on plug boards. I always use 115.2Kb/S. Works just fine.

    Since its not shielded it is occasionally susceptible to static discharge bit errors but I can live with
    that as they are clearly prototypes. Worse in the winter and never right now.

    Duane J
  • prof_brainoprof_braino Posts: 4,313
    edited 2012-06-03 17:25
    msrobots wrote: »
    ... used some autobaud-feature. If I remember correct they are using "U" as code because of binary 01010101 or so

    I have an RTX2000 (http://en.wikipedia.org/wiki/RTX2010) that has autobaud. It's always a puzzle every couple years because I forget the autobaud, and have to work out how to get it going. We type in a character, i or . (period) and it works out the bit timings and sets the proper rate, from 300 to 9600 I think. It's really simple and cool, but a pain in the butt because its so smart and I'm used to "not so smart". I always thought this would be great to include in devices, and not include any default baud rate information, just to mess with people's heads! :)

    My wife won't let me string stuff from one end of the house to another, but we do the serial tests across the lab with the shortest and longest cable available, just to see if there's a difference; there hasn't been, so far.
  • Cluso99Cluso99 Posts: 18,069
    edited 2012-06-03 20:22
    The AT used in modems was able to work out the baud, no of bits and parity from these 2 characters, including case. The start bit is alone and used to time the baud.
    My 1bit keyboard code uses the space character on the keyboard to time the keyboard baud because I don't have access to the clock pin.
    The biggest issue for autobauding is that the other end has to send characters for it to be determined. It is easy enough to do in software.
  • Peter JakackiPeter Jakacki Posts: 10,193
    edited 2012-06-04 01:23
    The problem with autobaud is that you can't transmit anything until you have the correct autobaud character received. The AT command itself is used as an autobaud sequence to synch from etc. However in my ARM Forth and others I used an ANSI enquiry sequence at various baud rates until the proper response was received, so this autobaud worked without having to type in characters but did require an ANSI compliant terminal emulator such as TeraTerm etc. Just hook up the terminal at any baud rate and boot the Forth and the welcome messages would magically appear.

    Sometimes with autobaud on character received a simple assumption can be made that the shortest low pulse (with glitch rejection) is the bit time, as long as other high and low pulses are multiples of this.
  • Heater.Heater. Posts: 21,230
    edited 2012-06-04 04:35
    Cluso,

    I cannot confirm the original definition of baud as being bits per second or symbols per second. Just because the wiki says so does not make it correct.


    The baud/baud rate/bits per second dilemma has been bugging me ever since I knew there were such things as serial links (circa 1974)


    Admittedly a quick google around does not clarify things as there are many poor descriptions of the situation, some contradictory or even self contradictory.


    Clearly the concept of "bits per second" did not really come to the fore before we had "bits" and that is a term coined by Cooley in the mid 1940's and first seen in print in an article by Shannon (1948).

    BAUD was defined well before anyone knew that more than one symbol could be sent in parallel.


    Not necessarily.

    The first telegraph date from 1839 by Cooke and Wheatstone. It could indicate one of 20 characters from the alphabet in each symbol period.
    (Not sure how the coped with the missing characters). Mind you it did use 5 signal wires to do so.
    http://www.makingthemodernworld.org.uk/icons_of_invention/technology/1820-1880/IC.017/


    Later telegraphs used Morse code over a single wire.


    In the absence of information theory it was natural to think of line speeds in words per minute or later symbols per second, Baud.

    But then we have the original proposal for the definition of baud, which was originally a unit of telegraph signalling speed, set at one pulse per second. It was proposed at the November, 1926
  • Toby SeckshundToby Seckshund Posts: 2,027
    edited 2012-06-04 12:53
    My simplistic (usual) thinking on this is :-

    If a single character ie "A", regardless of how many bits are used for that character and added for starts, stops or parity, is sent once every month at a bit rate of "19,200" bits per second then this is 19,200 Baud, even if the through put is one character / month.

    I had to laugh at the notion of RS232 standards, the only standard thing about it is that the plastic of the 9 pin D type will be a molten mess before the comms comes good :-)
  • kwinnkwinn Posts: 8,697
    edited 2012-06-04 17:08
    Toby, believe it or not going to a 9 pin connector was a big improvement as far as standards went when compared to the 25 pin connector. This was mainly due to having far fewer pins, but having IBM set a defacto standard also helped a lot. I got so frustrated with the number of adapter cables I needed at one point that I made a 9 pin to 9 pin cable and 9 pin to 25 pin adapters to go from that cable to each specific piece of equipment I needed to connect to. Made for a lot of adapters, but they took up much less space than the equivalent number of cables.
  • Cluso99Cluso99 Posts: 18,069
    edited 2012-06-04 18:41
    I have a set of 6" (150mm) colored wires terminated as m-m, m-f, f-f RS232 pins. This made for easy working out of the cable required. I now use these on my prop because they will plug into pcb holes and the females will plug into pin stakes - although a little sloppy you can compress the females.
    Actually the DB25 was properly defined. The typical crossover was to remove the DCE and hence it was to connect mainframe to mainframe. The standard cross IIRC was 2 to 3 (TXD to RXD) each way, 4+5 to 8 (RTS+CTS to DCD) each way and 6 to 20 (DSR to DTR) each way. IF it was synchronous, then 24 (XCLK) on one end only to 15+17 (TXC+RXC) on both ends. I carried these premade in a 12" length.
  • Cluso99Cluso99 Posts: 18,069
    edited 2012-06-04 20:19
    g3cwi: Richard, I am feeling rather guilty. We have hijacked your thread.

    I guess to really answer your thread, we don't see the need beyond having the BAUD defined as a constant in the program. There seem to be 2 or 3 standard rates being used, in order of preference

    115,200
    9,600
    19,200

    Use 115,200 where you can (for immediate local connection). At least mostly 8,N,1 is being used.
Sign In or Register to comment.