Shop OBEX P1 Docs P2 Docs Learn Events
USB seems to be PITA - Page 2 — Parallax Forums

USB seems to be PITA

2»

Comments

  • Heater.Heater. Posts: 21,230
    Yes, except the FT23x is a really bad thing to test against.

    As far as I can tell it needs some funky non-standards compliant, or at least not standardized, driver to make it work. I.e. it does not work on Windows without installing the FTDI driver first.

    FTDI are not above tweaking their driver to make your device and or system not work when it discovers you are not a real FTDI device. See FTDI scandles:

    Current scandal:
    http://www.eevblog.com/forum/microcontrollers/ftdi-gate-2-0/

    Previous scandal:
    http://www.eevblog.com/forum/reviews/ftdi-driver-kills-fake-ftdi-ft232/

    Basically the last thing you want to do is make your device FTDI compatible.

    But anyway, I though all that stuff was a layer above the basic signalling/CRC etc required to get USB working. Surely these higher layers are to be done in software?



  • Peter JakackiPeter Jakacki Posts: 10,193
    edited 2016-03-17 21:51
    I believe that silicon trumps planned features. The P2 just had to be a P1 with more memory and more I/O and yet it already is way way more than that, and many many years later but still it's not enough it seems. I'd rather have silicon, any silicon. SERIOUSLY
  • But could it be that years of wrangling USB made Fred Dart make the decisions he does?

    Anyway the thing about starting with FT232 is everyone here has one and can see the usb bits on a scope. Also the data stream is 'lean', you can send a single async character across and see how its represented, etc.

    Not that we need to, now. Let's test and fix up the issues with the v7y release and head towards silicon.
  • Heater.Heater. Posts: 21,230
    Peter,
    I'd rather have silicon, any silicon. SERIOUSLY
    Totally yep.

    Been waiting for a Prop with more than 32 I/O pins and a bit more RAM now for ten years!

    Ten frikken years!




  • cgraceycgracey Posts: 14,133
    Heater. wrote: »
    Peter,
    I'd rather have silicon, any silicon. SERIOUSLY
    Totally yep.

    Been waiting for a Prop with more than 32 I/O pins and a bit more RAM now for ten years!

    Ten frikken years!





    That's seventy dog years, which is even worse.
  • RaymanRayman Posts: 13,906
    edited 2016-03-17 22:49
    A one month delay in a 10 year cycle to add USB seems worth it to me.
    I'm assuming Chip can get this done in a month.

    With USB, P2 becomes a real System On a Chip.
    This will save people a lot of cost and time...

    Also, I'd point out that USB is not a new P2 feature addition. It was meant to be there from the beginning...

    That all said maybe at 200 MHz we could bit-bang this good enough to get keyboard and mouse HID working...
  • jmgjmg Posts: 15,148
    Heater. wrote: »
    Yes, except the FT23x is a really bad thing to test against.
    ..
    But anyway, I though all that stuff was a layer above the basic signalling/CRC etc required to get USB working. Surely these higher layers are to be done in software?

    Yes, the actual device used to get an active USB bus does not matter & I simply chose FT23x, because it is there already on the FPGA board.

    The point you seem to have missed in the FTDI reflex, is it will be easier to test clipped onto a working USB link, than to bring up a P2-USB link from scratch.
    You then have known traffic, and outcomes, to work with.
    Even a connected, but 'doing nothing at all' USB device, has signals to test.
  • cgracey wrote: »
    Heater. wrote: »
    Peter,
    I'd rather have silicon, any silicon. SERIOUSLY
    Totally yep.

    Been waiting for a Prop with more than 32 I/O pins and a bit more RAM now for ten years!

    Ten frikken years!





    That's seventy dog years, which is even worse.

    And, for you, it's been 20 bear years! :lol:
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2016-03-18 02:58
    To me, the tragedy of waiting is the slow but inexorable draining of interest. Gratification delayed is gratification denied. I suspect that the few forumistas who remain in rapt suspense for the P2 are but a fraction of those who might have been enthusiasts if things had come together more quickly. I have to admit, I'm no longer one of them. Getting things done on time is not just a matter of dollars and cents, it's a matter of hearts and minds. And if you lose the latter, you will lose the former. Maybe, if the P2 ever comes to fruition, I'll give it some attention. But it's a total waste of my time at this juncture to weigh in on technical matters relating to its development -- or even to care about them. I can't. It just needs to get wrapped up: warts, missing features, and all, before any more potential enthusiasts abandon ship like I have.

    -Phil
  • I don't think anyone here really understands how much is involved in having a full working USB setup. It's not just some simple protocol on top of special pin modes.
    It's going to take months to properly build and test a USB stack that's properly compliant. I don't think they can afford to wait that long to send off to have the silicon made.

    I really think we should just back away from USB for this rev, and spend the time to do it right in the next one. Rushing in something we think might work that we can't possibly test in time seems bad.
  • jmgjmg Posts: 15,148
    Roy Eltham wrote: »
    I don't think anyone here really understands how much is involved in having a full working USB setup. It's not just some simple protocol on top of special pin modes.
    It's going to take months to properly build and test a USB stack that's properly compliant. I don't think they can afford to wait that long to send off to have the silicon made.

    I really think we should just back away from USB for this rev, and spend the time to do it right in the next one. Rushing in something we think might work that we can't possibly test in time seems bad.

    That argument would stand up if P2 was a ROM part, but it is not.
    (No one has asked for ROM based USB boot, for example)

    The bits that are going into Silicon, are not the software stack you mention, but only the NRZI and Stuff/destuff.
    Those do not take 'months to properly build and test'.
    You can either decode a USB stream, with correct CRC, or you cannot.
    Likewise on Transmit.

  • Ten years, what's another couple of years. I'm just not sure what user base will be left by then considering how many users have ceased posting or caring.

  • Hey, I'm good either way. I think this attempt is worth it. Should it turn out too complex, etc... then it's not worth it.

    I'm not asking for the feature. Just to be clear.

    What we have done is excellent.
  • Chip: It would be wise to finish the current low-level USB capable differential signaling, and then move on to trying to stuff this logic sausage into its casing.

    A better attitude would also help. I disagree, in that USB didn't get to where it was because it was complete garbage and a conspiracy, there has to be an underlying elegance to find. You're all probably just not used to it, and it feels foreign.

    Since this is the first stab at it, it's okay for its implementation to be somewhat ugly. Okay, I get it, you and a lot of people here seem to love RS232-style simplicity. But think of why and how something as overly simple as RS232 fails in the real world: it has no consistent and uniform ability to report what kind of device it is at the other end, it would accept noise or a loose connector as data as it has no inherent error checking, it isn't particularly fast in real-world situations (that you can get megabits per second on your desk in a carefully controlled environment means little), it really doesn't like floating or unplugged situations, and most importantly it can't even supply electrical power(!!!) to the device on the other end.
  • cgraceycgracey Posts: 14,133
    whicker wrote: »
    Chip: It would be wise to finish the current low-level USB capable differential signaling, and then move on to trying to stuff this logic sausage into its casing.

    A better attitude would also help. I disagree, in that USB didn't get to where it was because it was complete garbage and a conspiracy, there has to be an underlying elegance to find. You're all probably just not used to it, and it feels foreign.

    Since this is the first stab at it, it's okay for its implementation to be somewhat ugly. Okay, I get it, you and a lot of people here seem to love RS232-style simplicity. But think of why and how something as overly simple as RS232 fails in the real world: it has no consistent and uniform ability to report what kind of device it is at the other end, it would accept noise or a loose connector as data as it has no inherent error checking, it isn't particularly fast in real-world situations (that you can get megabits per second on your desk in a carefully controlled environment means little), it really doesn't like floating or unplugged situations, and most importantly it can't even supply electrical power(!!!) to the device on the other end.

    I hear you, especially about the attitude. A bad attitude can make something that is difficult nearly impossible. That's been my problem.

    I finally realized today that to be able to design this, I have to separate the transmitter completely from the receiver and run two separate state machines that have only the NCO baud generator in common. Supposing I needed a single bit counter and a single one's counter for bit stuffing was making it all very hard to think about. This means that whatever you send also gets received, but that is not a problem, as there's only single-level byte buffering. It actually has some benefits, too, in that when you stop loading bytes for it to send, and it automatically does an EOP (end of packet), you will be alerted when the EOP is sensed, and then know that the transmission completed. I hope to have the transmitter working tonight. The receiver I've already designed once, so that will be no problem. Not realizing earlier that these things needed separating was the big hang-up.
  • whicker wrote: »
    Chip: It would be wise to finish the current low-level USB capable differential signaling, and then move on to trying to stuff this logic sausage into its casing.

    A better attitude would also help. I disagree, in that USB didn't get to where it was because it was complete garbage and a conspiracy, there has to be an underlying elegance to find. You're all probably just not used to it, and it feels foreign.

    Since this is the first stab at it, it's okay for its implementation to be somewhat ugly. Okay, I get it, you and a lot of people here seem to love RS232-style simplicity. But think of why and how something as overly simple as RS232 fails in the real world: it has no consistent and uniform ability to report what kind of device it is at the other end, it would accept noise or a loose connector as data as it has no inherent error checking, it isn't particularly fast in real-world situations (that you can get megabits per second on your desk in a carefully controlled environment means little), it really doesn't like floating or unplugged situations, and most importantly it can't even supply electrical power(!!!) to the device on the other end.

    There is nothing elegant about USB and don't try to compare this to "RS232" as that is a voltage signaling standard, nothing to do with formats or protocols. It is perfectly possible to have a balanced half-duplex bus like USB and not be tied into the multi-corporate/committee conspiracy way of thinking. I use balanced half-duplex buses all the time without the silicon and software overhead of USB and easily at multi-megabit rates using common micros and with protocols that allow "PnP".

    USB "support" for P2 is fine if it can just be wrapped up and not dragged out with "just a bit more" and the endless testing that would ensue. P2 in silicon will be an old genetically enhanced cat, but still an old cat, as it never got to be "born" as a kitten that we could play with and just grow up.

  • Dave Hein wrote: »
    Let's give Chip a little more time to see if he can pull it off instead of posting discouraging comments.

    Discouraging to whom? Surely not Chip.

  • Heater. wrote: »
    Yes, except the FT23x is a really bad thing to test against.
    FTDI are not above tweaking their driver to make your device and or system not work when it discovers you are not a real FTDI device. See FTDI scandles:

    A remote coworker bought some counterfeit USB to TTL async serial convertors and they didn't work properly under linux. If you plugged into cable into a computer it seemed to work, but additional cables would cause failures. I believe that this was because every chip returned the same id. (e.g. /dev/serial/by-id/ would only have a single entry) If the Linux driver had complained that these were counterfeit chips it would have saved some time. Or if he had plugged them into a Windows machine and detected that they were counterfeit it would have saved time. I was under the impression that he had purchased FTDI TTL-232R-3V3-2MM cables from Mouser, but this was not the case. I eventually got one and cracked it open, and it was using the 1412-G date code parts which you can find referenced on the internet. But I would agree - don't pretend that you are an FTDI device if you are not.
  • KeithEKeithE Posts: 957
    edited 2016-03-18 16:35
    jmg wrote:

    You can either decode a USB stream, with correct CRC, or you cannot.
    Likewise on Transmit.

    One question is - how do you effectively test this? Chip doesn't believe in simulation, so as far as I know he can't have someone sell him or write him a test bench to check this. Therefore it has to be done on an FPGA in someone's lab correct? Is it easy to verify this in isolation under such conditions? I haven't looked at USB at such a low level since the dotcom days.
  • Hi everyone,

    As the point seems well made, and Chip seems to have responded with vigorous development of said feature, then I hope no-one minds us closing this thread so as not to distract from said development.

    Thank you all.
This discussion has been closed.