Shop OBEX P1 Docs P2 Docs Learn Events
Expanding My Horizon In Digital I/O - CNC/General Interest — Parallax Forums

Expanding My Horizon In Digital I/O - CNC/General Interest

idbruceidbruce Posts: 6,197
edited 2019-01-17 10:47 in General Discussion
I am unsure where to begin this discussion, so please forgive me if this post just sort of rambles from point to point.

I suppose I will start by giving you a brief background. In the late nineties, I was working for an older gentlemen that liked to take gambles and invest his money, and he wanted to start a business with me. Of course this was during the dot com bubble, and so we decided to build a website. Since that time, programming has been an important part of my life, dabbling in some languages, while becoming fairly proficient in others, including HTML, JavaScript, PERL, C++, C, MFC, BASIC, and Spin, with the last two languages introducing me to digital I/O. Digital I/O led me to stepper motors, and stepper motors led me to building machinery. I have always designed things, ever since I was a young man, but adding motors to my designs, well that was something new and exciting.

Getting more directly to the point of this thread, digital I/O is very important to equipment operation, regardless of whether it is an embedded design or computer driven. For many years, home built CNCs relied on the parallel port of the common PC, but these days those boards are becoming obsolete. Additionally, parallel ports had very limited resources, which is the reason that I now rely on the Propeller chip to run my machines. However, with my knowledge of C++, C, and MFC, I have always been interested in machine control through the use of a PC, but then you are looking at software like Mach 3, etc....

In addition to digital I/O, timing is also very important to equipment operation. Considering the speed of PCs these days, I knew it had to be possible to get high accuracy timing from a computer, I just did not know how to get that functionality, so I would occasionally do research on the subject, as well as researching how to get more I/O from a PC. I was already aware of the SetTimer function in Windows, but I do not believe that would be adequate to run machinery.

Since I am getting tired and you are most likely getting bored, I will sum it up......

I found this information today....

For high accuracy timing, the functions listed on this page should do the trick: https://docs.microsoft.com/en-us/windows-hardware/drivers/kernel/counters

EDIT: I am still searching for better timing functions.

For digital I/O, visit this page, but more specifically, go to heading 5. Bidirectional I/O circuit: https://contec.com/support/basic-knowledge/daq-control/digital-io/

I could be wrong, but I believe with this information, some awesome things can be done, like running GRBL on a PC instead of on an Arduino :)
«1

Comments

  • The_MasterThe_Master Posts: 200
    edited 2019-01-17 11:53
    What's your point?
    Are you not already using a propeller for your CNC?
    And now you want to use PC directly so you can use C/C++? (huh?)
    But you can use C++ on a micro.
    But you don't want that because that means Arduino? (huh?)

    Are you just having problems doing basic firmware? Because your project only needs 1 MIPS and 512 bytes code.

    I know you're using stepper motors. Just curious, are you also using aluminum extrusions for your frame? (or angle stock or whatever?)

    Latency of PC's got bad and will probably only get even worse
  • idbruceidbruce Posts: 6,197
    edited 2019-01-17 13:59
    What's your point?
    Getting more directly to the point of this thread, digital I/O is very important to equipment operation, regardless of whether it is an embedded design or computer driven.
    Are you not already using a propeller for your CNC?

    Yes, I am using Propellers to control various pieces of equipment.
    And now you want to use PC directly so you can use C/C++? (huh?)
    But you can use C++ on a micro.
    But you don't want that because that means Arduino? (huh?)

    Are you just having problems doing basic firmware? Because your project only needs 1 MIPS and 512 bytes code.

    To me, using a PC for digital I/O would open up a whole new world of possibilities, without the limitations that are intrinsic to embedded designs. Simply another possibility or solution to future projects.
    Just curious, are you also using aluminum extrusions for your frame?

    I believe most aluminum is extruded. However, I use a lot of angle, bar stock, tubing, etc..., as well as backyard castings.
  • jmgjmg Posts: 15,173
    idbruce wrote: »
    I found this information today....

    For high accuracy timing, the functions listed on this page should do the trick: https://docs.microsoft.com/en-us/windows-hardware/drivers/kernel/counters
    EDIT: I am still searching for better timing functions.

    Yes and no.
    That merely captures the time at that line of code, it does not guarantee any known program real-time performance.
    There have also been mutterings around some of these timers being (possibly) used for attacks, so they move to deliberately cripple them and reduce their precision...
    The bigger and more complicated the Operating System, the less likely you are to have real time performance.

    idbruce wrote: »
    For digital I/O, visit this page, but more specifically, go to heading 5. Bidirectional I/O circuit: https://contec.com/support/basic-knowledge/daq-control/digital-io/

    I could be wrong, but I believe with this information, some awesome things can be done, like running GRBL on a PC instead of on an Arduino :)

    If you want to run hard-real-time at the edge, life will be easier with a MCU that is designed for hard real time & use the PC for more relaxed command-flow stuff.
    What timing granularity does GBRL need, and does it need to read any feedback, or is it output (stepper type) control only ?


    If you want to play with PC and I/O you need good raw backbone speed, so some choices could be
    * Drop "CY7C68013A module" into eBay, and you find lots of low cost HS-USB to GPIO pathways.
    That has good web support and has a inbuilt Microcontroller for config.
    * The FT2232H series have FIFO interfaces, but less general IO, and also HS-USB.
    * The FT602Q is Super-speed USB, for even higher speed IO



    Moving down the spectrum, the FT232H/FT2232H have 12MBd UARTS, that can talk to an external MCU
    The well priced FT4222H-D has a QuadSPI port, that is faster (40~50Mbps sustained), and could be well suited to a P2 or even P1 design. (but seems to have stock issues right now.)
  • idbruceidbruce Posts: 6,197
    edited 2019-01-17 21:19
    jmg

    To be perfectly honest, I am not that well versed with the requirements of GRBL, but I do know that it is probably most well liked and supported open source code for CNC machines.

    As far as I know, GRBL is strictly for steppers.
    That merely captures the time at that line of code, it does not guarantee any known program real-time performance.

    Yea, after I made my initial post, I went searching around in the vicinity that I linked to and I think I may have found some better functions for this type of application. One way or the other, I do believe the required timing functions can be found within the Windows Driver Kit, which could provide a fair degree of accuracy and resolution.

    Although I have never used Mach 3 (https://machsupport.com/software/mach3/), there are many people that do, and Mach 3 uses a PC for machine control.

    One of the key benefits to using a PC as a machine controller, is that they have all kinds of memory available and a person can create some awesome programs to go with machinery, such as Mach 3 for example. I don't know if it has changed at this point, but there was a time when Mach 3 strictly used the parallel port. Nowadays, I am sure there would have to be more options.

    Keep in mind that I am just guessing, but I would assume that just about any common PC, should have enough timing precision, to run most CNC machines, but I could be wrong.

    I suppose my first goal would be to find the benchmarks of GRBL running on the various Arduinos. Secondly, I should do a little more research to find out if the Windows Driver Kit contains functions that can actually be used to surpass the usefulness of the Arduinos, and thirdly, figure out exactly what hardware I need to get digital I/O from those functions.

    Keep in my mind, that in the past, I felt very comfortable writing programs for Windows. Is it possible?
  • jmgjmg Posts: 15,173
    idbruce wrote: »
    Keep in mind that I am just guessing, but I would assume that just about any common PC, should have enough timing precision, to run most CNC machines, but I could be wrong.
    That guess would be wrong. The PC hardware is fine, it is the Windows Operating system and the huge variance in PCs that makes hard real time a lottery.

    idbruce wrote: »
    To be perfectly honest, I am not that well versed with the requirements of GRBL, but I do know that it is probably most well liked and supported open source code for CNC machines.

    As far as I know, GRBL is strictly for steppers.
    That's my understanding too, so that means you need to generate multiple pulses, to well matched precision.
    1~10ms is not going to be good enough, but 1us sounds a better ballpark.

    Google suggests Ardunio manages 30kHz step rates and 115200 Baud speeds, ie somewhat modest.
    idbruce wrote: »
    Although I have never used Mach 3 (https://machsupport.com/software/mach3/), there are many people that do, and Mach 3 uses a PC for machine control.

    One of the key benefits to using a PC as a machine controller, is that they have all kinds of memory available and a person can create some awesome programs to go with machinery, such as Mach 3 for example. I don't know if it has changed at this point, but there was a time when Mach 3 strictly used the parallel port. Nowadays, I am sure there would have to be more options.
    The FAQ there says
    "An external motion device is a piece of hardware that is a replacement for the parallel port. It enables a PC running Mach3/Mach4 to control outputs and read inputs. They typically communicate with the PC via an Ethernet or USB connection (but are not limited to those two means of communication). In order to control a machine using an external motion device, the developer of the hardware must write a plugin for that specific device, so no standard USB-to-parallel port adapters will work."


    If you want to shift the software to be 99% PC-side operation, you need to carefully split the tasks.
    You need a nimble remote unit to do what the PC does poorly, to allow the PC to focus on what it does well.
    Certainly, some programmable logic helps greatly. That could be a MCU, or P1/P2 or FPGA.

    PC-USB UARTS etc actually work quite well, with their buffers, they can manage gap-less streaming at up to 12MBd.
    We have done test benches using a UART as a DAC (via pulse density modulation) and the handshake lines as stimulus.

    The USB parts I mentioned above can manage the data flows needed fine (HS-USB parts spec 40MBytes, FS-USB are maybe 1Mbytes) but they cannot manage time-stamps, but they can be read at a fixed cadence.
    It is then the PC side's job to keep the data available (to avoid underflow).

    idbruce wrote: »
    I suppose my first goal would be to find the benchmarks of GRBL running on the various Arduinos. Secondly, I should do a little more research to find out if the Windows Driver Kit contains functions that can actually be used to surpass the usefulness of the Arduinos, and thirdly, figure out exactly what hardware I need to get digital I/O from those functions.
    Keep in my mind, that in the past, I felt very comfortable writing programs for Windows. Is it possible?

    Yes, I would say it is possible.

    A good PC to Digital I/O device to start with, would be the FT2232H, used in MPSSE mode.
    That can stream byte-wide and serial info, and user can set the clock rate from the PC end, to have coarse control on the bandwidth.
    In that mode, no remote MCU is needed. Fairly simple logic can expand the 8b FIFOs to say 64io using nibble-indexing.
    ie you split the 8b into a nibble of data and nibble for selector.

    To increment all 8 axes (32b or half the IO), at the 30KHz step rate Ardunio specs, the TCK can be a fairly modest 240kHz
    TCK can define from PC side from 6MHz down to 91.553 Hz (12MHz/(2*N16b), for slower than 91.553 Hz, you just repeat the same data pattern.


    Or, the 245 slave FIFO mode, has Ready/Valid flags, to allow a remote MCU/FPGA stream data.
  • jmg

    Thank you for all the good information and the time you put forth in your replies. It is greatly appreciated.
    A good PC to Digital I/O device to start with, would be the FT2232H, used in MPSSE mode.

    There is no way that I will ever understand that datasheet in my lifetime :)

    Is there anything that would be much simpler?


    This page suggests that high accuracy and precision is obtainable from the Windows OS: https://docs.microsoft.com/en-us/windows/desktop/SysInfo/acquiring-high-resolution-time-stamps
  • jmgjmg Posts: 15,173
    edited 2019-01-18 06:30
    idbruce wrote: »
    There is no way that I will ever understand that datasheet in my lifetime :)
    Did you read this ?
    https://www.ftdichip.com/Support/Documents/AppNotes/AN_108_Command_Processor_for_MPSSE_and_MCU_Host_Bus_Emulation_Modes.pdf

    It's been a while since I did MPSSE code, but there are a lot of examples out on the web, and I'd suggest finding working code that uses
    Set clk divisor 0x86, 0xValueL, 0xValueH command, and then play with changing that.
    The simplest code would evenly space updates, as that clock speed ramps, but there is mention of a return clock RTCK, in FT2232H that would allow (eg) 8~16 rapid reads, then a MCU/PLD delay.

    I looked quickly at the smallest SPLD like ATF16V8BQL, and it can fit 8 outputs, with up to 14 devices connected to the FT2232H bus, and I allocated 2 addresses as global SHR/SHL, to raise the IQ
    In use, you would load patterns into all active steppers, and nulls into the paused ones, and then issue a whole series of SHR/SHL
    A minor change could use 2 bits per stepper, to give a quadrature U/D output which would trade off half-step, for 4 motors per PLD. (max of 56 motors)

    idbruce wrote: »
    Is there anything that would be much simpler?
    Hmm any fast USB link is going to take some work. You could also look at CY7C68013A, to see if that looks any better. There are good examples around using that as a Logic Analyser
    (wide capture, but going the opposite direction to what you need, tho the ideas will be broadly portable.

    Google finds this
    https://www-user.tu-chemnitz.de/~heha/basteln/PC/USB2LPT/ul-12.en.htm
    which seems to be a CY7C68013A emulating a parallel port, what may also be useful reference,
    and this
    http://www.cypress.com/documentation/application-notes/an61345-designing-ez-usb-fx2lp-slave-fifo-interface
    tho that's more FPGA as boss, talking to CY7C68013A.

    The CY7C68013A has a 16b FIFO mode, which has some appeal, and boards are very cheap on ebay....

    idbruce wrote: »
    This page suggests that high accuracy and precision is obtainable from the Windows OS: https://docs.microsoft.com/en-us/windows/desktop/SysInfo/acquiring-high-resolution-time-stamps


    That's using a hardware register to capture time, which is not quite the same as being able to define points in time.
    If you run code using those HW-captures, you find that mostly you get sensible dT's but sometimes the OS steals a chunk of time, and the capture jumps.
    I've used them to good effect, for benchmarks and even for Baud-rate sanity checks, but you do need to accept that some readings are outliers, and discard those.
  • I am slowly but surely coming to the realization that Windows does not give access to high precision timers :(
  • kwinnkwinn Posts: 8,697
    Windows is a multitasking operating system designed for data processing, not hard real time use. There is no guarantee that any specific real time code can be executed at a specific time because the OS may be executing a higher priority interrupt service routine at that time. The only mention of using Windows for real time was an article that discussed dedicating one core of a multi core cpu to run a real time OS. That was a year or two back, and I have not seen any further mention of it.
  • kwinn
    There is no guarantee that any specific real time code can be executed at a specific time because the OS may be executing a higher priority interrupt service routine at that time.

    As a programmer, I can specify priority, but I cannot specify the resolution of a clock trigger or timer, and that is my main problem. There is a Windows function that allows me to define a custom timer, but the best resolution is one 1 ms, and in my opinion, that just won't cut the mustard.

    There are high accuracy time stamps, but that would require a lot of work and unnecessary programming.
  • It's not going to be free, but there are realtime extensions for Windows.
    This is a whole other market segment than hobbyist.

    I may keep banging the drum, but in industrial computing on plant floors, PCs have always been present despite mockery and ridicule.
    Real-time is not something that Microsoft cared about (as long as the sound and video in media player didn't skip) so it was left to 3rd parties.
    There's no secret to be unlocked in the windows .dlls, it's simply not present.

    So middleware companies have since long ago filled the gap.
    Even the big factory automation players don't bother writing low-level hypervisor code, they also just license it and package it up to sell their components that they do manufacture.


    EC-Win is probably your best bet being a small operation:
    EtherCAT is an ethernet fieldbus, that can talk to many digital and analog I/O racks and drives. Very high performance.

    EtherCAT masters can run out of the commonly available commercial ethernet chips in most PCs, meaning no custom hardware for the PC.
    They say the library integrates into Microsoft Visual Studio.
    http://www.acontis.com/eng/products/ethercat/ethercat-echtzeit-plattform-fuer-windows/index.php

    Then on the other side you buy the commodity I/O cards and Servo drives. Or if you must build your own, there are EtherCAT slave chips available like the LAN9252.
    https://www.digikey.com/product-detail/en/microchip-technology/LAN9252-ML/LAN9252-ML-ND/5252338


    Here are some other more general links:
    http://www.acontis.com/eng/products/windows-real-time-hypervisor/index.php

    https://www.intervalzero.com/products/rtx64-rtx/overview/
  • Hard real-time is possible with Windows...I used Vtools-D and had an external hardware interrupt generator.

  • But VxD drivers died with Windows Me.

    Nowadays a hypervisor does the timing critical stuff, and has a communication channel up to the OS.
    You can configure the hypervisor to not even tell Windows about the core, leaving it entirely for your realtime stuff.
  • idbruce wrote: »

    As a programmer, I can specify priority, but I cannot specify the resolution of a clock trigger or timer, and that is my main problem. There is a Windows function that allows me to define a custom timer, but the best resolution is one 1 ms, and in my opinion, that just won't cut the mustard.

    Back in the 8088 days I was building custom PC controller cards to drive standard Hobby servo's with at least 5us resolution from a program written in QuickBasic 4.5

    I can't imagine that today's computers wouldn't give you any better than 1ms resolution, when I was getting 5us in 1995

  • Beau
    Back in the 8088 days I was building custom PC controller cards to drive standard Hobby servo's with at least 5us resolution from a program written in QuickBasic 4.5

    I can't imagine that today's computers wouldn't give you any better than 1ms resolution, when I was getting 5us in 1995

    The standard timer for Visual C++ is set with the SetTimer function, in which the time lapse must be specified in milliseconds. The time lapse must also be greater than the defined USER_TIMER_MINIMUM and less than the defined USER_TIMER_MAXIMUM.
    Source: https://docs.microsoft.com/en-us/windows/desktop/api/winuser/nf-winuser-settimer

    Of course that simply will not do... So I started researching the references applicable to the Windows Driver Kit

    Last night I found a reference for creating a custom timer, but I did not bookmark it, because the shortest time lapse that could be set was 1 ms.

    However I do agree that there must be something I am overlooking or there is some technique to achieving smaller time frames.
  • potatoheadpotatohead Posts: 10,261
    edited 2019-01-20 06:22
    However I do agree that there must be something I am overlooking or there is some technique to achieving smaller time frames.

    Good luck. The kernel in Windows does not really support what you want to do. The reason is pretty simple: doing that runs in contradiction to doing what most people need windows to do.

    And that is run a lot of applications and services and manage all of that to provide a good user, or server experience.

    Task switching is expensive. That fact is what limits your timing on windows. If it allowed the precision you were lookkng for, the rest of the environment would either stop, as it would not get a big enough slice of time to complete a necessary task, or slow to a crawl as context switch time grows to consume more overall time.

    But, as others have said here, Windows can very reasonably talk to something that can do those things.

    Just buffer those comms and good solutions can be made.

    One exception may be embedded Windows. But, you will have to write drivers and configure it exactly to spec. What you end up with will not be generally useful on Windows as you run today on your PC.


  • jmgjmg Posts: 15,173
    idbruce wrote: »

    That uses QueryPerformanceCounter() , which was already mentioned.
    Even assuming you can get your SW to time-transit, you still have to get the information out of the PC, and to the remote unit.
    That means USB, or similar, (or EtherCAT?) and the FS-USB frame is 1ms.
    HS-USB is 125us, so that's a little better, but you are still best to let the remote HW manage the time-slots, and windows can feed that,

    USB UARTS are actually quite good at delivering precise cadence, as they have large buffers and FIFOS and the BAUD rate sets how fast they empty.
    The FIFO mode versions can also be clocked at defined rates.
    Cheapest USB devices lock to the frame rate (so they vary with your PC clock, and their RC Osc jitters) but the XTAL based one have quite good local time, and a TCXO can always be used.
  • jmg

    Yea... I believe it has been a waste of research time. It is just hard for me to believe that a little uC can run a machine better than a PC :)

    Well... I expanded my knowledge a little, but as for my horizon, not so much :)
  • jmgjmg Posts: 15,173
    idbruce wrote: »
    jmg

    Yea... I believe it has been a waste of research time. It is just hard for me to believe that a little uC can run a machine better than a PC :)

    Well... I expanded my knowledge a little, but as for my horizon, not so much :)

    It's not really a waste of time, if it helps you define how to share the load.
    The PC is still fine for path interpolate and acceleration etc, but it's just not great at meeting microsecond waypoints, or precise frequencies..
    So you split the resource between a PC and some remote HW.

    That's how the universal programmers do it, - they have a HW remote head that does timing and voltage and current stuff, but the programming algorithms run on the PC
  • jmg

    I sincerely appreciate all of your input!

    Without any doubt, this is a subject that interests me greatly, however, it has also greatly distracted me from my current machine building project.

    For the time being, I will have to put this on the back burner, but I will definitely revisit this subject in the future.

    As I said, I appreciate your input and I will try to learn something from the information that you shared.

    Thank You!
  • Mickster

    Thanks for the video. I found that very interesting and I will have to investigate that further.
  • T ChapT Chap Posts: 4,223
    edited 2019-01-20 16:43
    You can download mach3 and have it installed in about 5 minutes. Assign some pins and hook up a scope and see what kind of timing they are achieving. That might help understand how others are achieving some tight timing.
  • T Chap
    You can download mach3 and have it installed in about 5 minutes. Assign some pins and hook up a scope and see what kind of timing they are achieving. That might help understand how others are achieving some tight timing.

    I agree. I need to investigate Mach 3 in depth to gain a better understanding. From jmg's earlier comments, it sounded like more than a PC was involved.
  • T ChapT Chap Posts: 4,223
    edited 2019-01-20 17:33
    I used mach3 for 5 years with steppers and no encoders. Then WinCNC the last 5 years with no encoders. Rock solid with simple printer port connections. Never once a thought entered my mind about timing issues. It is not a big task to toggle pins on LPT1 and see what kind of speeds and accuracy you get. Instead of timers (in your case with 1ms min) you can put in code similar to repeating NOPs to set pulse lengths.
  • The nice thing about Mach 3 and other similar programs is you can find an old PC for 25$ anywhere even win7 etc that will perform well as a dedicated controller. It’s so cheap to set up.
  • Very Nice Setup!

    Those last parts that I commented on, were they made with this CNC?
  • T ChapT Chap Posts: 4,223
    edited 2019-01-20 18:43
    Well no those were laser only to save time. I could cut aluminum but I don’t like to set up coolant so it’s slow. Laser is 10 minutes for those parts. I make aluminum all the time on this machine here is a recent example. Works fine but I’m more set up for Delrin.
  • T Chap

    Do you have a laser CNC also?

    If so, you are setup nicely :)
Sign In or Register to comment.