Shop OBEX P1 Docs P2 Docs Learn Events
interface to a PCI graphics card? - Page 2 — Parallax Forums

interface to a PCI graphics card?

2»

Comments

  • max72max72 Posts: 1,155
    edited 2012-04-21 13:58
    It has been done, also with playstation 3, using a dedicated OS.

    Moreover Nvidia has specific cards for number crunching.
    http://en.wikipedia.org/wiki/Nvidia_Tesla
    A possible bottleneck is the processing power is in the parallel computing capability, so to exploit the power you should be able to feed a steady amount of data.
    Another possibility would be to use a PC to interface both with the prop and the card.
    In this case you have the libraries and the interface on a PC, which would "bridge" data between prop and graphics card.
    Massimo
  • Duane DegnDuane Degn Posts: 10,588
    edited 2012-04-21 16:58
    max72 wrote: »
    Another possibility would be to use a PC to interface both with the prop and the card.
    In this case you have the libraries and the interface on a PC, which would "bridge" data between prop and graphics card.

    I think using a PC would be cheating. I also think not using a PC would be insane.

    I think it would take a lot of bit banging to get a Prop to talk with a video card. It would have to be a project done for the fun of it since I'm sure it's not a very practical way of crunching numbers.
  • Pharseid380Pharseid380 Posts: 26
    edited 2012-04-21 17:52
    I think the TI VLIW DSP's have a version that does 64 bit integer math, as well as 32 bit integer and floating point versions. And multicore versions of various types. And maybe versions where you can break the carry chain and do SIMD operations. So quite an assortment.

    phar
  • prof_brainoprof_braino Posts: 4,313
    edited 2012-04-21 19:05
    Duane Degn wrote: »
    I think using a PC would be cheating. I also think not using a PC would be insane.

    I think it would take a lot of bit banging to get a Prop to talk with a video card. It would have to be a project done for the fun of it since I'm sure it's not a very practical way of crunching numbers.

    Argh! Its not "talking to a video card! It's "talking to a chip that crunches numbers"! And no, its NOT a practical way of crunching numbers, unless you doing like the guys in the link in post 5, in which case its cheaper and better performance than the actual university super computer, if we take them at their word.
  • Duane DegnDuane Degn Posts: 10,588
    edited 2012-04-21 19:14
    Its not "talking to a video card! It's "talking to a chip that crunches numbers"!

    Sorry, my bad. How about "talking to the number crunching chips on a video card"?
    And no, its NOT a practical way of crunching numbers,

    And, yes, I realize you weren't implying it was a practical way. I'm all for doing impractical things if it's fun and/or educational. But this is way beyond what I'd try to do (not that you were suggesting I do it). I'm guessing those who could do it (if it can be done) are probably doing other sorts of fun things.

    I did try (unsuccessfully) to get the Prop to communicate with a memory chip on a old PC DIMM card. I plan to give this other try some time.

    For a more practical way to add a coprocessor to the Prop, one could use a floating point chip. Cessnapilot has written an object (I think more than one) for the Micromego uM-FPU64 chip.
  • prof_brainoprof_braino Posts: 4,313
    edited 2012-04-21 20:59
    Duane Degn wrote: »
    ... add a coprocessor to the Prop, one could use a floating point chip.

    Yeah, that's a good way too. Nick Lordi has already posted forth code for a floating point coprocessor (which normally would be my first choice). But that wasn't what I was looking for, for various reasons.

    I was thinking more of the "steady stream of data" as max32 mentions in post 32, to come from several sensors, like a GPS, an accelerometer, a gyro, range finder, etc. All the input streams would come from separate cogs to a custom single purpose app on the cruncher, which converts the telemetry to tomography (not the right word, having a brain fart at the moment).

    I'm thinking how to generate an on-the-fly environmental model as the robot (quad copter or ATV) traverses the environment at a fairly high speed.
  • rod1963rod1963 Posts: 752
    edited 2012-04-22 10:14
    Ahh sensor fusion combined with real time graphic visualizations. You would need a lot of horsepower and serious math skills for this. You got three choices for that.

    1) Go with a seriously muscular DSP and a expensive tool set.
    2) Get a Beagle board which has all the numerical and graphical goodies you need in one little package. Just interface the Prop via I2C, write the software and you're ready to rock and roll.It's debatable though if even that has the MIPs to do what you wantt. You'd need to do some tests in this department to get a idea of what you can pull off. As a side note: One DoD project I worked on(a terrain following missile with unique tracking capabilities) did a lot of real time sensor fusion and even with it's on board 1 GHz Alpha processor and custom logic it was hard pressed to handle speeds of more than 200 knots. And the PC's that did the number crunching via MathCad were the fastest money could buy at the time.
    3) Get a PC with a CUDA card, interface it to the Prop which is sitting in a quad copter or ATV via wireless. This would be the best and easiest choice.
  • Duane DegnDuane Degn Posts: 10,588
    edited 2012-04-22 10:21
    rod1963 wrote: »
    Ahh sensor fusion combined with real time graphic visualizations.

    The way I read Prof's post, I don't think he's after a graphical visualization. He just wants to the robot to konw where it is in relation to its surroundings. I don't think he needs the environment displayed to a screen.
  • prof_brainoprof_braino Posts: 4,313
    edited 2012-04-22 10:54
    Duane Degn wrote: »
    The way I read Prof's post, I don't think he's after a graphical visualization. He just wants to the robot to konw where it is in relation to its surroundings. I don't think he needs the environment displayed to a screen.

    Exactly! At most, stream (some portion of) the pre processed origin-vector-distance records to a workstation, and let some college kids do the visualization software from the comfort of the dorm room. The bot only cares that it stays between the obstacles.

    And rod1963 is correct, the prop + (something) mobile platform will interface to a move powerful remote machine which does the actual heavy lifting.
  • rod1963rod1963 Posts: 752
    edited 2012-04-22 11:54
    Well that's simplifies things a lot :smile:

    Off the top of my head a BeagleBone would fit the bill quite nicely. It's relatively low priced($89.00), small with serious cpu power plus it runs Linux. Would make a great combo with a I/O Boss chip like the Prop or a PropII(when it comes out).

    FWIW one possible configuration would be: Prop+Sensors(on robotic vehicle) xmits data to BeagleBone for number crunching then results displayed by 2nd Prop on monitor . I think minus the sensors and robot this could be done for under the Professors $200.00 price range.
Sign In or Register to comment.