Shop OBEX P1 Docs P2 Docs Learn Events
Big update for DE2-115 and DE0-Nano users w/add-on boards - Page 13 — Parallax Forums

Big update for DE2-115 and DE0-Nano users w/add-on boards

1789101113»

Comments

  • jmgjmg Posts: 15,175
    edited 2013-10-14 20:27
    Is this new FPGA board a candidate ? just $49.*

    Seems to have chunk more Memory and a slight jump in LE on the Nano, and runs a Cyclone V (5CEFA2F23C8N), not IV.

    http://www10.edacafe.com/nbc/articles/1/1226556/Altera-Extends-Low-Cost-Portfolio-with-Development-Kits-Starting-$49

    * I see Verical shows 22 in stock, at $42.03, orderable 1+
  • rjo__rjo__ Posts: 2,114
    edited 2013-10-15 05:06
    And from Terasic... cyclone v, 77k LE, $179
    https://www.terasic.com.tw/cgi-bin/page/archive.pl?Language=English&CategoryNo=167&No=830

    HDMI connector 4Gb DDR2 x32 4Mb SRAM x 16
    Arduino header
  • rjo__rjo__ Posts: 2,114
    edited 2013-10-15 05:21
    OR... for just a little more($279) https://www.terasic.com.tw/cgi-bin/page/archive.pl?Language=English&CategoryNo=167&No=816&PartNo=2

    which gives you multiport DDR3 with DMA and 110K LE for 6 Cogs!!!
  • jmgjmg Posts: 15,175
    edited 2013-10-15 11:42
    Yes, for full system testing large FPGAs will always be useful but the price curve quickly slashes the number of users.

    That is why a new price point at the bottom end is more important, it means many more users can run up P2 emulation
    (provided a build can target that board)
  • potatoheadpotatohead Posts: 10,261
    edited 2013-10-20 08:18
    Are we still waiting for Chip to make the latest round of changes, or has a new FPGA configuration been released?
  • rjo__rjo__ Posts: 2,114
    edited 2013-10-20 09:25
    IthinkWeBWaitn:)

    In the old days, at some point during their training general surgeons were required to remain sleepless one night in three, just to make sure they could do it during times of duress. They could function, but their personalities took quite a beating.

    Expect Chip to be in prickly state when he finally re-surfaces.

    Rich
  • Cluso99Cluso99 Posts: 18,069
    edited 2013-10-20 15:38
    rjo__ wrote: »
    IthinkWeBWaitn:)

    In the old days, at some point during their training general surgeons were required to remain sleepless one night in three, just to make sure they could do it during times of duress. They could function, but their personalities took quite a beating.

    Expect Chip to be in prickly state when he finally re-surfaces.

    Rich
    Never seen (well heard) Chip in a prickly state even after the 36hour binges. He's one cool potatoe :)

    He certainly has a big job on his plate... new fpga, major change to pnut.exe, fix the rom monitor, and then doc the new instructions. What a job!

    I am getting the new instruction set into an Excel spreadsheet and will post when I am done with the extra columns. I really like the new format.
  • potatoheadpotatohead Posts: 10,261
    edited 2013-10-20 15:47
    Ok then. No worries. Glad to see you using your board RJO.

    I think that will be a good spreadsheet to look at. I like the format too. IMHO, this whole round of test, change, etc... is intriguing. Apparently, lots of progress was made. We will vet that, and ideally the real P2 comes out running good.

    Most of what we've seen so far falls under the category of, "If we must, then let's make it worth it" kinds of things.
  • Roy ElthamRoy Eltham Posts: 3,000
    edited 2013-10-20 16:57
    Chip has a lot to do to get the new instruction layout implemented in the FPGA, pnut, bootrom/monitor, and spin interpreter.
    It'll probably be a bit before we get an update.

    (now watch Chip prove me wrong and post later today :P)
  • potatoheadpotatohead Posts: 10,261
    edited 2013-10-20 20:49
    Yeah, be careful what you wish for Roy!

    No worries here. It's a huge job. I asked because he has surprised us before.
  • JRetSapDoogJRetSapDoog Posts: 954
    edited 2013-10-21 11:19
    Hmm...I meant absolutely no disrespect with the picture I posted here and am attempting to remove. I'm one of the "Propeller Faithful" (or at least a "wannabe" in terms of skills). And I often laugh at the clever comments or jokes witty people post here, and I only hoped to evoke laughter with that pic (as I know I would have laughed upon seeing it from someone else). I just happen love the movie "Forrest Gump" and couldn't resist posting that particular scene, as Chip certainly deserves a rest after all his hard effort. However, in that the pic apparently can be taken in a way that I most certainly didn't intend, I will try to take it down. Apologies for the distraction. And regarding the development time involved with the P2, it is indeed a "labor of love," and I'm glad for all the love it has received and is receiving, as all the diligent effort will result in something special, magical. It has been said, "Find a job that you love and you'll never have to work another day in your life." Well, that is perhaps a bit of an overstatement, as we all know Chip is working extremely hard (and on all cylinders), but I'm pretty sure he loves the work and probably wouldn't have it any other way.
  • rjo__rjo__ Posts: 2,114
    edited 2013-10-21 12:15
    That's just wrong. What is most important is that Chip really love what he produces. If that means another couple of months to get it the way he wants it, that's the way it should be. And lot's of what Chip is doing came right here from the forum.

    While we are on the subject... I've been wondering... why isn't there (or maybe there is) a simple way to trigger input from an external source using multiple triggers and clocks. Right now, if we want to produce a wide range of video signals, we set a couple of registers, point at some data and forget it ( I don't understand it, yet. But I know it is possible.) Why can't we do that to acquire images from a wide range of video sources... or can we?
  • jmgjmg Posts: 15,175
    edited 2013-10-21 12:43
    rjo__ wrote: »
    While we are on the subject... I've been wondering... why isn't there (or maybe there is) a simple way to trigger input from an external source using multiple triggers and clocks. Right now, if we want to produce a wide range of video signals, we set a couple of registers, point at some data and forget it ( I don't understand it, yet. But I know it is possible.) Why can't we do that to acquire images from a wide range of video sources... or can we?

    New P2 Counter capture, and even final serializer form, are still 'in definition' - but the capabilities signaled, do seem to be improving.

    It is a good idea to have HW manage the bit-level stuff, and let SW & threads manage bytes/words.
  • YanomaniYanomani Posts: 1,524
    edited 2013-10-21 13:10
    rjo__ wrote: »

    While we are on the subject... I've been wondering... why isn't there (or maybe there is) a simple way to trigger input from an external source using multiple triggers and clocks. Right now, if we want to produce a wide range of video signals, we set a couple of registers, point at some data and forget it ( I don't understand it, yet. But I know it is possible.) Why can't we do that to acquire images from a wide range of video sources... or can we?
    jmg wrote: »
    New P2 Counter capture, and even final serializer form, are still 'in definition' - but the capabilities signaled, do seem to be improving.

    It is a good idea to have HW manage the bit-level stuff, and let SW & threads manage bytes/words.

    rjo__ , jmg

    I'm possibly missing something very obvious, but...just in case

    Despite Propeller 2 160 MHz master clock will give us a very good timing resolution, it will not be better if we can enjoy some kind of external pll dot clock recovery?
    Or there are any other exact means of doing this, and totaly avoid jittering pixels and/or color shifts, as in the case of grabbing some composite video sources?

    Yanomani
  • rjo__rjo__ Posts: 2,114
    edited 2013-10-21 16:11
    Yanomani,

    I love interdisciplinary discussions, because both parties have to establish a common language... that sort of makes sense to both.

    So, it is important that you don't confused with an engineer:)

    What seems totally reasonable to a simple mind often makes no sense to an engineer.

    That being said, to me it isn't necessary to capture every bit of information... just that information that is known to be good.
    In the same way that our hardware limits what we can do with a particular video output... I would expect limits to what could be done with any particular video source acting as an input. To me, this is not a reason not to do it.

    There are many variations of

    "capture every nth x-bit pixel (on pini-pinj) on one of the edges of a signal on pinx when piny has a particular state and transfer that to a variable hub location."

    The problem I see is that for every 100 people that want to generate a video signal, there might be one or two that wants to input a video signal.

    Rich
  • YanomaniYanomani Posts: 1,524
    edited 2013-10-21 23:04
    rjo__

    No worries here!:thumb:

    I love this kind of conversation, not by being an engineer, since i'm not, but by the fact it can lead us, to an upper level of knowledge.
    It's very good to me, knowing how much information you may want to extract, from such a complex signal.
    Because the end result, must match the best, and also the worst, of all criteria used by us, human beings, to judge anything: expectation and satisfaction.
    And you have just hit the point, telling us, how far you wanna go, to match your personal expectations.
    Nineteen year ago, I was hired to design some video signal grabber circuits (PAL M), to be used at some kind of new stock exchange building.
    The background image was kind of networked broadcast, synchronously overlayed by some computer generated, side scrolling, white colored characters messaging content. .
    The original specs, written by someone, somewhere, somehow, stated a few basic goals, to be satisfied by the resulting solution.
    Among all the technical bloating, they clear stated, in capitals:
    - The end result was meant to be displayed on 21" high res (90° neck angle), color tube monitors, placed at not least than 2,5 meter at the vertical plane, 2 meter at the horizontal plane, from the attendees perspective point of view.
    - The fixture system was planned to set them 4.5" appart, in a 30" center to center horizontal chain.
    Well, to be short, after three months of design, parts procurement, special inductors crafted by TOKO, to satisfy National or Rohm (I simply can't remember) chip requirements, four layered circuit boards, special cables, SMPS, metal enclosures, bla, bla, bla, and literaly hundred and hundred hours of testing and adjustments, we finaly (and proudly) bloated two cargo vans, in a sixty miles journey, to finally install and test the whole thing.
    We were so confident of a succesfull two day trip, that we brought only a extra pair of t-shirts, trousers and underwear.
    When we entered the building (shiny and new as the Morning Star), we almost stuck, astonished, at the very first sight; Some promotional marketing genius has bought about sixty wide angled (120° at the neck), 29" monitors, with the worse yet ever to be seen deflection circuits we had found in our short and miserable lifes.
    There are no golden pots at the end of the rainbow.
    In fact, there are not even enough gold in the whole world, to simply fill a little spoon, for each and every of the literally million rainbow effects, displayed by those unfaithful and astigmatic monitors.
    You can be assigned to the Nobel prize on physics, provided you could find a single white dot, inside those fuzzy colored characters.

    Do you want more grease in a staircase's step?:innocent:

    The original spacing was maintained, by some contractor's assembly crew preceeding us; there was just half an inch span between them, and the leaky ones caused lateral distortions to their neighbours.
    We spent a full week, carefully matching and adjusting each monitor/deflection coil pairs, and wasting almost all mu metal shielding material we could find in the whole state, trying to minimize the field effects.
    A tired five soldier team returned home, immersed in their own sweat, almost strabic and with several injuries at the neck muscles.
    A lot savvy team, for sure!:nerd:
    Since then, without any previous warn, I become allergic to frame grabbers, in general.

    Yanomani
  • rjo__rjo__ Posts: 2,114
    edited 2013-10-31 19:37
    Yanomani,

    I have studied your post and have decided not to corner the market on mu metal. Instead I am concentrating my efforts on what exactly constitutes a burst to SDRAM…my errant plan is accepting whatever the pins have to offer and dutifully recording them for further analysis.

    Of course, I have no idea with what minimum frequency I need to stimulate my SDRAM or for how long I might engage it so.

    I do love the data sheets… not that they ever answer my questions:)

    Rich
  • YanomaniYanomani Posts: 1,524
    edited 2013-10-31 20:54
    rjo__ wrote: »
    Yanomani,

    I have studied your post and have decided not to corner the market on mu metal. Instead I am concentrating my efforts on what exactly constitutes a burst to SDRAM…my errant plan is accepting whatever the pins have to offer and dutifully recording them for further analysis.

    Of course, I have no idea with what minimum frequency I need to stimulate my SDRAM or for how long I might engage it so.

    I do love the data sheets… not that they ever answer my questions:)

    Rich

    rjo__

    Perhaps I'd totally missed the point of my post intents, within reminiscences of my own previous experience.
    I'm sorry for this!

    I will try again, with a better focus in mind.

    If you can have a clue, about your intended to be grabbed video original dot clock frequency, then you can ponder its value, against the maximum clock frequency available at processor's side.

    The classic solution indicates a 1:2 relationship, and being video data to be collected, it's usualy synced to the horizontal rate.

    You can start by estimating the image data timing, versus the Hsync (front porch + sync pulse + back porch) period.
    If the previous sum, generates enough time for you to send all the needed command sequences, for the Ram chip to be ready to accept writing data to it when video data realy begin, and also there is enough spare time, to afterwards orderly close the sequence, and set the parameters to grab the next line and loopback to the grabbing sequence, then you may have reached your minimum needed clock spec.

    Sure, you should also decide, where to stop, based on lines total, or frames total. It's a matter of how much Ram you can rely on, to do the job.
    There are also concerns about being grabbing interlaced or not signals, to proper assembling final results.

    If your input signal, is from a Bayer pattern camera, you should arrange grabbed data accordingly too. Extra processing will follow, before presenting the results in a monitor.

    Hope it can be helpful, in some way.

    Yanomani
  • rjo__rjo__ Posts: 2,114
    edited 2013-11-02 19:44
    Yanomani,

    You are perfect.

    Exactly what I had in mind. Nyquist had his moments:)

    Chips SDRAM driver is really simple… and really complicated at the same time.
    Whenever Chip delivers the next FPGA image, my immediate problem will be to figure out how to
    stream constant data to it, using an intermittent clock and figure out what the data rate can actually be.
    Anything you could do with his driver would be most helpful.



    Rich
  • AribaAriba Posts: 2,690
    edited 2013-11-02 20:39
    rjo__ wrote: »
    Yanomani,

    You are perfect.

    Exactly what I had in mind. Nyquist had his moments:)

    Chips SDRAM driver is really simple… and really complicated at the same time.
    Whenever Chip delivers the next FPGA image, my immediate problem will be to figure out how to
    stream constant data to it, using an intermittent clock and figure out what the data rate can actually be.
    Anything you could do with his driver would be most helpful.

    Rich

    The clock for the SDRAM is always the system clock frequency. That is 80 MHz for the current FPGA emulation and about 160 MHz later.
    You can not freely choose this frequency, the whole SDRAM interface is based on the the 1:1 relation of SDRAM clock and P2 instruction clock

    Andy
  • YanomaniYanomani Posts: 1,524
    edited 2013-11-02 21:24
    Ariba wrote: »
    The clock for the SDRAM is always the system clock frequency. That is 80 MHz for the current FPGA emulation and about 160 MHz later.
    You can not freely choose this frequency, the whole SDRAM interface is based on the the 1:1 relation of SDRAM clock and P2 instruction clock

    Andy

    Ariba

    That was the reason I have suggested, on post #375, to use pll methods to recover the original dot clock, or a suitable sub multiple, and inject that recovered clock, as the Propeller's external reference.
    One can also use a fixed reference, but this will severely limit solution's usefulness to video sources whose dot clock is a multiple of the fixed source.
    And also suffer from jitter, by the lack of phase and thermal drift compensations.

    Yanomani

    P.S. I just figured that this can be a bit tricky solution, because the clock recovery circuit must have a minimum free running frequency, to avoid stalling the whole processor, in the absence of video feed.
Sign In or Register to comment.