Shop OBEX P1 Docs P2 Docs Learn Events
How come so many 5 star threads here? - Page 2 — Parallax Forums

How come so many 5 star threads here?

2

Comments

  • Heater.Heater. Posts: 21,230
    edited 2015-04-10 12:17
    davidsaunders,
    You do not know what system they are running, or if they are using a WebBrowser that supports JavaScript, or if they have an OpenGL implementation to use with WebGL.
    Strangely enough we do know. There is now nearly 50% of the worlds population on line. Due to the exponential growth that means that most of them have new hardware and operating systems that can handle it.

    When I started this project a few years back I was told I was crazy, "nobody has web-sockets" they said, "nobody has webgl", "HTML 5 is too new", "It won't work in IE". I smiled and refused to use FLASH.
    ...if we could get rid of the local FS access client side stuff in the variant used in HTTP clients (aka web browsers).
    What do you mean? JavaScript in web browsers does not have access to your local client side file system without user interaction.
    ...I highly doubt that a interface like what you speak of would be useful on the lowest common denominator.
    The lowest common denominator is any machine capable of running a recent Chrome, Firefox, Opera or IE browser. That covers most of the human race that has internet access. Who cares about the 0.00001% desperately trying to use the Amigas and Archimedes class machines?

    QTVR does not do what I want. I want real time-3D data visualization.
    ...what is the difference between writing the program and writing the program??
    I'm all for writing any program only once. But if I write in C/C++/Pascal/Java/C#...whatever I have to build and test that on every platform that users may have: Linux, Windows, Mac, in all their myriad of different versions.

    Then the potential users have to download and install the resulting binaries. You know what a pain that can be.

    On the other hand I can just send them JS in whatever browser and it works! Amazing.
    JS is close, if it were available on all systems separate from an HTML interpreter and separate from a web browser.
    Please don't tell me you have not heard of node.js. Run JS on your machine, from the command line, no browser required.
  • davidsaundersdavidsaunders Posts: 1,559
    edited 2015-04-11 17:09
    Heater. wrote: »
    davidsaunders,

    Strangely enough we do know. There is now nearly 50% of the worlds population on line. Due to the exponential growth that means that most of them have new hardware and operating systems that can handle it.
    I think that you overestimate that. While it is probably less than 3% that still use the good systems, I doubt it is as low as you think.

    Also if you look at the rate of growth in Personal computers, and at how many people have not upgraded in the last decade, I think you will find that a large chunk of those "Newer" computers are 300MHz to 1GHz PowerPC based Macintoshes running pre OS-X versions of Mac OS, or 400MHz to 1GHz Intel or AMD boxes, likely still running Win2K through Win2K3.
    When I started this project a few years back I was told I was crazy, "nobody has web-sockets" they said, "nobody has webgl", "HTML 5 is too new", "It won't work in IE". I smiled and refused to use FLASH.
    Flash is not the only way, pre HTML5. There were sites that did the same thing in pure JS as far back as HTML 4.0 (late mid 1990's). And it was fast enough on 100MIPS systems, so you could have done the same with even better effect.
    What do you mean? JavaScript in web browsers does not have access to your local client side file system without user interaction.

    The lowest common denominator is any machine capable of running a recent Chrome, Firefox, Opera or IE browser. That covers most of the human race that has internet access. Who cares about the 0.00001% desperately trying to use the Amigas and Archimedes class machines?
    I think it is a higher percentage than that, especially among those that would be interested in such an application. Maybe if you look at the world population it is somewhere around 0.001% (around 30000 people that still use systems of that era).

    More than that if you are listing RISC OS systems as Archimedes class. There are millions of Beagle Boards, Raspbery Pi's, etc out there, and roughly 4% of those are running RISC OS (some estimates higher some lower).

    Then there is the fact that the Atari ST/TT/Falcon compatible systems are making a notable comeback, with multiple clones, some based around the midrange ColdFire CPU's and some based around custom CPU's in FPGA.
    QTVR does not do what I want. I want real time-3D data visualization.

    I'm all for writing any program only once. But if I write in C/C++/Pascal/Java/C#...whatever I have to build and test that on every platform that users may have: Linux, Windows, Mac, in all their myriad of different versions.

    Then the potential users have to download and install the resulting binaries. You know what a pain that can be.

    On the other hand I can just send them JS in whatever browser and it works! Amazing.

    Please don't tell me you have not heard of node.js. Run JS on your machine, from the command line, no browser required.
    I have heard of Node.js, it runs on most modern systems. Not all.
  • evanhevanh Posts: 16,041
    edited 2015-04-11 21:28
    I noticed my bank about a year ago, for the first time, started offering an alternative scriptless internet banking option. I'm all for eliminating unnecessary scripting, I find most scripts implement wasteful burdensome features, but I do find the bank's behaviour a bit odd really - they've had this fancy scripted two-factor login since day one but now there appears to be a way to bypass that.
  • davidsaundersdavidsaunders Posts: 1,559
    edited 2015-04-12 04:53
    evanh wrote: »
    I noticed my bank about a year ago, for the first time, started offering an alternative scriptless internet banking option. I'm all for eliminating unnecessary scripting, I find most scripts implement wasteful burdensome features, but I do find the bank's behaviour a bit odd really - they've had this fancy scripted two-factor login since day one but now there appears to be a way to bypass that.
    sounds like your bank needs to work on the server side software. If done correctly it is fairly simple to keep people from bypassing such things in pure HTML. That is one system that is likely better with out any scripting.
  • Heater.Heater. Posts: 21,230
    edited 2015-04-13 06:09
    davidsaunders,

    I do understand your appreciation of old and obsolete systems. Atari, Amiga, Archemedes, RISC OS etc. It's nostalgia for simpler times. I also think it's an important part of history that should be preserved. Hence my little tribute to Gary Kildal and CP/M with a Z80 emulator for the Propeller.

    As my sig. here used to say: "For me, the past...is not over yet."

    Now these may be 1% or 3% or whatever of systems actually in productive use out there. It makes no sense to limit ones developments to machines and systems that have so little usage and severely limit what one can do. It also make no sense to spend a lot of time creating code that cannot be shared and used by as many people as possible.
    Flash is not the only way, pre HTML5. There were sites that did the same thing in pure JS as far back as HTML 4.0 (late mid 1990's).
    Are you sure about that? How?

    I had the misfortune to be involved in developing a web store in the late 1990's and that's not how I remember it. There was no canvas, there was no SVG or webgl. There was no way to redraw a page in response to incoming data in any kind of real-time manner. Every thing had to be done by a total page reload and re-render, like when hitting that submit button.

    Not until the mid 90's did we get the iframe that could load content asynchronously. And then only in IE.

    At the very end of the 90's we got the XMLHTTP request. Still, actually drawing anything was not possible. Hence the widespread us of FLASH.

    None of this got going until 2005 or so with Google's use of it in gmail and apps and writings of Jesse James Garrett.
  • davidsaundersdavidsaunders Posts: 1,559
    edited 2015-04-13 17:25
    Heater. wrote: »
    davidsaunders,

    I do understand your appreciation of old and obsolete systems. Atari, Amiga, Archemedes, RISC OS etc. It's nostalgia for simpler times. I also think it's an important part of history that should be preserved. Hence my little tribute to Gary Kildal and CP/M with a Z80 emulator for the Propeller.

    As my sig. here used to say: "For me, the past...is not over yet."

    Now these may be 1% or 3% or whatever of systems actually in productive use out there. It makes no sense to limit ones developments to machines and systems that have so little usage and severely limit what one can do. It also make no sense to spend a lot of time creating code that cannot be shared and used by as many people as possible.

    Are you sure about that? How?

    I had the misfortune to be involved in developing a web store in the late 1990's and that's not how I remember it. There was no canvas, there was no SVG or webgl. There was no way to redraw a page in response to incoming data in any kind of real-time manner. Every thing had to be done by a total page reload and re-render, like when hitting that submit button.
    I do not know how they did it back then, I do know that there were some great JS coders back then. Back then you did not see anything like that on commercial sites, it was all on the sites of the hackers that created it.

    Though dynamically modifiable HTML through JS was very much a reality back then, with out having to re render the entire page. JS back then used HTML as its output to change the layout of the page, even dynamically. What you are doing with the newer extensions is the same thing the good old hackers were doing in pure JS to show off back in the late 1990's.

    Not until the mid 90's did we get the iframe that could load content asynchronously. And then only in IE.

    At the very end of the 90's we got the XMLHTTP request. Still, actually drawing anything was not possible. Hence the widespread us of FLASH.

    None of this got going until 2005 or so with Google's use of it in gmail and apps and writings of Jesse James Garrett.
  • davidsaundersdavidsaunders Posts: 1,559
    edited 2015-04-13 17:37
    @Heater:
    I use the good systems not just because of nostalgia, though as much because they are good.

    Atari TOS with GEM is well written, not bloated, has a simple and usable API, and works very well. There is no reason that newer types of applications could not be implemented (heck there is a port of FireFox 3.6 to TOS, as well as a port of thunerbird [though they are slow, like they always are]).

    Atari TOS with GEM is better than DR-DOS with the x86 version of GEM. This is largely do to the fact that the Atari coders had to weed out a lot of redundant code in order to get the first version to fit into a 96KB ROM, and this made debugging a lot easier.

    Then there is RISC OS. To be real I like RISC OS because it is the best OS commonly available for the ARM.

    With most common modern OSes there is just to much bloat, so many coders think that the CPU is so fast that I do not need to worry to much about optimizing for speed, and there is so much RAM that it does not matter how big it is. These are wrong views, what good is a faster system with more RAM if you can not do more with it?

    ___________________________________________________

    A big part of why I want the Propeller 2 to become a reality so bad is I have designed a 3 cog 68020 emulator for the Prop 2 that relies on the quad word hub access to maintain speed (calculated to be around 20MIPS on a 100MIPS Propeller 2). That leaves 13 other cogs to emulate the rest of the Atari Falcon HW (including the 56K DSP). Now with some external RAM (256MB SDRAM takes one cog) that would be my ideal computer.
  • Heater.Heater. Posts: 21,230
    edited 2015-04-13 19:40
    davidsaunders,

    Seems we are quibbling over details of the time line. JS did not even exist until about 1996. Dynamical changing the HTML is nice but not really useful until you can get data streamed into the browser. That brings us to the end of to 90's as I said.

    Unless you can point to an example I'm not convinced that what skilful hackers were doing then is at all similar to what one can do with SVG, canvas, webgl, websockets etc today.

    I sometimes miss my old Atari ST520. Though I can't imagine it being much use today. No memory protection, no multi-core support, a clunky old DOS like file system.

    Did they open source GEM? How else do we know how well written it is?

    Do you have a link to FireFox on the Atari? I'd love to see it.
  • jmgjmg Posts: 15,175
    edited 2015-04-13 21:44

    A big part of why I want the Propeller 2 to become a reality so bad is I have designed a 3 cog 68020 emulator for the Prop 2 that relies on the quad word hub access to maintain speed (calculated to be around 20MIPS on a 100MIPS Propeller 2). .....
    Sounds like a great 'stress test' for a P2 pre-release.
  • markmark Posts: 252
    edited 2015-04-13 22:29
    I think modern web browser capabilities are great. Very functional with excellent cross platform compatibility. I personally welcome the push to get more software and services to be run out of them. I'm also glad the days of requiring all sorts of browser plug-ins are coming to an end, and hopefully flash follows suit. My biggest gripe is superfluous features in websites that are completely unnecessary. No, your website's navigation menu SHOULD NOT be made with flash. Nor does it need snazzy features in a desperate attempt to make it look cool. That said, in Chrome, I have flash set to "click to enable", so it won't be running flash ads or auto-play videos unless I specifically click on them. My other gripe is the sheer size of sites now a days. When I used to tether using my cellphone, I was pretty shocked that the average site would download about 3-5MB worth of content per page. Completely ridiculous for most of them.

    Saying that it's acceptable to use dedicated software for all the various things we take for granted in browsers such as FTP, video playing, etc. is equivalent to saying we should do the same for all other software. Ok, I just got to some point in a game where it should play a pre-rendered cut-scene. Hold on, let me exit the game, open the video player, and play the appropriate cutscene video, then when I'm done, re-launch the game. Sounds ridiculous, right? That's because it is. I hate software bloat as much as the next guy, but I'm not sure I'd really want to go back to the tedious way things were.
  • davidsaundersdavidsaunders Posts: 1,559
    edited 2015-04-14 03:05
    Heater. wrote: »
    davidsaunders,

    Seems we are quibbling over details of the time line. JS did not even exist until about 1996. Dynamical changing the HTML is nice but not really useful until you can get data streamed into the browser. That brings us to the end of to 90's as I said.

    Unless you can point to an example I'm not convinced that what skilful hackers were doing then is at all similar to what one can do with SVG, canvas, webgl, websockets etc today.

    I sometimes miss my old Atari ST520. Though I can't imagine it being much use today. No memory protection, no multi-core support, a clunky old DOS like file system.

    Did they open source GEM? How else do we know how well written it is?

    Do you have a link to FireFox on the Atari? I'd love to see it.
    A good part of the source for TOS 4.04 was leaked, same with a good part of the source of TOS 4.92. And MiNT (Recursive for MiNT is Now TOS) has always been open source, MiNT is the protected memory preemptive multitasking kernel for TOS.

    Any Atari with a 68030 or better that is running either MiNT or Magic does have memory protection.

    And the Digital Reasearch branch of GEM was open sourced in 1998.

    I am searching for the Atari port of firefox now.
  • davidsaundersdavidsaunders Posts: 1,559
    edited 2015-04-14 03:09
    Heater. wrote: »
    davidsaunders,

    Seems we are quibbling over details of the time line. JS did not even exist until about 1996. Dynamical changing the HTML is nice but not really useful until you can get data streamed into the browser. That brings us to the end of to 90's as I said.

    Unless you can point to an example I'm not convinced that what skilful hackers were doing then is at all similar to what one can do with SVG, canvas, webgl, websockets etc today.

    I sometimes miss my old Atari ST520. Though I can't imagine it being much use today. No memory protection, no multi-core support, a clunky old DOS like file system.

    Did they open source GEM? How else do we know how well written it is?

    Do you have a link to FireFox on the Atari? I'd love to see it.
    Also I do not think that anyone uses the FAT filesystem on the Atari with TOS anymore, I think that more modern FS's are the norm, like MiNT-FS, MINIX FS, and EXT2/3.
  • davidsaundersdavidsaunders Posts: 1,559
    edited 2015-04-14 03:30
    @Heater:
    I am not finding the FireFox for Atari TOS at the moment, though when I do run across it again I will try to remember to get a link to you.

    Everyone seems to prefer either HighWire or Netsurf on Atari TOS (and I agree with using those two).

    Also an Atari 520ST is not very usable anymore (8MHz 68000 CPU, 512KB RAM, no SCSI or IDE (only ATSCI)). Most use either Atari TT with at least 32MB RAM (much more possible, up to 256MB with out tricks, more with some tricks), Atari Falcon with at least 64MB Ram and an accelerator board, FireBee (Coldfire based Atari Falcon clone), Milan 040 or 060 (TT clone with 68040 or 68060 CPU), or another of the higher end Atari's or clones. There are applications that are ST only that are worth keeping a 1040 STE around for, or a Mega STE, though generally the higher end systems are better for everyday use.

    You see the Atari has grown way past the ST, and the current clones are great systems. Everyone is waiting for a better CPU to be available, though that is likely tied to better FPGA being available at a lower price (we are currently stuck with a 400MIPS Coldfire as the best for an Atari clone).
  • mindrobotsmindrobots Posts: 6,506
    edited 2015-04-14 11:44
    Chromebook -> Espruino IDE (Chrome App) -> Espruino

    Sweet! :D
  • ersmithersmith Posts: 6,068
    edited 2015-04-14 17:16
    Atari TOS with GEM is better than DR-DOS with the x86 version of GEM. This is largely do to the fact that the Atari coders had to weed out a lot of redundant code in order to get the first version to fit into a 96KB ROM, and this made debugging a lot easier.
    Atari GEM got a bit messy towards the end, but then all software does get messy, given time :). The guys who ported GEM to the Atari had to do it in an incredibly short period of time. It was before my time, but most of them were still at Atari when I arrived there. There were some very good coders on the TOS team.
    A big part of why I want the Propeller 2 to become a reality so bad is I have designed a 3 cog 68020 emulator for the Prop 2 that relies on the quad word hub access to maintain speed (calculated to be around 20MIPS on a 100MIPS Propeller 2). That leaves 13 other cogs to emulate the rest of the Atari Falcon HW (including the 56K DSP). Now with some external RAM (256MB SDRAM takes one cog) that would be my ideal computer.

    That sounds very cool. Make sure you put MiNT on it. I'd be pretty tickled to see MiNT running on a Propeller (even if indirectly), Actually a native Propeller 2 version would probably be feasible; might be a nice project. It would somehow be appropriate to have MiNT compiled with PropGCC, given that much of the PropGCC library comes from the MiNT C Library.

    Eric Smith
  • davidsaundersdavidsaunders Posts: 1,559
    edited 2015-04-14 17:29
    ersmith wrote: »
    Atari GEM got a bit messy towards the end, but then all software does get messy, given time :). The guys who ported GEM to the Atari had to do it in an incredibly short period of time. It was before my time, but most of them were still at Atari when I arrived there. There were some very good coders on the TOS team.
    Yeah I guess even Atari TOS began to fall victim to the to many cooks in the kitchen syndrome as it got on to the 4.xx versions, though it is still a lot better than most competitors.

    I was not aware that you worked for Atari. I have to ask the question (please forgive me): What ever happened to the TOS 5 project beyond TOS 4.92?
    That sounds very cool. Make sure you put MiNT on it. I'd be pretty tickled to see MiNT running on a Propeller (even if indirectly), Actually a native Propeller 2 version would probably be feasible; might be a nice project. It would somehow be appropriate to have MiNT compiled with PropGCC, given that much of the PropGCC library comes from the MiNT C Library.

    Eric Smith
    It would take a lot of work to port MiNT, BIOS, XBIOS, TOS, GDOS, VDI, AES, VDI Driver, Desktop.app, XControl and the standard applications. Though I do believe it would be worth the effort, count me in when I can buy a Prop 2 for under $15. A native Prop 2 version may be just what the doctor ordered, add to that a 68K emulator that hands off system TRAP calls to the native Prop based system and we would have something truly great.
  • jmgjmg Posts: 15,175
    edited 2015-04-14 17:35
    ..we are currently stuck with a 400MIPS Coldfire as the best for an Atari clone.

    If there is a 400MIPs Coldfire, what is the appeal of a 20 MIP emulated MPU on a Prop 2 ?
  • ersmithersmith Posts: 6,068
    edited 2015-04-14 17:44
    I was not aware that you worked for Atari.
    Check out the MultiTOS copyright screen sometime :) (Or even just the boot messages from MiNT.) There's a good reason I re-used so much of the MiNT C library for PropGCC...
    I have to ask the question (please forgive me): What ever happened to the TOS 5 project beyond TOS 4.92?
    All of the computer related stuff got shut down, and the programmers shifted onto the Jaguar game console. So really there was nothing beyond TOS 4.92. TOS 5 would have been the OS for a next generation Falcon, but that project didn't work out and the management decided that there was more money in game consoles than in computers.

    Regards,
    Eric
  • davidsaundersdavidsaunders Posts: 1,559
    edited 2015-04-14 18:56
    jmg wrote: »
    If there is a 400MIPs Coldfire, what is the appeal of a 20 MIP emulated MPU on a Prop 2 ?
    Simple, I can not afford a FireBee (the 400MIPS ColdFire Atari Falcon clone), and all my Atari HW is long dead :( . So I have to run emulation on a 68K Macintosh at this time (using an older version of MagicMAC).

    And even with a 50MHz 68030 (overclocked on the TT) the real thing only runs at about 14MIPS. OK I know that 100MIPS is possible with a 68060 at 66MHz, though is never achieved with real world applications (closer to 50MIPS tops real world).

    With a 3 COG 20MIPS emulated 68020 it would be possible to have multiple emulated CPU;s while leaving enough cogs to emulate the other HW (I think that 4 emulated 68020's would be possible, using the remaining 4 cogs to emulate the rest of the HW). And with MiNT and a little work to correctly support multiple CPU's that could be a big advantage.

    Though now that ersmith made the suggestion I am thinking that a native Propeller 2 version of the OS as a whole would be even better, with a 68020 emu thrown in to support the traditional applications, with newer stuff written to run native on the Prop 2. Kind of like the way that Mac OS implemented support for 68K code in the PowerPC based computers (though better as it is not running on the same core as the native software).
  • porcupineporcupine Posts: 80
    edited 2015-04-14 20:01
    Holy smokes, it's Eric R Smith of MiNT fame. Between your name and David Betz of XLisp fame that's two names I encountered a lot as a teen on my early days on the Internet. Man I love this forum and this community. :-)

    I have been collecting Atari machines lately. I'd love to hear your stories from your Atari days, Eric.
  • potatoheadpotatohead Posts: 10,261
    edited 2015-04-14 21:38
    This dialog makes me happy. :)

    Atari 800XL owner and user here. I flirted with the ST machines, and really liked what I did do, but ended up dragged into MSDOS land and manufacturing... :( Didn't get happy again, until I was running Moto chips on SGI IRIS boxes.
  • cruXiblecruXible Posts: 78
    edited 2015-04-14 23:37
    potatohead wrote: »
    This dialog makes me happy. :)

    Atari 800XL owner and user here. I flirted with the ST machines, and really liked what I did do, but ended up dragged into MSDOS land and manufacturing... :( Didn't get happy again, until I was running Moto chips on SGI IRIS boxes.

    I have the 600xl, 800xl and the 1040st. I just need to plug one in and it is happy times. I think I lost all of my st floppies, though.
  • koehlerkoehler Posts: 598
    edited 2015-04-15 02:07
    Ha, had the old Atari 800 back in '80/81. Think it was $800 or close.

    But I saw the light, and went with the Amiga after that.
  • porcupineporcupine Posts: 80
    edited 2015-04-15 05:36
    Back in the day I went from VIC-20 to an expanded Atari 520STFM, and lived with that til my parents gave me a 486DX 50 as a graduation present. I don't have that 520 any more, but I have since collected a Falcon, a 520STM, two 1040STFs, a Mega ST, a Mega STe, and now a TT coming in the mail. Plus I have the MiST FPGA emulation box. Oh and I have an Atari 800XL and a 130XE (which needs replacement DRAM). I now see what great machines those 8-bits were.

    I've never owned an Amiga, but I've played with it hardware emulation of it on the MiST, and it's pretty nice -- the games are much more responsive than the ST equivalents.

    A lot of missing or undocumented pieces of Atari ST era history are coming together over the last few years. The dadhacker blog series was neat, some interesting discussions on the Atari Museum facebook pages, the open sourcing (GPL) of all the Digital Research IP (but not, unfortunately, the Atari branches), and the integration of that code into EmuTOS which is a full open source reimplementation of the ST's operating system based on a combination of the DR pieces and new stuff.

    I wouldn't call the Atari's OS combination elegant, but it's not bad for the era. Basically a sort of DOS CP/M hybrid ported to the 68k, pretty primitive, no device driver model, no TTY redirection, or pipes or other 'Unixy' features, and certainly no threads or multitasking. Eric's work overtop of that was really great, and has been continued to be something of a quite nice OS in FreeMiNT + various GEM reimplementations on both classic and 'modern' hardware (Firebee on ColdFire).

    GEM itself was nicely architected, but IMHO incomplete. I'd love to see or do an interview with Lee Jay Lorenzen, who apparently was one of the key designers on DR of GEM, and who had come from Xerox where he had worked on the Star. That's a pretty nice pedigree! I'd love to hear his conceptions of what he was aiming for in the design and what working in that era and environment was like. He went on to do Ventura Publisher. The AES in GEM is nicely designed, obviously with an eye to an object oriented, multitasking system, but in the real world I think was incomplete -- it lacked an underlying OS kernel to back the message passing, multitasking API it provided, all the code was non-reentrant... and as a GUI toolkit itself it was very low-level and difficult to write real-life rich Mac-like GUIs in. It needed a nicer kernel below it and a higher level toolkit above it.
  • SapiehaSapieha Posts: 2,964
    edited 2015-04-15 06:02
    Hi porcupine.

    Are that emulation Open source and if so - have You link to that code ?

    porcupine wrote: »
    Plus I have the MiST FPGA emulation box..
  • Heater.Heater. Posts: 21,230
    edited 2015-04-15 06:51
    porcupine,
    Holy smokes, it's Eric R Smith of MiNT fame. Between your name and David Betz of XLisp fame that's two names I encountered a lot as a teen on my early days on the Internet. Man I love this forum and this community. :-)
    This is a great place isn't it.

    This answers my question as to why there are so many 5 star threads here. We have so many stars appearing :)
  • porcupineporcupine Posts: 80
    edited 2015-04-15 07:32
    Sapieha wrote: »
    Hi porcupine.

    Are that emulation Open source and if so - have You link to that code ?

    The device itself is of course $$, and imho underspec'd as it's built around a Cyclone III and has VGA output when it would be nice if it had DVI or HDMI.

    All the source is open.


    https://code.google.com/p/mist-board/
  • davidsaundersdavidsaunders Posts: 1,559
    edited 2015-04-15 13:59
    cruXible wrote: »
    I have the 600xl, 800xl and the 1040st. I just need to plug one in and it is happy times. I think I lost all of my st floppies, though.
    Just use a PC, format the DS/DD disks on the Atari ST, download the software that is available free (a lot of the ST class software is), and use the PC to write the files to disk.

    The reason for formating on the ST is the bootsector, slightly different though close enough to work.
  • davidsaundersdavidsaunders Posts: 1,559
    edited 2015-04-15 17:54
    Missed this comment earlier.

    So you are the Eric Smith. Good to know.

    As a test run do you think it may be possible to port a subset of the MiNT kernel to the Propeller 1?
  • porcupineporcupine Posts: 80
    edited 2015-04-15 18:01
    Fitting it in 32k hub ram would be ... a challenge :-)

    I don't know why you'd want a full OS on the prop?
Sign In or Register to comment.