Shop OBEX P1 Docs P2 Docs Learn Events
8-Bit 16-Bit 32-Bit Whats the difference? - Page 4 — Parallax Forums

8-Bit 16-Bit 32-Bit Whats the difference?

124

Comments

  • Heater.Heater. Posts: 21,230
    edited 2013-02-06 01:03
    CP/M was everywhere in business for a while and lived on a surprisingly long time after the launch of the IBM PC and PC DOS.

    Secretaries all over were using WordStar, business types had enough data bases and spread sheets to keep them happy. It was a standard accross a lot of famous machines like the Osbourne, NorthStar Horizon, Intertec Superbrain. There were even Z80 add on cards to run CP/M on your Apple II converting it from a toy to a tool. After IBM stepped in with the PC many were still buying 8 bit machines from the likes of Amstrad for a long time.

    CP/M was what IBM wanted on their PC. Or at least a 16 bit version of it. Well as you know for whatever reason they ended up with MS DOS. MS DOS took the standardization further than CP/M. There were a billion different floppy formats in the CP/M world for example.

    The Amstrads were amazing, with colour and sound and games, have a google search for "amstrad cp/m games", they were selling into the 1990's
  • evanhevanh Posts: 15,192
    edited 2013-02-06 01:45
    The clones are the key.

    The interesting part is why no particular CPM hardware clone? Was it just the same situation as the rest? I guess so, the clones couldn't take hold until there was a design that was effectively public domain. That never happened for CPM presumably.
  • TorTor Posts: 2,010
    edited 2013-02-06 02:03
    Gadgetman wrote: »
    Back in the 80s, soon after VisiCalc was announced, a lot of businesspeople entered computer stores and said "I want VisiCalc".They didn't ask for a Macintosh, they asked for VisiCalc.

    It could have been lime green with polkadots, it could have filled a room. What the computer VisiCalc ran on didn't matter.
    VisiCalc was arguably the first "killer application" in the microcomputer world. The first version was for the Apple II (in 1979, two years before the IBM PC) and the Apple II sales exploded. According to one source:
    "The success of VisiCalc turned Apple into a successful company, selling tens of thousands of the pricey 32 KB Apple IIs to businesses that wanted them only for the spreadsheet."
    (The 32KB ones were very expensive, I have an Apple II price list for the different variants glued to my office door.. but VisiCalc needed that much memory.)
    I guess it's not too far a stretch to imagine that the success of VisiCalc and how it made the microcomputer a business tool (with clones like SuperCalc appearing) was what made IBM sit up and take notice. VisiCalc, designed and written by Dan Bricklin and Bob Frankston was arguably what really changed microcomputers from a niche concept to a mass product.

    As for the regrettable choice by IBM of basing the PC on the 8086 platform (in the 8088 variant).. bad, bad, bad, everyone who watched the market at the time wished for the 68K. But (and it's truly hard to find real evidence for this these days - I have looked, and I don't find any of the articles I read back in the day) - it was basically accidental. The first prototype was based on a design an IBM employee had made at home, for fun, and that one was Intel-based (probably 8085). It was brought into the lab and was the base for the first internal prototype. IBM was in a real hurry. If that first home variant had been Motorola-based things may have looked different (no segmented memory.. hooray!), but we won't know for sure of course because then other concerns would come in. IBM wanted a second source for the CPU, for example. But even Intel didn't have one at the time - AFAIK they had to scramble and make AMD a second source after IBM approached them.

    -Tor
  • Heater.Heater. Posts: 21,230
    edited 2013-02-06 02:33
    Evanh,
    The interesting part is why no particular CPM hardware clone?
    Good question.
    As I recall in the 8 bit days there were a lot of players all scrambling to grab the market. There was a lot of "innovation" going on everyone trying to get ahead with more of this, faster that, smaller the other. There were different CPUs, different disks, different video, different operating systems. Things were changing fast and there was no time to think about a standard. And more importantly no one big or influential enough to set a standard.

    CP/M and calmed all that down a bit as a standard OS of sorts. But the hardware underneath was constantly moving.

    There was an open standard of course from the very beginning. The Altair came with a bus system like the PC. The S100 bus. However it only took root in engineering circles where people could knock up hardware to expand their machines via S100 cards. It was probably too big and expensive to make it into the office.

    Then IBM put the hammer down with the PC, it's ISA bus and PC-DOS and we entered a long and boring era of PC clones MS-DOS and so on.
  • evanhevanh Posts: 15,192
    edited 2013-02-06 03:34
    The PC didn't become ISA until 1985. Prior to that it was just the clones copying the XT and AT.

    Couldn't just clone the Altair, that would have been, and prolly was, blocked legally.

    The S100 was only the 8080 on a connector. Not a useful blueprint for the clones.
  • GadgetmanGadgetman Posts: 2,436
    edited 2013-02-06 03:44
    Eh...

    The PC's expansion bus wasn't labelled 'ISA-bus' until 1988, but it was present in every PC built from 1981.
    (It didn't get that name until 'The gang of nine' introduced the EISA-bus and they felt the need to rename the original bus)

    At first it was an 8bit bus, then in 1984, it went to 16bit with the introduction of the AT-class PCs powered by 80286 CPUs.
  • evanhevanh Posts: 15,192
    edited 2013-02-06 03:48
    Tor wrote: »
    IBM wanted a second source for the CPU, for example. But even Intel didn't have one at the time - AFAIK they had to scramble and make AMD a second source after IBM approached them.
    You could argue Intel have carefully maintained that - keeping AMD pinned in a constant prone position while at the same time being able to say "Here is a second source of binary compatible CPU's". Bonus, they also get to stave off anti-trust probes.

    The Athlon64 gave Intel a quick bit of excitement but the marketing machine rolled into action preventing any serious erosion of sales until the Core2 arrived.

    EDIT: Hehe, yeah, that's right, Dell was told who's boss in the PC hardware market too with that one. Apple launched first with the Core2. Dell had to wait in line, I suspect because they had tried publicly warning Intel the delays were unacceptable. I think that stung pretty bad.
  • Heater.Heater. Posts: 21,230
    edited 2013-02-06 05:04
    The IBM PC always had an ISA bus, even if it was not called that originally.

    Perhaps a clone of an Altair was not on but similar machines, IMSAI for example, using the same S100 bus were around a lot. You right S100 was basically the bus control signals of the 8080/8085 class CPU's but was also used in Z80 systems. Perhaps those were not as compatible as one would like but the Z80 took over so that was a good standard. Don't forget the IBM PC bus (ISA) was also pretty much the same signals coming out of the 8088 and it's bus support chips. So basically nearly as crappy as S100 was.

    As wkipedia says:

    "The S-100 bus was the first industry standard expansion bus for the microcomputer industry. S-100 computers, consisting of processor and peripheral cards, were produced by a number of manufacturers"

    S100 was indeed targeted at "clones" it was even an IEEE standard (IEEE696).
  • evanhevanh Posts: 15,192
    edited 2013-02-06 05:25
    The ISA standard formalised a little more than just changing the buses name from XT/AT to ISA. There were some fixes made. At any rate the name ISA didn't exist before that. After all, the PC was certainly no standard.

    The S100 was targeted at compatibles (and a pretty vague interpretation I'd say too), not clones. A rather significant difference.

    Cloning involved just copying an existing computer design, circuit for circuit, part for part. If there weren't schematics that covered everything then they were stuck. There is no specs to work from, it's just a straight copy without any care if it works or not.

    Interesting, in typical Wikipedia fashion, just having a read of it now and, the S100 bus description of two unidirectional 8 bit data paths is not that of the 8080. So, assuming that description is vaguely correct, the S100 had nothing at all to do with the 8080 or any other CPU I know of.
  • Heater.Heater. Posts: 21,230
    edited 2013-02-06 05:41
    Yes the "standard" got tightened up and called ISA. I would say the PC and it's was very much a standard, not in official meaning originally, that's the thing everyone set about cloning after all. It was a re-run of what happened when the Altair bus became the Altair/IMSAI bus became...became S100.

    That's an interesting idea about clones. Thing was apart from the BIOS there wasn't anything much to clone. The IBM PC was pretty much just the circuitry you would see in the data books for Intel's 8088/86 processor and support chips at the time.
  • evanhevanh Posts: 15,192
    edited 2013-02-06 06:05
    Heater. wrote: »
    Yes the "standard" got tightened up and called ISA. I would say the PC and it's was very much a standard, not in official meaning originally, that's the thing everyone set about cloning after all.
    A standard is a formal technical and political entity. It's either an actual agreed standard or it's not. If the agreements are not done then it's just a name without a standard. So, the PC was not a standard in any sense until those agreements were signed off on. In the case of ISA, it didn't even exist as a name until the agreement was made.

    The IBM PC was pretty much just the circuitry you would see in the data books for Intel's 8088/86 processor and support chips at the time.
    Of course, that's true of many designs. That's why a legal basis is needed to hold control.

    No mater the simplicity it's still a details thing. Details that are easy to do wrong. The clone makers didn't care for working out details when they could just clone.
  • ctwardellctwardell Posts: 1,716
    edited 2013-02-06 06:23
    zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz....

    Edit: The content is interesting, the back and forth over minutiae not so much.

    C.W.
  • tritoniumtritonium Posts: 540
    edited 2013-02-06 06:32
    CP/M
    Ah those were the days.
    I remember dreaming of having a computer with CP/M, but they were SO expensive.
    I was building hobbyist computers, memory was expensive - hobbyists tended to use static memory the cheaper dynamic stuff was too complicated to keep refreshed, and I wanted to add a floppy disc but I needed an operating system.
    Eventually I gained access to a CP/M machine and on the system disc was a copy of a skeletal CP/M, (ie a CP/M that was NOT written for that particular machine but with generic code which the programmer tweeked to suit their own hardware).
    And that is what was happening. There was a LOT of value in locking in the user to all the other stuff a user needs - software, floppy disks and so on (no hard discs yet!)' The different sellers of kit made sure that their CP/M was NOT compatable with anyone elses. They did it by for instance making the sector interleaving different, and in other ways, so that if you 'borrowed' or purchased a different manufactures stuff it would not work (sound familiar?).
    Anyway armed with this disc I was able to write my own os -happy days.
    Remember Amstradt? (Alan Sugar)
    He brought out a machine that was (in the beginning) just a word processor, ran CP/M - dirt cheap - didn't look like a toy- the printers electronics were in the pc case (as I re-call). Identified a market need - Sold millions.
    Anyway when the PC arrived, you bought software and periferals for the PC not the BRAND, you were no longer locked in, cloning made stuff affordable.
    Thankfully!
    Gosh I've just re-read the title of the thread - how did this happen!
    Dave H
  • kwinnkwinn Posts: 8,697
    edited 2013-02-06 07:42
    Heater. wrote: »
    Evanh,

    Good question.
    As I recall in the 8 bit days there were a lot of players all scrambling to grab the market. There was a lot of "innovation" going on everyone trying to get ahead with more of this, faster that, smaller the other. There were different CPUs, different disks, different video, different operating systems. Things were changing fast and there was no time to think about a standard. And more importantly no one big or influential enough to set a standard.

    CP/M and calmed all that down a bit as a standard OS of sorts. But the hardware underneath was constantly moving.

    There was an open standard of course from the very beginning. The Altair came with a bus system like the PC. The S100 bus. However it only took root in engineering circles where people could knock up hardware to expand their machines via S100 cards. It was probably too big and expensive to make it into the office.

    Then IBM put the hammer down with the PC, it's ISA bus and PC-DOS and we entered a long and boring era of PC clones MS-DOS and so on.

    The S100 bus certainly made it into offices in the area I was was working in at the time. For every unit I installed in a lab, school, or engineering related location I must have installed 5 or 6 in legal, accounting, and other business offices. These were not bare bones machines either. In most cases they were Z80 cpu, 64K ram, dual 8 inch floppies, and a Data Products D55 daisy wheel printer. In a few cases they had a 5 or 10 Meg hard drive.
  • kwinnkwinn Posts: 8,697
    edited 2013-02-06 07:43
    ctwardell wrote: »
    zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz....

    Edit: The content is interesting, the back and forth over minutiae not so much.

    C.W.

    Ditto.
  • potatoheadpotatohead Posts: 10,254
    edited 2013-02-06 07:50
    @Heater, I almost went back and edited it out. Was time to go to bed. I think it's funny too.

    Those clones popped up very soon after IBM released them. "it's compatable, but cheaper" was all a lot of people needed to hear.

    The weight of IBM + Standard mattered. And on that basis, tons of business bought the machines and the clones.

    I do wonder about a no clones policy. At the time, lack of standardization hobbled CP/M. "Home computing" was never really a business consideration, apart from the Apple ][. That left a big gap out there. People were computing, but they wanted more.

    IMHO, a big part of "more" was "who" more than it was "what" so long as "what" could get the job done. Once IBM entered the fray, computers were positioned differently and IBM meant business --big business and that means a lot. All I know is I saw tons of people who understood the PC wasn't a technical accomplishment of any kind (and it really wasn't) jump on the PC because it was "the standard" and it didn't take long for software to center on that.

    One huge point in favor of the PC, in the eyes of many who adopted them rapidly, was IBM isn't going anywhere, and that meant any investment made was going to endure for a long while. In the CP/M days, you could get a machine and then be stuck with some machine that Bob's Trustworthy Computers built, only to find out Bob was gone... Apple kept trying to kill the ][, to promote a more elegant computer. Had they not done that, perhaps things would have been way different. Apple did have a huge grip on software at that time. Nobody else even came close! And it ran CP/M.

    Apple didn't carry the weight IBM did though. And that weight was important. Unlike Bob's Computers, IBM really wasn't going anywhere soon. And unlike Bob, a PC is just a PC.

    That's how I see it anyway. I've never seen the better tech get a leg up in computing, other than for specific niches. I've also seen people spend amazing amounts of money to leverage software expertise and investments. I've seen competing products, particularly software ones, where the technically superior one doesn't get the massive share that an inferior one gets due to a name or marketing.

    And I believe the core reason for that is the vast majority of people who really can pay for computers have one basic metric, and that is "can I get X done?" They almost never say, "I need a computer with non segmented memory" They will ask for graphics, sound, input devices, ports and such, depending.

    However it evolved, the core things people valued were not technical specs, but more broad and general things. Am I seeing the benefit of scale? Can I get business done? Does it work with X? Will my supplier be around next year, or does that matter when I have lots of suppliers to choose from? Etc...

    In response to this thread, I did go and ask a lot of people who were there and who bought PC's. Most gave answers in line with the above. More than a few said one word, "Lotus"

    I am intrigued by the observation that IBM used the Intel reference design. Did Moto have a similar design by chance? Probably they did. Was it as general as the Intel one was? (Slots, BIOS, etc...?) This may have influenced IBM considerably and I have very little data. IBM would not have worried so much about technical superiority as much as they did a strong alignment with how they saw the PC doing things.

    **And I will wait a good long time for that example evanh. Here's why. I really want to be wrong about it. I do not believe I am, simply because I've challenged that repeatedly over the course of 15 years or so. One of the people I work for and with (it has varied over the years) said this, and I balked then and we've had a running conversation for that period of time. It has been very interesting and enlightening, and I've never once put a dent in that observation; namely, "things are worth what people will pay for them."

    That observation and the implications of it has led to doing very good business over an extended period of time, and that tends to be something very hard to refute particularly when watching others who don't get that fail.

    So please do. It's not ego with me at all in that sense. If you actually do have a solid rebuttal answer to that, I have a great new conversation starter I can't wait to drop on the table! :)Let's read it. All ears. Nail me man. I'm asking for it. Nice.
  • Heater.Heater. Posts: 21,230
    edited 2013-02-06 11:29
    potatohead,
    I am intrigued by the observation that IBM used the Intel reference design.
    Well, I'm not sure I'd go so far as saying Intel had a "reference design".
    But Intel had their CPU and a bunch of support chips, interrupt contrllers, bus arbiters, UARTS, timers etc. In their manuals at the time were nice clear schematics indicating how to use them.
    There was also the Intelc machines that they made for developers using all that stuff for which the circuits were published.
    Pretty much anybody could have come up with an IBM PC like design from that info.
    In fact a bunch of us young guys back in Blighty were working on such ideas as a hobby project.
  • potatoheadpotatohead Posts: 10,254
    edited 2013-02-06 11:42
    Ok then. Moto had that stuff too.

    I can only conclude Intel showed more value and less risk to IBM. Seems to me IBM would have gone shopping and had to select a vendor. The vendor who maximized those two things would be in the lead for selection.
  • ctwardellctwardell Posts: 1,716
    edited 2013-02-06 12:03
  • potatoheadpotatohead Posts: 10,254
    edited 2013-02-06 14:41
    Interesting bit there about a full 16 bit machine undermining their other computing products. So Big Blue was really being conservative about it all.
  • evanhevanh Posts: 15,192
    edited 2013-02-06 15:06
    A clarification is needed. Compatible is completely different to clone. I wasn't aware there was confusion on this earlier. In fact I'd kind of forgotten about compatibles, and true blues for that matter, because they were so rare around here.

    A compatible is something that is designed and has it's own infrastructure and service support. Often mechanically quite different. It's a totally different cost structure.

    And it involves licensing. IBM would have been licensing the right to make something compatible.

    Clones didn't have any of that. They were dirt cheap for the amount of hardware used to build them.

    Also, licensing doesn't automatically imply any sort of standard.
  • evanhevanh Posts: 15,192
    edited 2013-02-06 15:37
    Gadgetman wrote: »
    The PC's expansion bus wasn't labelled 'ISA-bus' until 1988, but it was present in every PC built from 1981.
    (It didn't get that name until 'The gang of nine' introduced the EISA-bus and they felt the need to rename the original bus)

    Hehe, yeah, does make sense. Wikipedia has conflicting info again but the line on the EISA page, where it dates ISA at 1988, actually has a reason.
  • evanhevanh Posts: 15,192
    edited 2013-02-06 16:58
    tritonium wrote: »
    Eventually I gained access to a CP/M machine and on the system disc was a copy of a skeletal CP/M, (ie a CP/M that was NOT written for that particular machine but with generic code which the programmer tweeked to suit their own hardware).
    And that is what was happening. There was a LOT of value in locking in the user to all the other stuff a user needs ...
    I suspect that was the norm all right. With no single public domain blueprint for the clones to latch onto. Ditto for S100. The PC was the first (and only) in that respect. Which possibly comes back to IBM letting it happen.
  • localrogerlocalroger Posts: 3,451
    edited 2013-02-06 17:03
    potatohead wrote: »
    I can only conclude Intel showed more value and less risk to IBM. Seems to me IBM would have gone shopping and had to select a vendor.

    It's really very simple. IBM was not going to use a processor family being used by one of their main competitors. Apple was using Motorola, and Tandy was using Zilog. That left Intel.

    At the time, Intel was trying to play a long game to leapfrog the Z80, which had stolen the 8080's thunder by being completely object code compatible, easier to use, and far superior. The 8085 was out but it was aimed at the embedded market and still inferior to the Z80.

    At the time x86 was a mess. It was ahead of its time but as a result it was expensive, slow, and used RAM very inefficiently. It was designed to be "source code compatible" with the 8080; you couldn't run 8080 object code on it, but you could run 8080 assembly language through the x86 assembler and get code that would run (which is how a lot of CP/M apps got ported to the PC in the early days).

    But it was what was left after eliminating Zilog and Motorola, so IBM used it. In a move many of us in the day regarded as fraud they advertised it as "16-bit" even though they used the 8088, the version wth an 8-bit path to memory. Well, it did have the same 16-bit ALU as the 8086, but ran slow as dog poop compared to a Z80 at the same clock. It was only commercially successful because IBM. But there it was.

    And it turned out Intel had really good reasons for doing all that wasteful separation of function; it may have made the 8088 slower than dog poop compared to a Z80 but you could take the same object code that ran on that 8088 and run it on an 8086, then a 80286, then a 80386, then a 80486, then a Pentium, and it will even run on your spiffy 2 GHz modern machine. And not in emulation -- the CPU just runs it. Every major upgrade of other 8-bit micros meant rewriting and recompiling all the software (including Apple's move to the 68000 family). Zilog pretty much fell out of the game and into the embedded world after the TRS-80, Coleco Adam, and Amstrad PCW lines sank beneath the waves. Because of the volume they were selling Intel had the cash to do first-class R&D and build new fab facilities every few years as the tech improved. Motorola ended up a full generation behind them by the mid 1990's which is why Apple was forced to move Mac to Intel (which meant rewriting and recompiling all the software, but Apple had the clout with their customers to make them go through the pain of upgrading).
  • Heater.Heater. Posts: 21,230
    edited 2013-02-06 22:49
    localroger,

    I had some, shall we call it "fun", with that Intel 8 to 16 bit compatibility. We were working with 8085 based embedded systems and I found that I could replace the 8085 with a little piggy back board I lashed up holding an 8088 and one TTL-something chip. The bus signals were very similar so it just worked.

    Using the Intel tools one had to run the 8080 ASM code through the conv86 tool to get 8086 ASM which could then be assembled. Amazingly our code worked in that Frankenstein board I had made.

    However it was almost exactly half as fast as the original 8080 code!

    Turned out that 8080 and 8086 treated some flag bits a bit differently and a lot of extra instructions had been introduced by conv86 to make sure flags were set correctly. Using a conv86 option to allow less strict flag handling gave you a more ore less one to one instruction translation.

    At that point our code got back to the original 8080 speed.

    After that we laid out a proper 8088 board, with FPU no less, upped the clock speed and started making use of 16 bit maths in the applications. Then we were flying!
  • Heater.Heater. Posts: 21,230
    edited 2013-02-06 22:57
    localroger,

    Yes, I always though Apple was amazing for the platform swaps they made. Don't forget they did it twice, from 68xxx to PowerPC to Intel.
    It was a huge technical feat what with all that rebuilding of OS and apps.
    The most amazing part being they managed to pull all the Mac software houses and customers along with them.
  • potatoheadpotatohead Posts: 10,254
    edited 2013-02-07 00:19
    Both times I was sure it wasn't really going to happen, but it did anyway. One of these days I need to ask people who did it why it made sense.
  • Heater.Heater. Posts: 21,230
    edited 2013-02-07 00:47
    "Why it made sense"

    Legend has it that the 68xxx line was falling behind in performance. The power PC was pretty hot at the time so they had little choice.
    As luck would have it for them the PPC did not work out either so Intel it was.
    All of which was god practice I guess as they could easily go ARM.

    Of course the other amazing thing is they changed operating system as well along the way.

    Still, after all that, I have never used an Apple product.
  • TorTor Posts: 2,010
    edited 2013-02-07 02:37
    localroger wrote: »
    It's really very simple. IBM was not going to use a processor family being used by one of their main competitors. Apple was using Motorola, and Tandy was using Zilog. That left Intel.
    It sounds as a likely hypothesis, but there is no evidence at all that this was what happened. Remember also that the first Motorola Apple (the Lisa) came on the market in January 1983, much later than the 1981 IBM PC. Intel had started scrambling in 1979 when they saw Apple II sales booming in the business market (ref. the previously mentioned VisiCalc). As was mentioned by another poster the m68k wasn't really market-ready in 1979, although I still maintain my earlier remark about Intel's prototype being based on a home-made computer by one of their employees. I _will_ find that article some day.. actually there were several, and an interview with the IBM employee in question.

    -Tor
  • potatoheadpotatohead Posts: 10,254
    edited 2013-02-07 02:49
    I didn't use the intermediate ones either. I picked up Mac OS when it became a UNIX. Prior to that, I had very little reason to run it.

    What Apple did makes sense, at least moving off the Moto 68K. Really, I'm wondering what users thought of all that. I do know emulation layers got them through for the most part, but a lot of them ended up with updated applications. Many users regularly update and support their vendors. Much more so than the PC, though there are plenty of PC users that do it too. I just think the percentages are much higher with Mac users.

    Just kind of curious to hear what they have to say. I know I ran IRIX machines and some Linux to avoid Microsoft at times. Would again too, but right now it makes little sense. Win 8 has me thinking about all of it again, and for the first time I'm getting some queries as to options. Apple is able to drag their users along through some really brutal stuff. I'm not sure Microsoft has the same pull.

    Seems to me, going ARM on the computers wouldn't make a lot of sense. Who knows though? If the chips continue to catch up, all bets are off.

    @Localroger: " It was only commercially successful because IBM. But there it was." Yeah, there it was... And I really do think "IBM" had way more to do with that whole thing than anybody really wants to recognize.

    @Heater & LR: Interesting posts. I too objected to it being hobbled with the 8 bit bus. It ran very sloooow. At that time, 6502 / Z80 / 6809 computers easily beat it.

    In general, I jumped on the PC for manufacturing. Once we got past the first couple of iterations, 286 and above, CAD / CAM really became possible and that's where I started. Ran some Xenix too, Word Perfect multi-user edition to replace some old CP/M stuff, the whole works on WYSE Terminals 9600 baud, but for a couple shorter distance ones that would do 19,200. Real CAD / CAM was happening on UNIX and interestingly, the Mac on 68K. The PC really entered that market with the Pentium computers, quick enough to do solid modeling, and that was the start of the end of the IRIX era for me.

    Right now, I have the entire exterior of a small jet loaded on an i7 machine and it weighs in about 5 GB. The nVidia Quadro 1K can spin that model around rather nicely too. For all the warts getting here, I can't really complain about a PC laptop these days. They are insane fast. I've not clocked the better Mac, but I'll bet it's near this performance, a bit slower, but with far better power management and of course a nice GUI and UNIX under the hood.

    IMHO, we've come to a point where it's making sense to manufacture just fast computers... Even cheap-o consumer grade laptops can do a lot of this stuff, if the graphics are there. Gamer cards differ very little from "Pro" systems, mostly software support for visualization / OGL, etc...

    This whole mobile / ARM business is mostly about going backward now. "All day" type computers that can do lots of stuff, and or that really narrow down choices. It's kind of bizarre. All my earlier comments about that "Apple ARM" type machine sort of make sense, but not quite. Going backward, limiting choices makes some sense, but only so much. Are we really going to fragment people that way? Interesting times.
Sign In or Register to comment.