Shop OBEX P1 Docs P2 Docs Learn Events
Complexity of modern computers. Can we even hope to understand? — Parallax Forums

Complexity of modern computers. Can we even hope to understand?

RavenkallenRavenkallen Posts: 1,057
edited 2010-09-22 03:09 in General Discussion
I have had this question tumbling around in my head for awhile. With the advent of computers so complex that we can't even fully understand all the hardware/ software issues, can a single human(With only small scale integration IC's and transistors/passives) actually make a computer OR even a microcontroller? On wikipedia i saw a circuit(That took up four large breadboards) that was a very small computer made form logic gates/ registers.....This thread is for any discussion about the creation/ complexity of the "Ultimate" DIY computer. Can it be done? Or is mankind already out of touch with our machines?

Comments

  • edited 2010-09-17 17:37
    Please share the link of the DIY computer. Please do.
  • FranklinFranklin Posts: 4,747
    edited 2010-09-17 18:23
    It all depends on your definition of computer.
  • Martin_HMartin_H Posts: 4,051
    edited 2010-09-17 18:25
    Yes it can be done. There's a home brew CPU web ring and some of them are amazing.

    Dr Harry Porter's relay computer and presentation materials on his web site http://web.cecs.pdx.edu/~harry/Relay/ can help you understand how to build a computer from discrete logic components.

    Once you make the jump to integrated circuits the Magic computer at http://www.homebrewcpu.com/ takes things further. It is built from 74 series logic chips. It is roughly equal to a mini computer from the late 1960's.

    Moving into the 1970's the 6502 only has about 4000 transistors in it and I've heard of people building their own compatible CPU's using 7400 series chips.
  • RavenkallenRavenkallen Posts: 1,057
    edited 2010-09-17 19:57
    @Martin H....Wow, that relay computer is amazing..At Least we will have one computer that will survive a nuclear holocaust, haha....But it said it used modern chips for its ram, so i guess only part of it would survive...


    @chuckz..I have tried to find the wikipedia page, but i have had no luck. I will keep trying though...
  • kwinnkwinn Posts: 8,697
    edited 2010-09-17 21:55
    Ravenkallen, it is possible for an individual to build a computer using 74xx(ttl or cmos) series logic. Most of the minicomputers were built that way. A simple 8 bit computer could be built on a single board approximately 8x10" using dip chips.
  • Dr_AculaDr_Acula Posts: 5,484
    edited 2010-09-17 23:16
    I can get my head around TTL and also around simpler microcontrollers like the 8080 and Z80. For me, CP/M is probably the last operating system that is truly possible to understand how it works, to be able to hack every single line of code.

    DOS is built on CP/M, and windows was originally a program that ran in DOS.

    I got lost about 1990, but I think it is possible to understand a computer from before then, especially one running as an emulation on the propeller.
  • Martin_HMartin_H Posts: 4,051
    edited 2010-09-18 06:34
    @Ravenkallen, I've also heard of people building their own RAM using ferrite cores. So the relay computer plus some ferrite core RAM would be a 100% pure pre-semiconductor computer.

    There's a saying in computer science that everything was invented in the 1950's and everything since has been elaboration. So the abstractions used in modern computers are identical to the abstractions in something like the relay computer. But the details are radically different. I agree with the sentiment that the 8 bit CPU's and run time monitors were the last generation of technology that it was possible for one person to fully understand all the details.

    However, a team of people can cooperatively understand modern computers and OS's with each person being a subject matter expert in one area. So it's not like their function is a complete mystery.
  • pharseidpharseid Posts: 192
    edited 2010-09-18 15:47
    I built a simple SIMD computer out of PLD's and discrete logic for a class years ago. Given that it was one semester and it was me doing the building, you can see it wasn't a major undertaking. Software was more of a problem, I literally programmed it in machine language. I think the tradeoff is if you design a simple computer, software is the problem, while implementing an existing computer is probably going to be a lot harder, but gives you access to a wealth of software.

    Unless you specifically want to build with discrete components, an FPGA is the way to go today. MIT Press published some great books on the early RISC and VLIW cpu's, that philosophy of a simple control unit and single cycle execution is probably the best approach. If you need complex instructions (and I think this is an interesting area for some special purpose computer) use the VLIW approach and let software sort out the complexity.

    -phar
  • localrogerlocalroger Posts: 3,452
    edited 2010-09-18 19:12
    @Martin the problem with core RAM is that you need semiconductors to amplify the sense signal returns, so it's not nucular bomb proof. It is nonvolatile, which is something amazing compared to what we take for granted nowadays -- ZERO, as in microsecondss, boot time.
  • Martin_HMartin_H Posts: 4,051
    edited 2010-09-19 13:07
    @localroger, I will plead ignorance in the technology used in early core memory systems. But core memory was invented in the late 1940's and became practical in 1951. While I thought the transistor wasn't invented until 1954. So I thought there was a version that used vacuum tubes in the signal amplifier?
  • LeonLeon Posts: 7,620
    edited 2010-09-19 13:20
    The point-contact transistor was in fact invented in 1948, although it was some time before they could be manufactured reliably. My older brother worked for GEC about when I started at grammar school, and I remember him giving me one of their books about the transistors they were making at the time, it must have been about 1953. I wanted some, but they were difficult to buy, and very expensive:

    http://homepages.nildram.co.uk/~wylie/GEC/GEC.htm
  • localrogerlocalroger Posts: 3,452
    edited 2010-09-19 15:58
    Ah yes Martin you're quite right, I think there were a couple of early core computers built with tube sense circuits, and they'd have been pulse proof. Most of the really early hollow-state computers used things like delay lines and magnetic drums and CRT's for storage, though. Computer builders were among the first adopters for both transistors and IC's because their wares were so expensive and the resulting economies so useful.
  • LoopyBytelooseLoopyByteloose Posts: 12,537
    edited 2010-09-20 09:11
    Getting back to the main question - Can we ever hope to understand today's or future computers?

    I'd say yes, but don't start with Windows or OS-X or any commercial OS. Linux offers a tremendous amount of resources and the open source code to identify exactly what is going on.

    Sure, you may think that Windows is running 90% of the PCs in the world, and you may even be right. But Linux is running 75% of the servers and Unix is running many that are not Linux. So Linux and Unix are doing the heavy lifting while OS-X and Windows milk cash from the rather poorly informed consumer.

    Try O'Reily Publications for good texts. And read some of the Unix classics to get a better foundation in how Linux evolved.
  • RiJoRiRiJoRi Posts: 157
    edited 2010-09-21 13:52
    ... can a single human(With only small scale integration IC's and transistors/passives) actually make a computer OR even a microcontroller? ... This thread is for any discussion about the creation/ complexity of the "Ultimate" DIY computer. Can it be done? Or is mankind already out of touch with our machines?

    Basically, it depends on what you want your computer to do, and how fast. Home-brew computers did not really take off until the 8080 came out. I took an Electronics Technician with Microcomputers course, and we built our own 4-bit CPU using SSI & MSI -- the MSI being the ALU. Clocking was done via a debounced switch, and the instructions and data were fed in via more switches. Yes, it did work very slowly.

    Another thing to consider is that reliability drops as the number of vias increases, which is one of the reasons SOCs (Systems on a Chip) and surface-mount devices are more popular.

    "... is mankind already out of touch with our machines?" If by "out of touch" you mean if we are beyond the point of one person completely understanding the hardware and software of modern computers, as well as the interactions thereof, then my answer is "Yes." Of course, how long has it been since barbers did tooth extractions? Specialization -- despite R.A. Heinlein -- is here to stay.

    I've worked with the T/S 1000, R/S CoCo2 & 3, Rockwell AIM-65, R/S Model 100, and R/S Model 4. I've a C=64 and Apple ][e awaiting my exploration. They are fun to play with, but I would certainly not try to use them to access the Internet. As I said, it depends on what you want your computer to do...

    --Rich
  • Martin_HMartin_H Posts: 4,051
    edited 2010-09-21 18:36
    @Loopy Byteloose, Linux is a good OS and I like it quite a lot. I just wish hardware support was a little more predictable.

    I have an MSI Wind u100 with Windows XP Home and Linux boot partitions on it. It boots and runs much faster with Linux, and the Linux UI was optimized for the netbook form factor, unlike XP.
  • Peter KG6LSEPeter KG6LSE Posts: 1,383
    edited 2010-09-21 21:37
    Martin_H.
    I see "7" on netbooks and I gag. who would THAT on a Atom chip is beyond me ..
    Mind there is a older lady here at college who has a ARM based $80 netbook on Win CE . its kinda cute .
  • LoopyBytelooseLoopyByteloose Posts: 12,537
    edited 2010-09-22 03:09
    Martin_H wrote: »
    @Loopy Byteloose, Linux is a good OS and I like it quite a lot. I just wish hardware support was a little more predictable.

    I have an MSI Wind u100 with Windows XP Home and Linux boot partitions on it. It boots and runs much faster with Linux, and the Linux UI was optimized for the netbook form factor, unlike XP.

    It took a bit of effort to get started but now everything I have is dual boot. Almost everything I do with micro-controllers requires that I have a Windows OS.

    The primary problem with Linux hardware support is that Apple and M$ encourage hardware developers to shut out Linux from leading edge consumer products. The manufacturer sees more revenue as Apple and M$ have the 'spenders' of the computer world and both assist in hardware driver development that is critical to successful market entry. Sadly, both Apple and Linux would prefer to shut out in-depth learners as well. And THAT is my main point. Similar problems exist with some specialized software applications as they want to cash in on their development.

    You can easily study Linux on your own and clear and precise language or you can buy courses and certifications from Apple and M$ that generally repackage and obscure the obvious.

    The really interesting feature of Linux for the learner is that the development history is not proprietary and convoluted -- one is allowed to study any aspect of it without buying expensive software. Many of the books are in public libraries as they define modern computing. It also performs much better on hardware that M$ might consider obsolete. (Another plus for the learner.) For instance, my XP computer with 512Mbyte of DRAM runs at a snail's pace in XP Professional SP3, but zips along quite nicely in Ubuntu 10.4.

    Aside from using Linux on my PCs ( XP, Vista and Windows 7), I have a very tiny version of Linux loaded on my wifi router. It doesn't have the GUI of Windows, but it is also a very useful teaching example for computer networking. Everything is there - the kernel, the file systems, and so on.

    BTW, Windows 7 Starter works fine on my Toshiba NB250 which is an Intel Atom. (but it takes 10x the space that Linux requires).
Sign In or Register to comment.