Complexity of modern computers. Can we even hope to understand?
Ravenkallen
Posts: 1,057
I have had this question tumbling around in my head for awhile. With the advent of computers so complex that we can't even fully understand all the hardware/ software issues, can a single human(With only small scale integration IC's and transistors/passives) actually make a computer OR even a microcontroller? On wikipedia i saw a circuit(That took up four large breadboards) that was a very small computer made form logic gates/ registers.....This thread is for any discussion about the creation/ complexity of the "Ultimate" DIY computer. Can it be done? Or is mankind already out of touch with our machines?
Comments
Dr Harry Porter's relay computer and presentation materials on his web site http://web.cecs.pdx.edu/~harry/Relay/ can help you understand how to build a computer from discrete logic components.
Once you make the jump to integrated circuits the Magic computer at http://www.homebrewcpu.com/ takes things further. It is built from 74 series logic chips. It is roughly equal to a mini computer from the late 1960's.
Moving into the 1970's the 6502 only has about 4000 transistors in it and I've heard of people building their own compatible CPU's using 7400 series chips.
@chuckz..I have tried to find the wikipedia page, but i have had no luck. I will keep trying though...
DOS is built on CP/M, and windows was originally a program that ran in DOS.
I got lost about 1990, but I think it is possible to understand a computer from before then, especially one running as an emulation on the propeller.
There's a saying in computer science that everything was invented in the 1950's and everything since has been elaboration. So the abstractions used in modern computers are identical to the abstractions in something like the relay computer. But the details are radically different. I agree with the sentiment that the 8 bit CPU's and run time monitors were the last generation of technology that it was possible for one person to fully understand all the details.
However, a team of people can cooperatively understand modern computers and OS's with each person being a subject matter expert in one area. So it's not like their function is a complete mystery.
Unless you specifically want to build with discrete components, an FPGA is the way to go today. MIT Press published some great books on the early RISC and VLIW cpu's, that philosophy of a simple control unit and single cycle execution is probably the best approach. If you need complex instructions (and I think this is an interesting area for some special purpose computer) use the VLIW approach and let software sort out the complexity.
-phar
http://homepages.nildram.co.uk/~wylie/GEC/GEC.htm
I'd say yes, but don't start with Windows or OS-X or any commercial OS. Linux offers a tremendous amount of resources and the open source code to identify exactly what is going on.
Sure, you may think that Windows is running 90% of the PCs in the world, and you may even be right. But Linux is running 75% of the servers and Unix is running many that are not Linux. So Linux and Unix are doing the heavy lifting while OS-X and Windows milk cash from the rather poorly informed consumer.
Try O'Reily Publications for good texts. And read some of the Unix classics to get a better foundation in how Linux evolved.
Basically, it depends on what you want your computer to do, and how fast. Home-brew computers did not really take off until the 8080 came out. I took an Electronics Technician with Microcomputers course, and we built our own 4-bit CPU using SSI & MSI -- the MSI being the ALU. Clocking was done via a debounced switch, and the instructions and data were fed in via more switches. Yes, it did work very slowly.
Another thing to consider is that reliability drops as the number of vias increases, which is one of the reasons SOCs (Systems on a Chip) and surface-mount devices are more popular.
"... is mankind already out of touch with our machines?" If by "out of touch" you mean if we are beyond the point of one person completely understanding the hardware and software of modern computers, as well as the interactions thereof, then my answer is "Yes." Of course, how long has it been since barbers did tooth extractions? Specialization -- despite R.A. Heinlein -- is here to stay.
I've worked with the T/S 1000, R/S CoCo2 & 3, Rockwell AIM-65, R/S Model 100, and R/S Model 4. I've a C=64 and Apple ][e awaiting my exploration. They are fun to play with, but I would certainly not try to use them to access the Internet. As I said, it depends on what you want your computer to do...
--Rich
I have an MSI Wind u100 with Windows XP Home and Linux boot partitions on it. It boots and runs much faster with Linux, and the Linux UI was optimized for the netbook form factor, unlike XP.
I see "7" on netbooks and I gag. who would THAT on a Atom chip is beyond me ..
Mind there is a older lady here at college who has a ARM based $80 netbook on Win CE . its kinda cute .
It took a bit of effort to get started but now everything I have is dual boot. Almost everything I do with micro-controllers requires that I have a Windows OS.
The primary problem with Linux hardware support is that Apple and M$ encourage hardware developers to shut out Linux from leading edge consumer products. The manufacturer sees more revenue as Apple and M$ have the 'spenders' of the computer world and both assist in hardware driver development that is critical to successful market entry. Sadly, both Apple and Linux would prefer to shut out in-depth learners as well. And THAT is my main point. Similar problems exist with some specialized software applications as they want to cash in on their development.
You can easily study Linux on your own and clear and precise language or you can buy courses and certifications from Apple and M$ that generally repackage and obscure the obvious.
The really interesting feature of Linux for the learner is that the development history is not proprietary and convoluted -- one is allowed to study any aspect of it without buying expensive software. Many of the books are in public libraries as they define modern computing. It also performs much better on hardware that M$ might consider obsolete. (Another plus for the learner.) For instance, my XP computer with 512Mbyte of DRAM runs at a snail's pace in XP Professional SP3, but zips along quite nicely in Ubuntu 10.4.
Aside from using Linux on my PCs ( XP, Vista and Windows 7), I have a very tiny version of Linux loaded on my wifi router. It doesn't have the GUI of Windows, but it is also a very useful teaching example for computer networking. Everything is there - the kernel, the file systems, and so on.
BTW, Windows 7 Starter works fine on my Toshiba NB250 which is an Intel Atom. (but it takes 10x the space that Linux requires).