The state of the world and computers, etc.
MarkCrCo
Posts: 89
This is my rant about computers and the state of the world. You'll probably figure out that I've been around a long time by the references I use.
First is the user community. Most users have gained their knowledge (if you can call it that) from gaming. They don't use the computer as a tool but are constantly trying to beat it. They look for loop holes and cheats to try and make the computer do what they want instead of figuring out what the software was actually designed to do and using it properly. The other group of users are the button pushers. They hit F5, enter 3 times tab once and the Type X and cross their fingers and hope the computer does what they want it to. They have no clue what any of those steps were designed to do it just has always worked before and they hope it will work again. This is the only thing that could actually lead to the H.A.L 9000 scenario. (look here if you don't understand).
I think that BASIC (Beginners All-purpose Symbolic Instruction Code) should be renamed ASIC so to remove the stigma associated with the B. I think everyone should be required to learn reading, writing, arithmetic, and ASIC. That way they really understand computers.
ASIC should be the language of choice for applications and the flavors of C should be only used for system programming. ASIC is self documenting more that any other language. COBOL (COmmon Business Oriented Language) used to run a close second but it was stupid that it actually used decimal for calculations and that weird EBCIDIC that IBM came up with. COBOL died with Y2K for the most part as it should have. It caused the Y2K problems and so should have been executed for it. C is not self documenting at all therefore the documentation is left to the programmer. That means either there won't be any documentation or it will be documented with techno-bable that won't be understood by most people. I think if programs were more open and accessible to the general population, computers would be more helpful and efficient tools.
I think the operating system should be primarily WINDOWS. Windows is the best compromise of systems. I know its not the fastest or most efficient but overall its the best OS for the masses. Windows 8 needs to stay as a specialized off shoot for those devices needing little or no keyboard input (cell phones and tablets primarily). Microsoft should not force it on laptops and desktops. Those need to stick with WIN 7. UNIX and the Apple OS's need to stay out there just for technical development but final products for the general public need to be Windows.
Ok there is my rant. These opinions are solely those of the author (me) and are not intended to offend anyone. I just put it out there for constructive discussion.
First is the user community. Most users have gained their knowledge (if you can call it that) from gaming. They don't use the computer as a tool but are constantly trying to beat it. They look for loop holes and cheats to try and make the computer do what they want instead of figuring out what the software was actually designed to do and using it properly. The other group of users are the button pushers. They hit F5, enter 3 times tab once and the Type X and cross their fingers and hope the computer does what they want it to. They have no clue what any of those steps were designed to do it just has always worked before and they hope it will work again. This is the only thing that could actually lead to the H.A.L 9000 scenario. (look here if you don't understand).
I think that BASIC (Beginners All-purpose Symbolic Instruction Code) should be renamed ASIC so to remove the stigma associated with the B. I think everyone should be required to learn reading, writing, arithmetic, and ASIC. That way they really understand computers.
ASIC should be the language of choice for applications and the flavors of C should be only used for system programming. ASIC is self documenting more that any other language. COBOL (COmmon Business Oriented Language) used to run a close second but it was stupid that it actually used decimal for calculations and that weird EBCIDIC that IBM came up with. COBOL died with Y2K for the most part as it should have. It caused the Y2K problems and so should have been executed for it. C is not self documenting at all therefore the documentation is left to the programmer. That means either there won't be any documentation or it will be documented with techno-bable that won't be understood by most people. I think if programs were more open and accessible to the general population, computers would be more helpful and efficient tools.
I think the operating system should be primarily WINDOWS. Windows is the best compromise of systems. I know its not the fastest or most efficient but overall its the best OS for the masses. Windows 8 needs to stay as a specialized off shoot for those devices needing little or no keyboard input (cell phones and tablets primarily). Microsoft should not force it on laptops and desktops. Those need to stick with WIN 7. UNIX and the Apple OS's need to stay out there just for technical development but final products for the general public need to be Windows.
Ok there is my rant. These opinions are solely those of the author (me) and are not intended to offend anyone. I just put it out there for constructive discussion.
Comments
I cannot begin to correct all your false assertions or disagree with your misguided conclusions. There are just too many of them.
I'll take your Windows 8 suggestion one step further.....users shouldn't force Windows 8 off of its DVD.
In recent months we have been hearing a lot about how programming should be a thing taught to kids in school. I'm sure you have heard of "Hour of code"
http://csedweek.org/
http://code.org/learn
and these amazing initiatives by such luminaries as Bill Gates and Zuckerberg etc.
Well, three decades or more ago the computer hit the normal human being in the form of C64's, Sinclairs, Ataris, etc etc. At least in England at the time there was a huge push for "computer literacy". i.e. teaching kids programming and comp. sci. so as to prepare them for the "information age" that was obviously coming.
What the f*** happened?
It all went dead when MS and Apple and co. pushed word processors and spread sheets and closed source proprietary software into schools, and homes, and businesses everywhere.
Grrr...
Luckily we have things like Parallax, the Arduino and the Rasperry Pi foundation to counter all of that.
The one exception was the introduction of what MarkCrCo might call Visual ASIC, which as of version 4 had a decent (and in fact rather impressive) compiler as well as the ability to run on legacy 16-bit Windows 3.1 systems as well as the newer post-Win95 systems and the 32-bit NT-based business systems. In an era when "hello world" took three pages of code on most visual GUI environments VB was a refreshing exception.
But hardware hacking got nothing but more difficult and expensive as non-chain hobby electronics shops closed up, Radio Shack turned into a cell phone company, and legacy parallel and serial ports got hard to find. The move to NT-cored Windows for the consumer with XP was the death knell for PC based hardware hacking because, even if you had a legacy port, you could no longer bit-bang it from the application layer, and generations of DOS era legacy software became unworkable.
Fortunately a few companies like Parallax stepped up, and then when the Internet got mature enough to drive things the big retailers like DigiKey and Mouser replaced the old hobby HAM shops (and in most ways better, with same-day shipping); more companies like Sparkfun stepped up to make breakout boards for those hard to use SMT parts and other goodies, and of course the Arduino thing happened.
The difference between today and 1985 is that back then you hacked on consumer level hardware; today that stuff is out of reach but we have interfaces to let us use that stuff to operate what we do hack, the Propellers and Basic (Asic?) Stamps, and other toys that would have been consumer grade computers in 1985 and are still adequate for lots of useful stuff, and much more approachable than the overcomplicated and under-reliable (often for reasons nobody can articulate) laptops and phones which are today's typical consumer products.
Did someone say CP/M?
Ah yes, well a few years back someone suggested running CP/M as an emulation on a Propeller. More recently other boffins have got it running as an emulation on a FPGA. I think there are other boffins who have got the Propeller running as an emulation on a FPGA. Soon I'll probably be an emulation...
Nifty little OS before DOS came along...
H is right before I
A is right before B
L is right before M
so the HAL. was on better than IBM across the board.
What code did you try with the WS2812B Fun Board? With just one LED you should be able to power it from the QuickStart's Vin or Vdd line when powered from USB. The WS2812B can be powered with 3.3V and IMO, the LEDs work better this way.
the moment I read your first post here on this thread I started smiling and was thinking to myself - hell @Heater will chime in here...
I agree with you that ASIC as common knowledge for all users would be wonderful.
I disagree with your statements about COBOL. I have to. It is a wonderful language with almost no need to write comments. Good written COBOL is readable as clear English text. Rear Admiral Grace Hopper was quite a person and invented most of the stuff all other languages are using now. Simple things like the Idea that there could be a program writing programs. The first compiler.
So I am absolutely sure that YOU created that Y2K problem and not COBOL. In fact - there was no actual Y2K problem at all. I 'fixed' tons of COBOL source at that time and in reality there was not much to fix at all. Most programs where already doing things right anyways. Quite a reading experiment. Not much typing involved.
On the other hand I think you are right that COBOL is not the mainstream language to teach to the world.
@Heater would say that JavaScript is the nowadays (B)ASIC. It runs almost everywhere in a browser and even on some micro controllers. I dug deep into JavaScript for some time, but am not as impressed as @Heater is. Different goals - different languages.
I think the state of the world is the point here.
Shortly before I left school, the Physics Teacher was able to get a Electronic Calculator! Not just one a whole set of ten. It was a sensation. Them red glowing 7 segment wires. And it was able to add, subtract divide and multiply. WOW. One of them had even a printer build in, printing each line of the transactions.
Up to that time everybody had to learn to calculate numbers by themselves. Now nobody does. Even I am using a calculator. But I am still able to do this with pen and paper. Or just in my head. I have learned it at a time where there was no other way to do it. It was important. Nowadays people do not understand what and why they are doing it, the just ' hit F5, enter 3 times tab once and the Type X and cross their fingers and hope the computer does what they want it to...'
I still know to use a slide rule. In theory. Haven't done it for a while. But I still have one. Somewhere.
Not sure about the WINDOWS part of your rant?
Isn't it already as you describe? WindowsXX runs for the masses, MAC for the people who want to care less about them systems and Linux for them who want to tinker around more with there computers?
tl;dr;
Nowadays computer are just consumer devices. Having or even programming them is nothing special for most of the people. Not a goal to archive or a skill to learn.
Sad!
Mike
Which isn't true for a lot of other people out there - it's routine to see someone in a shop using a calculator to figure out how much 10 items of something priced at 25 (say, $25) adds up to.
Edit: Actually I thought this was that other thread.. that 'senior' discussion. Didn't notice that this one was older.
So, to reply a little bit to the OP's post.. to me, BASIC is not good at all. Not good. Not good. Except if you replace it with a variant like BBC BASIC, which was structured. Writing BASIC (which I did a lot of, back then) teaches you bad habits and you run into issues with designing your program all the time. BASIC doesn't give me any good feeling at all. Well, Windows doesn't either. Give me any structured language instead, and any operating system where you can put together modules to do what you want. Something you can tinker with. Like the two quite different minicomputer operating systems I used to work with, and *nix-like systems. And oh, CP/M was nice back then too. Now I'm a bit too used to a different working dicipline.
-Tor
What stigma is that?
I found a really weird correlation between the awfulness of a programming language and the first letter in it's name. For example all the worst programming languages have a name that begins with a "P": Perl, Prolog, PL/M, Pascal, PHP, Postscript, Processing.
The "P" and the "B" are very close in pronunciation so clearly BASIC is awful as well.
But otherwise what is the stigma against "B"? Without "B" we would not have wonderful words like 'utt, 'ollocks and 'eer.
Addendum: Python seems to be a bit of outsider to my observed correlation being quite a fine language. The only awfulness of Python is it's performance.
Without P we would not have Please or Poppycock.
I respectfully disagree with the first part of this statement. This language was intended for business applications such as payroll, accounting, budgeting, etc. For these applications decimal calculations are absolutely necessary to avoid floating point value error issues with fractional values.
And BTW, COBOL could use floating point. I believe it was called "computational" or some such term.
The second part of the statement is also incorrect. COBOL did not "use" EBCDIC, it used EBCDIC or ASCII or whatever the underlying OS/hardware used. I programmed for many years on a Data General in COBOL with ASCII.
And for those of you that say that the IBM/360 only used EBCDIC, that's not true. Some models had a setting that allowed the machine to use ASCII.
Finally, please do not blame COBOL for Y2K. It had far more to do with "unit record equipment" and punched cards. When you've only got 80 columns to put an entire record into... something's gotta go.
Just my opinions.
Walter
I was a mainframe programmer/analyst from 1973 till 1994 using COBOL. You would probably be surprised now much COBOL is still being used - especially in financial and the health insurance systems. Huge volumes of daily transactions are one of the considerations and many companies can't really justify rewriting all of their systems.
There was a Y2K problem but it was not due to COBOL - it was due to systems designed around a two digit year (trying to save storage and shorten record sizes when data storage was limited and costly).
While structured COBOL programs (versus "spaghetti-code") were effective and could be self documenting to some degree.
I moved into developing PC systems and then into web development at the last place I worked before retiring. When I got there, technical services wanted to use COBOL for PC programming under OS2.
I considered that a big mistake and convinced my boss to go with MS Visual Studio. We developed VB6 systems using MS Access and MS SQL Server relational databases. Our success in that area killed any more talk of COBOL on the PC and of course OS2 faded away.
Decimal, stupid?
In most modern programming languages on most modern computers we have problems like "0.1 + 0.2 == 0.3" evaluates to false. Why? Because 0.1 + 0.2 = 0.30000000000000004. Why? Because they use binary floating point numbers under the hood which cannot represent our decimal numbers correctly.
COBOL being targeted at counting money had to have user friendly numbers. Hence decimal representations.
Every user of Excel and JavaScript has complained about issues like the above at some point so arguably all computers should use decimal floating point hardware.
It makes good sense to have a language such as COBOL that is dedicated to monetary computations that keep track of every penny and display funds in a format that has long been accepted universally by custom.
But if someone desires to get closer to the machine, Forth on the Propeller is delightful. It tends to lack floating point and focuses on signed and unsigned integer. Decimal numbers are displayed, but tend to be whole numbers unless you specifically program for smaller.
While that may all be tedious, it is educational. It really helps learners to comprehend that the machine is working in binary and nothing else. Decimals only come into play when a human desires to input or output in a convienent format.
-Phil
Phil Pilgrim would become hil ilgrim,
Parallax would be arallax
And our dear Propeller would be the ropeller.
In sum, let's stick with the basics.... Assembler, Basic, and C
The scope of those three is ample for a good introduction to computing. (Skip the ++, ##, and other such flourishes).
++++++
Mandating language generalizations just never seems to work.
Just consider the fact that Life is a four letter word.
Now worried,
-Tor
Would this forum exist without it?
Oh no, a P and a B!
Lovely observation. Nevertheless, a computer language is no more a true language that an honest theif is truly honest. We really do ourselves a dis-service when we get caught up in this nonsense.
Face the reality that in the history of computers, there was a period of rapid expansion where many many greedy fools tried to launch their own language to create a franchise that would bring them enormous wealth. Only a few prevailed, and only a few will endure.
It is all similar to breeding cockroaches. You start out with one female that lays a million or so eggs, and you merely get a few thousand... which are more than enough.
What is a a "true language" and why isn't a computer programming language one of them? I have a feeling people like Noam Chomsky might have a different view. What do you base that assertion on?
Programming languages have been sprouting like weeds ever since FORTRAN. If we look at the time line of programming languages on Wikipedia http://en.wikipedia.org/wiki/Timeline_of_programming_languages we can firstly note that:
a) There are an awful lot of programming languages.
b) The rate of creation and "mutation" of programming languages is about constant over time. And that's only the ones notable enough to make it to Wikipedia.
c) The majority of programming languages were not created by "greedy fools". Although many of the most well know were, obviously because they were the ones pushed and marketed.
d) The rate of programming language creation shows no sign or slowing.
Interestingly we can get an idea of rate of language creation per decade from the wiki page:
1940's 9
1950's 48
1960's 52
1970's 56
1980's 58
1990's 60
2000's 44
2010's 12
Seems the 1990's were a high spot, they brought us many famous and still popular languages like JavaScript, Haskel, Python, Java, Ada95, Delphi.
If we look at a time line of programming languages like the one at the top of this page: http://www.levenez.com/lang/ or even just a small part of it, like the Forth Family Tree as seen on this page http://www.complang.tuwien.ac.at/forth/family-tree/ we see a horrible tangled mess of related languages and dialects evolving through time. Starts to look the same as those "true languages" does it not?