Shop OBEX P1 Docs P2 Docs Learn Events
The state of the world and computers, etc. — Parallax Forums

The state of the world and computers, etc.

MarkCrCoMarkCrCo Posts: 89
edited 2015-01-26 02:10 in General Discussion
This is my rant about computers and the state of the world. You'll probably figure out that I've been around a long time by the references I use.

First is the user community. Most users have gained their knowledge (if you can call it that) from gaming. They don't use the computer as a tool but are constantly trying to beat it. They look for loop holes and cheats to try and make the computer do what they want instead of figuring out what the software was actually designed to do and using it properly. The other group of users are the button pushers. They hit F5, enter 3 times tab once and the Type X and cross their fingers and hope the computer does what they want it to. They have no clue what any of those steps were designed to do it just has always worked before and they hope it will work again. This is the only thing that could actually lead to the H.A.L 9000 scenario. (look here if you don't understand).

I think that BASIC (Beginners All-purpose Symbolic Instruction Code) should be renamed ASIC so to remove the stigma associated with the B. I think everyone should be required to learn reading, writing, arithmetic, and ASIC. That way they really understand computers.

ASIC should be the language of choice for applications and the flavors of C should be only used for system programming. ASIC is self documenting more that any other language. COBOL (COmmon Business Oriented Language) used to run a close second but it was stupid that it actually used decimal for calculations and that weird EBCIDIC that IBM came up with. COBOL died with Y2K for the most part as it should have. It caused the Y2K problems and so should have been executed for it. C is not self documenting at all therefore the documentation is left to the programmer. That means either there won't be any documentation or it will be documented with techno-bable that won't be understood by most people. I think if programs were more open and accessible to the general population, computers would be more helpful and efficient tools.

I think the operating system should be primarily WINDOWS. Windows is the best compromise of systems. I know its not the fastest or most efficient but overall its the best OS for the masses. Windows 8 needs to stay as a specialized off shoot for those devices needing little or no keyboard input (cell phones and tablets primarily). Microsoft should not force it on laptops and desktops. Those need to stick with WIN 7. UNIX and the Apple OS's need to stay out there just for technical development but final products for the general public need to be Windows.

Ok there is my rant. These opinions are solely those of the author (me) and are not intended to offend anyone. I just put it out there for constructive discussion.
«1

Comments

  • Heater.Heater. Posts: 21,230
    edited 2014-05-12 13:31
    Well, good grief, I have been around for far too long also. BASIC was a dream when I was a kid.

    I cannot begin to correct all your false assertions or disagree with your misguided conclusions. There are just too many of them.

    :)
  • mindrobotsmindrobots Posts: 6,506
    edited 2014-05-12 13:39
    Wow! The world would be a boring place if we didn't all have opinions! :lol:

    I'll take your Windows 8 suggestion one step further.....users shouldn't force Windows 8 off of its DVD. :smile:
  • ercoerco Posts: 20,254
    edited 2014-05-12 13:47
    I like turtles and Windows 8!
  • MarkCrCoMarkCrCo Posts: 89
    edited 2014-05-12 13:53
    Go for it Heater!!! That's why I started this was for discussion. Ok my first languages I learned were WATFOR and WATFIV (versions of FORTRAN), APL, and ASSEMBLER. The first computer I used, took up a whole building so we communicated with it by writing programs which we either keypunched on to cards or paper tape and then sent to the other building. Once a day we had access to this amazing tecnology were we could actually communicate with the computer in Real time with of all things the telephone. Then we could play tic-tac-toe with the computer. Oh and it was about a decade later when they came up with Turtles programming language. Turtles was a great etch-a-sketch languages.
  • Heater.Heater. Posts: 21,230
    edited 2014-05-12 14:08
    On the other hand...MarkCrCo has the germ of a point there.

    In recent months we have been hearing a lot about how programming should be a thing taught to kids in school. I'm sure you have heard of "Hour of code"
    http://csedweek.org/
    http://code.org/learn
    and these amazing initiatives by such luminaries as Bill Gates and Zuckerberg etc.

    Well, three decades or more ago the computer hit the normal human being in the form of C64's, Sinclairs, Ataris, etc etc. At least in England at the time there was a huge push for "computer literacy". i.e. teaching kids programming and comp. sci. so as to prepare them for the "information age" that was obviously coming.

    What the f*** happened?

    It all went dead when MS and Apple and co. pushed word processors and spread sheets and closed source proprietary software into schools, and homes, and businesses everywhere.

    Grrr...

    Luckily we have things like Parallax, the Arduino and the Rasperry Pi foundation to counter all of that.
  • localrogerlocalroger Posts: 3,451
    edited 2014-05-12 18:10
    The 1990's were a real desert both for electronics and programming as hobbies. It was getting very hard to get enough information about how PC's worked to make a professional looking program without a lot of expensive tools, there were no simple computers to be had, and many components started coming out only in surface-mount form factors and it was almost impossible to find documentation.

    The one exception was the introduction of what MarkCrCo might call Visual ASIC, which as of version 4 had a decent (and in fact rather impressive) compiler as well as the ability to run on legacy 16-bit Windows 3.1 systems as well as the newer post-Win95 systems and the 32-bit NT-based business systems. In an era when "hello world" took three pages of code on most visual GUI environments VB was a refreshing exception.

    But hardware hacking got nothing but more difficult and expensive as non-chain hobby electronics shops closed up, Radio Shack turned into a cell phone company, and legacy parallel and serial ports got hard to find. The move to NT-cored Windows for the consumer with XP was the death knell for PC based hardware hacking because, even if you had a legacy port, you could no longer bit-bang it from the application layer, and generations of DOS era legacy software became unworkable.

    Fortunately a few companies like Parallax stepped up, and then when the Internet got mature enough to drive things the big retailers like DigiKey and Mouser replaced the old hobby HAM shops (and in most ways better, with same-day shipping); more companies like Sparkfun stepped up to make breakout boards for those hard to use SMT parts and other goodies, and of course the Arduino thing happened.

    The difference between today and 1985 is that back then you hacked on consumer level hardware; today that stuff is out of reach but we have interfaces to let us use that stuff to operate what we do hack, the Propellers and Basic (Asic?) Stamps, and other toys that would have been consumer grade computers in 1985 and are still adequate for lots of useful stuff, and much more approachable than the overcomplicated and under-reliable (often for reasons nobody can articulate) laptops and phones which are today's typical consumer products.
  • MarkCrCoMarkCrCo Posts: 89
    edited 2014-05-12 19:25
    Ah the early 80's I remember it well (sort of). My transition from those big monster computers to smaller ones. At work I had a portable computer called a KAYPRO running CPM for an operating system. (Opps the memories are fading I can't remember what CPM stood for).It must have weighed about 50 pounds and was the size of a carry-on suitcase. My Home computer had Microsoft's very first attempt at an operating system RS-DOS. See if you know what my Home computer was for a bit of Trivia.
  • Dr_AculaDr_Acula Posts: 5,484
    edited 2014-05-12 19:36
    At work I had a portable computer called a KAYPRO running CPM for an operating system. (Opps the memories are fading I can't remember what CPM stood for)

    Did someone say CP/M?

    Ah yes, well a few years back someone suggested running CP/M as an emulation on a Propeller. More recently other boffins have got it running as an emulation on a FPGA. I think there are other boffins who have got the Propeller running as an emulation on a FPGA. Soon I'll probably be an emulation...
  • MarkCrCoMarkCrCo Posts: 89
    edited 2014-05-12 19:59
    I have also decided that all locations should be stated relative to Parallax HQ !!!!! Thanks Dr_Acula.
  • SapphireSapphire Posts: 496
    edited 2014-05-12 22:47
    CP/M = Control Program for Microcomputers

    Nifty little OS before DOS came along...
  • whiteoxewhiteoxe Posts: 794
    edited 2014-05-12 23:23
    I have thought abot how would be the best way to teach programming and understanding how ram and eprom work, variables etc . I think basic stamp or picaxe would be the best waty to go, i know i learnt a lot more about using memory from picaxe than the visual basic and cobol courses i completed
  • MrBi11MrBi11 Posts: 117
    edited 2014-05-17 19:08
    H.A.L. reference:

    H is right before I
    A is right before B
    L is right before M

    so the HAL. was on better than IBM across the board.
  • MarkCrCoMarkCrCo Posts: 89
    edited 2015-01-19 21:41
    OK No one has guessed what RS-DOS was. Hmm.... I finally decided to try a new micro-controller. The store down the street is having a clearance sale (they didn't admit they are closing though). I got a propeller quick start for $3.99. Now what do I do with it? I got that little free RGB LED a month ago and tried to make it work but no luck so far. I spent more for a 40 pin ribbon cable than I did for the controller. That way I could play with it on a bread board. I'll have to learn SPIN until I can write an ASIC to SPIN translator sometime in the future. I used to translate ASIC to ASSEMBLER back in 1986. I even wrote an article in a magazine about how to do that. The funny thing is that I was re-reading that article and in it I happened to mention that the program would stop working on January 1, 2000. I wonder if I was the one who woke people up to the Y2K problem???
  • Duane DegnDuane Degn Posts: 10,588
    edited 2015-01-19 23:02
    MarkCrCo wrote: »
    I got that little free RGB LED a month ago and tried to make it work but no luck so far.

    What code did you try with the WS2812B Fun Board? With just one LED you should be able to power it from the QuickStart's Vin or Vdd line when powered from USB. The WS2812B can be powered with 3.3V and IMO, the LEDs work better this way.
  • msrobotsmsrobots Posts: 3,704
    edited 2015-01-19 23:06
    @MarkCrCo,

    the moment I read your first post here on this thread I started smiling and was thinking to myself - hell @Heater will chime in here...

    I agree with you that ASIC as common knowledge for all users would be wonderful.

    I disagree with your statements about COBOL. I have to. It is a wonderful language with almost no need to write comments. Good written COBOL is readable as clear English text. Rear Admiral Grace Hopper was quite a person and invented most of the stuff all other languages are using now. Simple things like the Idea that there could be a program writing programs. The first compiler.

    So I am absolutely sure that YOU created that Y2K problem and not COBOL. In fact - there was no actual Y2K problem at all. I 'fixed' tons of COBOL source at that time and in reality there was not much to fix at all. Most programs where already doing things right anyways. Quite a reading experiment. Not much typing involved.

    On the other hand I think you are right that COBOL is not the mainstream language to teach to the world.

    @Heater would say that JavaScript is the nowadays (B)ASIC. It runs almost everywhere in a browser and even on some micro controllers. I dug deep into JavaScript for some time, but am not as impressed as @Heater is. Different goals - different languages.

    I think the state of the world is the point here.

    Shortly before I left school, the Physics Teacher was able to get a Electronic Calculator! Not just one a whole set of ten. It was a sensation. Them red glowing 7 segment wires. And it was able to add, subtract divide and multiply. WOW. One of them had even a printer build in, printing each line of the transactions.

    Up to that time everybody had to learn to calculate numbers by themselves. Now nobody does. Even I am using a calculator. But I am still able to do this with pen and paper. Or just in my head. I have learned it at a time where there was no other way to do it. It was important. Nowadays people do not understand what and why they are doing it, the just ' hit F5, enter 3 times tab once and the Type X and cross their fingers and hope the computer does what they want it to...'

    I still know to use a slide rule. In theory. Haven't done it for a while. But I still have one. Somewhere.

    Not sure about the WINDOWS part of your rant?

    Isn't it already as you describe? WindowsXX runs for the masses, MAC for the people who want to care less about them systems and Linux for them who want to tinker around more with there computers?

    tl;dr;

    Nowadays computer are just consumer devices. Having or even programming them is nothing special for most of the people. Not a goal to archive or a skill to learn.

    Sad!

    Mike
  • TorTor Posts: 2,010
    edited 2015-01-20 00:32
    Calculators.. the same year I started my electronics education TI came out with the TI-30, way cheaper than most of the alternatives at that time. It made quite a difference - suddenly calculator usage exploded. (Some time later I got exposed to HP calculators and switched to RPN, but that's another story - I still have my TI-30 somewhere). But the thing is that in school we didn't have much use of a calculator in the mathematics lessons. Because that wasn't really about adding, subtracting, or multiplying numbers. We were free to use the calculator as much as we wanted for tests, but we still learned mathematics.

    Which isn't true for a lot of other people out there - it's routine to see someone in a shop using a calculator to figure out how much 10 items of something priced at 25 (say, $25) adds up to.

    Edit: Actually I thought this was that other thread.. that 'senior' discussion. Didn't notice that this one was older.
    So, to reply a little bit to the OP's post.. to me, BASIC is not good at all. Not good. Not good. Except if you replace it with a variant like BBC BASIC, which was structured. Writing BASIC (which I did a lot of, back then) teaches you bad habits and you run into issues with designing your program all the time. BASIC doesn't give me any good feeling at all. Well, Windows doesn't either. Give me any structured language instead, and any operating system where you can put together modules to do what you want. Something you can tinker with. Like the two quite different minicomputer operating systems I used to work with, and *nix-like systems. And oh, CP/M was nice back then too. Now I'm a bit too used to a different working dicipline.

    -Tor
  • Heater.Heater. Posts: 21,230
    edited 2015-01-20 01:30
    I'm still wondering about "BASIC ... should be renamed ASIC so to remove the stigma associated with the B."

    What stigma is that?

    I found a really weird correlation between the awfulness of a programming language and the first letter in it's name. For example all the worst programming languages have a name that begins with a "P": Perl, Prolog, PL/M, Pascal, PHP, Postscript, Processing.

    The "P" and the "B" are very close in pronunciation so clearly BASIC is awful as well.

    But otherwise what is the stigma against "B"? Without "B" we would not have wonderful words like 'utt, 'ollocks and 'eer.

    Addendum: Python seems to be a bit of outsider to my observed correlation being quite a fine language. The only awfulness of Python is it's performance.
  • LoopyBytelooseLoopyByteloose Posts: 12,537
    edited 2015-01-20 07:18
    Perhaps Python should be renamed 'ython. I am sure Monte would understand ;-]

    Without P we would not have Please or Poppycock.
  • wmosscropwmosscrop Posts: 406
    edited 2015-01-20 07:29
    MarkCrCo wrote: »
    COBOL (COmmon Business Oriented Language) used to run a close second but it was stupid that it actually used decimal for calculations and that weird EBCIDIC that IBM came up with.

    I respectfully disagree with the first part of this statement. This language was intended for business applications such as payroll, accounting, budgeting, etc. For these applications decimal calculations are absolutely necessary to avoid floating point value error issues with fractional values.

    And BTW, COBOL could use floating point. I believe it was called "computational" or some such term.

    The second part of the statement is also incorrect. COBOL did not "use" EBCDIC, it used EBCDIC or ASCII or whatever the underlying OS/hardware used. I programmed for many years on a Data General in COBOL with ASCII.

    And for those of you that say that the IBM/360 only used EBCDIC, that's not true. Some models had a setting that allowed the machine to use ASCII.

    Finally, please do not blame COBOL for Y2K. It had far more to do with "unit record equipment" and punched cards. When you've only got 80 columns to put an entire record into... something's gotta go.

    Just my opinions.

    Walter
  • User NameUser Name Posts: 1,451
    edited 2015-01-20 07:54
    I'd have to agree that the B-word was unfortunate and has dogged that language right from the beginning. Still, I don't agree with C being reserved only for system programming. I left BASIC for C many years ago for very good reasons and have never looked back. Even though BASIC has improved, it is still much too cumbersome for the bit-twiddling and data manipulation I need. (It would be akin to swimming in a rain slicker.) If it weren't for C I'd be forced to revert to assembly for everything. And you can't argue that that would make everything more legible. ;)
  • Ron CzapalaRon Czapala Posts: 2,418
    edited 2015-01-20 08:44
    msrobots wrote: »
    I disagree with your statements about COBOL. I have to. It is a wonderful language with almost no need to write comments. Good written COBOL is readable as clear English text. Rear Admiral Grace Hopper was quite a person and invented most of the stuff all other languages are using now. Simple things like the Idea that there could be a program writing programs. The first compiler.

    So I am absolutely sure that YOU created that Y2K problem and not COBOL. In fact - there was no actual Y2K problem at all. I 'fixed' tons of COBOL source at that time and in reality there was not much to fix at all. Most programs where already doing things right anyways. Quite a reading experiment. Not much typing involved.

    On the other hand I think you are right that COBOL is not the mainstream language to teach to the world.

    ...

    Nowadays computer are just consumer devices. Having or even programming them is nothing special for most of the people. Not a goal to archive or a skill to learn.

    Sad!

    Mike

    I was a mainframe programmer/analyst from 1973 till 1994 using COBOL. You would probably be surprised now much COBOL is still being used - especially in financial and the health insurance systems. Huge volumes of daily transactions are one of the considerations and many companies can't really justify rewriting all of their systems.

    There was a Y2K problem but it was not due to COBOL - it was due to systems designed around a two digit year (trying to save storage and shorten record sizes when data storage was limited and costly).
    While structured COBOL programs (versus "spaghetti-code") were effective and could be self documenting to some degree.

    I moved into developing PC systems and then into web development at the last place I worked before retiring. When I got there, technical services wanted to use COBOL for PC programming under OS2.

    I considered that a big mistake and convinced my boss to go with MS Visual Studio. We developed VB6 systems using MS Access and MS SQL Server relational databases. Our success in that area killed any more talk of COBOL on the PC and of course OS2 faded away.
  • Heater.Heater. Posts: 21,230
    edited 2015-01-20 08:47
    MarkCrCo,

    Decimal, stupid?

    In most modern programming languages on most modern computers we have problems like "0.1 + 0.2 == 0.3" evaluates to false. Why? Because 0.1 + 0.2 = 0.30000000000000004. Why? Because they use binary floating point numbers under the hood which cannot represent our decimal numbers correctly.

    COBOL being targeted at counting money had to have user friendly numbers. Hence decimal representations.

    Every user of Excel and JavaScript has complained about issues like the above at some point so arguably all computers should use decimal floating point hardware.
  • LoopyBytelooseLoopyByteloose Posts: 12,537
    edited 2015-01-20 19:09
    Decimal notation is really a human interface convention.

    It makes good sense to have a language such as COBOL that is dedicated to monetary computations that keep track of every penny and display funds in a format that has long been accepted universally by custom.

    But if someone desires to get closer to the machine, Forth on the Propeller is delightful. It tends to lack floating point and focuses on signed and unsigned integer. Decimal numbers are displayed, but tend to be whole numbers unless you specifically program for smaller.

    While that may all be tedious, it is educational. It really helps learners to comprehend that the machine is working in binary and nothing else. Decimals only come into play when a human desires to input or output in a convienent format.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2015-01-20 19:52
    heater wrote:
    For example all the worst programming languages have a name that begins with a "P": Perl, ...
    Now, Heater, you''re just trying to start something, aren't you? After all, without the "P", Perl borders on freaking royalty! And I happen to quite like Postscript. It's like forth, but without the stack management headaches. I designed a lot of cool graphics for customers with raw PS before WYSIWYG drawing programs came into vogue. I also rather like my own initials, althought my middle name that starts with "C" throws the monogram into some disrepute. :)

    -Phil
  • kwinnkwinn Posts: 8,697
    edited 2015-01-20 20:36
    If erl borders on royalty without the P then ostscript must border on cheesy.
  • LoopyBytelooseLoopyByteloose Posts: 12,537
    edited 2015-01-21 08:05
    Gosh, I didn't mean that everything should remove the first P.

    Phil Pilgrim would become hil ilgrim,
    Parallax would be arallax
    And our dear Propeller would be the ropeller.

    In sum, let's stick with the basics.... Assembler, Basic, and C

    The scope of those three is ample for a good introduction to computing. (Skip the ++, ##, and other such flourishes).


    ++++++
    Mandating language generalizations just never seems to work.

    Just consider the fact that Life is a four letter word.
  • TorTor Posts: 2,010
    edited 2015-01-21 20:24
    but but.. First came BCPL. Then, inspired by BCPL, came B. Which transformed into C. (And was side-tracked into C++, which clearly was a mistake, so forget that one). The next transformation would be the P language, before ending up with the perfect L language. What will happen to P? Should it be called ' ' then? Or just '..'?

    Now worried,
    -Tor
  • No Static At AllNo Static At All Posts: 9
    edited 2015-01-22 07:55
    And what about PBASIC?

    Would this forum exist without it?

    Oh no, a P and a B!
  • LoopyBytelooseLoopyByteloose Posts: 12,537
    edited 2015-01-22 08:33
    And what about PBASIC?

    Would this forum exist without it?

    Oh no, a P and a B!

    Lovely observation. Nevertheless, a computer language is no more a true language that an honest theif is truly honest. We really do ourselves a dis-service when we get caught up in this nonsense.

    Face the reality that in the history of computers, there was a period of rapid expansion where many many greedy fools tried to launch their own language to create a franchise that would bring them enormous wealth. Only a few prevailed, and only a few will endure.

    It is all similar to breeding cockroaches. You start out with one female that lays a million or so eggs, and you merely get a few thousand... which are more than enough.
  • Heater.Heater. Posts: 21,230
    edited 2015-01-23 01:40
    Loopy,

    What is a a "true language" and why isn't a computer programming language one of them? I have a feeling people like Noam Chomsky might have a different view.
    ...in the history of computers, there was a period of rapid expansion where many many greedy fools...
    What do you base that assertion on?

    Programming languages have been sprouting like weeds ever since FORTRAN. If we look at the time line of programming languages on Wikipedia http://en.wikipedia.org/wiki/Timeline_of_programming_languages we can firstly note that:

    a) There are an awful lot of programming languages.
    b) The rate of creation and "mutation" of programming languages is about constant over time. And that's only the ones notable enough to make it to Wikipedia.
    c) The majority of programming languages were not created by "greedy fools". Although many of the most well know were, obviously because they were the ones pushed and marketed.
    d) The rate of programming language creation shows no sign or slowing.

    Interestingly we can get an idea of rate of language creation per decade from the wiki page:

    1940's 9
    1950's 48
    1960's 52
    1970's 56
    1980's 58
    1990's 60
    2000's 44
    2010's 12

    Seems the 1990's were a high spot, they brought us many famous and still popular languages like JavaScript, Haskel, Python, Java, Ada95, Delphi.

    If we look at a time line of programming languages like the one at the top of this page: http://www.levenez.com/lang/ or even just a small part of it, like the Forth Family Tree as seen on this page http://www.complang.tuwien.ac.at/forth/family-tree/ we see a horrible tangled mess of related languages and dialects evolving through time. Starts to look the same as those "true languages" does it not?
Sign In or Register to comment.