How come so many 5 star threads here?
Heater.
Posts: 21,230
How come nearly all the threads in the current first page of the Propeller 2 Multicore Microcontroller sub forum has earned 5 starts regardless of how pointless they are?
I don't mind. I just think it's a bit odd.
I don't mind. I just think it's a bit odd.
Comments
Never used them myself. Not sure you can even do it in the state the forums are in.
Or is the forum itself staring things it likes?
I just checked, and this one has it. Look at the top of the thread, upper right, below the thread title. Should be there in your browser. And if it is, you can use it to rate the thread and that's where the stars come from.
Everything P2 was five stars because we want the thing actualized! Yesterday.
So it does. And it works. This junk is now a 5 star thread!
I normally don't pay any attention to this kind of stupid social media fluff.
Personally I think all these threads should be minus 5 stars. We want the chip out. Ergo we don't want Chip to be distracted by flashy threads.
Oh, poo, actually only 3 stars!
Doesn't appear to be there without scripting. The mass of stars appear for me though.
I'm crushed.
Get NetSurf or Dillo DOS and the concern goes away.
I'm posing this from it's bigger brother links2, which does at least show avatas and images on the page.
And here is how things look in a "real browser", the forum links2 style:
It's actually hard to get links2 to get to use it's horizontal scroll bar. It just keeps folding every thing up into a narrower and narrower width as you shrink the window. Until it hits the width of the Parallax logo.
So it's a horizontal scale, not scroll. Ok. Fine.
So it's a window size scroll bar. That's OK. Fine. Too. Whatever.
Yeah, borked for sure.
Somebody needs to make an April Fools Desktop Window Manager. Redefine all the widgets. Somebody somewhere will absolutely love it, and forever be ashamed to admit it to anyone else, secretly running it when nobody is around.
On a serious note, does it animate the gif?
It does have a -g option for graphical mode. Which is rude and crude but does display images. No it does not animate gifs.
It actually makes visiting slashdot a dozen times faster!
All in all I think you might like it.
Mind you, there is nothing wrong with ECMA-262. It's a wonderful language.
The problem is the the browser environment that it normally lives in.
For example:
If I point my browser at web site A. That I may know and trust a little bit. How come the browser then is quite happy to fetch all kind of Smile from website B, C, D..that I did not ask for?
Which will include fetching further ECMA-262 code which in turn can be laden with malintent.
Who ever thought that was a good idea?
Netscape? and MS copied it.
The reason why one of the first browsers was called MOSAIC. The basic Idea to collect information from different sources and show them together. Linking resources not copying them seemed a good plan at that time.
And a href-tag looks quite similar to a img-tag or a javascript-tag are all using the URL or Uniform Resource Locator. Still sounds like locating some resource in the world wide web. Not on one website.
So it is implicit by design. It was the goal to be able to do that. Right from the beginning of the Internet at CERN in Switzerland. Yes. Switzerland. Not Al Gore did it. Swiss.
HTML was designed to do that.
Enjoy!
Mike
I was thinking about that. You are right it all starts at CERN with Tim Berners-Lee.
Tim suggested this idea of hyper links in documents, linking across files and across servers. He was thinking big so it's the world wide web.
So far so good. There is the link in your document. It's displayed in blue and/or underlined. You can click it if you want. Then the new resource is fetched from whatever site it points to.
Well, it's one thing to have the convenience of links to further information embedded in whatever you are reading. It's a totally different thing to have whatever those links point to downloaded without your asking. It's not clear to be that the idea was that things got fetched automatically. You had to ask for them by clicking the link. Here is an original page from the first WEB site: http://info.cern.ch/hypertext/WWW/TheProject.html
As you say, it was Netscape, Marc Andreessen, that introduced the image tag. Is that where the rot started? It makes sense to have the image downloaded and displayed in a document automatically, it makes sense that the image could be a local file or on a remote server.
Later comes a killer, Netscape introduces the EMBED tag. Enter FLASH.....And of course, Netscape gives as the glorious JavaScript. Not only do these now download without any user request, but they also execute code!
So it was a gradual slide as the user lost control of his machine and it was given to advertisers and malware authors.
Edit: Not so gradual I guess, this all developed very quickly, no one had time to think (or care) about the security nightmare and mess they were making.
Though I preferred the internet how it was in the 1980's. That is the protocols we used most were GOPHER, ftp, Telnet, smtp, pop, etc. There was no WWW yet, no http, and in many ways it was better.
Though we must live with that which has become popular, unfortunately. Why people think the WWW is so great I do not understand. There is nothing done with "web pages" that can not be done with other protocols better, and with out the junk of modern HTML and scripting. A board (like these forums) was done through telnet (or nntp), to view pictures you explicitly download them, to view video (yes there were videos online back then) you download and watch the video.
Why do we need http? We do not, it was created to attempt to make things easier, and for a time it worked, it has long since failed.
Personally I think web technology is a steaming pile of, well, you know. That I love to complain about. That tangled spaghetti soup of HTTP, HTML, XML, CSS, JS, template languages, PHP, frame works, the browser DOM API, embedded SQL, etc, etc is a nightmare. I mean how many different languages is it sensible to mix up in a single source file?
I had the misfortune to be employed building web site back with PHP and MySQL in the late 1990's. I quickly decided it was horrible and vowed never to go near web development again.
Recently I have been getting back to web stuff for fun and profit. Lot's of things have changed. And it's an even more tangled spaghetti soup.
BUT, to your proposition. If we were to scrap all that and use your "other protocols" what would they be and how would they be better?
We need some kind of protocol on the wire, we could swap HTTP for FTP or Telnet or whatever but how does that make anything better?
We need some kind of format for the delivered data. We could swap HTML for a different mark up, say TeX. What would you suggest? How would that help?
We need some way to run code in a page. We could swap JS for some other language. Which would you suggest? How would that help?
It seems this complexity and mess is in the nature of the problem and what we have is not such a bad solution.
No no. Get rid of the markup, use the text formatting that used to be common, through things like telnet, using menu based navigation. For linking to other stuff, the protocol is there with FTP and GOPHOR, do not bother with anything new, it is all already there.
For files use FTP and GOPHOR. etc, etc.
In other words use the internet as it was before HTTP.
But that's what I'm asking?
What text formatting would you suggest? There have been many over the course of computer history. Going back to things like RUNOFF and troff. Which would use? How would it be better? As a minimum it needs to support hyperlinks.
I'm going to have a hard time delivering my 3D real-time data visualizations using FTP and GOPHOR.
What about videos in documents?
menu based navigation is just not going to cut it.
VT100 formatted text is common for Telnet, and most telnet clients also support PC-Extended ASCII text characters.
For a 3D over the net use a streaming data protocol with a custom client software (or use one of the standard protocols for streaming 3D from the 1990's).
Though if it is data visualizations would not that make more since on the as a program on the system it is being viewed, then to do over the net??
We gave had diagrams, drawings and pictures in books for centuries. Would be great if they could also move, like in Harry Potter. Now we put documents on screens driven by computers, why not have the pictures in there, well we do. Why not make them move if need be? Having to divert ones attention from whatever one is reading to go and find and download a vid and then fire up a player for it is not what people want.
I love plain text and ASCII as much as anyone. Have a look at my laboratory here: http://the.linuxd.org/lab/ But seriously this 2015, we have nice screens and can support fancy fonts and images and video.
Run my data visualizations as a program on the system it is being viewed....No.
Thing is the data is in in one place, at least it is collected in one place. The data is changing all the time. We want people to get a 2D or 3D view of that data in as near real-time as it happens. We have to stream over the net to those people wherever they are.
Now, we don't know what operating system they are using or what machine they have. They don't want to be installing special softare to view our stuff. We don't want to have to create and maintain versions of the same program to run on Linux, Mac, Windows and anywhere else (BSD, Android, iOS, Jolla, ChromeOSD, FireFox OS, etc etc) now and forever into the future.
How would you solve this problem?
One early approach was Java. A language that I loath, which has no reason to exist, does not solve the proble in general and is luckily going out of fashion.
The current approach a web page, JavaScript and webgl. Works a treat.
You do not know what system they are running, or if they are using a WebBrowser that supports JavaScript, or if they have an OpenGL implementation to use with WebGL.
To be realistic I like HTTP, and the way it was up till HTML 4.1. JavaScript can be useful, if we could get rid of the local FS access client side stuff in the variant used in HTTP clients (aka web browsers).
Assuming the availability of JS in all cases, and the availability of WebGL in all cases (OpenGL is available for 16MHz systems, as are web browsers that support JS and HTML 4), I highly doubt that a interface like what you speak of would be useful on the lowest common denominator. The lowest common demonitor being the lowest end system that can do well with HTML 4, WebGL and JS.
Though if you have a self refreshing 3D model using one of the common plugins embedded in the page (like QTVR [I have seen implementations of QTVR that do well on 12MIPS Integer only processors, without acceleration of any type]). You see QTVR has even been ported to the Atari TT/Falcon, that is what I would call the lowest common denominator.
Though with out HTTP another solution would be needed. And what your doing already is opening a seperate program on the client system (written in JS and interpreted), to view the rendering, so what is the difference between writing the program and writing the program?? The trouble I see is that there is not a good platform independent environment available for ALL of the possible systems people will be using. JS is close, if it were available on all systems separate from an HTML interpreter and separate from a web browser.
and
Video in documents is awesome! I can build material that people can learn from in compelling ways. Easy and lean. Doing this takes very little time and when we set those things aside in a link, etc... consumption of them drops dramatically. Why author things people won't use?
One could make the case for video and audio not being necessary, and for those of us who grew up on text, I would agree. Many of us can work either way, and we often do. However, people who didn't grow up with this work very, very differently. Either we want to be relevant or not.
Re: 3D
Protocols, data schema, etc... from the 90's have a whole pile of painful problems. Requiring a client installation of some sort is often not permitted due to engineering and security / secrecy requirements in place today.
Current trends are to host CAD and data manage it in ways very similar to how we do code. Doing this means having a channel from the server and various data sets through to the user interacting with it all. The 90's was a very clunky time and only a few big companies ever managed to get efficiencies from those protocols, schemas, install software, etc...
Today, I can build a model on anything, upload it, share that with a friend, and we can actually work on the model together, concurrently, as well as perform basic tasks, such as markup without having much. It even works on my phone!
1+Ghz systems for these tasks are the norm and "lowest common" Under 100Mhz is a relic. Fun relic, but not viable, possible, nor practical for the vast majority of 3D interaction use cases.
I agree that what we have is not a bad solution. The internet has made a huge improvement in how we gather and share information, shop, communicate, and learn. That doesn't mean it is problem free or that the components that make up the web don't need to be improved.
The components may not be perfect, but after using a well done HTML service manual I am convinced that that is the way to publish manuals, text books, or any other documents aimed at presenting technical information in text format.
Perhaps more could be done on the browser end to reduce some of the problems.
On a related note, those evil tracking scripts need eliminated. Again it's the advertising industry to blame, they should be aiming for subject based ads rather than profile based ads. Ditch the tracking altogether.