In what way would X, over the network, be of any use accessing this forum? Or gmail? Or my bank? Or a million other services we get over the web now a days?
X is not even an efficient way to do what it does. Too many round trips over the network. Just listen to the guys who actually work on X and can tell you all the gory details of that.
No, what we need, and what we have, is damn fast processors, even on phones and such, that can render graphics directly to the screen. Perhaps with the help of a GPU. Only the higher level information need travel over the network link.
Well, I can expand on my graphics as service comment.
The simple beauty here happens once a lean network transparent protocol gets baked in.
On localhost, it can be optimized to near direct access speeds. With smart coding, like say pushing lists, and other structures before rendering, performance can be on par for all but the most advanced games.
Direct access does make some sense, and that is generally when compute and or RAM is at a premium. Dynamic generation does squeeze the max out of things. And it makes sense at small scale, like with our prop chips, though Baggers did produce a nice graphics engine, as have others, that operates over a wire.
For most things, it's not needed at all.
The other benefit comes from being able to drive the graphics from multiple machines, users and or processes.
Say I want to use a more robust UI input and display with my cellphone. Mine actually does that, but it's a screen scraper. Works, but is clunky.
When graphics are a service, everything changes. And, as far as I'm concerned WebGL, JS and friends contain all the pieces.
Most any app would have a command-line component, and a graphical one. Automation happens through command-line STDIN, OUT and friends. Graphics happen in a similar fashion, just a stream piped to a service. Anything talks to anything.
What we have now is this kind of odd hodge podge of things that require a lot of specific code, and the OS to be fairly tightly coupled to its graphics.
Look at CAD in the cloud. Today, decent experiences are possible. It's X with a smarter protocol underneath. Those experiences approach what was being done via GLX in the 90s.
Had the service idea taken root, we would be in a much more advanced place today. I place the blame for that on more basic, kludge like gfx, single user, single machine assumptions baked in. Was efficient, but not well aligned with where we are going.
One thing X was crappy about was display host changes. Could be done, but risky and only well behaved apps would manage a transition. The "web" protocols we are building on stand a much better chance of handling things like that.
Heater, that demo was trivial under X with GLX extensions. Used to show stuff like that off all the time.
At one point, I setup a compute server delivering that level and more to some 30 users. One copy of the software, one data copy, many sessions.
Today, that I can view it over my phone means all the pieces are there. Just needs to be standardized a bit more, and packaged as a service. In fact, it does happen to an arbitrary number of users. I just viewed it, as could have others.
Like I said, X is being reinvented.
Take the Linux on Windows case. Run that program on Linux, render on Windows browser, right?
I think decoupling all of this from the browser is key. That's probably the biggest potential disagreement.
Done that way, an OS would come up command line, text, serial, whatever. Graphics would be optional, and either local or remote and almost nothing should care.
Right now, a browser, running in a mess of graphics junk can deliver the goods. And apps written that way, get us a lot of the way there.
Most of the good stuff still needs a local system and all the stuff that comes along for the ride. So we screen scrape, and we dont get the benefits we could be getting.
Ideally, I should be able to grab anything, point my app at it, or my user session and just go. We aren't there yet, and I think we both should be and maybe will be.
We can run a lot of browsers, tabs, etc.. And get close, again for those things written that way.
..that demo was trivial under X with GLX extensions. Used to show stuff like that off all the time.
I'm not sure you see what is happening there.
Yes perhaps it's "trivial" to pump out 3D graphics over X. And it's great if you can scale that to 30 or 300 users.
In the X world that 3D model would have to be running in the server, only the graphics commands get pumped out to the users. But wait, every user can pan and zoom at will. That means the server has to figure out the view and rendering for every user independently. It does not scale.
Let's think of a simpler example. DOOM. Networked Doom was a huge hit. For the fist time dozens of players could join in a 3D shoot'em'up. Well, sort of 3D. Anyway, the point is that it did not rely on shunting graphics over the network. That would never have worked. No, Doom only needs to communicate the game state, a manageable amount of network traffic. The players PC's render that state as needed.
I use X every day. I couldn't survive without it. I have a headless Linux box that serves as my iptables firewall, backup, and kmail client. Plus there's a perl cron program that checks and displays Parallax stock every hour for the items I sell them, so I can jump on any shortages.
All of this is accessed via ssh with an XQuartz client on my MacBook. X may be an old system, but it hasn't outworn its usefulness.
I use X every day. I couldn't survive without it. I have a headless Linux box that serves as my iptables firewall, backup, and kmail client. Plus there's a perl cron program that checks and displays Parallax stock every hour for the items I sell them, so I can jump on any shortages.
All of this is accessed via ssh with an XQuartz client on my MacBook. X may be an old system, but it hasn't outworn its usefulness.
-Phil
I'm with the gentleman from the state who sponsors heavy rainstorms nearly every day. I run a box who's also headless. The box happens to be a SPARC system who runs Solaris 10 from March of 2005. The box handles my useful files, and sometimes serves as my bridge to remotely access my also running Linux system. I say "also" because currently that system is posing as my development station for the BASIC Stamp.
The big problem with X is that it's heritage is sometimes a big disruptor. In that being developed to provide graphics for UNIX is why the system is lousy with problematic libraries, still.
Have you tried VNC on your Mac? It works great here and even on Linux when I have it running here, as well as a Raspberry Pi......
I also use X11 every day. In its original incarnation - I'm not using a web interface to the other computer, I log in and start a (remote) session. Whether it's emacs or xemacs or one of a ton of x-something tool. I always work on local and remote networked computers, and I use X11 to get a seamless experience.
If the other end is too remote, and the application I run needs too much forth-and back (where latency kill the performance), I resort to run it in a VNC. Which is X11 too, of course.
There are only a few things I use a web interface for, although that's an important part - information gathering.
The one part of X11 I don't use anymore is where you run the majority of the X server on a separate computer (edit: actually mostly the window manager), to offload your thin workstation. That was essential back when there wasn't enough RAM around. I'm not sure anyone even remembers this.. I did a quick glance at Wikipedia and that article didn't seem to describe this, not really.
I also use X11 every day. In its original incarnation - I'm not using a web interface to the other computer, I log in and start a (remote) session. Whether it's emacs or xemacs or one of a ton of x-something tool. I always work on local and remote networked computers, and I use X11 to get a seamless experience.
If the other end is too remote, and the application I run needs too much forth-and back (where latency kill the performance), I resort to run it in a VNC. Which is X11 too, of course.
There are only a few things I use a web interface for, although that's an important part - information gathering.
The one part of X11 I don't use anymore is where you run the majority of the X server on a separate computer (edit: actually mostly the window manager), to offload your thin workstation. That was essential back when there wasn't enough RAM around. I'm not sure anyone even remembers this.. I did a quick glance at Wikipedia and that article didn't seem to describe this, not really.
And that's what largely happens here. Now about that Wikipedia page? Look at the Talk tab and add your thoughts to it, and perhaps even add your content to bolster it.
For example on the PDP-11 pages, I brought them to the attention to a list I belong to, and a gentleman I know who owns scads of them, and they all run, and are based in his museum. He then looked at it, and updated something on it regarding the gadget running there undressed.
I used to update and comment on Wikipedia. Now I almost never do. There's a rigidity combined with ignorance which very often blocks attempts to fix wikipedia pages. I mostly don't bother anymore.
I did not understand Tor's post at all. On the one hand we have:
I also use X11 every day. In its original incarnation - I'm not using a web interface to the other computer, I log in and start a (remote) session. Whether it's emacs or xemacs or one of a ton of x-something tool. I always work on local and remote networked computers, and I use X11 to get a seamless experience.
On the other hand, in the same post, we have:
The one part of X11 I don't use anymore is where you run the majority of the X server on a separate computer ...
This is totally contradictory.
Now, I know how useful X can be. Even to the point that I once compiled a Windows program against libwine such that it could run on Linux. And then had it display itself, using X, on a different machine than it was running on. I earn geek creds sometimes
But, my thesis is that the percentage of times people, today, use X to communicate over network to a remote machine is indistinguishable from zero.
Rather, what we all do is interact with remote machines all day long using HTTP and HTML. The web that is.
In short, if X magically could not operate over the network tomorrow, if Linux and other desktops could only work directly to the screens attached to the machine they run on, nobody would notice or care.
It's really simple. There's a majority of systems who run around headless. In this case, and here, it is that SPARC box. And that also applies to the Raspberry Pi system in general. They contain everything needed to run what makes up X. But rarely is a screen attached to the Raspberry Pi devices I have here. And that SPARC routinely ran that way.
Also the IBM S/390 or System Z systems who run Linux as a guest of the regular installed operating systems, also are run headless for obvious reasons. The big blue fellow has neither a keyboard or a directly connected video system, or even a mouse. People who manage them connect using SSH, and even VNC. Some even do so from a system running Linux directly. Others are using a Mac.
This is an amazingly simplified example. There are scads of more complex solutions out there.
In short, if X magically could not operate over the network tomorrow, if Linux and other desktops could only work directly to the screens attached to the machine they run on, nobody would notice or care.
I would. I'd have to repair to the dungeon home of my headless Linux box just to do email. My only regret is that kmail isn't thread-safe, so I can only have a session running via X11 on one machine, in this case my MacBook.
Yep. That is right. Most of the computers in the world are "headless", they have no direct display or keyboard, mouse etc.
And most of those machines are not communicated with using X. I maintain that a vanishingly small number of them are.
Most of my career has involved programming and managing headless machines. Used to be micro-controllers and embedded systems in remote locations. Now it also involves servers in Google or Amazon or Azure "clouds".
Meanwhile, for fun, I have Raspberry Pi doing this and that around the place. Headless. No X there either.
In short, as I said, the network capability of X is never used today.
VNC is Virtual Network Console, originally a product of Olivetti Research in Britain, eventually being made by AT&T, and now by themselves, RealVNC
Actually they report numbers in the thousands. I'm one. And I've met a few hundred in decent counts, who use it to talk to their System Z Linux systems.
As for the Cybermen, since Phil didn't react, okay.
I guess that makes me less than nobody but I'm not going to belabor the point. X serves a distinct and useful purpose for me over my local network, and I'd be at a disadvantage without that capacity.
Rather, what we all do is interact with remote machines all day long using HTTP and HTML. The web that is.
What? But that's just web browsing. There's nothing else in html and http. Forum, finding information, netbank. No actual *work*.
These days I use networked computers more than ever. Lots of it is ssh, and yes a lot of the remote X11 access I do is tunneled over ssh due to today's "-notcp" of X servers, but the X client/server mechanism is the same.
When I work remotely I do my work in a GUI editor, not in a web browser. Of course not.
Btw, did you never run your X11 window manager, say, olvwm, on a different computer from the one you were sitting in front of? We all did, if our workstation (or Linux box, back in the early nineties) did not have enough resources. That is rare today (although I'm reconsidering now, with all the bloat eating all my 16GB of RAM), but the rest of X11 is as useful as ever.
Don't feel so bad. I'm less than nobody too. It's hard to be somebody when there are 7 billion bodies on the planet. I do see the utility of X. It's just that the percentage of users of X over a network today rivals the number of CP/M users!
What? But that's just web browsing. There's nothing else in html and http. Forum, finding information, netbank. No actual *work*.
A fail to to see the difference there. With X the *work* happens in computer A, the display and interaction is on computer B. So it is with HTML/HTTP.
These days I use networked computers more than ever. Lots of it is ssh...
Same here. All the machines I am actually paid to work on are remote. Either remote embedded systems or the remote servers in the "cloud" that those embedded systems work with. This is very common now a days. How do people work with these remote systems? They use the command line via ssh. They use web browser interfaces. I have never seen anyone using X for this kind of thing.
When I work remotely I do my work in a GUI editor, not in a web browser. Of course not.
Currently I do all my work in a GUI editor. Microsoft's Visual Studio Code. I do that locally. Code gets pushed to a git repository and eventually gets deployed to the remote machines. Often from the git repository. Rarely do I edit stuff directly on the remote machines. If I do that is what vim is for!
Oddly enough, Visual Studio Code is actually built on a web browser engine....
Btw, did you never run your X11 window manager, say, olvwm, on a different computer from the one you were sitting in front of? We all did, if our workstation (or Linux box, back in the early nineties) did not have enough resources. That is rare today (although I'm reconsidering now, with all the bloat eating all my 16GB of RAM), but the rest of X11 is as useful as ever.
Yes. Recently I had to arrange for software on machine A to display on a machine B far away. Whilst I'm on machine C even further way. Of course A and B communicate with HTTP/HTML. X is only the display. No need for X over the network.
I don't say that X is not useful. Only that nobody uses it like that anymore.
Oddly enough I found myself hanging out with a bunch of chip designers in Mountain View this summer. Sadly the discussions did not include such tools.
I guess it's historical. Today anyone designing a system with heavy weight processing on remote servers would provide a web interface to view the results. Like waveform viewing.
I worked on a couple of sites where you are not able to access the internet. Simply for security reasons. Sure you could run http over the local lan, but why should you restrict yourself to this?
There a quite a lot of other world-spanning networks not connected to the internet. One reason you find not much COBOL in the Internet. Companies using COBOL tend to disallow code sharing or discussions about code to leave the internal net.
But things are changing even there, one reason is the new Z-series from IBM.
I personally haven't used X for decades, maybe because of my mostly windows habit.
Comments
X is pretty much redundant now a days.
In what way would X, over the network, be of any use accessing this forum? Or gmail? Or my bank? Or a million other services we get over the web now a days?
X is not even an efficient way to do what it does. Too many round trips over the network. Just listen to the guys who actually work on X and can tell you all the gory details of that.
No, what we need, and what we have, is damn fast processors, even on phones and such, that can render graphics directly to the screen. Perhaps with the help of a GPU. Only the higher level information need travel over the network link.
For example:
https://www.babylonjs.com/demos/v8/
How are you going to deliver that to an arbitrary number of users using X?
The simple beauty here happens once a lean network transparent protocol gets baked in.
On localhost, it can be optimized to near direct access speeds. With smart coding, like say pushing lists, and other structures before rendering, performance can be on par for all but the most advanced games.
Direct access does make some sense, and that is generally when compute and or RAM is at a premium. Dynamic generation does squeeze the max out of things. And it makes sense at small scale, like with our prop chips, though Baggers did produce a nice graphics engine, as have others, that operates over a wire.
For most things, it's not needed at all.
The other benefit comes from being able to drive the graphics from multiple machines, users and or processes.
Say I want to use a more robust UI input and display with my cellphone. Mine actually does that, but it's a screen scraper. Works, but is clunky.
When graphics are a service, everything changes. And, as far as I'm concerned WebGL, JS and friends contain all the pieces.
Most any app would have a command-line component, and a graphical one. Automation happens through command-line STDIN, OUT and friends. Graphics happen in a similar fashion, just a stream piped to a service. Anything talks to anything.
What we have now is this kind of odd hodge podge of things that require a lot of specific code, and the OS to be fairly tightly coupled to its graphics.
Look at CAD in the cloud. Today, decent experiences are possible. It's X with a smarter protocol underneath. Those experiences approach what was being done via GLX in the 90s.
Had the service idea taken root, we would be in a much more advanced place today. I place the blame for that on more basic, kludge like gfx, single user, single machine assumptions baked in. Was efficient, but not well aligned with where we are going.
One thing X was crappy about was display host changes. Could be done, but risky and only well behaved apps would manage a transition. The "web" protocols we are building on stand a much better chance of handling things like that.
Seems you agree with me.
How dull
At one point, I setup a compute server delivering that level and more to some 30 users. One copy of the software, one data copy, many sessions.
Today, that I can view it over my phone means all the pieces are there. Just needs to be standardized a bit more, and packaged as a service. In fact, it does happen to an arbitrary number of users. I just viewed it, as could have others.
Like I said, X is being reinvented.
Take the Linux on Windows case. Run that program on Linux, render on Windows browser, right?
I think decoupling all of this from the browser is key. That's probably the biggest potential disagreement.
Done that way, an OS would come up command line, text, serial, whatever. Graphics would be optional, and either local or remote and almost nothing should care.
Right now, a browser, running in a mess of graphics junk can deliver the goods. And apps written that way, get us a lot of the way there.
Most of the good stuff still needs a local system and all the stuff that comes along for the ride. So we screen scrape, and we dont get the benefits we could be getting.
Ideally, I should be able to grab anything, point my app at it, or my user session and just go. We aren't there yet, and I think we both should be and maybe will be.
We can run a lot of browsers, tabs, etc.. And get close, again for those things written that way.
Yes perhaps it's "trivial" to pump out 3D graphics over X. And it's great if you can scale that to 30 or 300 users.
In the X world that 3D model would have to be running in the server, only the graphics commands get pumped out to the users. But wait, every user can pan and zoom at will. That means the server has to figure out the view and rendering for every user independently. It does not scale.
Let's think of a simpler example. DOOM. Networked Doom was a huge hit. For the fist time dozens of players could join in a 3D shoot'em'up. Well, sort of 3D. Anyway, the point is that it did not rely on shunting graphics over the network. That would never have worked. No, Doom only needs to communicate the game state, a manageable amount of network traffic. The players PC's render that state as needed.
We like to be able to connect machines together.
The question is at what level of abstraction should that communication happen?
We had computers and teletypes exchanging characters at 300 baud.
IBM extended that idea to smart terminals that submitted forms and got screens back.
Then we had faster computers and networks exchanging detail graphics commands over X.
Then we get the web, where we exchange just the pertinent, high level, data over the net and let the browsers render it how they like.
Is X being reinvented?
No, it's an on going adaption and trade off between network speed, processing power and what is possible at the time.
All of this is accessed via ssh with an XQuartz client on my MacBook. X may be an old system, but it hasn't outworn its usefulness.
-Phil
The big problem with X is that it's heritage is sometimes a big disruptor. In that being developed to provide graphics for UNIX is why the system is lousy with problematic libraries, still.
Have you tried VNC on your Mac? It works great here and even on Linux when I have it running here, as well as a Raspberry Pi......
If the other end is too remote, and the application I run needs too much forth-and back (where latency kill the performance), I resort to run it in a VNC. Which is X11 too, of course.
There are only a few things I use a web interface for, although that's an important part - information gathering.
The one part of X11 I don't use anymore is where you run the majority of the X server on a separate computer (edit: actually mostly the window manager), to offload your thin workstation. That was essential back when there wasn't enough RAM around. I'm not sure anyone even remembers this.. I did a quick glance at Wikipedia and that article didn't seem to describe this, not really.
And that's what largely happens here. Now about that Wikipedia page? Look at the Talk tab and add your thoughts to it, and perhaps even add your content to bolster it.
For example on the PDP-11 pages, I brought them to the attention to a list I belong to, and a gentleman I know who owns scads of them, and they all run, and are based in his museum. He then looked at it, and updated something on it regarding the gadget running there undressed.
Now, I know how useful X can be. Even to the point that I once compiled a Windows program against libwine such that it could run on Linux. And then had it display itself, using X, on a different machine than it was running on. I earn geek creds sometimes
But, my thesis is that the percentage of times people, today, use X to communicate over network to a remote machine is indistinguishable from zero.
Rather, what we all do is interact with remote machines all day long using HTTP and HTML. The web that is.
In short, if X magically could not operate over the network tomorrow, if Linux and other desktops could only work directly to the screens attached to the machine they run on, nobody would notice or care.
Also the IBM S/390 or System Z systems who run Linux as a guest of the regular installed operating systems, also are run headless for obvious reasons. The big blue fellow has neither a keyboard or a directly connected video system, or even a mouse. People who manage them connect using SSH, and even VNC. Some even do so from a system running Linux directly. Others are using a Mac.
This is an amazingly simplified example. There are scads of more complex solutions out there.
-Phil
And most of those machines are not communicated with using X. I maintain that a vanishingly small number of them are.
Most of my career has involved programming and managing headless machines. Used to be micro-controllers and embedded systems in remote locations. Now it also involves servers in Google or Amazon or Azure "clouds".
Meanwhile, for fun, I have Raspberry Pi doing this and that around the place. Headless. No X there either.
In short, as I said, the network capability of X is never used today.
Of course this does not explain why there is an audience of cybermen watching Phil at work. And in the rain as well.
However, pretty much nobody uses X to display their desktop or even a single application over a network.
If X lost it's ability to work over a network tomorrow, almost nobody would know it and less than nobody would care.
I have no idea what VNC has to do with this discussion. It's a totally different thing.
And please stop it with the random statements about Phil and the Cybermen and whatever.
Actually they report numbers in the thousands. I'm one. And I've met a few hundred in decent counts, who use it to talk to their System Z Linux systems.
As for the Cybermen, since Phil didn't react, okay.
I can use the VNC protocol to display the screen of my Android phone on Windows. Neither of which knows anything about X.
I've used both X and VNC -- at least the version of VNC that transmits screen grabs. X, by comparison, is much more responsive.
-Phil
However.. If X lost it's ability to work over a network tomorrow, almost nobody would know it and less than nobody would care.
-Phil
These days I use networked computers more than ever. Lots of it is ssh, and yes a lot of the remote X11 access I do is tunneled over ssh due to today's "-notcp" of X servers, but the X client/server mechanism is the same.
When I work remotely I do my work in a GUI editor, not in a web browser. Of course not.
Btw, did you never run your X11 window manager, say, olvwm, on a different computer from the one you were sitting in front of? We all did, if our workstation (or Linux box, back in the early nineties) did not have enough resources. That is rare today (although I'm reconsidering now, with all the bloat eating all my 16GB of RAM), but the rest of X11 is as useful as ever.
Don't feel so bad. I'm less than nobody too. It's hard to be somebody when there are 7 billion bodies on the planet. I do see the utility of X. It's just that the percentage of users of X over a network today rivals the number of CP/M users!
Oddly enough, Visual Studio Code is actually built on a web browser engine.... Yes. Recently I had to arrange for software on machine A to display on a machine B far away. Whilst I'm on machine C even further way. Of course A and B communicate with HTTP/HTML. X is only the display. No need for X over the network.
I don't say that X is not useful. Only that nobody uses it like that anymore.
Oddly enough I found myself hanging out with a bunch of chip designers in Mountain View this summer. Sadly the discussions did not include such tools.
I guess it's historical. Today anyone designing a system with heavy weight processing on remote servers would provide a web interface to view the results. Like waveform viewing.
There a quite a lot of other world-spanning networks not connected to the internet. One reason you find not much COBOL in the Internet. Companies using COBOL tend to disallow code sharing or discussions about code to leave the internal net.
But things are changing even there, one reason is the new Z-series from IBM.
I personally haven't used X for decades, maybe because of my mostly windows habit.
Enjoy!
Mike
Although that does not really help security much, as STUXNET demonstrated.
I'm not sure why HTTP is a restriction for machines communicating over a local LAN.