Shop OBEX P1 Docs P2 Docs Learn Events
What is the best way of using computer remotely? — Parallax Forums

What is the best way of using computer remotely?

W9GFOW9GFO Posts: 4,010
edited 2014-08-06 07:10 in General Discussion
I spend quite a bit of time in my shop which is about 100ft from the house. I would like to be able to run Solidworks in my shop but it is installed on the computer in my house (installing Solidworks on the network is not an option). The obvious solution is to log in remotely via some program. Previous experience tells me that this is laggy and frustrating. It would be great if I could just have a monitor, keyboard and mouse in the shop connected directly to the computer so that another computer is not needed. If I ran cables the total length would end up being about 150 ft.

I have hundreds of feet of Cat6 cable at my disposal, could that carry a VGA or HDMI signal? Is there a better way to log in remotely that has no lag?

Comments

  • ctwardellctwardell Posts: 1,716
    edited 2014-08-03 21:51
  • msrobotsmsrobots Posts: 3,709
    edited 2014-08-03 22:35
    Assuming you are using Windows I would recommend the Remote Desktop Application.

    I use it mostly over sometimes slow Internet connections, and it is lagging then. But used in a local network it is quite fast and not lagging at all.

    What I like most on it is the seamless integration of local drives, printers and clipboard (you may need to enable these options before logging in).

    Enjoy!

    Mike
  • LoopyBytelooseLoopyByteloose Posts: 12,537
    edited 2014-08-03 23:56
    As long as you define your project around using available CAT5-6 cable, some latency will be likely. But you can measure how much and manage it.

    Another alternative is fiber optic cable. It is simply faster.
    http://www.fiberopticsstore.com/
    http://www.lanshack.com/fiber-optic-tutorial-network.aspx

    Fiber optic cable is not expensive, maybe 30 cents per meter is possible. The converters at each end can cost a bit.
  • Heater.Heater. Posts: 21,230
    edited 2014-08-04 00:15
    Fibre optic is great. However the limit on bandwidth and latency is determined by the machines, hardware and software, on both ends. Adding those fibre interfaces adds cost of course. I very much doubt it would make any noticeable difference compared to the ethernet connection in this application. Use what is cheapest.
  • potatoheadpotatohead Posts: 10,261
    edited 2014-08-04 00:49
    Personally, I would recommend you try TightVNC. If both machines are on the network, and the network is fast, you can have a decent experience.

    Run your Solidworks computer at the lowest resolution you find useful with Solidworks. For a 100T connection, I would choose a 1024 or 1280 pixel display max. On the client end, Tight VNC does allow display scaling, and this can match up verticals, should they be different, and the cost of that is a non pixel for pixel display, which in many cases is still very useful. Ideally, you run the Solidworks computer at a slightly lower resolution than your shop computer for a pixel perfect display.

    If you can, setup a 1000T network connection. It's worth it. Don't use a wireless. It's not worth it.

    Run Solidworks full screen, and disable the fancy graphic effects, essentially going for a solid color background. The gradients, shadows and other nice things contribute to how much information needs to be compressed, sent, and displayed on the other end.

    You may find assigning the "high" priority to the VNC Server process helpful. On some machines, this helps things a little.

    On the VNC Client machine, you can set the compression high, JPEG high, and this will improve lag a lot, if the shop machine is a fast machine. If not, roll back on the compression some. The JPEG compression does leave a few artifacts around some screen elements, but it's fast, and that helps with solid models. It ships this way by default, but make sure you enable the cursor dot, and "allow remote machine to manage cursor" so that you can click ahead of the lag. The little dot is where the remote machine thinks the cursor is, and that's where your click will register, whether or not the cursor draw has caught up on your end.

    This takes a little bit of practice, but once you get used to looking at the dot, you will find you click in the right places at the right time more often than not. So the latency may be there at times, but you can click anyway, and VNC will often just skip that frame or interaction and move on to the next one. Nifty, if you can deal.

    Personally, I don't like RDP (Remote Desktop) for this application because you don't get the benefit of your GPU that way, and it forces Solidworks, and other CAD systems, to do software rendering, which buries the CPU on that end of things, leaves the least amount of time for screen compression, etc...

    Of the various VNC incarnations out there, Tight VNC does provide a low latency display, and it gives you a lot of options on the client side to manage what it does and how it does it. The biggest thing you can do on the Solidworks end is maximize the application, and keep visual effects to their useful minimums.

    Another good speed up is to disable the Aero Desktop theme, and go for "Windows 7 Classic", which will have a more square, less polished look to it, and it will not have many of the animations and other treats normally associated with a Win 7 desktop. All of this adds up, and with Solidworks, the real task is drawing sketches, selecting geometry entities and evaluating the shaded model. There is drafting too, and that generally runs well. It's a much simpler display.

    A drastic speed up is to limit color to 8 bits. With solid models, this is crappy. However, when doing drafting, or looking at wireframe representations, it's often quite servicable. It's faster. That's all.

    Since this is a freebie, it's won't take you but 15 minutes to set it up and give it a go. I would give yourself 30 minutes to adjust to the dot cursor, and you are likely to find it useful, if not productive. One thing you might find helpful is the keyboard arrow keys do navigate around the solid model. It's a coarse rotation, but it's the same amount for each click. Early on in Solidworks history, keyboard keys were the primary rotation device. A version or two into it, they moved to the dynamic mouse type rotation and or 3D input "Space Ball" or "3D mouse" type input used by many today. VNC isn't going to work with a 3D mouse, but it will work with the dynamic mouse rotation. Tap the arrow keys around a little, and you may find that very useful. If there is some lag you find unacceptable, the difference is this: With the mouse, you end up "drunk" and that means you will rotate, overshoot, wait, rotate back, overshoot, then settle in. With the keyboard keys, your mind will mentally map out the few keystrokes you need to transform your model view. It's more coarse, but you get an idea of what will happen, and there are fewer view transitions for VNC to communicate to you, meaning the whole thing works better overall.

    I've personally run some big models this way. It's not the same experience as you get locally, but it's more than adequate to get things done. The big hurdle is getting past needing to always see it before you interact with it. Once you've run CAD this way a little while, you get to know what it will do, and you click there anyway, and just keep moving.

    The JPEG compression is a mixed bag. Some people just don't play well with the small artifacts it produces. Try it with that option turned off. You can toggle those things on the client side, once you've got the VNC server setup on the Solidworks computer.

    The lag and frustration can be managed down. Invest a little time and you may find it's not such a worry. In my past, I've run just about every MCAD system there is, and I've had to do it remote more than a few times for licensing reasons.

    Is your Solidworks a home activation? If not, you might consider activating a laptop. Or, you might consider just moving your activation to either a shop computer, or a laptop too. That's not hard.

    Having it on a laptop is generally fine for most models today. And the nice thing about those is being able to take them into the shop. You could add a monitor, bluetooth or USB keyboard and mouse and keep the laptop in a box. Some poking around Windows will yield the settings to make an external monitor primary, and or disable machine sleep / shut down when the lid is closed. From there, just tuck it out of the way.
  • W9GFOW9GFO Posts: 4,010
    edited 2014-08-04 12:10
    The primary reason I want to use the main computer remotely is so that I don't have to accept lower resolution/lower performance. I already have a laptop installation, which works okay, just not great.

    The KVM extenders look promising. One things that worries me is the video quality. When they claim that it is "satisfying", well that's not very encouraging to me. Most of them claim to carry the VGA signal about twice as far as I will need. If the video quality does not suffer then this is the way I think I would go.
  • NWCCTVNWCCTV Posts: 3,629
    edited 2014-08-04 22:02
    Although I have never used the Linksys KVM extenders I have used higher end brands and they do work pretty well.
  • Beau SchwabeBeau Schwabe Posts: 6,566
    edited 2014-08-04 23:09
    VNC all the way ... When I worked for National Semiconductor, I used VNC in Atlanta o do layout on a machine in California on a regular basis. his was mainly o keep he database local o he project which was in California. Between Chip and myself, we have also used VNC on numerous occasions at Parallax to drive the floor planning, convey ideas, etc. ... What I really like about VNC is that it is also cross platform... much of our work spans across Windows and Linux. VNC allows us to make that bridge.
  • Peter KG6LSEPeter KG6LSE Posts: 1,383
    edited 2014-08-05 00:07
    VNC all the way ... When I worked for National Semiconductor, I used VNC in Atlanta o do layout on a machine in California on a regular basis. his was mainly o keep he database local o he project which was in California. Between Chip and myself, we have also used VNC on numerous occasions at Parallax to drive the floor planning, convey ideas, etc. ... What I really like about VNC is that it is also cross platform... much of our work spans across Windows and Linux. VNC allows us to make that bridge.

    Same here . VNC is Ideal as I am also X platform .
  • Martin HodgeMartin Hodge Posts: 1,246
    edited 2014-08-05 00:22
    I've been blown away with Chrome Remote Desktop over LAN. It works much better than the Microsoft solution in my expereience.
  • potatoheadpotatohead Posts: 10,261
    edited 2014-08-05 09:05
    And that's powered by VNC too. I'm gonna set it up on a couple of machines. Didn't know about it.
  • Phil Pilgrim (PhiPi)Phil Pilgrim (PhiPi) Posts: 23,514
    edited 2014-08-05 09:52
    My experiences with VNC have all been pretty disappointing, due to latency issues. X Windows over IP is a lot snappier, but then the app itself has to be able to run under X for that to be a solution.

    -Phil
  • Heater.Heater. Posts: 21,230
    edited 2014-08-05 10:29
    Which is quite remarkable Phil. Even the long time X Windows developers will say that X is horribly latency bound and totally unoptimized for the one thing it is supposed to do well, network transparency. In fact proposed optimizations for X will have it using techniques like those of VNC!

    I have not tried X over a net or VNC for years so I don't know how they compare no a days.
  • potatoheadpotatohead Posts: 10,261
    edited 2014-08-05 11:01
    I continue to think those developers have some basic misunderstanding. Or they simply recognize a mess and are taking a best fit step forward. Ideally, it's the latter.

    In the 90's, I ran this application: http://en.wikipedia.org/wiki/I-DEAS Over the X Window system on IRIX, a PC, Sun, all the time. In fact, I had my training room setup this way. One dual CPU Origin class server, several GB of RAM, and various machines, usually Indy class computers, but sometimes Windows computers running Exceed X Server software, and it flat out rocked hard. I did this so I could have ONE copy of the software, running multi-user on that server, all data managed, etc... with the users only seeing what they needed to, which was the application and it's data manager. No accidentally deleted files, etc... They didn't even have permissions to view the files. The application, running SUID had the permissions, and that's it.

    I could fire up a session running local, compute, everything on the machine, and one remoted from another machine. The only material difference was time to send tessellation data over the wire on larger datasets. Basic GUI interaction was not something an ordinary user could discriminate well enough to matter.

    That was on 100T ethernet, BTW.

    Here's a screen shot of a larger model that would work reasonably over X:
    boom.inside.small.gif
    Smaller models were absolutely no problem. One that size would have some latency on redraw operations as the tessellation data would need to move over the 100T connection.

    Here's the GUI:
    SDRC-2.jpg
    Sorry, I could not find a bigger image online. That GUI was very highly interactive. Passing the cursor over the model would highlight entities for selection dynamically, and this worked over X just fine.

    The combination of software was MOTIF, X, OpenGL, and the GLX extensions. At one point, I had 30 users connected to a larger Origin class machine, each running this program, and I did that over 1000T ethernet, with pretty much no issues.

    These users would employ a 3D mouse to dynamically interact with the models as well. That thing worked over the wire too. No worries. BTW, administration was a doddle. I could do it over dialup and did. Nothing since has even come close to how potent, lean, efficient that setup was. Those guys have two people walking around doing admin today, when I cost a few bucks a few times per month. X drove all of that, and it was powerful, cheap, lean.

    Oh, and here's the killer: In the 90's, I could remote that application over X and put it side by side with the same application running native on a PC, running Win 2K, and have users prefer the Irix Interactive Desktop, running entirely over the wire. Yes, that good. X can totally perform. There are other matters driving this change, IMHO.

    In addition, I used many text editors, GL visualization applications of all kinds for life sciences, mechanical engineering, finite element analysis, detailing and drafting. All over X. Latency was moderate at worst, and the biggest issue would be slower redraws on larger data sets on slower networks. (10T ethernet mostly)

    Personally, I think application development moved away from the techniques needed to properly employ X, and the move to VNC is all about getting around that issue. And if so, OK. That's maybe a reasonable path, given how many application developers really don't understand X at all, and given how all the other GUI platforms we have are single user affairs.

    Poorly written applications make lots of calls to the GUI, don't employ things like backing store, request lots of redraw events, etc... Yes, those may not perform well on X where network latency and throughput are an issue, and depending on what they've done, might not work well even on a really good network!

    One other component to this is how things get made in the first place. SGI IRIX featured a lot of command line tools which worked just great on any text interface you could name. When a GUI was desired, a user could launch that, and it would connect to the command line tool, which would then stream data needed for the GUI to operate, sort of like --verbose works on many things today.

    The beauty of this was giving users lots of options. If I was on a dialup, 2.5K or so per second, I would just ask for the command line. If I were on a DSL or better connection, ask for the GUI, whatever.

    Over the last 10 years or so, we sort of lost track of how UNIX, X and networks work together. It's this I believe drives the reworking of X and a move to VNC type approaches. Not that X was such a problem. Frankly, UNIX with X on IRIX in particular, was the very best GUI computing experience I ever had. We've got spiffy stuff today, but absolutely nothing out there even comes close to how good that environment really was.

    I'll go and scan a print I made of my SGI desktop one of these days. In the 90's I was doing video conferencing, instant messaging, running applications from all over the place, on all sorts of machines and not missing a beat! X ruled and I would very regularly give people a demo and they would walk away stunned!

    Linux, for a while, participated nicely in all of that. nVidia, MESA, and some work from SGI brought the X extensions to Linux, and those computers would participate on par with the IRIX ones, minus the better scheduling in IRIX at the time. Linux has since caught up and improved on that OS, but if I factor that out, the whole experience was pretty solid.

    What I did notice as I headed into the '00's, was more and more applications not actually performing over X, and with that the complaints going up. The thing was, X didn't really change much through that time. Going back to older versions of things that existed in the 90's today would see the same performance on an X11 display system, and if it's 3D, one with "GLX" extensions working nearly unchanged!

    So it's about application development, tool kits and all sorts of stuff that really didn't incorporate the means and methods needed to properly employ X driving a lot of this today. Or at least, that's how I see it, given the nuts good experiences I had.

    Re: VNC x Applications running over X being more snappy.

    Here is what Phil is getting at. An application running over the wire, on the X Window System, will see local cursor, local highlighting, etc... This happens at the monitor frame rate, and in the case of the mouse movement and GUI interaction, will happen at the system scheduling rate. For reference, a SUN was 30Hz, which is a little slow for mouse movement. Feels a bit like working drunk. IRIX was 60Hz, as was Linux, etc...

    If that interaction happens at speeds over about 50Hz, the user will see "snappy" and here's the subtle bit. If you move the latency to areas outside that basic user interaction, like say an application will take a moment longer to redraw a window, or process a request, but the user GUI elements perform, the user will claim that application is "faster" and "more snappy" or "real time" They will say this regardless of whether or not the actual time to perform the entire task is longer or shorter.

    With VNC, all that latency is moved to the user interaction. Thus, the little "dot cursor", which is the only thing operating in real time 50Hz + for the user. Now, the subtle bit: The application on VNC will operate at it's full speed. If we could find an application that ran under X to test with, we could set it up side by side and test this.

    (which I have done in the past, BTW)

    What we will find is the perception of "fast" comes down to how real time the actual GUI interactions are, not the overall speed the task itself gets done.

    Say that application were to draw a complicated dataset on the screen, like CAD does. The VNC one will complete quicker, because the local display, local GPU, means the data streams over a local bus, and that's quicker than the same data streaming over the wire. The X one will exhibit some delay on an operation like that, but it will demonstrate almost no delay at all for most other interactions.

    Users will say the application with the most precise GUI interaction is faster every time, even though the actual cycle time on a higher latency display, such as VNC, is actually shorter.

    The original developers of X optimized for this case, and application developers could write to it, providing users with great experiences over a wide range of network conditions. We lost that somehow, and today "screen scraping" is being used all over the place where before, it was piped through the standard pipes and over the network.

    With the loss of pipes and streamed data, we are left with virtual implementations of GUI screens in RAM, or actually reading a frame buffer to get the results of a GPU, then compressing that, stuffing that whole mess over the wire, and rendering it out to people. I believe that can get low latency, but then the benefit of the OS, pipes, networks and a whole lot of other stuff is lost. In particular, the ability to write an application that works command line, and that can have a GUI wrapper made for it is gone too.

    Oh well. We shall see what the next incarnation brings us. Maybe I'll like it. :)

    But there is a very high bar to get over. One that I believe a lot of people didn't even experience, leading to the mess we've got today, IMHO.

    Aw heck, here's one more small rant...

    Let's talk multi-user for just a moment. On a UNIX system running X, it's not necessary to have graphics on it at all. Users connect via SSH or something, and just ask for the application. Given their environment is setup, X sees the display they have, and it just works. No fuss. In particular, launching that application to a remote user isn't any different than launching it locally as that user.

    When it runs local, it connects via localhost to the X server running there. That set of graphics resources is setup once, and shared by all users running all applications. So on the local machine I log in as "doug" and fire my stuff up. Then I log in as "other" and fire that stuff up, and "heater" jumps on from an SSH and fires it up for his display.

    The memory footprint for the application doesn't change. The only difference is where the display is, and efficient network stacks really don't impose a penalty for a local connection. 127.0.0.1 kind of thing.

    Now, let's talk Microsoft.

    Today, we can use the "run as" function to run an application on our display as another user. This works, and it works well, but it doesn't work through any named pipes or simple redirection. So it's only good for that computer and that user, but it does share the benefit X did, in that the memory footprint for the application doesn't change. So far, so good.

    Now, RDP, Terminal Server, and other kinds of things actually require a GUI setup in RAM! Start up a few users, and now youv'e got their frame buffers and all sorts of other garbage in there along with the application! Do that with X, and it's just a different set of pipes.

    To me, that's a huge mess of a kludge that is requiring other kludges to get past. Ugly.

    Now, there is an upside to this VNC business, and that is being able to share a GUI with somebody else! Doing that on X is hard. Back in the X days, I would fire up the application as the user, remote it to me, and duplicate a problem, or do something as them as needed. That isn't optimal at all. With VNC, I could share the session with them, and we both could run the application. Much improved, and I like this.

    Perhaps being able to do that is going to balance this stuff out. I very regularly share desktops with people, either for training, or support, or just collaboration. We both run it, and talk on the phone, whatever. Honestly, that's kind of awesome. Maybe worth the trade for X as it was.

    One other bright spot here with the VNC type approach is the ability to share a GPU, or have a pool of them serving users. With X, that's hard. Only the user local to an X server gets the benefit of a GPU, however an application can take advantage of it from anywhere. Maybe this isn't optimal.

    Instead, if we do incorporate the VNC ideas into a network GUI, we could have a few GPU's running from queues, rendering scenes for users, serving a great many of them potentially in the same way we can have CPU's serving a bunch of them. I have high hopes for this, if they get it right. It could be as transformational as X was initially.

    What I'm really hoping is they balance these things, we get good local interactivity as we did with X, but we get to share a GUI and run multi-user over the wire too. I'm critical of the current development, but only because I really believe in multi-user, networked computing. And it's been such a freaking mess, but for X. The stuff I got done with UNIX + X is nothing short of amazing compared to what I see most of the time today.

    Use every machine in the building to render a movie quicker without ever leaving my desk or disturbing those users beyond a little slowdown? Check. Heck, I even changed their systems in a few cases and they never knew. ( I literally asked the IRIX machine for a process map, looked at what they were doing, identified the dependent pieces, then proceeded to remove system software, install rendering software, render, then put it all back while they were running whatever it was on their box. Amazing, and on one of those, I ran out of room in the middle of it, was able to request another software removal, then continue with my rendering software installation and administration, no reboots, no disrupting the user, not even any starting over!)

    Serve up an application to a pile of users, without needing to worry about their system software or dependencies other than having X on the box, with admin hours numbering in the single digits? Check. They can run whatever they want? Check. Linux, Mac, Sun, HP, SGI...

    Spread an application across multiple displays? Check. (we do this fairly easily today, try it in the 90's and try it with GPU assist. That was messy without X, and it crashed a lot.)

    Setup a "multi-head" machine, complete with graphics, keyboard, mouse for several users to share and operate on concurrently? Check.

    All that was due to X, and most of it isn't on the radar or even in the scope of experiences for most computer users and administrators today.


    Fingers crossed!
    616 x 245 - 45K
    298 x 238 - 30K
  • max72max72 Posts: 1,155
    edited 2014-08-06 07:10
    I use VNC too, but with my ADSL (very slow upload) running a 3D modeler is very difficult.
    Massimo
Sign In or Register to comment.