Real-time Ajax/streaming Browser Performance
Phil Pilgrim (PhiPi)
Posts: 23,514
I've recently been developing some software that uses a local Perl-based server and a browser window as the GUI. I've tested it on IE 11, Firefox 32, and Chrome 38 (all Windows 7, 64-bit). Here are my results:
-Phil
Firefox: works 100% with snappy, real-time performance.
Chrome: Works 100%, but dog slow. Unusable.
IE: Does not work at all. (This may be my fault, since IE implements Ajax stuff a little differently, although I thought I accommodated it correctly.)
Chrome: Works 100%, but dog slow. Unusable.
IE: Does not work at all. (This may be my fault, since IE implements Ajax stuff a little differently, although I thought I accommodated it correctly.)
-Phil
Comments
I do like the approach. Leveraging the browser as the way to get a cross-platform GUI that is. Great. The easiest way to get something that works on Windows, Mac, IOS, Chromebook, Android. No messing with building multiple executables and building installation packages. Wonderful.
I'm curious as to what is is you are finding "...dog slow. Unusable" with Chrome?
In my adventures into browser based systems over recent years we have built web apps that seem to run just as well in FireFox and Chrome. We make use of web sockets to get real time data feeds, we use webgl to create real-time visualizations from that data. There is of course a bucket load of JavaScript running in the browser to do this. Then there is all the other usual web browser UI stuff.
I don't recall anything becoming "dog slow" on Chrome. We don't do anything by way of benchmarking or take any trouble to optimize for different browsers. Generally things just work and no show stopping performance differences have jumped out at us.
Except Windows of course where nothing we created worked until recently. IE only got web socket support in IE10. Webgl finally got support in IE 11. Our attitude to that since we started was just to say that use of a modern browser is required. Nobody ever complained about that.
So, really, I'm curious, where are you seeing the slow down?
-Phil
I was curious about how this goes without the luxury of websockets and webgl so I made a little test. Using XMLHttpRequest to fetch a small text file from localhost and then draw a semicircle in a canvas element. The semi-circle is rotated a little bit after every response is received. This fetches and redraws 100 times per second and is a smooth as silk!
So how much data are you fetching and what are you drawing that might cause this issue?
Edit: Having said all that I notice Chrome is using over twice the percentage CPU load to do this as FireFox does
Other Edit: I do recall Chrome being horribly jerky with one animation where FireFox was not. It seemed to be a problem with an old graphics card and Linux driver on one machine though as it worked fine elsewhere.
Pardon me if you already tried disabling hardware acceleration in Chrome, I've had a few NVIDIA adapters that would cause Chrome to choke when enabled.
chrome://settings > advanced > system > use hardware accel
@phil,
Thought I'd give that a quick try on Windows. Only got access to IE 8 on XP here so of course it does not work no matter what advice off the net I try. There is some talk about having to enable xmlhttp requests in some security settings but I have given up messing with that.
Do you have a cut down piece of code that shows the Chrome problem we could play with?
Thanks for offering to take a look. The app is a simulated oscilloscope/logic analyzer. Here's the part of the Javascript that retrieves and draws the traces on the canvas object denoted by context:
The trace data is encoded in six bits of each byte, with "0" added to avoid control characters. A 1 bit is "high"; 0, "low". There are two traces that get displayed.
-Phil
A couple of potential slow downs stand out in that code:
Having:
xmlhttp.onreadystatechange=function() {....
inside your timeout means that a new function object has to be created every time around, and the old one destroyed. That can be heavy on the garbage collector. Better to have a stand alone named function that you pass to setTimeout.
Speaking of which, it's probably better to us setInterval() for regular events like this. What you have there has to create a new time out every time.
This:
setTimeout("loadXMLDoc('trace')", 33);
puzzles me. I did not even know you could pass a string like that to setTimeout(). All that string parsing can be a slow down.
I have knocked up this code: Which redraws at almost 60 frames per second whilst fetching 160 points of data in a JSON array 30 times per second.
The data it is down loading is just JSON formatted text like so:
BTW, I was using setInterval, but things tend to seize when the browser can't keep up. That's why I switched to setTimeout, since everything can then run at the browser's own pace.
-Phil
-Phil
Just use raw text. JSON format is good because the decoding all happens at "native" speed in the JS engine.
I rearranged my code a bit and did away with the animation loop. Now it just redraws when it gets data from the xhr request: 1280 values drawn on a nice green trace almost exactly 30 frames per second. No stuttering anywhere.
I tried the JSON approach:
Still no joy with Chrome, and now even Firefox hiccups occasionally.
Gordon,
Good point. I'll try it on my ancient XP laptop.
-Phil
SRLM, interesting idea about using SVG.
-Phil
Addendum: Running SP1 on laptop. Opera installer complained and quit. Firefox installer simply croaked.
The heavy XP users are non-US, so it depends on your target market. Asia (specially China) is still heavily invested in XP. There are some businesses that continue to use XP, and have a support channel for it through Microsoft that regular folk don't.
Students are going to use a mobile device. I'm having to go through all my sites, and those I handle for others, and update for mobile. What a drag. If I see the word "responsive" one more time I'm gonna puke.
I changed my HTML like so: And then draw the trace by changing the "d" attribute like so: This is a much smaller, neater looking piece of code and would be even more so if the D3 library was used but it adds another 6 to 10 % to my CPU load as best I can tell.
I have never had any luck getting speed out of SVG.
The following code fetches and draws 1280 points of a scope trace using the ivank.js webgl ibrary. It pegs at a constant 60 frames per second!
ivak.js is great. Very easy to use. You can easily use B
IvanK is very nice but it does tend to consume a lot of CPU time. As far as I can tell it is re-rendering the scene on every frame refresh, 60fps here, even if no changes have been made. I have yet to find a way to control that.
Moving a bit closer to the metal we have THREE.js. A full up 3D library that can render with webgl (or good old canvas as a fall back). It's also good for 2D drawing, simply use an orthographic projection.
The following will draw the 1280 point scope trace consistently at 30fps whilst using a lot less CPU time than ivank.js. About 20% of my 2GHz Intel PC. That's OK I have other cores free:) Thing is it only renders when you tell it so things are a lot more relaxed.
THREE.js opens up possibilities for going crazy with special effects in your oscilloscope.
-Phil
It's been an interesting little diversion here. I took a quick look at most of these techniques a couple of years back when we had a little real-time animation job that needed doing in a hurry. I did not have a lot of time to play with them and I knew absolutely nothing about browser API's, CSS, JS etc, at the time so it was all quite bewildering. This was all compounded by having to create the back end server data streaming software at the same time...we were building an "Internet Of Things" before the term was popularized.
We had a contractor working in parallel with me using FLASH to do the same job because nobody believed it was possible with HTML5, CSS and JS. As it turned out I got my hacky JS solution up and running in time for a trade expo and the FLASH version was never finished!
This seeded a new idea in the company, we stopped developing tradition applications. A trend that has taken off everywhere else. Despite Gordon's suggestion that these might be "...merely a test bench in preparation for writing a GUI in a more traditional framework." Nobody wants to mess with that old fashioned application idea any more.
Anyway, that's all history, todays little investigations have shown me how things have moved on in capabilities and performance in the various solutions.
Now, are you hinting that your performance issues are actually down to that perl code in your server?
I like to that kind of server stuff in JavaScript using the node.js engine. Dead easy to use. Very good performance, even compared to native C++! It's nice to get into the groove and be able to use the same language at the server and at the browser end.
-Phil
Web browsers were made to browse the Web. There's no getting around that. Dollars to donuts on some critical platform these approaches will fail you when stressed, because they are applied well outside their use case. So developers end up either shelving the project, or accepting second best and admitting the software won't run for potential customers. (Hopefully neither will befall you, Phil, just trying to suggest more testing across platforms, including mobile, before you invest much more effort on this.)
At least look at something like CEF, which is Chrome without the heavy overhead of being a full Web browser. These and others are among today's "traditional" frameworks to cross-platform application development. Your argument is based on old assumptions.
Thanks for the heads up on CEF. Interesting. It had floated past me before but I paid no attention. Might be useful.
Your post is full of contradictions though:
"Web browsers were made to browse the Web....", but your suggestion is to bolt a web browser to my applications so as to be able to make use of HTML, CSS, JavaScript in my app.
"...CEF, which is Chrome without the heavy overhead of being a full Web browser", but CEF is a full web browser, OK barring the controls that a browser has. libcef is still 70MBytes of browser engine. The dev kit down load is one giga byte !
"today's "traditional" frameworks"..I know things move fast now a days but to have a thing become a tradition the day it's invented is pushing it a bit
I would have thought "traditional" was Qt or GTK or the Windows API or whatever they have on Mac. Or, God forbid, Java.
"to cross-platform application development", not really, there is no CEF for ARM, or Android or iOS as far as I can find. On top of that the app developer still has to make builds for Windows, Mac, Linux, whatever comes down the pipe. It is telling that I don't see CEF for Chrome Books anywhere (or is that CEF API built into Chrome Books?)
"Your argument is based on old assumptions."...perhaps...like two or three years old:) When we started out people noted we were very brave (stupid) to be experimenting with such cutting edge stuff.
The killer for me is that adopting something like CEF locks you into a Google solution. If you can get by with browser technology and web standards you are not locked in.
Anyway, as usual it all depends on what you want to do.
If you have a big pile of native code that could use a GUI built with HTML/CSS/JavaScript then bolting CEF on there may be a good solution. Looking around I see articles about game developers adopting it, together with the Unreal engine for example.
If you want people to view data without having to mess around installing anything, without having to have the right machine architecture, actually without needing own the machine they are using at all, then getting it into the browser is best way at the moment. You must admit that the browser is a common denominator for nearly every human on the planet.
Out of interest, who is using CEF? Any nice examples out there?
CEF is not Chrome, a general purpose browser. It's an API built over the underlying engine in Chrome. You could bypass CEF and develop directly in Chromium, and some people do. In any case, while it can be used to surf the Web, saying it's just a Web browser or Chrome in disguise is missing the point. Otherwise, why would anyone even bother with it?
Just to remind you, I didn't "recommend" CEF at all. I merely suggested looking into it as a framework rather than rely on whatever browser a user has. I'd personally not use HTML/CSS/JavaScript gunk to code any raster or vector graphics application (no knocks to you Phil, I do understand what you're trying to accomplish here). However, that is me, and my unique set of experiences as a developer of graphics apps. YMMV is well at work here.
I'm a BIG proponent of researching the use case for an application, researching the typical user base, then finding a solution (or solutions) to match that. I've seen a lot of Web-based app projects abandoned or released (often to bad reviews) with too many compromises. The hype seldom lives up to its promise. We've had this discussion before; simply saying "X with Y, added to Z works" is missing the point. The world has other hardware and software you do. Since there's no way to write one piece of code to work everywhere, it really comes down to first determining who will use it the most, the catering to that.
There are no perfect cross-platform development toolkits, but to me, relying on a random browser running on a random OS provides about the lowest confidence of delivering functionality that I can think of. That said, I'm sure Phil's tool will run well on many machines.