Shop OBEX P1 Docs P2 Docs Learn Events
Windows 10 telemetry not actually controllable? - Page 2 — Parallax Forums

Windows 10 telemetry not actually controllable?

2»

Comments

  • evanhevanh Posts: 15,916
    I'll point out that scripting has changed the Web from being a pull-system into a push-system. When it comes to plain web pages, the push features offered to the end user are just bling, the job can be done without it.

    You do get the odd exception of non-webpage usefulness like Google Maps and Docs, so I will admit there is a use case for something like Javascript ... just not a web browser one. That was, of course, the original intent of compiled Java. Maybe phone apps will bring some sanity back to the Web :) ... or maybe they'll squash it instead. :(
  • Heater.Heater. Posts: 21,230
    evanh,

    I totally agree with you regarding push technology becoming useless bling in good old fashioned web pages. And that Javascript, or any programming language, should not be required there. We should not lose web pages. They are documents. They are searchable and indexable. They are accessible to the blind and partially sighted. They are to some extent translatable to different languages. They are archivable and printable.

    I do take a slightly different view of push technology and "web applications". That is a whole other ball game. We are no longer talking documents but dynamic interactive applications.

    I see this is the same as creating a GUI app with Win32 or the Qt GUI libraries or whatever they do on Mac. I can have all manner of widgets, buttons, sliders, knobs dials, check boxes. I can draw graphs and charts and whatever with 2D graphics. I can create accelerated 3D data views or games. I can play audio and video and so on and so on. All connected to live data streams and updated in real time. Today almost anything that can be done with a native, platform specific compiled application can be done in the browser. The browser is my GUI.

    Why is this a good thing? It's cross platform, one does not have to build a Windows version, a Mac version, a Linux version, a whatever comes down the pipe version. It's available to anyone with a newish browser. It does not require any installing. Keeping users software updated is trivial. Excellent.

    I would say there is more than the odd exception where all this is useful and even in use today. In fact I seem to spend most of my working day using such web applications. For example:

    The dashboard for Google cloud services. Where we can provision compute instances, monitor their performance, log into them and talk via a virtual terminal. And so on. All this needs to be live and interactive.

    The dashboard for resin.io. Like the above but this time we are deploying software to remote embedded systems. Again I can pull up a terminal in the browser dashboard and talk to them. Monitor their performance in real time. resin also has a very need chat system, if I have a problem with resin I type it into a chat window and soon enough one of the resin developers will be there discussing it.

    Microsoft online. Yeah gack. But it's outlook in the web and I don't need to be running windows to use it!

    Then there is our actual product which involves a lot of real time data visualization.

    Yes, all this was the original intent of Java. It had it's chance. It failed miserably. Too big. Too slow. Not flexible enough. A horrible language anyway. Being proprietary and non-standardized did not help it.

    If you want compiled code in the browser with all it's static type checking and so on you can already do that. We have demonstrated the Open Source Spin compiler, written in C++, compiled to Jvascipt and run in the browser. Surprisingly it's almost as fast as compiling it to native binaries! You can also compile Pascal and an increasing selection of other languages for JS in the browser. JS is the assembler of the web.

    All of this is still in early days with HTML5 of course. It is improving everyday. Even Microsoft is pushing in this direction as far as I can tell from an MS developer day I attended a few weeks back.
  • evanhevanh Posts: 15,916
    Heater. wrote: »
    If you want compiled code in the browser ...
    Allowing any form of arbitrary code in the browser is the problem. It breeds the pushiness because it's expected to be always be accessible rather than just for those special uses.
  • @Heater.,

    yes, you listed a lot of 'features(?)' you can use to pretend your application is more secure then other web-applications.

    But basically it isn't. Because Tim's idea of sharing stuff was not meant to be secure in the first place. All the other stuff are kludges to get around this. And - honestly - none of them will really work.

    I am actually not even sure if they should work.

    But this is just me,

    Mike
  • Heater.Heater. Posts: 21,230
    edited 2016-02-11 10:45
    evanh
    Allowing any form of arbitrary code in the browser is the problem. It breeds the pushiness because it's expected to be always be accessible rather than just for those special uses.

    I agree with you but I don't think this is a problem unique to the web browser. All useful systems have it. Let me try to explain why...

    When I interact with my computer it is running some software that listens to my keyboard and mouse and whatever input devices, audio, video etc. It also has some software to communicate over the network to the rest of the worlds computers.

    Being a computer it has this magical property of being programmable. I can can create new software for it to run and change it's behaviour. I can run software that others have written. I can fetch that software those people from over the network. Magic!

    Of course such a system has potential problems. How do I know what that software I get over the net does exactly? Does it spy on me? Does it read data of my local storage and send it to others? Does it corrupt my data. Or do other bad things?

    Basically I need to be able to trust the creators of that software and the locations from which I down load it.

    I can of course take steps to protect myself by running said software in a sand box. Have isolated protected memory regions in my computer. Have different permissions assigned to different programs and so on.

    Now, you might recognize what I'm describing above as a typical computer running a typical operating system. Say Windows or Mac or Linux. You may recognize the problems I describe as the ocean of viruses, trojans and other malware we have been suffering from for decades and still do.

    BUT I will posit that the web browser environment is no different. It's an "operating system", all be it very virtualized running on a platform. It's programmable. It is subject to all the same pros and cons of being a programmable computer.

    I will further posit that the web browser does a very good job of being secure. Better or at least as good as those operating systems. For example: Code running in a tab in my browser cannot access my files willy nilly. Code running in one tab cannot see or mess with code running in any other tab. And so on.

    It has the same problems as the traditional operating system. Where did that code come from? Can I trust the authors? Can I trust the site it's fetched from? And so on.

    No, arbitrary code is the problem. But as we have seen for native executable OS level code and virtualized browser based code. It's the same problem.

    I think the fact that we have a thread about the Win 10 spying on it's users demonstrates my argument rather well. Win 10 and it applications are "arbitrary code" downloaded from the net and doing bad stuff. Worse or at least the same problem as any JS in the browser.



  • Heater.Heater. Posts: 21,230
    msrobots,
    yes, you listed a lot of 'features(?)' you can use to pretend your application is more secure then other web-applications.

    But basically it isn't.
    You are not helping my paranoia with that :)

    What I'm fishing for when I talk about my secure webserver / web app and point people to it's source code is suggestions on why it is not secure. Where are the holes? What have I missed.

    A agree, in a way I'm pretending to be secure. I can never be sure that I am secure. But that is true of all systems that communicate over open networks.
    Tim's idea of sharing stuff was not meant to be secure in the first place.
    I'm sure you are right. Tim developed the idea at CERN for sharing scientific data. I don't think security was on his radar.
    All the other stuff are kludges to get around this. And - honestly - none of them will really work.
    Well yes but....As I said above this is true of any systems that are communicating over open networks. As the Germans discovered when the Brits cracked their Enigma and Lorentz encryption techniques.

    Let's take an example. Say I want to communicate with some embedded system over the internet. I can do that using UDP or TCP/IP. Fine. I want it to be secure so I create some encryption system, or I use one that is readily available. Is it secure? Who knows? The recent Heart Bleed fiasco shows we cannot be sure.

    I could design and write the entire system myself. Is it secure? Probably less so because I'm no security expert and I'm bound to get something wrong.

    In that respect a web server/browser is far more secure than many other approaches.

    Now, if you have any suggestions as to how I can pretend to be more secure in my experimental web app I love to hear them.


  • evanhevanh Posts: 15,916
    It's the websites pushing that I'm concerned with. Removing scripting from browsers would solve it.

    Actually, I would go as far as to say M$'s recent behaviour was triggered because of the current web browsing situation.

  • Heater.Heater. Posts: 21,230
    edited 2016-02-11 13:45
    evanh,
    It's the websites pushing that I'm concerned with.
    I agree, it's a concern to me as well. But it's no different than using a raw OS and downloading stuff from here and there, like we did in the good old virus ridden days. Heck until very recently Oracle installed malware with every Java download.

    websites are a problem. The other day it was reported how one could set up as a seller on ebay, put some script tags in your seller page and product listings, that JS would run when a user looks at your shop, boom your JS owns that user account! Ebay of course could filter out such JS injection easily but said "We have no plans to fix this". WTF?!

    It's not just JS execution that is the problem mind you. For example this very website gets your browser to visit gravatar.com with your user ID. (Yes I know it's hashed but so what). It does this even if you set an avatar and don't need gravatar. Ergo parallax.com allows strangers to track you. Until recently parallax.com fetched stuff from facebook. Why?, nobody knows. But FB is famed for tracking even non-FB members.

    It's all down to who you trust. As it always did.In the browser or not.
    Removing scripting from browsers would solve it.
    Yes, it would certainly solve a lot of it. For this reason I think JS should be disabled for any site that is not somebody I trust. That is disabled by default.

    Similarly nothing should be fetched from any URL that I did not ask for. By default a browser should not visit any link I did not point it at or otherwise allow.

    By default a browser on a secure (https://) page should not be fetching anything from insecure resources (http://).

    It's this kind of wrong default behaviour that is turned off by most of the security measures I listed above.

    The defaults are nuts.


    On the other hand (There is always another hand). When it's my server and my web page. I want to be able to connect them together.
    I want to be able to interactive stuff. I want JS. I also want that to be as secure as possible over the net.

    I could of course do this without a browser. Replace web page with some native GUI app I write. Replace HTTP(S) with whatever protocol and encryption I like. Would it be any more secure? Historically such apps have not had a good record. The browser though has everything we need.
  • potatoheadpotatohead Posts: 10,261
    edited 2016-02-11 17:49
    Either we have open computing, Turing complete, or we have secure, trusted systems.

    Pick one. No joke
  • Heater.Heater. Posts: 21,230
    I do believe you are right. Secure or useful. Or some trade off between the two.

    My simple thesis, in the face of arguments about the insecurity of browsers, the web and JS, is that actually browsers are doing a pretty good job.

    Decades of experience have shown that building operating systems and native applications with the similar use cases (user interaction, remote access, multiple users, shared resources, security etc etc) in mind has proven a dismal failure with no end of security vulnerabilities in all manner of programs and devices over the years.





  • It's all risk reward. If the machine will do anything we want,it will do that for anyone. The physics of it do not include a permission model.
  • Well @Heater, lets start with HTTPS.

    It basically ensures that the communication between your browser and your server gets encrypted. Sadly HTTPS does not ensure that your browser is reading data from another server, even unencrypted ones. Some browser warn you about this with a message box, but not all do.

    The main problem is that browser do without problem fetch information from multiple servers. Inherited by design. One of the first browser was called mosaic for that reason.

    The second main problem is that programmer are lazy. Even you use node.js and whatyoucallit 'bootstrap'? for building your site.

    So you do not really know what code is running and from where. Sure you could use wireshark to monitor the requests for a while, but that still can't prove that the code you are using is not sending out some information at every odd Friday every second month or so.

    As soon as you allow a user to enter some text to be displayed by your site, like usernames, profiles, all them 'nice'? stuff, you can be hit by unwanted scripts being injected into the web-pages your server delivers.

    Like the problem EBAY is facing and decided to simply ignore it. Because they simply can not do anything against it without removing 'needed' features.

    For example one system I am working on since 1996 is a bidding system for contractors, trying to get the job. Sure - all of them are members and have username/passwords/access rights, you name it. But they are construction companies and the guy doing the bidding is saving his access code on his laptop with some 'freebee' password manager and the whole thing is already compromised.

    Because his drunken son, daughter, wife, girlfriend, co-worker can just log in and do serious damage to his bids. And this happens way more often that I had anticipated.

    Sure you can set up a server with node to just react on one port, put a firewall there and claim you are secure. but you aren't. The user just hits F12 in chrome or IE and can change any CSS, making stuff visible you intended to have hidden in any page of your server he can access. He can see the source and run it in debug mode. All your JS you are using. All CSS, all html.

    Your paranoia is sadly correct. And I did not even took in account what is possible with dns-spoofing or other infrastructure attacks. Just the usual webserver/browser/user.

    Enjoy!

    Mike
  • A system I use actually runs something to generate a unique machine ID, and that is combined with username and password.

    If any of that fails, you email for a one time code, good for like 5 minutes. It's a PITA due to browser updates, etc.. I end up using the code nearly all the time.
  • I know there must be a joke about 'making Cortana blush being its own reward' in here somewhere...
  • Heater.Heater. Posts: 21,230
    edited 2016-02-13 00:53
    msrobots,

    Ah, great, now we get down to the meat of the security issues.
    Sadly HTTPS does not ensure that your browser is reading data from another server, even unencrypted ones
    For this we can set a "Strict-Transport-Security" header in the response from the server.

    The HTTP Strict Transport Security feature lets a web site inform the browser that it should never load the site using HTTP and should automatically convert all attempts to access the site using HTTP to HTTPS requests instead.
    The main problem is that browser do without problem fetch information from multiple servers.
    For this we can set a "Content-Security-Policy" header in the response from the server.

    CSP makes it possible for server administrators to reduce or eliminate the vectors by which XSS can occur by specifying the domains that the browser should consider to be valid sources of executable scripts. A CSP compatible browser will then only execute scripts loaded in source files received from those whitelisted domains, ignoring all other script
    Even you use node.js and whatyoucallit 'bootstrap'? for building your site....So you do not really know what code is running and from where. Sure you could use wireshark to monitor the requests for a while, but that still can't prove that the code you are using is not sending out some information at every odd Friday every second month or so.

    A very good point.

    Firstly my propanel experiment does not pull code from any CDNs or other websites. Only from my server. Of course I put those libraries onto my server from somewhere so how do I know what I have got is not back-doored and malicious? In principle I could review it all but that is never going to happen. I could just not use any code that I did not write myself, but that is not going to happen either. I have to trust that my upstream sources are good.

    Note also that the headers above won't allow the browser to fetch code from anywhere but my server.

    BUT...This problem is exactly the same as if I would build propanel as a Windows, Mac or Linux GUI application. How would I know that all those libraries I include are not malicious? I don't.
    As soon as you allow a user to enter some text to be displayed by your site, like usernames, profiles, all them 'nice'? stuff, you can be hit by unwanted scripts being injected into the web-pages your server delivers.
    Another very good point. And it worries me. And I have yet to get down to looking into all the issues of input validation and "sanitizing".

    I'm sure this is a solvable problem. I'm also sure it requires great care and attention. Have to get back to you on that one.

    So far my defences are: 1) The only textual user input I have is user names and passwords etc. 2) I off load all issues of dealing with that to the stormpath authentication service. Yeah, yeah, I know how can I trust stormpath?

    Again, this is not a problem limited to web applications. SQL injection can be done from native apps as well.
    Like the problem EBAY is facing and decided to simply ignore it. Because they simply can not do anything against it without removing 'needed' features.
    The ebay thing boggles my mind. They are inviting people to inject malicious code.

    My defence against that would be to validate and sanitize inputs. Good practice in any program. And NOT to invite people to provide code of their own.

    Also note that the security features mentioned above prevent any code that does get injected from doing anything. Along with a few others dealing with cookies and security tokens we have not come to yet.

    There is actually a way to allow user input of JS and have it run safely "sandboxed" within your JS app. It's not trivial but I have seen it described and demonstrated. I'm not going there.
    saving his access code on his laptop with some 'freebee' password manager and the whole thing is already compromised.
    Yep. BUT this is an issue common to any system that allows such users and logins. It is not a web app or JS problem.
    Sure you can set up a server with node to just react on one port, put a firewall there and claim you are secure. but you aren't. The user just hits F12 in chrome or IE and can change any CSS, making stuff visible you intended to have hidden in any page of your server he can access. He can see the source and run it in debug mode. All your JS you are using. All CSS, all html.
    Firstly I would hope that there is nothing in the delivered web page that the user should not have. Relying on obfuscation or visibly hiding stuff would be silly.

    There should not be any problem allowing the user to read, and use, the JS and CSS in the page. It should not allow him to do anything meaningful. After all many secure systems run on open source software. For that we need to get into a discussion of "httpOnly cookies" and web tokens which propanel uses.
    Your paranoia is sadly correct.
    I know.

    That's why I like to have this discussion. To find out what we can do. How far can we get with securing the thing before it becomes unusable.

    I look forward to hearing about the dns-spoofing or other infrastructure attacks. Again they sound like general problems with any networked system. Not specific to a browser and JS application.

    In the mean time I'm working on getting Propanel actually hooked up to a Propeller so that people have something to hack :)

  • rjo__rjo__ Posts: 2,114
    With all of this going on... is there any wonder that Windows 10 tries to monitor everything?
    How else is MS going to offer a "secure" operating system for their customers and fix what needs to be fixed before it destroys everyone's computers?

    "ahh yes a Windows 10 computer just failed in Kabul... let's look at that."

  • The volume and type of tracking, intrusion and data mining that Windows 10 does, and the infrastructure Microsoft has set up to support it, seems to go far beyond what could possibly be imagined as necessary for a secure personal computer. It's doubtful that Microsoft is even capable of providing a secure operating system in the first place.
  • rjo__rjo__ Posts: 2,114
    Let.s test that.
  • rjo__rjo__ Posts: 2,114
    To follow through I will need a volunteer... if anyone is interested, PM me. Then I will try to set something with MS public relations.
Sign In or Register to comment.