All your Intel processors slowing down by 30% soon!
Heater.
Posts: 21,230
Seems there is a major security bug in Intel processors that is getting a software work around in Windows, Linux and Mac very soon.
https://www.theregister.co.uk/2018/01/02/intel_cpu_design_flaw/
Looks like the work around makes a big performance hit. Like 30% !
AMD not affected.
Makes me giggle. Back in the day, when I worked for Northern Telecom, our team got a huge thick document from Intel, under NDA, detailing all the know bugs in then new 286 processor. Most of them were holes in it's memory protections. Seems the tradition continues.
https://www.theregister.co.uk/2018/01/02/intel_cpu_design_flaw/
Looks like the work around makes a big performance hit. Like 30% !
AMD not affected.
Makes me giggle. Back in the day, when I worked for Northern Telecom, our team got a huge thick document from Intel, under NDA, detailing all the know bugs in then new 286 processor. Most of them were holes in it's memory protections. Seems the tradition continues.
Comments
Apparently Intel's CEO sold off as many shares as he was allowed to, recently.
AMD up almost 7% as they don't have this particular bug.
This is kind of huge if you are a Google, Facebook, Amazon, Microsoft kind of cloud service provider and suddenly find you need 20% or 30% more data center to serve the same capacity. Not to mention the electric bills...
And what if consumers get into a class action over having been sold 30% less performance than they were offered?
That's just plain wrong. Insider trading. The CEO should def take a bath on this one. Of course it could all be a carefully choreographed scam, if the stock drops, he buys a LOT, then a workaround is found in a week and the stock pops up higher than ever.
Looks like old BK has been on a sales spree since October cashing in over $60mil in stock and stock options. He may be looking, or rather have to leave soon.
From the report, this is a design flaw in their processors that has existed for a decade and was known by internal security folks but where to keep silent until a fix was established. Intel can not fix this on their own so it requires a software patch as a workaround. That sucks.
"Recent reports that these exploits are caused by a “bug” or a “flaw” and are unique to Intel products are incorrect. Based on the analysis to date, many types of computing devices — with many different vendors’ processors and operating systems — are susceptible to these exploits.
Intel is committed to product and customer security and is working closely with many other technology companies, including AMD, ARM Holdings and several operating system vendors, to develop an industry-wide approach to resolve this issue promptly and constructively. Intel has begun providing software and firmware updates to mitigate these exploits. Contrary to some reports, any performance impacts are workload-dependent, and, for the average computer user, should not be significant and will be mitigated over time."
Nothing to see here, move along. :swear: :swear: :swear: :swear:
Intel is famous for declaring that things are not bugs. And/or that they have little impact. See FOOF bug, FDIV bug etc.
Back in 1980 or so I was amused to find a bug in the Intel 8059 interrupt controller chip referred to as a "feature". Actually in the datasheet. Complete with example code required to work around it.
This "it's not a bug it's a feature" has been a running joke for many years. Not sure if it started with that 8059 datasheet though.
Admittedly other processors may suffer from versions of this particular exploit. There are similar architectures involved. We shall see. But that does not make it less of a flaw in my eyes.
If you want to make a more informed judgement on the situation try watching this presentation by one of the exploit developers:
How serious is this issue? How long will it take them to fix?
Kicking myself for not seeing the news that Intel CEO sold stock Dec 19 we could have bought more.
21 other architectures also suffer, as the link you gave said all 22 tested suffer from this effect.
Interesting that precision is a problem here...
Sounds like a 'fix' is already in the pipeline, which is why this is going wider-public now.
Serious ? I guess that depends on your level of paranoia, but if it has motivated Microsoft and Linux to move, I'd say someone, somewhere, is quite concerned...
Operating systems need a kernel patch to work around it.
I read that Microsoft is planning on pushing an update on January 9th.
See video above to judge how serious you think it might be.
What will Amazon etc do? Buy 30% more Intel chips or look at AMD/Nvidia/other?
The video talks about browser code, so the focus seems to be consumer desktops.
I doubt that server farms are going to be too impacted, as they are not running unknown javascript.
It is not JUST intel, either, this from ARM :
"I can confirm that Arm have been working together with Intel and AMD to address a side-channel analysis method which exploits speculative execution techniques used in certain high-end processors, including some of our Cortex-A processors," says ARM public relations director Phil Hughes. "This method requires malware running locally and could result in data being accessed from privileged memory."
Ah, yeah. I guess it does affect some ARM processors such as the A series so it might affect Phones, Tablets and such.
https://axios.com/massive-chip-flaw-not-limited-to-intel-2522178225.html
This also might explain why Intel came out with a reply to the reports but only to state that it is not just an Intel issue and other processors are affected as well.
Firstly this is nothing specific to Javascript or Javascript in the browser. Everything you see in that video could be done in C++ or a ton of other languages.
Server farms certainly do run a lot of unknown Javascript. In the form of node.js servers. They also run a lot of unknown code in every other language. I'm talking here about the cloud servers you or I can rent from Amazon, Google, Microsoft and many other such cloud providers. All of this infrastructure is now as vulnerable as our PC's running a browser.
Certainly the JS in the browser is the focus for the public at large.
scenario...
Thieves have found that they can turn the car on by shoving a screwdriver into the ignition lock and forcing it to rotate to engage the engine. Did the car manufacturers have to install a gadget to protect the ignition lock from having a screwdriver inserted? No! There were no remedies, no speculation about class actions, etc.
Isn't this the same thing? Without thieves, this bug isn't a bug because clearly it works as designed. It is only a bug when a thief forces the CPU to run malicious code specifically designed to break the CPU in a way that can be used to steal something.
I think it's high time that the law caught up with reality, and prosecuted these villians who write viruses of any kind. But, there is no way this will happen, because the lawyers make a living scamming the honest people who get caught up in legal messes.
Analogies are always flawed and analogies between cars and computers are especially terrible but let's go with this one.
So the car maker finds out that there is a dead easy way to get into your car and drive it away. Without breaking anything. Then after your next service you find they have thoughtfully fixed that issue. But in the process knocked your top speed down by 25% and doubled your 0 to 60 time.
I think car owners would have a case to complain about here. Well, there are thieves and there always will be.
Given that Intel has known about this possibility for many years and let it slide I think they have some responsibility. Personally I don't think writing any kind of code should be criminalized. It's just code. No different from anything you or I write.
There are no thieves involved in the current fiasco. The exploint has been demonstrated by security researchers. Academics in a respected university. Should they now be arrested for creating such code?
Such a law would halt all such security research. Except by the bad guys of course. It would make the work of penetration testers impossible. We would be far worse situation than we are now.
Now, of course if one uses such code for criminal purposes, theft, damage, ransom, whatever then one should expect to be punished for those crimes. As normal.
Finding the bad guys that do that sort of thing is next to impossible. And expensive. Better to not give them the chance in the first place. Fix the damn bug!
At what level of snooping should intel ( & AMD and ARM and Apple and Samsung.... ) stop ?
If someone monitors Icc and can deduce some internal actions from that, should intel modify the silicon again ?
All this so that my program running on a machine cannot access the data of your program running on the same machine. An important feature for multi-users systems, sand boxing untrusted code, running virtual machines, security systems, etc, etc.
So, when it is found that there is a hole in all that protection mechanism it's certainly a flaw that needs to be addressed. I'd call it a "bug". You may not. Makes no odds what we call it. Given that this possibility has been known about for many years I think "bug" is appropriate.
Obviously I have no idea what a silicon level fix would be. Not being a processor designer myself. But clearly whatever work around the MS, Apple, Linux and other guys are working on to paper over this problem in software could as well be done in silicon.
I don't think monitoring Icc is of relevance here. That is not something that can be done from software. And it requires physical access to the machine.
"On vulnerable systems, Meltdown allows user programs to read from private and sensitive kernel address spaces, including kernel-sharing sandboxes like Docker or Xen in paravirtualization mode. And when you've stolen the keys to the kingdom, such as cryptographic secrets, you'll probably find you can indeed corrupt, modify or delete data..."
Sounds like VW and their emissions fix.
For decades computers have been following Moore's Law. Which most people translate from being about transistors per chip to speed.
It has been suggested for a few years now that Moore's law has come to and end, or at least approaching it. Processor speeds have not be rising as fast recently.
Well...if the dire predictions of significant processor slow down due to implementing the work around for this bug are true, then Moore's law will have ended with a big thud. The first year ever that processors have gotten slower!
Moore made his observation about the doubling of integration density of chips in 1965. Before there was an Intel, before anyone even put RAM on a chip.
Actually I think it's amazing. The first integrated circuits had two transistors, soon they got to 4 then 8.... Even with these few, tiny, numbers Moore drew a plot of progress over time on a log scale, stuck a straight line through it and could see where all this was heading already.
With than insight he could set out raising money and planning for the integration density he could see coming down the pipe. Hence Intel.