Yes, I can accept a shorter expiry time, or make my certificates on a 64 bit machine, or install openssl v1.x.y where the bug was fixed last year or so. As I'm only learning what is what and trying stuff out just now it's not so important.
Thing was that this bug hit me after fighting with openssl configuration and use, not to mention tweaking with my code that is going to use ssl/tls, for nearly three days. I was nearly throwing monitors and computers out of the window:)
Well the nastiest of bugs lead use all down a few blind alleys and force us to redo work several times over. I really wanted to be helpful, not just bash a particular OS due to a personal resentment.
I intalled Oprea on my notebook and hope that its up-to-date security system will resolve the my wifi hotspot problem. I have been Googling and fooling with the problem for well over a month now. Of course, Opera may be just another blind alley, but it seems that related to the SSL is the security of browsers that use TLS and the Firefox and some older Chrome is not fully compliant with the latest versions of TLS.
Opera claims it is. And so is the most recent Internet Explorer and the latest Chrome. Since those work in my W7 Starter, this might be the reason the servers keep dumping me in Linux.
Was there a Y2K hoax? .....On a smaller scale I have seen (back in the eighties) December->January transitions with time handling bugs that were literally life-threatening when they happened.
Maybe I only have a limited grasp of the whole picture. The 16-bit roll over for my system was going to happen during my watch in 1987(?), the boss said "fix it". Ended up reserving a second 16 bit word address for the overflow, the rollover will not happen again for centuries. The rollover came and went, we got no phone calls, it must have worked. A couple weeks later, we heard that SEVERAL large financial institutions (one of which may or may not have the letters T, R and W in its name) had suffered losses for ignoring this error. When the roll over occurred, the uncorrected clocks went back to Jan 1, 1900 (or whatever) and they lost MILLIONS of US dollars per day till they fixed it a couple weeks later. For some reason, the story only ran on local LA news, for a short while.
The same short sighted individuals responsible for getting burned by not fixing the 16 bill roll over became the "experts" that warned of the Y2K "problem". However they seemed to fail to grasp the significance of the roll over of the internal value (a major issue) versus the external display of the value (minor/cosmetic). While there was mention that some critical systems existed that calculate the date based on the external representation rather than the internal representation (rather insane), I never found an example. (The only issue was more like exporting the data to an external spreadsheet, and two digit year wouldn't sort correctly after 1999. Big deal, export four digits, done.)
Knowing this, and that midnight in Chicago occurs close to last, I was able to get my tickets to Asia for 1/10th the usual cost, and had practically the whole plane to myself. My recollection of Y2K is sleeping across 5 seats in a 747, and being paid double time to be "on call" in case the world ended. Best vacation ever.
I never did hear of any issues that were not cosmetic, display issues for Y2K. But then again there was all that left over champaign that was on close out until June that year, it may have affected my memory. We had champagne and cheese burgers every Friday that summer. So yeah, Y2K seemed a bunch of hype and nonsense both before and after.
Well, I know for a fact that there were tons of real applications which stored the year in two digits, but that was only one of the problems. There were also, just to take one, the problem of systems inputting values from users or from files. I would say that the majority of pre-2000 applications would use two digits for that, and even if stored in full internally there would be something like 'year = input+1900'. And I've seen 'printf ("
19%02d", tm_year);' as well (an easy error to make because the C runtime struct tm has a field 'tm_year' which is defined as years since 1900). And so on an so forth. Lots of little subtle bugs.
Well, I know for a fact that there were tons of real applications which stored the year in two digits, but that was only one of the problems. There were also, just to take one, the problem of systems inputting values from users or from files. I would say that the majority of pre-2000 applications would use two digits for that, and even if stored in full internally there would be something like 'year = input+1900'. And I've seen 'printf ("
19%02d", tm_year);' as well (an easy error to make because the C runtime struct tm has a field 'tm_year' which is defined as years since 1900). And so on an so forth. Lots of little subtle bugs.
-Tor
But that's consumer applications, yes? where the app itself is not expected to last more than a couple years. Changing the value of the data from 1991 to 91 is just asking for trouble, I don't think I saw this on any "critical" applications, where an error would cause a collapse of modern civilization, as advertised in Y2K.
But that's consumer applications, yes? where the app itself is not expected to last more than a couple years. Changing the value of the data from 1991 to 91 is just asking for trouble, I don't think I saw this on any "critical" applications, where an error would cause a collapse of modern civilization, as advertised in Y2K.
Consumer? I'm not sure I follow. Those were applications, and I'm not aware of any application that's expected to last only a couple of years.. I've never heard about such a thing. No, these were part of the majority of software: Something used for some purpose by some industry, data provider, imagery processor and the like. Of course nothing that would collapse modern civilization (as I mentioned in my first posting on this, the "take to the hills" was of course rubbish). The point is that these bugs were important. Lots of things would fail and stop working and cost interruptions and money. They were real, and had to be fixed, and they were for the most part fixed. One thing I know about in detail would be weather forecasts.. I personally fixed some bugs that would have otherwise dumped the quality of weather forecasts back to pre-1990 levels if they hadn't been fixed before year 2000. Would they detroy civilization? No. But it would be at least inconvenient, worst case some ships would end up in a dangerous situation because of it. But we won't know, fortunately, because the bugs were fixed.
It seems to be impossible to talk about Y2K bugs without including the word "COBOL". Seems there is billions of lines of COBOL in use around the world. I've read that the number of transactions COBOL handles every day rivals the number of Google searches and so on.
So, I have always wondered where do they keep all the COBOL programmers?
After hanging around the industry for decades I had never met one. Perhaps I should not expect to as my work has been far away from financial institutions and such.
Finally, this year I have one as a neighbor. They do exist!
They may be fine programmers and all but they don't seem to be geeks. You don't see them showing up on github proudly displaying their talents.
Actually, the guy that lives next door to me is a cobol programmer. For some reason I've never seen him smile, but maybe that's because he has me for a neighbor.
So, I have always wondered where do they keep all the COBOL programmers?
After hanging around the industry for decades I had never met one. Perhaps I should not expect to as my work has been far away from financial institutions and such.
Finally, this year I have one as a neighbor. They do exist!
They may be fine programmers and all but they don't seem to be geeks. You don't see them showing up on github proudly displaying their talents.
Anyone else spotted such creatures? Where? When?
We have them where I work, but they're in another division. They all seem to be about 60 years old, even their junior programmers. maybe they're born that age.
Here in Taiwan, the year is 102.. not 2013. All banking is done in the years since the founding of the Republic of China. Thus the Y2K bug was not a potential problem within their system. Think about it, there are other nations with other calendars... not everyone is locked into Christian era.
Could Taiwan have a Y2039 problem? Since the current year is 102, why use a 32-bit integer to store it when you can get by with 8 bits. In 2039, the year will be 128 in Taiwan, which translates to -128 in a signed byte. Of course, you could use an unsigned byte, but that only puts the problem off for another 128 years.
According to the Y1C article "As generally speaking only governmental offices used the official system, Y1C computer bug impact on the private sector was minimal."
Comments
Thing was that this bug hit me after fighting with openssl configuration and use, not to mention tweaking with my code that is going to use ssl/tls, for nearly three days. I was nearly throwing monitors and computers out of the window:)
I intalled Oprea on my notebook and hope that its up-to-date security system will resolve the my wifi hotspot problem. I have been Googling and fooling with the problem for well over a month now. Of course, Opera may be just another blind alley, but it seems that related to the SSL is the security of browsers that use TLS and the Firefox and some older Chrome is not fully compliant with the latest versions of TLS.
Opera claims it is. And so is the most recent Internet Explorer and the latest Chrome. Since those work in my W7 Starter, this might be the reason the servers keep dumping me in Linux.
http://en.wikipedia.org/wiki/Transport_Layer_Security
Maybe I only have a limited grasp of the whole picture. The 16-bit roll over for my system was going to happen during my watch in 1987(?), the boss said "fix it". Ended up reserving a second 16 bit word address for the overflow, the rollover will not happen again for centuries. The rollover came and went, we got no phone calls, it must have worked. A couple weeks later, we heard that SEVERAL large financial institutions (one of which may or may not have the letters T, R and W in its name) had suffered losses for ignoring this error. When the roll over occurred, the uncorrected clocks went back to Jan 1, 1900 (or whatever) and they lost MILLIONS of US dollars per day till they fixed it a couple weeks later. For some reason, the story only ran on local LA news, for a short while.
The same short sighted individuals responsible for getting burned by not fixing the 16 bill roll over became the "experts" that warned of the Y2K "problem". However they seemed to fail to grasp the significance of the roll over of the internal value (a major issue) versus the external display of the value (minor/cosmetic). While there was mention that some critical systems existed that calculate the date based on the external representation rather than the internal representation (rather insane), I never found an example. (The only issue was more like exporting the data to an external spreadsheet, and two digit year wouldn't sort correctly after 1999. Big deal, export four digits, done.)
Knowing this, and that midnight in Chicago occurs close to last, I was able to get my tickets to Asia for 1/10th the usual cost, and had practically the whole plane to myself. My recollection of Y2K is sleeping across 5 seats in a 747, and being paid double time to be "on call" in case the world ended. Best vacation ever.
I never did hear of any issues that were not cosmetic, display issues for Y2K. But then again there was all that left over champaign that was on close out until June that year, it may have affected my memory. We had champagne and cheese burgers every Friday that summer. So yeah, Y2K seemed a bunch of hype and nonsense both before and after.
19%02d", tm_year);' as well (an easy error to make because the C runtime struct tm has a field 'tm_year' which is defined as years since 1900). And so on an so forth. Lots of little subtle bugs.
-Tor
But that's consumer applications, yes? where the app itself is not expected to last more than a couple years. Changing the value of the data from 1991 to 91 is just asking for trouble, I don't think I saw this on any "critical" applications, where an error would cause a collapse of modern civilization, as advertised in Y2K.
-Tor
So, I have always wondered where do they keep all the COBOL programmers?
After hanging around the industry for decades I had never met one. Perhaps I should not expect to as my work has been far away from financial institutions and such.
Finally, this year I have one as a neighbor. They do exist!
They may be fine programmers and all but they don't seem to be geeks. You don't see them showing up on github proudly displaying their talents.
Anyone else spotted such creatures? Where? When?
We have them where I work, but they're in another division. They all seem to be about 60 years old, even their junior programmers. maybe they're born that age.
http://en.wikipedia.org/wiki/Time_formatting_and_storage_bugs
http://en.wikipedia.org/wiki/Y1C_Problem
https://www.youtube.com/watch?v=QJQ691PTKsA&list=UUoxcjq-8xIDTYp3uz647V5A&index=25