The Y2K Scare, Ten Years After
I'm working with Canadian journalist Dan Gardner on his upcoming book about expert predictions, and why they succeed or fail. I was wondering if you could verify that this article was indeed written by you: http://www.kunstler.com/mags_y2k.html.
Also, Mr. Gardner wonders if you would care to comment on why you think those particular predictions failed to come to pass, as well as any other insights you have into the world of predictions. We would greatly appreciate hearing back from you, with any comments you may have on the subject.
Thanks so much,
Research Assistant to Dan Gardner
Yes, I wrote that about Y2K.
I will venture to comment -- which, I warn you, may seem convoluted.
What we've been seeing the past decade might be understood as a kind of meta event, of which the Y2K episode was an early chapter. Overall, this meta event has been about systemic socioeconomic collapse as a result of over-investments in hypercomplexity. This is essentially the argument made by Joseph Tainter in his seminal book The Collapse of Complex Societies (which I had not read at that time of Y2K, but which informs a lot of my later writing).
We are seeing now, ten years later, a full playing-out of these trends in the collapse of banking and capital finance, and consequently in the real economy of ordinary business activity and households. The immediate causes of this fiasco were the hypercomplex frauds and swindles run by Wall Street in the form of abstruse securities and derivatives, and the creation of excessive credit (debt, leverage) that went with it -- and the response of government to the crisis. Peak Oil, so called -- more generally global resource scarcity -- also played a part in this as it foreclosed further conventional economic "growth," which in turn has left us unable to service our massive debt, or run a revolving credit economy.
Y2K was, in my opinion, an early apprehension of the dangers of growing hypercomplexity. Obviously, this anxiety, shall we say, came to focus on the computerization of complex systems such as electric power grids, banks, water purification, air traffic control, et cetera. Remember, in the late 1990s, computerization was increasing at a phenomenal rate of both power and application to vital activities. It came on very quickly and it began to dawn on people paying attention that there might be dangerous unintended consequences in turning over control of so many vital activities to complex machines and their algorithms.
Beyond this dimension of the event, there was a psycho-social realm that manifested eventually in the Y2K Scare. The Internet was then a suddenly new thing, and it rapidly created a "community" of linked intelligences and sensibilities which included those people I mentioned above -- the ones who were paying attention to increasing hypercomplexity, and who were concerned about it. Because this was a realm of human psychology, a "narrative" was constructed around a seemingly small but very real feature of the situation: the date programming in legacy computer systems. These were systems that were designed and installed fairly early in the computer revolution. They were therefore somewhat crude. The systems were bought by the municipal water agencies, the power plants, banks, government agencies, et cetera (they were expensive). To save lines of programming in limited memory, dates were condensed to the last two digits of a given year -- 91, 92, etc. It was widely feared that the computers would not recognize the turning of the year 2000, and might not function at the "rollover" of that date. It was believed that this embedded problem would cause many vital computer systems to crash -- and create tremendous disorder in everything from banking to air traffic control, with linked, "cascading" failures.
The new "community" of the Internet spread the word about this problem and concern quickly turned to anxiety and terror. The fact that so many early members of the Internet "community" were themselves people who happened to be computer-savvy lent validity to the Y2K narrative. A lot of the people who were chattering on the Y2K "bulletin boards" were themselves programmers, many of them quite sophisticated and informed... many of these actually working on the Y2K problem at the very places where there was a fear of failure. In short, these were people who had to be taken seriously. I did take them seriously. As the "rollover" date approached, I took the position that we were in for a lot of trouble.
The trouble, it turned out, was averted. This is a part of the story usually overlooked by those who mock the Y2K episode. Billions of dollars were spent, and scores of thousands of man-hours were dedicated, to mitigating this problem. Programmers went into these old legacy systems and either successfully reprogrammed them or changed out the hardware altogether -- note, this period coincided with the tech boom of the late 1990s precisely because so much new computer equipment was sold. In any case, there were no "cascading" failures of the kind that had been most feared. Lots of systems did fail, but not a critical mass of the largest and most critical ones. The Y2K incident passed into history as a joke.
I don't think it was a joke. I regard it still as a legitimate potential catastrophe that was averted. The longer-lasting consequence of it was that it alerted thinking people to the problems associated with the larger issue of over-investment in hyper-complexity. This has become the over-arching "narrative" of the period we are now living through, with all its vicissitudes, and it was what prompted me to write The Long Emergency, which was published in 2005, and World Made By Hand, published in 2008.
Email Comments From Readers
From Pierre Cloutier (email@example.com)
Dear Mr. Kunstler -
I am a 60-year old computer programmer, and I wish to express my strongest support for your position on the Year 2000 problem.
The warnings and alarms that were sounded, served to galvanize the technical community. The catastrophic failure of computer systems was avoided thanks to the diligent efforts of unsung heroes who were able to patch computer systems in time for the millenia rollover. (I'm not one of them. I switched to 4-digit years long ago).
I have documented my own personal experience:
Were it that Peak Oil be as simple a problem..
From Daniel Carleton (firstname.lastname@example.org)
I know you've garnered more than your fair share of flack on the subject. However, I would take a moment to give you a glimpse into the trenches from then. Back in those days I was a nuclear power plant operator. I had left one plant in Florida for another in New York in January 1999. Both plants had set up task forces to investigate and identify potential Y2K trouble spots. To the best of my knowledge there were not that many. Power plants as well as the electric industry as a whole are not as sophisticated as Hollywood action movies would lead the American public to believe. So as the final seconds of 1999 ticked by there I was, not a party, but seated in a breakroom watching the clock. It was not my crew's night to work but the company paid for an extra shift to be there 'just in case'. As the clock rolled over midnight we made a comprehensive tour of the facility and found nothing amiss. After the tour I called my former coworkers in Florida and heard the same story, extra bodies were in place 'just in case'. Over the years I've heard similar tales from my peers at other utilities. Would an entire industry have paid millions of dollars for overtime and task forces on something that wasn't a threat? People would ask me in the months and weeks leading up to Y2K what our response would be of worse came to worst. My response was always the same: 'I have a pocket full of keys. I have been trained and drilled relentlessly to deal for when worse comes to worst. There is not a single system that I cannot take to manual and render safe.' Of course, all of that training would come in handy three and a half years later during the blackout, but that's another story.
From Diana Haugh (email@example.com)
If anything, I think you understated your case.
The federal agency I work for spent a small fortune on upgrading technology for Y2k. The one system we missed did crash on January 1, 2000. (We just didn't report it for face-saving reasons)
I can only imagine what would have happened to our agency's ability to function had we not prepared by upgrading, replacing and strengthening everything we could identify.
The fact that Y2K did not bring society to its knees doesn't impeach the validity of it or any other warning. Rather, it shows that an approaching crisis can be mitigated by wide acceptance of the warning coupled with vigorous action. How sad that a crisis successfully prevented by a global response and dedication of vast resources is used as an excuse to ignore the next impending disaster.
Bravo on your Y2K piece. It's right on target.
But you neglected one very important aspect of the situation back in '99, which was front and center to those of us who were in the trenches trying to figure out WTF could happen and how much in the way of time and resources to throw at it: It was that most large private institutions (banks, corporations, utilities, etc.) were *legally constrained* not to report the status of either their potential Y2K exposure, **or even of their successes in remediation, lest an implied warranty be made**. This left us in the dark, unable to make rational decisions.
One wonders what we're not being told during THIS stage of the hypercomplexity game.
BTW, I think Ed Yourdon's 'Time Bomb 2000' is still a classic -- probably the first critical study of the potential for single-point failure of the infrastructure on which our entire civilization depends... or should I say "... from which it hangs by a thread?"
SF Bay Area