Year 2000 problem

"Y2K" redirects here. For other uses, see Y2K (disambiguation).
An electronic sign displaying the year incorrectly as 1900 on 3 January 2000 in France

The Year 2000 problem is also known as the Y2K problem, the Millennium bug, the Y2K bug, or Y2K. Problems resulted because people, including programmers, reduced the four-digit year to two digits. This made the year 2000 indistinguishable from 1900. The assumption, that a twentieth-century date was always understood, caused various errors, such as:

In 1997, the British Standards Institute (BSI) developed a standard, DISC PD2000-1,[1] which defines "Year 2000 Conformity requirements" as four rules:

  1. No valid date will cause any interruption in operations.
  2. Calculation of durations between, or the sequence of, pairs of dates will be correct whether any dates are in different centuries.
  3. In all interfaces and in all storage, the century must be unambiguous, either specified, or calculable by algorithm.
  4. Year 2000 must be recognized as a leap year.

It identifies two problems that may exist in many computer programs.

Firstly, the practice of representing the year with two digits became problematic with logical error(s) arising upon "rollover" from x99 to x00. This had caused some date-related processing to operate incorrectly for dates and times on and after 1 January 2000, and on other critical dates which were billed "event horizons". Without corrective action, long-working systems would break down when the "... 97, 98, 99, 00 ..." ascending numbering assumption suddenly became invalid.

Secondly, some programmers had misunderstood the Gregorian rule that determines whether years that are exactly divisible by 100 are not leap years, and assumed the year 2000 would not be a leap year. Years divisible by 100 are not leap years, except for years that are divisible by 400. Thus the year 2000 was a leap year.

Companies and organizations worldwide checked, fixed, and upgraded their computer systems.

The number of computer failures that occurred when the clocks rolled over into 2000 in spite of remedial work is not known; among other reasons is the reluctance of organisations to report problems.[2]

Background

Y2K is a numeronym and was the common abbreviation for the year 2000 software problem. The abbreviation combines the letter Y for "year", and k for the SI unit prefix kilo meaning 1000; hence, 2K signifies 2000. It was also named the Millennium Bug because it was associated with the popular (rather than literal) roll-over of the millennium, even though the problem could have occurred at the end of any ordinary century.

The Year 2000 problem was the subject of the early book, Computers in Crisis by Jerome and Marilyn Murray (Petrocelli, 1984; reissued by McGraw-Hill under the title The Year 2000 Computing Crisis in 1996). The first recorded mention of the Year 2000 Problem on a Usenet newsgroup occurred Friday, 18 January 1985, by Usenet poster Spencer Bolles.[3]

The acronym Y2K has been attributed to David Eddy, a Massachusetts programmer,[4] in an e-mail sent on 12 June 1995. He later said, "People were calling it CDC (Century Date Change), FADL (Faulty Date Logic) and other names."

The problem started because on both mainframe computers and later personal computers, storage was expensive, from as low as $10 per kilobyte, in many cases as much as or even more than US$100 per kilobyte.[5] It was therefore very important for programmers to reduce usage. Since programs could simply prefix "19" to the year of a date, most programs internally used, or stored on disk or tape, data files where the date format was six digits, in the form MMDDYY, MM as two digits for the month, DD as two digits for the day, and YY as two digits for the year. As space on disk and tape was also expensive, this also saved money by reducing the size of stored data files and data bases.

Many computer programs stored years with only two decimal digits; for example, 1980 was stored as 80. Some such programs could not distinguish between the year 2000 and the year 1900. Other programs tried to represent the year 2000 as 19100. This could cause a complete failure and cause date comparisons to produce incorrect results. Some embedded systems, making use of similar date logic, were expected to fail and cause utilities and other crucial infrastructure to fail.

Some warnings of what would happen if nothing was done were particularly dire:

The Y2K problem is the electronic equivalent of the El Niño and there will be nasty surprises around the globe. John Hamre, United States Deputy Secretary of Defense[6]

Special committees were set up by governments to monitor remedial work and contingency planning, particularly by crucial infrastructures such as telecommunications, utilities and the like, to ensure that the most critical services had fixed their own problems and were prepared for problems with others. While some commentators and experts argued that the coverage of the problem largely amounted to scaremongering,[7] it was only the safe passing of the main "event horizon" itself, 1 January 2000, that fully quelled public fears. Some experts who argued that scaremongering was occurring, such as Ross Anderson, Professor of Security Engineering at the University of Cambridge Computer Laboratory, have since claimed that despite sending out hundreds of press releases about research results suggesting that the problem was not likely to be as big a problem as some had suggested, they were largely ignored by the media.[7]

Programming problem

The practice of using two-digit dates for convenience predates computers, but was never a problem until stored dates were used in calculations.

The need for bit conservation

"I'm one of the culprits who created this problem. I used to write those programs back in the 1960s and 1970s, and was proud of the fact that I was able to squeeze a few elements of space out of my program by not having to put a 19 before the year. Back then, it was very important. We used to spend a lot of time running through various mathematical exercises before we started to write our programs so that they could be very clearly delimited with respect to space and the use of capacity. It never entered our minds that those programs would have lasted for more than a few years. As a consequence, they are very poorly documented. If I were to go back and look at some of the programs I wrote 30 years ago, I would have one terribly difficult time working my way through step-by-step."

Alan Greenspan, 1998[8]

In the first half of the 20th century, well before the computer era, business data processing was done using unit record equipment and punched cards, most commonly the 80-column variety employed by IBM, which dominated the industry. Many tricks were used to squeeze needed data into fixed-field 80-character records. Saving two digits for every date field was significant in this effort.

In the 1960s, computer memory and mass storage were scarce and expensive. Early core memory cost one dollar per bit. Popular commercial computers, such as the IBM 1401, shipped with as little as 2 Kilobytes of memory. Programs often mimicked card processing techniques. Commercial programming languages of the time, such as COBOL and RPG, processed numbers in their character representations. Over time the punched cards were converted to magnetic tape and then disk files, but the structure of the data usually changed very little. Data was still input using punched cards until the mid-1970s. Machine architectures, programming languages and application designs were evolving rapidly. Neither managers nor programmers of that time expected their programs to remain in use for many decades. The realization that databases were a new type of program with different characteristics had not yet come.

There were exceptions, of course. The first person known to publicly address this issue was Bob Bemer, who had noticed it in 1958 as a result of work on genealogical software. He spent the next twenty years trying to make programmers, IBM, the government of the United States and the ISO aware of the problem, with little result. This included the recommendation that the COBOL PICTURE clause should be used to specify four digit years for dates.[9] Despite magazine articles on the subject from 1970 onward, the majority of programmers and managers only started recognizing Y2K as a looming problem in the mid-1990s, but even then, inertia and complacency caused it to be mostly unresolved until the last few years of the decade. In 1989, Erik Naggum was instrumental in ensuring that Internet mail used four digit representations of years by including a strong recommendation to this effect in the Internet host requirements document RFC 1123.[10]

Saving space on stored dates persisted into the Unix era, with most systems representing dates to a single 32-bit word, typically representing dates as elapsed seconds from some fixed date.

Resulting bugs from date programming

Webpage screenshots showing the JavaScript .getYear() method problem, which depicts the so-called Year 2000 problem. (for the detail, click on the image to see its description)
An Apple Lisa does not accept the date

Storage of a combined date and time within a fixed binary field is often considered a solution, but the possibility for software to misinterpret dates remains because such date and time representations must be relative to some known origin. Rollover of such systems is still a problem but can happen at varying dates and can fail in various ways. For example:

Date bugs similar to Y2K

4 January 1975

This date overflowed the 12-bit field that had been used in the Decsystem 10 operating systems. There were numerous problems and crashes related to this bug while an alternative format was developed.[15]

9 September 1999

Even before 1 January 2000 arrived, there were also some worries about 9 September 1999 (albeit less than those generated by Y2K). Because this date could also be written in the numeric format 9/9/99, it could have conflicted with the date value 9999, frequently used to specify an unknown date. It was thus possible that database programs might act on the records containing unknown dates on that day. Data-entry operators commonly entered 9999, into required-fields for an unknown future-date, (e.g., a termination date for cable-television or telephone service), in order to process computer forms using CICS software.[16] Somewhat similar to this is the end-of-file code 9999, used in older programming languages. While fears arose that some programs might unexpectedly terminate on that date, the bug was more likely to confuse computer operators than machines.

Leap years

Main article: Zeller's congruence.

Mostly, a year is a leap year if it is evenly divisible by four. A year divisible by 100, however, is not a leap year on the Gregorian calendar unless it is also divisible by 400. For example, 1600 was a leap year, but 1700, 1800 and 1900 were not. Some programs may have relied on the oversimplified rule that a year divisible by four is a leap year. This method works fine for the year 2000 (because it is a leap year), and will not become a problem until 2100, when older legacy programs will likely have long since been replaced. Other programs contained incorrect leap year logic, assuming for instance that no year divisible by 100 could be a leap year. An assessment of this leap year problem including a number of real life code fragments appeared in 1998.[17] For information on why century years are treated differently, see Gregorian calendar.

Year 2010 problem

Some systems had problems once the year rolled over to 2010. This was dubbed by some in the media as the "Y2K+10" or "Y2.01K" problem.[18]

The main source of problems was confusion between hexadecimal number encoding and binary-coded decimal encodings of numbers. Both hexadecimal and BCD encode the numbers 0–9 as 0x0–0x9. But BCD encodes the number 10 as 0x10, whereas hexadecimal encodes the number 10 as 0x0A; 0x10 interpreted as a hexadecimal encoding represents the number 16.

For example, because the SMS protocol uses BCD for dates, some mobile phone software incorrectly reported dates of SMSes as 2016 instead of 2010. Windows Mobile is the first software reported to have been affected by this glitch; in some cases WM6 changes the date of any incoming SMS message sent after 1 January 2010 from the year "2010" to "2016".[19][20]

Other systems affected include EFTPOS terminals,[21] and the PlayStation 3 (except the Slim model).[22]

The most important occurrences of such a glitch was in Germany, where upwards of 20 million bank cards became unusable, and with Citibank Belgium, whose digipass customer identification chips failed.[23]

Year 2038 problem

Main article: Year 2038 problem

The original Unix time datatype (time_t) stores a date and time as a signed long integer (on 32 bit systems a 32-bit integer) representing the number of seconds since 1 January 1970. During and after 2038, this number will exceed 231  1, the largest number representable by a signed long integer on 32 bit systems, causing the Year 2038 problem (also known as the Unix Millennium bug or Y2K38). As a long integer in 64 bit systems uses 64 bits, the problem does not realistically exist on 64 bit systems that use the LP64 model.

Programming solutions

Several very different approaches were used to solve the Year 2000 problem in legacy systems. Three of them follow:

Date expansion
Two-digit years were expanded to include the century (becoming four-digit years) in programs, files, and databases. This was considered the "purest" solution, resulting in unambiguous dates that are permanent and easy to maintain. However, this method was costly, requiring massive testing and conversion efforts, and usually affecting entire systems.
Date re-partitioning
In legacy databases whose size could not be economically changed, six-digit year/month/day codes were converted to three-digit years (with 1999 represented as 099 and 2001 represented as 101, etc.) and three-digit days (ordinal date in year). Only input and output instructions for the date fields had to be modified, but most other date operations and whole record operations required no change. This delays the eventual roll-over problem to the end of the year 2899.
Windowing
Two-digit years were retained, and programs determined the century value only when needed for particular functions, such as date comparisons and calculations. (The century "window" refers to the 100-year period to which a date belongs.) This technique, which required installing small patches of code into programs, was simpler to test and implement than date expansion, thus much less costly. While not a permanent solution, windowing fixes were usually designed to work for several decades. This was thought acceptable, as older legacy systems tend to eventually get replaced by newer technology.[24]

Documented errors

Before 2000

On 1 January 2000

When 1 January 2000 arrived, there were problems generally regarded as minor. Consequences did not always result precisely at midnight. Some programs were not active at that moment and would only show up when they were invoked. Not all problems recorded were directly linked to Y2K programming in a causality; minor technological glitches occur on a regular basis. Some caused erroneous results, some caused machines to stop working, some caused date errors, and two caused malfunctions.

Reported problems include:

On 31 December 2000 or 1 January 2001

Some software did not correctly recognise 2000 as a leap year, and so worked on the basis of the year having 365 days. On the last day of 2000 (day 366) these systems exhibited various errors. These were generally minor, apart from reports of some Norwegian trains that were delayed until their clocks were put back by a month.[31]

Government responses

Bulgaria

Although only two digits are allocated for the birth year in the Bulgarian national identification number, the year 1900 problem and subsequently the Y2K problem were addressed by the use of unused values above 12 in the month range. For all persons born before 1900, the month is stored as the calendar month plus 20, and for all persons born after 1999, the month is stored as the calendar month plus 40.[32]

Netherlands

The Dutch Government promoted Y2K Information Sharing and Analysis Centers (ISACs) to share readiness between industries, without threat of antitrust violations or liability based on information shared.

Norway and Finland

Norway and Finland changed their national identification number, to indicate the century in which a person was born. In both countries, the birth year was historically indicated by two digits only. This numbering system had already given rise to a similar problem, the "Year 1900 problem", which arose due to problems distinguishing between people born in the 20th and 19th centuries. Y2K fears drew attention to an older issue, while prompting a solution to a new problem. In Finland, the problem was solved by replacing the hyphen ("-") in the number with the letter "A" for people born in the 21st century. In Norway, the range of the individual numbers following the birth date was altered from 0–499 to 500–999.

Uganda

The Ugandan government responded to the Y2K threat by setting up a Y2K Task Force.[33] In August 1999 an independent international assessment by the World Bank International Y2k Cooperation Centre found that Uganda's website was in the top category as "highly informative". This put Uganda in the "top 20" out of 107 national governments, and on a par with the United States, United Kingdom, Canada, Australia and Japan, and ahead of Germany, Italy, Austria, Switzerland which were rated as only "somewhat informative". The report said that "Countries which disclose more Y2k information will be more likely to maintain public confidence in their own countries and in the international markets."[34]

United States

In 1998, the United States government responded to the Y2K threat by passing the Year 2000 Information and Readiness Disclosure Act, by working with private sector counterparts in order to ensure readiness, and by creating internal continuity of operations plans in the event of problems. The effort was coordinated out of the White House by the President's Council on Year 2000 Conversion, headed by John Koskinen.[35] The White House effort was conducted in coordination with the then-independent Federal Emergency Management Agency (FEMA), and an interim Critical Infrastructure Protection Group, then in the Department of Justice, now in Homeland Security.

The U.S. Government followed a three-part approach to the problem: (1) Outreach and Advocacy (2) Monitoring and Assessment and (3) Contingency Planning and Regulation.[36]

The logo created by The President's Council on the Year 2000 Conversion, for use on Y2K.gov

A feature of U.S. Government outreach was Y2K websites including Y2K.GOV. Presently, many U.S. Government agencies have taken down their Y2K websites. Some of these documents may be available through National Archives and Records Administration[37] or the Wayback Machine.

Each federal agency had its own Y2K task force which worked with its private sector counterparts. The FCC had the FCC Year 2000 Task Force.[36][38]

Most industries had contingency plans that relied upon the Internet for backup communications. However, as no federal agency had clear authority with regard to the Internet at this time (it had passed from the U.S. Department of Defense to the U.S. National Science Foundation and then to the U.S. Department of Commerce), no agency was assessing the readiness of the Internet itself. Therefore, on 30 July 1999, the White House held the White House Internet Y2K Roundtable.[39]

United Kingdom

The British government made regular assessments of the progress made by different sectors of business towards becoming Y2K-compliant and there was wide reporting of sectors which were laggards. Companies and institutions were classified according to a traffic light scheme ranging from green "no problems" to red "grave doubts whether the work can be finished in time". Many organisations finished far ahead of the deadline.

International cooperation

The International Y2K Cooperation Center (IY2KCC) was established at the behest of national Y2K coordinators from over 120 countries when they met at the First Global Meeting of National Y2K Coordinators at the United Nations in December 1988. IY2KCC established an office in Washington, D.C. in March 1999. Funding was provided by the World Bank, and Bruce W. McConnell was appointed as director.

IY2KCC's mission was to "promote increased strategic cooperation and action among governments, peoples, and the private sector to minimize adverse Y2K effects on the global society and economy." Activities of IY2KCC were conducted in six areas:

IY2KCC closed down in March 2000.[40]

Private sector response

The Y2K issue was a major topic of discussion in the late 1990s and as such showed up in most popular media. A number of "Y2K disaster" books were published such as Deadline Y2K by Mark Joseph. Movies such as Y2K: Year to Kill capitalized on the currency of Y2K, as did numerous TV shows, comic strips, and computer games.

Cost

The total cost of the work done in preparation for Y2K is estimated at over US$300 billion ($412 billion today, once inflation is taken into account [43]).[44] IDC calculated that the U.S. spent an estimated $134 billion ($184 billion) preparing for Y2K, and another $13 billion ($18 billion) fixing problems in 2000 and 2001. Worldwide, $308 billion ($423 billion) was estimated to have been spent on Y2K remediation.[45] There are two ways to view the events of 2000 from the perspective of its aftermath:

Supporting view

This view holds that the vast majority of problems had been fixed correctly, and the money was well spent. The situation was essentially one of preemptive alarm. Those who hold this view claim that the lack of problems at the date change reflects the completeness of the project, and that many computer applications would not have continued to function into the 21st century without correction or remediation.

Opposing view

Others have asserted that there were no, or very few, critical problems to begin with. They also asserted that there would be only a few minor mistakes and that a "fix on failure" approach, would have been the most efficient and cost-effective way to solve these problems as they occurred.

See also

References

  1. BSI Standard, on year 2000
  2. Carrington, Damian (4 January 2000). "Was Y2K bug a boost?". BBC News. Archived from the original on 22 April 2004. Retrieved 19 September 2009.
  3. Spencer Bolles. "Computer bugs in the year 2000". Newsgroup: net.bugs. Usenet: 820@reed.UUCP.
  4. American RadioWorks Y2K Notebook ProblemsThe Surprising Legacy of Y2K. Retrieved on 22 April 2007.
  5. A web search on images for "computer memory ads 1975" returns ads showing pricing for 8K of memory at $990 and 64K of memory at $1495.
  6. Looking at the Y2K bug, portal on CNN.com Archived 7 February 2006 at the Wayback Machine.
  7. 1 2 3 Presenter: Stephen Fry (2009-10-03). "In the beginning was the nerd". Archive on 4. BBC Radio 4.
  8. Testimony by Alan Greenspan, ex-Chairman of the Federal Reserve before the Senate Banking Committee, 25 February 1998, ISBN 978-0-16-057997-4
  9. "Key computer coding creator dies". The Washington Post. 25 June 2004. Retrieved 25 September 2011.
  10. "Requirements for Internet Hosts – Application and Support". tools.ietf.org.
  11. "Microsoft Knowledge Base article 214326". Support.microsoft.com. 8 July 2011. Retrieved 25 September 2011.
  12. "JavaScript Reference Javascript 1.2". Sun Microsystems. Retrieved 7 June 2009.
  13. "JavaScript Reference Javascript 1.3". Sun. Retrieved 7 June 2009.
  14. "Millennium Bug - Television Tropes & Idioms". Tvtropes.org. Retrieved 11 June 2013.
  15. "The Risks Digest Volume 4: Issue 45". The Risks Digest.
  16. Stockton, J.R., "Critical and Significant Dates" Merlyn
  17. A. van Deursen, "The Leap Year Problem" The Year/2000 Journal 2(4):65–70, July/August 1998
  18. "Bank of Queensland hit by "Y2.01k" glitch". 4 January 2010.
  19. "Windows Mobile glitch dates 2010 texts 2016". 5 January 2010.
  20. "Windows Mobile phones suffer Y2K+10 bug". 4 January 2010.
  21. "Bank of Queensland vs Y2K – an update". 4 January 2010.
  22. "Error: 8001050F Takes Down PlayStation Network".
  23. "2010 Bug in Germany". 6 January 2010.
  24. "The Case for Windowing: Techniques That Buy 60 Years", article by Raymond B. Howard, Year/2000 Journal, Mar/Apr 1998.
  25. Millennium bug hits retailers, from BBC News, 29 December 1999
  26. Martin Wainwright (13 September 2001). "NHS faces huge damages bill after millennium bug error". The Guardian (UK). Retrieved 25 September 2011. The health service is facing big compensation claims after admitting yesterday that failure to spot a millennium bug computer error led to incorrect Down's syndrome test results being sent to 154 pregnant women. ...
  27. 1 2 Y2K bug fails to bite, from BBC News, 1 January 2000
  28. 1 2 Computer problems hit three nuclear plants in Japan, report by Martyn Williams of CNN, 3 January 2000 Archived 7 December 2004 at the Wayback Machine.
  29. 1 2 3 "Minor bug problems arise". BBC News. British Broadcasting Corporation. Retrieved 4 December 2015.
  30. Preparation pays off; world reports only tiny Y2K glitches at the Wayback Machine, report by Marsha Walton and Miles O'Brien of CNN, 1 January 2000
  31. The last bite of the bug, report from BBC News, 5 January 2001
  32. Iliana V. Kohler, Jordan Kaltchev, Mariana Dimova. "Integrated Information System for Demographic Statistics 'ESGRAON-TDS' in Bulgaria" (PDF). 6 Article 12. Demographic Research: 325–354.
  33. "Uganda National Y2k Task Force End-June 1999 Public Position Statement". 30 June 1999. Retrieved 11 January 2012.
  34. "Y2K Center urges more information on Y2K readiness". 3 August 1999. Retrieved 11 January 2012.
  35. "White House shifts Y2K focus to states, CNN (Feb. 23, 1999)". CNN. 23 February 1999. Retrieved 25 September 2011.
  36. 1 2 FCC Y2K Communications Sector Report (March 1999) copy available at WUTC PDF (1.66 MB)
  37. See President Clinton: Addressing the Y2K Problem, White House, 19 Oct. 1998
  38. "Federal Communications Commission Spearheads Oversight of the U.S. Communications Industries' Y2K Preparedness, Wiley, Rein & Fielding Fall 1999". Opengroup.org. Archived from the original on 9 October 2008. Retrieved 25 September 2011.
  39. Basic Internet Structures Expected to be Y2K Ready, Telecom News, NCS (1999 Issue 2) PDF (799 KB)
  40. "Finding Aids at The University of Minnesota".
  41. "quetek.com". quetek.com. Retrieved 25 September 2011.
  42. Internet Year 2000 Campaigned archived at Cybertelecom
  43. "Consumer Price Index (estimate) 1800–". Federal Reserve Bank of Minneapolis. Retrieved November 10, 2015.
  44. Y2K: Overhyped and oversold?, report from BBC News, 6 January 2000
  45. Robert L. Mitchell (28 December 2009). "Y2K: The good, the bad and the crazy". ComputerWorld.
  46. 1 2 James Christie, (12 January 2015), Y2K – why I know it was a real problem, 'Claro Testing Blog' (accessed 12 January 2015)
  47. Y2K readiness helped New York after 9/11, article by Lois Slavin of MIT News, 20 November 2002
  48. "Finance & Development, March 2002 - September 11 and the U.S. Payment System". Finance and Development - F&D.
  49. Y2K readiness helped NYC on 9/11, article by Rae Zimmerman of MIT News, 19 November 2002
  50. Dutton, Denis (31 December 2009), "It’s Always the End of the World as We Know It", The New York Times
  51. Smith, R. Jeffrey (4 January 2000), "Italy Swatted the Y2K Bug", The Washington Post
  52. White House: Schools lag in Y2K readiness: President's Council sounds alarm over K-12 districts' preparations so far, article by Jonathan Levine of eSchool News, 1 September 1999
  53. Hoover, Kent (9 January 2000), "Most small businesses win their Y2K gamble", Puget Sound Business Journal
  54. Lights out? Y2K appears safe, article by Elizabeth Weise of USA Today, 14 February 1999
  55. John Quiggin, (2 September 1999), Y2K bug may never bite, 'Australian Financial Review' (from The Internet Archive accessed 29 December 2009)

External links

This article is issued from Wikipedia - version of the Monday, April 25, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.