Years divisible by 100 are not leap years, except for years that are divisible by 400. Companies and organizations worldwide checked, fixed, and upgraded their computer systems.The number of computer failures that occurred when the clocks rolled over into 2000 in spite of remedial work is not known; among other reasons is the reluctance of organisations to report problems.Y2K is a numeronym and was the common abbreviation for the year 2000 software problem.
Calculation of durations between, or the sequence of, pairs of dates will be correct whether any dates are in different centuries.
In all interfaces and in all storage, the century must be unambiguous, either specified, or calculable by algorithm. It identifies two problems that may exist in many computer programs.
Firstly, the practice of representing the year with two digits became problematic with logical error(s) arising upon "rollover" from x99 to x00.
This had caused some date-related processing to operate incorrectly for dates and times on and after 1 January 2000, and on other critical dates which were billed "event horizons".
Without corrective action, long-working systems would break down when the "...
97, 98, 99, 00 ..." ascending numbering assumption suddenly became invalid.
Secondly, some programmers had misunderstood the Gregorian rule that determines whether years that are exactly divisible by 100 are not leap years, and assumed the year 2000 would not be a leap year.
The Year 2000 problem is also known as the Y2K problem, the Millennium bug, the Y2K bug, or Y2K.
Problems arose because programmers represented the four-digit year with only the final two digits.