What's the Problem?

nferris@govexec.com

T

he year 2000 problem stems from a common practice during the early days of computing. Programmers customarily used two-digit notations for years, such as "98," and instructed the system to assume that the first two digits were "19." This obviously causes errors when next-century dates are entered.

The best fix is to change all two-digit years to four-digit years, but shortcuts are possible. For example, some agencies are using a "sliding window" approach that assumes all dates before a certain year say-say "50"-are 21st century dates, and the rest are from the 1900s. Such quick fixes won't last long into the next century, but they'll solve the problem for now.

One reason the problem is complex is that today's systems typically communicate with many other systems, and alterations must be compatible. Also, dates are important in the internal functions of computers, not just in applications software such as the programs that compute whether you're eligible for Social Security benefits. And there's been concern recently about the nation's telephone networks, which also transmit data.

Shockingly, computer hardware and software that won't operate after Dec. 31, 1999, is still on the market. Federal procurement rules require vendors to provide only year 2000-compliant products to agencies, but there are reports of problems nonetheless because of interoperability issues.

Systems that aren't year 2000-ready may stop running altogether on Jan. 1, 2000, if not sooner. Perhaps more disturbing, they may keep operating but produce erroneous data.

Opinion is divided as to whether the nation should brace for a catastrophic breakdown of phones, cash registers, banks and office systems. Many pooh-pooh any such suggestion, while others worry that the interconnections among our public and private infrastructures have left us vulnerable to an international disaster that will strike hardest in wealthy nations with sophisticated systems.

NEXT STORY: Road Warriors Win Awards