Client Servers Empower PC Users

Information Technology

nferris@govexec.com

W

hen the Naval Sea Logistics Center was ordered to move its Ships Maintenance and Material Management (3-M) system off an Amdahl mainframe in 1995, the idea was to save money.

It worked. 3-M System operations are costing the center at least $500,000 less this year than without the shift, and avoided costs will be greater in the coming years. What's more, the system does a better job for the center's customers throughout the Navy and the Pentagon.

More than 1,000 people can tap into 3-M, which holds ship blueprints, parts inventories and maintenance records. They use a simple but effective Windows program the center developed, called OARS (Open Architectural Retrieval System). With free OARS software on their PCs and a link to the Internet or the Navy's networks, they dig out information they need in their choice of formats.

Contrast this with the old version of the 3-M system. Users could call into the Amdahl mainframe and order printouts, but their choices were limited and it helped to know database systems, OARS program manager Cathy Carpinello says. It was clumsy enough that outsiders called the people at the center and ask for a report to be pulled for them, rather than dealing with the system themselves. And, Carpinello says, "the operating costs were doubling."

The 3-M system now resides on a mid-size Hewlett-Packard 9000 computer at the center's offices in Mechanicsburg, Pa. The information is stored in an Oracle7 database. This hardware and software is the "server" in the client-server system, and the users' PCs are the clients. Software on the PCs does much of the grunt work; the server coordinates the activity of multiple users and handles central storage of shared information.

Just a couple of years ago, client-server systems were regarded as risky. Although it has stabilized, client-server technology turned out to be more costly than expected. The Naval Sea Logistics Center's cost-saving results are not universal.

The theory was that a $25,000 computer-the server-could do the work of a mainframe costing $1 million, if smart terminals, or PCs, carried some of the computing burden. That was true. But keeping all the hardware and software at work has proved expensive. For every dollar they spend to buy hardware and software, corporations are spending another $4 to keep their PCs in tune and online, according to executives at Sun Microsystems Inc.

One reason: Maintaining PCs scattered around the nation, as is common in federal agencies, is expensive. For another, PC users are prone to modify their systems without permission. Also, most large organizations have a mix of PCs bought at different times from different vendors and using different versions of the software. All the products are supposed to be compatible, but in practice, they're not quite.

The flexibility that client-server advocates boast of has turned out to be a two-edged sword when it comes to maintaining and controlling system operations. Nonetheless, the conventional wisdom now views client-server as the norm for most new systems. In this view, any extra costs associated with client-server architectures are offset by greater functionality, such as the data accessibility the Naval Sea Logistics Center is enjoying.

Zipora Brown, a vice president of American Management Systems Inc., says client-server systems can improve productivity, work flow and output speed, agency services, and management reporting. There's a price to be paid, she agrees, but "net, it's a gain for the agencies."

The much-discussed Year 2000 problem is another reason agencies are increasingly likely to move off mainframes. Legacy mainframe software often is a mess-a patched-together collection of modules whose original identities have become clouded in the mists of time. The system runs, but no one working with it truly understands its innards. Rather than dissect the old system and patch it further in updating all its date codes, it may make sense to start over with a clean slate.

Federal agencies have a further incentive to move from mainframes to client-server. Client-server systems generally consist of more standard hardware and software components than mainframe systems. This standardization complies with federal mandates intended to keep agencies from becoming locked into a single systems supplier. Standardization tends to increase vendor competition by leveling the playing field.

Federal agencies also are under orders to consolidate and streamline their mainframe computing operations. The Office of Management and Budget is pressing agencies to reduce the overhead costs they incur by maintaining many data centers. As they inventory their mainframe systems and plan for consolidation, agencies are taking a new look at the costs and benefits of these "legacy" systems.

Now that client-server architecture is established, however, a new challenger has come along. Called network computing, it almost amounts to resurrection of the dumb terminal. In network computing, an inexpensive and low-powered PC (often called a "thin client") pulls both software and data from servers as needed.

That's the theory, anyway. There are few installations of this kind at work today outside hardware and software companies, although they are expected to take off later this year as more products come on the market and buyer hesitation runs its course. Market researchers at the Yankee Group report that two-thirds of large corporations are likely to buy some network computers by the end of 1998.

The chief advantage of network computing is that system maintenance and management can all be done at the server. The client machines would be virtually interchangeable, unlike the personalized PCs in use today. Once an employee signs onto the system, he or she gets access to the approved software and data via the network, rather than from the PC.

Sun Microsystems, a company leading the charge toward network computing, says it costs almost $12,000 a year to maintain a PC on a network. Sun is promising to cut that sum to about $2,500 with its network computers. However, its figures do not account for the cost of heavy-duty new networks, servers and server software that no doubt also will be needed.

The network computing push has divided the computer industry into two major camps. On the one hand, Sun, Oracle Corp. and their allies are pushing for a family of products that merge client-server technologies with the Internet. Among the other members of this alliance are IBM Corp. and Netscape Communications Corp.

The other camp is led by Microsoft Corp. Network computing fervor has forced Microsoft to make concessions, such as a stripped-down NetPC model and a set of PC management tools to cut the costs of system administration. But Microsoft is sticking with the fundamental PC system that made the company the world's top software maker. Its allies include the PC manufacturers and other well-known companies such as Hewlett-Packard Co. and Digital Equipment Corp., although each of the latter also has a foot in the network computing camp.

Microsoft's network software system, Windows NT, competes with the Unix software at the heart of the network computing concept. After years of being dismissed as not up to the task of supporting systems like the Navy's 3-M, Windows NT is being taken seriously. By next year, according to market researchers at International Data Corp., NT will overtake both Unix and Novell NetWare as the most-often-installed server software. Other companies are developing more products to work with NT than they have in the past.

Many think Microsoft is in the catbird seat because in the end, PC users won't want to give up the computing power they now enjoy. "Companies will not dump their huge installed base of PCs," says Tom Rhinelander, a market analyst with Forrester Research. But he predicts network computers will have a place in the computing landscape. Their role, he says, will be to replace the millions of dumb terminals still in use.

Even the leaders of the network computing movement seem to share his belief to some extent. For example, Oracle Corp. marketers say the first large-scale users of the new systems will be focused operations such as airline reservations, insurance claims processing and high-volume telephone sales where functions are limited. This means many of today's remaining mainframe systems could be candidates for the new computer systems.

This spring's buzzword in computer marketing circles has been "total cost of ownership." Each vendor is arguing that their products are the most cost-effective and easy to manage. ("Zero administration" is Sun's phrase, for example.) It's a concept federal managers are familiar with by a different name: life-cycle costs.

Whether any 1997 analysis of life-cycle costs will still be valid in mid-1998 is open to question. The technology is changing more rapidly than ever, and in the case of network computing with thin clients, there is virtually no experience on which to base an estimate. It's clear that client-server systems are not major cost-cutters. The same most likely will prove true of network computing.

Purchase and operating costs should shrink as the industry takes customer complaints to heart. But don't count on getting much more systems muscle by spending slimmed-down IT budgets on thin clients.

Nancy Ferris is a Washington-based freelance writer specializing in technology.

NEXT STORY: Disaster Recovery