Behind the News

Analysis of Industry Events

New Meaning for Fat and Thin

For years, the IS department has stood on the sidelines of desktop computing. Supported by a succession of faster processors and cheaper memory and storage, desktop applications have become evermore powerful. By keeping most processing power, applications and data on their individual machines and off host systems, desktop users loosened their dependence on IS. Even when companies replaced their mainframes with client/server architectures, users often retained much of the power on their clients, relegating the server to storing files.

But if catchphrases are prognosticators, things may be changing. Many IS publications and managers are looking forward to a new architecture called "fat server/thin client." While this distribution can be achieved in a number of ways, the general scenario is to place the user interface, which may be proprietary client software or merely a Web browser, on the client. Most or all of the application, along with any database it accesses, is maintained on the server or shared between database and application servers in a three-tier configuration.

The fat server/thin client architecture is garnering interest because user organizations want some central management of their systems. Often they want IS to take over this task. And a fat server may be the most reasonable way for IS to administer the tangled web of systems purchased over the last five years out of user departmental budgets.

"Departments brought in their software and hardware, and at a certain point they woke up to the fact that they're not very good at taking care of them. Now they're asking IS to help," says Judith Hurwitz, president of the Hurwitz Group, a consulting firm in Newton, MA.

Peter Burris, vice president and director of open computing and server strategies for the Meta Group in Burlingame, CA, has a similar view. "Much of the software being purchased or developed today is by departmental groups. But now business managers want to get out of the IT business," he says.

Downsizing the Desktop

If IS is to manage all those clients, a few important changes may have to be made. No longer will all applications be situated entirely on the desktop. Instead, each case will be evaluated separately to determine the optimal technology to accomplish it in the most efficient and economical way.

Many observers speculate that the best way to manage large, distributed networks is to centralize at least some of the control. "The advantages of this fat server architecture all relate to the degree to which complexity on the client can compromise quality and cost," says Burris. Keeping applications in one central location makes administration, software support and data integrity and protection less complicated. For example, IS can be assured that everyone's using the same version of software and that no one reconfigures the software to run in a nonstandard way.

Fat server/thin client architecture also allows companies to downsize the types of platforms their users need. No longer will everyone require the latest, fastest processor. "If a company is running a hotel reservation system, it has no need to give everyone Windows or fancy workstations," says Hurwitz. Of course, some users still need robust desktop machines. "A fat client makes sense if what I'm doing requires a lot of intelligence on the front end," she says. "But certainly not everyone who has the latest Pentium today is doing work complex enough to make full use of their machines." Hurwitz estimates that most users rarely need more than 10 to 20 percent of the processing power available to them on their PCs.

Even PeopleSoft of Pleasanton, CA, a software supplier known for the fat client architecture of its enterprise solutions, is considering backing off a bit. "Five percent of our users use 95 percent of the power of our applications, but 95 percent of our users use only five percent of the power. Those people may not need fat clients," says Stan Sweete, manager of product strategies. Order entry clerks, for example, will continue to use PeopleSoft's Windows fat client application. But those who only occasionally need to locate a customer record or change their human resources personal information will be able to access PeopleSoft via an intranet browser, the thinnest client application available today. While PeopleSoft won't have intranet capability with version 6, due out this winter, it should be available in version 7. "Our customers are asking for it," says Sweete.

Avoid Crash Diets

Despite the advantages of the fat server/thin client architecture, experts warn of the dangers of going too far too fast. "A lot of us suffered through decentralized computing and the anarchy which resulted, but we don't want to go back to the mainframe-and-dumb-terminal architecture either," says Christine Comaford, CEO of C2 Ventures, a consulting company in Mill Valley, CA.

In other words, while you're putting some clients on a diet, be careful not to starve them. "If the server goes down, you don't want 10,000 people twiddling their thumbs. And people still want to work remotely and unplugged. You can't take that away from all of them," Comaford says.

Many consultants suggest that while IS may be justified in relieving the desktop of some of its fat, other applications, notably office productivity packages, are virtually sacrosanct. "Our research shows that people want to keep their traditional PC applications. They're used to them, and they don't want to switch and learn anything that's not Windows-based," says Jim Johnson, founder and chairman of the Standish Group International in Dennis, MA.

But for applications that can be moved off the client, there are a growing number of tools to help. For example, products like the Dynasty Development Environment from Dynasty Technologies of Naperville, IL, and Forte from Forte Software of Oakland, CA, allow you to partition applications between servers and client dynamically (on the fly using trial and error), through a point-and-click interface.

While fat server/thin client seems all the rage, IS has to keep a watchful eye on this, like other trends. Many people remember the early failures of those who jumped off the mainframe to client/server architectures. Burris sees some similarities between that mistake and moving too quickly to a fat server. "Fat clients are the first legacy systems on client/server," he says. "The decision to move off a legacy system should never be taken lightly. If the current system is working, even if it is fat client, don't necessarily throw it out."

--Larry Stevens


Off-the-Shelf Government

Imagine yourself at the controls of a state-of-the-art fighter aircraft. Not one of those flight simulator packages that you use at your desk, but the real thing. You're coming in on the final approach to an aircraft carrier. The weather is bad, and it's everything you can do to keep the small runway lined up in your sights. You're a few hundred feet off the water. Another five seconds to touchdown. Then, without warning, your panel goes blank and in your heads-up display are the words fatal error . . program aborted . . rebooting system. For the rest of your life--about four more seconds--you'd probably wish that your software vendor had done a better job of beta testing the software.

Is this an unrealistic scenario? For the moment, yes. No one expects the U.S. military to procure commercial off-the-shelf (COTS) software for situations such as this. However, in response to taxpayer demands that our dollars be spent more wisely, federal agencies are pushing the use of COTS products as a way to gain greater cost-effectiveness for software systems. This is particularly true of the military. However, as the above scenario suggests, major problems exist in implementing this strategy.

No one would be happy to learn that a government agency spent a ton of money (think of paper currency in large denominations) to develop a spreadsheet or a word processor. The need for a special version of such a product is hard to imagine. But what if the input to that spreadsheet was status information from a shipboard fire-control system? And how do you integrate the custom-written fire-control software with the COTS spreadsheet? The use of a COTS spreadsheet isn't necessarily a problem-free solution.

All systems must be maintained somehow. For systems that are in aircraft or ships, this requires a trained technician who can troubleshoot any malfunctions. This technician must have documentation in order to isolate and repair the problem. Just recall the various formats for the user manuals you see with commercial software. How is that technician going to wade through various formats, terminology and even bindings to repair the fault in a timely--not to mention emergency--fashion? Furthermore, since almost all COTS software is sold as binary executables, how can the technician (user) even attempt to fix a bug? You can't easily download a new version to an undersea submarine or an aircraft at 40,000 feet.

Does One Size Fit All?

In January of this year, a conference was held to address the use of COTS software in systems integration. Hosted by the Software Engineering Institute of Pittsburgh, PA, and Microelectronics and Computer Technology Corp. of Austin, TX, this conference brought together industry and government representatives who are faced with this dilemma. They raised various issues regarding COTS products and their applicability in high-reliability usage.

To think about this topic, first one has to define the phrase high reliability. Certainly the pilot landing on an aircraft carrier has a higher reliability need than an executive who needs an inventory report for an upcoming meeting. Yet most commercial software is targeted toward the business user, and testing tends to be less than thorough. Unfortunately, many vendors leave the final testing to the customer. They don't want to, and cannot afford to, test exhaustively enough to get a 100 percent verifiably tested product. Usually they do an adequate level of testing and expect to repair bugs in future releases. After all, how serious is it to reboot your system from time to time? Our executive goes to get another cup of coffee. Our pilot isn't so lucky.

Another issue that people raise when advocating wider use of COTS products is that their use will decrease development time. The assumption is that if 25 percent of your software was purchased, not developed, your expenses should be significantly less because of the development time you've saved. But one has to spend the time to integrate these packages into a workable system. Interfaces often are not well-documented and probably don't adhere to any formal standards. Major commercial systems have life cycles that are sometimes measured in months between major revisions. On the other hand, major weapons systems have life cycles measured in years. The integration issue will be a never-ending cycle, eating into the budget.

Another argument in favor of using COTS products is the belief that they tend to implement open systems standards, thereby reducing development and integration time. Realistically, many commercial systems have more than the basic standard as part of their product. Many vendors include standard interfaces to meet a mandatory requirement but also include their own proprietary features as a "value-adding" enhancement. These proprietary features may not integrate easily with other products.

Also, if you've ever read a standard you realize that standards specifications don't describe the implementation of that standard. For example, I was involved in organizing a trade show in Vancouver, BC, Canada. One of the highlights of this show was to have an X.400 electronic mail system running among four different Unix boxes. All the boxes were in the same room and were connected with an Ethernet TCP/IP link. We had the X.400 experts from each of the four vendors. All four vendors had electronic mail packages that conformed to the ISO X.400 standard. Yet it took many hours before all four systems were interoperating. Why? Because one vendor called a certain configuration file by a unique name and placed it in a certain directory that wasn't the same name and directory as another's. Permission bits had to be set in an ad hoc manner, and so forth. In short, the integration issues involving so-called standard products may be significantly complex.

For this article, I spoke to several people in the military who didn't want to be identified, partly because of the bureaucracy involved in gaining permission to be quoted in a publication but also because they're not sure what they think about the COTS issue. Contrary to what you read in the papers, these people are honestly trying to develop systems that include more COTS products and cost less money. The problem is that it's not looking like the huge dollar savings of COTS is there now, and it may never be. Nevertheless, these people are being asked (actually mandated) to integrate as much COTS software as possible.

For the reasons stated above, such integration is not likely to shorten development time or reduce costs--at least not to the extent that some promoters might have us believe. High-level officers are touting the military's move to COTS products and predicting more reliable systems at less cost to the taxpayer. Unfortunately, until they can go down to a retail software store and pick up a torpedo firing software package for a Pentium system, it's not going to happen. Here's hoping the ejection seats work in realtime.

--Gary Donnelly