Case Study: AirTouch Communications

Forcing Value Downward


The Seattle office of AirTouch Communications (formerly US West New Vector) once was primarily a mainframe shop but had begun its transition to a Unix-based client/server environment. By early 1995, AirTouch Seattle was ready for the next stage in its client/server evolution, which was to move functionality further down in the system to midrange servers and desktop computers. Before embarking on this stage, however, says Dave Beal, leader of an applications development team, AirTouch felt it necessary to seek outside help to set up performance metrics to guide the team. "We didn't have a lot of basis to understand what would happen if we moved applications down to PCs and midlevel servers," Beal says.

Over several weeks, Beal worked with Terry Hodges, a founder of Thinc Corp., a client/server consultancy firm. They created a training program and a set of metrics and standards to guide the in-house AirTouch developers as they build specific client/server applications in the future.

"Hodges supplied the basis to understand the new development environment," Beal says. "He took our in-house programming standards, which were legacies of the mainframe environment, and expanded and updated them to provide the metrics for our client/server development."

According to Hodges, the theory behind his work at AirTouch was that the best time to establish client/server metrics is at the commencement of a project. "A lot of companies are trying to measure the effectiveness of applications coming out of a client/server environment," he says. "That implies establishing clearly up-front what are the goals of the project."

Hodges says that it is hard to measure the effectiveness of the various client/server software architecture options, particularly when a development team is influenced by a mainframe psychology. "What many people don't understand is that there are at least five different forms of client/server software architectures," he says. At one extreme is the architecture that places almost all functionality on the server. At the other extreme is the architecture where most of the function resides on the client. In between are the options that apportion functions between client and server in varying degrees dictated by the process that make the most sense in the given application.

"Client/server is essentially a process distribution problem more than a data distribution problem," Hodges says. "What you find is that people coming from a mainframe environment will load all the processes onto the server. You also find people who want to load processes onto the client." When these unbalanced architectures bog down, technology managers or company executives may blame the entire client/server paradigm rather than their particular implementation.

In his work with AirTouch, Hodges taught Beal's developers how to apportion client/server processes in ways that made sense for each application. The result was a book of metrics and standards to guide future applications development. The actual consulting took only a few weeks spread over a period of several months. Hodges estimates that his efforts cost AirTouch less than one percent of the cost of the development projects it would undertake using the metrics and standards he produced. Of course, AirTouch had already selected much of its client/server hardware and software platforms, so his job was confined to teaching process distribution.

"But even if a company contracted for a complete client/server metrics program to guide a reorganization, the cost would get applied to all subsequent development," Hodges says. "What's a few hundred thousand dollars if you spread it over millions of dollars of development costs over several years?"