Case Study: Millipore Corp.

Self-Guided Assessment Yields Comparison

Millipore Corp., a manufacturer of filtration systems for pharmaceutical and laboratory applications based in Bedford, MA, is an established open systems shop. Yet even after using the MOSES group guidelines to architect a successful client/server environment for Millipore's $600 million worldwide business, Ron Hawkins, IS director, wondered whether the company was getting the best performance from its 47-person North American information technology group.

"We wanted a set of checkpoints we could take to management and say this is where we are relative to other companies of similar size," says Hawkins, who oversees a client/server system with 800 end users. "The metrics would also tell me whether I'm making progress or regressing on key areas like quality, operational security and the like."

Hawkins took advantage of a unique metrics program administered by the computer assurances division of Coopers & Lybrand. Cushing Anderson of C&L, who worked with Hawkins on the project, describes how the assessment is done.

The process begins with the BenchStation, a fully configured multimedia PC that is brought onto the client's site. The BenchStation runs a multimedia software routine that guides IS directors through a self-assessment process, asking them questions that cover staffing and budget levels, as well as operational issues like frequency of backups, instances of downtime and security practices. Answers are made on a scale from one to five.

The program also records the answers. Using the computer to administer the self-assessment questionnaire has both practical and methodological benefits, Anderson says. On the practical side it reduces the cost of the process (Coopers & Lybrand charges $3,000 to $5,000 for the metrics assessment). The use of a computer-driven program also eliminates the drudgery of filling out forms and makes sure every respondent hears exactly the same instructions.

When respondents are finished answering questions--three hours is the usual time it takes--the BenchStation sends the data via modem to the C&L metrics database in Boston. There, each set of responses is matched against previous responses made by the several hundred companies that have already participated in the metrics program. Within two weeks, clients receive a report that shows how their company stacks up against other respondents in four areas: IT spending, IT staffing, application metrics and technical metrics. The report also shows how each client firm measures up against the best-in-class client/server implementations for companies of comparable size or similar industries.

Because the metrics ratings are based on self-assessments, checks have been put in place to make sure responses are not inflated, according to Nancy Corbett, director of the metrics assessment program for C&L. They begin with the coordinator, who brings the BenchStation into the client site and oversees the process. The coordinator files a private comment to Corbett's office that notes any discrepancies between the known practices of the company and the responses given to the BenchStation. If a respondent did lie during the self-assessment, Coopers & Lybrand reserves the right not to put the responses into the benchmark database, where it would distort the results. That has not yet happened, Corbett says. "Fudging on the survey is like cheating at golf. You only fool yourself."

For Hawkins, the information he gleaned from the metrics assessment has served as a yardstick of past performance and a baseline for future efforts. "It really helps that we did this with full management support, so it's not like we were worried about getting a bad report card," Hawkins says. "For me, it was a quick, cheap, painless way to learn where we were strong, where we need improvement and how far we have to go to achieve best-in-class."