Standards & Technology

A Look Behind the Scenes

The Rise and Fall and Rise of Standardization



The cyclical nature of standards efforts in the IT industry provides a lesson worth learning and remembering.

By Carl Cargill

In April, I had a conversation on the past, present and future of standardization with Gary Robinson, director of standards at Sun Microsystems. I've known Gary for nearly two decades and worked for him when he was manager of corporate standards at DEC. Gary was writing a paper, and I was looking it over before he sent it to whichever journal was publishing it. One of his contentions was that standardization is cyclical and that it was playing true to form now. Needless to say, I was intrigued.

When I asked Gary about the statement, he said that it had come up in a conversation he'd had with Joseph DeBlasi, executive director of the Association for Computing Machinery (ACM) and former director of standardization at IBM. Gary and Joe had been discussing the changes in the industry in the last 50 years (that is, since the discipline started) and had noticed the rise and fall of standardization popularity. Their contention--with which I agree--is that standardization is always occurring; it's just that, sometimes, it commands more publicity than at other times.

This started me thinking about what is happening in the industry now, and it provided the basis of a paper on the status of open systems that I presented at UniForum New Zealand in May. I'd like to summarize that paper here and then, from personal experience, provide a bit of insight into what it means for the industry.

Over the last 20 years, the information technology industry has displayed a consistent pattern of moving from less open to more open solutions. The way has not been easy, and the industry tends to forget why something was done.

For example, the idea of second sources--so popular in the mid-1980s--seems now to apply only to chips and boards. The idea of second sources for all components of a system seems to have fallen by the wayside. Yet those who remember the plug-compatible battles of the 1970s understand that the ideas that drove second sources were a response to the nearly monopolistic pricing and market position that IBM seemed to hold. This was when the Brooks Act set up the Federal Information Processing Standards (FIPS) program administered by the National Institute of Standards and Technology (NIST), which provided for equity in procurements of disk drives. It was also the time when the industry rallied to prevent the federal government from mandating the DEC PDP 11 architecture as a standard. (Even DEC fought that one, especially because it was about to introduce the VAX architecture.)

Similar attempts to include--or preclude--any architecture or other proprietary solutions are doomed to fail for the same reason that these earlier attempts did. Fundamentally, the market isn't dumb. (I can almost hear the chorus of protests now: "We never said the market was dumb. It's just that what they bought is so inferior to our solution. We're sure that if they only understood . . . .") In a presentation I made to the Motorola standards group, I said that the market will punish "closed" systems that keep other vendors from playing with them. Silicon Valley has a few examples of this; Apple Computer is a primary one. They remained special unto themselves for a long time and did it out of a sort of pride. Microsoft took an entirely different course; it retained a proprietary hold on certain of its assets while inviting anyone to help on other parts. Now Microsoft continues to succeed, while Apple has fallen on hard times.

Proprietary Versus Closed
The key thing to remember is that the market does not object to "proprietary" solutions. It objects to "unopen" (or closed) systems: those that prevent participation by others in the computing construct under consideration. Returning to the earlier example, if the construct was mainframes, the concept was plug compatibility, which eventually succeeded. The market rewarded behaviors that emphasized sharing of hardware; the companies that succeeded were those that adopted the standards necessary to provide some way to have heterogeneous hardware systems. Standards and proprietary systems intermingled for a long time, but their legacy was that peripherals would be capable of use across a wide selection of computers.

The starting model assumed sharing of hardware, and the standardization that drove this model proceeded apace. The SCSI disk interface started at this time, and no successful player in the hardware industry does not, in some form or another, embrace SCSI and the RS232-D interface.

But the big effort in this era was the movement to the sharing of data. In the minicomputer construct, the participatory scheme was to have been networked information sharing; within the standardization world, this implied the Open Systems Interconnect (OSI) model. Needless to say, that model failed around 1990, at about the same time that the minicomputer market (which had revolved around Route 128 in Boston) began to fade. (Remember Wang, Prime and Apollo, and when DEC had to rent the QEII as a floating hotel, because Boston didn't have enough hotel space for its DECWorld event?) The "open" issue then turned to focus on the sharing of data and interoperability.

In the workstation arena, the sharing model was the Unix operating system and the TCP/IP communications package that was usually included. The idea behind the model was the nonexclusionary nature of Unix--that is, anyone could be Unix (mostly) and, if you were Unix, you could share a certain commonality. On the other hand, the PC model became one of "here's a way to share data"; not applications, necessarily, but data from the applications. The catch with the first was that there were no standards to enforce the commonality of Unix and portability of applications; the catch with the second was that the data couldn't be shared if the applications weren't exactly the same and running on the same type of operating system.

Both models have succeeded. The Unix market has finally managed to accept a common model (a unified Unix), and Microsoft has come to dominate the desktop with its "open" model, which says that essentially anyone can build an application using the published Windows interfaces. Standardization--formal or de facto--has caught up with this market and is being put in place to ensure that the foundation is firm. The market now understands how to use and share data.

The Market Moves On
While this fight holds center stage for the vendors, the market has moved on. Today it is focusing on the Internet and the World Wide Web. The buzz about the "information superhighway" began nearly five years ago, in the presidential campaign of 1991. The market has disengaged from the earlier activities--even the Department of Justice is no longer pursuing Microsoft--and is trying to make sense of the new opportunities and to understand what is being offered by this new capability. This is a period of sorting out, as the market looks for the "open" solution.

Major standardization efforts will migrate to this arena. New organizations will be created, and new structures will deal with the problems that surface. Some will fail--from lack of will, lack of ability or lack of skill. But many will succeed, and the solution will be the one that gains the belief of both the user and the technical communities. The workstation/PC stage gave users the ability to seek solutions that they needed within a defined construct; the current revolution removes many of these limiting constructs. The solutions that succeed--and by inference, the companies that back those solutions--will be those that are inclusive, not exclusive.

I started out with the statement that standardization is cyclical. The nature of cyclicity is that of repeating the same actions but within a different environment. The standardization that occurred in the 1960s was the same as that which occurred in the 1980s; both were aimed at opening up a market. Both were aimed at making the market more "open"--not for a technology but so more people could use the results of the technology and become more productive. The workstation/PC era is closing in standardization. And the cycle is starting again.

Carl Cargill is a standards strategist at Netscape Communications in Mountain View, CA. He can be reached at carl@netscape.com.