Will the Internet Stay Open

By Don Dugdale



[Sidebar]: Internet Standards Organizations
[Chart]: Standards Evolution
The rush of vendors into the Internet market is raising worries about future
interoperability.
Can a system initiated by the U.S. military and built largely by the government and academia succeed as a worldwide commercial communications infrastructure--and remain open? That's the question facing the IT world and the Internet community today. As Internet traffic volume grows exponentially, as the variety and complexity of data increase, and as dozens of vendors jump into the arena with all sorts of business and technical schemes, the standards that govern interoperability on the Internet are getting their severest test. As of now, although the system is a bit frazzled, it is surviving. And there's optimism that it can remain open and workable in the future--if there's a continued focus on sharing technology and developing standards.

Some observers liken the advent of the Internet as a vehicle for universal communications to the development and spread of Unix. Both involved diverse efforts to build on the same framework, focusing on interoperability, portability and sharing of technology.

"That's why Unix took off," says Michael Tilson, chief information officer for the Santa Cruz Operation in Santa Cruz, CA, and current president of UniForum. "The formal standards came later. To tell the truth, the formal standards just prevented some bad things from happening. They weren't the reason it took off."

Of course, the Internet's development, not tied to a particular operating system, has occurred much faster. "This industry is moving 10 times faster than Unix ever moved," says Corinne Moore, associate executive director of CommerceNet, a consortium of Internet-related companies in Mountain View, CA.

Despite the unprecedented demands being placed on the Net, most observers believe that its history of openness and technology-sharing will ensure an open future. "Interoperability is so ingrained in the whole philosophy that I don't think anybody's going to be so bold as to step out on their own" with a proprietary solution that doesn't mesh with the rest, says Marshall Behling, business development manager for VeriSign, a Mountain View firm founded in 1995 that verifies and authenticates the identity and credit status of online purchasers.

Fear of a Coup

However, others remain wary that the vision of great wealth to be gained from Internet software will tempt some company to try a Microsoft-style takeover in this arena. "If there is a company that is not committed to open standards, the same thing could happen that happened in the PC world, where one company with its proprietary standards sort of owns all the important standards in that space," says Frank Chen, security products manager for Netscape Communications in Mountain View, which, with Microsoft, is frequently mentioned as being capable of such a coup. Netscape insists that nothing of the sort is on its agenda. "We are absolutely committed to building our company around trying not to have one company, whether it's us or some other company, own proprietary standards," Chen says.

Netscape appears to have become the dominant vendor of software for the World Wide Web with its Netscape Navigator browser--employed by an estimated 70 percent of Web users--and Web server and Hypertext Markup Language (HTML) authoring programs that employ the same technology. However, the company insists that its strategy of sharing the specifications it develops means that any de facto standards it produces will be accessible to anyone. Specifically, Netscape has developed extensions to the HTML standard used to write and interpret Web pages--extensions designed to add value to its products. "We just want to be proactive and get necessary extensions out into the marketplace and at the same time aggressively pursue making those industry-wide standards," Chen says. "But in the interim, when they're not industry-wide standards, we publish them and make all the technical details available to developers who want to build products."

So far the feeling in the Internet community seems to be that Netscape's aggressive pursuit of market leadership on the Web has not hurt the openness of the Internet but has accelerated the standards process and increased the evolution of de facto standards. "It isn't hurting the standards effort so long as there are people who need to use those features," says Pierre Wolff, director of marketing for First Virtual, an Internet payment system provider in San Diego. "The standards are necessary for there to be extensions."

"The mental stance on all of this is interoperability and intercommunication," Tilson of SCO says. "A lot of people understand the power of sharing technology."

As for Microsoft's role in the Internet world, the feeling is one of guarded optimism. An attempt by Microsoft and Visa to establish their own payment protocol for the Web appears to have withered since Visa and MasterCard announced in February agreement on a common standard, whose development would involve both Microsoft and Netscape. In December, Microsoft appeared to defer to an emerging standard by agreeing to license Java, the Internet programming language, from Sun Microsystems, which Netscape licenses as well. Microsoft also employs Netscape's Secure Sockets Layer (SSL) security protocol, which helps make SSL a de facto standard.

Overall there's a degree of uncertainty, fueled by the youth of the market, its potential for sudden growth and the need for companies to stay flexible in a world of standards that are certain to change. Tilson describes the climate as a competitive brawl, from which standards probably will emerge to maintain interoperability. "I think we're in for five years of chaos and most likely longer," he says. "Everyone is going to try for proprietary advantage. The market dynamics could shift suddenly and chaotically. Netscape's ahead now, but the barriers to entry in this area are small. And some big whales that control important things like financial transactions might want to have proprietary advantage."

The Heart of the Net

While startup firms and industry leaders alike vie over high-level standards, the Internet infrastructure also evolves. Internet standards are governed by several organizations, none of which has absolute control and no member of which can be identified as having vast power. Essentially, no one is in charge. Instead, the standards-setting process resembles a technological equivalent to the New England town meeting. Although a number of bodies are involved with Internet standards-setting (see "Internet Standards Organizations"), the Internet Engineering Task Force (IETF) and its working groups are the focus of most standards discussions, and within them many thorny technical issues are hashed out. Its meetings are held three times a year, almost anyone with the time and the interest can contribute, and results are obtained by rough consensus.

Although standards bodies tend to be dominated by vendor representatives out to protect their companies' interests, the IETF has many members from universities. However, as the Internet has emerged from academia into the commercial marketplace, the number of vendors is increasing. IETF meetings, attended by less than 300 participants six years ago, now have 1,000 or more. Predictably, current opinion in the vanguard of Internet commerce is that the IETF isn't moving fast enough in its standards work to keep up with related technology.

"I think the IETF will wind up having to increase its pace," says Richard Doherty, director of the Envisioneering Group, a technology assessment and market research firm in Seaford, NY. Doherty, who has participated in IETF meetings, gives the IETF credit for trying to maintain open standards. "While they have stayed in a democratic, United Nations mode, there's an effort to make the process work. If anybody tried to do something too proprietary or that may disappoint too many people, it would get worked out in discussion." He thinks that more frequent working group meetings may be the answer to the problem of speeding up the process.

Although the IETF spans the gamut of Internet-related standards, one of its most often discussed recent efforts is to develop the next generation of the Internet networking protocol, IPv6, also known as IPng (Internet Protocol next generation). The current version, IPv4, was designed for a relatively small network of engineers and scientists primarily to perform file transfers. IPv6 will update the basic protocol to allow the Internet to grow into a global multimedia network. Testing of the proposal is under way, but some people wonder how much longer it will take as the process gets bogged down by ever-increasing vendor participation.

"With the IP environment, you have massive quantities of hardware in place, so you've got to worry about backward compatibility," says Lawrence Backman, vice president of advanced development for FTP Software, a provider of TCP/IP networking for PCs in Andover, MA. "If someone makes a mistake there and you break every old IPv4 thing, you're in trouble."

IPv6 is designed to solve the problem of too few IP addresses and increase routing efficiency, considered the two most significant limitations of the current basic Internet. IPv6 will expand IP addresses to 16 bytes from the current four bytes, allowing for more and longer addresses. The routing and addressing improvements will be needed soon, Doherty says. "When we get to four times or 10 times the traffic we're seeing today [in late 1995] on the Internet--and that's likely to happen by spring--people will start to notice that everything's getting through, but it takes longer. It takes longer because there are more subtle twists and value-added open standards coming into play."

Although the IETF process isn't broken, it is showing strain, admits John Klinsen, senior data architect for MCI in Reston, VA, and IETF's area director for applications. But he can't envision the IETF moving away from openness in its standards-setting. "If you look at the alternatives out there, virtually all of them are worse," Klinsen says. "But I'm looking for ways to make things better. I am not convinced that the way the IETF is doing business now would be workable if the number of active participants extended to 2,000 or 5,000 people."

We may feel confident about the dedication of those working on the underpinning of the Net. But the runaway growth of value-adding applications is sure to put pressure on the framework, and business and the public are far less patient (and cautious) than professors and scientists. Despite some growing pains, however, the standards base seems reliable.

What Do Killer Apps Kill?

Of most concern for future interoperability are those user-focused applications: World Wide Web browsers and servers, e-mail, security and financial transaction protocols. Those are the areas in flux, where the greatest profits could be made and where the opportunity for proprietary lock-in lies. "There is the possibility for balkanization [fragmentation]," says First Virtual's Wolff. "That would work contrary to the good of the whole. The bottom line is that openness is important in order to keep growing."

As Netscape and others have forged ahead with their advanced Web tools, the question of complete interoperability between browsers and servers has come to the fore. In a perfect world, all browsers would interact with all servers in the same way, using the same HTML specification. But software developers, seeking competitive advantage, add features requiring HTML extensions; in Netscape's case, these include adding tables, frames and font manipulation to its Navigator browser and its server software. Webmasters place those features on their Web sites, but they're visible only to Netscape browsers. Interoperability with other browsers is still possible, but those browsers can't see the added features. The difference is relatively slight now but could become much more pronounced as technology advances.

Entrepreneurs believe that this is healthy for the industry, allowing the marketplace to sort out the better products. "Mosaic was the de facto standard and now Netscape is," says Stratton Sclavos, president and CEO of VeriSign. "Nothing that Netscape is doing prevents anyone else from coming into the marketplace with something better." He argues that the key to openness is that product specifications not be hidden. "The whole model is that you not only announce it, but at the same time you make it freely available and let people comment on it." In addition, new technology should be backward compatible, so, for example, Web pages employing Java applets can be read by browsers that lack Java capability.

Java itself has become a prime example of the Internet development model. Originally developed by Sun as an object-oriented language for use in set-top boxes, Java started simmering when Sun decided to create Hot Java technology for running small applications, or applets, within Web pages. With Sun's decision to license the technology and its adoption by the key Internet players, Java appears ready to become a de facto standard this year. "Java browsers will have a successful market, but I don't think anybody is going to dictate a certain set of features that only their browser can use, and therefore everybody is locked in," says Sclavos.

In addition, Netscape and Sun have developed JavaScript, a cross-platform object scripting language for application development. The endorsement of JavaScript by 28 companies seems to assure that it will become a standard as well.

Although the development of product extensions can decrease accessibility for some users of older systems, Chen of Netscape sees a healthy tension in the different roles of standards bodies and vendors. "It would be great if somebody could just dictate from the IETF that it's going to be a certain way. The trouble is that that's not a standards body's charter. Their charter is to make sure that technical specs are complete and accurate and serve a technical purpose. A product company's job is to meet customers' needs."

This situation could mean that the IETF will be considering whether to endorse standards after they are already in wide use. Klinsen, the IETF area director, believes that is not necessarily bad. "If they [technology innovators] dominate the marketplace and keep the technology for themselves, then in that particular area, the IETF is irrelevant," he says. "But if their goal is to get it into as many systems and environments as possible, the IETF process has a tendency to refine and technically improve the proposals."

Delivering the Mail

Electronic mail is another application area where the need for interoperability and standards development is crucial. E-mail on the Internet works because messages of different formats can be transferred and translated through gateways, filters and protocols like the Post Office Protocol (POP) and Simple Mail Transfer Protocol (SMTP). But the widely varying formats can cause problems for companies like First Virtual, which depends on e-mail for its Internet payment system to work. For example, some gateways truncate the subject header, which contains information vital to First Virtual. "We have over 1,000 filters to address different variants of what are considered standard e-mail systems," Wolff says. "It's amazing that something as simple as that, which one would think is a done deal, is actually still evolving."

In addition, secure e-mail is now a pressing need for many businesses. RSA Data Security of Redwood City, CA, and its spin-off, VeriSign, have been actively promoting the Secure/Multipurpose Internet Mail Extensions (S/MIME) protocol, which is designed to add security to e-mail in MIME format. Authentication and privacy would be provided using encryption and digital signatures. Interoperability testing has been conducted and compliant products were expected to be announced early in 1996.

Open Yet Secure

Protocols for security on the Internet have been at the center of some of the strongest controversies of recent months. One continuing battle apparently was settled peacefully last spring when competing companies agreed to resolve their differences over the rivals SSL, promoted by Netscape, and the Secure Hypertext Transfer Protocol (S-HTTP), backed by Enterprise Integration Technologies (EIT) of Menlo Park, CA, and CommerceNet. The parties involved announced that both SSL and S-HTTP would be supported and unified in a toolkit for application developers. EIT and RSA formed Terisa Systems in Mountain View to provide the toolkit, which will employ SSL as a lower, broader level and S-HTTP at a higher level. "They work synergistically," claims Kurt Stammberger, RSA's director of technology marketing. "You're going to see more browsers and servers supporting both protocols." Open Market of Cambridge, MA, also offers such a toolkit.

The clash between Visa/Microsoft and MasterCard over Web payment protocols has overshadowed a more basic question: How will the world of electronic commerce function via the Internet in a world of multiple, sometimes competing and inconsistent protocols? "The most important thing we're lacking is a universal way of doing secure transactions that really works," Tilson says.

EIT, now part of the Internet commerce division of VeriFone, Inc., has been focusing on developing a total payment infrastructure, so, for example, companies like First Virtual, Open Market and Cybercash of Reston, VA, could interoperate via Netscape browsers and servers. EIT/VeriFone is working in several areas--with CommerceNet and the World Wide Web Consortium (W3C), with the American National Standards Institute (ANSI) and with the credit card companies--to establish a single solution. The result may be one negotiation protocol standard that would allow software to settle questions of what protocol and payment method would be selected. "Let's say Visa and MasterCard don't work together," says Mohammad Khan, director of marketing for EIT/VeriFone. "At least they should be able to decide on a system for transmission. Our goal is to make that as simple as possible for the consumer and the merchant."

Tilson also accepts that multiple protocols will survive at some level. "It's clear that people are going to do different things [with protocols]," he says. "Eventually the market will settle on a couple that are highly standardized. Standards have to allow for security protocol negotiation, because that creates a framework for advancement."

Despite the climate of cooperation, the talk of technology sharing and the Internet's history of democratic resolution of standards issues, few observers discount the possibility for an attempted proprietary coup in a lucrative arena such as Internet commerce. "Anyone who can exert monopoly control can make a lot of money," Tilson says. "Anybody who wants to be immensely wealthy would like to be another Microsoft." However, the prevailing opinion is that the Internet has a good chance of maintaining a level of interoperability unprecedented in the history of computing. Many already see that as a significant victory.

Don Dugdale is a technology writer based in San Jose, CA. He can be reached at dugdale@netgate.net.