Technologists Predict Pervasive Change
Network will be the basis for faster
By Don Dugdale
Just dealing with the pace of technological change may be the biggest challenge in the era of pervasive networked computing, according to a panel of chief technology officers speaking at SCO Forum96 last month. The advent of the Internet, intranets and widespread access to information have created dangers along with the opportunities, the panel members said, implying that success in IT management will depend on how well one can manage the resulting changes.
Panel members included Rob Gingell, vice president and Sun Fellow with SunSoft (and a UniForum director); Martin Haeberli, director of technology with Netscape Communications; Todd Reece, general manager of Hewlett-Packard's networked computing division; and Glenn Ricart, chief technology officer with Novell.
Among competitive businesses, one change that has occurred is that competition is now based more on access to technology and less on the products themselves, Gingell said. For example, Nike and Reebok may not compete on shoes. "They're competing on things like market demographics, pricing models and manufacturing issues, which are all information problems and have nothing really to do with the mechanics of building sneakers. The winners in these marketplaces are likely to be the people that use technologies better than their competitors use them."
Another result, he said, is that information
systems can't be frozen in one configuration anymore, resulting
in continuous tension. "This is going to create a lot of
pressure on the processes by which technology is delivered,"
Gingell said. "If the network really is the computer, and
if the network is perpetually changing by virtue of the fact that
every day there's a new machine or a new application or a new
operating system, the idea that you can freeze the configuration
is now bogus. The victory will go to the people who are able to
absorb component technologies without having to revalidate all
of their systems. Change is your friend," Gingell concluded.
"You're going to have to create software that gets deployed
without having to be constructed in the system test environment
that we've come to use. You're going to have to get more software
and component computing pieces from different places that you
absorb more rapidly."
The Other Side of Change
For Reece, rapid change means that the industry has to consider the users' need for stability more than it ever has. "The key challenge we've got ahead of us is the balance between speed and stability," he said. "What we tend to lose sight of is that computing technology has become so pervasive and so much a foundation element in the way people run their businesses, that we need to make sure we're providing that stability and that infrastructure." He said there's a great need for "a class of very simple information appliances that users can plug in and use to get access to information. And it shouldn't take technologists like ourselves or hordes of IT people behind the scenes to hold this system together. We need to aim our innovation toward that model of pervasive computing."
But if the Internet and the Web are harbingers of this computing model, they are not perfect, he warned. "A lot of cracks around the edges are beginning to show right now," Reece said. "We haven't hit the stability level that we really need to aim for if this stuff is to become pervasive. We need to aim for perfection. People aren't going to base their businesses on an accounting system that's imperfect [but] the Internet is as close as we can come today. We aren't that far off, and we now need to figure out new ways of introducing new technologies out there."
According to Ricart, the Internet can't die, because it has been adopted by too many people. "The Internet is actually past the point on the adoption curve whereby it could die," he said. "Once it's been adopted by 10 percent of the population, it really can't go back, because the curve has to keep going." Ricart's future computing model involves moving from the client/server idea to the client/network model. "We have to move from a network of servers to service provided by the network," he said. "What that means is that we're no longer dependent on any one server." He added that, to move forward faster, software will have to be distributed in smaller, object-based releases and pieces instead of major revisions.
Ubiquitous connectivity is Haeberli's prediction for the future. "It is said perhaps not often enough that the Internet changes everything," he said. "At one extreme you'll see things like global satellite systems being widely leveraged. You'll see Internet connections off a podium like this and certainly in every hotel room and every conference room you go into." He also predicted that the Java programming language will become pervasive. "The good news is that Java is portable across a wide range of platforms, and the bad news is that there are still some devils or gods in the details. You find that things like filename lengths become critical issues that constrain whether a piece of Java code will run on a given platform. Also, Java implementations on a wide range of platforms aren't up to snuff yet. So we in the industry should continue to push that forward to give software developers and innovators a context for rapid innovation."
Change will not come without hiccups, the panelists warned. Both Gingell and Ricart predicted a "major computing disaster" by the year 2000, perhaps due to a security breach. And Haeberli forecasted "earthquakes" as more new technology is rapidly accepted on the leading edge, but old technology remains in use in many sectors. "Internet technology creates a kind of meta-infrastructure for change, where we can have the earthquakes," Haeberli said. "But the center of mass is actually moving slowly. I guarantee that there are still paper tape punches and readers being used to control machine tools in this country, even though paper tape has been obsolete for one or two decades."