Guest Commentary
Two From Australia

 

Publisher's Note: AUUG, UniForum's very active affiliate in Australia, publishes AUUGN: The Journal of AUUG. This excellent periodical carries features, reviews and opinion pieces of interest to the entire open systems industry. UniNews online is pleased to publish a pair of articles by Frank Crawford, Site Manager of ANSTO and a member of the AUUG Executive Committee. His first offering gives us an important counterview of NT Mania, and his second a thoughtful discourse on computers as "magic" - what we don't understand about computing. Please let Frank hear directly from you with your thoughts and opinions. RRS

 

 

Has NT's Time Come?

by Frank Crawford

Microsoft's publications regularly claim that this is the year that NT will take off, as if these repeated statements will make it happen. However, in the last few months there appears to be some evidence that NT may in fact be beginning to be accepted, but this may not be exactly the way Microsoft would like.

In the last six months the number of people actively studying NT, finding design flaws, bugs and security issues has increased dramatically. Making it worse for Microsoft, this concentrated attention has started to filter through to the non-technical people involved in decision making, countering many of Microsoft's own press releases.

In fact, NT is starting to suffer from exactly the same public pressures that most Unix systems have for years (and has even been used by Microsoft as points against Unix).

Things started quietly late last year, when it was noted that by connecting to the NT Web server (IIS), by typing a specific string you could cause the IIS server to crash. Nothing too dramatic, and it turned out to be fixed in the next release of IIS, however, not everyone was ready to upgrade to the new version, or even knew it was necessary.

Over the Christmas period, things began to get worse. Firstly, it was discovered that connecting to NT's RPC server (an essential service) and typing random characters would cause the system load to approach 100% and lock out all those network processes to that machine. The way to fix the problem was either to stop and start the process controlling the RPC service (rpcss) or reboot the system. Either way this would drop most active network services.

Shortly after this, it was discovered that the RPC service was not the only one susceptible to such attacks. Further, this bug was in both NT version 3.5 and NT version 4.0. To it's credit, Microsoft acted fairly quickly and had a fix out for the problem within a week, however, this fix was only for Intel based NT version 4 systems with the current service pack installed (SP2), and only for the original RPC problem.

After these problems surfaced, a few users started investigating other such shortcomings. Recently it has been discovered that the DNS server (i.e. the program that handles Internet name to address conversion) can be easily killed if it is sent a response that it never requested.

Now, while it may seem that all these problems are unlikely to occur in real life in a protected environment, on the Internet they are common place. All of these can (and are) used by malicious users to perform what are known as "Denial of Service" attacks against systems. For a system used within an Intranet the liklihood of such an attack is small, but for any system connected directly to the Internet, it is a certainty.

If the existence of these problems is not bad enough, they all seem to exhibit a common feature, and one which raises even more concerns. They all seem to stem from poor data and error checking by the programs involved, and by extension, probably many others. This is something that is taught from day one in most programming courses and is essential in all programs that have any security implications. Unfortunately, it appears that many of the programmers at Microsoft have forgotten it.

Taken back a step further, this problem may be a cultural one at Microsoft rather than just poor programming. Most of the codes developed by Microsoft are not available for public inspection, are developed under extremely tight deadlines and little effort is put into working with those outside a very tight circle.

Taken back a step further, this problem may be a cultural one at Microsoft rather than just poor programming. Most of the codes developed by Microsoft are not available for public inspection, are developed under extremely tight deadlines and little effort is put into working with those outside a very tight circle.

If you contrast this with the Unix world you see why this culture is a problem. Firstly, most new features within Unix have been developed though the free availability of source code. First through the original Bell Laboratory versions, then to BSD, and now with Linux, FreeBSD and other public developments. The availability of source has two effects, firstly, the original programmer is conscious that others will be seeing their work and are unlikely to take short cuts, and secondly, even if they do miss something, a later developer is very likely to pick it up. It is interesting that while many of the commercial Unix vendors make their own modifications, the vast majority of the code base still comes from the original freely available source.

The pressures of deadlines is an obvious cause of errors, and while it is present in the commercial Unix market place, so much of the development is still done in the public domain, it is not as much of an issue. At the same time, because there are a number of different Unix vendors and developers, while one may be under the pressure of a deadline at any one time, others will not, and hence have some time to note the problems and correct them.

The limited exposure that Microsoft products get before release is a further cause for concern. As any developer will tell you, the users of a program do not do as you expect. When the process is tightly controlled, it is unlikely that anyone will do one of those stupid things that happen in the real world. The code will work correctly when given correct data, but it is unlikely to be heavily tested on its response to incorrect data. Again, the wide distribution of code at a very early stage in the Unix world, quickly fixes that.

So, while NT may have started to come of age, like any child this means that it has also started to encounter unexpected problems. While the basic design appears to be sound, the implementation leaves a lot to be desired, and there are likely to be many growing pains ahead.

 

 

Computer Magic

by Frank Crawford

Everyone involved in computer support knows of people who just have to walk by a computer for it to play up. They also know people who just seem to have everything go right for them, such as only ever having to install Windows 95 once (or maybe never). At the same time anyone who is seriously involved in computer support has to firmly believe in Murphy's Law, and often take unusual actions to mitigate it.

Now, while those not seriously involved may think that the statements above shouldn't be taken seriously, in fact, in modern computing there are good reasons to follow them.

Today's computers are by far the most complex piece of equipment that people encounter, and not just from the hardware components. If you look at the average PC, it has all the equipment of a stereo, telephone, typewriter/printer, fax machine, copier and television, as well as the "specialist" items such as a hard-disk, network card and mouse. Then on top of this can change it's personality at the touch of a button, from an educational toy for a pre-schooler, to a Formula 1 racing car simulation, from a challenging chess opponent, to something to chat between friends, and even on to something for solving complex scientific problems that are hard to describe let alone understand.

Yet, at the same time these things are being deployed into every corner of the country with the assumption that anyone can use them. Also built within this widespread distribution is the assumption that people understand what a computer is doing. Unfortunately, this is totally false. It may have been possible for people to understand all the steps involved in any computational activity, in the days when computers only ran a single job at a time, which included all the instruction for every activity, but today with multitasking, hardware and software interrupts, multiple devices, and even multiple CPUs, at any tick of the clock, your computer may suddenly run off to undertake a different and unexpected activity.

All these different activities imply an exponential number of possible interactions, and it is these that make any computer unpredictable. In fact, today, much of the work by computer support staff involves experimentation to see how a system reacts in a given situation. The good support people have a feel for it, just like the old bushman has a feel for the country.

At one time, this experimentation was only needed for high end systems, unfortunately, it is now getting more and more common right across the entire computer spectrum. Despite the open nature of most Unix systems, security analysts often have to experiment to see the effects on different security policies and options. Even worse, NT security people are unsure which settings in the operating system are significant, and which are irrelevant. To make matters more difficult, the NT documentation does not even fully list what all the security options are!

The problems faced by these groups is no different to scientists trying to explain the world we live in, unfortunately, the world that is being explored here is man made, yet we still can't comprehend it.

When you take a step back and look at how this affects what the average user sees, it is not surprising that unusual problems occur, and that some people are just naturally better at adjusting or correcting them than others. Some people will instinctively take the correct approach, others will blindly follow some preconceived ideas.

Ultimately, it comes down to the simple issue that anything to do with a computer was developed by a person. Unarguably, this person was not able to comprehend all the interactions that may affect their work, and as such may have left out an important detail. On the positive side, you should always assume that this was unintentional, and in general, is only a small deviation from the truth (however, the lower down this is the bigger can be the effect by the time it affects you).

Ultimately, it comes down to the simple issue that anything to do with a computer was developed by a person. Unarguably, this person was not able to comprehend all the interactions that may affect their work, and as such may have left out an important detail. On the positive side, you should always assume that this was unintentional, and in general, is only a small deviation from the truth (however, the lower down this is the bigger can be the effect by the time it affects you).

In other words, what doesn't work was not aimed at you, but was only caused by them being human, and, at the same time, understanding and correcting that problem may take intuition and "magic" rather than a totally logical process.

 

AUUG is Australia's largest and most active open systems user group. AUUG provides a wide range of services for its members, for more information please contact our secretariat on 1 800 625 655 (phone), (02) 332 4066 (fax), email us at auug@auug.org.au or see our Web page, http://www.auug.org.au/auug/.

About the Author:
Frank Crawford is Site Systems Manager with ANSTO and can be reached at frank@ansto.gov.au

 

 

Back | Table of Contents | Next