The new Unix alters NT's orbit - NC World - April 1998

The new Unix alters NT's orbit

The re-emergence of Unix threatens to modify the future direction of NT

By Nicholas Petreley

Summary
Unix on Intel; early results of NT vs. Unix; the ability to offer NT services on a more reliable Unix server; colleges turning out a legacy of Unix-loving IT professionals -- these reasons (and more) make Unix a well-honed carving tool for shaping the direction of NT. But at what cost? (2,800 words)

The Next 10 Minutes is an analysis series dedicated to exploring the various external and internal market forces Microsoft is allowing to shape the architectural development of Windows NT, some of which are pushing NT in contradictory and conflicting directions. In the end, we hope to get a better idea of what we can expect NT to be (and not to be -- that is, indeed, the question) in the future.



Last month, we looked at how Microsoft is driven to protect its monopoly on the desktop at all costs even if it means its ultimate design goals for NT must change to meet every threat. In particular, we focused on Microsoft's reaction to the threat of network-centric computing.

This month, we look at how the resurgence of interest in Unix is affecting the development of Windows NT and related Microsoft software. But in order to understand why Unix is a thorn in Microsoft's side, we have to understand why Unix is only now becoming a threat after years of failures to break into the PC market. In a nutshell, here's the history and determining factors:

Unix had initial problems competing with other PC operating systems:

  • Unix started out too big and unfriendly for the PC
  • Wintel has emerged as the only "safe" business choice

Pundits have incorrectly predicted that Windows NT must overtake Unix because:

  • Unix is largely associated with expensive, high-margin hardware
  • Moore's law assumes Microsoft will eventually match Unix performance with NT on cheap hardware

(Moore's law states that the number of transistors on integrated circuits will double every 12-18 months, thus advancing the performance of computer hardware at a breakneck pace as prices drop).

Pundits flip and Unix flops
When it comes to predicting the future, Unix is truly the most enigmatic of all operating systems. In the early '80s, pundits predicted Unix would eventually rule the PC. Yet Unix couldn't make a dent.

AT&T took a stab at bringing Unix to the desktop by putting a version of its Unix in an Olivetti box running a 68000 series chip (if memory serves, it had a whopping 20 megabyte drive). It could barely get out of its own way.

Microsoft co-developed Xenix. It sold like ice cubes in the Arctic.

Unix had a limited PC market, almost entirely server-centric. SCO made money on Unix, some of it even from Microsoft. (Microsoft owns 11 percent of SCO, but Microsoft got the better deal in the long run, as it collected money on each unit of SCO Unix sold, due to a bit of code in SCO Unix that made SCO somewhat compatible with Xenix. The arrangement ended in 1997.)

By the '90s, pundits had reversed their predictions. The success of Unix was confined to expensive high-margin hardware. So the question was no longer whether Unix would invade the PC market. The question was whether the Windows PC would invade the Unix market.

It is superficially logical that the Windows PC must eventually overtake the high-end Unix market. As Moore's Law continues to increase PC processing power at ever-reducing prices, the assumption is that Windows NT will eventually be able to match the performance of high-priced Unix RISC workstations and servers at a much lower price.

This reasoning is flawed in several respects, which we will explain later. But first, we need to examine how Microsoft formed its plans based on the assumption that Windows NT would have to compete with Unix one way or another -- whether on the PC or as a Wintel vs. RISC/Unix battle.

Waging the war
Microsoft fights its wars on paper long before it addresses the technology. So Microsoft prepared for its battle against Unix with a two-pronged attack in the press. The first approach was to position Windows NT as a better Unix than Unix.

Vendors such as Sun, IBM, DEC, SCO, and HP modified Unix to differentiate their products. This splintered Unix to a degree, though not quite as much as is usually perceived. Necessity being the mother of invention, programmers have created development tools that help them work around the differences between Unix flavors. As a result, there is a large body of software based on source code that will automatically configure itself to compile on most Unix platforms, including Intel-based Unix.

Regardless, Microsoft would leverage the perception that Unix is splintered beyond hope, and present Windows NT as a more consistent multi-platform alternative.

Unfortunately, Microsoft misjudged the market's priorities when it came to Windows. The rules for PC buyers are quite different than those governing high-end buyers. IT consumers of high-end equipment weigh features and stability along with vendor credibility, as they assume they will have to hire the expertise to manage any environment they choose. IT consumers of PC equipment tend to stick with the safe lowest-common-denominator choice and leverage their existing expertise on staff.

So the very rule that led to Microsoft's success -- 100 percent compatibility with the largest marketshare -- subverted Microsoft's attempt to sell Windows NT based on its cross-platform consistentency. Most potential customers feel better about running their Windows applications on 100 percent Wintel-compatible machines, even if it means running them slower than they might run on an Alpha, MIPS, or some other platform.

As a result, non-Intel platforms fell away as Windows NT progressed, leaving only the Alpha version of Windows NT as an emotionally viable non-Intel platform (though even interest in the Alpha remains limited, despite its superior performance). By promoting cross-platform support, Microsoft was competing with the very concept that led to its widespread success.

Of the two-pronged attack, it was the first prong that bent back and jabbed Microsoft in the eye.

Dysfunctional equivalent
Second, Microsoft promoted Windows NT as if it was essentially technologically equivalent to Unix. In some cases, Microsoft promoted Windows NT on Intel as superior to Unix/RISC machines. If this wasn't an embarrassing mistake, it should have been.

In a March 1996 interview with InfoWorld, Bill Gates said, "Compare the performance: Buy the most expensive Sun [Microsystems Inc.] box you can and compare its Web performance to an inexpensive Windows NT box. Let's not joke around: Pentium Pro processors have more performance than the RISC community is putting out. I'm not talking about price/performance; I'm talking about performance in the absolute."

This bravado may sound convincing to customers lacking experience with Unix/RISC platforms. But everyone else understands this to be nonsense. Surely no one in his right mind would expect a cheap Windows NT box to outperform the most expensive Sun box (currently the Sun Microsystems Enterprise 10000 Server with 64 UltraSPARC processors).

One need not perform a lab comparison to extrapolate the results. An InfoWorld Web server benchmark showed that a dual Pentium Pro machine running Windows NT pegged the CPU usage while a single-processor Sun SPARC workstation barely broke a sweat.

The problem is with both the software and the hardware. Years ago, PC Week demonstrated that OS/2 and NetWare could outperform Windows NT on a single-processor machine, even when NT was given more processors to work with. InfoWorld's internal unpublished testing once demonstrated that Windows NT and SQL Server consistently crashed under high-stress loads while running on four Pentium Pro processors. Yet a single-processor IBM AS/400 machine running OS/400 and DB2 hummed along under the same load. (As one might expect, the single-processor AS/400 ran slower than the quad-processor NT box at lighter loads.)

Thus, despite the Gates bravado, Windows NT on any platform cannot yet compete with the high-end Unix/RISC machines. Result: Prong two bent back, poking Microsoft in the other eye.

Technically speaking
The pundits and analysts expecting NT to overtake Unix have one hope left: If Moore's law continues to apply, it will eventually allow Windows NT to compete with high-end Unix/RISC boxes on performance while beating high-end RISC boxes on price. This brings us back to the original reason for pundits to predict NT will eventually overtake Unix. But there are a number of flaws in this prediction, because it neglects to take the following into account:

  • Unix in general is making a comeback due to the Internet
  • Unix on Intel is invading IT both overtly and covertly
  • Premature competition between Windows NT vs. Unix on Intel is exposing flaws in NT
  • Poor timing is causing Moore's law to work against NT since Moore's Law helps the emerging Unix competition as much as it does NT. (Many thanks to the readers who wanted this point clarified. --Ed.)
  • Competitive threats on other fronts are pushing NT in conflicting directions

Unix is making a comeback on several fronts. The Internet is Unix. It was built on Unix. It has drawn attention to Unix. And today's college graduates were weaned on Unix as a result. This is the generation that is now entering the IT workplace. And eventually they will be IT managers.

This leads to the second stage of the Unix comeback. Unix is now growing rapidly on Intel platforms. And these Intel-based versions of Unix have a remarkable amount of free and commercial software support (see the Resources section below for a link to a list of commercial applications for Linux.)

Yesterday's college students learned their Unix expertise on Linux and FreeBSD. Today they're working in IT departments, and many of them are openly hostile to both Microsoft and Windows NT. As a result, Linux, BSD, Solaris, and other forms of Unix are finding their way into IT departments, both overtly and on the sly.

For example, are you sure that's an NT server you're connecting to at work? IS employees in many corporations have secretly installed Unix servers that provide native NT services. Why take such a risk? Linux and FreeBSD are free, as is SAMBA, the software that provides NT services. So the IS department saves money. And managers are unlikely to find out Unix is behind the scenes because fewer people will complain about server downtime.

Fewer people will complain because the servers are more stable than Windows NT. Linux, FreeBSD, and BSDI Unix outperform Windows NT by a wide margin on limited hardware, and under some circumstances can perform as well or better than NT on the best hardware. Once behind in scalability features, Unix on Intel is catching up and may soon surpass NT in the number of processors it can use, and how it uses them.

Technicalities
Meanwhile, Windows NT already loses on many more competitive issues. Linux, FreeBSD, and other forms of Unix can be configured as a firewall right out of the box. Windows NT cannot. Free Unix operating systems have built-in features like IP masquerading. Windows NT doesn't even do basic IP filtering without additional software.

Unix comes with one or more command-line shells that support sophisticated scripting languages with easy access to its network utilities. This is often the most efficient way to automate complex administration tasks. Windows NT has no similar capability (batch files are not competitive with shell scripts). You can fully administer a Unix server from any station supporting Telnet. Windows NT doesn't even provide enough command-line tools to make this possible even if you could Telnet into a Windows NT server (you cannot by default, but a Telnet daemon is available for NT).

Some versions of Unix (Linux, for example) support loadable device modules. This means you can boot Linux and reconfigure its support for hardware and software on the fly. For example, you can boot Linux without support for the SCSI card you have installed. You simply load support for that SCSI card when you need to access one or more of the SCSI-connected devices, such as an optical disk for backup. You can unload the SCSI driver when you're finished. You can also freely load and unload support for sound cards, network cards -- even filesystems such as HPFS, FAT, VFAT, and others (an NTFS driver is in the works).

Any Unix with loadable module support is therefore by nature more appropriate for a server environment because almost all configuration changes do not require system restarts.

Windows NT doesn't even come close. Even insignificant changes to a Windows NT configuration require or request a shutdown and reboot in order to make the changes take effect. Change the IP address of your default gateway and you need to reboot. You can't even change the type of modem you use for a dial-up PPP connection without a reboot to update the system. None of these limitations exist in Unix.

While NT has a few advantages over Unix (it has a more flexible security model for its NTFS filesystem, for example), one could go on almost indefinitely about its disadvantages. Windows NT suffers from design flaws as annoying as its inexcusable handling of system DLLs to a dangerous kernel model that invites driver crashes.

A course in collision
The problems for Microsoft don't stop there, even if one assumes Windows will continue to dominate on Intel. While Intel-based hardware continues to advance, RISC hardware continues to drop in price. As Windows NT continues to grow in size, instability, and price, Unix continues to become more mature, streamlined, and less expensive.

When pundits predicted Windows NT would overtake Unix due to Moore's Law, they assumed Microsoft would lower prices, not raise them. And they assumed manufacturers of high-priced RISC platforms would never give up their high profit margins. The evidence is in that Microsoft is raising its prices rapidly, even to the point of eliminating concurrent licensing on some of its products. And evidence is trickling in that some vendors of high-end equipment are prepared to lower their margins and support Intel machines with their operating systems.

No matter how you look at it, Windows NT is certainly on a collision course with Unix. It must compete with Unix at a technical level, and in addition face the issues we covered last month -- the efficiency and cost reductions of network-centric computing.

As the two forces converge, Microsoft must address problems of stability, architecture, speed, and manageability. Microsoft must also make Windows NT multiuser and provide remote access to applications. And it must do so by patching a system that was not designed for either. Worst of all, it must find a way to own Internet standards or make Windows NT conform to Internet standards. And while Microsoft addresses all of the above, it keeps its monopoly as priority number one.

Is it safe?
None of the above goals are trivial, and some of them are in direct conflict with Microsoft's priorities and techniques. This brings us back to the premise of this series of analysis articles: The future of Windows NT is threatened less by the superiority of its competition than the inferiority of Windows NT, which results from Microsoft's misplaced priorities. As we demonstrated in the first installment, Microsoft's design decisions are driven more by its attempt to protect its desktop monopoly than by technical excellence.

As evidence, Windows NT is less stable than Unix because it is more vulnerable to clashing shared libraries (DLL conflicts). But it is only left vulnerable in this way because Microsoft likes to overwrite existing system DLLs with its applications (thus secretly "upgrading" the operating system in ways no competitor would dare to do) to gain unfair leverage against its competition. "Fixing" the DLL problem is technical simplicity. It simply isn't desirable from Microsoft's perspective.

In addition, Windows NT has a dangerous driver model because it is willing to sacrifice stability for speed in an attempt to win benchmarks against competing operating systems.

Until now, these compromises have worked because Microsoft's domain has been limited to the desktop. It is only now beginning to infiltrate the departmental server market, and is attempting to challenge higher-end systems. And as Intel-based Unix draws attention to the differences in quality between NT and Unix, the prospect of a wholesale switch to NT is looking less and less appealing.

Can Microsoft win this battle based on the "safety" factor alone? In other words, will the "Nobody got fired for buying Microsoft" rule take precedence over quality as Microsoft enters these new, high-end markets?

In our attempt to answer that question, next month we'll examine in more detail the architectural issues Microsoft must face. And we'll shed some light on the way Microsoft combats Unix -- more specifically, free Unix -- in support and its continuing battle in the press and PR.



Resources


About the author
Nicholas Petreley is editor-in-chief of NC World and columnist for InfoWorld and NT World Japan. and read his column "Down to the Wire" in InfoWorld. Reach Nicholas at nicholas.petreley@ncworldmag.com.


APRIL  1998
 
 

Feedback: ncweditors@ncworldmag.com
Technical difficulties: webmaster@ncworldmag.com
URL: http://www.ncworldmag.com/ncworld/ncw-04-1998/ncw-04-nextten.html
Last modified: Thursday, April 16, 1998