(LinuxWorld) -- We examine two different scenarios in an attempt to find when it's smarter to choose Unix or Windows.
-
In scenario 1, a college administrator chooses Unix or Windows for a 500-student system.
-
In scenario 2, the board of a 5,000-user manufacturing company looks at that same decision.
Under each scenario, we examine some of the direct and indirect consequences of the decision. We consider, for example, the impact the college's decision has on parental support costs for students as well as the more obvious costs to the college. At the manufacturing company, we look at costs, productivity, and visit the board's decision on the CIO's role.
Case 1: Student system
We picked the college situation because its requirements are relatively simple with few management issues to consider. The administrator wants to minimize costs while giving the students access to standard tools, including word processors, Web browsers, and e-mail. The school also needs to allow staff to communicate with each other and students using Web servers, shared files, and e-mail. The school seeks to avoid security or other systems-related crises that would detract from the business of the college, which is teaching.
In an all-Windows situation, those requirements spell out a need for a rackmount of four small servers and 500 desktop PCs. All would run a version of Windows 2000 with Back Office on the server and Office 2000 on the desktops. The installed cost to the school totals about $1 million.
The at-home cost imposed by the college's decision is similar except that the parents don't pay for servers directly while security and other support issues are absorbed in uncosted student time. If students share about 20 percent of the machines, this amounts to parents and students buying 400 PCs. We chose the Dell Dimension 2100 with 15-inch screen, Microsoft XP and Office 2000, and 256 megabytes of memory. Ignoring communication gear and related issues, these cost parents a cumulative $574,400 at start-up.
Assuming all students use Microsoft products for home use, the total cost of the college's Windows decision totals about $1.6 million.
Hardware
|
Software
|
Unit Cost
|
Total Cost
(504 users)
|
At work:
17-inch Dell GX150; 128 MB of RAM, 20 GB hard disk, 900-MHz Celeron
|
Windows 2000
Office 2000
|
$1,219
$479
|
$855,792
|
At home:
15-inch Dell 2100 with 256 MB of RAM & network card
|
XP/Office
|
$1,436
|
400 users because of some sharing
$574,400
|
Rack of four Dell Poweredge 2550; 2 x 1-GHz PIII, 2 GB of RAM, 36 GB disk, dual controllers, 1 TB PowerVault shared storage
|
Windows 2000 Advanced Server
Back Office 2000
499 CALs
|
$84,421
$3,999
$232
|
$204,188
|
School Total
|
|
|
$1,059,980
|
Parent Total
|
|
|
$574,400
|
Start-up total
|
|
|
$1,634,380
|
(All prices are from the vendors Web sites as of September 19, 2001.)
|
In the Unix alternative, the college meets student needs with a Sun 4800 server, an administration workstation, and 500 "smart displays" running bundled software, including OpenOffice, Netscape, and the usual suite of Unix communication services for a total cost of about $630,000.
What's a smart display? It is a desktop graphics device intended solely to handle the user interface components of an application running on a server. Most provide and extend the capabilities of such predecessors as the "dumb" terminal, X terminals, and the Microsoft "thin client." Some feature Java/OS and local browser execution.
Parents will still want to provide a home computing device of some kind. Here we assume parents and students would buy a Linux- or BSD-based PC using the same set of free software the college chooses. This eases integration while enabling the student to read and write Microsoft file formats for interchange with students at other schools.
The immediate software savings, relative to the Windows alternative, would allow students to buy both a better quality machine and a larger screen for less money. Choosing, for example, a 17-inch Dell Optiplex (instead of the 15-inch Dimension 2100) at $1,219 plus $50 for a Caldera CD costs about 12 percent less than Windows. This creates a total out-of-pocket cost of $507,600.
Hardware
|
Software
|
Unit Cost
|
Total Cost
|
At work:
500 x 17-inch SunRay smart displays and one Sun 4800 with 12 GB of RAM, one 2 x 750 CPUs, 2 TB SCSI Disk
|
Solaris with all needed application software
|
$621,470
|
$621,470
|
Administration workstation: SPARCstation 10; 512 MB of RAM, 2 x 20 GB disk, CD-ROM & floppy drives, 21-inch monitor
|
Solaris
|
$5,345
|
$5,345
|
At Home:
Dell Optiplex 17-inch with Caldera Linux
|
OpenOffice and related open source tools
|
$1,269
|
$507,600
|
Cost to College
|
|
|
$626,815
|
Cost to Parents
|
|
|
$507,600
|
Start-up total
|
|
|
$1,134,415
|
When we look at five-year costs for these two configurations three things stand out:
-
The school using Unix can reasonably expect to achieve nearly perfect system reliability while maintaining a relative immunity to student attacks. Only hardware failure or serious administrator error can bring the Unix system to a stop. As a result, the Unix operation will fade into the background to become something which, like the telephone system, just works and can therefore be ignored by college management.
The school choosing Windows is, in contrast, commiting itself to a far more complex environment in which systems failure is a daily reality and student access to the the Windows desktop opens everything to easy insider attack. As a result this choice will impose a continuous drain on management time and energy as they confront one crisis after another.
-
The Unix administration job is really part-time although, in practice, it would be filled as a full-time position and the person hired will find additional ways to contribute to the college. The Windows-based solution, by contrast, will be under-supported with four full-time staff and
lead to a serious loss of productivity among other professionals as they become part time PC support people.
-
It makes sense to amortize the Unix investment over five years, but it would defy experience to do that for Windows.
We can reasonably expect the experience in the next five years will reflect that of the previous five. A Sun 5500 server bought in 1996 to support 200 X-terminal users would still be in use today, albeit with upgraded applications and a later Solaris release. In contrast, someone who bought a Windows networking system for 200 users in 1996 would have been forced to upgrade both his servers and his desktop hardware at least once, and more likely twice, in the period and now be facing yet another forced march to new hardware and software to cope with the XP/Net generation.
This difference in product life reflects a very fundamental difference in product development strategies. Unix is built on solid theoretical foundations: technical progress is faster, and goes further because the next generation extends, but does not replace, the current generation. Hardware gets faster and software grows more powerful but the human investment in learning to use the system effectively is not obsoleted each time a new product generation is released. That's why the Kernighan and Pike classic,
The Unix Programming Environment
(Prentice Hall, 1984), is valuable to a Linux user today and also why it is possible (although not terribly practical) to run the hottest new Solaris 2.8 applications on a 1989 Sun IPX.
In contrast, the Windows approach is to use each new generation to obsolete the previous one while maintaining brand continuity. That's why last year's book on Windows/ME is useless to today's Windows/XP buyer while an expert on Windows 95 networking would have first had to abandon NETBUIE for DECNET to cope with Windows/NT and now have to abandon that skill set to learn the basic Unix networking built into Windows/XP.
Churn is a major contributor to the long-term direct cost of ownership for Windows users. That, coupled with staffing demands, is why its 33 percent cost disadvantage relative to Unix at purchase time grows to more than 66 percent over the first five years.
Cost Source
|
Windows
|
Cost Estimate
|
Unix
|
Cost Estimate
|
Percent Savings with Unix
|
Initial Capital Cost
|
|
$1,634,380
|
|
$1,134,415
|
31%
|
Support Staff
|
4 x $45,000 x 5 years
|
$900,000
|
1 x $65,000 x 5 years
|
$325,000
|
64%
|
24 month software refresh
|
2 x $479 x 504
2 x $400 x 349
|
$482,832
$279,200
|
|
|
100%
|
36 month hardware refresh
|
904 x $1,219
|
$1,101,976
|
|
|
100%
|
Parent total
|
|
$1,341,209
|
|
$507,600
|
62%
|
School total
|
|
$3,057,179
|
|
$951,815
|
69%
|
Total Estimate
|
|
$4,398,388
|
|
$1,459,415
|
67%
|
Case 2: 5,000-user manufacturing operation
The 5,000-user manufacturing company example is more complex, not because of hardware or software issues, but because this comparison gets into the more important, but largely unquantifiable, areas of the overall
value
of systems to a company and the managerial impact of a change in systems technology.
Since we don't have a good source for estimates of the cost of the PeopleSoft suite, which we selected for its client independence, we assumed it's the same for both architectures. This probably isn't right, DB2 for Solaris is, for example, much cheaper than SQL-Server for a rackmount of Compaq Proliants and, of course, the whole client licensing thing doesn't exist for the Unix alternative.
That said, the Windows capital cost totals about $11 million. This breaks down to $2,157 per seat before application software, wiring, installation, communications, or additional PC and server software licensing.
Hardware
|
Software
|
Estimated Unit Cost
|
Estimated Total Cost
|
17-inch Dell GX150; 128 MB of RAM, 20GB, 900-MHz Celeron,
|
Windows 2000
Office 2000
(5,200 users)
|
$1,219
$479
|
$8,829,600
|
Set of eight in four racks. Compaq Proliant 8000, 8 GB of RAM, 8 x 700-MHz Xeon, 4 x 36 GB internal on Dataguard controller; 2 external smart controllers, R6000 UPS
Compaq 42U rack
Compaq rack monitor
|
Windows 2,000 Advanced Server
Microsoft Operations Manager with Application pack for 64 CPUs
|
8 x $86,923
4 x $5,439
4 x $1,625
4 x $2,500
64 x $1,798
|
$849,012
|
Storage Works 4354R - 12 drives, mirrored 72 GB of RAM (432 GB net) + 2 DLT tape drives (110/220GB)
|
|
8 x $39,995
|
$319,960
|
Communications and related
|
Estimated as Back Office with 5,200 CALS
|
4 x $3,999
5,195 x $232
|
$1,221,236
|
Start-up total
|
|
|
$11,219,808
|
We create two data centers for the Unix setting. Each has separate administration and networking, with two Sun 6800 servers and 4 terabytes of shared storage for each pair. In normal operations each data center will run roughly half the load, spreading their share across both machines to provide 60 percent more cycles than the Windows alternative while running most tasks at memory speeds.
This architecture is doubly redundant. One machine in each data center or one data center can be shut down without stopping, or even seriously affecting, corporate processing.
We use 21-inch NCD NC900s for the 2,000 or so office workers here instead of the 17-inch SunRays. The NCDs sport faster graphics, more screen space, and higher resolution monitors.
Hardware
|
Software
|
Estimated Unit Cost
|
Estimated Total Cost
|
NCD NC900 21-inch monitor
|
NCDware
|
2,000 x $1,825
|
$3,650,000
|
SunRay 17-inch monitor
|
Solaris
|
3,000 x $600
|
$3,568,000
|
Two sets of 2. Sun 6800, 24 CPUs, 48 GB of RAM, 4 x 36, four controllers
|
Solaris, includes all needed tools and applications
|
4 x $892,000
|
$3,568,000
|
Two x T3 Storage Array with 5.2 TB of disk and 1 GB cache per controller
|
|
|
$947,000
|
Start-up total
|
|
|
$9,965,000
|
This structure produces an initial capital cost estimate that's about 12 percent less than Windows. This system, however, offers larger screens, far more processing power, and is doubly redundant while the Windows system isn't redundant at all.
In operation, a Windows-based client-server system of this complexity will be staffed at a user:support ratio of about 30:1, and so needs about 165 full-time support people plus a base Information Systems (IS) staff of about 35 for
single shift
operation.
The Unix setting, in contrast, needs two groups of 20 people in the data centers and a staff of perhaps five. This totals 45 IS staffers for
24 x 7 operation.
Ignoring ignoring maintenance, space, power, and telecommunication as minor issues, long-term costs stack up as:
Cost Source
|
Windows
|
Cost Estimate
|
Unix
|
Cost Estimate
|
Percent Savings with Unix
|
Initial capital cost
|
|
$11,219,808
|
|
$9,965,000
|
12%
|
Support Staff
|
165 x $45,000
35 x $60,000
|
$37,125,000
$2,100,000
|
45 x $65,000
|
$14,625,000
|
63%
|
24 month software refresh
|
2 x $3,585,000
|
$7,170,000
|
|
|
100%
|
36-month hardware refresh
|
|
$7,263,972
|
|
|
100%
|
Total Estimate
|
|
$64,878,780
|
|
$24,590,000
|
62%
|
The long-term Unix cost estimate totals 38 percent of the Windows cost.
The per-user impact on home computer users is, of course, about the same as that for the college students and works out to about $1,735 per household for the period. Assuming that all 2,000 office workers have home systems and switch to Linux, the corporate decision to use Unix with smart displays would save these employees a cumulative $3.4 million, or 53 percent of Windows.
The corporate savings from the Unix architecture looks like a big number -- getting $40,288,000 would certainly make our day -- but from the board's perspective it doesn't amount to much relative to daily operations. The point, after all, is not to pick the cheaper system but to pick the
better
one.
To the board, the business is a machine for making money. The Unix versus Windows decision, from the board's perspective, will not be determined by what each costs, but by how these choices affect the operational effectiveness of the business. Spending $1,000 is
better
than spending $100 if it produces a larger return or achieves substantially better results with respect to intangibles, such as information security or protection from catastrophic systems failure.
The productivity paradox & technology choices
Bad decisions & the paradox
|
The quality of board decisions should show up as changes in the return the company offers its shareholders. At the company level, such clarity is relatively rare. Although failures like Hershey's ill-fated SAP implementation get a lot of publicity, most of the effects of these types of choices by individual boards are hidden from public view by obfuscating factors such as market change, personnel change, or technology change in an industry.
One place the effects do show up is in aggregate economic statistics that average out inter-industry effects over time and across sectors. In 1987, Robert Solow, a Nobel-winning economist at Stanford, said, "You can see the computer revolution everywhere except in the productivity statistics." His quip highlighted what has become known as the IT productivity paradox.
The paradox is that national accounts show trillions being poured into information technology but don't show a commensurate growth in productivity.
True believers, therefore, question the aggregate statistics or find other ways of arguing that the results do not reflect reality. It is simpler to believe, however, that the numbers are right and that most boards, which usually rubber-stamp internally vetted proposals, make bad IS decisions.
|
In our example, board members need to consider two critical issues before making their decision:
-
Will the business become more productive and make a better return on investment if it uses Windows or Unix?
-
Does this decision affect the basic structure -- organization, compensation, controls -- of the business? If so, is one set of outcomes preferable to the other?
No complete answer can be given to these difficult questions. We can start by asking two simpler questions:
-
What is the relative productivity impact of Windows versus Unix in the enterprise?
-
How does the Unix versus Windows decision affect the role and function of IT in the organization?
The productivity effect of failure reduction
Since the enterprise software selected, PeopleSoft, is client-independent we can focus here on the productivity effects of the two delivery mechanisms -- Unix with smart displays or Windows client-server. We know Windows costs about $40 million more, but how does it affect user productivity?
According to the Microsoft-sponsored
NSTL Test Report: Microsoft Windows 2000 Professional Comparison of the Reliability of Desktop Operating Systems
, Windows 2000 Professional has a MTTF of approximately 2,900 hours.
You should also, however, consult
Bugtoaster.com
for a different result. Bugtoaster asks people to download and use a utility that reports the nature and cause of each system crash the user encounters. Bugtoaster results suggest failures occur at least once every 233 hours.
Bugtoaster does not report operating hours by OS. The number of PCs reported running Windows 2000 SP2 increased from 1,451 at noon on October 7 to 1,556 at noon on October 15. During those 192 hours, the number of reported crashes increased from 4,915 to 6,200. We don't know when the 105 individual PCs were added to the database. If all had been added at the beginning, the numbers would reflect 1,285 crashes in 298,752 operating hours, or one every 232.5 hours.
According to Microsoft's numbers our manufacturing company can expect 13 desktop failures each day and one significant server failure every 15 days. The more realistic Bugtoaster number equals about 161 desktop application crashes per day.
Each failure generates a crisis of some kind. Many result in partially completed transactions or loss of data. Each hurts user productivity both at the moment and in terms of the user's trust in, and so commitment to, the system.
One of the less appreciated, but more expensive, consequences of this type of daily failure coupled with continuous Windows churn is that it becomes virtually impossible to separate systems support -- meaning Windows desktop, networking, and server support -- from application support.
A user hits "Ctrl-Shift-F3" and the PC draws an empty box instead of what's expected. What went wrong? Did the user load something that changed the PC's settings or libraries? Did one of the servers hiccup? Is this a transient network issue? Should the user have typed "Ctrl-Alt-F3"? There's usually no easy way to tell. Consequently, first-line application support devolves to the Windows support staff. That has two negative, unavoidable consequences:
-
Users avoid experimenting with the system or the application. As a result they never learn to use either effectively.
-
Support staff learns an application's basics and try to train users. Since they don't know the user's job, and so can't understand the application, this help is superficial and rapidly institutionalizes easily found, but ineffective, solutions to common problems better addressed through functions that exist but are not found.
By contrast, Unix users will be unaffected by server failures and see, company-wide, about one smart display failure every two weeks. When a smart display does fail, a user can continue to work on another device with no loss of data or other work. Smart-display failures will not cause any kind of crisis.
There is no desktop operating system to support in the smart display environment. No support people get between users and the application. That results in a very clear demarcation of responsibilities with systems delivering application processing, the application vendor responsible for functionality, and lead users responsible for teaching others how to use the application.
Since these lead users know the job and are committed to the application, the direct effect is that more users learn to use more functions more effectively. The indirect effect is to build lasting trust between the parties.
The cost consequences of these differences have never been measured formally, but they dwarf most other elements of the total cost of ownership. To have 5,000 users, the company would be about the size of PolyOne, a plastics and resins maker with about 9,000 total employees based in Cleveland, Ohio. This company generates annual revenues in the range of $3 billion. For a company this size, a 1 percent decrease in productivity due to user resistance and failure to use the application suite properly amounts to a $150 million revenue hit in five years.
If we assume a 30 percent contribution margin this means the company's operating earnings will be reduced by the direct cost of Windows ($64 million) plus the lost contribution margin (30 percent of $150 million) for a net reduction in operating earnings of $109 million.
Conversely, a 1 percent productivity increase due to effective use of the software, made possible by the absence of failures and the quality effects of peer-group-based, rather than systems-based application support, leaves the company with about $21 million in net gains in operating earnings. (We get to this by subtracting $24 million in costs from the 30 percent contribution margin that comes from $150 million in productivity gains.)
In other words a 1 percent estimate of the productivity impact produces an estimate that the Unix decision is better, over the five years, not by the $40 million difference
in costs
, but by $130 million in
operating earnings.
More realistic assumptions, such as those that include costs from Windows-induced downtime, produce larger estimates. For example, a 2.5 percent productivity variance produces an impact on operating earnings in the range of $750 million over five years.
The impact on the role of IT in the organization
A massive productivity benefit is one thing, but what are the implications for the business design? Will Unix use lead to a smoother, more efficiently functioning business? Does a Unix decision make a different contribution compared to a Windows decision?
From the board's perspective, organizational design is about aligning incentives with desired results. That's easy for areas like Manufacturing or Marketing where an obvious measure such as defect rates or sales relates naturally to growth in the power, prestige, and earnings of the executives involved. It is much more difficult, however, for the CIO role. The cost-cutting mandate normally given IS contradicts both its company-wide service mandate and the normal incentives for budget and staffing growth felt by all executives.
The organizationally right answer is to align the IS department with the organization's revenue producers instead of with Finance, and then fund IS from operations instead of from overhead. This does not diminish the need to meet user expectations with respect to the most basic "automated clerking" functions that IS undertakes. On the contrary, it makes any failure to control costs consistently more visible. That's a good thing from an incentives perspective because it forces a CIO, who cannot grow on the revenue side without the kind of user trust that comes from meeting their expectations, to focus on cutting overhead while improving services.
Moving to Unix
|
In the real world, you usually don't start out with a clear choice: Unix or Windows for 5,000 users with no previous IS support. Instead, you take over an existing infrastructure and plan for a long-term transition in which you revitalize the IS staff, develop trust with users, and reeducate senior management.
It is difficult to move a data center from a mixed or proprietary environment to Unix. That process is the subject of my book
The Unix Guide to Defenestration
and requires far more than technical change. The challenge is to change
minds
not just technology.
Using Unix
enables
but does not
ensure
success. Get people thinking in terms consistent with core Unix ideas and you'll succeed. Apply ideas and reflexes learned in proprietary environments like Windows or OS/400 to Unix and you will soon have an expensive, dysfunctional mess.
|
What is needed to meet this requirement is a simple way to make the CIO's ability to pursue his revenue-side opportunities contingent on reducing overhead costs. That incentive structure comes naturally with Unix. Consider that the Unix CIO
can
move beyond service provision to revenue generation because he has two things going for him:
-
He has his house in order. Job 1 in IS is always application processing -- and he's got that covered. To do it well and cheaply he's relying on very fundamental Unix advantages in scalability, reliability, ease of administration, and access to advanced software running on smart displays.
-
His people have developed the strong relationships with users that are critical to a partnership. Users know the applications work and that IS people, who in true Unix style are encouraged to work with users directly, and are on
their
team.
For this CIO, the incentives are aligned properly. Growth takes place at the revenue edge of the business, not in overhead. Cutting overhead increases user confidence and frees resources for use in revenue generation. Increasing overhead creates diminishing returns as users lose confidence and growth resources are re-directed inward.
In contrast, the Windows CIO gets user social approval by going along with perceived wisdom, but must always disappoint because he can't deliver reliable application services. This isn't his fault. The same social pressures that cause users to approve of the Windows desktop preclude them from seeing it as the source of the problem. Unfortunately, this conflicted user perception has consequences: Users may like the CIO as a person but they won't partner with IS people on new revenue generation if they don't trust IS to meet their service expectations.
As a result, the Windows CIO can launch independent revenue initiatives, such as a Web site or selling homegrown software, but not merge his organization's efforts into those of the company as a whole. To do that he has to earn user trust -- but the technology and its organizational consequences prevent this.
At the technology level, each new Windows generation promises improved reliability and ease of administration. Development, however, is going in the wrong direction to achieve this. Instead, he's dragged along on an inexorable cost escalator as each new generation is more complex, more expensive, and more tightly integrated with other Microsoft-licensed products than the last one.
When Microsoft forces a Windows CIO to move from a 15-million-line desktop operating system to one with 40-million lines, the CIO is pushed in the wrong direction from the corporate viewpoint and the right direction from a personal one. To get reliability and costs under control, he needs to reduce, not add, complexity. The technology won't let him do that. What it
will
help him do is grow his share of overhead and increase his visibility in the executive suite. User unhappiness with his services coupled with the social acceptability of the Windows approach makes that easy. The Windows CIO can't simplify the user's desktop without replacing it with a smart display. He can't get Unix-like reliability and performance from his servers without converting them to Unix.
Once the Unix decision is made, the incentives produce an effective service delivery mechanism. It also creates an aggressive, strategically focused IS team that enhances business flexibility while cutting costs.
There is no reasonable methodology for estimating the impact this component of total cost of ownership has on companies. It is significant, probably more so than anything else. Metaphorically, it amounts to the difference between operating the company as a sailboat dragging the IS department behind it, and operating that same boat with the systems group functioning as an additional sail. While how to measure it is unclear, it spells the difference between success and failure for companies struggling to get system expenses under control.