Computers are a commodity. Not just the desktops at work but also the servers (as far as a certain threshold is not passed). The ever increasing rate the prices drop in respect to the computing demands adds to this effect. The consequence is that for every odd reason a new server is set up. Of course, the benefits are apparent, since there is no interference with perhaps critical systems no dispute between business units about the sharing of costs etc. but there is no such thing as free beer. Every new machine adds load to the network, consumes electricity, dissipates heat and needs constant monitoring and maintenance. And these costs by far exceed the costs of the original purchase. By estimates of the big consulting companies, the hardware accounts just for 20 to 33 percent of the TCOs. So why is this happening? – Because the complexity increases even faster than any other aspect. For example, add a new application to a running server and you will have to keep to the current constraints (libraries, interdependencies, et al.) and find out what other constraints are added. And then add another application. Soon, the list of constraints will effectively deny any change of configuration and the system is unmaintainable. So what to do about it? – This is the challenge of current IT.