In the mid-nineties, new directions in managerial accounting made cost center bookkeeping popular in large organizations. In this model, departments within an organization traded almost like a miniature economy, with some departments earning a net gain for the company and some departments delivering a net loss.
The profit centers – as those who earned money are called – get to call the shots, because ultimately they pay for everything. Loss-generators – called cost centers – get to follow along. Information Technology departments become the poster child for the cost center (’cause those computers are EXPENSIVE!) and IT managers became sometimes destructively frugal. Because bonuses are delivered based on keeping costs low, no expense is spared the razor.
In the early 2000s, Information Technology departments began to show that they were more than cost centers. With the popularity of the Internet the web site suddenly became a profit center, throwing the model all out of whack. A faintly golden age of IT departments reared its head, while IT managers effectively got anything they asked for.
We all know the story from then to now. The bubble burst (again). The IT departments are cost centers (again). The IT managers are forced to make tough decisions and do more with less (again). It’s like the nineties all over again.
Virtualization puts a lot more tools in the IT manager’s belt than before; that’s for certain. Cloud Computing – a natural outgrowth of virtualization and internetworking – offers even more flexibility because of an interesting shift in what bucket the costs go.
Here, we will look at Cloud Computing, a little history, and implementations. We’ll perform some economic and accounting analysis, and then see how an IT manager might implement cloud computing in his or her own environment.