Editor's Note: This is part of a series written by CIOs discussing their thought processes and lessons learned from major events in their tenures as CIO. Check back tomorrow for a companion post.
I was once in charge of IT operations at a company where the decision had been made to defer as many capital expenses as possible. That meant all the VPs and directors went through our budgets and identified infrastructure upgrades or new systems that could be postponed for a year or two. I did my duty and found several projects that could be postponed. One of them was upgrading the emergency power generator that provided backup power for our datacenter and headquarters building. I soon wished i hadn't been asked to make that decision.
The summer that year turned out to be hotter than normal. One hot day in late August, we began experiencing power brownouts in the office park where our headquarters was located. Suddenly, right after lunch, the power went off completely. The building went dark and quiet.
I stepped out of my office and looked down the hall toward the datacenter. To my relief, the lights were on in the datacenter, and our systems stayed up. I should have heard the roar of the generator behind the building as it kicked in to supply the power we needed, yet the generator did not come on. We quickly realized our systems were running on the batteries of the UPS unit. We had about 30 minutes of power. If the commercial power grid didn't come back on in the next five minutes or so, we would have to start an orderly shutdown of all our systems while we still had time.
I went down the hall to the datacenter and got ready to give the order to shut down our systems. That meant not only all the company's internal systems, such as ERP and HR, but also all our Internet systems, such as our website, our e-commerce system, and our EDI and other data links. If these went down, you know what would hit the fan in a big way, and I would be the one responsible. There were career-limiting consequences involved. No one had much to say; we stood around thinking about what we had to do. I looked at my watch. There really is nothing to say at times like this.
After about three and a half long minutes, the power did come back on. We looked at each other and thanked our lucky stars. My knees were weak. Then we rushed out back to check on the generator. On the first Wednesday of every month, we had a regularly scheduled operation where the generator was switched on and run for 30 minutes to make sure it was in working order. We couldn't understand why it hadn't worked when we really needed it.
Right away, we called a service technician, who came out to check. He found that, although the generator turned on when we ran our monthly tests, the failover switch that sensed when the commercial power grid went down was not working. We hadn't been testing that.
We replaced the switch immediately and ran another test. We turned off power to the building to see if the failover switch would work and trigger the generator to start up. The generator did start up, but within minutes, it overheated and shut itself down. Then we remembered that the generator had been installed years ago to support a smaller operation. Since then, more people had been hired, and more gear from servers to PCs and air conditioners had been added. Now when the generator was subjected to that increased load, it choked.
This started a sequence of tuneups, repairs, and tinkering to try to get another year or two out of the old generator. We looked at having it power only the datacenter, but the rest of the headquarters would be dark, so nobody could work anyway. None of the other managers liked that idea. After a couple of days and several more tests, I realized the old generator just could not be trusted. And I felt like I was tempting fate by delaying the decision to buy a new generator.
I drew up a capital expense request for a new generator -- not just a replacement for what was already there, but a larger generator and some related upgrades to the UPS unit and the electrical wiring in the datacenter. I was about to learn a lesson in the logic and the politics of requesting a large, unbudgeted capital expense during a tough year when profits are down and money is tight. Come back tomorrow to see what I learned.