Sir Francis Bacon is credited with being the first to utter the phrase “Knowledge is power”, despite being a keen grasp of the obvious, who can argue with that? Examples of this maxim surround all of us. Knowing the capital of South Dakota has probably won you more than your share of bar bets, understanding that Nigeria has no deposed royal family ensured that you didn’t give that “prince” ten grand to help him get his money out of the country, and discovering that if you don’t go to the boat show then you won’t buy one.  With all this in mind you can see why I was shocked when I read that 50% of data center operators are feeling a little “powerless” lately.

A recent study by Intel and Dell found that approximately half of data center operators don’t really know what’s going on inside their facilities. For those of you looking for an excuse, or a remedy depending on your point of view, you’re out of luck since 53% of the respondents said that they were already using a DCIM system. I’m not sure what’s more disturbing, the fact that these results cast a shadow over the effectiveness of today’s DCIM offerings, or that half the people supporting their organization’s mission critical applications are just kind of “winging it”. I’ll let you draw your own conclusions. In any event, it’s pretty evident that we have a bunch of folks out there who need to move up the knowledge curve pretty quickly.

While it’s easy to make light of these results (and I am), if I was an advocate of enhanced capabilities like software defined data centers I’d be more than a little concerned. If 50% of my potential client base is either not maximizing the value of their DCIM investment, or still using spreadsheets to keep track of their data center fiefdoms, this does not bode well for product offerings that are dependent on customers that do know what’s going on inside their facilities.

From a corporate standpoint this lack of knowledge on the part of the individuals entrusted to support your mission critical applications should raise real doubt as to the effectiveness of your current data center operations and the preparedness of your company to effectively embrace new initiatives like the Internet of Things (IoT).  I would venture to say that the companies chirping the loudest about moving to the public cloud have CIOs who can’t get this critical information.

Do findings like this survey’s mean that we aren’t as far along as an industry as we should be? I think the only way we can answer that question, at this point, is maybe. While it’s obvious that companies like Google, Facebook, et. al. are more than prepared for anything that comes down the pike, it may be that a large portion of enterprises are not. The underlying reasons for the survey’s findings are difficult to pinpoint. Are many data center operators simply overwhelmed by the rapidity of changes within the industry? Are tools like DCIM too complex to be effectively implemented or simply unequipped to perform their intended functions? Or is it a mélange of these and other factors? I suspect, like in most things, the answer is actually a mixture of all three, but whatever the reason it appears that a number of data centers are operating below the level of reliability and cost effectiveness that their businesses require.

The good news in all this is that this is just one survey, a fleeting snapshot in time if you will, as opposed to a harbinger of impending doom. What it does seem to indicate is that many data center operators need to gain a much better understanding of the inventories, applications and utilization of the hardware and software that comprises their domains. As data center environments become more complex, the use of higher-level technologies to resolve the most basic questions is wasteful and ineffective. You can’t expect to more effectively manage your data center when you don’t even know what it contains. In short, the operation of your data center shouldn’t exemplify the inverse of Bacon’s assertion.

Share on FacebookTweet about this on TwitterShare on LinkedIn