Customers and Testimonials
When you think of the leading data center markets in the country a few cities and states immediately come to mind: New Jersey, Northern Virginia, Santa Clara California, Dallas, Texas, Jackson, Mississippi… While the capital city of the Magnolia state may seem more likely to conjure visions of mint juleps rather than rows of racks and servers, that won’t always be the case if the state’s governmental leaders have their way.
Within the past five years data center providers have found that it is more cost and energy efficient to build out these facilities in an incremental fashion. As a result, many providers have developed what they refer to as “modular” data centers. This terminology has been widely adopted but no true definition for what constitutes a modular data center has been universally embraced.
As corporate computing and storage needs continue to grow, the need to for more data center space follows. This continued escalation of demand is generating the need for customers to demand more from wholesale data center providers. In a recent series of blogs we described the new requirements for providers as being the data center industry’s version of baseball’s five-tool player. This white paper is the compilation of that series.
IT capacity planning has long been the “Gordian Knot” for data center professionals. Trying to account for the business’ hardware and software needs, coupled with the vagaries of short and long term corporate strategies, typically makes projecting your organization’s capacity requirements an exercise akin to nailing jell-o to the wall. These efforts are hardly aided by the product offerings of data center providers who view capacity planning as a euphemism for “you need to lease your future space NOW”. This paper will examine the issues that have long characterized the process of capacity planning and explore a new alternative that serves as the metaphoric equivalent of Alexander the Great’s sword as the solution to the Gordian Knot.
For many CIO’s data centers are, for lack of a better phrase, a necessary evil. In relation to the applications and hardware that they house, they are don’t change on a regular basis. It’s kind
of like buying a Ferrari and realizing that you have to have a garage to put it in. Required yes, sexy no. In most instances the decision regarding what type of data center to obtain, and who to get it from, is a process of determining who gets the most checkmarks on a feature summary check sheet. Power, check; 36” raised floor, check; and so on.
As industries mature they tend to adopt standards. These are typically best practices that have been formally adopted that provide customers with a level of assurance that the product they are purchasing meets a proscribed level of measurement for performance, safe operations, etc. Customers rely on standards as a method of performing comparative analysis on competing suppliers. As part of the maturation of this standards-development process, qualified third parties verify the legitimacy of providers’ claims.
Type “Big Data” into the search engine of your choice and you’ll get over 1 billion possible results. That’s even more than Miley Cyrus gets. Obviously, this means that you need to be thinking about how Big Data might impact your data center since it’s mor e likely that you will have to accommodate that within your facility than Miley.
Remember Economics 101? Sure it’s been awhile, and for many of us it wasn’t the most riveting subject—they don’t call it the “dismal science” for nothing—but a few things probably still stand out. “Supply and demand” ring a bell? Another common term from that time that we all probably remember is “economies of scale”–probably because it made so much sense.
As Coleridge made clear, water is essential for survival. In California for example, people are finding new meaning in the phrase, “water, water everywhere and not a drop to drink”. Since data centers hadn’t been invented when he published The Rime of the Ancient Mariner in 1798, he didn’t mention them. But if they had, he certainly would have included them in his epic poem since water is too often a common requirement for their operation. This water dependency is an area of consideration that businesses should factor into their plans for upcoming data centers.
When any person, product or technology reaches the level of public awareness where nuance has given way to ubiquity, and the singularity denoted by the word “the” has become the accepted prefix to a name or descriptor, opportunity for competitors often begins to arise from their long shadows. For a number of customers, the shared nature of public clouds is an acceptable negative element of the solution’s experience, but as in the case of dedicated data centers, the unique requirements of a sizeable segment of the marketplace have led them to a perspective that is totally at odds with your mother’s admonishments regarding the relationship between your toys and your siblings and friends, in other words: it’s not so nice to share.
Being a growing business is always a good thing, but even growing companies find themselves with challenges they need help in resolving. For Windstream Hosted Solutions, this meant finding a data center builder that could construct dedicated data centers in two nontraditional data center markets. To address these requirements, Windstream looked to Compass Datacenters.
Data Center Quarterly
Data Center Minute
In this monthly video commentary from Compass Datacenters our CEO, Chris Crosby, will briefly talk about new and emerging issues that are important to the data center industry.
This month Chris discusses the increasing instances of arc flash within the data center and the implications for end users and providers
The issue with many new “next big things” is that they tend to skip one or more essential steps. In this brief video, Compass Datacenters’ CEO, Chris Crosby, will explain why calibrating your data center is the essential step required to accurately measure and model data center performance and provides the necessary bridge to new capabilities like the Software Defined Data Center.
Predictions are tricky things. Big Data was going to be the “next big thing” and now folks are saying that it’s sinking faster than the Titanic, so obviously in the making of any prediction there is a margin for error. In this month’s Data Center Minute, Chris offers his thoughts on the predicted demise of the data center.
Industries change over time. Sometimes for the better. Sometimes for the worst. The one common element they all share is that at some point in their maturation process they reached an inflection point. In this month’s Data Center Minute Chris discusses how the data center industry has reached its own inflection point and where it needs to go from here.
If you’ve ever heard the phrase, “it looked good on paper” then you know that there is often a real difference between what is bought and what is delivered. Data centers are no different. In this month’s data center minute Chris explains the difference between the Uptime Institute’s design and constructed facility certifications and how most Tier III facilities really aren’t.
Everyone is talking about the Internet of Things, and if you’re not, you probably should be. Excitement alone, however, doesn’t guarantee effective implementation. The high volumes of data and latency requirements that come with the IoT dictate that many current data centers may be a part of the solution but they won’t be THE solution. In this month’s Data Center Minute, Chris talks about the elements you need to consider in planning to join the IoT.
The nature of data is changing the demands on today’s data centers and many current structures will not be up to the coming challenges. In this month’s Data Center Minute, Chris discusses how elements like the IoT and increasingly packet rich applications will be changing the way data centers will be planned for and deployed in the months ahead.