A recently released report by 451 Research explores important characteristics of the global data center market. Specializing in technology innovation and market disruption, the research and advisory company presented findings on the size, scope and growth of the multi-tenant data center market. These facilities house the data center operations of multiple customers.
Key Findings of the 451 Research Study
According to the 2014 report, over 800 data center companies provide multi-tenant services worldwide. These providers range in size from operating one data center in one market to over 20 facilities in different markets.
A study by Accenture Analytics revealed big data is making good on its promises, despite getting off to a rough start. Initially, companies didn’t understand how to manage big data projects and were unsure of the value they would produce. Organizations have now become well aware of how business-transforming big data can be and they are implementing technologies and processes to fully leverage it.
Accenture surveyed a wide variety of executives, including CIOs, COOs and CMOs from 19 countries and seven industries. Basically, the research company wanted to find out how technology leaders perceive the current state of big data analytics.
The build vs. lease conversation is entering a new stage. With increasing financial and operating risks associated with building, managing, and maintaining a company-owned data center, the scales are tipping in favor of a leasing model for most organizations.
Building a data center is one of the most expensive decisions a CIO will make. Changing hardware, software and business needs can quickly turn a data center investment into a resource drain on a company.
Greater choice, but risks need to be considered.
Over the past few years, cloud computing has been touted as the number one way to simplify, yet strengthen, how the digital world functions. Today, the benefits have been well documented. Thanks to cloud computing technologies, companies of all sizes are managing their IT operations in ways never imagined.
Using cloud resources, computer applications and services are delivered to users through networks or the Internet. Computational work is done remotely and delivered online for on-demand, anytime, anywhere, service availability. As a result, companies reduce storage and processing power on local computers and devices. They can invest less in infrastructure assets and operate with greater elasticity.
“Big data” is as the name implies – sets of data so huge and complicated traditional storage and analytical tools often times have difficulty processing them.
As big data is being stored at record rates, companies have two major challenges. First, they need to store and manage the volume of data. Second, and most important, they need to analyze the vast amounts of data to derive value from it.
To put this explosion in perspective, a paper published during the 2012 IEEE Aerospace Conference estimated the size of digital data in 2011 to be 1.8 Zettabytes, or 1.8 trillion Gigabytes.
The growth of social networking, web services and cloud applications has caused an exponential increase in data and network traffic. As a result, powerful data centers are needed to handle burgeoning infrastructure requirements.
In addition, organizations must configure solutions for disaster recovery and business continuity. Ensuring ongoing operations in the event of a power outage has never been more critical.
Data center providers must respond to both demands – maintaining high performance levels in light of escalating network traffic and enabling customers to continue operations during a disaster. Interconnecting multiple data center facilities helps accomplish these goals.
As data dependence intensifies, businesses of all sizes need to mitigate the risk from both human-caused and natural disasters. IT outages hurt every company’s operation and finances, regardless of the industry.
To ensure business continuity, data center best practices should be deployed to deliver 100% availability to data center customers.
Today, businesses expect and depend on immediate communications. Business expectations revolve around 24/7 electronic communications and regular availability of data.
This past month alone, the U.S. Geological Survey (USGS) reported 17 “significant” earthquakes around the globe. The recent magnitude-8.2 earthquake in Chile, the 5.8 in Panama, and the multiple earthquakes in Southern California have people wondering if they’re related and whether activity will increase. Although experts believe the odds are against this spate of events being connected, data center customers remain concerned nonetheless.
Maintaining a physically secure data center should be important to every business. However, many companies may not be able to afford the military grade level security required to adequately protect valuable IT infrastructure.
Incorporating the equipment, personnel, and systems needed for a highly secure company-owned facility or data center environment can be cost prohibitive for many companies.
Moving to a colocation strategy sometimes creates angst among IT professionals. In many cases, they’re concerned with giving up control of the data center infrastructure. And in other situations, IT managers need to analyze whether colocation will bring down costs and mitigate downtime risk, as well as provide additional strategic benefits.
In recent years, numerous companies, especially emerging and medium-sized businesses (EMBs), have moved to colocation. Several factors are driving this colocation trend.