Look for a colocation provider with a commitment to sustainability.

3 things to look for in a sustainable colocation provider

When considering your data management needs, finding a colocation provider that is committed to sustainable practices is a good idea. Beyond altruism and environmental responsibility, sustainable colocation providers can help make your organization better. 

The colocation data center market is growing bigger
Data Center Knowledge reported that, by 2017, the international colocation data center market is expected to reach $36 billion. A report from 451 Research indicated that the global footprint of data centers will grow by over 40 million square feet to total 150 square feet. Today, the colocation market generates approximately $22.8 billion a year. In only two years, colocation data centers' footprint will grow by 75 percent and revenue will increase by 63 percent.

The news source also pointed out that data center market growth is driven primarily by the cloud. 451 Research estimated that today, less than 50 percent of the world's total space that supports data center IT equipment is located in North America – 43 percent. Europe, the Middle East and Asia-Pacific represent a significant amount as well. It is difficult to say what the implications of the footprints growing faster than revenues may be, but companies should do their part to make sure they are committed to sustainable growth. There are several strategies can companies can use to manage their data centers in more economical and environmentally safe ways.

Sustainable, energy efficient data centers are on the rise
According to IntelligentUtility, 60 to 70 percent of total operational costs for a data center are energy related. Data centers require large amounts of energy to power the computers, servers and network. Interestingly, data centers represent one of the biggest consumers of electricity in the world. However, many organizations today commit to powering their data centers intelligently, have zero waste policies, and do what they can to operate more efficiently and lessen their environmental impact. Not only does energy efficiency fit nicely with CSR initiatives of large corporations, but 59 percent of Fortune 100 companies have already set greenhouse gas emissions reduction commitments. As such, data center efficiency is likely to be a priority for many top executives.

Data center colocation is one proven way that companies can use less energy, while still enjoying all the benefits of maintaining a fully functioning data center. Accordingly, here are three things to look for in a data center colocation provider:

1. Look for high computing density
Colocation providers that enable high-density computing are able to improve performance by reducing the amount of resources necessary for each square foot of space. Things like water, energy, maintenance and expenses can be kept to a minimum if each square foot of a data center hosts as much computing power as possible, while still adhering to safety standards and best practices. Ask your colocation provider if they use space efficiently and opt to outsource your data center needs to companies that understand high-density computing.

2. Ask for key performance indicators
One of the best ways to ensure that you are continually working toward your sustainability goals is to receive constant updates regarding operating efficiency and resource usage. Look for colocation providers that track key metrics over time and can provide that data to you through software online. By tracking the performance of your colocation provider and outsourced IT assets, you can find areas where efficiencies can be reduced and improvements made. The more flexible you can be regarding the management of your data center assets, the easier it will be to engage in continuous improvement activities. 

3. Plan your data center management intelligently
Having a  strategy, and establishing a footprint, allows you to manage your data center needs in a more effective manner. You might find that you can allocate your demand across multiple data centers, receive all the services you require, and still meet your sustainability goals. As previously mentioned, by tracking water, energy and overall resource management, you can ensure sustainability, regardless of where your data center is located. 

Ultimately, outsourcing your data center needs is a good idea. Looking for energy smart data centers is even better. In 2015, responsibility is just as important as revenue. Management teams, shareholders, employees and customers want to see a commitment to sustainability. Given that efficiency operations are also good for the bottom line, there is no reason why your data center strategy should not be based on socially responsible principles.

What is the best data storage option for you?

Choosing the right data center option for your business

No longer just an option for many companies, data centers are now considered to be essential parts of the corporate machine. Information drives many business decisions and processes today, and making sure that data is stored and secured is paramount. However, some organizations find that the cost of maintaining and upgrading their data center is an ongoing challenge and many companies have to balance the costs against what they actually need to have to continue operating successfully. When choosing between colocation, expanding a current data center or building a new one, it is not always an easy choice. 

Consider the financial aspect
There are a number of financial considerations that go into deciding the best data strategy for your company. You should ask yourself what your immediate needs are. If your current infrastructure is not enough to support your operation, it will take time to implement any solution. However, colocation will take significantly less time than building a new facility. A new facility can take months to become fully operational, whereas a colocation provider can make several servers available much faster. The Data Center Journal explained how backend infrastructure factors into data center management. 

Perhaps you think you need to have your own site, but will that be too costly? Will the implementation time affect your operations? Consider the return on investment and scope of any project before you commit serious capital.

What is your organizational need?
Some companies utilize data more intensively than others. In an industry where revenue is earned through managing large amounts of data, owning your data center assets may be more important than in industries where you just need to access your data and know it is secure. If your business is not primarily about managing data, then consider colocation as a faster and less costly option.

The solution also does not have to be cut and dry. Some companies prefer to use a combination of corporate-owned and colocation services to meet their data needs, noted Data Center Knowledge. Certain pools of data are highly sensitive, and must adhere to stringent regulations. Some of the data belonging to government agencies, banks, hospitals and health care companies may be too sensitive for colocation, but the vast majority of data does not fit this description. Accordingly, it would be a good idea to outsource some of your needs to a third party provider. If your business model allows for it, take advantage of what the market has to offer today. Colocation is safe, reliable and will not break the bank in the same way that building your own facility will. Don't buy space, servers and hire IT staff if you don't need to.

Do you have the physical space?
Data Center Knowledge explained that most companies looking to increase their data center capabilities consider expanding their current infrastructure, as it is less costly than investing in new facilities. However, a company must think about whether it has space to house new servers and staff to cater to the additional assets. Power consumption also can present a certain dilemma. Data centers require a lot of power and ensuring that there will be sufficient energy, as well as backup power in the case of an outage might be beyond a company's capability.  

If your company can pay for the upgrade, operate efficiently, and accommodate for the changes, then expansion might be the correct solution. If you do not have enough physical space of if you don't believe you have the additional resources to cater to the increased IT demand, then colocation might be a better idea.

It is important to remember, that when talking about data centers, operational efficiency and time management are key. Whether you choose colocation or housing your own data center, make sure your business objectives and technology needs are in sync. Not doing so could end up costing you more than you expected.

The cloud can help companies protect their data.

Cloud-based disaster recovery

Most businesses today need to have disaster recovery plans for their data. A few types of companies do not consider data their most valuable asset, but the vast majority store confidential client information, proprietary knowledge, financial account breakdowns and scores of other files that, if lost, would be devastating for them. Earthquakes, floods and power outages present a looming threat to your data. Having an effective disaster recovery plan is essential – but it does not have to break the bank. Cloud technology offers you a more affordable option in the short term. Maintaining your own secondary data storage site has its advantages, but requires much more upfront capital investment. You should thoroughly assess your needs, but remember: Having no disaster recovery plan is the worst and most expensive plan of all. 

Safeguard your data because you never know…
It's not always easy to know when disaster will strike. That's why small business owners embrace private cloud-based disaster recovery solutions, because they can rest assured that they will be able to access their data and work remotely in the event of a sudden disruption to their systems. Artur Matzka, director of infrastructure development at Integrated Solutions, told ComputerWeekly what a brief interruption of a technology network can do to a business.

"The cost of data loss or lack of access to information can be dangerous, and not only in financial terms. For companies operating in a really competitive market, even 10 days without access to key data could cause an outflow of customers and the death of the business," Matzka said.

Disaster recovery doesn't have to be ridiculously expensive
Forbes explained that in the beginning, disaster recovery solutions were inaccessible to small- and medium-sized enterprises because the costs of setting up a secondary data center was too expensive. The process involved finding a backup location far enough away from the initial site, investing in hardware and ensuring that all the main data was backed up. Charles King, principal analyst for Pund-IT explained that, in the beginning, secondary storage was simply too costly for midsize enterprises.

"For many years, that separation was a leading cause in disaster recovery costs being so high, especially if synchronous replication was required," said King, according to Forbes.

SearchDisasterRecovery noted that many companies maintaining secondary sites, often have facilities that are not suitable to their needs. Those assets may require more IT staffing, better equipment and extensive upgrades to network, storage and server infrastructures. 

However, now with private cloud-based solutions, disaster recovery is a viable option for many companies – not just large enterprises. The cloud enables small business to ensure the same degree of functionality, while keeping capital investments in check. Using the cloud, companies can back up their customer records and other vital data, should the worst happen.

"By engaging a cloud service provider, companies can avoid the substantial costs of building, outfitting, managing and maintaining data centers, including employing information technology staff," King added. "As business-computing usage continues to grow and IT becomes increasingly complex, cloud and other hosted services look more and more attractive."

The cloud turns capital expense into operating expense
Before the cloud, setting up a secondary data storage facility meant a company had to invest in a place, new hardware and hire highly skilled IT professionals to administer the site. Now, companies can simply outsource their data storage needs to private cloud and colocation providers – achieving the same ends for a fraction of the price. It is important to point out that the cost of rebuilding data, should disaster strike, is significantly more than the cost of backing up information with a cloud provider.

"Rebuilding transactions and files if disaster recovery fails is hugely expensive and time consuming," King said. "But over time, replication solutions have generally become cheaper, faster and more dependable to the point that a number of cloud-computing players can now offer various disaster recovery solutions."

Ultimately, some companies will choose to maintain their own secondary sites over cloud-based solutions because, if they can afford it, there are certain benefits. However, colocation services let organizations get the best of both worlds – off-site backup and cloud technology. Colocation providers can host private clouds in their facility, letting customers lease the data center space instead of paying to build a dedicated facility.

Data center colocation offers several advantages.

5 reasons why data center colocation is good for business

There are many reasons why companies choose to outsource their data center needs to colocation providers. The task of setting up and managing a data center can be very costly and complicated. Housing IT equipment and data at a third party colocation provider, on the other hand, can save companies considerable money and allow them to spend their time on other value-added activities. Interestingly, an added benefit of working with a colocation provider is it is more environmentally sustainable.

This article explores 6 reasons why you should consider data center colocation for your business.

1. You will free up resources and staff
Data Center Knowledge explained that colocation can allow you to focus on your core business. Your IT team will not have to spend their days dealing with the ins and outs of managing a data center because the colocation provider will take care of all the details. Unlike most organizations, colocation providers have teams that are available around the clock to reboot servers, manage disaster recovery plans, assess power needs and continually run diagnostics on the network. By outsourcing your data needs, you will spend less time worrying about the hardware, and more time devoted to the platforms on which you do your work – the operating systems, applications and databases themselves. 

2. You will lessen overhead costs
According to SearchDataCenter, colocation is appealing for finance directors because it allows them to realize instant cost savings, while still utilizing state-of-the-art infrastructure. Instead of investing heavily in equipment, a company can outsource its data needs and immediately gain access to shared uninterruptible power supplies, auxiliary generators, cooling systems and physical security. Colocation facility owners will be responsible for maintaining these assets and you will only pay for them through standard billing. Additionally, another hidden value in data center colocation is the sharing of resources. May providers manage multiple accounts and, as such, may be able to offer other services at a discount. Data Center Knowledge mentioned that, with respect to power supply alone, outsourced solutions clearly win out over in-house options. Sophisticated dual generators, air conditioning and battery backups are just icing on the cake.

3. You will gain better connectivity
Colocation almost always ensures that a business is connected globally and securely, noted Data Center Knowledge. Alternatively, in-house data center server rooms often don't have access to a strong Internet connection, nor do companies have the resources to specifically dedicate IT staff to monitoring network speed and traffic. If you choose to work with a colocation provider, you will likely benefit from a faster and more resilient network offered to you at competitive pricing. In your office, delivering 100 Mbps of bandwidth may prove costly, but with a provider you will have better service at a premium.

4. You will be more sustainable
SearchDataCenter pointed out that by outsourcing your data center needs, you will effectively shrink the environmental footprint of your facility. Working with a colocation provider, you will not have to invest heavily in servers, cooling systems, power supplies, generators and other utility-intensive technologies. Additionally, most colocation facility owners design their infrastructure intelligently. Owners will plan their building based on anticipated occupancy. For example, if the owner anticipates 20 percent occupancy, the internal resources will reflect that need. When occupancy increases, the owner will accordingly raise his data center capability to match the demand. This approach is not only cost-effective, but reflects a more sustainable way of doing business, where resources are only used as they are needed. Only when newer data center technologies are unveiled, or additional geographies must be covered, do owners build new facilities. According to Data Center Knowledge, companies that move their to colocation facilities typically save 90 percent on their own carbon emissions. Many data centers invest in green technologies, because they have the incentive, time and money to do so as well. Peter Gross, vice president at Bloom Energy commented on the green nature of colocation

"Companies are increasingly turning to data center colocation services to interconnect with other businesses and they want to do this in an environmentally responsible way," said Gross, according to Data Center Knowledge.

5. You will know that your data is secure
One of the best ways to establish trust is to share risk. With that in mind, you can trust a reputable colocation provider to protect your assets because they are stored in the same facility as theirs. In other words, your servers and their servers share the same walls. Colocation providers have added incentive to protect your data center hardware because they use the same infrastructure to safeguard their own. SearchDataCenter mentioned colocation providers rely on having multiple customers in their facilities. If they don't, their business model is not that strong.

Colocation is not for everyone, but for companies that simply cannot allocate all the necessary resources, outsourcing data center needs is a good idea. 

Companies increasingly outsource their data storage.

Data centers give life to several industries

Increasingly, companies use data centers because of the numerous benefits they offer. In addition to allowing for safe and effective storage for massive amounts of data, these beacons of operational efficiency also speed up application deployment time and facilitate access to important information, regardless of user or center location. Due to the prevalence of data centers today, small sub-sectors have sprung up to cater their creation and maintenance.

Many companies choose to protect their information with data centers today because not doing so could result in a loss of valuable assets and, resultantly, lost customers and diminished revenue. With cloud technology, many of the initial concerns about setting up data centers have been addressed because installation costs and lack of skilled staff are no longer entirely necessary. All that is needed for a highly scalable solution in 2015 is an Internet connection and the foresight to seek a colocation provider.

Considerations for companies that outsource their data management
Data Center Knowledge provided companies that outsource their information storage to cloud providers with suggestions on how to make the experience as problem-free as possible. The idea behind the advice was to inform companies that even though they are relying on a third-party provider for their data storage, they still need to act like stakeholders in the project – outsourced does not mean forgotten.

It is important to point out that there are, however, some serious limitations to protecting data in the cloud. Companies still need to manage their files carefully. A cloud provider may not turn out to be too helpful if a company cannot store or backup its files responsibly. Just like renting an apartment, the landlord cannot be held accountable for how the apartment is cleaned or used on a day-to-day basis. Additionally, not all cloud providers are the same. Public clouds are multi-tenant, so if another data user experiences a security breach, all the other data could be at risk of contamination as well. Companies that know certain files or folders are highly sensitive should take precautions to ensure they have backups. They should also stay aware of the fact that not all cloud providers are reputable. When it comes to bargain hunting, data storage is one area where penny-pinching can seriously backfire.

While it is obvious, it should be said that the cloud is dependent on Internet connectivity. Companies that cannot ensure a reliable connection at the office or for remote workers should take measures to guarantee access to needed documents at all times. An ISP outage is not a good reason to take a day off.

Companies that want to take measures to protect their most valuable assets should pay attention to these three areas:

1. Keep track of storage devices.
Hard drives and other storage devices should be handled with care. Because these devices are portable, they are prone to contamination and can end up affecting larger pools of data. They are not to be used as long-term storage options.

2. Schedule data protection efforts.
Data protection requires a set plan, detailed processes and accountability. The plan should include a schedule of backups, and identify who is responsible for managing storage assets. If data is lost, it is important to know who to speak with.

3. Avoid duplicate effort.
Because businesses are likely to be pulling data from many different locations like social media, email, mobile and shared drives, it is important to make sure that information is stored in one centralized management location. This will help eliminate redundancy and duplication. It is also much easier for staff to access data from one central reservoir of information.

Growth in the data center industry is evident
Ultimately, the rise in the number of data centers is made evident by the sub-sectors that cater to this industry. WhatTech pointed out that the cooling process is responsible for approximately 40 percent of the electricity required for a data center. Cooling solutions are used to reduce heat in data centers and use air, water and liquid cooling systems to achieve this end. According to a Research Beam report, the global data center cooling market will grow at a compound annual growth rate of 14.30 percent from 2014 to 2019. Similarly, the data center construction sector is growing at a compound annual growth rate of 10 percent per year, reported Facility Executive. The growing need for Cloud services has resulted in a surge in this market, covering things like design and architecture, as well as the installation of electrical and mechanical systems.

Data Center Knowledge noted that cloud infrastructure spending is expected to reach $32 billion this year. Not only are data centers the new way to manage important information, but they are actually where growth in the computer chip industry is occurring. This is why Intel is acquiring server maker Altera for $16.7 billion.

Data center recovery plans are essential.

Contingency plans for data center disaster

Data centers are an increasingly integral part of modern business. Companies use them all over the world to store data and for the processing and distribution of large amounts of information.The entire online infrastructure of an organization is typically housed in a data center. Whether a company manages its own asset or outsources its data needs to a third party, there are a few important things to consider when it comes to protecting these valuable assets. Perhaps, the most important thing to keep in mind is that accidents happen. Data protection is not just about misplaced files – it also involves planning for Armageddon. All data center managers should consider the importance of their data and their contingency plans to keep that data safe. 

Data centers are about more than just data
Data Center Knowledge reported on a data center outage that took place last month at Legacy Health in Oregon. Legacy Health is a nonprofit that operates several hospitals in the Northwest. Reportedly, a power surge caused a shutdown of the organization's servers and information network and prevented access to clinical and electronic management systems. Fortunately, the affected hospitals were able to continue their operations despite the shutdown, operating on generator power instead. However, the system downtime forced doctors to see patients under an emergency operations plan, which, had it not been in place, could have led to calamity. While no disruptions to patient care were reported during the outage, Legacy's team worked quickly to restore data center operations. George Brown, president and CEO of Legacy Health, referred to the event as a crisis in his official statement. 

"I'm proud of the professional response, compassionate care, and problem-solving expertise shown by employees throughout Legacy Health during this data system crisis to make sure the situation was resolved as quickly as possible and that our patients received the quality and safe care they expect from Legacy Health clinical care teams," said Brown, according to the news source.

Despite things turning out well, it is evident that data center protection is not only important for ensuring successful operations, at times, but it could mean the difference between life and death. This event demonstrates the importance of having proper contingency plans in place. 

Disaster recovery plans are essential
Power outages are not the only thing that can disrupt data center operations. InformationWeek explained that potential motor vehicle accidents, solar flares, fires and floods also worry data center operators greatly. One such incident took place in 2009 in a Texas data center. A diabetic driver passed out behind the wheel of his SUV and crashed into a building that housed data center electricity transformer equipment. A similar incident also took place in Iowa last year when an unexpected fire broke out in the state's primary data center. Robert von Woffradt, CIO for the State of Iowa, advised data center managers to consider extreme events as part of their disaster recovery plans.

"Test complete loss of systems at least once a year," wrote Wolffradt in a blog post that appeared on Government Technology. "No simulation; take them offline."

Ultimately, companies that do not have formal plans for disaster recovery and protection are at great risk of not being able to recover from unexpected events. Not having the right contingency protocols could lead to legal problems, operational disruptions and loss of important information. As such, best practices for data protection involve centralized management and smart storage choices. Investing in recovery services will help companies ensure their data retrieval, should sudden disruptions occur.

The Cloud is growing.

New applications are reliant on sophisticated data centers

People used to think of data centers as electronic filing cabinets. While that may have been true once, today, data centers are a different animal altogether. Thanks to Cloud technology, data centers are not only a means of storing data, but they are also central to the delivery of new applications and features that help drive business innovation.

A wide range of technology services reside in data centers
According to CloudTweaks, by 2017, 35 percent of new applications will be deployed to organizations using Cloud-based delivery. The benefit of this strategy is that product development and implementation times will be significantly reduced and the rollout of new features will require less resources than before. Sophisticated data center technology makes the computing behind enterprise colocation solutions possible and enables direct connections for Cloud providers like Microsoft, Amazon and Google. 

By 2018, according to the news source, over 60 percent of enterprises will have a considerable portion of their overall infrastructure housed on Cloud-based platforms. Data center facilities that have a high level of interconnectedness, located in strategic locations, are the ones that best demonstrate what Cloud service can do. As a result, it is not surprising that technology companies continue to invest and build new centers around the world all the time.

Google and Facebook invest heavily in data centers
ZDNet reported that Facebook and Google have been busy with their respective data center projects. Both companies announced expansion plans for their existing U.S. data centers this week. Google will invest $300 million into its 500,000 square-foot data center in Georgia and expand its capabilities. Construction will begin soon and is expected to culminate at the end of 2016. Facebook will build its third data center in Iowa at the same campus where the company first announced the idea in 2013. Although the official numbers are not known, Facebook's 2013 investment was approximately $300 million as well.

Google also announced that it would extend its data center in Singapore, noted TechCrunch. The expansion will bring Google's total spending on the site to $500 million.
The new center is expected to be completed in two years. It is important to point out that Google's data centers don't exclusively serve customers in the same region that they are located. The American and European data centers serve Asia in the same way the Asian data centers can serve the U.S. Ultimately, Google's commitment to Cloud-based data centers demonstrates the importance and prevalence of this technology, providing a positive example for any companies looking to take their infrastructure online.

The market for software-defined data center services is projected to see massive growth over the next five years.

Software-defined data center services set for massive growth

The trend toward software-defined networking in the data center is already a few years in the making, but a recent study showed that the technology is still far from maturity. A recent report featured on Research and Markets found that the software-defined data center market is projected to grow from a total size of $21.78 billion in 2015 to $77.18 billion in 2020 – a compound annual growth rate of 28.8 percent over those five years. The increasing virtualization of IT is poised to change the nature of data center services in several ways.

What is driving the expansion of the software-defined data center market?
Data has become the currency through which businesses unlock new value in their own operations and guide critical decision-making processes. The rise of big data as a means to gain a more accurate understanding of customer and business information is essential in almost every industry, and the increasing adoption of the bring-your-own-device workplace has put new strains on networks.

According to the authors of the Research and Markets report, the primary driver of this explosive market growth is the rising need for IT infrastructure to handle the rapid increases in network traffic that businesses and data center providers must accommodate. The costs associated with handling this massive influx of traffic have given the impetus to move to software-defined networking as a means by which businesses and providers can cut down on operational and capital expenditures.

The report pointed out that when data center services are tied to hardware, the costs associated with scaling up to meet customer needs can spiral out of control. In addition, several interoperability issues can crop up when disparate hardware doesn't integrate. Software-defined data centers, on the other hand, can avoid many of these problems and create a more seamless and adaptable environment that can handle increasing data storage and throughput needs. 

Software-defined networking's impact on the data center
Data Center Journal explained that at its core, software-defined networking serves as a software-based controller – or "traffic cop" – that directs all traffic coming through the network. This controller gives IT managers a total view of the network and allows them to configure switches and routers without having to manually make adjustments on every individual piece of hardware.

When the network, servers and storage are completely software-defined and free from the constraints of the hardware, it allows data center services to be more agile and responsive to scaling and management needs. But it also creates a new set of priorities for data center managers. Essentially, uptime becomes even more critical, so managers will have to take greater precautions to avoid the common causes of downtime.

Another change to data center management that Data Center Journal noted is that virtualization often leads to higher power densities, which can create hot spots. Traditional rack cooling systems may not be able to adequately cool the virtualized equipment, so upgrades may be necessary before a complete move to virtualized infrastructure is made. To this end, modular cooling systems may be an essential tool for ensuring cooling needs are met and uptime requirements are closely adhered to.

Enterprises are increasingly data-driven, and their IT infrastructures, whether on-premise or outsourced, must be able to handle the sheer volume of data being stored and sent through networks and servers. For the data center provider, this trend will manifest itself in virtualized equipment and software, as well as the physical assets that will support them. 

Phoenix is an increasingly popular destination for data center colocation.

Phoenix becoming increasingly popular for data center colocation

The North American market for data center colocation is growing rapidly. Data Center Knowledge looked at projections provided by 451 Research and found that from now until the end of 2016, the North American data center market will grow by 32 percent to reach a total size of $14.8 billion. So far, this growth seems to be concentrated in a few major markets, with Phoenix, Arizona being one of them. But what is leading enterprises to seek a Phoenix data center?

GlobeSt.com interviewed Michael Ortiz, associate vice president of Colliers International, and found that Phoenix is a popular region for data center colocation primarily due to favorable economic incentives and tax breaks for large developers.

In addition, the startling growth of the tech sector in nearby Silicon Valley has greatly increased the need for data center services to handle the influx of network traffic and data storage needs. Outsourcing key IT infrastructure is also a way for these businesses to lower their total cost of ownership and focus more on core competencies. Phoenix's welcoming economic climate and its proximity to key tech regions will likely keep it among the top data center markets throughout the continent. 

The social networking titan Facebook appears poised to break ground on a new Texas data center located in the Fort Worth area.

Facebook planning a massive Texas data center

The social networking titan Facebook appears poised to break ground on a new Texas data center located in the Fort Worth area. According to The Dallas Morning News, the project is worth nearly $1 billion and will be a boon to both the tech company and the region as a whole.

As the news source indicated, the developers of the new facility and Fort Worth officials had been mum on the true identity of the progenitor of the massive 750,000-square-foot data center, but recent documents filed with the state government of Texas confirm that the project is for Facebook. The company itself, however, has yet to comment on the development.

Incentives abound for Facebook and Texas alike
The original design of Facebook's new Texas data center will start out as a 250,000 square foot facility, which will triple in size over the next few years to its final stage of 750,000-square-feet. It's no coincidence that the company chose the Fort Worth area for a new data center. The Dallas Morning News reported that the City Council has already voted to provide economic incentives for the tech company for locating the new facility in their municipality. In addition, the Northwest Independent School District is also looking to enact tax abatements to support the large project.

The Dallas-Fort Worth area, in return, will be looking at a $1 billion investment in the region, plus the additional property taxes that Facebook would pay for the data center.

Why Texas is so attractive as a location for a data center
Beyond Facebook, it's worth it for companies with large-scale data storage needs to consider a Texas data center. The state's central location and low electricity prices make it a very attractive locale for data center colocation.

"Dallas-Fort Worth has become the third largest data center market in the world," Curt Holcomb, executive vice president with commercial real estate firm JLL, told The Dallas Morning News. "For the same reason you have all these people moving their headquarters and offices here…the data centers are coming here."

According to Data Center Knowledge, Facebook is quickly building out its arsenal of data centers just to keep up with its users' activity. Facebook currently accounts for 9 percent of all Internet traffic, with users uploading about 300 million photos each day. Data center colocation, then, is a strategy for the company to add storage capacity and boost performance for its ever-growing platform.

Categories

Archives