The financial sector is currently going through a great deal of uncertainty, due to both technological and other reasons. For starters, the state of the economy has been shaky since the recession that hit the U.S. in 2008, and it's been difficult ever since to predict the financial landscape – how much money people will have, how much they'll be willing to invest and so on.
At the same time, uncertainty about technology abounds as well. Financial firms' IT leaders are under a lot of pressure to reconsider their strategies for managing data. This is an ongoing trend because firms are looking for creative tactics that will help them curate their files efficiently without waste, all while maximizing employee productivity and also guaranteeing compliance with ever-changing federal regulations.
It's a lot to handle all at once, and thus it should come as no surprise that the financial data center model is under review. According to Open Markets magazine, CTOs in the finance sector are actively searching for new technologies that can take their business to the next level. Data center colocation is absolutely one of them.
The key to success in finance
Financial firms are going through a difficult era in which every scrap of opportunity at a competitive advantage must be explored. Paul Rowady, senior analyst at the TABB Group, told Open Markets that there are four main factors driving companies to better compete in finance – infrastructure, processing, data analysis and human capital management.
"You need to optimize deployment of all four, which will depend on the use case," Rowady said. "You need to optimize all four factors to be successful. Ultimately, human capital is the one that stands out. If you have the right people, you can tune hardware and software to use data and the output will be a strong competitive advantage. Not everyone will have the right people."
This means that for finance companies to edge out their rivals, they need both a better way of managing data and a staff that will be on board with such a change. This is no small order.
The value of change management
There's a lot of talk in finance today about introducing colocation solutions, which have the potential to increase the speed and flexibility of information management in finance. The greatest obstacle, however, is that such a change can only be successful if employees adapt to it well. The benefits of the technology are not guaranteed.
"There is a change management aspect to it," Rowady explained. "The technology doesn't guarantee that you have the creativity to use the tools properly. Just because you have them, doesn't make you more competitive, but it might give you a better shot at being competitive."
With data center colocation, for example, companies have the chance to manage their data in facilities that are convenient, cost-controlled and staffed by expert professionals. To really get ROI from this move, though, they need employees who are willing to leverage new data management solutions for greater efficiency and productivity.
The changing finance business model
Changes in finance technology, such as the rise of colocation, are leading to a new paradigm in the industry. Simply making transactions and recording them accurately isn't enough anymore – to be successful in the future, financial firms will need to act quickly, think creatively and use their new computing power to achieve at a higher level.
"High-performance computing is likely to unleash a level of creativity," Rowady told Open Markets. "You could see the pace of innovation go parabolic."
In today's uncertain financial climate, faster innovation is the key to a more successful business. Superior data center technology is at the heart of that.
We've reached the point in 2015 where having the best possible strategy for managing data is one of the most important aspects of running a business. Along with staffing/HR needs and real estate, IT project management is right up there.
Every business is forced to face the same dilemma, more or less – you've got a massive amount of files, and you need a way both to curate them on a daily basis and keep them protected. Cyberattacks, natural disasters and employee errors are all a threat to your company's data. The best data management solutions are well equipped both for daily workflow and for disaster recovery.
Of course, the other part of this dilemma is a limited budget. You want to manage your data optimally, because it's not feasible to overspend on a massive IT infrastructure. For most companies, building an entirely new data center is a nonstarter. It requires way more time, real estate and labor than they can afford to invest.
That's why data center colocation has become such a popular option. It gives companies the chance to manage their data using a scalable, customizable infrastructure that's maintained by expert professionals, yet also reasonably priced for businesses of all sizes. Colocation solutions offer all the benefits of a full-blown data center without the prohibitive costs.
According to Data Center Knowledge, we're moving toward a future in which the vast majority of businesses embrace this IT model. Bill Kleyman, vice president of strategy and innovation at MTM Technologies in Connecticut, told the news source that while "cloud computing" is a popular buzzword, the reality is that information still needs to have an actual physical location. Colocation gives your data a home.
"In today's ever-changing IT environment, more emphasis is being placed on the data center," Kleyman explained. "In fact, almost all new technologies being deployed today require a place to reside. This location is the data center. It's no wonder that the modern data center is being referred to as 'the data center of everything.' In using advanced data center technologies, your organizations would literally have a secure slice of the cloud to manage and control."
Having said all of this, it's still very difficult to find the data center provider that's a perfect fit for your business. The way Kleyman sees it, the problem is largely one of physical resource allocation. Curating data for a sizable business isn't as easy as just building a facility and stashing files away – it requires a great number of key data center components. Here's a look at a few physical assets that are indispensable:
Capacity for high density
First and foremost, data centers should be built around high-density equipment. When you work with a colocation partner, your goal should be to fit as much data into as tight a space as possible – this is how you can afford the real estate to house your files. For this reason, density is clearly important.
Efficient cooling techniques
Another indicator of a cost-efficient data center is cooling. Having a high volume of data moving through your facility means a lot of heat is inevitable, so effective cooling and smooth airflow are vital for keeping your data safe. The best data centers have complete control of distinct hot and cold aisles.
Smart power management
To manage the flow of data in and out of a facility on a nonstop basis, you need to use a lot of power. Companies need to be energy-efficient about using their data, lest they go over budget. Optimizing energy use is important. When you're using high-density servers constantly, you can't let power go to waste.
A whole lot of bandwith
Constant communication between your office and your colocation provider puts a great deal of strain on your wide-area network (WAN). Naturally, WAN utilization eats up a lot of bandwidth. You have to be sure that your network can handle the traffic, because otherwise, you might have downtime on your hands.
Airtight data security
Finally, you can't sleep easy knowing your data is safe if your colocation provider can't offer the utmost security. This includes both cybersecurity measures to protect against hackers and physical security for your actual data facility as well. Every company that manages a great deal of data needs to know that its files are protected.
In 2015, companies are using more data than ever before, and they're concerned not just about sharing that data easily and using it efficiently, but also keeping it all secure. For this reason, they need to work with storage providers that can provide the utmost level of protection and, by extension, peace of mind.
While building a data center oneself is one way of tackling the challenge, there are serious drawbacks. This construction process is expensive, time-consuming and fraught with risks. It's also difficult to find the right scale – build a center too small, and you won't be able to house all your data, but too large, and you're letting resources go to waste.
The better solution may well be to work with a colocation provider. According to Data Center Knowledge, it's invaluable to have a provider that you can trust – entering into such a partnership can help you store your data, maintain security and also avoid wasteful allocation of your limited IT resources.
Bill Kleyman, vice president of strategy and innovation at MTM Technologies in Connecticut, told the news source that to establish a high level of trust with a colocation provider right off the bat, it's good to have a service level agreement that can establish the terms of the business partnership.
"When selecting the right colocation provider, creating or having a good SLA and establishing clear lines of demarcation are crucial," Kleyman explained. "Many times, an SLA can be developed based on the needs of the organization and what is being hosted within the data center infrastructure. This means identifying key workloads, applications, servers and more."
Once you have an agreement in place, you can hammer out expectations for uptime, issue resolution, response time and any other key criteria your business deems necessary.
So as you enter into data center colocation with a new business partner, what specific skills and attributes are you looking for? The following is a good list of four key values you should bear in mind:
A good understanding of disaster recovery
Perhaps the most important reason to work with a colocation provider is to have a disaster recovery data center. If something goes wrong – be it a natural disaster, a cyberattack or an in-house technical malfunction – you want to be able to recover your data and get your business back up to speed with little downtime. Your provider should be able to help you achieve this.
A strong business impact analysis
You shouldn't wait for disaster to strike before you know the risks involved with cyberthreats – instead, you should do all the legwork beforehand. It's advisable to complete a comprehensive business impact analysis (BIA) – this means outlining the components of your operations that are most important to recover quickly. A good BIA can keep your company afloat.
The right supplies for effective data management
Managing your data effectively requires having the right supplies – this means not just physical assets like servers and cables, but also resources like diesel fuel for power generation and water for cooling. A good provider will maintain supplies both on-site and off, ensuring that you'll have everything you need even if a disaster limits your access.
Good, consistent communication
If you want to manage your data effectively and keep it safe on a daily basis, it's important to maintain regular communication with your colocation partner. Whenever trouble arises, you need to be confident that a trusted ally is just a phone call away, and that they know your business' needs backward and forward. Stay in touch, and keep your disaster recovery game plan ready to deploy at a moment's notice.
For anyone who works to curate large volumes of data – whether they're using a private data center, a cloud solution or a colocation provider – one of the greatest fears that consumes you on a daily basis is that of downtime.
When your system goes down, your business goes out of commission. Losing access to your data disrupts every component of your daily operations – if you can't access internal workflow data, you can't complete regular in-house tasks as scheduled. If you can't find data about your customers, you won't be able to handle basic client interactions that are a fundamental part of what you do. Without data, you have nothing.
Therefore, it's vital for corporate IT leaders to look for data management solutions that keep downtime to an absolute minimum. You want a system that won't crash – and if it does, you want to have a comprehensive disaster recovery plan that will help you get back up and running in no time.
The only way to confirm that your system is downtime-proof is test it, early and often. This is the case for those who manage their own data centers, and it rings just as true for companies that rely on data center colocation as well.
Testing helps to eliminate downtime
According to TechTarget, it's vital for companies to test their data storage infrastructures frequently and look for any abnormalities that may cause downtime in the future. Stephen Ford, who's the managing director at a data center testing company in the United Kingdom, told the news source that an integrated systems test (IST) is probably the way to go – this entails validating power sources, monitoring systems and all applications that manage data.
"It's the only opportunity you will have to test the full intensity of a facility," Ford said of the IST strategy.
Ford told TechTarget that while banks and government offices have been fairly diligent about testing their networks using ISTs, other verticals haven't shown the same level of attention to this issue. Some IT leaders are skipping the testing process or cutting corners.
"Some of the colo guys just go through things just to say they have done it rather than really test the system," Ford said.
Will other sectors start to show more vigilance about protecting their data? What's the future of the health care data center, for instance? Regular testing is vital for any industry that cares about protecting its information and keeping daily operations afloat.
Don't back down from business risk
The problem that many business leaders have with the IST approach is risk aversion. Typically, an IST entails shutting off your entire system and gauging what happens as a result. Some are afraid to do this. What if it causes massive disruptions? What if things won't turn back on afterward?
This is the type of fear-mongering you hear a lot in IT offices, at least. However, the reality is that not testing your system and risking downtime is far more dangerous than testing it.
"They only see it as testing that creates risk," Ford said. "Until something goes wrong, it is very difficult to convince them to get these done."
There's no sense in waiting until something goes wrong to test your system and look for potential causes of downtime. Experts have emphasized time and time again that testing is low-risk if it's planned, rehearsed and monitored closely. Michael Fluegeman, an engineer based in California, also told TechTarget that these tests are safer when performed at a low-traffic hour, like 3 a.m. on a Sunday.
When using colocation solutions, you must be confident that your system can stay afloat without any disruptions. Test early and test often, and that shouldn't be a problem.
In recent years, as the business world has evolved and companies have become more and more reliant on large stockpiles of data – both on their own internal operations and on their customers – there's been an increasing level of interest in creative solutions for managing all of that storage.
It's no longer feasible for employees merely to save all the files they need locally, on their own office machines – firstly, because they need the flexibility to access their data remotely and easily share it, and secondly, because as data continues to pile up, firms need more large-scale solutions for storage.
Building a data center is one option, but that's often not realistic, as the project requires a great deal of real estate and labor to complete. In this vein, data center colocation is becoming an increasingly popular option.
As companies go about the process of choosing colocation providers, they have myriad factors to consider.
Why colocation is the way to go
In this time of increasing technology needs and changing economic circumstances, colocation is a viable way of curating data in a cost-effective manner, according to Data Center Journal. Chris Alberding, vice president of product management at FairPoint Communications, told the news source that colocation merits consideration because it's a flexible solution to ever-changing data storage needs.
"Data center colocation services provide organizations with conditioned space without requiring them to invest capital for new construction," Alberding explained. "Data center colocation offers a pay-as-you-grow model that allows businesses to replace capex with a more manageable operational expenditure."
The size, location and specific needs of your business may change over time – and moreover, a disaster recovery scenario may arise that compels you to reconsider your IT infrastructure – but companies can find a colocation provider that meets their needs if they know to evaluate the right factors in advance.
The importance of your data's location
Location is an important consideration when choosing a colocation provider. Every business wants a provider that's close, but not too close. It's good to consider proximity to your company's location – wherever your data is located, you want your IT staff members to be able to access the facility when necessary.
At the same time, it's important to consider the geographic stability of your colocation center. Is the facility vulnerable to the same threats that your primary location is? The last thing you want is for an earthquake or tornado to strike both your office and your colocation center, putting your data in double danger.
Data security is a key priority
In the modern environment, where cyberattacks are always a threat, it's vital to go with a colocation provider that can offer maximum data security.
The idea data center colocation solutions have multiple layers of data protection. Of course, surveillance technology for keeping out hackers, but physical protection for data is important as well. For example, the best providers have safeguards such as secure doors, locks with keycards, alarm systems and staffed checkpoints. There should also be physical barriers such as fences and reinforced walls.
Evaluating various pricing models
Finally, price is obviously a key consideration when choosing a colocation provider, as it is with any major business decision.
The price of a colocation provider is contingent on many criteria. One major one is energy – it takes a lot of power to keep a data center up and running, and those expenses are often passed along to customers. Real estate, construction and labor costs are sure to be factored in as well.
Choosing a colocation solution for your business is a major step. It should not be taken lightly, nor without considering a variety of key factors well in advance.
When it comes to storing and managing their massive stockpiles of data, today's companies find themselves facing a serious challenge. They want to curate data optimally so that it's easy to access, share among employees and retrieve safely in the event of a disaster recovery scenario. However, they also need to think about conserving resources. They don't want to waste their employees' time or their companies' budgets.
For this reason, IT leaders are looking for alternatives to the process of building large, expensive data centers for housing all their files. Not only can these facilities cost millions to build, but accessing these data centers is also frequently an inefficient use of people's time. The challenge of today is to find a simpler, more streamlined way of managing data center services.
One such strategy that's become far more popular in recent years is data center virtualization. Rather than build huge facilities for their files, companies are looking to use reasonable facsimiles that are far cheaper and just as effective.
What is virtualization, anyway?
Rather than rely on large, bulky data centers for storing their coffers of data, companies have begun to explore using virtual resources instead. According to TechTarget, it's a way to circumvent the conventional method of working with a data center provider. There's no massive construction process required – instead, streamlined operating systems, servers and storage devices can do the heavy lifting.
Virtualization has been growing in popularity since approximately 2005, according to the news source. There are several different types of virtualized resources, including network virtualization and storage virtualization, but perhaps the most significant upgrade that companies can make is to embrace server virtualization. In other words, they would be masking their server resources from users and automating the process of managing storage space, all while increasing resource sharing among employees.
The virtualization debate evolves
IT leaders have been debating the viability of server virtualization for years, according to TechRepublic. Conner Forrest, an expert in startup and enterprise technology, told the news source that the debate has evolved as new tech innovations have made virtualization more viable.
"New virtualization technologies are arriving every day to improve the performance of virtualization in the data center and make it a better option than it once was," Forrest explained. "One of these innovations is that of the virtual storage area network. Virtualization tools, like the VSAN, help to increase flexibility in the data center, as well as helping with automation as well."
Business decision-makers are slowly coming around to the myriad benefits of data center virtualization – among them better network testing, easier data backups and faster redeployments of storage solutions. As these advantages become more pronounced, trust in virtualization around the business world is increasing.
The importance of reducing hardware costs
As companies look to cut IT costs wherever possible, one of the main areas in the budget they look is under hardware. The prices of all those servers, cables and other storage devices can add up quickly, leaving CEOs and CFOs wondering if there's a more cost-effective way of coordinating data storage.
TechRepublic noted that reduced hardware costs are actually a primary reason that virtualization has become so prominent.
"Hardware is most often the highest cost in the data center," data center virtualization expert Jack Wallen pointed out. "Reduce the amount of hardware used and you reduce your cost. But the cost goes well beyond that of hardware – lack of downtime, easier maintenance, less electricity used. Over time, this all adds up to a significant cost savings."
Along the same lines, companies that are looking to reduce their hardware costs while still managing data effectively are increasingly considering the option of using colocation solutions. This is another way to eliminate all the bulky, unnecessary servers and still curate data without skipping a beat.
The health care sector is one that's endured a great deal of change in recent years. For one thing, health organizations are dealing with a sharply increased level of demand for their services. The aging of the baby boomer generation, along with the widespread availability of affordable health coverage under President Barack Obama, have created a climate in which seemingly everyone wants medical care.
Meanwhile, another trend is sweeping the health sector – an influx of new technology. Medical professionals are dealing with all sorts of data on the health of their patients – cloud data, mobile data, you name it. New devices and software solutions have emerged to revolutionize the way modern medicine operates.
This raises the question of what will happen to the health care data center in the near future. Health organizations have massive amounts of data on their hands these days, but what's the best strategy for curating it all? It's an open-ended question that, for the moment at least, has no clear answer.
Growth places strain on the industry
The health care sector is no doubt generating a great deal of additional revenue thanks to the trends of the last few years, but that added cash flow is not without its drawbacks. According to Data Center Knowledge, growth has actually placed a lot of strain on today's health organizations.
Todd Boucher, an expert in energy-efficient data center design strategies, told the news source that health firms have struggled to figure out how to corral all of the additional data they're now working with. Effectively managing data requires satisfying a large group of stakeholders that includes doctors, insurance providers, IT professionals and the corporate C-suite. The old data center model might not be sufficient anymore.
"By combining the increase in the number of departments that are affected by and leveraging information technology with the aging infrastructure of most hospital data centers, we can begin to understand the strain that is being placed on health care data centers today," Boucher explained. "The scale at which a hospital will need to increase data center capacity and availability has never been greater."
Today's health firms are dealing with a lot. They need to manage their data in a way that helps them comply with numerous federal regulations, most notably the Health Insurance Portability and Accountability Act. They also have to think about ease of use for their employees, who are increasingly managing their data in their own independent ways, through the use of such practices as bring your own device (BYOD).
Concerns about physical data structures
To manage large volumes of data in a way that better conforms to the changing regulatory landscape and the rise of the BYOD-powered employee, it would be best for health firms to consider adapting from the old model of relying on big data centers.
According to Transparency Market Research, health organizations are under a great deal of pressure to deliver optimal care while also maintaining speed and low costs. For this reason, health IT executives are looking for more creative, dynamic ways of managing data.
This often means phasing out old strategies like building large data centers. Such projects are huge investments that require a great deal of time, money and real estate, and in today's landscape, they appear unnecessary.
The best strategy for health firms moving forward might be to rely on colocation solutions for managing data in a cost-effective way. There's no longer any need to build entire facilities and hire massive IT staffs – colocation enables health organizations of any size to oversee their own data in their own facilities.
The financial services sector is one that's prone to a great deal of fluctuation. As the global economy ebbs and flows, so too does banking activity. When times are lean, the finance industry can expect their revenues to backslide. When activity picks back up, they'll be back in business.
This continual change in activity and revenue also translates to a fluid situation when it comes to data. When business is booming for financial firms, they have a great deal of information to keep track of – databases full of customers, records of numerous transactions and so on. When times are slow, there are fewer files to curate.
Therefore, in terms of technology, what companies need is a flexible approach to data center services. Sometimes they need a massive amount of storage space, while at other times, they need far less. The "one size fits all" approach to the financial data center might not work so well anymore.
The data center boom is over
The financial sector has gone through boom and bust cycles in recent years. After the nationwide fiscal crisis that struck in 2008, the economy gradually recovered, and banks weren't thriving again until the early 2010s. Once they were, they began investing in data centers.
According to InformationWeek, that surge in investment is over now. We've seen a couple of notable changes in the industry – one is that organizations' demand for computing power has leveled off, and another is that technology options have matured and become more flexible.
Gartner senior adviser Howard Rubin told the news source that financial firms are now reconsidering the old data center model.
"Things are changing in the corporate data center, especially in financial services," Rubin said. "There are a couple of big patterns. With the change in the business world and the decline in revenue in financial services, the equation for owning and running your own data center has changed. There was a time when you could predict the increasing demand for capacity, but not anymore."
Circa 2011, building big data centers was the thing to do. Now, the mindset is changing.
A time of transition
Financial firms are now rethinking their strategies. Today, corporate leaders are realizing that economic ebbs and flows are unpredictable, and they can't move forward with elaborate data center initiatives that they might not be able to afford in four years.
IT strategist Tony Bishop told InformationWeek that for this reason, some companies are considering transferring some of their data warehousing needs to third-party providers.
"We are in the beginning of a 10-year transition cycle," Bishop said. "We are seeing the traditional enterprise move to a digital enterprise."
Data center colocation is one strategy that can be cost-effective for companies that have been vulnerable during uncertain economic times. As their revenues shift dramatically, firms are looking for stable, reliable data storage providers that can adjust to their changing needs. Financial IT expert Roji Oommen believes this is necessary in a volatile economy.
"Profits and margins are down," Oommen told InformationWeek. "The overwhelming majority of financial firms are starting to outsource IT infrastructure. Today, it is very hard to differentiate from peers by investing in infrastructure."
Flexibility is a key priority
The problem with building large data centers for a financial firm is the lack of flexibility. When you construct a facility that's designed to fit an organization of a certain size, you have no recourse when your business fluctuates. You either have not enough space, or you have too much, which means you've allocated resources inefficiently.
"What happens when the business drops off 25 percent?" Rubin asked. "You need to retune your capacity because you don't need all of it. That is very hard to do. The entire business needs to be resized. If you go across every financial services firm, there is a need to fine-tune the data center."
One large-scale change that's happening at financial firms everywhere is the ongoing "cloudification" of data center services. Because cloud-based solutions allow for flexibility and scalability based on each business' specific needs, finance organizations can sleep easy, knowing they're headed toward a stable, cost-effective future.
For a long time now, financial firms have been among the world’s foremost curators of data. For every client they add and every transaction they make, there’s a paper trail, and new information is piling up on a daily basis. The best financial organizations work doggedly to save every scrap of information about their activities.
The problem, of course, is where to store all that info. For big companies with staggeringly large stockpiles of data to maintain, it can be difficult to keep it all saved locally on employees’ machines – and even if that is feasible, it ignores the challenge of saving backup copies for disaster recovery purposes. No, financial companies need a better solution than that. What they need is data centers.
Unfortunately, such facilities can be hard to come by. This is why today’s business leaders have gone off in search of more creative solutions – including data center colocation.
Finding real estate for data
Your data is just like any other asset of your business, like your employees or your equipment – you have to find space for it. And in the financial sector especially, such space tends to be limited. This is largely because many financial firms operate in New York, where real estate is at a premium.
According to Seeking Alpha, this leaves business leaders looking for other ways to house all that data. Building a new data center in that region of the country would be extremely difficult, as noted by Shally Bansal Stanley, managing director for global services at Acumen Solutions.
“There is no data center space in the New York and Northeast corridor,” Stanley told the news source. “It’s driven a lot of companies to change how they buy, build and manage their data centers.”
An industry-defining moment
Because there’s so little space for data, companies are trying to do less than more – for example, by using colocation to house their data more efficiently. Paolo Gorgo, founder and CEO of Nortia Research, told Seeking Alpha that we’re now at an inflection point in the financial sector, as providers will be lining up to help them deploy their colocation plans.
“In spite of the distress recently experienced by many financial institutions, it looks like this is a key moment for the definition of who, among colocation providers, will gain a leading role in the sector and attract the major players within its data centers – no doubt the domino effect this event will generate might guarantee a great long term business to the winner.”
Why colocation is the answer
While housing data can be taxing both financially and space-wise, there’s reason to believe that colocation solutions can be a solution to the problem. With data center colocation, there’s no daunting construction project to be done – all you need is to set up servers and cables, plug and play.
Financial data center construction may not be feasible for many in today’s economy, but not to worry. There’s another way to confront the data curation challenges of the 21st century.
Increasingly in recent years, as businesses have grown in scope and begun to manage more and more data, there's been a lot of conversation about how to manage all those files that employees use on a daily basis. Simply saving them locally on people's office computers isn't enough – shared servers are necessary for easy collaboration, and data backup is important for disaster recovery scenarios.
The question for companies then becomes how best to store their data on servers that will be convenient and secure for everyone. Two main strategies have emerged – one is the idea of colocation, which allows companies to host their own data centers using a minimal amount of IT staffing or resources, and the other is cloud computing, which gives a business remote storage space to manage data online.
Often, there's a binary debate that emerges between the two solutions – colocation or cloud? But in reality, it doesn't have to be this way. There's a way to join the two together, enhancing today's colocation solutions through the use of cloud capabilities. This way, companies get the best of both worlds.
The "cloud versus colocation" debate
According to TechTarget, the choice between cloud computing and colocation is a source of much debate among corporate IT leaders. Both sides make compelling arguments. For example, Simon Withers, a product manager at SunGard Availability Services in Pennsylvania, told the news source that the cloud has become the new favorite of many.
"Cloud is the new colocation," Withers said. "The challenges are the same. The difference is that you're dealing with virtual assets versus physical ones."
Meanwhile, colocation remains a strong alternative. Bryan Chong, a sales representative at Digital Realty in California, told TechTarget that some are growing skeptical of the cloud as a sustainable business model.
"Applications that grew up in the cloud are now too expensive to live there," Chong argued. "It's cheaper to build out an owned 'base' data center and rent the 'burst' in the cloud."
Both sides raise good points. The real question, however, is what if you didn't have to choose?
Why it's time to "cloudify"
The merger between the two strategies is what's becoming known as a "cloudified" model for data center colocation. In other words, companies now have the option of taking the traditional colocation model and adding some cloud-based elements to it.
Forbes described this trend in a recent report. Michael Levy, a colocation product strategist at CenturyLink, told the news source that companies can benefit greatly from the managed hosting capabilities they get through colocation, but the easy scalability of the cloud can also be a key asset.
Under the current colocation model, for example, it's difficult for companies to add additional server space when they need it. This happens all the time – numerous businesses are currently going through growth phases, and accordingly, they need to add server space to account for all of the extra data they're taking on.
Levy pointed out that with traditional colocation, this is difficult. You need extra servers, which means picking up the phone, calling a salesperson and placing an order. It would be easier, of course, if you could simply swipe a credit card and purchase more cloud storage space. Effectively, you'd be combining cloud location with a new "cloudified" element.
"What this is doing is taking the next-generation automation that cloud has brought to us – the ability to scale up and down using an online portal and applying it to the more physical services, which aren't going anywhere," Levy said. "That's how you can fully cloudify."
Colocation isn't going anywhere. It's as strong as ever. But adding a cloud-based element to the process is a win-win proposition, and it's one that will become increasingly popular in the near future.