Click here to close now.

Welcome!

Big Data Journal Authors: Elizabeth White, Dana Gardner, Vormetric Blog, Carmen Gonzalez, Pat Romanski

Related Topics: Cloud Expo, SOA & WOA, Virtualization, Web 2.0, Big Data Journal, SDN Journal

Cloud Expo: Article

Top Three Best Practices for Migrating to the Cloud

Planning your migration strategy

As an Infrastructure-as-a-Service provider, Bluelock sees a lot of migration of applications. Migration is occurring from physical servers to cloud, from private cloud to public cloud and back to private cloud from public cloud.

Migration can be tricky and a poor migration strategy can be responsible for costly time delays, data loss and other roadblocks on your way to successfully modernizing your infrastructure.

While each scenario is different, I'd like to identify three key best practices that will help your team create a solid, successful plan for migrating your application.

Even before you begin to move your application, there's a lot of best practice that goes into choosing which application to migrate to the cloud. Regardless of whether you are migrating that app to a public cloud or a private cloud, you should assess the app for data gravity and connectivity of the application.

Best Practice: Understand the Gravity of Your Data
Data Gravity is a concept first discussed by Dave McCrory in 2010. It's the idea that data has weight and the bigger the data is, the harder it is to move. The bigger the data, the more things are going to stick to it.

McCrory states in his original blog post about Data Gravity, "As data accumulates (builds mass) there is a greater likelihood that additional Services and Applications will be attracted to this data."

McCrory goes on to explain that large data can be virtually impossible to move because of latency and throughput issues that develop upon movement. On his website, datagravity.org, McCrory explains that to increase the portability of an application it should have a lower data gravity.

When moving tier one applications from a physical datacenter to a private or public cloud, we have to take data gravity into account because it will impact the migration.

As you are talking about migrating an application, you can think of the full stack of components as a single VM or a group of VMs that are a vApp (see Figure 1).

Think of a VM with an OS. If we were to migrate that entire VM to the public cloud, we're copying anywhere from 8-20 GB of data with that OS for no reason at all as the cloud you're migrating the app to might already have the OS available to it.

Rather than transferring the data for the OS, whenever possible use metadata instead to describe what OS you want and the configurations using a template or an image on the public or private cloud side. The same metadata concept can be applied to middleware instances too.

What we're left with is our actual data and what the app is. The app is static and static info is easy to move because you can copy it once. There's no need to replicate.

The most difficult part of the migration is the data, however. There's no easy way to shrink down the data, so you need to evaluate the weight of the data in the app you're considering migrating.

Especially if you're a high transaction company, or if it's a high transaction application, as that would be a lot of data to replicate. The data of the app constitutes 99% of the data gravity of the application.

Part of the best practice of understanding the gravity of your application is to understand the ramifications of moving a tier one application with a large amount of data and establish where the best home for that application is.

Another aspect that you should evaluate as part of your pre-migration plan is to determine how connected your VM or vApp is to other apps.

If you have a lot of applications tightly coupled to the application you want to migrate, the cloud might not be an option for that application, or at least only that application.

Best Practice: How Connected Is Your App?
Beyond what applications are connected to the app you want to migrate, the important aspect to evaluate is how coupled the application in question is to other applications, and how tight or loose of a couple they are.

Does your application have data that other applications need to access quickly? If so, a move all or nothing philosophy is your best option.

If you have an application that is tightly coupled to two or three others, you may be able to move them all to the cloud together. Because they are still tightly coupled, you won't experience the latency that would occur if your cloud-hosted application needed to access a physical server to get the data it needs to run.

A step beyond identifying how many apps are tied to the application you wish to migrate, work next to identifying which of those applications will be sensitive to latency problems.

How sensitive it can be should be a consideration of whether you should migrate the app or not.

To be able to check this best practice off your list, be very sure you understand everything your application touches so you won't be surprised later, post-migration.

The final part gets down to the nitty gritty... choosing the correct migration strategy.

Best Practice: Pick Your Migration Strategy.
Your best-fit migration strategy will be a function of the features of the application.

Option one is data migration of just the data. This is typically the correct choice for tier 1 and 2 applications.

Let's say you are able to migrate your VM or vApp. But, it's constantly changing and if it's a tier one application, we may not be able to afford a lot of downtime. Typically, we'll have to invoke some sort of replication.

Replication is an entirely separate subject, but when I think of replication, I think of the size of the data, the rate of change and the bandwidth between our source and target.

Without going into too many details of replication, let's assume you use some sort of SQL or MySQL program for database replication. What you've done is set up your new cloud to have this OS provision. You've got a MySQL provision and the two SQLs are talking to each other and replicating the data.

Option two for migrating your application is machine replication. This is best for tier 1 and tier 2 applications that can afford some downtime. It involves stack migration. There is less configuring in this scenario, but there is more data migrating.

Option two is best if you're moving to an internal private cloud. You will be able to replicate the entire stack because you have plenty of bandwidth to move stuff around.

It's important to note the portability of VMware, because VMware allows you to package the entire VM/vApp, the entire stack, into an OVF. The OVF can then be transported anywhere if you're already on a virtualized physical server.

Option three involves cold P2V migration. You typically see this for tier 2 and 3 apps that are not already virtualized.

The concept involves taking a physical app and virtualizing it. VMware has a VMware converter that does P2V, and it's very easy to go from a physical to a private cloud using P2V. It is, however, an entirely different set of best practices.

In option three, there is no replication. Those apps can also be shipped off to a public cloud provider to run in the public cloud after being virtualized.

A final path some companies take is to treat it as a Disaster Recovery (DR) scenario. Setting up something to basically do replication from one machine to another. Replicate the entire stack from point a to point b, and then click the failover button.

Each application, and migration strategy, is unique, so there is no detailed instruction manual that would work for everyone. The best strategy for some applications may be to stay put, especially if you find that steps one and two of the pre-migration evaluation is closely connected or especially weighty. To truly enjoy the benefits of cloud, you want the right application running that you can leverage to the fullest extent.

When planning your migration strategy, ask for help from those who are familiar with similar use cases and plan and evaluate extensively to save yourself a lot of time, money and headaches that come from rushing into a migration without a strategy.

More Stories By Jake Robinson

Jake Robinson is a Solutions Architect at Bluelock. He is a VCP and former CISSP and a VMware vExpert. Jake’s specialties are in infrastructure automation, virtualization, cloud computing, and security

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@BigDataExpo Stories
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your onlin...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
SYS-CON Media announced that IBM, which offers the world’s deepest portfolio of technologies and expertise that are transforming the future of work, has launched ad campaigns on SYS-CON’s numerous online magazines such as Cloud Computing Journal, Virtualization Journal, SOA World Magazine, and IoT Journal. IBM’s campaigns focus on vendors in the technology marketplace, the future of testing, Big Data and analytics, and mobile platforms.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Docker is an excellent platform for organizations interested in running microservices. It offers portability and consistency between development and production environments, quick provisioning times, and a simple way to isolate services. In his session at DevOps Summit at 16th Cloud Expo, Shannon Williams, co-founder of Rancher Labs, will walk through these and other benefits of using Docker to run microservices, and provide an overview of RancherOS, a minimalist distribution of Linux designed...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
Every innovation or invention was originally a daydream. You like to imagine a “what-if” scenario. And with all the attention being paid to the so-called Internet of Things (IoT) you don’t have to stretch the imagination too much to see how this may impact commercial and homeowners insurance. We’re beyond the point of accepting this as a leap of faith. The groundwork is laid. Now it’s just a matter of time. We can thank the inventors of smart thermostats for developing a practical business a...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
CommVault has announced that top industry technology visionaries have joined its leadership team. The addition of leaders from companies such as Oracle, SAP, Microsoft, Cisco, PwC and EMC signals the continuation of CommVault Next, the company's business transformation for sales, go-to-market strategies, pricing and packaging and technology innovation. The company also announced that it had realigned its structure to create business units to more directly match how customers evaluate, deploy, op...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
Cryptography has become one of the most underappreciated, misunderstood components of technology. It’s too easy for salespeople to dismiss concerns with three letters that nobody wants to question. ‘Yes, of course, we use AES.’ But what exactly are you trusting to be the ultimate guardian of your data? Let’s face it – you probably don’t know. An organic, grass-fed Kobe steak is a far cry from a Big Mac, but they’re both beef, right? Not exactly. Crypto is the same way. The US government require...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understand...
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
Software Defined Storage provides many benefits for customers including agility, flexibility, faster adoption of new technology and cost effectiveness. However, for IT organizations it can be challenging and complex to build your Enterprise Grade Storage from software. In his session at Cloud Expo, Paul Turner, CMO at Cloudian, looked at the new Original Design Manufacturer (ODM) market and how it is changing the storage world. Now Software Defined Storage companies can build Enterprise grade ...
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. ...