Welcome!

Big Data Journal Authors: Elizabeth White, Pat Romanski, Roger Strukhoff, Adrian Bridgwater, Liz McMillan

Related Topics: Cloud Expo, SOA & WOA, Virtualization, Security, Big Data Journal, SDN Journal

Cloud Expo: Blog Feed Post

Weighing the Options for Onboarding Data into the Cloud

One of the questions we hear most frequently is “how do I get my data into the cloud?”

One of the questions we hear most frequently is “how do I get my data into the cloud?” For many organizations, the benefits of expanding on-premise data storage to include hybrid cloud storage have begun to resonate, but they struggle to get started as they determine how to get move data into the cloud. The decision on how to onboard initial data to the cloud, or what we call the initial ingest, is one that cannot be overlooked.

Cloud-truck

While there is more than one way to perform the initial ingest, it shouldn’t be a surprise that the best solution can vary on an individual case basis. Relevant factors influencing the decision include: amount of data intended for ingestion, amount of available bandwidth, timeframe in which you want to load the data. Typically, most organizations will decide on one of the following three methods for the initial ingest:

  • Use existing bandwidth to perform the transfer over time
  • Increase or “burst” bandwidth for the duration of the transfer
  • Ship media directly to a cloud provider

Use existing bandwidth
Calculating how long it takes to upload a large amount of data across a WAN involves a bit of straightforward arithmetic. For instance, an uplink speed of 100Mbit/sec should be able to push nearly 1TB per day.

While this approach sounds cut and dry, in practice, organizations need to consider a few additional factors:

  • Subtract typical WAN usage to more accurately calculate available bandwidth
  • Employ bandwidth throttling and scheduling to minimize impact on existing applications
  • Cache/buffer the data so they can continue to access data during the ingest process – sometimes starting with a large buffer and shrinking it over time

Temporarily increase bandwidth
For circumstances where existing bandwidth will not onboard data in the cloud in a timely manner, another option is to temporarily increase bandwidth during the upload process. Some telcos and internet providers offer bursting capability for short durations lasting weeks or months. Once the ingest completes, bandwidth can be restored as before to accommodate the normal course of data accesses and updates

An alternative to increasing bandwidth is using a temporary colocation or data center facility that has higher-bandwidth access to the cloud provider. This adds the additional costs of transportation, equipment setup and leasing but may offer a cost-effective compromise.

Physically ship media
Ultimately, if data cannot be onboarded in a timely manner via network (let’s say it’s a few PB in size), shipping physical media to a cloud provider is the next option. While this option may seem deceptively easy, it’s  important not to ignore best practices when physical shipping media.

Whereas many organizations have adopted a “zero trust” model for their data already stored in the cloud (meaning all data is encrypted with a set of keys maintained locally), transporting data requires similar safeguards.

This week, TwinStrata announced the latest release of CloudArray, which includes a secure import process that encrypts and encapsulates data into object format stored in the cloud prior to shipping the data. Following the same security practice used for storing data online in the cloud eliminates security compromises that may lead to possible data breaches.

The bottom line
While there are benefits to expanding on-premise storage infrastructure with a secure, hybrid cloud strategy, often the starting point involves answering the question of how to get initial data there. Choosing the right option can both satisfy the need for timeliness while mitigating risks around security and disruption.

The post Weighing the options for onboarding data into the cloud appeared first on TwinStrata.

Read the original blog entry...

More Stories By Nicos Vekiarides

Nicos Vekiarides is the Chief Executive Officer & Co-Founder of TwinStrata. He has spent over 20 years in enterprise data storage, both as a business manager and as an entrepreneur and founder in startup companies.

Prior to TwinStrata, he served as VP of Product Strategy and Technology at Incipient, Inc., where he helped deliver the industry's first storage virtualization solution embedded in a switch. Prior to Incipient, he was General Manager of the storage virtualization business at Hewlett-Packard. Vekiarides came to HP with the acquisition of StorageApps where he was the founding VP of Engineering. At StorageApps, he built a team that brought to market the industry's first storage virtualization appliance. Prior to StorageApps, he spent a number of years in the data storage industry working at Sun Microsystems and Encore Computer. At Encore, he architected and delivered Encore Computer's SP data replication products that were a key factor in the acquisition of Encore's storage division by Sun Microsystems.

Latest Stories from Big Data Journal
IoT is still a vague buzzword for many people. In his session at Internet of @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, will discuss the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. The presentation will also discuss how IoT is perceived by investors and how venture capitalist access this space. Other topics to discuss are barriers to success, what is n...
BlueData aims to “democratize Big Data” with its launch of EPIC Enterprise, which it calls “the industry’s first Big Data software to enable enterprises to create a self-service cloud experience on premise.” This self-service private cloud allows enterprises to create 100-node Hadoop and Spark clusters in less than 10 minutes. The company is also offering a Community Edition via free download. We had a few questions for BlueData CEO Kumar Sreekanti about all this, and here's what he had to s...
Labor market analytics firm Wanted Analytics recently assessed the market for technology professionals and found that demand for people with proficient levels of Hadoop expertise had skyrocketed by around 33% since last year – it is true, Hadoop is hard technology to master and the labor market is not exactly flooded with an over-abundance of skilled practitioners. Hadoop has been called a foundational technology, rather than ‘just’ a database by some commentators – this almost pushes it towards...
The cloud provides an easy onramp to building and deploying Big Data solutions. Transitioning from initial deployment to large-scale, highly performant operations may not be as easy. In his session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, will discuss the benefits, weaknesses, and performance characteristics of public and bare metal cloud deployments that can help you make the right decisions.
Cisco on Wedesday announced its intent to acquire privately held Metacloud. Based in Pasadena, Calif., Metacloud deploys and operates private clouds for global organizations with a unique OpenStack-as-a-Service model that delivers and remotely operates production-ready private clouds in a customer's data center. Metacloud's OpenStack-based cloud platform will accelerate Cisco's strategy to build the world's largest global Intercloud, a network of clouds, together with key partners to address cu...
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the “Internet of Things” (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, will discuss the real...
Where historically app development would require developers to manage device functionality, application environment and application logic, today new platforms are emerging that are IoT focused and arm developers with cloud based connectivity and communications, development, monitoring, management and analytics tools. In her session at Internet of @ThingsExpo, Seema Jethani, Director of Product Management at Basho Technologies, will explore how to rapidly prototype using IoT cloud platforms and c...
Amazon, Google and Facebook are household names in part because of their mastery of Big Data. But what about organizations without billions of dollars to spend on Big Data tools - how can they extract value from their data? Ion Stoica is co-founder and CEO of Databricks, a company working to revolutionize Big Data analysis through the Apache Spark platform. He also serves as a professor of computer science at the University of California, Berkeley. Ion previously co-founded Conviva to commercial...
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again.
When one expects instantaneous response from video generated on the internet, lots of invisible problems have to be overcome. In his session at 6th Big Data Expo®, Tom Paquin, EVP and Chief Technology Officer at OnLive, to discuss how to overcome these problems. A Silicon Valley veteran, Tom Paquin provides vision, expertise and leadership to the technology research and development effort at OnLive as EVP and Chief Technology Officer. With more than 20 years of management experience at lead...