Welcome!

@DXWorldExpo Authors: Yeshim Deniz, Liz McMillan, Elizabeth White, Pat Romanski, Zakia Bouachraoui

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @DXWorldExpo, SDN Journal

@CloudExpo: Blog Feed Post

Weighing the Options for Onboarding Data into the Cloud

One of the questions we hear most frequently is “how do I get my data into the cloud?”

One of the questions we hear most frequently is “how do I get my data into the cloud?” For many organizations, the benefits of expanding on-premise data storage to include hybrid cloud storage have begun to resonate, but they struggle to get started as they determine how to get move data into the cloud. The decision on how to onboard initial data to the cloud, or what we call the initial ingest, is one that cannot be overlooked.

Cloud-truck

While there is more than one way to perform the initial ingest, it shouldn’t be a surprise that the best solution can vary on an individual case basis. Relevant factors influencing the decision include: amount of data intended for ingestion, amount of available bandwidth, timeframe in which you want to load the data. Typically, most organizations will decide on one of the following three methods for the initial ingest:

  • Use existing bandwidth to perform the transfer over time
  • Increase or “burst” bandwidth for the duration of the transfer
  • Ship media directly to a cloud provider

Use existing bandwidth
Calculating how long it takes to upload a large amount of data across a WAN involves a bit of straightforward arithmetic. For instance, an uplink speed of 100Mbit/sec should be able to push nearly 1TB per day.

While this approach sounds cut and dry, in practice, organizations need to consider a few additional factors:

  • Subtract typical WAN usage to more accurately calculate available bandwidth
  • Employ bandwidth throttling and scheduling to minimize impact on existing applications
  • Cache/buffer the data so they can continue to access data during the ingest process – sometimes starting with a large buffer and shrinking it over time

Temporarily increase bandwidth
For circumstances where existing bandwidth will not onboard data in the cloud in a timely manner, another option is to temporarily increase bandwidth during the upload process. Some telcos and internet providers offer bursting capability for short durations lasting weeks or months. Once the ingest completes, bandwidth can be restored as before to accommodate the normal course of data accesses and updates

An alternative to increasing bandwidth is using a temporary colocation or data center facility that has higher-bandwidth access to the cloud provider. This adds the additional costs of transportation, equipment setup and leasing but may offer a cost-effective compromise.

Physically ship media
Ultimately, if data cannot be onboarded in a timely manner via network (let’s say it’s a few PB in size), shipping physical media to a cloud provider is the next option. While this option may seem deceptively easy, it’s  important not to ignore best practices when physical shipping media.

Whereas many organizations have adopted a “zero trust” model for their data already stored in the cloud (meaning all data is encrypted with a set of keys maintained locally), transporting data requires similar safeguards.

This week, TwinStrata announced the latest release of CloudArray, which includes a secure import process that encrypts and encapsulates data into object format stored in the cloud prior to shipping the data. Following the same security practice used for storing data online in the cloud eliminates security compromises that may lead to possible data breaches.

The bottom line
While there are benefits to expanding on-premise storage infrastructure with a secure, hybrid cloud strategy, often the starting point involves answering the question of how to get initial data there. Choosing the right option can both satisfy the need for timeliness while mitigating risks around security and disruption.

The post Weighing the options for onboarding data into the cloud appeared first on TwinStrata.

Read the original blog entry...

More Stories By Nicos Vekiarides

Nicos Vekiarides is the Chief Executive Officer & Co-Founder of TwinStrata. He has spent over 20 years in enterprise data storage, both as a business manager and as an entrepreneur and founder in startup companies.

Prior to TwinStrata, he served as VP of Product Strategy and Technology at Incipient, Inc., where he helped deliver the industry's first storage virtualization solution embedded in a switch. Prior to Incipient, he was General Manager of the storage virtualization business at Hewlett-Packard. Vekiarides came to HP with the acquisition of StorageApps where he was the founding VP of Engineering. At StorageApps, he built a team that brought to market the industry's first storage virtualization appliance. Prior to StorageApps, he spent a number of years in the data storage industry working at Sun Microsystems and Encore Computer. At Encore, he architected and delivered Encore Computer's SP data replication products that were a key factor in the acquisition of Encore's storage division by Sun Microsystems.

DXWorldEXPO Digital Transformation Stories
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value...
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?
Andi Mann, Chief Technology Advocate at Splunk, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, and communicator. For over 30 years across five continents, he has built success with Fortune 500 corporations, vendors, governments, and as a leading research analyst and consultant.
A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to great conferences, helping you discover new conferences and increase your return on investment.
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science" is responsible for guiding the technology strategy within Hitachi Vantara for IoT and Analytics. Bill brings a balanced business-technology approach that focuses on business outcomes to drive data, analytics and technology decisions that underpin an organization's digital transformation strategy.
DXWorldEXPO LLC announced today that ICOHOLDER named "Media Sponsor" of Miami Blockchain Event by FinTechEXPO. ICOHOLDER gives detailed information and help the community to invest in the trusty projects. Miami Blockchain Event by FinTechEXPO has opened its Call for Papers. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Miami Blockchain Event by FinTechEXPOalso offers sp...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.