Click here to close now.

Welcome!

Big Data Journal Authors: Liz McMillan, Elizabeth White, Pat Romanski, Lori MacVittie, Bart Copeland

Related Topics: DevOps Journal, Java, Linux, Open Source, Cloud Expo, Big Data Journal

DevOps Journal: Blog Post

Docker + Stackato: The Perfect Workload Portability Solution

Looking to ease application development and deployment and also retain the maximum flexibility in terms of deployment location?

Looking to ease application development and deployment and also retain the maximum flexibility in terms of deployment location?

If you work in technology, you'd have to have been under a rock to have not heard about Docker. In a nutshell, Docker provides a lightweight container for code that can be installed onto a Linux system, providing both an execution environment for applications and partitioning to securely segregate sets of application code from one another. While this high-level description doesn't sound that exciting, Docker addresses three key issues confronting application developers:

  • Efficient resource use: One of the problems confronting IT organizations is how to get the most benefit from computing resources; this translates as to how to raise utilization of servers to ensure that their cost and power use is actually applied to computing rather than being used to operate a server that is running, but performing no useful work. The previous solution to this issue was virtualization, which enabled a single server to support multiple virtual machines, each containing an operating system and software payload. While virtualization helps address the issue of utilization, it seems obvious that operating multiple virtual machines, each with its own operating system presents the problem that a lot of the server's resources may be tied up with running multiple operating systems rather than application code, which is where all the value resides. Said another way, the operating system is a necessary evil, but it's not where business value resides. A solution that reduces the proportion of the server's overall processing capacity devoted to running operating systems would be extremely valuable. Docker is that solution -- it requires only one operating system per server and uses containers to provide the segregated execution environment that individual virtual machines previously provided. My colleague Phil Whelan used an analogy of a server as being like a jar -- and choosing to use sand rather than marbles to most efficiently fill the jar; just so, containers are more efficient as optimizing overall server use and waste less computing capacity (i.e., leave less "wasted space in the jar") than virtualization.
  • Workload encapsulation: A container offers exactly what it sounds like -- an environment to hold something. In the case of Docker, it holds a set of executable code that runs inside the Docker container. This means that the container encapsulates the execution code, and that the container can be transferred from one location to another. This simplifies the application lifecycle, as containers can be passed from one group to another with no need for separate groups to recreate the same application into different environments via recompiling and repeated configuration.
  • Workload portability: It's a fact of life that businesses use a variety of application deployment environments -- a single company may deploy applications into an on-premise VMware vSphere environment, a virtual private cloud run by an OpenStack-based provider, and also Amazon Web Services. Each uses a different hypervisor and has a different set of operational controls, which presents a challenge to organizations that desire greater flexibility and choice for workload deployment. The previous vendor solution to this issue was OVF -- the Open Virtualization Format -- which promised to enable workload portability, but in practice ended up being a mechanism to transport proprietary virtual machine images along with operational metadata. This reduced the vision of true workload portability to vendor-constrained islands of technology homogeneity, which didn't really address end user objectives at all. By contrast, Docker containers are easily transported and run on any hypervisor environment that supports Linux -- which is all of them. Therefore, Docker is a much better solution to workload portability and addresses a key user desire. You'll hear much more about how Docker enables workload portability over the coming months and years.

Given the advantages Docker offers, it's easy to understand why it has been so avidly embraced by the vendor and user community. It addresses efficient use and provides for better workload portability.

On the other hand, Docker does not solve all application problems. In fact, its benefits expose a significant issue: if it's easier to run and distribute workloads, then efficient creation and management of application workloads is all the more important. And Docker does nothing to ease application creation and management -- it merely does a fantastic job of deploying workloads once they are created.

And application creation and management is where Stackato shines. Its Cloud Foundry-based framework accelerates application development and management by providing easy to use code deployment inside a Docker container, as well as predefined and managed application data storage (i.e., database). Moreover, Stackato makes it easy to grow and shrink the pool of Docker containers within which an application operates.

For organizations looking to ease application development and deployment and that also want to retain the maximum flexibility in terms of deployment location, combining Docker and Stackato is the perfect solution. In fact, ActiveState agrees with this so much that it integrates Docker into its Stackato product.

So if you're a company or IT organization looking to address the issue of workload portability, Docker and Stackato is a good place to start your search.

Source: ActiveState, originally published, here.

More Stories By Bernard Golden

Bernard Golden has vast experience working with CIOs to incorporate new IT technologies and meet their business goals. Prior to joining ActiveState, he was Senior Director, Cloud Computing Enterprise Solutions, for Dell Enstratius. Before joining Dell Enstratius, Bernard was CEO of HyperStratus, a Silicon Valley cloud computing consultancy that focuses on application security, system architecture and design, TCO analysis, and project implementation. He is also the Cloud Computing Advisor for CIO Magazine and was named a "Top 50 Cloud Computing Blog" by Sys-Con Media. Bernard's writings on cloud computing have been published by The New York Times and the Harvard Business Review and he is the author of Virtualization for Dummies, Amazon Web Services for Dummies and co-author of Creating the Infrastructure for Cloud Computing. Bernard has an MBA in Business and Finance from the University of California, Berkeley.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@BigDataExpo Stories
CodeFutures has announced Dan Lynn as its new CEO. Lynn assumes the role from Founder Cory Isaacson, who has joined RMS and will now serve as chairman of CodeFutures. Lynn brings more than 14 years of advanced technology and business success experience, and will help CodeFutures build on its industry leadership around its Agile Big Data initiatives. His technical expertise will be invaluable in advancing CodeFutures’ AgilData platform and new processes for streamlining and gaining value from gro...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
“We just completed the roll out of our first public and private cloud offerings, which are a combination of public, hybrid, and private cloud,” stated Erik Levitt, CEO of Open Data Centers, in this SYS-CON.tv interview at the 14th International Cloud Expo®, held June 10-12, 2014, at the Javits Center in New York City. Cloud Expo® 2014 Silicon Valley, November 4–6, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the...
Companies today struggle to manage the types and volume of data their customers and employees generate and use every day. With billions of requests daily, operational consistency can be elusive. In his session at Big Data Expo, Dave McCrory, CTO at Basho Technologies, will explore how a distributed systems solution, such as NoSQL, can give organizations the consistency and availability necessary to succeed with on-demand data, offering high availability at massive scale.
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
The 3rd International @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - is now accepting submissions to demo smart cars on the Expo Floor. Smart car sponsorship benefits include general brand exposure and increasing engagement with the developer ecosystem.
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
FedRAMP is mandatory for government cloud deployments and businesses need to comply in order to provide services for federal engagements. In his session at 16th Cloud Expo, Abel Sussman, Director for Coalfire Public Sector practice, will review the Federal Risk and Authorization Management Program (FedRAMP) process and provide advice on overcoming common compliance obstacles.
Software is eating the world. Companies that were not previously in the technology space now find themselves competing with Google and Amazon on speed of innovation. As the innovation cycle accelerates, companies must embrace rapid and constant change to both applications and their infrastructure, and find a way to deliver speed and agility of development without sacrificing reliability or efficiency of operations. In her Day 2 Keynote DevOps Summit, Victoria Livschitz, CEO of Qubell, discussed...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that SoftLayer, an IBM company, has been named “Gold Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015 at the Javits Center in New York City, NY, and the 17th International Cloud Expo®, which will take place November 3–5, 2015 at the Santa Clara Convention Center in Santa Clara, CA. SoftLayer operates a global cloud infrastructure platform built for Internet scale. With a global footprint of data centers and network points...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. Learn about IoT, Big Data and deployments processing massive data volumes from wearables, utilities and ot...