Welcome!

Big Data Journal Authors: Pat Romanski, Dana Gardner, Liz McMillan, Elizabeth White, Carmen Gonzalez

Related Topics: Cloud Expo, SOA & WOA, .NET, Virtualization, Web 2.0, Security, Big Data Journal, SDN Journal

Cloud Expo: Article

Leveraging Your Private PaaS for Feature Delivery

Improving the product and delivering value to users

The growth of cloud services for business has been a hot topic for years now, but 2012 was the year when the cloud went from market hype to mainstream deployment. Most organizations have now adopted a private cloud of some kind, but caution is preventing them from taking full advantage. Exploring the potential benefits of new tools is vital if IT departments hope to see real performance gains.

Recent Gartner research highlights the importance of digital technologies for CIOs in the coming year. Gartner's Mark McDonald described the problem succinctly, "IT needs new tools if it hopes to hunt for technology-intensive innovation and harvest raised business performance from transformed IT infrastructure, operations and applications. Without change, CIOs and IT consign themselves to tending a garden of legacy assets and responsibilities."

What Is the Problem?
As it stands we are seeing widespread adoption of private clouds that essentially act as virtualized infrastructure. That's undeniably useful, but it doesn't solve the underlying business problem which is how to accelerate the delivery of features. Features come from the application. If we really want to leverage the potential power of the private cloud then we should be working toward a setup that supports fast, cost-efficient and error-free changes to the application layer.

What we're talking about here is adding value by rolling out features internally and externally at a much quicker clip without jeopardizing the end quality. We need to go beyond infrastructure to private PaaS.

Why PaaS May Fall Short
There are quite a few PaaS solutions on the market, but many of them are not suitable for today's enterprise and there are a number of reasons for that. The majority of them run fully or mainly in the public cloud, which immediately raises security concerns. They tend to support a very limited subset of middleware and database solutions, so integration is difficult. Interoperability has not been given enough weight and that can lead to serious difficulties down the line. There's a real lack of mobility for an application deployed via a typical PaaS service right now. You may find your business locked into a cloud service platform provider.

A start-up might see the value in adopting one of these PaaS solutions because it allows them to avoid capital expenditure at the outset, but what if your business already has a large datacenter? Many enterprises will want the option to use their existing setup and they'll naturally shy away from becoming reliant on a particular vendor environment.

What Is the Goal?
What we are really looking for here is the ability to deliver the benefits of PaaS with your existing middleware environment. You need a solution that supports automated, efficient, error-free application updates. You need a system that supports auto-scaling of your runtime environment. You need a system that can deliver an end-to-end insight into your running applications and their configuration.

The aim is to free your business and your development team from the cost and complexity of managing the underlying hardware and software systems that allow you to deploy your applications. When a new feature request comes in or customer feedback leads development in a new direction, private PaaS should enable you to deliver faster than ever before and with fewer errors. An automation interface that is accessible for the team is infinitely more efficient, not to mention more cost-effective.

Working Towards Automation
Once you have a private PaaS it's important to ensure that your new features are rolled out onto the platform automatically if you want to leverage the maximum benefits from the system. Integration of your development tooling and test suites will enable your company to provide frequent, automated updates of incremental feature improvements that happen automatically.

The benefits are obvious both internally and externally. Not only can you automatically scale and manage your existing functionality as required, but you can also add new functionality without fear of introducing errors. It's easy to get a clear overview of your complete application state every step of the way.

If you can roll out new features and updates in this way then your development team can remain focused on what's important - improving the product and delivering value to its users.

More Stories By Andrew Phillips

Andrew Phillips heads up product management at XebiaLabs. He is an evangelist and thought leader in the DevOps, Cloud and Continuous Delivery space. He sits on the management team and drives product direction, positioning and planning.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@BigDataExpo Stories
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series dat...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial C...
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, addressed the big issues involving these technologies and, more important, the results they will achieve. Rodney Rogers, chairman and CEO of Virtustream; Brendan O'Brien, co-founder of Aria Systems, Bart Copeland, president and CEO of ActiveState Software; Jim Cowie, chief scientist at Dyn; Dave Wagstaff, VP ...
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happe...
The 4th International DevOps Summit, co-located with16th International Cloud Expo – being held June 9-11, 2015, at the Javits Center in New York City, NY – announces that its Call for Papers is now open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's large...
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, ...
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the "Internet of Things" (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his General Session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, discuss the ...
Agility is top of mind for Cloud/Service providers and Enterprises alike. Policy Driven Data Center provides a policy model for application deployment by decoupling application needs from the underlying infrastructure primitives. In his session at 15th Cloud Expo, David Klebanov, a Technical Solutions Architect with Cisco Systems, discussed how it differentiates from the software-defined top-down control by offering a declarative approach to allow faster and simpler application deployment. Davi...
The adoption of the Internet Of Things (IoT) is growing and its growth is synonymous with the growth of cloud. As per predictions from IDC: IoT and the Cloud: Within the next five years, more than 90% of all IoT data will be hosted on service provider platforms as cloud computing reduces the complexity of supporting IoT "Data Blending." This means that any organization that wanted to transform themselves using IoT has to automatically embrace the cloud too, especially the public cloud. This b...
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.
“We help people build clusters, in the classical sense of the cluster. We help people put a full stack on top of every single one of those machines. We do the full bare metal install," explained Greg Bruno, Vice President of Engineering and co-founder of StackIQ, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
In this demo at 15th Cloud Expo, John Meza, Product Engineer at Esri, showed how Esri products hook into Hadoop cluster to allow you to do spatial analysis on the spatial data within your cluster, and he demonstrated rendering from a data center with ArcGIS Pro, a new product that has a brand new rendering engine.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, discussed how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your onlin...
Hardware will never be more valuable than on the day it hits your loading dock. Each day new servers are not deployed to production the business is losing money. While Moore's Law is typically cited to explain the exponential density growth of chips, a critical consequence of this is rapid depreciation of servers. The hardware for clustered systems (e.g., Hadoop, OpenStack) tends to be significant capital expenses. In his session at Big Data Expo, Mason Katz, CTO and co-founder of StackIQ, disc...
Software Defined Storage provides many benefits for customers including agility, flexibility, faster adoption of new technology and cost effectiveness. However, for IT organizations it can be challenging and complex to build your Enterprise Grade Storage from software. In his session at Cloud Expo, Paul Turner, CMO at Cloudian, looked at the new Original Design Manufacturer (ODM) market and how it is changing the storage world. Now Software Defined Storage companies can build Enterprise grade ...
Can the spatial component of your Big Data be harnessed and visualized, adding another dimension of power and analytics to your data? In his session at Big Data Expo®, John Meza, Product Engineer and Performance Engineering Team Lead at Esri, discussed the spatial queries that can be used within the Hadoop ecosystem and their integration with GeoSpatial applications. The GIS Tools for Hadoop project was also discussed and its implementation to discover location-based patterns and relationships...