Welcome!

Big Data Journal Authors: Yeshim Deniz, Adrian Bridgwater, Roger Strukhoff, Rick Delgado, Elizabeth White

Related Topics: Cloud Expo, SOA & WOA, .NET, Virtualization, Web 2.0, Security, Big Data Journal, SDN Journal

Cloud Expo: Article

Leveraging Your Private PaaS for Feature Delivery

Improving the product and delivering value to users

The growth of cloud services for business has been a hot topic for years now, but 2012 was the year when the cloud went from market hype to mainstream deployment. Most organizations have now adopted a private cloud of some kind, but caution is preventing them from taking full advantage. Exploring the potential benefits of new tools is vital if IT departments hope to see real performance gains.

Recent Gartner research highlights the importance of digital technologies for CIOs in the coming year. Gartner's Mark McDonald described the problem succinctly, "IT needs new tools if it hopes to hunt for technology-intensive innovation and harvest raised business performance from transformed IT infrastructure, operations and applications. Without change, CIOs and IT consign themselves to tending a garden of legacy assets and responsibilities."

What Is the Problem?
As it stands we are seeing widespread adoption of private clouds that essentially act as virtualized infrastructure. That's undeniably useful, but it doesn't solve the underlying business problem which is how to accelerate the delivery of features. Features come from the application. If we really want to leverage the potential power of the private cloud then we should be working toward a setup that supports fast, cost-efficient and error-free changes to the application layer.

What we're talking about here is adding value by rolling out features internally and externally at a much quicker clip without jeopardizing the end quality. We need to go beyond infrastructure to private PaaS.

Why PaaS May Fall Short
There are quite a few PaaS solutions on the market, but many of them are not suitable for today's enterprise and there are a number of reasons for that. The majority of them run fully or mainly in the public cloud, which immediately raises security concerns. They tend to support a very limited subset of middleware and database solutions, so integration is difficult. Interoperability has not been given enough weight and that can lead to serious difficulties down the line. There's a real lack of mobility for an application deployed via a typical PaaS service right now. You may find your business locked into a cloud service platform provider.

A start-up might see the value in adopting one of these PaaS solutions because it allows them to avoid capital expenditure at the outset, but what if your business already has a large datacenter? Many enterprises will want the option to use their existing setup and they'll naturally shy away from becoming reliant on a particular vendor environment.

What Is the Goal?
What we are really looking for here is the ability to deliver the benefits of PaaS with your existing middleware environment. You need a solution that supports automated, efficient, error-free application updates. You need a system that supports auto-scaling of your runtime environment. You need a system that can deliver an end-to-end insight into your running applications and their configuration.

The aim is to free your business and your development team from the cost and complexity of managing the underlying hardware and software systems that allow you to deploy your applications. When a new feature request comes in or customer feedback leads development in a new direction, private PaaS should enable you to deliver faster than ever before and with fewer errors. An automation interface that is accessible for the team is infinitely more efficient, not to mention more cost-effective.

Working Towards Automation
Once you have a private PaaS it's important to ensure that your new features are rolled out onto the platform automatically if you want to leverage the maximum benefits from the system. Integration of your development tooling and test suites will enable your company to provide frequent, automated updates of incremental feature improvements that happen automatically.

The benefits are obvious both internally and externally. Not only can you automatically scale and manage your existing functionality as required, but you can also add new functionality without fear of introducing errors. It's easy to get a clear overview of your complete application state every step of the way.

If you can roll out new features and updates in this way then your development team can remain focused on what's important - improving the product and delivering value to its users.

More Stories By Andrew Phillips

Andrew Phillips is Vice President, Product Management, XebiaLabs. XebiaLabs is a global provider of Application Release Automation and offers a Continuous Delivery platform to help enterprises improve and accelerate the application delivery process.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Latest Stories from Big Data Journal
Are your Big Data initiatives resulting in a Big Impact or a Big Mess? In his session at 6th Big Data Expo®, Jean-Francois Barsoum, Senior Managing Consultant, Smarter Cities, Water and Transportation at IBM, will share their successes in improving Big Decision outcomes by building stories compelling to the target audience – and our failures when we lost sight of the plotline, distracted by the glitter of technology and the lure of buried insights. Our cast of characters includes the agency head...
Scene scenario: 10 am in a boardroom somewhere, second round of coffees served, Danish and donuts untouched, a quiet hush settles. “Well you know what guys? (and, by the use of the term guys I mean to include both sexes here assembled) – the trouble that we have as a company is that we are, to put it bluntly, just a little analytics poor,” said the newly appointed Chief Analytics Officer. That we should consider a firm to be analytically deficient or poor is a profound comment on our modern ag...
When one expects instantaneous response from video generated on the internet, lots of invisible problems have to be overcome. In his session at 6th Big Data Expo®, Tom Paquin, EVP and Chief Technology Officer at OnLive, to discuss how to overcome these problems. A Silicon Valley veteran, Tom Paquin provides vision, expertise and leadership to the technology research and development effort at OnLive as EVP and Chief Technology Officer. With more than 20 years of management experience at lead...
BlueData aims to “democratize Big Data” with its launch of EPIC Enterprise, which it calls “the industry’s first Big Data software to enable enterprises to create a self-service cloud experience on premise.” This self-service private cloud allows enterprises to create 100-node Hadoop and Spark clusters in less than 10 minutes. The company is also offering a Community Edition via free download. We had a few questions for BlueData CEO Kumar Sreekanti about all this, and here's what he had to s...
Labor market analytics firm Wanted Analytics recently assessed the market for technology professionals and found that demand for people with proficient levels of Hadoop expertise had skyrocketed by around 33% since last year – it is true, Hadoop is hard technology to master and the labor market is not exactly flooded with an over-abundance of skilled practitioners. Hadoop has been called a foundational technology, rather than ‘just’ a database by some commentators – this almost pushes it towards...
Where historically app development would require developers to manage device functionality, application environment and application logic, today new platforms are emerging that are IoT focused and arm developers with cloud based connectivity and communications, development, monitoring, management and analytics tools. In her session at Internet of @ThingsExpo, Seema Jethani, Director of Product Management at Basho Technologies, will explore how to rapidly prototype using IoT cloud platforms and c...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With “smart” appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user’s habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps,...
I write and study often on the subject of digital transformation - the digital transformation of industries, markets, products, business models, etc. In brief, digital transformation is about the impact that collected and analyzed data can have when used to enhance business processes and workflows. If Amazon knows your preferences for particular books and films based upon captured data, then they can apply analytics to predict related books and films that you may like. This improves sales. T...
Gridstore has announced that NAC, Inc. and Sky Tech have joined its innovative Accelerate Partner Program. Both new members cite Gridstore's expertise in enabling the Hybrid Cloud and their solution purpose-built for Hyper-V as the key criteria for their decision to join the program. Integrating seamlessly with business clients, these new partners provide industry-proven storage solutions that promote satisfied customers, profitable businesses, and communities that thrive.
There are 182 billion emails sent every day, generating a lot of data about how recipients and ISPs respond. Many marketers take a more-is-better approach to stats, preferring to have the ability to slice and dice their email lists based numerous arbitrary stats. However, fundamentally what really matters is whether or not sending an email to a particular recipient will generate value. Data Scientists can design high-level insights such as engagement prediction models and content clusters that a...