Click here to close now.

Welcome!

Big Data Journal Authors: Ian Khan, Harry Trott, Carmen Gonzalez, Elizabeth White, Bart Copeland

Related Topics: Cloud Expo, Java, SOA & WOA, Linux, Big Data Journal, SDN Journal

Cloud Expo: Article

Application Performance Doesn’t Have to Be a Cloud Detractor

The transition to the cloud offers IT teams an excellent opportunity to emerge as protectors of their organizations

Introduction: The Cloud as a Point of IT and Business Disconnect
For years, the benefits of moving to the cloud - including lowered costs, flexibility and faster time-to-market - have been espoused. But among IT professionals, there remains widespread reticence about migrating mission-critical applications over, due in large part to performance concerns. "Can the cloud really deliver the same level of speed and reliability as physical services?" IT asks. Business managers often minimize or downplay these worries, saying that the potential business benefits to be reaped are just too mission-critical to ignore.

Therein lies a major conflict. Regardless, as numerous surveys like this one from IDC demonstrate, cloud adoption is moving forward at a rapid pace. Clearly, the cloud is where we're headed. IT must learn to accept the cloud as just part of how services are delivered today, rather than some exotic and potentially dangerous new technology. The good news is IT's concerns about application performance are not insurmountable and can actually be eased with specific approaches.

This article will discuss factors leading to IT's wariness of the cloud. It will also highlight recent survey findings that show just how concerned IT professionals actually are, even as organizations move to the cloud en masse. Finally, the article offers several recommendations to help assuage IT's concerns and minimize risks as cloud adoption continues at a rapid clip.

Why Is IT Uneasy?
It would seem that cloud computing and high performance should go hand-in-hand. Theoretically, performance for cloud-based applications should match that of applications hosted on physical servers, assuming the configuration is right. However, in the real world, many factors can impact the performance of cloud-based applications, including limited bandwidth, disk space, memory and CPU cycles. Cloud customers often have little visibility into their cloud service providers' capacity management decisions, leaving IT professionals with a sense of distrust.

IT's concerns are exacerbated by the fact that when major cloud services do fail, they tend to fail spectacularly. Media often flock to this news like vultures, further undermining IT's confidence. As an example, this past summer Apple's iCloud service, which helps connect iPhones, iPads and other Apple devices to key services, went down for more than six hours. This captured headlines around the world. While Apple claimed that the outage impacted less than one percent of iCloud customers, the sheer size of this user base - 300 million users - translated to approximately three million users being disconnected from services for 11 hours. Shortly thereafter, an Amazon EC2 outage rocked Instagram, Vine, Netflix and several other major customers of this cloud service, inflicting unplanned downtime across all of them and igniting a frenzy of negative press.

In an attempt to ease IT's worries, several cloud service providers have begun offering "premium" performance features along with cloud instances. For example, Amazon EC2 now offers its customers dedicated IOPS (input/output operations per second) to benchmark disk performance. Other cloud service providers are also marketing ways to configure their platforms for different performance thresholds. The challenge here is that few companies can afford premium features for every cloud-based node and service. Without a view into the cloud's impact on end-user performance on the other side of the cloud, it is nearly impossible to identify poor end-user performance, never mind where these premium features could be applied for maximum ROI.

Finally, an awareness of their growing - and often precarious - reliance on cloud services further forces IT to face their own vulnerability. Enterprise use of cloud technology grew 90 percent between early 2012 and mid 2013, according to Verizon's recent "State of the Enterprise Cloud Report." Another important trend worth noting is that businesses are hosting less and less of what gets delivered on their websites. Instead, they're relying on a growing number of externally hosted (third-party) web elements to enrich their web properties, such as ad servers and social media plug-ins. This often results in a company becoming a cloud customer indirectly, without their even knowing it.

Recent Survey Results Demonstrate IT's Wariness
Recently, Research in Action (on behalf of Compuware) conducted a survey of 468 CIOs and other senior IT professionals from around the world, which determined cloud computing to be the top IT investment priority. No surprises there, as clearly these professionals are being driven by the promised benefits of greater agility, flexibility and time-to-value.

What is surprising is the fact that 79 percent of these professionals expressed concern over the hidden costs of cloud computing, with poor end-user experience resonating as the biggest management worry. According to the survey, here are the four leading concerns with cloud migration:

  • Performance Bottlenecks: (64%) Respondents believe that cloud resources and e-commerce will experience poor performance due to cloud application bottleneck usage.
  • Poor End-User Experience: (64%) End users may end up dissatisfied with the cloud performance due to heavy traffic from application usage.
  • Reduced Brand Perception: (51%) Customer loyalty may be greatly reduced due to poor experience and poor cloud performance.
  • Loss of Revenue: (44%) Companies may lose revenues as a result of poor performance, reduced availability or slow technical troubleshooting services.

Ironically, these responses come at a time when the cloud is increasingly being used to support mission-critical applications like e-commerce. More than 80 percent of the professionals surveyed are either already using cloud-based e-commerce platforms or are planning to do so within the next year. It's evident that even as cloud adoption marches forward, a layer of trepidation remains, at least among IT staffs.

Business managers believe the efficiency benefits of the cloud are just too mission-critical to ignore. But IT's primary concern - application performance - is also mission-critical, and perhaps a bit more visceral and tangible. After all, a major service outage is a blatant, clear-cut scenario while efficiency gains or losses are often more subtle and less quantifiable. Ultimately, it's IT that takes the blame when business services don't work exactly as planned.

It used to be that issues like security and cost dominated the list of cloud concerns. But application performance is increasingly making headway as users grow more demanding. For the average user, 0.1 seconds is an instantaneous, acceptable response, similar to what they experience with a Google search. As response times increase, interactions begin to slow and dissatisfaction rises. The impact of a slowdown can be devastating: Amazon has calculated that a page load slowdown of just one second could cost it $1.6 billion in sales each year. In addition, Google found that slowing search response times by just four-tenths of a second would reduce the number of searches by eight million per day - a sizeable amount.

Getting the Performance You Need from the Cloud
As more and more companies begin or extend their journey to the cloud, there are things IT can do to increase their comfort level. These include:

1. Don't Be Afraid to Experiment: Cloud computing offers businesses the opportunity to leverage computing resources they might not otherwise have the expertise or wherewithal to employ. But it can be intimidating to move critical operations out of one's own hands. That's where free trials come in. A number of cloud computing vendors offer free test runs that let companies figure out how cloud services would meld with their current operations.

Getting the most out of a trial period takes some planning and effort, and this includes making certain to measure the cloud service provider's performance. Unfortunately, most cloud service providers today don't measure and provide performance statistics as part of these trial periods, so it's incumbent upon prospective customers to do so. It's often best to experiment in the cloud with a non-critical system, such as a sales support application that doesn't have a huge impact on customers, should performance degrade. Organizations should also be sure to measure performance for as broad a cross-section of users as possible.

2. Insist on Performance-Focused SLAs: Inherent cloud attributes like on-demand resource provisioning and scalability are designed to increase confidence in the usability of applications and data hosted in the cloud. But the most common mistake that people often make is interpreting availability guarantees as performance guarantees in a cloud computing environment. Availability shows that a cloud service provider's servers are up and running - but that's about it. Service-level agreements (SLAs) based on availability say nothing about the user experience, which can be significantly impacted by the cloud - such as, when an organization's "neighbor" in the cloud experiences an unexpected spike in traffic. Yet, despite the mission-critical nature of many cloud applications, our survey found that 73 percent of companies are still using outdated methods like availability measurements to track and manage application performance.

The fact is that most traditional monitoring tools simply don't work in the cloud. Effectively monitoring and managing modern cloud-based applications and services requires a new approach based on more granular user metrics such as response time and page rendering time. This approach must be based on an understanding of the true user interaction "on the other side" of the cloud. It must enable cloud customers to directly measure the performance of their cloud service providers and validate SLAs. With this type of approach, cloud customers can be better assured that application performance issues will not undercut the benefits of moving to the cloud. In addition, an understanding of true end-user experiences across key geographies can help companies identify the most strategic opportunities for applying premium performance features, as discussed above.

3. Utilize Industry Resources: There are resources available to help companies better assess if the source of a performance problem lies with them or with a cloud service provider, as well as the likely performance impact on customers. As an example, Compuware's Outage Analyzer is a free new generation performance analytics solution that tracks Internet web service outages, including cloud service outages, in real-time around the world. Outage Analyzer provides instant insight into the performance of thousands of cloud services and the resulting impact on the websites they service. Resources like this may not prevent cloud service outages from happening, but they can help companies better understand the source of performance problems so they can get in front of them more confidently and efficiently.

Conclusion: Cloud Computing Is the "New Normal"
Like it or not, cloud computing is here to stay, and its adoption will only accelerate further in the years to come. In many ways, the move to the cloud is reminiscent of the adoption of Linux. At one time, IT administrators had significant concerns about Linux, including its scalability and reliability. But sure enough, businesses continued their adoption of Linux, propelled largely by the promise of lower costs and greater efficiencies. Today, Linux is a well-integrated component of corporate data centers worldwide.

In reality, neither IT nor the business is wrong when it comes to their strong opinions on adopting the cloud for mission-critical applications. Ultimately both sides share the same goal, which is to maximize a company's revenues and profits. It's just that the two teams approach the problem differently: IT emphasizes application performance as a means of driving productivity and conversions, while business leaders look to increase cash flow, seek greatest return on capital investments and lower operating expenses.

The move to the cloud can be a very good thing for today's enterprises. It's also a good thing to be cloud-wary, and this is where the business will ultimately depend on IT to be vigilant. By paying due attention to performance issues, the transition to the cloud offers IT teams an excellent opportunity to emerge as protectors of their organizations, thus maximizing return on cloud investments.

More Stories By Ronald Miller

Ronald Miller is Marketing Manager at Compuware Application Performance Management(APM) Business Unit. For over a decade he has served in a variety of product marketing roles in the software, mobile, and high technology industries. In his current role managing Compuware’s go-to-market efforts for Cloud and Big Data, he is dedicated to helping Compuware APM customers get the most ROI and performance out of their Cloud and Big Data deployments.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@BigDataExpo Stories
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understand...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impac...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
Companies today struggle to manage the types and volume of data their customers and employees generate and use every day. With billions of requests daily, operational consistency can be elusive. In his session at Big Data Expo, Dave McCrory, CTO at Basho Technologies, will explore how a distributed systems solution, such as NoSQL, can give organizations the consistency and availability necessary to succeed with on-demand data, offering high availability at massive scale.
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...