Click here to close now.

Welcome!

Big Data Journal Authors: Elizabeth White, Liz McMillan, Dana Gardner, Carmen Gonzalez, Pat Romanski

Related Topics: Cloud Expo, Microservices Journal, Virtualization, Apache, Security, Big Data Journal

Cloud Expo: Blog Feed Post

Seven Modeling Tools to Help Assess Cloud ROI

The potential savings and agility gains offered by cloud services are too important to IT

Network computing posted  a good article on tool for assessing cloud ROI. Check it out below.
Posted by David Greenfield, January 08, 2013

Gone are the days when the cloud meant simply a hosted virtual instance or resource on some provider's network. We're seeing all sorts of new variants and twists emerge. I'm not just referring to the use of storage or compute resources or those that allow for elastic computing, such as Amazon's Reserved Instances.

We're likely to see industry-specific community clouds emerge. These clouds are built to address the security and compliance needs within specific industries. Examples of community clouds may well be Verizon's Health Insurance Portability and Accountability Act (HIPAA) cloud service, which targets the healthcare community; and Metal Lynx, a cloud community targeting buyers and sellers of precious metals.

Other options include private clouds that are managed by a third party off premises, according to Chris Morris, associate VP for Asia/Pacific cloud services and computing at IDC. Morris's prediction is detailed by writer Joe McKendrik in a Fortune Magazine post, "7 Predictions for Cloud Computing in 2013 That Make Perfect Sense."

These variants can sound like just the ticket for certain verticals, but do they make financial sense? That can be harder to judge. And before IT can adopt and/or take responsibility for these cloud services, the same sort of rational, fiscal arguments that have been made for any service or equipment will have to be made with cloud services.

This requirement marks another stage of cloud development, as IT can't simply assume that "going to the cloud" equates to savings. This is a point made by James Staten, Forrester Research's principal analyst serving infrastructure and operations professionals, in a blog post where he notes that many applications are more expensive to run in the cloud.

This is particularly the case when evaluating the use of Reserved Instances, where determining break-even points can be so difficult. There are so many factors to consider-usage patterns, costs, preferences around utilization level, commitment term, and more. The good news: Seven vendors are already offering cloud modeling and costing tools for determining the true cost of cloud services and how best to use on-demand resources.

6Fusion allows IT to benchmark existing cloud operations against public services. Its Workload Allocation Cube (WAC) measures the critical compute resources required to operate an x86-based software application. 6Fusion introduced the Cloud Resource Meter for VMware vSphere during the summer. The company also meters resources running on Linux and Windows environments.

Apptio expanded its line of IT monitoring and costing tools in December with Cloud Express, a free, cloud-costing tool. Cloud Express lets you enter your costing information and receive recommendations for managing your reserved instances across any cloud platform today.

Cloudyn added a tool in the fall that that lets an organization calculate the number, type and duration of Reserved Instances a company should purchase to meet its operational requirements. Cloudyn says it has seen the number of companies using Reserved Instances jump from 29% to 48% in nine months. Supported services are Amazon Elastic Compute Cloud (EC2), Amazon Elastic Block Store (EBS) and Amazon Relational Database Service (RDS).

Cloud Cruiser is offering tools for chargeback in enterprises and, more recently, service providers. Dashboards let individuals monitor their personal cloud usage. Products are provided for the enterprise (Cloud Cruiser Enterprise Edition) and the Service Provider (Cloud Service Provider Edition), which was announced in December. Cloud Cruiser collects information from Amazon, Microsoft Windows Azure and leading hypervisors.

Cloudability lets you track a wide range of key performance indicators (KPIs) and lets you see whether you're saving money with Reserved Instances. Cloudability works across a wide variety of cloud platforms, including AWS, Google Apps and HP Cloud.

Newvem baselines and tracks the assets, costs and risks of using a cloud service. The platform is available for Amazon EC2 and Amazon Simple Storage Service (S3) services, and in November the company announced 30 new partnerships with Amazon Partnership Network members. Newvem's Cloud Smart Meter for AWS is a native iPad and iPhone application that lets IT track AWS costs and assets.

Rightscale simulates a cloud deployment to identify costs. You model the deployment and then identify the cost of elasticity and three-year costing. You can also run what-if scenarios. Platforms supported include AWS, Google Compute Engine, Microsoft Azure, Rackspace, and SoftLayer.

The cloud as a term may not be here forever (see point seven of the Forbes article), but whatever you call them, cloud services will not disappear anytime soon and are increasingly viable. The potential savings and agility gains offered by cloud services are too important to IT, but only if IT can get buy-in from management. Demonstrating the real fiscal value-not just the promise-of going to the cloud is best way to secure that purchase order for more cloud services.

David Greenfield is a long-time technology analyst. He currently works in product marketing for Silver Peak.

The post 7 Modeling Tools To Help Assess Cloud ROI appeared first on 6fusion.

Read the original blog entry...

More Stories By John Cowan

John Cowan is co-founder and CEO of 6fusion. John is credited as 6fusion's business model visionary, bridging concepts and services behind cloud computing to the IT Service channel. In 2008, he along with his 6fusion collaborators successfully launched the industry's first single unit of meausurement for x86 computing, known as the Workload Allocation Cube (WAC). John is a 12 year veteran of business and product development within the IT and Telecommunications sectors and a graduate of Queen's University at Kingston.

@BigDataExpo Stories
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
SYS-CON Events announced today that the "First Containers & Microservices Conference" will take place June 9-11, 2015, at the Javits Center in New York City. The “Second Containers & Microservices Conference” will take place November 3-5, 2015, at Santa Clara Convention Center, Santa Clara, CA. Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities.
Can the spatial component of your Big Data be harnessed and visualized, adding another dimension of power and analytics to your data? In his session at Big Data Expo®, John Meza, Product Engineer and Performance Engineering Team Lead at Esri, discussed the spatial queries that can be used within the Hadoop ecosystem and their integration with GeoSpatial applications. The GIS Tools for Hadoop project was also discussed and its implementation to discover location-based patterns and relationships...
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. 8th International Big Data Expo, co-located with 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. As advanced data storage, access and analytics technologies aimed at handling high-volume and/or fast moving data all move center stage, aided by the cloud computing bo...
"People are a lot more knowledgeable about APIs now. There are two types of people who work with APIs - IT people who want to use APIs for something internal and the product managers who want to do something outside APIs for people to connect to them," explained Roberto Medrano, Executive Vice President at SOA Software, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Discussions about cloud computing are evolving into discussions about enterprise IT in general. As enterprises increasingly migrate toward their own unique clouds, new issues such as the use of containers and microservices emerge to keep things interesting. In this Power Panel at 16th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists will address the state of cloud computing today, and what enterprise IT professionals need to know about how the latest topics and trends affec...
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
An effective way of thinking in Big Data is composed of a methodical framework for dealing with the predicted shortage of 50-60% of the qualified Big Data resources in the U.S. This holistic model comprises the scientific and engineering steps that are involved in accelerating Big Data solutions: problem, diagnosis, facts, analysis, hypothesis, solution, prototype and implementation. In his session at Big Data Expo®, Tony Shan focused on the concept, importance, and considerations for each of t...
Cloud and Big Data present unique dilemmas: embracing the benefits of these new technologies while maintaining the security of your organization's assets. When an outside party owns, controls and manages your infrastructure and computational resources, how can you be assured that sensitive data remains private and secure? How do you best protect data in mixed use cloud and big data infrastructure sets? Can you still satisfy the full range of reporting, compliance and regulatory requirements? In...
Big Data is amazing, it's life changing and yes it is changing how we see our world. Big Data, however, can sometimes be too big. Organizations that are not amassing massive amounts of information and feeding into their decision buckets, smaller data that feeds in from customer buying patterns, buying decisions and buying influences can be more useful when used in the right way. In their session at Big Data Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positi...
Storage administrators find themselves walking a line between meeting employees’ demands to use public cloud storage services, and their organizations’ need to store information on-premises for security, performance, cost and compliance reasons. However, as file sharing protocols like CIFS and NFS continue to lose their relevance, simply relying only on a NAS-based environment creates inefficiencies that hurt productivity and the bottom line. IT wants to implement cloud storage it can purchase a...
The cloud is everywhere and growing, and with it SaaS has become an accepted means for software delivery. SaaS is more than just a technology, it is a thriving business model estimated to be worth around $53 billion dollars by 2015, according to IDC. The question is - how do you build and scale a profitable SaaS business model? In his session at 15th Cloud Expo, Jason Cumberland, Vice President, SaaS Solutions at Dimension Data, discussed the common mistakes businesses make when transitioning t...
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial C...
Are your Big Data initiatives resulting in Big Impact or Big Mess? In her session at Big Data Expo, Penelope Everall Gordon, Emerging Technology Strategist at 1Plug Corporation, shared her successes in improving Big Decision outcomes by building stories compelling to the target audience – and her failures when she lost sight of the plotline, distracted by the glitter of technology and the lure of buried insights. The cast of characters includes the agency head [city official? elected official?...
More organizations are embracing DevOps to realize compelling business benefits such as more frequent feature releases, increased application stability, and more productive resource utilization. However, security and compliance monitoring tools have not kept up and often represent the single largest remaining hurdle to continuous delivery. In their session at DevOps Summit, Justin Criswell, Senior Sales Engineer at Alert Logic, Ricardo Lupo, a Solution Architect with Chef, will discuss how to ...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises a...
Move from reactive to proactive cloud management in a heterogeneous cloud infrastructure. In his session at 16th Cloud Expo, Manoj Khabe, Innovative Solution-Focused Transformation Leader at Vicom Computer Services, Inc., will show how to replace a help desk-centric approach with an ITIL-based service model and service-centric CMDB that’s tightly integrated with an event and incident management platform. Learn how to expand the scope of operations management to service management. He will al...
The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application and business needs change, data repositories and structures get outmoded rapidly, resulting in increased work for applica...