Welcome!

@BigDataExpo Authors: Pat Romanski, Elizabeth White, Liz McMillan, William Schmarzo, Kevin Benedict

Related Topics: @CloudExpo, Microservices Expo, IBM Cloud

@CloudExpo: Article

Reference Architecture for Cloud Computing

IBM releases second version of its cloud reference architecture

Admittedly, when I was heads-down in code earlier in my career, I did not pay much attention to reference architectures. We had our own internal architectures that served as ‘the way and the truth', and reference architectures for our product or solution domain were simply out of scope.  Anyway, reference architectures are, by design, not detailed enough to steer someone implementing one out of hundreds of components that will fall under said architectures. So, for the most part I ignored them, even though I could hear rumblings coming from rooms full of folks arguing over revision 25 of the reference architecture for some problem domain or another.

Fast forward a few years to a change of professional venue, and my outlook on reference architectures is a good deal different. If I were still developing, I'm sure my outlook would be much the same. However, talking with users on a frequent basis has made me aware that such architectures and solution domain overviews can be of great value to both buyers and providers. For buyers, reference architectures can help to orient them in a particular domain, and they can guide implementation and buying strategies. For providers, reference architectures serve to clearly communicate their outlook on a particular domain to both the buyers and broader market. Put simply, reference architectures serve both sides of the coin.

Now that's not to say that reference architectures come without their detractors. There are always those that stand ready to point out holes and biases in a particular provider's reference architecture. In fact, some seem to completely write off reference architectures as an instrument of marketing. In my opinion, some of these complaints are without merit and a bit overly cynical. Other complaints rise above typical inter-vendor sniping and actually point out valid holes, oversights, and biases with a particular provider's architecture. Open discourse and communication is good. In that light, I was glad to see IBM publish the second version of its cloud computing reference architecture to the Open Group earlier this week.

The document, which you can download here, explains the reference architecture in detail, but I want to look at the major highlights. To start, let's consider the high-level diagram for the architecture:

As you can see, the architecture orients itself around user roles for cloud computing. On either end, you have the cloud service creator and cloud service consumer. As its name implies, the cloud service creator role includes any type of cloud service creation tools. These tools include software development environments, virtual image development tools, process choreographing solutions, and anything else a developer may use to create services for the cloud.

On the other side of the architecture, the cloud service consumer comes into focus. As you well know, in a cloud environment there are many potential service consumers. The architecture above accounts for in-house IT as well as cloud service integration tools as consumers. There are countless more, but just with these you can begin to appreciate the challenge of effectively enabling the ‘consumer.' This requires self-service portals, service catalogs, automation capability, federated security, federated connectivity, and more. It is certainly no small task.

Finally, in the middle of the diagram, we have perhaps the most complex role, the cloud service provider. This section builds on top of a shared, usually virtualized infrastructure to address two basic facets for providers: services and service management. From a services perspective, we see the trinity of the cloud (IaaS, PaaS, SaaS), with an added wrinkle, Business Process as a Service. As the diagram acknowledges, existing services and partner services will nearly always augment these services, thereby implying the need for tools that provide both functional and non-functional integration capabilities.

Opposite the services, we see the common management framework that divides into two major categories: Operational Support Services (OSS) and Business Support Services (BSS). Naturally, the OSS accounts for those capabilities that a provider needs to effectively operate a cloud environment. This includes provisioning, monitoring, license management, service lifecycle management, and a slew of other considerations. BSS outlines the capabilities providers need to support the business requirements of cloud, and this includes pricing, metering, billing, order management, order fulfillment, and more.

Of course, there are non-functional requirements that span all three roles including security, performance, resiliency, consumability, and governance. Thus, these wrap the three major roles in the reference architecture shown above.

I know there will be some that disagree with certain elements of this reference architecture, but that is good and healthy. For those that have strong opinions on this subject (one way or another), I encourage you to get involved. That is the benefit of this being in the Open Group. You can download the reference architecture, review it at your leisure, and then discuss and influence change via the mailing list discussion. In other words, speak up!

More Stories By Dustin Amrhein

Dustin Amrhein joined IBM as a member of the development team for WebSphere Application Server. While in that position, he worked on the development of Web services infrastructure and Web services programming models. In his current role, Dustin is a technical specialist for cloud, mobile, and data grid technology in IBM's WebSphere portfolio. He blogs at http://dustinamrhein.ulitzer.com. You can follow him on Twitter at http://twitter.com/damrhein.

@BigDataExpo Stories
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
Connecting to major cloud service providers is becoming central to doing business. But your cloud provider’s performance is only as good as your connectivity solution. Massive Networks will place you in the driver's seat by exposing how you can extend your LAN from any location to include any cloud platform through an advanced high-performance connection that is secure and dedicated to your business-critical data. In his session at 21st Cloud Expo, Paul Mako, CEO & CIO of Massive Networks, wil...
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
Cloud adoption is often driven by a desire to increase efficiency, boost agility and save money. All too often, however, the reality involves unpredictable cost spikes and lack of oversight due to resource limitations. In his session at 20th Cloud Expo, Joe Kinsella, CTO and Founder of CloudHealth Technologies, tackled the question: “How do you build a fully optimized cloud?” He will examine: Why TCO is critical to achieving cloud success – and why attendees should be thinking holistically ab...
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, presented an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He then expounded on the industry issues he frequently came up against as an analyst, and ...
SYS-CON Events announced today that App2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. App2Cloud is an online Platform, specializing in migrating legacy applications to any Cloud Providers (AWS, Azure, Google Cloud).
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
Because IoT devices are deployed in mission-critical environments more than ever before, it’s increasingly imperative they be truly smart. IoT sensors simply stockpiling data isn’t useful. IoT must be artificially and naturally intelligent in order to provide more value In his session at @ThingsExpo, John Crupi, Vice President and Engineering System Architect at Greenwave Systems, will discuss how IoT artificial intelligence (AI) can be carried out via edge analytics and machine learning techn...
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
Existing Big Data solutions are mainly focused on the discovery and analysis of data. The solutions are scalable and highly available but tedious when swapping in and swapping out occurs in disarray and thrashing takes place. The resolution for thrashing through machine learning algorithms and support nomenclature is through simple techniques. Organizations that have been collecting large customer data are increasingly seeing the need to use the data for swapping in and out and thrashing occurs ...