Click here to close now.

Welcome!

Big Data Journal Authors: Carmen Gonzalez, Liz McMillan, Roger Strukhoff, Leo Reiter, Pat Romanski

Related Topics: Cloud Expo, SOA & WOA, Virtualization, Security, Big Data Journal, SDN Journal

Cloud Expo: Article

The Intelligence Inside: Cloud Developers Change the World of Analytics

Evidence is mounting that embedding analytics inside apps business people use every day can lead to quantifiable benefits

Slide Deck from Karl Van den Bergh's Cloud Expo Presentation: The Intelligence Inside: How Developers of Cloud Apps Will Change the World of Analytics

We live in a world that requires us to compete on our differential use of time and information, yet only a fraction of information workers today have access to the analytical capabilities they need to make better decisions. Now, with the advent of a new generation of embedded business intelligence (BI) platforms, cloud developers are disrupting the world of analytics. They are using these new BI platforms to inject more intelligence into the applications business people use every day. As a result, data-driven decision-making is finally on track to become the rule, not the exception.

The Increased Focus on Analytics
With the emphasis on data-driven decision-making, it is perhaps not a surprise that the focus on analytics continues to mount. According to IDC's Dan Vesset, 2013 was poised to be the first year that the market for data-driven decision making enabled by business analytics broke through the $100 billion mark. IT executives are also doubling-down on analytics, a fact highlighted by Gartner's annual CIO survey which has put analytics as the number one technology priority three times out of the last five years. So, given the importance and spend on analytics, everyone should have access to the insight they need, right?

Most Business People Still Don't Use Analytics
Amazingly, in spite of spending growth and focus, most information workers today do not have access to business intelligence. In fact, Cindi Howson of BI Scorecard has found that end-user adoption of BI seems to have stagnated at about 25%. This stagnation is difficult to reconcile. How is it possible that, at best, one quarter of information workers have access to what is arguably most critical to their success in a world that runs on data?

There are a variety of reasons for stagnant end-user adoption, including the high costs associated with BI projects and an overall lack of usability. However, the biggest impediment to BI adoption has nothing to do with the technology. The reality is that the vast majority of business decision makers do not spend their day working in a BI tool - nor do they want to. Users already have their preferred tool or application: sales representatives use a CRM service; marketers use a campaign management or marketing automation platform; back-office workers will spend a lot of their day in an ERP application; executives will typically work with their preferred productivity suite, and the list goes on. Unless you are a data analyst, you are not going to want to spend much of your day using a BI tool. But, just because business people prefer not to use a BI tool does not mean they don't want access to pertinent data to bolster better decision-making.

The Need for More Intelligence Inside Applications
What's the solution? Simply put, bring the data TO users inside their preferred applications instead of expecting them to go to a separate BI system to find the report, dashboard or visualization that's relevant to the question at hand. If we want to reach the other 75% of business people who don't have access to a standalone BI product, we have to inject intelligence inside the applications and services they use every day. It is only through more intelligent applications that organizations can benefit from broader data-driven decision-making. In fact, according to Gartner, BI will only become pervasive when it essentially becomes "invisible" to business people as part of the applications they use daily. In a 2013 report highlighting key emerging tech trends, Gartner concludes that in order "to make analytics more actionable and pervasively deployed, BI and analytics professionals must make analytics more invisible and transparent to their users." How? The report explains this will happen "through embedded analytic applications at the point of decision or action."

If the solution to pervasive BI is to deliver greater intelligence inside applications, why don't more applications embed analytics? The reality is that only a small fraction of applications built today have embedded intelligence. Sure, they might have a table or a chart but there is no intelligent engine; users typically can't personalize a report or dashboard or self-serve to generate new visualizations on an ad-hoc basis. The culprit here is that business intelligence was originally intended as a standalone activity, not one that was designed to be embeddable. Specifically, the reasons driving developers to ignore BI platforms boil down to cost and complexity.

Cost and Complexity Are Barriers to Embedded BI
Traditionally, BI tools have carried a user-based licensing model. Licenses typically cost from the tens of thousands to millions of dollars. Such high per-user costs might be justified for a relatively small, predictably-sized population that includes a large percentage of power users who will spend a good amount of time working with the BI tool. This user-based model, however, is totally unsuitable for the embedded use case. The embedded use case is geared toward business users who will access the BI features less frequently and likely have less analytics experience than the traditional power user - in this scenario, high per-user costs simply can't be justified.

BI products are complex on a number of different levels. First, they are complex to deploy, often requiring months if not years to roll out to any reasonable number of users. Second, they are complex to use, both for the developers building the reports and dashboards as well as the business people interacting with the tool. Third, they are complex to embed. Designed as standalone products, BI tools are not architected to plug into another application.

Given the cost and complexity of traditional standalone BI offerings, it is no surprise that developers often turn to charting libraries to deliver the visualizations within their application. The cost is low and they are relatively simple for a developer to embed. In the short term, a charting library is a reasonable solution, but over time falls flat. The demands for more charts, dashboards and reports quickly grow, and end users begin looking for the ability to self-serve and create their own visualizations. As a result of these mounting demands, many application developers find themselves essentially building a BI tool, taking them outside their core competency and stealing precious time away from advancing their own application.

Could a New Generation of Embedded BI Provide the Solution?
Fortunately, there is a new generation of embedded analytic platforms emerging that looks set to address these challenges of cost and complexity. Wayne Eckerson, a noted BI analyst, identifies this as the third generation of embedded analytics in his article on the Evolution of Embedded BI. In summary, Eckerson describes the third generation as "moving beyond the Web to the Cloud" where developers can "rent these Cloud-based BI tools by the hour." These BI platforms can "support a full range of BI functionality including data exploration and authoring" and can be embedded through standard interfaces like REST and JavaScript. So, how does this third-generation address the issues of cost and complexity?

Utility Pricing Dramatically Reduces Cost
To address the challenge of cost, a new generation of embedded analytics platforms employs a utility-based licensing model where the software is available on a per-core, per-hour or per-gigabyte basis. From a developer's perspective, this is a much fairer model, as one only pays for what is used. At the beginning of the application lifecycle when usage is sporadic, developers can limit their costs. As the application becomes successful and use grows, usage can be easily scaled up. A recent report by Nucleus Research concluded that utility pricing for analytics can save organizations up to 70% of what they would pay for a traditional BI solution. I've written previously about how utility pricing will dramatically increase the availability of analytics, reaching a much broader set of organizations. The rapid adoption of Amazon's Redshift data warehousing service and Jaspersoft's reporting and analytics service on the AWS Marketplace provides rich testimony to the benefits of this model.

Cloud and Web-Standard APIs Reduce Complexity
A cloud-based BI platform significantly simplifies deployment, as there is no BI server to install or configure. The Nucleus Research report found that the utility-priced, Cloud BI solutions could be deployed in weeks or even days as opposed to the months commonly required for a traditional BI product.

Leveraging web-standard APIs like REST and JavaScript, the third-generation platforms also simplify the task of embedding analytics both on the front-end and back-end of the application. Importantly, these APIs allow full-featured, self-service BI capabilities to be embedded, not just reports and dashboards. This means increased ability of the application to respond to the ad-hoc information requests of business users.

The Benefits of Embedded Intelligence
Intuitively, it would seem that, by providing analytics within the applications business people use every day, an organization should experience the benefits of more data-driven decision-making. But is there any proof?

A recent report by the Aberdeen Group, based on data from over 130 organizations, has helped shed light on some of the benefits of embedded analytics. First, as might be expected, those companies using embedded analytics saw 76% of users actively engaged in analytics versus only 11% for those with the lowest embedded BI adoption. As a result, 89% of the business people in these best-in-class companies were satisfied with their access to data versus only 21% in the industry laggards. The bottom line? Companies leading embedded BI adoption saw an average 19% increase in operating profit versus only 9% for the other companies.

Andre Gayle, who helps manage a voicemail service at British Telecom, illustrates the difference embedded analytics can make. "We had reports [before] but they had to be emailed to users, who had to wait for them, then dig through them as needed. It was inefficient and wasteful." Now, thanks to embedded analytics, British Telecom has seen a huge savings in time and cost. As Gayle explains, capacity planning for the voicemail service used to be a "laborious exercise, involving several days of effort to dig up the numbers " but now can be done "on demand, in a fact-based manner, in just a few minutes."

The evidence is mounting that embedding analytics inside the applications business people use every day can lead to quantifiable benefits. However, the protagonist here, unlike in the traditional world of analytics, must be the developer, not the analyst. A new generation of embedded BI platforms is making it easier and more cost effective for developers to deliver the analytical capabilities needed inside the Cloud applications they are building. As developers increasingly avail of these new platforms, we can hope that BI will finally become pervasive as an information service that informs day-to-day operations. As Wayne Eckerson puts it, "In many ways, embedded BI represents the fulfillment of BI's promise." Now it's up to Cloud developers to help us realize that promise.

More Stories By Karl Van den Bergh

Karl Van den Bergh is the Vice President of Product Strategy at Jaspersoft, where he is responsible for product strategy, product management and product marketing. Karl is a seasoned high-tech executive with 18 years experience in software, hardware, open source and SaaS businesses, both startup and established.

Prior to Jaspersoft, Karl was the Vice President of Marketing and Alliances at Kickfire, a venture-funded data warehouse appliance startup. He also spent seven years at Business Objects (now part of SAP), where he held progressively senior leadership positions in product marketing, product management, corporate development and strategy – ultimately becoming the General Manager of the Information-On-Demand business. Earlier in his career, he was responsible for EMEA marketing at ASG, one of the world’s largest privately-held software companies. Karl started his career as a software engineer.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@BigDataExpo Stories
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch ...
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will disc...
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, will analyze a range of cloud offerings (IaaS, PaaS, SaaS) and discuss the benefits/challenges of migrating to each of...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
SYS-CON Media announced that IBM, which offers the world’s deepest portfolio of technologies and expertise that are transforming the future of work, has launched ad campaigns on SYS-CON’s numerous online magazines such as Cloud Computing Journal, Virtualization Journal, SOA World Magazine, and IoT Journal. IBM’s campaigns focus on vendors in the technology marketplace, the future of testing, Big Data and analytics, and mobile platforms.
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...