Click here to close now.


@BigDataExpo Authors: Elizabeth White, Pat Romanski, AppDynamics Blog, Jayaram Krishnaswamy, Jim Scott

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Linux Containers, @BigDataExpo, SDN Journal

@CloudExpo: Article

Standards and APIs: Searching for Standards on Identity & Authentication

How to best manage identity and security in the mobile era

The advent of the application programming interface (API) economy has forced a huge, pressing need for organizations to both seek openness and improve security for accessing mobile applications, data, and services anytime, anywhere, and from any device.

Awash in inadequate passwords and battling subsequent security breaches, business and end-users alike are calling for improved identity management and federation technologies. They want workable standards to better chart the waters of identity management and federation, while preserving the need for enterprise-caliber risk remediation and security.

Meanwhile, the mobile tier is becoming an integration point for scads of cloud services and APIs, yet unauthorized access to data remains common. Mobile applications are not yet fully secure, and identity control that meets audit requirements is hard to come by. And so developers are scrambling to find the platforms and tools to help them manage identity and security, too.

Clearly, the game has clearly changed for creating new and attractive mobile processes, yet the same old requirements remain wanting around security, management, interoperability, and openness.

BriefingsDirect assembled a panel of experts to explore how to fix these pressing needs: Bradford Stephens, the Developer and Platforms Evangelist in the CTO’s Office at Ping Identity; Ross Garrett, Senior Director of Product Marketing at Axway, Kelly Grizzle, Principal Software Engineer at SailPoint Technologies. Welcome, Kelly. The sponsored panel discussion is moderated by me, Dana Gardner, Principal Analyst at Interarbor Solutions.

Here are some excerpts:

Gardner: We are approaching the Cloud Identity Summit 2014 (CIS), which is coming up on July 19 in Monterey, Calif. There's a lot of frustration with identity services that meet the needs of developers and enterprise operators as well. So let’s talk a little bit about what’s going on with APIs and identity.

What are the trends in the market that keep this problem pressing? Why is it so difficult to solve?

Interaction changes

Stephens: Well, as soon as we've settled on a standard, the way we interact with computers changes. It wasn’t that long ago that if you had Active Directory and SAML and you hand-wrote security endpoints of model security products, you were pretty much covered.


But in the last three or four years, we've gone to a world where mobile is more important than web. Distributed systems are more important than big iron. And we communicate with APIs instead of channels and SDKs, and that requires a whole new way of thinking about the problem.

Garrett: Ultimately, APIs are becoming the communication framework, the fabric, in which all of the products that we touch today talk to each other. That, by extension, provides a new identity challenge. That’s a lot of reason why we've seen some friction and schizophrenia around the types of identity technologies that are available to us.

So we see waves of different technologies come and go, depending on what is the flavor of the month. That has caused some frustration for developers, and will definitely come up during our Cloud Identity Summit in a couple of weeks.

Grizzle: APIs are becoming exponentially more important in the identity world now. As Bradford alluded to, the landscape is changing. There are mobile devices as well as software-as-a-service (SaaS) providers out there who are popping up new services all the time. The common thread between all of them is the need to be able to manage identities. They need to be able to manage the security within their system. It makes total sense to have a common way to do this.


APIs are key for all the different devices and ways that we connect to these service providers. Becoming standards based is extremely important, just to be able to keep up with the adoption of all these new service providers coming on board.

Gardner: As we describe this as the API economy, I suppose it’s just as much a marketplace and therefore, as we have seen in other markets, people strive for predominance. There's jockeying going on. Bradford, is this a matter of an architectural shift? Is this a matter of standards? Or is this a matter of de-facto standards? Or perhaps all of the above?

Stephens: It’s getting complex quickly. I think we're settling on standards, like it or not, mostly positively. I see most people settling on at least OAuth 2.0 as a standard token, and OpenID Connect for implementation and authentication of information, but I think that’s about as far as we get.

There's a lot of struggle with established vendors vying to implement these protocols. They try to bridge the gap between the old world of say SAML and Active Directory and all that, and the new world of SCIM, OAuth, OpenID Connect. The standards are pretty settled, at least for the next two years, but the tools, how we implement them, and how much work it takes developers to implement them, are going to change a lot, and hopefully for the better.

Evolving standards

Garrett: We have identified a number of new standards that are bridging this new world of API-oriented connectivity. Learning from the past of SAML and legacy, single sign-on infrastructure, we definitely need some good technology choices.


The standards seem to be leading the way. But by the same token, we should keep a close eye on the market changing with regards to how fast standards are changing. We've all seen things like OAuth progress slower than some of the implementations out there. This means the ratification of the standard was happening after many providers had actually implemented it. It's the same for OpenID Connect.

We are in line there, but the actual standardization process doesn’t always keep up with where the market wants to be.

Gardner: We've been this play out before that the standards can lag. Getting consensus, developing the documentation and details, and getting committees to sign off can take time, and markets move at their own velocity. Many times in the past, organizations have hedged their bets by adopting multiple standards or tracking multiple ways of doing things, which requires federation and integration.

Kelly, are there big tradeoffs with standards and APIs? How do we mitigate the risk and protect ourselves by both adhering to standards, but also being agile in the market?

Grizzle: That’s kind of tricky. You're right in that standards tend to lag. That’s just part and parcel of the standardization process. It’s like trying to pass a bill through Congress. It can go slow.

You're right in that standards tend to lag. That’s just part and parcel of the standardization process.

Something that we've seen some of these standards do right, from OAuth and from the SCIM perspective, is that both of those have started their early work with a very loose standardization process, going through not one of the big standards bodies, but something that can be a little bit more nimble. That’s how the SCIM 1.0 and 1.1 specs came out, and they came out in a reasonable time frame to get people moving on it.

Now that things have moved to the Internet Engineering Task Force (IETF), development has slowed down a little bit, but people have something to work with and are able to keep up with the changes going on there.

I don’t know that people necessarily need to adopt multiple standards to hedge their bets, but by taking what’s already there and keeping a pulse on the things that are going to change, as well as the standard being forward-thinking enough to allow some extensibility within it, service providers and clients, in the long run, are going to be in a pretty good spot.

Quick primer

Gardner: We've talked a few technical terms so far, and just for the benefit of our audience, I'd like to do a quick primer, perhaps with you Bradford. To start: OAuth, this is with the IETF now. Could you just quickly tell the audience what OAuth is, what it’s doing, and why it’s important when we talk about API, security and mobile?

Stephens: OAuth is the foundation protocol for authorization when it comes to APIs for web applications. OAuth 2 is much more flexible than OAuth 1.

Basically, it allows applications to ask for access to stuff. It seems very vague, but it’s really powerful once you start getting the right tokens for your workflows. And it provides the same foundation for everything else we do for identity and APIs.

The best example I can think of is when you log into Facebook, and Facebook asks whether you really want this app to see your birthday, all your friends’ information, and everything else. Being able to communicate all that over the OAuth 2.0 is a lot easier than how it was with OAuth 1.0 a few years ago.

Gardner: How about OpenID Connect. This is with the OpenID Foundation. How does that relate, and what is it?

If OAuth actually is the medium, OpenID Connect can be described as the content of the message. It’s not the message itself.

Stephens: If OAuth actually is the medium, OpenID Connect can be described as the content of the message. It’s not the message itself. That’s usually done with the Token, Javascript object notation (JSON) Web Token, but OpenID Connect provides the actual identity information.

When you access an API and you authenticate, you choose a scope, and one of the most common scopes is OpenID Profile. This OpenID Profile will just have things like your username, maybe your address, various other pieces of identity information, and it describes who the "you" is, who you are.

Gardner: And SCIM, you mentioned that Kelly, and I know you have been involved with it. So why don’t you take the primer for SCIM, and I believe it’s Simple Cloud Identity Management?

Grizzle: That's the historical name for it, Simple Cloud Identity Management. When we took the standard to the IETF, we realized that the problems that we were solving were a little bit broader than just the cloud and within the cloud. So the acronym now stands for the System for Cross-domain Identity Management.

That’s kind of a mouthful, but the concept is pretty simple. SCIM is really just an API and a schema that allows you to manage identities and identity-related information. And by manage them, I mean to create identities in systems to delete them, update them, change the entitlements and the group memberships, and things like that.

Gardner: From your perspective, Kelly, what is the relationship then between OAuth and SCIM?

Managing identities

Grizzle: OAuth, as Bradford mentioned, is primarily geared toward authorization, and answers the question, "Can Bob access this top-secret document?" SCIM is really not in the authorization and authentication business at all. SCIM is about managing identities.

OAuth assumes that an identity is already present. SCIM is able to create that identity. You can create the user "Bob." You can say that Bob should not have access to that top-secret document. Then, if you catch Bob doing some illicit activity, you can quickly disable his account through a SCIM call. So they fit together very nicely.

Gardner: In the real world, developers like to be able to use APIs, but they might not be familiar with all the details that we've just gone through on some of these standards and security approaches.

How do we make this palatable to developers? How do we make this something that they can implement without necessarily getting into the nitty-gritty? Are there some approaches to making this a bit easier to consume as a developer?

The best thing we can do is have tool-providers give them tools in their native language or in the way developers work with things.

Stephens: As a developer who's relatively new to this field -- I worked in database for three years -- I've had personal experience of how hard it is to wrap your head around all the standards and all these flows and stuff. The best thing we can do is have tool providers give them tools in their native language, or in the way developers work with things.

This needs well-documented, interactive APIs -- things like Swagger -- and lots of real-world code examples. Once you've actually done the process of authentication through OAuth, getting a JSON Web Token, and getting an OpenID Connect profile, it’s really  simple to see how it all works together, if you do it all through a SaaS platform that handles all the nitty-gritty, like user creation and all that.

If you have to roll your own, though, there's not a lot of information out there besides the WhitePages and Wall Post. It’s just a nightmare. I tried to roll my own. You should never roll your own.

So having SaaS platforms to do all this stuff, instead of having documents, means that developers can focus on providing their applications, and just understand that I have this media and project, not about which tokens carry information that accesses OAuth and OpenID Connect.

I don’t really care how it all works together; I just know that I have this token and it has the information I need. And it’s really liberating, once you finally get there.

So I guess the best thing we can do is provide really great tools that solve the identity-management problems.

Tools: a key point

Garrett: Tools, that’s the key point here. Whether we like it or not, developers tend to be kind of lazy sometimes and they certainly don’t have the time or the energy to understand every facet of the OAuth specification. So providing tools that can wrap that up and make it as easy to implement as possible is really the only way that we get to really secure mobile applications or any API interaction. Because without a deep understanding of how this stuff works, you can make pretty fundamental errors.

Having said that, at least we've started to take steps in the right direction with the standards. OAuth is built at least with the idea of mobile access in mind. It’s leveraging REST and JSON types, rather than SOAP and XML types, which are really way too heavyweight for mobile applications.

So the standards, in their own right, have taken us in the right direction, but we absolutely need tools to make it easy for developers.

Grizzle: Tools are of the utmost importance, and some of the identity providers and people with skin in the game, so to speak, are helping to create these tools and to open-source them, so that they can be used by other people.

Identity isn’t the most glamorous thing to talk about, except when it all goes wrong, and some huge leak makes the news headlines.

Another thing that Ross touched on was keeping the simplicity in the spec. These things that we're addressing -- authorization, authentication, and managing identities -- are not extremely simple concepts always. So in the standards that are being created, finding the right balance of complexity versus completeness and flexibility is a tough line to walk.

With SCIM, as you said, the first initial of the acronym used to stand for Simple. It’s still a guiding principle that we use to try to keep these interactions as simple as possible. SCIM uses REST and JSON, just like some of these other standards. Developers are familiar with that. Putting the burden on the right parties for implementation is very important, too. To make it easy on clients, the ones who are going to be implementing these a lot, is pretty important.

Gardner: Do these standards do more than help the API economy settle out and mature? Cloud providers or SaaS providers want to provide APIs and they want the mobile apps to consume them. By the same token, the enterprises want to share data and want data to get out to those mobile tiers. So is there a data-management or brokering benefit that goes along with this? Are we killing multiple birds with one set of standards?

Garrett: The real issue here, when we think about the new types of products and services that the API economy is helping us deliver, is around privacy and ultimately customer confidence. Putting the user in control of who gets to access which parts of my identity profile, or how contextual information about me can perhaps make identity decisions easier, allows us to lock down, or better understand, these privacy concerns that the world has.

Identity isn’t the most glamorous thing to talk about -- except when it all goes wrong -- and some huge leak makes the news headlines, or some other security breach has lost credit-card numbers or people’s usernames and passwords.

Hand in hand

In terms of how identity services are developing the API economy, the two things go hand in hand. Unless people are absolutely certain about how their information is being used, they simply choose not to use these services. That’s where all the work that the API management vendors and the identity management vendors are doing and bringing that together is so important.

Gardner: You mentioned that identity might not be sexy or top of mind, but how else can you manage all these variables on an automated or policy-driven basis? When we move to the mobile tier, we're dealing with multiple networks. We're dealing with multiple services ... cloud, SaaS, and APIs. And then we're linking this back to enterprise applications. How other than identity can this possibly be managed?

Stephens: Identity is often thought of as usernames and passwords, but it’s evolving really quickly to be so much more. This is something I harp on a lot, but it’s really quickly becoming that who we are online is more important than who we are in real life. How I identify myself online is more important than the driver's license I carry in my wallet.

And it’s important that developers understand that because any connected application is going to have to have a deep sense of identity.

As you know, your driver’s license is like a real-life token of information that describes what you're allowed to do in your life. That’s part of your identity. Anybody who has lost their license knows that, without that, there's not a whole lot you can do.

Bringing that analogy back to the Internet, what you're able to access and what access you're able to give other people or other applications to change important things, like your Facebook posts, your tweets, or go through your email and help categorize that is important. All these little tasks that help define how you live, are all part of your identity. And it’s important that developers understand that because any connected application is going to have to have a deep sense of identity.

Gardner: Let me pose the same question, but in a different way. When you do this well, when you can manage identity, when you can take advantage of these new standards that extend into mobile requirements and architectures, with the API economy in mind, what do you get? What does it endow you with? What can you do that perhaps you couldn’t do if you were stuck in some older architectures or thinking?

Grizzle: Identity is key to everything we do. Like Bradford was just saying, the things that you do online are built on the trust that you have with who is doing them. There are very few services out there where you want completely anonymous access. Almost every service that you use is tied to an identity.

So it’s of paramount importance to get a common language between these. If we don’t move to standards here, it's just going to be a major cost problem, because there are a ton of different providers and clients out there.

If every provider tries to roll their own identity infrastructure, without relying on standards, then, as a client, if I need to talk to two different identity providers, I need to write to two different APIs. It’s just an explosive problem, with the amount that everything is connected these days.

So it’s key. I can’t see how the system will stand up and move forward efficiently without these common pieces in place.

Use cases

Gardner: Do we have any examples along these same lines of what do you get when you do this well and appropriately based on what you all think is the right approach and direction? We've been talking at a fairly abstract level, but it really helps solidify people’s thinking and understanding when they can look at a use-case, a named situation or an application.

Stephens: If you want a good example of how OAuth delegation works, building a Facebook app or just working on Facebook app documentation is pretty straightforward. It gives you a good idea of what it means to delegate certain authorization.

Likewise, Google is very good. It’s very integrated with OAuth and OpenID Connect when it comes to building things on Google App Engine.

The thing that these new identity service providers have been offering has, behind the scenes, been making your lives more secure.

So if you want to secure an API that you built using Google Cloud on Google App Engine, which is trivial to do, Google Cloud Endpoints provides a really good example. In fact, there is a button that you can hit in their example button called Use OAuth and that OAuth transports OpenID Connect profile, and that’s a pretty easy way to go about it.

Garrett: I'll just take a simple consumer example, and we've touched on this already. It was the idea in the past, where every individual service or product is offering only their identity solution. So I have to create a new identity profile for every product or service that I'm using. This has been the case for a long time in the consumer web and in the enterprise setting as well.

So we have to be able to solve that problem and offer a way to reuse existing identities. It involves so taking technologies like OpenID Connect, which is totally hidden to the end user really, but simply saying that you can use this existing identity, your LinkedIn or  Facebook credentials, etc., to access some new products, takes a lot of burden away from the consumer. Ultimately, that provides us a better security model end to end.

The thing that these new identity service providers have been offering has, behind the scenes, been making your lives more secure. Even though some people might shy away from using their Facebook identity across multiple applications, in many ways it’s actually better to, because that’s really one centralized place where I can actually see, audit, and adjust the way that I'm presenting my identity to other people.

That’s a really great example of how these new technologies are changing the way we interact with products everyday.

Standardized approach

Grizzle: At SailPoint, the company that I work for, we have a client, a large chip maker, who has seen the identity problem and really been bitten by it within their enterprise. They have somewhere around 3,500 systems that have to be able to talk to each other, exchange identity data, and things like that.

The issue is that every time they acquire a new company or bring a new group into the fold, that company has its own set of systems that speak their own language, and it takes forever to get them integrated into their IT organization there.

So they've said that they're not going to support every app that these people bring into the IT infrastructure. They're going with SCIM and they are saying that all these apps that come in, if they speak SCIM, then they'll take ownership of those and pull them into their environment. It should just plug in nice and easy. They're doing it just because of a resourcing perspective. They can't keep up with the amount of change to their IT infrastructure and keep everything automated.

They can't keep up with the amount of change to their IT infrastructure and keep everything automated.

Gardner: I want to quickly look at the Cloud Identity Summit that’s coming up. It sounds like a lot of these issues are going to be top of mind there. We're going to hear a lot of back and forth and progress made.

Does this strike you, Bradford, as a tipping point of some sort, that this event will really start to solidify thinking and get people motivated? How do you view the impact of this summit on cloud identity?

Stephens: At CIS, we're going to see a lot of talk about real-world implementation of these standards. In fact, I'm running the Enterprise API track and I'll be giving a talk on end-to-end authentication using JAuth, OAuth, and OpenID Connect. This year is the year that we show that it's possible. Next year, we'll be hearing a lot more about people using it in production.

You may also be interested in:

More Stories By Dana Gardner

At Interarbor Solutions, we create the analysis and in-depth podcasts on enterprise software and cloud trends that help fuel the social media revolution. As a veteran IT analyst, Dana Gardner moderates discussions and interviews get to the meat of the hottest technology topics. We define and forecast the business productivity effects of enterprise infrastructure, SOA and cloud advances. Our social media vehicles become conversational platforms, powerfully distributed via the BriefingsDirect Network of online media partners like ZDNet and As founder and principal analyst at Interarbor Solutions, Dana Gardner created BriefingsDirect to give online readers and listeners in-depth and direct access to the brightest thought leaders on IT. Our twice-monthly BriefingsDirect Analyst Insights Edition podcasts examine the latest IT news with a panel of analysts and guests. Our sponsored discussions provide a unique, deep-dive focus on specific industry problems and the latest solutions. This podcast equivalent of an analyst briefing session -- made available as a podcast/transcript/blog to any interested viewer and search engine seeker -- breaks the mold on closed knowledge. These informational podcasts jump-start conversational evangelism, drive traffic to lead generation campaigns, and produce strong SEO returns. Interarbor Solutions provides fresh and creative thinking on IT, SOA, cloud and social media strategies based on the power of thoughtful content, made freely and easily available to proactive seekers of insights and information. As a result, marketers and branding professionals can communicate inexpensively with self-qualifiying readers/listeners in discreet market segments. BriefingsDirect podcasts hosted by Dana Gardner: Full turnkey planning, moderatiing, producing, hosting, and distribution via blogs and IT media partners of essential IT knowledge and understanding.

@BigDataExpo Stories
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, foc...
The cloud has reached mainstream IT. Those 18.7 million data centers out there (server closets to corporate data centers to colocation deployments) are moving to the cloud. In his session at 17th Cloud Expo, Achim Weiss, CEO & co-founder of ProfitBricks, will share how two companies – one in the U.S. and one in Germany – are achieving their goals with cloud infrastructure. More than a case study, he will share the details of how they prioritized their cloud computing infrastructure deployments ...
Data loss happens, even in the cloud. In fact, if your company has adopted a cloud application in the past three years, data loss has probably happened, whether you know it or not. In his session at 17th Cloud Expo, Bryan Forrester, Senior Vice President of Sales at eFolder, will present how common and costly cloud application data loss is and what measures you can take to protect your organization from data loss.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete en...
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet condit...
Achim Weiss is Chief Executive Officer and co-founder of ProfitBricks. In 1995, he broke off his studies to co-found the web hosting company "Schlund+Partner." The company "Schlund+Partner" later became the 1&1 web hosting product line. From 1995 to 2008, he was the technical director for several important projects: the largest web hosting platform in the world, the second largest DSL platform, a video on-demand delivery network, the largest eMail backend in Europe, and a universal billing syste...
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driv...
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line lo...
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends an...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
In recent years, at least 40% of companies using cloud applications have experienced data loss. One of the best prevention against cloud data loss is backing up your cloud data. In his General Session at 17th Cloud Expo, Bryan Forrester, Senior Vice President of Sales at eFolder, will present how organizations can use eFolder Cloudfinder to automate backups of cloud application data. He will also demonstrate how easy it is to search and restore cloud application data using Cloudfinder.
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.

Tweets by @BigDataExpo

@BigDataExpo Blogs
SCOPE is an acronym for Structured Computations Optimized for Parallel Execution, a declarative language for working with large-scale data. It is still under development at Microsoft. If you know SQL then working with SCOPE will be quite easy as SCOPE builds on SQL. The execution environment is different from that RDBMS oriented data. Data is still modeled as rows. Every row has typed columns and eveyr rowset has a well-defined schema. There is a SCOPe compiler that comes up with optimized execution plan and a runtime execution plan.
If you’re running Big Data applications, you’re going to want to look at some kind of distributed processing system. Hadoop is one of the best-known clustering systems, but how are you going to process all your data in a reasonable time frame? MapReduce has become a standard, perhaps the standard, for distributed file systems. While it’s a great system already, it’s really geared toward batch use, with jobs needing to queue for later output. This can severely hamper your flexibility. What if you want to explore some of your data? If it’s going to take all night, forget about it.
Disaster recovery (DR) has traditionally been a major challenge for IT departments. Even with the advent of server virtualization and other technologies that have simplified DR implementation and some aspects of on-going management, it is still a complex and (often extremely) costly undertaking. For those applications that do not require high availability, but are still mission- and business-critical, the decision as to which [applications] to spend money on for true disaster recovery can be a struggle.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
Today’s modern day industrial revolution is being shaped by ubiquitous connectivity, machine to machine (M2M) communications, the Internet of Things (IoT), open APIs leading to a surge in new applications and services, partnerships and eventual marketplaces. IoT has the potential to transform industry and society much like advances in steam technology, transportation, mass production and communications ushered in the industrial revolution in the 18th and 19th centuries.
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
I was recently watching one of my favorite science fiction TV shows (I’ll confess, ‘Dr. Who’). In classic dystopian fashion, there was a scene in which a young boy is running for his life across some barren ground in a war-ravaged world. One of his compatriots calls out to him to freeze, not to move another inch. The compatriot warns the young boy that he’s in a field of hand mines (no, that is not a typo, he did say hand mines). Slowly, dull gray hands with eyes in the palm start emerging from the ground around the boy and the compatriot. Suddenly, one of the hands grabs the compatriot and pu...
Recently announced Azure Data Lake addresses the big data 3V challenges; volume, velocity and variety. It is one more storage feature in addition to blobs and SQL Azure database. Azure Data Lake (should have been Azure Data Ocean IMHO) is really omnipotent. Just look at the key capabilities of Azure Data Lake:
DevOps Summit at Cloud Expo 2014 Silicon Valley was a terrific event for us. The Qubell booth was crowded on all three days. We ran demos every 30 minutes with folks lining up to get a seat and usually standing around. It was great to meet and talk to over 500 people! My keynote was well received and so was Stan's joint presentation with RingCentral on Devops for BigData. I also participated in two Power Panels – ‘Women in Technology’ and ‘Why DevOps Is Even More Important than You Think,’ both featuring brilliant colleagues and moderators and it was a blast to be a part of.
It’s not hard to find technology trade press commentary on the subject of Big Data. Variously defined (in non-technical terms) as the cluttered old shoebox of all data – and again (in more technical terms) as that amount of data that does not comfortably fit into a standard relational database for storage, processing and analytics within the normal constraints of processing, memory and data transport technologies – we can say that Big Data is an oft mentioned and sometimes misunderstood subject.
“Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management.” While this definition is broadly accepted and has, in fact, been my adopted standard for years, it only describes technical aspects of cloud computing. The amalgamation of technologies used to deliver cloud services is not even half the story. Above all else, the successful employment requires a tight linkage to the econ...
Developing software for the Internet of Things (IoT) comes with its own set of challenges. Security, privacy, and unified standards are a few key issues. In addition, each IoT product is comprised of at least three separate application components: the software embedded in the device, the backend big-data service, and the mobile application for the end user's controls. Each component is developed by a different team, using different technologies and practices, and deployed to a different stack/target - this makes the integration of these separate pipelines and the coordination of software upd...
All we need to do is have our teams self-organize, and behold! Emergent design and/or architecture springs up out of the nothingness! If only it were that easy, right? I follow in the footsteps of so many people who have long wondered at the meanings of such simple words, as though they were dogma from on high. Emerge? Self-organizing? Profound, to be sure. But what do we really make of this sentence?
Too many multinational corporations delete little, if any, data even though at its creation, more than 70 percent of this data is useless for business, regulatory or legal reasons.[1] The problem is hoarding, and what businesses need is their own “Hoarders” reality show about people whose lives are driven by their stuff[2] (corporations are legally people, after all). The goal of such an intervention (and this article)? Turning hoarders into collectors.
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated and cloud solutions through hybrid hosting – a sustainable solution for the data required to manage I...

About @BigDataExpo
Big Data focuses on how to use your own enterprise data – processed in the Cloud – most effectively to drive value for your business.