Welcome!

@DXWorldExpo Authors: Zakia Bouachraoui, Yeshim Deniz, Liz McMillan, Elizabeth White, Pat Romanski

Related Topics: @DXWorldExpo, Java IoT, Microservices Expo, Agile Computing, @CloudExpo, SDN Journal

@DXWorldExpo: Article

Big Data, IoT, API – Newer Technologies Protected by Older Security

With the evolution of these technologies, there is a very raw, basic, and yet incontrovertible need being expressed

Nowadays every single CIO, CTO, or business executive that I speak to is captivated by these three new technologies: Big Data, API management and IoTs (Internet of Things). Every single organizational executive that I speak with confirms that they either have current projects that are actively using these technologies, or they are in the planning stages and are about to embark on the mission soon.

Though the underlying need and purpose served are unique to each of these technologies, they all have one thing common: they all necessitate newer security models and security tools to serve any organization well. I will explain that in a bit, but let us see what is the value added by these technologies to any organization.

IoT
IoT is specific data collection points that employ sensors placed anywhere and everywhere. Most times the information collected by these devices are sensitive data and contain specific identifiable targeted data. IoT allows organizations to analyze behaviors and patterns as needed but also poses an interesting problem. Gone is TB (Terabytes) of data; now we are talking about PB (petabytes) of data that continue to grow exponentially. IoTs use M2M communication, which is a newer channel and create a newer set of threat vectors.

Big Data
Big Data stores massive amounts of data (some of these data are from the aforementioned IoTs) and having the necessary software and infrastructure that allow you to access them faster which promises to cost you a fraction of what it is costs today, further enabling you to capture as many data points as possible.

API
APIs are the interface, enabler and interconnector between systems by providing a uniform and portable interface (whether it is to the Big Data or the platform that enables Big Data).

While each of technologies at first glance appears to be serving different constituencies within an enterprise, there is an undeniable interconnectedness that exists. The IoT collects data from everywhere. Hence, it is pouring tons of data that need to be not only stored somewhere, but also analyzed properly so that the dots can be connected, to ultimately form meaningful patterns that people can make use of.


Assume all communications to the central neural system is via APIs.

With the evolution of these technologies, there is a very raw, basic, and yet incontrovertible need being expressed. Every business yearns to be better than its competitors in catering to the needs of its consumers. I mean the "consumer" in a loose sense here - an individual or for that matter, an organization that is consuming your offerings. Ipso facto, this means you need to capture as much information as you possibly can about the target consumer behavior so that it can be analyzed, protected, stored, shared selectively and, most important, so that it can serve your consumer better (or perhaps to be used when strategically monetizing an area of your business).

None of these technologies is in a trial phase any more. If anything, the social media explosion provided ample evidence that these technologies are being used quite effectively already (real life POCs). Of late, all of these technologies have been gaining adoption in the sacred technology worlds, such as the healthcare and financial sectors. However, when you employ these technologies with your production applications, you need an enterprise grade security that is built from the ground up to provide a necessary level of protection.

In the social world, the model had always been, "build [it] first and secure later based on the need" (or never in some cases). With healthcare, federal and financial sectors, that model is no longer tenable. You need to secure data at any cost, question anybody who wants access, and be hyper-vigilant without compromise.

What is particularly troublesome is that these organizations seem to believe that they can extend existing security measures to protect all of these newer technologies. While your SSL, Identity systems and other existing controls can serve as the baseline for these technologies, you need a newer set of security controls and tools in place. Your security model needs to make the necessary accommodations instead of trying to force fit everything to make the older set of tools to fit. That would be like trying to fit a square peg in a round hole. I have seen customers trying to bend RACF to fit the newer SOA, API, Big Data paradigm. While it can be done, it would end up costing you more, be very inflexible, and it defeats the fundamental purpose of security. Don't get me wrong - everything has a place in this universe.

Remember I wrote recently about the disappearing perimeter defenses and moving lines of thin defense. This is due to shared data centers, cloud adoption, multiple shared tenants, deeper integration and wider exposure to multiple partners, etc. Regardless of the scenario, you need to protect your own data and be accountable for it. Cyber attackers are very sophisticated and are funded by organizations (or even countries), which means they need to get to the proverbial data gold mine. Without adequate protection, this can prove to be that gold mine. The thing that scares me the most is the underlying threat to all of the above technologies when you try to fit them into the older security model. Most of the above technologies, from what I have observed, are either under protected or unprotected. While it is great for organizations to maximize monetization and satisfaction of a consumer and have a competitive edge over others, that shouldn't come at the cost of security or by increasing their risk. Especially when it comes to security, Murphy's Law is always right; it is not a question of if a security loophole will be exploited, it is a question of when.

You not only need to identify the users, authenticate them, and authorize them but also make sure they are allowed access during that time window that they are requesting the info (throw in a location-based and device-based identification on top).

In addition, you also need to worry about protecting the Big Data store itself, including strong encryption of storage, transmission, and in-process data.

But then, most important of all, you need to mitigate the threat vectors that are created by these new technologies. I will write in the next few articles about how you can protect all of these areas with minimal effort while keeping your TCO very low. I will also talk about specific usecases and usage models that will make sense.

Blake recently wrote a great blog on "touchless" Big Data security. I urge you to check it out here. Demo version is here.

More Stories By Andy Thurai

Andy Thurai is Program Director for API, IoT and Connected Cloud with IBM, where he is responsible for solutionizing, strategizing, evangelizing, and providing thought leadership for those technologies. Prior to this role, he has held technology, architecture leadership and executive positions with Intel, Nortel, BMC, CSC, and L-1 Identity Solutions.

You can find more of his thoughts at www.thurai.net/blog or follow him on Twitter @AndyThurai.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


DXWorldEXPO Digital Transformation Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
As Cybric's Chief Technology Officer, Mike D. Kail is responsible for the strategic vision and technical direction of the platform. Prior to founding Cybric, Mike was Yahoo's CIO and SVP of Infrastructure, where he led the IT and Data Center functions for the company. He has more than 24 years of IT Operations experience with a focus on highly-scalable architectures.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...