Click here to close now.

Welcome!

Big Data Journal Authors: Elizabeth White, Jnan Dash, Robert McNutt, Liz McMillan, Bart Copeland

Blog Feed Post

Continuous Monitoring – Part 2

By

I previously wrote about the various “functional areas” of continuous monitoring. According to the federal model, there are 15 functional areas comprising a comprehensive continuous monitoring solution, as shown in the graphic below:

Federal CM Model

These functional areas are grouped into the following categories:

  • Manage Assets
  • Manage Accounts
  • Manage Events
  • Security Lifecycle Management

Each category addresses a general area of vulnerability in an enterprise. The rest of this post will describe each category, the complexities involved in integration, and the difficulties in making sense of the information. “Making sense” means a number of things here – understanding and remediating vulnerabilities, detecting and preventing threats, estimating risk to the business or mission, ensuring continuity of operations and disaster recovery, and enforcing compliance to policies and standards.

The process of Managing Assets includes tracking all hardware and software products (PCs, servers, network devices, storage, applications, mobile devices, etc.) in your enterprise, and checking for secure configuration and for known/existing vulnerabilities. While some enterprises institute standard infrastructure components, it still takes a variety of products, sensors and data to obtain a near-real-time understanding of an organization’s security posture. Implementing an initial capability in my cybersecurity lab, we have integrated fourteen products and spent considerable time massaging data so that it conforms to NIST and DoD standards. This enables pertinent information to pass seamlessly across the infrastructure, and allows correlation and standardized reporting from a single dashboard. This will become more complex as we add more capabilities and more products. In addition, the new style of IT extends enterprise assets to include mobile devices and cloud resources – so our work to understand and manage the security within this area is just beginning.

The next area deals with Managing Accounts of both people and services. Typically, we think of account management as monitoring too many unsuccessful login attempts or making sure that we delete an account when someone leaves the organization. However, if you look at the subsections of the graphic, you’ll see that managing accounts involves additional characteristics – evaluating trust, managing credentials, and analyzing behavioral patterns. This applies not only to people but to services – system-to-system, server-to-server, domain-to-domain, process-to-process, and any other combination thereof. Think about the implications of recording and analyzing behavior, and you’ll realize that any solution will require big data. Security-related behavior is a somewhat nebulous concept, but if you drill down into the details, you can envision capturing information on location, performance characteristics, schedule, keywords, encryption algorithms, access patterns, unstructured text, and more. Accounts (whether a person, system, computer, or service) use assets and groups of assets. As such, the information gleaned from the relationships and interactions between accounts and assets provides another layer of intelligence to the continuous monitoring framework.

The Managing Events category is organized into preparation-for and response-to incidents. In the cybersecurity realm, incidents can cover anything from a spear phishing attack to denial of service to digital annihilation to critical infrastructure disruption to destruction of physical plant. That covers a wide range of methods to protect an organization’s assets – and any physical asset that is somehow connected to a network needs cyber protection. The first thing to do to manage events is to plan! Backup and recovery, continuity of operations, business continuity, disaster recovery – call it what you will, but a plan will help you to understand what needs protecting, how to protect it, and how to recover when those protections fail. This is the next layer of functionality – and complexity – in the continuous monitoring framework; the functions all build upon one another to provide a more secure and resilient enterprise. The security-related data and information aggregated across these layers provides the raw materials to understand your security posture and manage your risk.

The final set of functions deal with Security Lifecycle Management. The lifecycle helps to identify the security requirements of an organization, the associated plans and policies needed to satisfy those requirements, and the processes used to monitor and improve security operations. That improvement is based on the data collected and correlated across all the other functional areas described above. Depending on the size of an organization (dozens of assets to millions of assets) and the granularity of the data (firewall alerts to packet capture), the continuous monitoring framework leads to “big security data”. Timing is also very important. Whereas today we mostly hear about cybersecurity incidents after the fact (hours, days, months, and sometimes years later), continuous monitoring operates on real or near-real-time information. The benefits are three-fold: 1) intrusions, threats and vulnerabilities can be detected much more quickly, 2) you can perform continuous authorization on your systems; typically, after the initial Certification & Authorization approval, re-certification occurs either after a major change in the system or once every two or three years, and 3) big security data can lead to predictive analytics. That’s the holy grail of cybersecurity – the ability to accurately and consistently predict vulnerabilities, threats, and attacks to your enterprise, business, or mission.

There are other benefits to this approach, besides improving an organization’s security posture because let’s face it – all the things I’ve described look like they incur additional costs. Yet after some initial investment, depending on the size of your organization and the security products you already have in your infrastructure, there are actually cost savings. At the top of the list, continuous monitoring automates many of the manual processes you have today, reduces disruptions in your enterprise, and minimizes periodic accreditation costs. This is, however, a complex undertaking. We’ve learned a lot about what works and what doesn’t as we continue to integrate products and build continuous monitoring capabilities in our lab.  Feel free to contact me for best practices or if you have any other questions.

This post first appeared at George Romas’ HP blog.

 

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley, former CTO of the Defense Intelligence Agency (DIA), is Founder and CTO of Crucial Point LLC, a technology research and advisory firm providing fact based technology reviews in support of venture capital, private equity and emerging technology firms. He has extensive industry experience in intelligence and security and was awarded an intelligence community meritorious achievement award by AFCEA in 2008, and has also been recognized as an Infoworld Top 25 CTO and as one of the most fascinating communicators in Government IT by GovFresh.

@BigDataExpo Stories
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
Companies today struggle to manage the types and volume of data their customers and employees generate and use every day. With billions of requests daily, operational consistency can be elusive. In his session at Big Data Expo, Dave McCrory, CTO at Basho Technologies, will explore how a distributed systems solution, such as NoSQL, can give organizations the consistency and availability necessary to succeed with on-demand data, offering high availability at massive scale.
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
Software is eating the world. Companies that were not previously in the technology space now find themselves competing with Google and Amazon on speed of innovation. As the innovation cycle accelerates, companies must embrace rapid and constant change to both applications and their infrastructure, and find a way to deliver speed and agility of development without sacrificing reliability or efficiency of operations. In her Day 2 Keynote DevOps Summit, Victoria Livschitz, CEO of Qubell, discussed...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. Learn about IoT, Big Data and deployments processing massive data volumes from wearables, utilities and ot...
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that Intelligent Systems Services will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Established in 1994, Intelligent Systems Services Inc. is located near Washington, DC, with representatives and partners nationwide. ISS’s well-established track record is based on the continuous pursuit of excellence in designing, implementing and supporting nationwide clients’ mission-cri...
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.