Welcome!

@BigDataExpo Authors: Liz McMillan, Elizabeth White, Pat Romanski, Carmen Gonzalez, ManageEngine IT Matters

Blog Feed Post

Continuous Monitoring – Part 2

By

I previously wrote about the various “functional areas” of continuous monitoring. According to the federal model, there are 15 functional areas comprising a comprehensive continuous monitoring solution, as shown in the graphic below:

Federal CM Model

These functional areas are grouped into the following categories:

  • Manage Assets
  • Manage Accounts
  • Manage Events
  • Security Lifecycle Management

Each category addresses a general area of vulnerability in an enterprise. The rest of this post will describe each category, the complexities involved in integration, and the difficulties in making sense of the information. “Making sense” means a number of things here – understanding and remediating vulnerabilities, detecting and preventing threats, estimating risk to the business or mission, ensuring continuity of operations and disaster recovery, and enforcing compliance to policies and standards.

The process of Managing Assets includes tracking all hardware and software products (PCs, servers, network devices, storage, applications, mobile devices, etc.) in your enterprise, and checking for secure configuration and for known/existing vulnerabilities. While some enterprises institute standard infrastructure components, it still takes a variety of products, sensors and data to obtain a near-real-time understanding of an organization’s security posture. Implementing an initial capability in my cybersecurity lab, we have integrated fourteen products and spent considerable time massaging data so that it conforms to NIST and DoD standards. This enables pertinent information to pass seamlessly across the infrastructure, and allows correlation and standardized reporting from a single dashboard. This will become more complex as we add more capabilities and more products. In addition, the new style of IT extends enterprise assets to include mobile devices and cloud resources – so our work to understand and manage the security within this area is just beginning.

The next area deals with Managing Accounts of both people and services. Typically, we think of account management as monitoring too many unsuccessful login attempts or making sure that we delete an account when someone leaves the organization. However, if you look at the subsections of the graphic, you’ll see that managing accounts involves additional characteristics – evaluating trust, managing credentials, and analyzing behavioral patterns. This applies not only to people but to services – system-to-system, server-to-server, domain-to-domain, process-to-process, and any other combination thereof. Think about the implications of recording and analyzing behavior, and you’ll realize that any solution will require big data. Security-related behavior is a somewhat nebulous concept, but if you drill down into the details, you can envision capturing information on location, performance characteristics, schedule, keywords, encryption algorithms, access patterns, unstructured text, and more. Accounts (whether a person, system, computer, or service) use assets and groups of assets. As such, the information gleaned from the relationships and interactions between accounts and assets provides another layer of intelligence to the continuous monitoring framework.

The Managing Events category is organized into preparation-for and response-to incidents. In the cybersecurity realm, incidents can cover anything from a spear phishing attack to denial of service to digital annihilation to critical infrastructure disruption to destruction of physical plant. That covers a wide range of methods to protect an organization’s assets – and any physical asset that is somehow connected to a network needs cyber protection. The first thing to do to manage events is to plan! Backup and recovery, continuity of operations, business continuity, disaster recovery – call it what you will, but a plan will help you to understand what needs protecting, how to protect it, and how to recover when those protections fail. This is the next layer of functionality – and complexity – in the continuous monitoring framework; the functions all build upon one another to provide a more secure and resilient enterprise. The security-related data and information aggregated across these layers provides the raw materials to understand your security posture and manage your risk.

The final set of functions deal with Security Lifecycle Management. The lifecycle helps to identify the security requirements of an organization, the associated plans and policies needed to satisfy those requirements, and the processes used to monitor and improve security operations. That improvement is based on the data collected and correlated across all the other functional areas described above. Depending on the size of an organization (dozens of assets to millions of assets) and the granularity of the data (firewall alerts to packet capture), the continuous monitoring framework leads to “big security data”. Timing is also very important. Whereas today we mostly hear about cybersecurity incidents after the fact (hours, days, months, and sometimes years later), continuous monitoring operates on real or near-real-time information. The benefits are three-fold: 1) intrusions, threats and vulnerabilities can be detected much more quickly, 2) you can perform continuous authorization on your systems; typically, after the initial Certification & Authorization approval, re-certification occurs either after a major change in the system or once every two or three years, and 3) big security data can lead to predictive analytics. That’s the holy grail of cybersecurity – the ability to accurately and consistently predict vulnerabilities, threats, and attacks to your enterprise, business, or mission.

There are other benefits to this approach, besides improving an organization’s security posture because let’s face it – all the things I’ve described look like they incur additional costs. Yet after some initial investment, depending on the size of your organization and the security products you already have in your infrastructure, there are actually cost savings. At the top of the list, continuous monitoring automates many of the manual processes you have today, reduces disruptions in your enterprise, and minimizes periodic accreditation costs. This is, however, a complex undertaking. We’ve learned a lot about what works and what doesn’t as we continue to integrate products and build continuous monitoring capabilities in our lab.  Feel free to contact me for best practices or if you have any other questions.

This post first appeared at George Romas’ HP blog.

 

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder and partner at Cognitio Corp and publsher of CTOvision.com

@BigDataExpo Stories
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
In order to meet the rapidly changing demands of today’s customers, companies are continually forced to redefine their business strategies in order to meet these needs, stay relevant and continue to see profitable growth. IoT deployment and development is integral in this transformation, and today businesses are increasingly seeing the value of investing their resources into IoT deployments. These technologies are able increase ROI through projects such as connecting supply chains or enabling sm...
SYS-CON Events announced today that CollabNet, a global leader in enterprise software development, release automation and DevOps solutions, will be a Bronze Sponsor of SYS-CON's 20th International Cloud Expo®, taking place from June 6-8, 2017, at the Javits Center in New York City, NY. CollabNet offers a broad range of solutions with the mission of helping modern organizations deliver quality software at speed. The company’s latest innovation, the DevOps Lifecycle Manager (DLM), supports Value S...
SYS-CON Events announced today that Outscale, a global pure play Infrastructure as a Service provider and strategic partner of Dassault Systèmes, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Founded in 2010, Outscale simplifies infrastructure complexities and boosts the business agility of its customers. Outscale delivers a secure, reliable and industrial strength solution for its customers, which in...
SYS-CON Events announced today that Peak 10, Inc., a national IT infrastructure and cloud services provider, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Peak 10 provides reliable, tailored data center and network services, cloud and managed services. Its solutions are designed to scale and adapt to customers’ changing business needs, enabling them to lower costs, improve performance and focus intern...
Everywhere we turn in our industry we can find strong opinions about the direction, type and nature of cloud’s impact on computing and business. Another word that is used in every context in our industry is “hybrid.” In his session at 20th Cloud Expo, Alvaro Gonzalez, Director of Technical, Partner and Field Marketing at Peak 10, will use a combination of a few conceptual props and some research recently commissioned by Peak 10 to offer a real-world consideration of how the various categories of...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
IBM helps FinTechs and financial services companies build and monetize cognitive-enabled financial services apps quickly and at scale. Hosted on IBM Bluemix, IBM’s platform builds in customer insights, regulatory compliance analytics and security to help reduce development time and testing. In his session at 20th Cloud Expo, Tom Eck, Industry Platforms CTO at IBM Cloud, will discuss how these tools simplify the time-consuming tasks of selection, mapping and data integration, allowing developers ...
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in compute, storage and networking technologies, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo | @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA. Learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value...
Multiple data types are pouring into IoT deployments. Data is coming in small packages as well as enormous files and data streams of many sizes. Widespread use of mobile devices adds to the total. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists will look at the tools and environments that are being put to use in IoT deployments, as well as the team skills a modern enterprise IT shop needs to keep things running, get a handle on all this data, and deli...
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
In his opening keynote at 20th Cloud Expo, Michael Maximilien, Research Scientist, Architect, and Engineer at IBM, will motivate why realizing the full potential of the cloud and social data requires artificial intelligence. By mixing Cloud Foundry and the rich set of Watson services, IBM's Bluemix is the best cloud operating system for enterprises today, providing rapid development and deployment of applications that can take advantage of the rich catalog of Watson services to help drive insigh...
Existing Big Data solutions are mainly focused on the discovery and analysis of data. The solutions are scalable and highly available but tedious when swapping in and swapping out occurs in disarray and thrashing takes place. The resolution for thrashing through machine learning algorithms and support nomenclature is through simple techniques. Organizations that have been collecting large customer data are increasingly seeing the need to use the data for swapping in and out and thrashing occurs ...
SYS-CON Events announced today that delaPlex will exhibit at SYS-CON's @CloudExpo, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. delaPlex pioneered Software Development as a Service (SDaaS), which provides scalable resources to build, test, and deploy software. It’s a fast and more reliable way to develop a new product or expand your in-house team.
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend @CloudExpo | @ThingsExpo, June 6-8, 2017, at the Javits Center in New York City, NY and October 31 - November 2, 2017, Santa Clara Convention Center, CA. Learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
SYS-CON Events announced today that EARP Integration will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. EARP Integration is a passionate software house. Since its inception in 2009 the company successfully delivers smart solutions for cities and factories that start their digital transformation. EARP provides bespoke solutions like, for example, advanced enterprise portals, business intelligence systems an...
SYS-CON Events announced today that Progress, a global leader in application development, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Enterprises today are rapidly adopting the cloud, while continuing to retain business-critical/sensitive data inside the firewall. This is creating two separate data silos – one inside the firewall and the other outside the firewall. Cloud ISVs oft...
When NSA's digital armory was leaked, it was only a matter of time before the code was morphed into a ransom seeking worm. This talk, designed for C-level attendees, demonstrates a Live Hack of a virtual environment to show the ease in which any average user can leverage these tools and infiltrate their network environment. This session will include an overview of the Shadbrokers NSA leak situation.