Click here to close now.




















Welcome!

@BigDataExpo Authors: Pat Romanski, Liz McMillan, Cloud Best Practices Network, Elizabeth White, Continuum Blog

Blog Feed Post

Continuous Monitoring – Part 2

By

I previously wrote about the various “functional areas” of continuous monitoring. According to the federal model, there are 15 functional areas comprising a comprehensive continuous monitoring solution, as shown in the graphic below:

Federal CM Model

These functional areas are grouped into the following categories:

  • Manage Assets
  • Manage Accounts
  • Manage Events
  • Security Lifecycle Management

Each category addresses a general area of vulnerability in an enterprise. The rest of this post will describe each category, the complexities involved in integration, and the difficulties in making sense of the information. “Making sense” means a number of things here – understanding and remediating vulnerabilities, detecting and preventing threats, estimating risk to the business or mission, ensuring continuity of operations and disaster recovery, and enforcing compliance to policies and standards.

The process of Managing Assets includes tracking all hardware and software products (PCs, servers, network devices, storage, applications, mobile devices, etc.) in your enterprise, and checking for secure configuration and for known/existing vulnerabilities. While some enterprises institute standard infrastructure components, it still takes a variety of products, sensors and data to obtain a near-real-time understanding of an organization’s security posture. Implementing an initial capability in my cybersecurity lab, we have integrated fourteen products and spent considerable time massaging data so that it conforms to NIST and DoD standards. This enables pertinent information to pass seamlessly across the infrastructure, and allows correlation and standardized reporting from a single dashboard. This will become more complex as we add more capabilities and more products. In addition, the new style of IT extends enterprise assets to include mobile devices and cloud resources – so our work to understand and manage the security within this area is just beginning.

The next area deals with Managing Accounts of both people and services. Typically, we think of account management as monitoring too many unsuccessful login attempts or making sure that we delete an account when someone leaves the organization. However, if you look at the subsections of the graphic, you’ll see that managing accounts involves additional characteristics – evaluating trust, managing credentials, and analyzing behavioral patterns. This applies not only to people but to services – system-to-system, server-to-server, domain-to-domain, process-to-process, and any other combination thereof. Think about the implications of recording and analyzing behavior, and you’ll realize that any solution will require big data. Security-related behavior is a somewhat nebulous concept, but if you drill down into the details, you can envision capturing information on location, performance characteristics, schedule, keywords, encryption algorithms, access patterns, unstructured text, and more. Accounts (whether a person, system, computer, or service) use assets and groups of assets. As such, the information gleaned from the relationships and interactions between accounts and assets provides another layer of intelligence to the continuous monitoring framework.

The Managing Events category is organized into preparation-for and response-to incidents. In the cybersecurity realm, incidents can cover anything from a spear phishing attack to denial of service to digital annihilation to critical infrastructure disruption to destruction of physical plant. That covers a wide range of methods to protect an organization’s assets – and any physical asset that is somehow connected to a network needs cyber protection. The first thing to do to manage events is to plan! Backup and recovery, continuity of operations, business continuity, disaster recovery – call it what you will, but a plan will help you to understand what needs protecting, how to protect it, and how to recover when those protections fail. This is the next layer of functionality – and complexity – in the continuous monitoring framework; the functions all build upon one another to provide a more secure and resilient enterprise. The security-related data and information aggregated across these layers provides the raw materials to understand your security posture and manage your risk.

The final set of functions deal with Security Lifecycle Management. The lifecycle helps to identify the security requirements of an organization, the associated plans and policies needed to satisfy those requirements, and the processes used to monitor and improve security operations. That improvement is based on the data collected and correlated across all the other functional areas described above. Depending on the size of an organization (dozens of assets to millions of assets) and the granularity of the data (firewall alerts to packet capture), the continuous monitoring framework leads to “big security data”. Timing is also very important. Whereas today we mostly hear about cybersecurity incidents after the fact (hours, days, months, and sometimes years later), continuous monitoring operates on real or near-real-time information. The benefits are three-fold: 1) intrusions, threats and vulnerabilities can be detected much more quickly, 2) you can perform continuous authorization on your systems; typically, after the initial Certification & Authorization approval, re-certification occurs either after a major change in the system or once every two or three years, and 3) big security data can lead to predictive analytics. That’s the holy grail of cybersecurity – the ability to accurately and consistently predict vulnerabilities, threats, and attacks to your enterprise, business, or mission.

There are other benefits to this approach, besides improving an organization’s security posture because let’s face it – all the things I’ve described look like they incur additional costs. Yet after some initial investment, depending on the size of your organization and the security products you already have in your infrastructure, there are actually cost savings. At the top of the list, continuous monitoring automates many of the manual processes you have today, reduces disruptions in your enterprise, and minimizes periodic accreditation costs. This is, however, a complex undertaking. We’ve learned a lot about what works and what doesn’t as we continue to integrate products and build continuous monitoring capabilities in our lab.  Feel free to contact me for best practices or if you have any other questions.

This post first appeared at George Romas’ HP blog.

 

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley, former CTO of the Defense Intelligence Agency (DIA), is Founder and CTO of Crucial Point LLC, a technology research and advisory firm providing fact based technology reviews in support of venture capital, private equity and emerging technology firms. He has extensive industry experience in intelligence and security and was awarded an intelligence community meritorious achievement award by AFCEA in 2008, and has also been recognized as an Infoworld Top 25 CTO and as one of the most fascinating communicators in Government IT by GovFresh.

@BigDataExpo Stories
The Internet of Things is in the early stages of mainstream deployment but it promises to unlock value and rapidly transform how organizations manage, operationalize, and monetize their assets. IoT is a complex structure of hardware, sensors, applications, analytics and devices that need to be able to communicate geographically and across all functions. Once the data is collected from numerous endpoints, the challenge then becomes converting it into actionable insight.
Consumer IoT applications provide data about the user that just doesn’t exist in traditional PC or mobile web applications. This rich data, or “context,” enables the highly personalized consumer experiences that characterize many consumer IoT apps. This same data is also providing brands with unprecedented insight into how their connected products are being used, while, at the same time, powering highly targeted engagement and marketing opportunities. In his session at @ThingsExpo, Nathan Trel...
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
With the Apple Watch making its way onto wrists all over the world, it’s only a matter of time before it becomes a staple in the workplace. In fact, Forrester reported that 68 percent of technology and business decision-makers characterize wearables as a top priority for 2015. Recognizing their business value early on, FinancialForce.com was the first to bring ERP to wearables, helping streamline communication across front and back office functions. In his session at @ThingsExpo, Kevin Roberts...
While many app developers are comfortable building apps for the smartphone, there is a whole new world out there. In his session at @ThingsExpo, Narayan Sainaney, Co-founder and CTO of Mojio, will discuss how the business case for connected car apps is growing and, with open platform companies having already done the heavy lifting, there really is no barrier to entry.
The Internet of Things (IoT) is about the digitization of physical assets including sensors, devices, machines, gateways, and the network. It creates possibilities for significant value creation and new revenue generating business models via data democratization and ubiquitous analytics across IoT networks. The explosion of data in all forms in IoT requires a more robust and broader lens in order to enable smarter timely actions and better outcomes. Business operations become the key driver of I...
WSM International, the pioneer and leader in server migration services, has announced an agreement with WHOA.com, a leader in providing secure public, private and hybrid cloud computing services. Under terms of the agreement, WSM will provide migration services to WHOA.com customers to relocate some or all of their applications, digital assets, and other computing workloads to WHOA.com enterprise-class, secure cloud infrastructure. The migration services include detailed evaluation and planning...
SYS-CON Events announced today that DataClear Inc. will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The DataClear ‘BlackBox’ is the only solution that moves your PC, browsing and data out of the United States and away from prying (and spying) eyes. Its solution automatically builds you a clean, on-demand, virus free, new virtual cloud based PC outside of the United States, and wipes it clean...
Contrary to mainstream media attention, the multiple possibilities of how consumer IoT will transform our everyday lives aren’t the only angle of this headline-gaining trend. There’s a huge opportunity for “industrial IoT” and “Smart Cities” to impact the world in the same capacity – especially during critical situations. For example, a community water dam that needs to release water can leverage embedded critical communications logic to alert the appropriate individuals, on the right device, as...
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of tech...
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advance...
In his session at @ThingsExpo, Lee Williams, a producer of the first smartphones and tablets, will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater. He will explain how M2M controllers work through wirelessly connected remote controls; and specifically delve into a retrofit option that reverse-engineers control codes of existing conventional controller systems so the...
U.S. companies are desperately trying to recruit and hire skilled software engineers and developers, but there is simply not enough quality talent to go around. Tiempo Development is a nearshore software development company. Our headquarters are in AZ, but we are a pioneer and leader in outsourcing to Mexico, based on our three software development centers there. We have a proven process and we are experts at providing our customers with powerful solutions. We transform ideas into reality.
SYS-CON Events announced today that IceWarp will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IceWarp, the leader of cloud and on-premise messaging, delivers secured email, chat, documents, conferencing and collaboration to today's mobile workforce, all in one unified interface
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of ...
SmartBear Software has updated its API tools, ServiceV for API service virtualization and LoadUI NG for API load testing, to accelerate development and testing processes in Agile teams. Updates to ServiceV enable software teams to rapidly build advanced mocks from real-time API traffic and quickly switch between virtualized “mock” services and actual APIs during diagnostic, load or integration testing in the continuous delivery lifecycle.
Learn how you can use the CoSN SEND II Decision Tree for Education Technology to make sure that your K–12 technology initiatives create a more engaging learning experience that empowers students, teachers, and administrators alike.
The amount of data processed in the world doubles every three years and a global commitment to open source technology is the way to handle this growth. An open technology approach fosters innovation through massive community involvement and impedes expensive vendor lock-in. This benefits buyers as markets remain more competitive. In doing so, open standards and technologies also allow for market hypergrowth, and this is the key to handling the growth of data. A doubling every three years ...