Click here to close now.




















Welcome!

@BigDataExpo Authors: Pat Romanski, Elizabeth White, Esmeralda Swartz, Liz McMillan, Cloud Best Practices Network

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

Don't Stick Your Head in the Sand, Create a Proactive Security Strategy

Preventing data leakage from the cloud

In business, data is currency. It is the oil that keeps the commercial engine in motion and databases are the digital banks that store and retrieve this valuable information. And, according to IDC, data is doubling every two years. But as the overall amount of data grows, so does the amount of sensitive and regulated data. All this data stored by enterprises requires high levels of security. Presently (again, according to IDC) only about a quarter of that data is being properly protected now. Like all currency, data must be protected.

And herein lays a key issue. Too many executives see security as a cost center and are often reticent to invest beyond the bare minimum--whatever keeps the nasty viruses out; whatever is absolutely necessary for compliance. Their thought process is akin to “we haven’t been attacked before…or we don't have a high enough profile for hackers to care” I call this “ostriching” – putting your head in the sand and hoping misfortune never darkens your door.

To substantiate this attitude many organizations look toward on premise-based protection that encrypts or monitors network traffic containing critical information. For the average company, this can be a budget buster and a significant resource drain...that is until they look toward the cloud and explore cloud-based security options.

Yet regardless of deployment options, most security experts will agree the best defense is a proactive strategy.

Data leak prevention (DLP), like most security efforts, is a complex challenge. It is meant to prevent the deliberate and inadvertent release of sensitive information. Too many companies are trying to cure the symptoms rather than prevent them in the first place.

Part of the protection equation is being overlooked. Database management systems must also be a component of a proactive data security strategy. Like the bank vault, it requires strong protections at its foundation. DLP is one part of a comprehensive enterprise data security program that includes comprehensive security best practices for the protection of mission-critical enterprise data repositories. The security must be able to both foil attackers who are financially motivated and won't be deterred by minimalist security and prevent the accidental release of data. Data security will go nowhere without robust, proactive database security.

To properly achieve these goals, organizations need to implement functions that comprise of a variety of solutions. And when used cooperatively, a company can instantly discover who is doing what and when on the network, identify the potential impact and take the necessary steps to prevent or allow access/usage. Just like a bank vault—security cameras follow you to see who you are, you need a password  to get into the vault itself (during business hours!) and your only allowed to open your own safety deposit box (as long as you have the key). Here are four proactive measures you can take:

Intrusion detection (security information and event monitoring): The first step in protection is to know who is proverbially knocking on the door…or sneaking around the back entrance. Activity monitoring and blocking is the first line of defense for your firewall and beyond (this includes BYOD access. And vigilance on the front lines create real time correlation to detect patterns of traffic, spot usage anomalies and prevent internal or external attacks. SIEM actually provides the forensic analysis that determines whether or not any access of a network is friendly/permissible, suspicious or threatening. This analysis is the basis of creating alerts to take appropriate action/alerts to prevent data leakage.

Traffic monitoring (Log Management): Once you know who’s accessing the network, log management looks to make sense of the patterns and historical usage so one can identify suspicious IP addresses, locations, and users as likely transgressors. If you can predict the traffic, then you can create the rules to block sources, prevent access and create a reportable audit trail of activity. But to be proactive, it must be continuous and in real time.  Looking at reams of machine logs days or weeks after might discover breaches and careless users, but it can’t prevent it. It is the proverbial equivalent of chasing the horse that has left the barn.

Provisioning: (Identity Management): One of the best ways of ensuring users only access data to which they are entitled to see or use is through proper delegation of user rights. This is handled through identity management provisioning. In well too many documented cases, a user (typically an employee) leaves the fold, but never relinquishes access to this sensitive information. Just as provisioning gives users certain rights, automatic de-provsioning keeps former employees and other away from certain sections of your database. And when connected to SIEM and Log Management, when and if deprovsioned users try to use retired passwords or accounts, you know about it when it happens!

Authentication and Credentialing: (Access Management) This is more than password management (and making sure these codes are more substantial than “password123” B making sure access is controlled by at least two or more credentialing (multi-factored authentication) For example, a hospital may choose to require authorized personnel to present a log in credentials like a password and a unique variable code to access certain protected/sensitive areas of the network or database. In doing so, they have additional protection against the use of lost or unauthorized credentials. It is another layer of protection that can deflect potential data leakage.

In this assessment, there are at least four individual solutions which require implementation and monitoring. If the executives were unwilling before, how can an IT department muster the leverage to find money or the proposed staffing to deploy this preventive strategy? The good news is they don’t have to do either. Through a unified security model (real time event and access correlation technology) from the cloud combines the capabilities and functionalities from each of these toolsets and creates a strong, cost-effective enterprise platform. It leverages the key features in a single cooperative, centralized  source that enhances visibility throughout the enterprise. All the cost saving benefits inherent with cloud computing are realized and as a security-as-a-service, the need for additional headcount is moot. Part of the service is the live expert analysts watching over your virtual borders 24/7/365.

The additional benefit it’s the ability to leverage existing programs into a REACT platform. If a company previously invested in a Log Management or Single Sign On solution, they can easily integrate the other pieces of the puzzle to ensure a layered, holistic approach. This way all the independent silos are monitored and covered. Because each of the solutions interact and intersect with one another, the seamless communication creates a layered, responsive defense that anticipates, controls and alerts as opposed attempting to put the toothpaste back into the tube. The damage of a breach (whether through user carelessness, internal sabotage or direct attack) is more than just the compliance fines and the blowback of the data currency affected. Substantial and detrimentally impactful as they are, they can’t touch the cost of broken trust. That, in itself, is a driving reason to get ahead on the issue of proactive security.

As enterprise systems are exposed to substantial risk from data loss, theft, or manipulation, unified security platforms from the cloud IS that fine balance of data leakage prevention, protection of IP assets, maintenance of compliance standards versus cost/resource responsibility. It is an accountable way of becoming proactive.

Kevin Nikkhoo

CloudAccess

More Stories By Kevin Nikkhoo

With more than 32 years of experience in information technology, and an extensive and successful entrepreneurial background, Kevin Nikkhoo is the CEO of the dynamic security-as-a-service startup Cloud Access. CloudAccess is at the forefront of the latest evolution of IT asset protection--the cloud.

Kevin holds a Bachelor of Science in Computer Engineering from McGill University, Master of Computer Engineering at California State University, Los Angeles, and an MBA from the University of Southern California with emphasis in entrepreneurial studies.

@BigDataExpo Stories
While many app developers are comfortable building apps for the smartphone, there is a whole new world out there. In his session at @ThingsExpo, Narayan Sainaney, Co-founder and CTO of Mojio, will discuss how the business case for connected car apps is growing and, with open platform companies having already done the heavy lifting, there really is no barrier to entry.
With the Apple Watch making its way onto wrists all over the world, it’s only a matter of time before it becomes a staple in the workplace. In fact, Forrester reported that 68 percent of technology and business decision-makers characterize wearables as a top priority for 2015. Recognizing their business value early on, FinancialForce.com was the first to bring ERP to wearables, helping streamline communication across front and back office functions. In his session at @ThingsExpo, Kevin Roberts...
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
The Internet of Things is in the early stages of mainstream deployment but it promises to unlock value and rapidly transform how organizations manage, operationalize, and monetize their assets. IoT is a complex structure of hardware, sensors, applications, analytics and devices that need to be able to communicate geographically and across all functions. Once the data is collected from numerous endpoints, the challenge then becomes converting it into actionable insight.
Consumer IoT applications provide data about the user that just doesn’t exist in traditional PC or mobile web applications. This rich data, or “context,” enables the highly personalized consumer experiences that characterize many consumer IoT apps. This same data is also providing brands with unprecedented insight into how their connected products are being used, while, at the same time, powering highly targeted engagement and marketing opportunities. In his session at @ThingsExpo, Nathan Trel...
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
The Internet of Things (IoT) is about the digitization of physical assets including sensors, devices, machines, gateways, and the network. It creates possibilities for significant value creation and new revenue generating business models via data democratization and ubiquitous analytics across IoT networks. The explosion of data in all forms in IoT requires a more robust and broader lens in order to enable smarter timely actions and better outcomes. Business operations become the key driver of I...
WSM International, the pioneer and leader in server migration services, has announced an agreement with WHOA.com, a leader in providing secure public, private and hybrid cloud computing services. Under terms of the agreement, WSM will provide migration services to WHOA.com customers to relocate some or all of their applications, digital assets, and other computing workloads to WHOA.com enterprise-class, secure cloud infrastructure. The migration services include detailed evaluation and planning...
SYS-CON Events announced today that DataClear Inc. will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The DataClear ‘BlackBox’ is the only solution that moves your PC, browsing and data out of the United States and away from prying (and spying) eyes. Its solution automatically builds you a clean, on-demand, virus free, new virtual cloud based PC outside of the United States, and wipes it clean...
Contrary to mainstream media attention, the multiple possibilities of how consumer IoT will transform our everyday lives aren’t the only angle of this headline-gaining trend. There’s a huge opportunity for “industrial IoT” and “Smart Cities” to impact the world in the same capacity – especially during critical situations. For example, a community water dam that needs to release water can leverage embedded critical communications logic to alert the appropriate individuals, on the right device, as...
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of tech...
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advance...
In his session at @ThingsExpo, Lee Williams, a producer of the first smartphones and tablets, will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater. He will explain how M2M controllers work through wirelessly connected remote controls; and specifically delve into a retrofit option that reverse-engineers control codes of existing conventional controller systems so the...
U.S. companies are desperately trying to recruit and hire skilled software engineers and developers, but there is simply not enough quality talent to go around. Tiempo Development is a nearshore software development company. Our headquarters are in AZ, but we are a pioneer and leader in outsourcing to Mexico, based on our three software development centers there. We have a proven process and we are experts at providing our customers with powerful solutions. We transform ideas into reality.
SYS-CON Events announced today that IceWarp will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IceWarp, the leader of cloud and on-premise messaging, delivers secured email, chat, documents, conferencing and collaboration to today's mobile workforce, all in one unified interface
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of ...
SmartBear Software has updated its API tools, ServiceV for API service virtualization and LoadUI NG for API load testing, to accelerate development and testing processes in Agile teams. Updates to ServiceV enable software teams to rapidly build advanced mocks from real-time API traffic and quickly switch between virtualized “mock” services and actual APIs during diagnostic, load or integration testing in the continuous delivery lifecycle.
Learn how you can use the CoSN SEND II Decision Tree for Education Technology to make sure that your K–12 technology initiatives create a more engaging learning experience that empowers students, teachers, and administrators alike.
The amount of data processed in the world doubles every three years and a global commitment to open source technology is the way to handle this growth. An open technology approach fosters innovation through massive community involvement and impedes expensive vendor lock-in. This benefits buyers as markets remain more competitive. In doing so, open standards and technologies also allow for market hypergrowth, and this is the key to handling the growth of data. A doubling every three years ...