Welcome!

Big Data Journal Authors: Carmen Gonzalez, Elizabeth White, Keith Cawley, Roger Strukhoff, Jason Bloomberg

Related Topics: Big Data Journal, Java, XML, SOA & WOA, .NET, Cloud Expo

Big Data Journal: Article

Simplified Data Retention on a Massive Scale Speeds Access to Big Data

Organizations can gain competitive advantages when able to rely on data retention for improved decision making & trend analysis

There are numerous applications for cost-effective data retention. Organizations can gain substantial competitive advantages when able to rely on data retention for improved decision making and trend analysis. Research enterprises can make use of large scale data sets enabling them to study information more completely than ever before.

Simplified data retention on a massive scale speeds up access time to Big Data. Big Data is defined as large-scale data sets that are too large to analyze and manage using ordinary methods. This data in both structured and unstructured form is valuable and comes from sources such as trading systems.

In many cases existing systems cannot process data of this variety and volume. Some organizations store such data in file systems so as not to overburden their databases. This may be a temporary stop gap, but it will not suffice in the long run. Because Big Data is increasing at an exponential rate, this is only a temporary solution. It's likely that machine-generated data will exceed the processing capability of conventional systems. The cost of extracting this data can be so high that many organizations will just shy away from it.

Today technology is just beginning to address Big Data issues. Many organizations try to apply existing strategies to manage this data effectively. Standard methods from relational database queries to complex analysis tools are being used. Data retention software is also being applied to extract relevant information from Big Data sources.

Currently Big Data retention technology is available that is scalable and easy to implement. Using this technology it's possible to access Big Data online using SQL along with business intelligence software. Components of this type of system are storage platforms with specialized software and a specialized massive scale data repository developed for data retention online. This unique Big Data management system is scalable and designed to process machine-generated data at 40:11 compression ratios while maintaining its online availability.

Organizations that need to process Big Data may benefit by using databases specifically designed for this purpose. Such databases will prove cost-effective and are currently being used in numerous organizations internationally. Such databases work in parallel allowing tens of billions of records to be processed each day. At the same time, the retention capability is practically limitless. This database can fit content addressable storage (CAS), direct attach storage (DAS), and storage area network (SAN). Some of the benefits of this data storage and retrieval system are reduction in infrastructure through reduction in physical storage demand and effective, configurable record management.

One Big Data retention solution has three components. The first is paired server level service managers that share metadata and provide import and query capability. The second is a data archive residing on a cluster services node as well as storage nodes. It's designed with enough scalability to process billions of objects. The third component consists of shared storage that can be local direct access storage, a network file system or a comprehensive clustered file system.

This type of system was recently tested on 508 GB of artificially generated using stock trading test data, modeled after NASDAQ. Performance test results for data import showed a rate of close to 12 billion records imported within an hour. Data compression resulted in a data reduction of 476.1 GB. The archive data was only about 6.3% of the original size prior to compression. A SQL query was executed selecting the three largest volume stocks having trades of well over 4 million per day. This query against 11.6 billion records took approximately 5.5 seconds to execute.

Big Data is high-volume, high-velocity and perhaps highly variable as well. Big Data retention solutions can lead to better decision making, new discoveries and even process optimization. Science is a major area that can benefit from Big Data solutions. Meteorology is just one example that can reap rewards by using new technological advances in data retention on a massive scale. The ability to do research and analysis with extremely large sets of data gives greater understanding to those who are modeling weather, oceanographic conditions, the economy or social trends. With new cost-effective technology available many new organizations will consider the possibilities of Big Data retention in their enterprise.

More Stories By Alan McMahon

Alan McMahon works for Dell. He has worked for Dell for the past 13 years and is involved in enterprise solution design across a range of products from servers and storage to virtualization. He now focuses his attention on marketing for Dell. He is based in Ireland and enjoys sailing as a past time.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@BigDataExpo Stories
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce t...
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water,...
IoT is still a vague buzzword for many people. In his session at Internet of @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, will discuss the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. The presentation will also discuss how IoT is perceived by investors and how venture capitalist access this space. Other topics to discuss are barriers to success, what is n...
All major researchers estimate there will be tens of billions devices – computers, smartphones, tablets, and sensors – connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be!...
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo--to be held November 4-6 at the Santa Clara Convention Center in the heart of Silicon Valley--will expand the DevO...
Software AG helps organizations transform into Digital Enterprises, so they can differentiate from competitors and better engage customers, partners and employees. Using the Software AG Suite, companies can close the gap between business and IT to create digital systems of differentiation that drive front-line agility. We offer four on-ramps to the Digital Enterprise: alignment through collaborative process analysis; transformation through portfolio management; agility through process automation...
The Internet of Things (IoT) promises to create new business models as significant as those that were inspired by the Internet and the smartphone 20 and 10 years ago. What business, social and practical implications will this phenomenon bring? That's the subject of "Monetizing the Internet of Things: Perspectives from the Front Lines," an e-book released today and available free of charge from Aria Systems, the leading innovator in recurring revenue management.
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at 6th Big Data Expo®, Hannah Smalltree, Director at Treasure Data, to discuss how IoT, B...
Quantum is a leading expert in scale-out storage, archive and data protection, providing intelligent solutions for capturing, sharing and preserving digital assets over the entire data lifecyle. They help customers maximize the value of these assets to achieve their goals, whether it’s top movie studios looking to create the next blockbuster, researchers working to accelerate scientific discovery, or small businesses trying to streamline their operations. With a comprehensive portfolio of best-i...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, will discuss single-value, geo-spatial, and log time series ...
SimpleECM is the only platform to offer a powerful combination of enterprise content management (ECM) services, capture solutions, and third-party business services providing simplified integrations and workflow development for solution providers. SimpleECM is opening the market to businesses of all sizes by reinventing the delivery of ECM services. Our APIs make the development of ECM services simple with the use of familiar technologies for a frictionless integration directly into web applicat...
Software is eating the world. Companies that were not previously in the technology space now find themselves competing with Google and Amazon on speed of innovation. As the innovation cycle accelerates, companies must embrace rapid and constant change to both applications and their infrastructure, and find a way to deliver speed and agility of development without sacrificing reliability or efficiency of operations. In her keynote DevOps Summit, Victoria Livschitz, CEO of Qubell, will discuss ho...
Dyn solutions are at the core of Internet Performance. Through traffic management, message management and performance assurance, Dyn is connecting people through the Internet and ensuring information gets where it needs to go, faster and more reliably than ever before. Founded in 2001 at WPI, Dyn’s global presence services more than four million enterprise, small business and personal customers.
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. Over the summer Gartner released its much anticipated annual Hype Cycle report and the big news is that Internet of Things has now replaced Big Data as the most hyped technology. Indeed, we're hearing more and more about this fascinating new technological paradigm. ...
You use an agile process; your goal is to make your organization more agile. But what about your data infrastructure? The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application an...
SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer's customers range from Web startups to global enterprises. Products and services include bare metal and virtual servers, networking, turnkey big data solutions, private cloud solutions, and more. SoftLayer's unique advantages include the industry's first Network-Within-a-Network topology for true out-of-band access, and an easy-to-...
Despite the fact that majority of developers firmly believe that “it worked on my laptop” is a poor excuse for production failures, most don’t truly understand why it is virtually impossible to make your development environment representative of production. When asked, the primary reason for the production/development difference everyone mentions is technology stack spec/configuration differences. While it’s true, thanks to the black magic of Cloud (capitalization intended) with a bit of wizard...
SYS-CON Events announced today that AppDynamics will exhibit at DevOps Summit Silicon Valley, which will take place November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Digital businesses like yours need a way to turn data into actual results. AppDynamics is ushering in the next digital age – the age of the software-defined business. AppDynamics’ mission is to deliver true application intelligence that helps your software-defined business run faster, leaner, and more ef...
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In their General Session at 15th Cloud Expo, Phil Jackson, Development Community Advocate at SoftLayer, and Harold Hannon, Sr. Software Architect at SoftLayer, to discuss how to take advantage of a multitude of compute option...