Click here to close now.


@BigDataExpo Authors: Liz McMillan, Elizabeth White, Yeshim Deniz, AppDynamics Blog, Pat Romanski

Blog Feed Post

Effective Storage for Growing Data Volumes

By Ian Kilpatrick, chairman Wick Hill Group

Dealing with vast amounts of data used to be a problem faced purely by large enterprises. However, in today's world of rapidly increasing data, it's now an issue for companies of all sizes.

Data is increasing for all sorts of reasons, which include the use of social media and the increased use of mobile devices such as smartphones and tablets, which have dramatically increased the amount of data being backed up on the network.

Symantec's recent State of Information Survey conducted over 38 countries worldwide found that SMEs expected their storage to increase by 178% over the next year.

The large amount of data companies now produce, which can be in the terabytes and petabytes, needs to be backed up and stored so it can be accessed easily and quickly. It needs to be archived in case it is needed in the future and also for compliance reasons; and it needs to be replicated, so it's available for use in the case of a disaster. All this needs to be done cost-effectively and securely.

Storage is a notoriously boring subject for many people, and it tends to get pushed down the list of priorities. While enterprises have the resources to manage their back-up and storage issues, many organisations consider it a necessary evil and don't review its financial effectiveness or its fitness for future purpose.

This is a shame, as in all matters IT, times have moved on from things such as old-style tape systems. Modern solutions are bigger, better, easier and lower cost, often with features previously only available on enterprise systems.

Increases in data volume now make storage a key business issue for companies. Performance issues, which result from companies being overwhelmed by data or by backing up, could impact on profitability. The inability to access data quickly, in the event of a disaster, could put a company out of business.

Some recent statistics indicate that 43% of businesses that close after a natural disaster never re-open and a further 29% close within two years. One of the key reasons for this is the failure of their disaster-recovery planning. Back-ups and back-up plans need to be tested regularly to ensure that they are relevant or actually working. Finding that your back-ups are corrupted, when you absolutely need them, is too late.

Which route forward?
The question is how to choose a storage solution that will cope with current storage needs as well as taking you forward into a future of unpredictable, and mushrooming, data growth?

Storage solutions for companies outside the Top 500 range from traditional tape to the cloud, with other options and permutations in between. For some organisations, cloud may be the answer. For others, particularly if they have been using tape in the past, the leap to the cloud is just too great and they want something more tangible, such as disc-based storage.

For some, the solution may be a hybrid one which gives local back up with background cloud back-up. This option provides speed of access with the security of offsite cloud.

Companies still using traditional tape for back-up and archiving will be finding it increasingly inadequate for their needs, that is if they've checked recently. Tape also has inherent disadvantages.

It's cumbersome, expensive, has a finite life and is easily damaged. It takes longer and longer, the more you are backing up. And it can be very difficult to find things quickly on tape when you need them.

The cloud, while it may seem like a great option to many, isn't for everyone. One disadvantage of the cloud, which many aren't aware of, is that the data is probably going to be stored on traditional tape.

Another issue to be aware of is how long it takes to seed and download data over limited capacity internet connections. Seeding a terabyte of data in the cloud on a 10mbps connection with nothing else going on will take 300 hours (12.5 days).

This means that accessing it and downloading it could take days. If you have a serious disaster recovery situation, this may possibly be acceptable, but it's a serious hindrance if you need to access stored data quickly during the normal day-to-day running of a business.

One solution for more conservative SMEs is to use RDX removable hard disc cartridges for storage. They combine the best of hard disc and tape storage. They scale with a business, but, unlike tape, are very rugged and reliable. It's a step forward, for those who have been using tape, which isn't too different from what went before.

RDX cartridges back-up and restore data very quickly (much quicker than tape) and are very secure. They are typically available in sizes from 160 gigabytes to 1.5 terabytes. You just add extra cartridges as your storage needs expand.

Imation, for example, which has a specialist division for SME storage, provides an RDX solution, the A8, which can accommodate up to 12 terabytes of data.

The A8 helps SMEs conduct high-performance back-up, data protection, archiving, restoration and cloud-enabling. It can mix RDX cartridges of different sizes, so users can start with lower capacity cartridges and add higher capacity ones as needs dictate.

A solution like this allows organisations to quickly back up and instantly access their crucial data. It gives them more operational flexibility and the ability to cost-effectively and quickly recover their data in the event of a disaster. Users get the benefits of RDX cartridge storage, but also keep cloud options open.

Imation also provides another product which gives companies a comprehensive set of storage and back-up options.

DataGuard is a network attached storage (NAS) backup appliance which uses hard disk drives, removable RDX® disk cartridges, replication, and cloud storage to provide up to four layers of data protection. It shortens back-up windows and allows for fast recovery.

DataGuard is capable of making multiple copies of content as local online copies, replicated copies, optional offline RDX copies and remote online (cloud) copies.

It means companies can have all bases covered. They don't have to go with the cloud straight away, but the facility is there to do it when and if they are ready.

Another company which provides a solution which gives the best of both worlds, both local and cloud storage, is Barracuda with its Barracuda Backup Service.

This provides full local data back-up and is combined with a storage subscription, to replicate data to the cloud at two offsite locations. So organisations get onsite back-ups for fast restore times and secure, offsite storage for disaster recovery.

The Barracuda system uses a technology called deduplication, which reduces traditional back-up storage requirements by 20 to 50 times, while also reducing back-up windows and bandwidth requirements.

Deduplication works by eliminating redundant data. Only one unique instance of the data is actually retained. The redundant data is replaced with a pointer to the original copy.

One constant is guaranteed. Storage and data access requirements will (as they always have done) continue to grow, in fact the pace appears to be accelerating. Alongside the change in data volumes, new options have become available which provide enterprise-level solutions at affordable prices.

A range of such options is available from RDX cartridges to cloud services, plus a variety of combinations in between. Such solutions allow organisations to effectively back-up and store data, so it doesn't cause serious performance problems on the network; they allow the data to be quickly available for both the running of the business and for compliance purposes; and they offer a disaster recovery option.

Bio of author
Ian Kilpatrick is chairman of international value added distributor Wick Hill Group plc, specialists in market development for secure IP infrastructure solutions and convergence. Kilpatrick has been involved with the Group for more than 35 years. Wick Hill supplies organisations from enterprises to SMEs, through an extensive value-added network of accredited VARs.

Kilpatrick has an in-depth experience of IT and unified communications (UC) with a strong vision of the future. He looks at these areas from a business point-of-view and his approach reflects his philosophy that business benefits, ease-of-use and cost of ownership are key factors, rather than just technology. He has authored numerous articles and publications, as well as being a regular speaker at conferences, exhibitions and seminars. For more information about Wick Hill, please visit or


For further press information, please contact Annabelle Brown on 01326 318212, email [email protected]. For reader queries, contact Wick Hill on 01483 227600. Web For pic of Ian Kilpatrick, please contact Annabelle Brown or download from

Read the original blog entry...

More Stories By RealWire News Distribution

RealWire is a global news release distribution service specialising in the online media. The RealWire approach focuses on delivering relevant content to the receivers of our client's news releases. As we know that it is only through delivering relevance, that influence can ever be achieved.

@BigDataExpo Stories
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet condit...
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete en...
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
In their session at DevOps Summit, Asaf Yigal, co-founder and the VP of Product at, and Tomer Levy, co-founder and CEO of, will explore the entire process that they have undergone – through research, benchmarking, implementation, optimization, and customer success – in developing a processing engine that can handle petabytes of data. They will also discuss the requirements of such an engine in terms of scalability, resilience, security, and availability along with how the archi...
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
SYS-CON Events announced today that VividCortex, the monitoring solution for the modern data system, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The database is the heart of most applications, but it’s also the part that’s hardest to scale, monitor, and optimize even as it’s growing 50% year over year. VividCortex is the first unified suite of database monitoring tools specifically desi...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated a...
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Val...
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the...
Apps and devices shouldn't stop working when there's limited or no network connectivity. Learn how to bring data stored in a cloud database to the edge of the network (and back again) whenever an Internet connection is available. In his session at 17th Cloud Expo, Bradley Holt, Developer Advocate at IBM Cloud Data Services, will demonstrate techniques for replicating cloud databases with devices in order to build offline-first mobile or Internet of Things (IoT) apps that can provide a better, ...
Recently announced Azure Data Lake addresses the big data 3V challenges; volume, velocity and variety. It is one more storage feature in addition to blobs and SQL Azure database. Azure Data Lake (should have been Azure Data Ocean IMHO) is really omnipotent. Just look at the key capabilities of Azure Data Lake:
SYS-CON Events announced today that IBM Cloud Data Services has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. IBM Cloud Data Services offers a portfolio of integrated, best-of-breed cloud data services for developers focused on mobile computing and analytics use cases.
The enterprise is being consumerized, and the consumer is being enterprised. Moore's Law does not matter anymore, the future belongs to business virtualization powered by invisible service architecture, powered by hyperscale and hyperconvergence, and facilitated by vertical streaming and horizontal scaling and consolidation. Both buyers and sellers want instant results, and from paperwork to paperless to mindless is the ultimate goal for any seamless transaction. The sweetest sweet spot in innov...