Welcome!

Big Data Journal Authors: Elizabeth White, Yeshim Deniz, Liz McMillan, Pat Romanski, Roger Strukhoff

Related Topics: Big Data Journal

Big Data Journal: Blog Feed Post

HP's New StoreEver Hardware

Offers organizations the opportunity to meet some of the high demands of Big Data

In January of this year, HP made a huge leap forward. They announced the transition of their entire tape storage portfolio to the LTO-6 specification, branding it under the moniker of HP StoreEver Storage. This announcement has massive implications, especially for those organizations that are trying to manage a constantly-growing pool of data, much of which is simply archival data.

The Benefits of LTO-6
For some organizations, this move will prove to be an immediate benefit. LTO-6 gives twice the amount of storage as LTO-5 along with increasing the transfer speed by half.

Transfer rates are estimated to be approximately 1.4 terabytes for each tape drive per hour. This promises to provide a new environment for storing and retrieving cloud computing data, as well as opportunities for disaster recovery configurations. LTO-6 can be scalable and affordable, more so than many of the other archival options on the market today.

Longer Shelf Life
This move may also enable HP customers to better maintain archival data over longer periods of time. LTO-6 cartridges, according to HP, have a shelf life of as much as three decades. A single tape can handle 6.25 terabytes of (compressed) data, giving organizations a whopping 44 petabytes in just one StoreEver tape library.

Big Data Challenges Met
StoreEver offers organizations the opportunity to meet some of the high demands of Big Data. This is especially true in two critical areas: security and compliance.

In fact, more and more companies that are trying to handle the challenges of Big Data are moving back to tape storage. The fragmented way that many organizations currently deal with protecting and retaining data creates unnecessary risk. HP StoreEver allows storage needs to integrate and converge, landing squarely in high-capacity and highly-resilient technology.

Data Integrity
StoreEver uses hardware-based encryption as well as WORM (Write Once Read Many) protection. It also offers support for LTFS (Linear Tape File System) via the management console in HP StoreOpen.

HP’s new LTO-6 StoreEver Storage solutions are available today. If your organization is struggling with solutions to archiving data, this might be the solution you need.

Read the original blog entry...

More Stories By Unitiv Blog

Unitiv, Inc., is a professional provider of enterprise IT solutions. Unitiv delivers its services from its headquarters in Alpharetta, Georgia, USA, and its regional office in Iselin, New Jersey, USA. Unitiv provides a strategic approach to its service delivery, focusing on three core components: People, Products, and Processes. The People to advise and support customers. The Products to design and build solutions. The Processes to govern and manage post-implementation operations.

Latest Stories from Big Data Journal
The cloud provides an easy onramp to building and deploying Big Data solutions. Transitioning from initial deployment to large-scale, highly performant operations may not be as easy. In his session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, will discuss the benefits, weaknesses, and performance characteristics of public and bare metal cloud deployments that can help you make the right decisions.
Cisco on Wedesday announced its intent to acquire privately held Metacloud. Based in Pasadena, Calif., Metacloud deploys and operates private clouds for global organizations with a unique OpenStack-as-a-Service model that delivers and remotely operates production-ready private clouds in a customer's data center. Metacloud's OpenStack-based cloud platform will accelerate Cisco's strategy to build the world's largest global Intercloud, a network of clouds, together with key partners to address cu...
When one expects instantaneous response from video generated on the internet, lots of invisible problems have to be overcome. In his session at 6th Big Data Expo®, Tom Paquin, EVP and Chief Technology Officer at OnLive, to discuss how to overcome these problems. A Silicon Valley veteran, Tom Paquin provides vision, expertise and leadership to the technology research and development effort at OnLive as EVP and Chief Technology Officer. With more than 20 years of management experience at lead...
Amazon, Google and Facebook are household names in part because of their mastery of Big Data. But what about organizations without billions of dollars to spend on Big Data tools - how can they extract value from their data? Ion Stoica is co-founder and CEO of Databricks, a company working to revolutionize Big Data analysis through the Apache Spark platform. He also serves as a professor of computer science at the University of California, Berkeley. Ion previously co-founded Conviva to commercial...
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again.
Where historically app development would require developers to manage device functionality, application environment and application logic, today new platforms are emerging that are IoT focused and arm developers with cloud based connectivity and communications, development, monitoring, management and analytics tools. In her session at Internet of @ThingsExpo, Seema Jethani, Director of Product Management at Basho Technologies, will explore how to rapidly prototype using IoT cloud platforms and c...
BlueData aims to “democratize Big Data” with its launch of EPIC Enterprise, which it calls “the industry’s first Big Data software to enable enterprises to create a self-service cloud experience on premise.” This self-service private cloud allows enterprises to create 100-node Hadoop and Spark clusters in less than 10 minutes. The company is also offering a Community Edition via free download. We had a few questions for BlueData CEO Kumar Sreekanti about all this, and here's what he had to s...
Labor market analytics firm Wanted Analytics recently assessed the market for technology professionals and found that demand for people with proficient levels of Hadoop expertise had skyrocketed by around 33% since last year – it is true, Hadoop is hard technology to master and the labor market is not exactly flooded with an over-abundance of skilled practitioners. Hadoop has been called a foundational technology, rather than ‘just’ a database by some commentators – this almost pushes it towards...
Are your Big Data initiatives resulting in a Big Impact or a Big Mess? In his session at 6th Big Data Expo®, Jean-Francois Barsoum, Senior Managing Consultant, Smarter Cities, Water and Transportation at IBM, will share their successes in improving Big Decision outcomes by building stories compelling to the target audience – and our failures when we lost sight of the plotline, distracted by the glitter of technology and the lure of buried insights. Our cast of characters includes the agency head...
Scene scenario: 10 am in a boardroom somewhere, second round of coffees served, Danish and donuts untouched, a quiet hush settles. “Well you know what guys? (and, by the use of the term guys I mean to include both sexes here assembled) – the trouble that we have as a company is that we are, to put it bluntly, just a little analytics poor,” said the newly appointed Chief Analytics Officer. That we should consider a firm to be analytically deficient or poor is a profound comment on our modern ag...