Click here to close now.

Welcome!

Big Data Journal Authors: Dana Gardner, Roger Strukhoff, Elizabeth White, Gregor Petri, Michael Jannery

News Feed Item

Quantum's New Scalar i6000 HD Stores 5 PB in a Single Rack and Scales to Over 75 PB

Delivers Industry's Best Tape Slot Density for Large-Scale Archiving and Long-Term Retention of Big Data

SAN JOSE, CA -- (Marketwire) -- 01/30/13 -- Quantum Corp. (NYSE: QTM), a proven global expert in data protection and big data management, today announced the new Scalar i6000 HD enterprise tape library, providing the industry's highest slot density. Designed to address customers' big data and archive needs with slot densities that are twice those offered by competitors, this new library makes nearly 5 PB of data available in a single 19" rack and scales to more than 75 PB of capacity. Quantum has also added new high-availability and management features to the Scalar i6000, as well as greater access capabilities for storing big data archives on tape.

In addition to best-in-class slot density and scalability, the Scalar i6000 HD delivers high performance and availability, with new active-active dual robotics for fast data access times. Scheduled to be available in March and backwards compatible with Quantum's Scalar i2000 systems, the Scalar i6000 HD offers a comprehensive suite of capabilities, including proactive diagnostics and many redundant systems, such as power, network ports for encryption key management and connectivity for multi-fabric SAN architectures. Leveraging its embedded iLayer™ software, the Scalar i6000 HD also includes Quantum's unique Active Vault technology for storing vaulted tapes securely in the library and automated policy-based integrity checking, delivering the richest feature set for management of massive data archives.

Tape's New Role in Managing Big Data
As the capture and monetization of data grows, so does the role of tape in big data environments. Built to be reliable and inexpensive, tape technology now serves as an effective tier for storing large-scale archives behind disk storage and next-generation object storage systems. To manage data between tiers of storage intelligently, Quantum's StorNext AEL6000 Archive uses integrated, policy-based software to migrate files automatically. The StorNext AEL6000 Archives will incorporate the new high-density capabilities in the next quarter, enabling StorNext® customers to benefit from the greater tape slot density, smaller footprint, and increased capacity levels as well.

In addition, Quantum's Scalar LTFS provides Scalar i6000 users with greater flexibility for accessing big data archives on tape. Available now, Scalar LTFS provides an easy-to-use NAS file system presentation for tape users in industries working with big data, such as media and entertainment and life sciences. For example, Biola University has deployed Scalar LTFS in its workflow as a flexible and cost-effective approach to storing and accessing large video archives.

Supporting Quotes
Alex Rodriguez, VP of System Engineering and Product Development, Expedient
"The Scalar platform enables us to easily license additional slots and not have to worry about significant fixed asset costs. This flexibility plus the reliable performance of the Scalar i6000 makes it a truly attractive tape library for us. Additionally, Quantum's introduction of high density capabilities addresses the challenges of managing data growth in a service provider environment where efficient use of floor space is critical."

Robert Amatruda, research director, Data Protection and Recovery, IDC
"Today's announcement demonstrates Quantum's continued focus on extending its tape leadership in the market. The latest density, scalability and redundancy features added to the Scalar i6000 tape libraries make it an ideal solution for managing big data archives and cost effectively maintaining data stored in the cloud."

Robert Clark, senior vice president, Data Protection, Quantum
"85 percent of the Fortune 100 have turned to Quantum to meet their storage needs, and we're continuing to deliver innovative new solutions to address the challenges that large enterprise customers face. This includes adding new offerings such as the Scalar i6000 HD to our tape automation portfolio, which not only is unmatched in its breadth and depth but also reflects the unique benefits tape technology provides in today's evolving data protection environment."

Additional Resources

About Quantum
Quantum is a proven global expert in data protection and big data management, providing specialized storage solutions for physical, virtual and cloud environments. From small businesses to major enterprises, more than 100,000 customers have trusted Quantum to help maximize the value of their data by protecting and preserving it over its entire lifecycle. With Quantum, customers can Be Certain™ they're able to adapt in a changing world -- keeping more data longer, bridging from today to tomorrow, and reducing costs. See how at www.quantum.com/BeCertain.

Quantum, the Quantum logo, Be Certain, iLayer, Scalar and StorNext are either registered trademarks or trademarks of Quantum Corporation and its affiliates in the United States and/or other countries. All other trademarks are the property of their respective owners.

"Safe Harbor" Statement: This press release contains "forward-looking" statements. All statements other than statements of historical fact are statements that could be deemed forward-looking statements. Specifically, but without limitation, statements relating to 1) customer benefits and value to customers from using Quantum's Scalar i6000 libraries (including the Scalar i6000 HD libraries), Scalar LTFS and StorNext AEL6000, 2) the future availability of the Scalar i6000 HD libraries and 3) customer demand for and Quantum's future revenue from such libraries and appliances are forward-looking statements within the meaning of the Safe Harbor. All forward-looking statements in this press release are based on information available to Quantum on the date hereof. These statements involve known and unknown risks, uncertainties and other factors that may cause Quantum's actual results to differ materially from those implied by the forward-looking statements. These risks include operational difficulties, unforeseen technical limitations, unexpected changes in market conditions and unanticipated changes in customers' needs or requirements, as well as the risks set forth in Quantum's periodic filings with the Securities and Exchange Commission, including, but not limited to, those risks and uncertainties listed in the section entitled "Risk Factors," in Quantum's Quarterly Report on Form 10-Q filed with the Securities and Exchange Commission on November 9, 2012 and Quantum's Annual Report on Form 10-K filed with the Securities and Exchange Commission on June 14, 2012, especially those risks listed in this section under the heading "Our operating results depend on a limited number of products and on new product introductions, which may not be successful, in which case our business, financial condition and operating results may be materially and adversely affected." Quantum expressly disclaims any obligation to update or alter its forward-looking statements, whether as a result of new information, future events or otherwise.

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

@BigDataExpo Stories
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch ...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
SYS-CON Media announced that IBM, which offers the world’s deepest portfolio of technologies and expertise that are transforming the future of work, has launched ad campaigns on SYS-CON’s numerous online magazines such as Cloud Computing Journal, Virtualization Journal, SOA World Magazine, and IoT Journal. IBM’s campaigns focus on vendors in the technology marketplace, the future of testing, Big Data and analytics, and mobile platforms.
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understand...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impac...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...