Welcome!

@DXWorldExpo Authors: Liz McMillan, Elizabeth White, Yeshim Deniz, Pat Romanski, Zakia Bouachraoui

Blog Feed Post

HGST Joins Open Compute Project To Help Define Best Practices For Datacentre Storage

Leading Storage Provider Works with Facebook and Industry Partners to Help Define and Implement Efficient Tiered Storage Strategies that Reduce Total Cost of Ownership for Corporate and Cloud Datacentres

Open Compute Summit (Booth C7), 17 January 2013- HGST (formerly Hitachi Global Storage Technologies and now a Western Digital company, NASDAQ: WDC) today announced that it has joined the Open Compute Project, an initiative launched by Facebook in 2011, to increase technology efficiencies and reduce the environmental impacts of datacentres.

With the explosion of data resulting from mobile devices, Internet services, social media and business applications, corporate, cloud and big data customers are constantly looking for ways to improve their storage infrastructure costs and their bottom line. The Open Compute Project applies open-source software principles to the hardware industry to drive the development of the most efficient computing infrastructures at the lowest possible cost. HGST will contribute its expertise toward defining storage solutions that deliver the performance and density required while achieving low total cost of ownership (TCO) reflected in metrics such as cost-per-terabyte, watt-per-TB, TB-per-system weight and TB-per-square foot.

"Demand for storage is booming as IT managers strive to handle the avalanche of new data being generated by cloud datacentres, Big Data analytics, social networking, HD video and millions of mobile devices," said Brendan Collins, vice president of product marketing at HGST. "As a strategic drive supplier and consultant to Facebook and in collaboration with the Open Compute Project, we're defining best practices in the storage industry to afford end-users with greater capital savings, operational efficiencies and energy conservation in the datacentre."

The fourth Open Compute Summit is January 16-17, 2013, at the Santa Clara Convention Center - 5001 Great America Parkway in Santa Clara, Calif. As a Summit sponsor, HGST will be showcasing its Ultrastar™ 4TB enterprise-class HDD, the world's first 4TB enterprise-class hard drive, which provides space-efficient, high-performance, low-power storage for traditional enterprises as well as for the explosive big data and cloud/Internet markets where storage density, watt-per-gigabyte and cost-per-GB are critical parameters. The 4TB Ultrastar 7K4000 family raises the bar with a five-year limited warranty and a 2.0 million hours MTBF specification, resulting in a 40 percent lower annualised failure rate (AFR) than enterprise drives rated at 1.2 million hours MTBF. As a leader in enterprise-class SAS SSDs, HGST will also be showcasing its Ultrastar enterprise-class SSDs that meet the performance, capacity, endurance and reliability demands of today's Tier 0, mission-critical datacentre applications.

About HGST
HGST (formerly known as Hitachi Global Storage Technologies or Hitachi GST), a Western Digital company (NASDAQ: WDC), develops advanced hard disk drives, enterprise-class solid state drives, innovative external storage solutions and services used to store, preserve and manage the world's most valued data. Founded by the pioneers of hard drives, HGST provides high-value storage for a broad range of market segments, including Enterprise, Desktop, Mobile Computing, Consumer Electronics and Personal Storage. HGST was established in 2003 and maintains its U.S. headquarters in San Jose, California. For more information, please visit the company's website at http://www.hgst.com.

One GB is equal to one billion bytes, and one TB equals 1,000 GB (one trillion bytes). Actual capacity will vary depending on operating environment and formatting.

###

Contact:
Caroline Sumners
HGST
Office: +44 2392459719
[email protected]

Keira Anderson
Porter Novelli
Office: +44 020 7853 2289
[email protected]

Read the original blog entry...

More Stories By RealWire News Distribution

RealWire is a global news release distribution service specialising in the online media. The RealWire approach focuses on delivering relevant content to the receivers of our client's news releases. As we know that it is only through delivering relevance, that influence can ever be achieved.

DXWorldEXPO Digital Transformation Stories
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage...
SYS-CON Events announced today that DatacenterDynamics has been named “Media Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY. DatacenterDynamics is a brand of DCD Group, a global B2B media and publishing company that develops products to help senior professionals in the world's most ICT dependent organizations make risk-based infrastructure and capacity decisions.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value...
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?
Andi Mann, Chief Technology Advocate at Splunk, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, and communicator. For over 30 years across five continents, he has built success with Fortune 500 corporations, vendors, governments, and as a leading research analyst and consultant.
A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to great conferences, helping you discover new conferences and increase your return on investment.
DXWorldEXPO LLC announced today that ICOHOLDER named "Media Sponsor" of Miami Blockchain Event by FinTechEXPO. ICOHOLDER gives detailed information and help the community to invest in the trusty projects. Miami Blockchain Event by FinTechEXPO has opened its Call for Papers. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Miami Blockchain Event by FinTechEXPOalso offers sp...
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term.
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...