Welcome!

@BigDataExpo Authors: Rishi Bhargava, Pat Romanski, Elizabeth White, Liz McMillan, Steven Lamb

Related Topics: @BigDataExpo, Java IoT, Linux Containers, Containers Expo Blog, @CloudExpo, SDN Journal

@BigDataExpo: Article

Supercomputing Tackle Big Data Challenges

Bringing innovations to the Big Data era

We all feel it, data use and growth is explosive. Individuals and businesses are consuming - and generating - more data every day.

The challenges are common for nearly all businesses operating in every industry:  ingest all the data, analyze it as fast as possible, make good sense of it, and ultimately drive smart decisions to positively affect the business - all as fast as possible!

Fujitsu brings the processors of supercomputing in the Fujitsu M10 enterprise server family to help organizations meet these everyday challenges.  What once was reserved for data-intensive scientific computing is now especially relevant to mission-critical business computing, and it's all driven by big data.

Fujitsu M10 Server Family

Fujitsu M10 Server Family:  Fujitsu M10-1, Fujitsu M10-4, Fujitsu M10-4S

From high-throughput connectivity to data sources, to high-speed data movement to and from high-performance processing units, all the way to serving data and analytics results for mission-critical business intelligence (BI) applications, Fujitsu's innovative technologies leveraged from decades in High-Performance Computing (HPC) enable enterprises to impact decisions, boost business productivity, and enhance the bottom line.

Dynamic Scaling

The Fujitsu M10 server line ranges from one to 64 SPARC64 X+ processors, each with 16 cores at speeds up to 3.7 GHz.   Extending the existing SPARC V9 instruction set architecture, the SPARC64 X+ adopts HPC-ACE (High Performance Computing Arithmetic Computational Extension), which was developed for the Fujitsu supercomputers. "Software on Chip" technology on all M10 models enables faster computational results by taking functions previously handled by software and putting them right into the processor itself. Through this innovative approach, the decimal floating-point arithmetic processing required by financial applications and database computing is done directly on the hardware in the processor, eliminating the overhead of binary-to-decimal conversions and speeding up overall computational results. Further, Single Instruction Multiple Data (SIMD) technology, once reserved only for HPC, also allows processing of multiple data in a single instruction cycle, dramatically accelerating results. This vector processing capability allows certain workloads such as columnar databases and business analytics to process massive data faster.

SPARC64 X+ processor die: 16 cores

SPARC64 X+ processor die: 16 cores

The Fujitsu M10's interconnect technology matches the power of its processors.  Fujitsu's innovative high-performance interconnect provides high bandwidth and low latency for high-speed internal data access and transfers among all SPARC64 X+ processors in the Fujitsu M10 server. All processors access all memory and all I/O slots equally, so that a large workload is as easy to deploy and support as a smaller workload.

Fujitsu M10 also offers the latest PCI Express serial I/O technology to provide fast connectivity to data warehouses, operational systems and other data sources.  The data can be fed to high performance BI systems through as many as 128 internal I/O slots on the Fujitsu M10 servers and up to 928 slots with the optional PCI Expansion Units.

SPARC64 X+ Processor

With the extreme performance of SPARC64 X+ processors and unrivaled utilization efficiency through innovative high performance supercomputing technologies, the Fujitsu M10 servers allow customers to quickly leverage large data sets and access the critical information they need for real-time decisions in ever-changing environments.  It helps companies get results with Big Data, faster.

More Stories By Ferhat Hatay

Dr. Ferhat Hatay is Senior Manager of Strategy and Innovation at the Fujitsu Oracle Center of Excellence driving incubation programs in the areas of Cloud and Big Data.

His experience includes serving in key roles at Sun Microsystems, Oracle, and HAL (A Fujitsu Company, not the HAL 9000!) driving innovative open large-scale infrastructure solutions for high performance and enterprise computing.

Ferhat started his career at NASA Ames Research Center building infrastructures for large-scale computer simulations and Big Data analysis. He forever remains a rocket scientist. Follow him at Twitter: @FerhatSF.

@BigDataExpo Stories
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
From wearable activity trackers to fantasy e-sports, data and technology are transforming the way athletes train for the game and fans engage with their teams. In his session at @ThingsExpo, will present key data findings from leading sports organizations San Francisco 49ers, Orlando Magic NBA team. By utilizing data analytics these sports orgs have recognized new revenue streams, doubled its fan base and streamlined costs at its stadiums. John Paul is the CEO and Founder of VenueNext. Prior ...
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
UpGuard has become a member of the Center for Internet Security (CIS), and will continue to help businesses expand visibility into their cyber risk by providing hardening benchmarks to all customers. By incorporating these benchmarks, UpGuard's CSTAR solution builds on its lead in providing the most complete assessment of both internal and external cyber risk. CIS benchmarks are a widely accepted set of hardening guidelines that have been publicly available for years. Numerous solutions exist t...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...