Welcome!

@DXWorldExpo Authors: Yeshim Deniz, Pat Romanski, Elizabeth White, Liz McMillan, Zakia Bouachraoui

Blog Feed Post

Optimizing The Data Warehouse: A Big Data Blueprint from Pentaho

By

We previously wrote about the Pentaho Big Data Blueprints series, which include design packages of use to enterprise architects and other technologists seeking operational concepts and repeatable designs. With this post we provide more information from the blueprint on Optimizing the Data Warehouse:

Optimizing your data warehouse can reduce strain on your existing systems and reduce overall project cost by offloading less frequently used data and corresponding transformation workloads to Hadoop without coding or relying on legacy scripts and ETL product limitations. Doing this right can save money and make your overall system more functional at the same time.

Here is more from Pentaho:

What Is It?

  • Hadoop Made Simple, Accessible and 15X Faster
  • Pentaho simplifies offloading to Hadoop and speeds development and deployment time by as much as 15x versus hand-coding approaches. Complete visual integration tools eliminate the need for hand coding in SQL or java-based MapReduce jobs.

Save data costs and boost analytics performance

  • An intuitive graphical, no-coding big data integration.
  • Access to every data source – from operational to relational to NoSQL technologies.
  • Support for every major Hadoop distribution with a future-proof adaptive big data layer.
  • Achieve higher processing performance with Pentaho MapReduce when running in cluster.
  • 100% Java, fast and efficient.

As part of the Pentaho Business Analytics Platform, there is no quicker or more cost-effective way to immediately get value from data through integrated reporting, dashboards, data discovery and predictive analytics.

How it works

Here is an example of how this may look within an IT landscape:

  • This company leverages data from disparate sources including CRM and ERP systems.
  • A Hadoop cluster has been implemented to offload less frequently used data from the existing data warehouse.
  • The company saves on storage costs and speeds-up query performance and access to their analytic data mart.

data-warehouse-example

The Results

  • Staff savings and productivity: Pentaho’s Visual MapReduce GUI and big data integration means existing data warehouse developers can move data between the data warehouse and Hadoop without coding.
  • Time to value: MapReduce development time is reduced by up to 15x versus hand-coding based on comparisons.
  • Faster job execution: Pentaho MapReduce runs faster in cluster versus code generating scripting tools.

 

Customer ROI

Leading Global Network Storage Company had a goal of  scaling machine data management to enhance product performance and customer success.

  • Affordably scale machine data from storage devices for customer application
  • Predict device failure
  • Enhance product performance

Architecture Example:

 

DW-NetApp

Pentaho Benefits:

  • Easy to use ETL and analysis for Hadoop, Hbase, and Oracle data sources
  • 15x data cost improvement
  • Stronger performance against customer Service Level Agreements

 

For more on these and other blueprints see Pentaho’s Blueprints to Big Data Success

 

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder of Crucial Point and publisher of CTOvision.com

DXWorldEXPO Digital Transformation Stories
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?
The hierarchical architecture that distributes "compute" within the network specially at the edge can enable new services by harnessing emerging technologies. But Edge-Compute comes at increased cost that needs to be managed and potentially augmented by creative architecture solutions as there will always a catching-up with the capacity demands. Processing power in smartphones has enhanced YoY and there is increasingly spare compute capacity that can be potentially pooled. Uber has successfully ...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
The technologies behind big data and cloud computing are converging quickly, offering businesses new capabilities for fast, easy, wide-ranging access to data. However, to capitalize on the cost-efficiencies and time-to-value opportunities of analytics in the cloud, big data and cloud technologies must be integrated and managed properly. Pythian's Director of Big Data and Data Science, Danil Zburivsky will explore: The main technology components and best practices being deployed to take advantage...
For years the world's most security-focused and distributed organizations - banks, military/defense agencies, global enterprises - have sought to adopt cloud technologies that can reduce costs, future-proof against data growth, and improve user productivity. The challenges of cloud transformation for these kinds of secure organizations have centered around data security, migration from legacy systems, and performance. In our presentation, we will discuss the notion that cloud computing, properl...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
Digital Transformation is well underway with many applications already on the cloud utilizing agile and devops methodologies. Unfortunately, application security has been an afterthought and data breaches have become a daily occurrence. Security is not one individual or one's team responsibility. Raphael Reich will introduce you to DevSecOps concepts and outline how to seamlessly interweave security principles across your software development lifecycle and application lifecycle management. With ...
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.