Welcome!

@DXWorldExpo Authors: Zakia Bouachraoui, Yeshim Deniz, Liz McMillan, Elizabeth White, Pat Romanski

Blog Feed Post

An Update on Platfora: Great capability for enhancing clarity, agility, user self service and mission support from Big Data

By

We have been tracking Platfora for quite a while. They have long been our on Disruptive IT list, we knew long ago these guys would change the world. Now with their announcement that their flagship capability is available for widespread use (see: Platfora, Now Available, Cuts Through Big Data Hype and Delivers Business Value on Hadoop), we can all dive a bit deeper in their capabilities.

My first recommendation is to think through use cases before getting into their approach. It can really help get your mind around its importance to take that customer-focused view first. One easy way to do that is by watching the video here:

Did that get your attention? Clearly Platfora is onto something. Here is how they describe what they do at their website:

HARNESS THE POWER OF APACHE HADOOP

Platfora drives Hadoop like a work engine, leveraging its near-linear scalability to perform the heavy lifting to make access to big data fast. Adaptive Job Synthesis™ generates MapReduce jobs without developer or IT intervention, efficiently adapts work based on previous output, and monitors job progress to completion.

SINGLE VIEW INTO ALL THE DATA

Datasets in Hadoop are cataloged and searchable in the Platfora Data Catalog. IT teams can maintain standard and commonly used datasets and end users can import their own to join into the data model.

READS RAW DATA IN HADOOP

Platfora natively processes standard file types and structures typically used in big data. Log files, delimited files, even structured files in formats like JSON are easily interpreted and elements are extracted into order. Platfora will even interface with a Hive metastore to import metadata. Platfora’s architecture supports the rapid extension of file types by Platfora and from third parties.

NO ETL

Each dataset can be enriched through a complete expression language. Normalizing data, adding columns, building aggregates and computed fields are all completed through a straightforward interface. Data integration is an agile process and can be changed as requirements change or new data arrives.

SOLID SECURITY, SIMPLE SHARING

Platfora allows you to control your sensitive data and ensures it does not fall into the wrong hands. Independent data access and object access controls puts data security first, and makes sure users can not accidentally share data without permission. Platfora integrates with your LDAP or Active Directory server for user and group authentication.

In the coming weeks we will provide more information on their technology, use cases and incredibly virtuous return on investment. For now, learn more here: http://www.platfora.com/

 An Update on Platfora: Great capability for enhancing clarity, agility, user self service and mission support from Big Data

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder of Crucial Point and publisher of CTOvision.com

DXWorldEXPO Digital Transformation Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
As Cybric's Chief Technology Officer, Mike D. Kail is responsible for the strategic vision and technical direction of the platform. Prior to founding Cybric, Mike was Yahoo's CIO and SVP of Infrastructure, where he led the IT and Data Center functions for the company. He has more than 24 years of IT Operations experience with a focus on highly-scalable architectures.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...