Welcome!

@DXWorldExpo Authors: Yeshim Deniz, Zakia Bouachraoui, Elizabeth White, Pat Romanski, Liz McMillan

Related Topics: @DXWorldExpo, Java IoT, @CloudExpo

@DXWorldExpo: Article

Getting Automation Right with Big Data | @BigDataExpo #BigData

Things To Remember While Automating With Big Data

Big data automation can mean writing dozens of scripts to process different input sources and aligning them in order to consolidate all this data and produce the required output.

Why exactly do you need big data for your enterprise projects? Many industry observers have been noting that although a lot of enterprises like to claim that their big data projects are aimed at "deriving insights" that replace human intuition with data-driven alternatives, in reality though, the objective appears to be automation. They point out that the role of data scientists at a lot of organizations has got little to do with replacing human intuition with big data. Instead, it is about augmenting human experience by making it easier, faster and more efficient.

But automating big data processing is easier said than done and the biggest problem here is that big data is well big. What this means is that there is a lot of chaos and inconsistency in the data available. As a result, creating a MapReduce script that can instantly input all your data and process the results is just wishful thinking. In reality, big data automation can mean writing dozens of scripts to process different input sources and aligning them in order to consolidate all this data and produce the required output.

The first thing to get right with respect to automating big data is the architecture. One of the most popular ways to set up big data automation is through data lakes. To put it simple, data lakes is a large storage repository that holds all the raw data until it is necessary for processing. Unlike traditional hierarchical data warehouses, data lakes stores raw data in a flat architecture . One of the key advantages here is that data lakes can store all sorts of data - structured, semi-structured and unstructured and is thus ably suited for big data automation.

The next thing to get right is agility. Traditional data sources are structured and using a data warehouse technology ensures seamless processing and efficient processing of data. With big data though, this can be a disadvantage. Data scientists need to build agile systems that can be easily configured and reworked in order to quickly and efficiently navigate through the multitude of data sources and build an automation system that works.

While challenges as those mentioned above can be tackled by choosing the right technologies, there are other problems with big data that need to be dealt at a more granular level. One example is manipulative algorithms that can bring about vastly different outputs and rogue or incompetent developers can cause automation issues that can be extremely difficult to track down and modify. Another issue is with misinterpretation of data. An automated big data system could possibly magnify minor discrepancies in data and feed them into a loop that could lead to grossly misleading outputs.

These are issues that cannot be wished away and the only way to get automation right in such cases is by diligently monitoring and evaluating the code and outputs. This way, it is possible to identify discrepancies in the algorithm and outputs before it can potentially blow up. From a business perspective, this means additional resources to test and validate the code and output at each stage of the development and operational cycle. This could effectively bring down the cost advantage that big automation has. But this is a necessary expense to pay if businesses need to establish a sustainable big automation product that also works.

More Stories By Harry Trott

Harry Trott is an IT consultant from Perth, WA. He is currently working on a long term project in Bangalore, India. Harry has over 7 years of work experience on cloud and networking based projects. He is also working on a SaaS based startup which is currently in stealth mode.

DXWorldEXPO Digital Transformation Stories
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve fu...
Only Adobe gives everyone - from emerging artists to global brands - everything they need to design and deliver exceptional digital experiences. Adobe Systems Incorporated develops, markets, and supports computer software products and technologies. The Company's products allow users to express and use information across all print and electronic media. The Company's Digital Media segment provides tools and solutions that enable individuals, small and medium businesses and enterprises to cre...
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams.
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
DXWorldEXPO LLC announced today that Nutanix has been named "Platinum Sponsor" of CloudEXPO | DevOpsSUMMIT | DXWorldEXPO New York, which will take place November 12-13, 2018 in New York City. Nutanix makes infrastructure invisible, elevating IT to focus on the applications and services that power their business. The Nutanix Enterprise Cloud Platform blends web-scale engineering and consumer-grade design to natively converge server, storage, virtualization and networking into a resilient, softwar...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee A...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...