Welcome!

@DXWorldExpo Authors: Elizabeth White, Pat Romanski, Roger Strukhoff, Yeshim Deniz, Zakia Bouachraoui

Related Topics: @DXWorldExpo, Java IoT, @CloudExpo

@DXWorldExpo: Article

Getting Automation Right with Big Data | @BigDataExpo #BigData

Things To Remember While Automating With Big Data

Big data automation can mean writing dozens of scripts to process different input sources and aligning them in order to consolidate all this data and produce the required output.

Why exactly do you need big data for your enterprise projects? Many industry observers have been noting that although a lot of enterprises like to claim that their big data projects are aimed at "deriving insights" that replace human intuition with data-driven alternatives, in reality though, the objective appears to be automation. They point out that the role of data scientists at a lot of organizations has got little to do with replacing human intuition with big data. Instead, it is about augmenting human experience by making it easier, faster and more efficient.

But automating big data processing is easier said than done and the biggest problem here is that big data is well big. What this means is that there is a lot of chaos and inconsistency in the data available. As a result, creating a MapReduce script that can instantly input all your data and process the results is just wishful thinking. In reality, big data automation can mean writing dozens of scripts to process different input sources and aligning them in order to consolidate all this data and produce the required output.

The first thing to get right with respect to automating big data is the architecture. One of the most popular ways to set up big data automation is through data lakes. To put it simple, data lakes is a large storage repository that holds all the raw data until it is necessary for processing. Unlike traditional hierarchical data warehouses, data lakes stores raw data in a flat architecture . One of the key advantages here is that data lakes can store all sorts of data - structured, semi-structured and unstructured and is thus ably suited for big data automation.

The next thing to get right is agility. Traditional data sources are structured and using a data warehouse technology ensures seamless processing and efficient processing of data. With big data though, this can be a disadvantage. Data scientists need to build agile systems that can be easily configured and reworked in order to quickly and efficiently navigate through the multitude of data sources and build an automation system that works.

While challenges as those mentioned above can be tackled by choosing the right technologies, there are other problems with big data that need to be dealt at a more granular level. One example is manipulative algorithms that can bring about vastly different outputs and rogue or incompetent developers can cause automation issues that can be extremely difficult to track down and modify. Another issue is with misinterpretation of data. An automated big data system could possibly magnify minor discrepancies in data and feed them into a loop that could lead to grossly misleading outputs.

These are issues that cannot be wished away and the only way to get automation right in such cases is by diligently monitoring and evaluating the code and outputs. This way, it is possible to identify discrepancies in the algorithm and outputs before it can potentially blow up. From a business perspective, this means additional resources to test and validate the code and output at each stage of the development and operational cycle. This could effectively bring down the cost advantage that big automation has. But this is a necessary expense to pay if businesses need to establish a sustainable big automation product that also works.

More Stories By Harry Trott

Harry Trott is an IT consultant from Perth, WA. He is currently working on a long term project in Bangalore, India. Harry has over 7 years of work experience on cloud and networking based projects. He is also working on a SaaS based startup which is currently in stealth mode.

DXWorldEXPO Digital Transformation Stories
Using serverless computing has a number of obvious benefits over traditional application infrastructure - you pay only for what you use, scale up or down immediately to match supply with demand, and avoid operating any server infrastructure at all. However, implementing maintainable and scalable applications using serverless computing services like AWS Lambda poses a number of challenges. The absence of long-lived, user-managed servers means that states cannot be maintained by the service. Lo...
As you know, enterprise IT conversation over the past year have often centered upon the open-source Kubernetes container orchestration system. In fact, Kubernetes has emerged as the key technology -- and even primary platform -- of cloud migrations for a wide variety of organizations. Kubernetes is critical to forward-looking enterprises that continue to push their IT infrastructures toward maximum functionality, scalability, and flexibility. As they do so, IT professionals are also embr...
Technology has changed tremendously in the last 20 years. From onion architectures to APIs to microservices to cloud and containers, the technology artifacts shipped by teams has changed. And that's not all - roles have changed too. Functional silos have been replaced by cross-functional teams, the skill sets people need to have has been redefined and the tools and approaches for how software is developed and delivered has transformed. When we move from highly defined rigid roles and systems to ...
This month @nodexl announced that ServerlessSUMMIT & DevOpsSUMMIT own the world's top three most influential Kubernetes domains which are more influential than LinkedIn, Twitter, YouTube, Medium, Infoworld and Microsoft combined. NodeXL is a template for Microsoft® Excel® (2007, 2010, 2013 and 2016) on Windows (XP, Vista, 7, 8, 10) that lets you enter a network edge list into a workbook, click a button, see a network graph, and get a detailed summary report, all in the familiar environment of...
IT professionals are also embracing the reality of Serverless architectures, which are critical to developing and operating real-time applications and services. Serverless is particularly important as enterprises of all sizes develop and deploy Internet of Things (IoT) initiatives. Serverless and Kubernetes are great examples of continuous, rapid pace of change in enterprise IT. They also raise a number of critical issues and questions about employee training, development processes, and opera...
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
AI and machine learning disruption for Enterprises started happening in the areas such as IT operations management (ITOPs) and Cloud management and SaaS apps. In 2019 CIOs will see disruptive solutions for Cloud & Devops, AI/ML driven IT Ops and Cloud Ops. Customers want AI-driven multi-cloud operations for monitoring, detection, prevention of disruptions. Disruptions cause revenue loss, unhappy users, impacts brand reputation etc.
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Atmosera delivers modern cloud services that maximize the advantages of cloud-based infrastructures. Offering private, hybrid, and public cloud solutions, Atmosera works closely with customers to engineer, deploy, and operate cloud architectures with advanced services that deliver strategic business outcomes. Atmosera's expertise simplifies the process of cloud transformation and our 20+ years of experience managing complex IT environments provides our customers with the confidence and trust tha...
Using serverless computing has a number of obvious benefits over traditional application infrastructure - you pay only for what you use, scale up or down immediately to match supply with demand, and avoid operating any server infrastructure at all. However, implementing maintainable and scalable applications using serverless computing services like AWS Lambda poses a number of challenges. The absence of long-lived, user-managed servers means that states cannot be maintained by the service. Lo...