Welcome!

@DXWorldExpo Authors: Zakia Bouachraoui, Pat Romanski, Yeshim Deniz, Elizabeth White, Liz McMillan

Related Topics: @DXWorldExpo, Containers Expo Blog, @CloudExpo

@DXWorldExpo: Article

Testing #BigData Applications | @CloudExpo @YourZephyr #AI #Analytics

Big Data applications are used for everything from targeting retail customers to bringing new pharmaceuticals to market quicker

Big Data applications are used for everything from targeting retail customers to bringing new pharmaceuticals to market more quickly. As their name suggests, they are built to acquire, sort and put to use vast sets of information, whether the data points are historical sales figures for a product, results of clinical trial simulations or details in an online dating profile.

There are plenty of freely available tools out there for building Big Data applications, most notably Apache Hadoop, which supports distributed storage and processing using commodity hardware. Partnerships between companies like Intel - the chipmaker has worked on Big Data frameworks such as its Trusted Analytics Platform, available on Rackspace and Amazon Web Services - and Cloudera, a company that specializes in Hadoop, have also made it easier for organizations to build Hadoop applications in recent years.

"Hadoop helps here with its HDFS (Hadoop Distributed File System), which lets you store a large amount of data on a cloud of machines," explained Vasu Swaminathan, director of quality engineering for Aspire Systems, in a blog post. "On top of HDFS, Hadoop provides an API to process the stored data which is Map Reduce. The idea is since the data is stored in a distributed manner across nodes, it can be processed well in that manner where each node can process the data stored on it instead of getting hit by performance degradation issues by moving it over the network."

So, "how do I build a Big Data application? is becoming an easier question to answer. But what about "how do I test a Big Data application?" That one is harder to deal with. Since these programs go beyond the capabilities and infrastructures that underpin typical pieces of PC or mobile software, they require a unique, scalable approach to testing.

Requirements and Tips for Testing a Big Data Application
Many so-called "Big Data applications" can be categorized as one of the following:

  • A data warehouse (DWH), which can be a federated physical or logical repository that collects data from a variety of sources (just like a literal warehouse may contain items from all over). DWH programs only work with structured data, and use batch processing to handle it. They are typically built on relational databases such as Oracle.
  • A Big Data storage system, which has fewer limitations and can accept more types of data. Such an application could process streaming data, unstructured data, etc. It might have a capacity well into the petabytes, whereas a DWH would only reach into the neighborhood of gigabytes or terabytes. It is built on a filesystem like HDFS.

Both types can be set up to work with items such as web search queries, medical records, video archives and documents of different kinds. The goal of testing these applications is largely to validate all of this information. In a widely read Quora answer on this topic, Parul Gangwar explained that there are four Vs of the data in question to look out for here: variety, volume, velocity and value.

Gauging these metrics and verifying the data as a whole is usually done through performance and functional testing. A few things to note along the way:

  • An early step is to make sure that the right data has been pulled into the system in the first place. Source data may be compared to what's actually in the Hadoop cluster.
  • Business logic may be evaluated on each node to verify that Map Reduce is working for the application.
  • Finally, the output of the application may be checked for corruption as well as application of the correct transformational rules.
  • Hadoop architectures may also be subjected to failover testing to make sure that they can hold up even under the heavy workloads that they are regularly saddled with.
  • The system should also be checked for basic capabilities such as its throughput and how well its subcomponents perform at tasks such as indexing messages.

Properly testing a Big Data application is helped by having knowledge of a variety of tools, frameworks and programming languages, everything from Java to Jenkins. An enterprise test management system is also valuable for its real-time capabilities when vetting Big Data applications.

Given the vast scale of Big Data programs, automation is particularly important during the building and testing phases. It helps improve and streamline software testing processes so that DevOps teams can move more quickly to validate the data throughout the application.

"[T]est automation can be a good approach in testing Big Data implementations," Swaminathan explained at the end of his post. "Identifying the requirements and building a robust automation framework can help in doing comprehensive testing. However, a lot would depend on how the skills of the tester and how the Big Data environment is set up."

More Stories By Sanjay Zalavadia

As the VP of Client Service for Zephyr, Sanjay Zalavadia brings over 15 years of leadership experience in IT and Technical Support Services. Throughout his career, Sanjay has successfully established and grown premier IT and Support Services teams across multiple geographies for both large and small companies.

Most recently, he was Associate Vice President at Patni Computers (NYSE: PTI) responsible for the Telecoms IT Managed Services Practice where he established IT Operations teams supporting Virgin Mobile, ESPN Mobile, Disney Mobile and Carphone Warehouse. Prior to this Sanjay was responsible for Global Technical Support at Bay Networks, a leading routing and switching vendor, which was acquired by Nortel. He has also held management positions in Support Service organizations at start-up Silicon Valley Networks, a vendor of Test Management software, and SynOptics.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


DXWorldEXPO Digital Transformation Stories
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
ICC is a computer systems integrator and server manufacturing company focused on developing products and product appliances to meet a wide range of computational needs for many industries. Their solutions provide benefits across many environments, such as datacenter deployment, HPC, workstations, storage networks and standalone server installations. ICC has been in business for over 23 years and their phenomenal range of clients include multinational corporations, universities, and small busines...
DXWorldEXPO LLC announced today that Nutanix has been named "Platinum Sponsor" of CloudEXPO | DevOpsSUMMIT | DXWorldEXPO New York, which will take place November 12-13, 2018 in New York City. Nutanix makes infrastructure invisible, elevating IT to focus on the applications and services that power their business. The Nutanix Enterprise Cloud Platform blends web-scale engineering and consumer-grade design to natively converge server, storage, virtualization and networking into a resilient, softwar...
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams.
Serveless Architectures brings the ability to independently scale, deploy and heal based on workloads and move away from monolithic designs. From the front-end, middle-ware and back-end layers, serverless workloads potentially have a larger security risk surface due to the many moving pieces. This talk will focus on key areas to consider for securing end to end, from dev to prod. We will discuss patterns for end to end TLS, session management, scaling to absorb attacks and mitigation techniques.
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee A...
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...