Welcome!

@DXWorldExpo Authors: Pat Romanski, John Katrick, Stackify Blog, William Schmarzo, Elizabeth White

Related Topics: @DXWorldExpo, Agile Computing, @CloudExpo

@DXWorldExpo: Article

Finding the Right Little Data | @CloudExpo #BigData #ML #InternetOfThings

Even with the great strides technology has taken, data quality remains a tremendous challenge for genealogy researchers

Over the years, one of my favorite pastimes has been working on the family genealogy. I first started work on it in the early 1990s. At that time, records were not digitized and research involved going to libraries, newspapers, and various local, state, and federal archives. There one would have to sift through reams of old records, documents and microfiche. If you were lucky, someone had created printed indices of the information contained in those documents. These indices, when they existed, were based on manual transcription of the data, and prone to the data quality. This is inherent in transcribing information from what frequently were old handwritten documents. The challenge was then trying to find those nuggets of information that would relate and connect to individuals you were trying to locate in your research.

For anyone operating in the Big Data space of today, this may all sound very familiar. Genealogists, amateur and otherwise, have been dealing with the challenges of Big Data, unstructured data, and data quality long before the terms became technology buzzwords. There are tools and products today to help; who hasn't seen the commercials for ancestry.com? However, even with technology, there are still challenges. These challenges are not unique to genealogy, and make a good lens for viewing and discussing the business needs in general for Big Data. Let's take a closer look at some of them.

Data Quality
Even with the great strides technology has taken, data quality remains a tremendous challenge for genealogy researchers. Let's take U.S. Census data as an example. Every 10 years, the U.S. government conducts a census of the population, and the results of these censuses become public 72 years after they are taken (the 1940 Census material just recently became available). U.S. Census data is a gold mine for genealogy research.

Below is a sample from the 1930 Federal census. Census forms were filled out by individuals going door to door and asking the residents questions. However, there were a number of data quality factors you must take into consideration. The sample here has fairly good quality handwriting, although that's not always the case. Also, you are constrained by the census takers interpretation of the person's answers and pronunciation. For example, this could result in different variations on the spelling of names.

When this document gets transcribed, you could still have multiple sources of problems with the data quality. The original census taker could have written it down incorrectly or the person transcribing it could have made a transcription error.

This challenge is not unique to genealogy research. Data quality has been an issue in IT systems since the first IT system. In the world of Big Data, unstructured data (such as social media), and things like crowd-sourced data, can become a daunting challenge. As with any challenge, we must understand the impact of those issues, the risk, and what can be done to mitigate that risk. In my above example, Ancestry.com takes an interesting approach to the mitigation. Given they have millions of records based on scanned documents, checking each one is beyond reasonable expectations. Given that, they crowd-source corrections. As a customer, I locate a particular record for someone I am looking for, that little data in all the Big Data. If I notice there is some type of error I can flag that record, categorize the error and provide what I believe is the correct information. Ancestry will then look at my correction and, if appropriate, cleanse the transcribed data.

Data Pedigree
Even though we are discussing genealogy, data pedigree is not about the family tree. Data pedigree is ‘Where did the data comes from?' and ‘How reliable is that source?' If, as an organization, you own the data, that is not an issue. In today's Big Data world, many sources are outside of your direct control (unstructured social media data, crowd-sourced data). For genealogy research, data pedigree has always been an issue and concern. A date of birth is a lot more reliable from a town birth record than from say the census example above, where the information is ‘the age on last birthday, as provided in the interview' (I have seen variations of multiple years from sequential census forms for an individual). In my Ancestry.com example again, as well as source records, Ancestry members can make their research available for online search and sharing. When using others' data (i.e., crowd sourcing research), one must always feel comfortable with the reliability of the source. Ancestry allows you to identify what your source of information was, and can identify multiple sources (for example, I may source data of birth based on both a birth record, a marriage record, and a death record). That information is more reliable than a date of birth with no source cited. When I find a potential match (again, that little data I am truly looking for), I can determine if it truly is a match or possibly a false correlation.

Similarly, in any Big Data implementation, we must understand the pedigree of our data sources. This impacts any analytics we perform and the resulting correlations. If you don't, you run the risk of potentially false correlations and assumptions. For some entertaining examples of false correlations check out www.tylervigen.com.

Finding That Gem of Little Data in the Huge Oceans of Big Data
The ultimate value of Big Data is not the huge ocean of data. It's being able to find the gems of little data that provide the information you seek. In genealogy, it is wonderful that I have millions of public records, documents, and other genealogy research available to sift through, but that's not the value. The value is when I find that record for that one individual in the family tree I have been trying to find. Doing the analysis, and the matching of the data is very dependent on the challenges we have been discussing, data quality and data pedigree. The same is true for any Big Data implementation. Big Data without good understanding of the data is just a big pile of data taking up space.

No technology negates the need for good planning and design. Big Data not just about storing structured and unstructured data. It's not just providing the latest and greatest analytic tools. As technologists we must work with the business plan and design how to leverage and balance the data and its analysis. Work with the business to ensure there is the correct understanding of the data that is available, its quality, its pedigree and the impact of those. Then the true value of the Big Data will shine through as all the gems of little data are found.

This post is sponsored by SAS and Big Data Forum.

More Stories By Ed Featherston

Ed Featherston is VP, Principal Architect at Cloud Technology Partners. He brings 35 years of technology experience in designing, building, and implementing large complex solutions. He has significant expertise in systems integration, Internet/intranet, and cloud technologies. He has delivered projects in various industries, including financial services, pharmacy, government and retail.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@BigDataExpo Stories
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...
"There's plenty of bandwidth out there but it's never in the right place. So what Cedexis does is uses data to work out the best pathways to get data from the origin to the person who wants to get it," explained Simon Jones, Evangelist and Head of Marketing at Cedexis, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"ZeroStack is a startup in Silicon Valley. We're solving a very interesting problem around bringing public cloud convenience with private cloud control for enterprises and mid-size companies," explained Kamesh Pemmaraju, VP of Product Management at ZeroStack, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
"Infoblox does DNS, DHCP and IP address management for not only enterprise networks but cloud networks as well. Customers are looking for a single platform that can extend not only in their private enterprise environment but private cloud, public cloud, tracking all the IP space and everything that is going on in that environment," explained Steve Salo, Principal Systems Engineer at Infoblox, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventio...
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5–7, 2018, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buye...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
SYS-CON Events announced today that Telecom Reseller has been named “Media Sponsor” of SYS-CON's 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
"Codigm is based on the cloud and we are here to explore marketing opportunities in America. Our mission is to make an ecosystem of the SW environment that anyone can understand, learn, teach, and develop the SW on the cloud," explained Sung Tae Ryu, CEO of Codigm, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to close th...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
"We are an integrator of carrier ethernet and bandwidth to get people to connect to the cloud, to the SaaS providers, and the IaaS providers all on ethernet," explained Paul Mako, CEO & CTO of Massive Networks, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Sanjeev Sharma Joins June 5-7, 2018 @DevOpsSummit at @Cloud Expo New York Faculty. Sanjeev Sharma is an internationally known DevOps and Cloud Transformation thought leader, technology executive, and author. Sanjeev's industry experience includes tenures as CTO, Technical Sales leader, and Cloud Architect leader. As an IBM Distinguished Engineer, Sanjeev is recognized at the highest levels of IBM's core of technical leaders.