Welcome!

@DXWorldExpo Authors: Yeshim Deniz, Zakia Bouachraoui, Liz McMillan, Pat Romanski, Elizabeth White

Related Topics: @DXWorldExpo, Microservices Expo, Containers Expo Blog, @CloudExpo, Apache, SDN Journal

@DXWorldExpo: Article

Big Data Good, Fast Big Data Better

Speed has become an integral part of the Big Data ethos, yet it is mentioned with comparative scarcity

This post is sponsored by The Business Value Exchange and HP Enterprise Services

The IT industry is nothing if not a breeding ground for an infinite variety of acronyms and neologisms. Alongside cloud computing today sits the term Big Data, which of course we understand to mean "that amount" of data which a traditional database would find hard to compute and process as a normal matter of job processing.

Neo-neologisms
But what is a neologism if you can't turn it into a neo-neologism? Big Data in its own right is a term that we are just about getting used to, but the sooner we move towards an appreciation of ‘fast Big Data' the better.

Technology analysts have been fond of the standard 'four Vs' definition used to describe the shape of Big Data, i.e., volume, velocity, variety and variability - but it is the ‘velocity' factor that sits somewhat incongruently among its V-shaped bedfellows, i.e., it is the only factor that describes speed or motion. Without a velocity layer, Big Data lies in a state of inertia.

In the new world of data 2.0 we find that the velocity factor is extremely important. More of our computing channels are described as real-time or near real-time (both definitions are important) as users demand applications that rely upon ubiquitous connections to the Internet, other users, other data events and other application services.

Suddenly speed has become an integral part of the Big Data ethos, yet it is mentioned with comparative scarcity. Press and analyst (and vendor) comment pieces talk up the zany incomprehensible world of zettabytes, petabytes and yottabytes. These are low hanging fruit and easy to comment on. Forget terabytes, they are so 2009.

Speed Is the Unloved Second Cousin of Big Data
If speed is the unloved second cousin of Big Data, it shouldn't be. Major enterprise players (the vendors, not the customers in the first instance) are spending their hard-earned acquisition and development dollars on the technology positioned as the antidote to our Big Data woes, namely "analytics" - and analytics without real-time analytics is like a car at full throttle without a steering wheel, i.e., we need to be able to react to data in the real world and navigate through it without crashing.

Of course the real fact of the matter here is that Big Data should be considered for its size, girth and overall hugeness as much as for its speed of movement. To contemplate an analysis of one without the other is fallacious and foolhardy. These two factors form two mutually interdependent sides of the contemporary data balancing equation that props up the Big Data economic model.

Software requirements in terms of compute capacity and depth of storage (okay that's hardware, we know) both increase proportionally as the economic values for data and time approach zero. As fast real time Big Data comes of age, we need more back office technology to support it.

None of this happens without layers of management technology and this is where much of the industry discussion is focused today with regard to Big Data. The trouble is, people aren't calling it fast Big Data yet. It will happen, but it needs to happen in real time and that means today.

More Stories By Adrian Bridgwater

Adrian Bridgwater is a freelance journalist and corporate content creation specialist focusing on cross platform software application development as well as all related aspects software engineering, project management and technology as a whole.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


DXWorldEXPO Digital Transformation Stories
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
DXWorldEXPO LLC announced today that Telecom Reseller has been named "Media Sponsor" of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams.
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addresse...
A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to great conferences, helping you discover new conferences and increase your return on investment.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. ...