Welcome!

@DXWorldExpo Authors: Liz McMillan, Elizabeth White, Zakia Bouachraoui, Pat Romanski, Maria C. Horton

Blog Feed Post

Big Data On-ramp

Due to the unprecedented volume, variety, and velocity of Big Data, it is neither trivial nor straightforward to find a clear path to jumpstart the Big Data journey. This space is overwhelmingly crowded with so many immature options and evolving solutions. To some extent it is somewhat confusing and daunting. Where can you find an entry point? What is the most effective way to get on board? Which aspects should you be mindful of? How can you not miss the paramount things? Why do you need to begin with the basics?
 
Here are five areas of consideration for Big Data on-ramp: Structure, Location, Objective, Participant, and Event (SLOPE).
  • Structure: The data format is the first and foremost factor. Confining to the traditional structured contents is no longer sufficient in this era. We have to pay close attention to how we will deal with the unstructured and semi-structured information that will be imported and analyzed in the short term and long run.
  • Location: Where data reside and how they move around inside or outside an enterprise have an influential impact on the overall Big Data ecosystem. A sophisticated messaging platform should be employed in a complex environment entailing heterogeneous data sources and consumption. The data locality is also important for distributed processing with a viable hosting model.
  • Objective: The reasonable goal and right level of expectations should be set up at the very beginning to develop solid business cases. Big Data as a whole is not just for the sake of moving to the Big Data technology. Rather, it is an advanced discipline to transform a problematic environment to a realistic target state. For example, eliminating data silos is a must, but it also brings pains and conflicts during the execution.
  • Participant: It is crucial to conceive Big Data solutions from a user-centric perspective. All stakeholders involved need think coherently about the value chain of the data as assets. The priorities and preferences among the end-users, partners, data feeders, brokers, etc. must be balanced and harmonized. A RACI or SCARI matrix should be established to specify roles and responsibilities in the governance.
  • Event: The types of interactions and access dictate what Big Data platforms are the most suitable candidates for both transactional and analytical processing. Large data streaming is a feasible option to enable near real-time analytics in the event-driven scenarios like fraud detection. Quantifiable thresholds ought to be defined to explicitly describe how real is real-time.

For more information, please contact Tony Shan ([email protected]). ©Tony Shan. All rights reserved.

Read the original blog entry...

More Stories By Tony Shan

Tony Shan works as a senior consultant, advisor at a global applications and infrastructure solutions firm helping clients realize the greatest value from their IT. Shan is a renowned thought leader and technology visionary with a number of years of field experience and guru-level expertise on cloud computing, Big Data, Hadoop, NoSQL, social, mobile, SOA, BI, technology strategy, IT roadmapping, systems design, architecture engineering, portfolio rationalization, product development, asset management, strategic planning, process standardization, and Web 2.0. He has directed the lifecycle R&D and buildout of large-scale award-winning distributed systems on diverse platforms in Fortune 100 companies and public sector like IBM, Bank of America, Wells Fargo, Cisco, Honeywell, Abbott, etc.

Shan is an inventive expert with a proven track record of influential innovations such as Cloud Engineering. He has authored dozens of top-notch technical papers on next-generation technologies and over ten books that won multiple awards. He is a frequent keynote speaker and Chair/Panel/Advisor/Judge/Organizing Committee in prominent conferences/workshops, an editor/editorial advisory board member of IT research journals/books, and a founder of several user groups, forums, and centers of excellence (CoE).

DXWorldEXPO Digital Transformation Stories
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
The Internet of Things is clearly many things: data collection and analytics, wearables, Smart Grids and Smart Cities, the Industrial Internet, and more. Cool platforms like Arduino, Raspberry Pi, Intel's Galileo and Edison, and a diverse world of sensors are making the IoT a great toy box for developers in all these areas. In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists discussed what things are the most important, which will have the most profound e...
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of S...
Rodrigo Coutinho is part of OutSystems' founders' team and currently the Head of Product Design. He provides a cross-functional role where he supports Product Management in defining the positioning and direction of the Agile Platform, while at the same time promoting model-based development and new techniques to deliver applications in the cloud.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
"We were founded in 2003 and the way we were founded was about good backup and good disaster recovery for our clients, and for the last 20 years we've been pretty consistent with that," noted Marc Malafronte, Territory Manager at StorageCraft, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
LogRocket helps product teams develop better experiences for users by recording videos of user sessions with logs and network data. It identifies UX problems and reveals the root cause of every bug. LogRocket presents impactful errors on a website, and how to reproduce it. With LogRocket, users can replay problems.
Data Theorem is a leading provider of modern application security. Its core mission is to analyze and secure any modern application anytime, anywhere. The Data Theorem Analyzer Engine continuously scans APIs and mobile applications in search of security flaws and data privacy gaps. Data Theorem products help organizations build safer applications that maximize data security and brand protection. The company has detected more than 300 million application eavesdropping incidents and currently secu...