Welcome!

@DXWorldExpo Authors: Yeshim Deniz, Zakia Bouachraoui, Liz McMillan, Pat Romanski, Elizabeth White

Related Topics: @DXWorldExpo, @CloudExpo, @ThingsExpo

@DXWorldExpo: Blog Feed Post

Connecting Big Data Project Management with Enterprise Data Strategy By @DDMcD | @BigDataExpo #BigData

Making the data analysis process effective and efficient is where good project planning and management come in

The Tip of the Spear II: Connecting Big Data Project Management with Enterprise Data Strategy
By Dennis D. McDonald

“If data analysis is Big Data’s "tip of the spear" when it comes to delivering data-dependent value to customers or clients, we also must address how that spear is shaped, sharpened, aimed, and thrown – and, of course, whether or not it hits its intended target.”

Introduction
In Meeting the Mission of Transportation Safety, Richard McKinney, U.S. Department of Transportation's CIO, describes four components for what I call an “enterprise data strategy”:

  1. Data governance
  2. Data sharing
  3. Data standards
  4. Data analysis

He also mentions additional factors relevant to DOT’s data strategy:

  1. The volume of data is increasing and we need to be ready for it.
  2. Managing data is not the same as analyzing it.
  3. We need to be thinking now about what type of analysis we need to be doing and what resources will be needed to do the analysis.

bdpm

Based on the 20+ personal, telephone, and email interviews I’ve conducted so far[2] as part of my big data project management research I would add a fourth item to McKinney's list:

  1. We need to spend at least as much time to planning and managing the people and business processes that make data analysis possible as we do the analysis process itself and the technologies that support it.

Tip of the Spear
If data analysis is Big Data’s “tip of the spear” when it comes to delivering data-dependent value to customers or clients, we also must address how that spear is shaped, sharpened, aimed, and thrown – and, of course, whether or not it hits its intended target.

We also want the processes associated with throwing that spear to be both effective and efficient.

Making the data analysis process – the tip of the Big Data spear -- effective and efficient is where good project planning and management come in.  Challenges to doing this in connection with data intensive projects are identifiable and include:

  1. Siloes. Data are often generated and managed in system- or mission-specific siloes. As a result, creating and implementing an effective enterprise-level data strategy that rises above and encompasses multiple programs, systems, and/or missions requires not just data analysis skills but a mix of technical, organizational, and political skills – not just good “project management.”
  2. Sharing. Making data accessible and useful often means that data need to be shared with systems and processes outside the control of those who "own" the data to be analyzed. Key steps in sharing data are that (a) data need to be identified and inventoried, and (b) technical and business ownership of the inventories data must be determined. In many organizations this inventorying is easier said than done and may require both manual and automated approaches to creating the necessary inventories.
  3. Standards. Efficient and sustainable analysis of data and metadata may require development or implementation of data standards. Existence and use of such standards differs by industry, data type, and system. The costs for developing and adopting standards to facilitate data sharing and analysis will also vary and may have cost and schedule implications at the project, program, enterprise, and industry or community levels.
  4. Delivering value. Modern data analysis tools and techniques provide mechanisms to identify patterns and trends from the increasing volumes of data generated by a steadily widening variety of data capture mechanisms. Challenges in predicting what will be found when data are analyzed places a premium on making sure we are asking the right questions. This in turn impacts our ability to justify project expenditures in advance.

Portfolio Management
Responding to the above challenges requires not only project management skills but also a project planning process that takes into consideration alignment with an organization’s goals and objectives.

As one of my interviewees suggested, the challenge faced in complex “big data” projects has just as much – if not more -- to do with overall strategy and “portfolio management” as with how individual projects are planned and managed. Effectively designing and governing a portfolio of projects and processes requires not only an understanding of how the portfolio supports (relates to, is aligned with, interacts with) the organization’s objectives; it should also incorporate a rational process for defining project requirements and then governing how the organization’s resources are managed and applied.

Given how pervasive and fundamental data are to an organization’s operation, skill in data science and analytics is a necessary element but such skill will not be, in many cases, a guarantor of success. Technical and analytical skills must be accompanied by effective planning, oversight, and management in order to ensure that the data analysis “spear” is being thrown in the right direction.

Delivering Value Quickly
Ideally a portfolio of projects will support an organization’s strategic plan and the goals or missions the organization is charged with pursuing. We may also need to “get tactical” by delivering value to the customer or client as quickly as possible, perhaps by focusing on better-controlled and better-understood product-centric data early on via a “data lake” approach.

Doing so will be good for the customer and will help create a relationship of trust moving forward. Such a relationship will be needed when complications or uncertainties arise and need to be dealt with.

In organizations that are not historically “data centric” or in organizations where management and staff have a low level of data literacy, an early demonstration of value from data analysis is especially important. An agile approach to project management, accompanied by openness, transparency, and collaboration, will help to accomplish this.

Unfortunately, challenges such as those identified above in many cases cannot be addressed effectively in tactically focused short-term projects given the usual pressures of time and budget. Such challenges can be complex or rooted in how the organization has been traditionally structured and managed.

Still, it’s not unusual for a tactically-focused “sprint” project, even while delivering an effective model or other deliverable, to uncover the need for a more global (or strategic) approach to managing data, metadata, data security, privacy, or data quality.

Balancing Tactics and Strategy
When focusing on delivery of useful data-related deliverables it always pays to keep two questions in mind:

  1. What needs to be done immediately to make data useful?
  2. What does this tell us about what needs to be done more globally in order to maintain and increase data usefulness?

Attention to enterprise-level data strategy while delivering useful results in the short term has implications beyond what is being attempted in an individual project’s scope. Treating data as an enterprise resource may even require changes to how the enterprise itself is managed. As we all know, it’s not unusual for change to be resisted.

An effective enterprise level data strategy will be one that balances the management of a portfolio of individual data intensive “agile” projects with parallel development of an upgraded enterprise data strategy. Doing one without the other could have negative consequences, for example:

  1. Focusing only on a narrowly defined data intensive analytics project by itself may generate immediate value through frequent useful deliverables but may not address underlying technical process issues that impact long-term efficiency and sustainability.
  2. Focusing only on an enterprise data strategy without delivering tactical benefits reduces the possibility that that less data-savvy managers understand the “big picture” down the road.

As experienced project managers know, concentrating on “quick and dirty” or “low hanging fruit” when under the gun to deliver value to a client in the short term can generate short term benefits. This same approach, however, may actually increase costs over time if strategic data management issues related to data standards or quality are repeatedly kicked “down the road.” Also, delivering a “strategy” without also engaging users in development of real-world analytical deliverables might mean that strategically important recommendations ends up gathering dust on the shelf somewhere.

Communication Strategy
As experienced project managers understand all too well one of the most important elements in effective project management is communication:

  • Communication among project staff
  • Communication with the client
  • Communication with stakeholders

In the case of the big data or data intensive project, even when focused on delivering incremental value to the customer by focusing initially on specific or narrowly targeted goals, we want communications about project activities, especially among key stakeholders, to focus both on tactical as well as strategic objectives.

This may require accommodating a variety of communication styles as well as different levels of data and analytical literacy especially when both business-focused and technology- or analytics-focused staff are involved. But if we do follow this balanced approach we will:

  1. Deliver a useful project.
  2. Develop a trusted relationship with the client.
  3. Build the foundation for a realistic sustainable enterprise data strategy going forward.

Summary
In summary, how a data-intensive project is planned must take into account both short- and long-term goals. This planning process must be a collaborative one and, even if led by the organization’s IT department – not an unusual situation – it must involve business or operating units right from the start in order to ensure success.

I’ll be turning my attention to this planning process in future posts. If you’re interested in learning more about this process please let me know.

Related reading:

[1] Copyright (c) 2015 by Dennis D. McDonald, Ph.D. Dennis is an independent Washington DC area management consultant. His services include preproposal research and analysis, proposal development and costing, marketing and sales support, project and program management, project plan development, requirements analysis, and strategic planning. Reach him by phone at 703-402-7382 or by email at [email protected]. An earlier version of this post was published at http://www.ddmcd.com/spear.html and distributed at the Dec. 8, 2015 ATARC Federal Big Data Summit in Washington, DC.

[2] Thanks are due the following for sharing their thoughts with me: Aldo Bello, Kirk Borne, Clive Boulton, Doug Brockway, Ana Ferreras, Keith Gates, Douglas Glenn, Jennifer Goodwin, Jason Hare, Christina Ho, Randy Howard, Catherine Ives, Ian Kalin, Michael Kaplan, Jim Lola, David McClure, Jim McLennan, Trevor Monroe, Brian Pagels, John Parkinson, Dan Ruggles, Nelson Searles, Sankar Subramanian, and Tom Suder.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder of Crucial Point and publisher of CTOvision.com

DXWorldEXPO Digital Transformation Stories
DXWorldEXPO | CloudEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
Disruption, Innovation, Artificial Intelligence and Machine Learning, Leadership and Management hear these words all day every day... lofty goals but how do we make it real? Add to that, that simply put, people don't like change. But what if we could implement and utilize these enterprise tools in a fast and "Non-Disruptive" way, enabling us to glean insights about our business, identify and reduce exposure, risk and liability, and secure business continuity?
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
In this Women in Technology Power Panel at 15th Cloud Expo, moderated by Anne Plese, Senior Consultant, Cloud Product Marketing at Verizon Enterprise, Esmeralda Swartz, CMO at MetraTech; Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems; Seema Jethani, Director of Product Management at Basho Technologies; Victoria Livschitz, CEO of Qubell Inc.; Anne Hungate, Senior Director of Software Quality at DIRECTV, discussed what path they took to find their spot within the tec...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
DXWorldEXPO LLC announced today that Telecom Reseller has been named "Media Sponsor" of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
"We host and fully manage cloud data services, whether we store, the data, move the data, or run analytics on the data," stated Kamal Shannak, Senior Development Manager, Cloud Data Services, IBM, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"Akvelon is a software development company and we also provide consultancy services to folks who are looking to scale or accelerate their engineering roadmaps," explained Jeremiah Mothersell, Marketing Manager at Akvelon, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.