Welcome!

@DXWorldExpo Authors: Yeshim Deniz, Zakia Bouachraoui, Liz McMillan, Pat Romanski, Elizabeth White

Related Topics: @DevOpsSummit, Java IoT, Containers Expo Blog, @CloudExpo, @DXWorldExpo, SDN Journal

@DevOpsSummit: Article

The DevOps Database | Part 2

Applying Systems Thinking to Database Change Management

In my first post in this series, I discussed the underpinning principles of all DevOps patterns as eloquently stated by Gene Kim, author of "The Phoenix Project."  In this post I'd like to dig a little deeper into The First Way.  As a refresher:

The First Way: Systems Thinking - This Way stresses the performance of the entire system of value delivery.  Instead of becoming laser focused on the part of the process for which an individual or team is responsible, the individual or team works to understand the entire process from requirements generation to customer delivery.  The goal is to eliminate the delivery impediments that arise when a project transitions from one isolated silo to another.  Understanding the entire system allows business, development, and operations to work towards a common goal in a consistent manner.

When we started Datical, our first step was to perform extensive market validation.  We quickly learned that database schema management was going to be tough nut to crack. In the scores of conversations we had with people throughout the ALM spectrum, we learned that the process for managing and updating the database schema that supports an application was at best murky and at worst a black box.  So how do we elevate a process owned and understood by a few people in an organization to the level of visibility required to understand that process as part of a larger system?

What follows is a list of concepts pertaining to System Thinking that we've rallied around in providing our database change management solution Datical DB. Instead of the black box, database change management can become a transparent and flexible part of your value delivery system.

Start With Reality
When beginning a new project based on previous development, don't rely on a stack of SQL scripts on a file server or even in a source code control system. As you know, databases evolve over time and sometimes out of process changes happen.  When a database schema is modified to resolve an error condition or performance degradation, these alterations are usually handled in a support ticketing system.  You can never be certain that they were added to the stack of scripts used to build out a fresh environment.  In light of this, any database change management solution should start by generating a baseline from the working system: your production schema.  This ensures that design and development activity are taking into account not only those schema objects generated in Dev, but also those which originated in other stops in the system.

Don't Script! Model!
To keep this blog post short(er) I'll point you now to a blog post I wrote a few months ago on modelling vs. scripting.

Version Twice, Deploy Once
The first level of versioning any database change management tool should provide is versioning of the gold standard. Versioning the scripts or model that you use to build a fresh database instance for your application provides a ton of benefits.  You can track how your schema has evolved over time and across releases, you can tightly couple a schema definition to the version of the application it supports using the same branch/tag/merge workflow that you use for your application code, and you can quickly stand up a new database instance that you know is correct for any released version or experimental branch you are working in.  The second level of versioning takes place in each database instance.  Tracking the version of the schema deployed on a specific instance makes deployment and troubleshooting much easier.  Will this application build work with the schema on this instance?  What changes need to be applied to this database to catch it up to the latest version?  Were the right changes applied to this test database to validate a closed defect?  If you are tracking the individual changes applied in a database and the impetus for those changes, these questions can be answered quickly and easily.

Unify Your Modes of Delivery
We've found that a lot of uncertainty was generated when database updates were handled by several individuals using their own tools and methods to affect the required changes.  To remove this uncertainty, all database change must be executed using the same tools and processes across departments and individuals.   The tricky part: a continuous integration system has different requirements than a developer working iteratively or a DBA processing a batch of changes in a headless environment during a maintenance window.  In order to unify your database change process, the solution you use should be accessible to all of these individuals while maintaining the consistency of deployment activities.  That's why Datical DB provides a rich GUI experience for developers that helps them craft & deploy changes in dev environments; tight integrations with popular build and release automation frameworks that preserve the frameworks' workflows while providing Datical DB functionality; and a command line interface that allows users in headless environments to deploy changes in the exact same manner that the GUI or a 3rd-party integration would.  By unifying the modes of delivery you are constantly testing your release practices. By the time the production push rolls around, your system works.

More Stories By Pete Pickerill

Pete Pickerill is Vice President of Products and Co-founder of Datical. Pete is a software industry veteran who has built his career in Austin’s technology sector. Prior to co-founding Datical, he was employee number one at Phurnace Software and helped lead the company to a high profile acquisition by BMC Software, Inc. Pete has spent the majority of his career in successful startups and the companies that acquired them including Loop One (acquired by NeoPost Solutions), WholeSecurity (acquired by Symantec, Inc.) and Phurnace Software.

DXWorldEXPO Digital Transformation Stories
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
DXWorldEXPO LLC announced today that Telecom Reseller has been named "Media Sponsor" of CloudEXPO | DXWorldEXPO 2018 New York, which will take place on November 11-13, 2018 in New York City, NY. Telecom Reseller reports on Unified Communications, UCaaS, BPaaS for enterprise and SMBs. They report extensively on both customer premises based solutions such as IP-PBX as well as cloud based and hosted platforms.
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams.
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in ...
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addresse...
A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to great conferences, helping you discover new conferences and increase your return on investment.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. ...