Welcome!

@BigDataExpo Authors: Pat Romanski, Steven Lamb, Elizabeth White, Liz McMillan, Dan Potter

Related Topics: @DevOpsSummit, Java IoT, Containers Expo Blog, @CloudExpo, @BigDataExpo, SDN Journal

@DevOpsSummit: Article

The DevOps Database | Part 2

Applying Systems Thinking to Database Change Management

In my first post in this series, I discussed the underpinning principles of all DevOps patterns as eloquently stated by Gene Kim, author of "The Phoenix Project."  In this post I'd like to dig a little deeper into The First Way.  As a refresher:

The First Way: Systems Thinking - This Way stresses the performance of the entire system of value delivery.  Instead of becoming laser focused on the part of the process for which an individual or team is responsible, the individual or team works to understand the entire process from requirements generation to customer delivery.  The goal is to eliminate the delivery impediments that arise when a project transitions from one isolated silo to another.  Understanding the entire system allows business, development, and operations to work towards a common goal in a consistent manner.

When we started Datical, our first step was to perform extensive market validation.  We quickly learned that database schema management was going to be tough nut to crack. In the scores of conversations we had with people throughout the ALM spectrum, we learned that the process for managing and updating the database schema that supports an application was at best murky and at worst a black box.  So how do we elevate a process owned and understood by a few people in an organization to the level of visibility required to understand that process as part of a larger system?

What follows is a list of concepts pertaining to System Thinking that we've rallied around in providing our database change management solution Datical DB. Instead of the black box, database change management can become a transparent and flexible part of your value delivery system.

Start With Reality
When beginning a new project based on previous development, don't rely on a stack of SQL scripts on a file server or even in a source code control system. As you know, databases evolve over time and sometimes out of process changes happen.  When a database schema is modified to resolve an error condition or performance degradation, these alterations are usually handled in a support ticketing system.  You can never be certain that they were added to the stack of scripts used to build out a fresh environment.  In light of this, any database change management solution should start by generating a baseline from the working system: your production schema.  This ensures that design and development activity are taking into account not only those schema objects generated in Dev, but also those which originated in other stops in the system.

Don't Script! Model!
To keep this blog post short(er) I'll point you now to a blog post I wrote a few months ago on modelling vs. scripting.

Version Twice, Deploy Once
The first level of versioning any database change management tool should provide is versioning of the gold standard. Versioning the scripts or model that you use to build a fresh database instance for your application provides a ton of benefits.  You can track how your schema has evolved over time and across releases, you can tightly couple a schema definition to the version of the application it supports using the same branch/tag/merge workflow that you use for your application code, and you can quickly stand up a new database instance that you know is correct for any released version or experimental branch you are working in.  The second level of versioning takes place in each database instance.  Tracking the version of the schema deployed on a specific instance makes deployment and troubleshooting much easier.  Will this application build work with the schema on this instance?  What changes need to be applied to this database to catch it up to the latest version?  Were the right changes applied to this test database to validate a closed defect?  If you are tracking the individual changes applied in a database and the impetus for those changes, these questions can be answered quickly and easily.

Unify Your Modes of Delivery
We've found that a lot of uncertainty was generated when database updates were handled by several individuals using their own tools and methods to affect the required changes.  To remove this uncertainty, all database change must be executed using the same tools and processes across departments and individuals.   The tricky part: a continuous integration system has different requirements than a developer working iteratively or a DBA processing a batch of changes in a headless environment during a maintenance window.  In order to unify your database change process, the solution you use should be accessible to all of these individuals while maintaining the consistency of deployment activities.  That's why Datical DB provides a rich GUI experience for developers that helps them craft & deploy changes in dev environments; tight integrations with popular build and release automation frameworks that preserve the frameworks' workflows while providing Datical DB functionality; and a command line interface that allows users in headless environments to deploy changes in the exact same manner that the GUI or a 3rd-party integration would.  By unifying the modes of delivery you are constantly testing your release practices. By the time the production push rolls around, your system works.

More Stories By Pete Pickerill

Pete Pickerill is Vice President of Products and Co-founder of Datical. Pete is a software industry veteran who has built his career in Austin’s technology sector. Prior to co-founding Datical, he was employee number one at Phurnace Software and helped lead the company to a high profile acquisition by BMC Software, Inc. Pete has spent the majority of his career in successful startups and the companies that acquired them including Loop One (acquired by NeoPost Solutions), WholeSecurity (acquired by Symantec, Inc.) and Phurnace Software.

@BigDataExpo Stories
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Cloud analytics is dramatically altering business intelligence. Some businesses will capitalize on these promising new technologies and gain key insights that’ll help them gain competitive advantage. And others won’t. Whether you’re a business leader, an IT manager, or an analyst, we want to help you and the people you need to influence with a free copy of “Cloud Analytics for Dummies,” the essential guide to this explosive new space for business intelligence.
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...