Welcome!

@DXWorldExpo Authors: Pat Romanski, Elizabeth White, Yeshim Deniz, Liz McMillan, Zakia Bouachraoui

Blog Feed Post

Subversion Disaster Recovery

Disaster Recovery is something most people only think about right after a natural disaster. Business continuity (Latin for “Will you go out of business if your office floods?”) is usually an afterthought. The truth is that most of us are too busy to spend a lot of energy on the hypotheticals of disaster recovery and business continuity planning. While we often think of our workplaces as impervious, the truth is that computers are fragile and buildings are damaged frequently. Flooding, fire, theft, and tornadoes are all common enough that every development team should have a contingency plan. But as we enter the traditional peak of hurricane season in the Atlantic, let’s take a quick look at the the recoverability of Subversion and Git repositories in case of catastrophe.

Hurricane Sandy Aftermath in New York City
Hurricane Sandy Aftermath in New York City: Is your server safe?
Image used under Creative Commons License from David Shankbone.

If you’re a Git user, you probably already know that Git is a distributed version control system (DVCS) that is designed without a single point of failure. Every clone of a Git repository contains all the information it needs to become the “master” at any time. So if your main Git server is destroyed in an catastrophic event, you can recover completely using any developer’s clone of the repository. Its peer-to-peer nature is one of the major differences between Git and Subversion. I wrote recently about some factors influencing how to choose between Git and Subversion.

Subversion is a centralized version control system, so all clients don’t carry the entire history of the repository. That means that the master repository needs to be protected to ensure that you don’t lose any code in case the master is destroyed. One easy way is to use ProjectLocker Subversion. In addition to our redundant primary storage, repositories are backed up to redundant storage over 2,000 miles away from the primary servers. So using ProjectLocker gets you a high degree of disaster prevention.

But you may be surprised to learn that Subversion has built-in features that can let you easily build in provisions for business continuity in the case of a technical disaster. With Subversion replication, you can configure a Subversion repository to replicate to a mirror repository. The mirror will automatically receive the entire history of the repository and all the information needed to reconstruct the repository. So if your company has multiple offices, you can easily mirror your primary Subversion repositories to a backup server in a different location.

You can even use ProjectLocker as a business continuity mirror for an in-house Subversion repository. If you have an in-house Subversion server, you can improve your disaster recovery planning by configuring your repository to replicate to ProjectLocker. Your mirrored repository will automatically benefit from all of the data protection, backup, and security we apply to primary repositories. And you’ll have a quick way to recover in case there’s a disaster incapacitates your office server. Reach out to us if you’d like to set up ProjectLocker as a mirror of your in-house Subversion repository.

What else do you do for disaster planning for your development environment?

Read the original blog entry...

More Stories By Damon Young

Damon Young is Director of Sales at ProjectLocker.com. ProjectLocker was founded in 2003 to provide on-demand tools for software developers. Guided by the simple mission of helping companies build better software, ProjectLocker's services have expanded to include services for the complete lifecycle of software projects, from requirements documentation to build and test automation. ProjectLocker serves companies from startups to Fortune 1000 multinationals.

DXWorldEXPO Digital Transformation Stories
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science" is responsible for guiding the technology strategy within Hitachi Vantara for IoT and Analytics. Bill brings a balanced business-technology approach that focuses on business outcomes to drive data, analytics and technology decisions that underpin an organization's digital transformation strategy.
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Andi Mann, Chief Technology Advocate at Splunk, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, and communicator. For over 30 years across five continents, he has built success with Fortune 500 corporations, vendors, governments, and as a leading research analyst and consultant.
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Early Bird Registration Discount Expires on August 31, 2018 Conference Registration Link ▸ HERE. Pick from all 200 sessions in all 10 tracks, plus 22 Keynotes & General Sessions! Lunch is served two days. EXPIRES AUGUST 31, 2018. Ticket prices: ($1,295-Aug 31) ($1,495-Oct 31) ($1,995-Nov 12) ($2,500-Walk-in)
Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed by some of the world's largest financial institutions.