Welcome!

Big Data Journal Authors: Liz McMillan, Kevin Benedict, Pat Romanski, Michael Bushong, Elizabeth White

Related Topics: Virtualization, Java, SOA & WOA, Open Source, Cloud Expo, SDN Journal

Virtualization: Article

Is Service Virtualization’s “Shift Left” a Burden to Developers?

Service Virtualization can be both a blessing and a curse for developers

Service virtualization undeniably benefits the development process, but it can be both a blessing and a curse for developers. Minimizing the burden that "shift left" can place on developers is key to achieving maximum acceleration of delivery cycles.

service virtualization shift left

Service Virtualization's Shift Left Benefits the Development Process
Service virtualization
's potential to "shift left" testing is relatively well-accepted throughout the industry. With simulated test environments eliminating constraints that commonly delay or curtail testing efforts, testing can begin earlier. And, as we all know by now, the earlier you find a defect, the faster, easier, and cheaper it is to fix. Beyond that, service virtualization allows teams to test more extensively and more frequently (e.g., for continuous regression testing).

Service virtualization's shift left certainly yields significant benefits to the development process in terms of accelerating time to market, reducing risks, and lowering the costs associated with dev/test environment management. However, its impact on the actual development team is often overlooked.

But Does Service Virtualization Burden Developers?
In many respects, service virtualization is a gift to developers. First and foremost, it means their development and testing tasks aren't stalled because they're waiting for still-evolving components to be completed and/or staged test environments to be available. It allows them to create and modify "disposable" test environments on demand...without having to rely on someone else each time they need to tweak an existing configuration or access a new one. It relieves them from the minutiae involved in developing and managing effective stubs or mocks. It also enables them to access much more sophisticated behavior than stubbing or mocking can provide.

Yet, this "shift left" is not necessarily a panacea from the developers' perspective. When you shift testing left, you also hasten the point at which QA is discovering and reporting the most defects. This means that instead of defect reports peaking during the testing phase, they peak at the development phase-which is when developers are already scrambling to implement the functionality needed to meet their development deadlines.

service virtualization shift left

Without Service Virtualization

service virtualization shift left 2

With Service Virtualization - The Shift Left

Getting barraged with defect reports during this critical phase is likely to cut into development's time and focus on creating the innovative functionality that (you're hoping) will set your organization apart from competition.

To understand what this shift left must feel like to developers, assume you're expecting houseguests to arrive on Sunday evening, giving you an entire weekend to tidy up and prepare. Now, imagine that on Thursday evening they call to say they'll be arriving Friday evening...and you have a major work deadline Friday afternoon.

So what do you do? Obviously, you don't want to throw out the baby with the bathwater here. After all, service virtualization stands to deliver remarkable benefits and provide tremendous value to your organization as a whole.

Shift Left + Compress
The good news is that service virtualization doesn't have to place extra burdens on development. The trick is to not only shift testing left, but also compress the defect curve. In other words, reduce the overall error injection rate so there are fewer defects to find and fix.

service virtualization shift left

Shift Left + Compress

As you can see, this "shift left + compress" strategy avoids taxing development at their most critical juncture. Even though the defect curve peaks earlier, developers are not burdened with an increase in reported defects during construction time because the peak is lower. Moreover, because there are fewer defects to find and fix across the SDLC, the team is able to complete the entire iteration significantly earlier.

To return to our analogy, this is akin to your houseguests arriving early...but now they're planning to stay in a hotel. Because you can focus on meeting your work deadline without worrying about cleaning, shopping, etc., the early arrival isn't nearly as stressful.

How do you reduce the overall error injection rate? Through Development Testing: synchronized application of a broad spectrum of automated defect prevention and defect detection strategies in a way that reduces development risks, time, and costs. Depending on the organization's expectations and priorities, Development Testing might include static analysis, peer code reviews, unit testing, runtime error detection, and other software verification practices.

But isn't this just placing a different burden on development? Not if it's implemented smartly and unobtrusively. In fact, Development Testing can actually improve productivity while reducing risks. But that's the topic for another blog...

Service Virtualization ROI Paper

service virtualization roi

Curious about what ROI your organization can achieve with service virtualization? Read Parasoft's new 5-page Service Virtualization ROI white paper to learn about the business drivers behind service virtualization purchase decisions, as well as the substantial opportunities for ROI in terms of OpEx reduction, CapEx reduction, risk reduction, and incremental top-line revenue.

More Stories By Cynthia Dunlop

Cynthia Dunlop is the lead technical writer for Parasoft.

Latest Stories from Big Data Journal
Cisco on Wedesday announced its intent to acquire privately held Metacloud. Based in Pasadena, Calif., Metacloud deploys and operates private clouds for global organizations with a unique OpenStack-as-a-Service model that delivers and remotely operates production-ready private clouds in a customer's data center. Metacloud's OpenStack-based cloud platform will accelerate Cisco's strategy to build the world's largest global Intercloud, a network of clouds, together with key partners to address cu...
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the “Internet of Things” (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, will discuss the real...
I write and study often on the subject of digital transformation - the digital transformation of industries, markets, products, business models, etc. In brief, digital transformation is about the impact that collected and analyzed data can have when used to enhance business processes and workflows. If Amazon knows your preferences for particular books and films based upon captured data, then they can apply analytics to predict related books and films that you may like. This improves sales. T...
IoT is still a vague buzzword for many people. In his session at Internet of @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, will discuss the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. The presentation will also discuss how IoT is perceived by investors and how venture capitalist access this space. Other topics to discuss are barriers to success, what is n...
When one expects instantaneous response from video generated on the internet, lots of invisible problems have to be overcome. In his session at 6th Big Data Expo®, Tom Paquin, EVP and Chief Technology Officer at OnLive, to discuss how to overcome these problems. A Silicon Valley veteran, Tom Paquin provides vision, expertise and leadership to the technology research and development effort at OnLive as EVP and Chief Technology Officer. With more than 20 years of management experience at lead...
BlueData aims to “democratize Big Data” with its launch of EPIC Enterprise, which it calls “the industry’s first Big Data software to enable enterprises to create a self-service cloud experience on premise.” This self-service private cloud allows enterprises to create 100-node Hadoop and Spark clusters in less than 10 minutes. The company is also offering a Community Edition via free download. We had a few questions for BlueData CEO Kumar Sreekanti about all this, and here's what he had to s...
Labor market analytics firm Wanted Analytics recently assessed the market for technology professionals and found that demand for people with proficient levels of Hadoop expertise had skyrocketed by around 33% since last year – it is true, Hadoop is hard technology to master and the labor market is not exactly flooded with an over-abundance of skilled practitioners. Hadoop has been called a foundational technology, rather than ‘just’ a database by some commentators – this almost pushes it towards...
The cloud provides an easy onramp to building and deploying Big Data solutions. Transitioning from initial deployment to large-scale, highly performant operations may not be as easy. In his session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, will discuss the benefits, weaknesses, and performance characteristics of public and bare metal cloud deployments that can help you make the right decisions.
Where historically app development would require developers to manage device functionality, application environment and application logic, today new platforms are emerging that are IoT focused and arm developers with cloud based connectivity and communications, development, monitoring, management and analytics tools. In her session at Internet of @ThingsExpo, Seema Jethani, Director of Product Management at Basho Technologies, will explore how to rapidly prototype using IoT cloud platforms and c...
Amazon, Google and Facebook are household names in part because of their mastery of Big Data. But what about organizations without billions of dollars to spend on Big Data tools - how can they extract value from their data? Ion Stoica is co-founder and CEO of Databricks, a company working to revolutionize Big Data analysis through the Apache Spark platform. He also serves as a professor of computer science at the University of California, Berkeley. Ion previously co-founded Conviva to commercial...