Click here to close now.

Welcome!

Big Data Journal Authors: Carmen Gonzalez, Ian Khan, Harry Trott, Elizabeth White, Bart Copeland

Related Topics: DevOps Journal, Java, Linux, Cloud Expo, Big Data Journal, SDN Journal

DevOps Journal: Blog Feed Post

DevOps, Automation, and Mid-Market Companies

When you think about the largest and most dynamic networks in the world topics like automation are a no-brainer

The overall networking landscape has been going through a fairly deliberate shift over the past couple of years. Where we used to talk CapEx, we are now talking OpEx. Where we used to talk features, we are now talking about workflows. This change in industry dialogue mirrors the rise of trends like SDN and DevOps. I have been a huge fan of automation in general and DevOps in particular for many years now. But, as an industry, are we leaving people behind unintentionally?

When you think about the largest and most dynamic networks in the world (typically characterized as either service providers or web-scale companies), topics like automation are a no-brainer. The sheer number of devices in the networks that these companies manage demands something more than keying in changes manually. And for these types of companies, the network is not just an enabler – it is a central part of their business. Without the network, there is no business. It’s not terribly surprising that these companies hire small armies of capable engineers and developers to make everything function smoothly.

In these environments, automation is not a nice-to-have. It’s closer to food and water than it is to sports and entertainment. Accordingly, their interest in technologies that support automation is high. Their capability in putting automation tools to use is high. And if their abilities do not match their requirements, they open up their wallets to make sure they get there (think: OSS/BSS).

In networking, there is a prevailing belief that what is good for these complex environments will eventually make its way into smaller, less complex networks. It might take time, but the technologies and best practices that the most advanced companies employ, will eventually trickle down to everyone else. It’s sort of the networking equivalent of Reaganomics.

But is this necessarily true?

First, let me reiterate that I am a huge advocate for automation and DevOps. But these capabilities might not be universally required. Automation is most important in environments where either the volume or rate of change is high enough to justify the effort. If the network is relatively static, changing primarily to swap out old gear for new functionally equivalent gear, it might not be necessary to automate much at all. Or if network changes are tied to incremental growth, it might not make sense to automate the very much.

Automation enthusiasts (myself included) will likely react somewhat viscerally to the idea that automation isn’t necessary. “But even in these cases, automation is useful!” Certainly, it is useful. But what if your IT team lacks the expertise to automate all the things. What then? Sure, you can change the team up, but is it worth the effort?

And even if it is worth the effort, how far along the automation path will most companies need to go? It could be that simple shell scripts are more than enough to manage the rate of change for some companies. Full-blown DevOps would be like bringing a cruise missile to a water gun fight.

In saying this, I am not trying to suggest that automation or DevOps are not important. Rather, the tools we associate with these are just that: tools. They need to be applied thoughtfully and where it makes sense. Vendors that build these tools and then to try to push them too far down into the market will find that the demand for cruise missiles drops off pretty precipitously after the top-tier companies.

Even smaller-scale infrastructure does require workflow though. The trick is in packaging the tools so that they are right-sized for the problems they are addressing.

This obviously starts with discarding the notion that workflows are common across all sizes of networks. That is simply not true. The reason that there is pushback when people say that the future of network engineering is programming is that for many people, it is not yet a foregone conclusion that full-blown automation is worth the effort.

For these people, the juice isn’t worth the squeeze.

The conclusion to draw here is not that automation is not a good thing. It’s that automation packaged as a complex DIY project isn’t always the right fit. Not everyone wants to do it themselves. At home, it turns out I am capable of repainting a room, but it just isn’t worth my time, so I hire a professional. In a network, people might be fully capable of automating policy provisioning and still find that it isn’t worth doing because policy for them just isn’t that complex.

What vendors ought to be doing is packaging their workflow optimizations in a way that is far easier to consume. Rather than building scaffolding around the network to handle management, it might make sense to make the management itself much more intuitive and more a core part of the way devices are architected.

This might sound like a brain dead statement, but consider that most networking devices are designed by people who do not run networks. And even worse, the workflows that dictate how things are used are frequently the last thing designed. If the mid-market and below are to get the advantages of the automation capabilities that the big guys are driving, vendors will need to design workflows explicitly for broad adoption.

If we really want to make the juice worth the squeeze, we need to make the squeeze a lot less painful. We need to move beyond automated networking closer to intuitive networking.

[Today’s fun fact: Lake Nicaragua boasts the only fresh water sharks in the entire world. I would be very motivated not to fall down while water skiing.]

The post DevOps, automation, and mid-market companies appeared first on Plexxi.

More Stories By Michael Bushong

The best marketing efforts leverage deep technology understanding with a highly-approachable means of communicating. Plexxi's Vice President of Marketing Michael Bushong has acquired these skills having spent 12 years at Juniper Networks where he led product management, product strategy and product marketing organizations for Juniper's flagship operating system, Junos. Michael spent the last several years at Juniper leading their SDN efforts across both service provider and enterprise markets. Prior to Juniper, Michael spent time at database supplier Sybase, and ASIC design tool companies Synopsis and Magma Design Automation. Michael's undergraduate work at the University of California Berkeley in advanced fluid mechanics and heat transfer lend new meaning to the marketing phrase "This isn't rocket science."

@BigDataExpo Stories
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
One of the biggest impacts of the Internet of Things is and will continue to be on data; specifically data volume, management and usage. Companies are scrambling to adapt to this new and unpredictable data reality with legacy infrastructure that cannot handle the speed and volume of data. In his session at @ThingsExpo, Don DeLoach, CEO and president of Infobright, will discuss how companies need to rethink their data infrastructure to participate in the IoT, including: Data storage: Understand...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
The true value of the Internet of Things (IoT) lies not just in the data, but through the services that protect the data, perform the analysis and present findings in a usable way. With many IoT elements rooted in traditional IT components, Big Data and IoT isn’t just a play for enterprise. In fact, the IoT presents SMBs with the prospect of launching entirely new activities and exploring innovative areas. CompTIA research identifies several areas where IoT is expected to have the greatest impac...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
Companies today struggle to manage the types and volume of data their customers and employees generate and use every day. With billions of requests daily, operational consistency can be elusive. In his session at Big Data Expo, Dave McCrory, CTO at Basho Technologies, will explore how a distributed systems solution, such as NoSQL, can give organizations the consistency and availability necessary to succeed with on-demand data, offering high availability at massive scale.