Welcome!

Big Data Journal Authors: Liz McMillan, Elizabeth White, Adrian Bridgwater, Kevin Benedict, Pat Romanski

Related Topics: IoT Expo, SOA & WOA, .NET, Virtualization, Web 2.0, Cloud Expo, SDN Journal

IoT Expo: Article

The Internet of Things and DNS

The Internet of Things will result in an increasing need for scalable DNS services

JANUARY 8, 2014 02:00 PM EST

When we talk about the impact of BYOD and BYOA and the Internet of Things, we often focus on the impact on data center architectures. That's because there will be an increasing need for authentication, for access control, for security, for application delivery as the number of potential endpoints (clients, devices, things) increases. That means scale in the data center.

What we gloss over, what we skip, is that before any of these "things" ever makes a request to access an application it had to execute a DNS query. Every. Single. Thing.

Maybe that's because we assume DNS can handle the load. So far it's done well. You rarely, if ever, hear of disruptions or outages due directly to the execution of DNS. Oh, there has been some issues with misconfiguration of DNS and of exploitation of DNS (hijacking, illicit use in reflection attacks, etc...) but in general there's rarely a report that a DNS service was overwhelmed by traffic and fell over.

dns-center-iotThat is exactly the problem. We've been successful with scaling DNS.

"Success breeds complacency. Complacency breeds failure. Only the paranoid survive." - Andrew Grove.

In the face of rapidly expanding endpoints (things), it behooves us all to take a second look at DNS and ensure it's ready to meet the challenge.

This is not just about availability. Remember operational axiom #2 - as load increases, performance decreases. That's true for DNS, too. It doesn't get a pass. That's why it's called an axiom, after all, because it's kind of the law, like gravity.

Browsers do a good job of obfuscating the latency incurred by DNS, and native mobile applications never show such gory details, so it's difficult for a user to separate latency associated with an overloaded DNS service from a generally poorly performing application. Not that they care, actually. A slow app is a slow app to an end user. They aren't interested in the gory details, they're interested in speedy applications. Period.

Interestingly, though, the Internet of Things is made up of more than just users. Lots of devices and applications make up the myriad endpoint "overlay" network created by connections between these devices and "things".

Devices don't care about latency (unless of course they're being driven by users, then the users care, but the devices surely don't). But the thing about DNS is that the latency is generally incurred at initial connection time. There's no way to differentiate before a connection is made whether it's a device or a real, live person on the other end. Even after a connection is made, UDP isn't exactly the most verbose of protocols and it's nearly impossible to differentiate via UDP, too. You've only got a few headers, and none of them offer insight into what kind of endpoint is making the request.

The imperative, then, is to ensure really fast connections and responses to every single query.

That may mean you need to reevaluate your DNS infrastructure to ensure it's ready to handle the coming flood of "things". Test and verify the maximum queries per second (QPS) your systems can manage while maintaining what your business defines as acceptable latency. Make sure to plot out latency based on connections and queries per second to get an idea of at what point your DNS starts to become part of the performance problem.

As the Internet expands and more devices and users are accessing your applications, it would be a mistake to forget about DNS. We all know the old saying about "assuming" things - and that certainly holds true when you simply assume your DNS is able to handle the increasing load.

Be paranoid. Test often. CYA(pps).

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Latest Stories from Big Data Journal
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at 6th Big Data Expo®, Hannah Smalltree, Director at Treasure Data, to discuss how IoT, B...
Predicted by Gartner to add $1.9 trillion to the global economy by 2020, the Internet of Everything (IoE) is based on the idea that devices, systems and services will connect in simple, transparent ways, enabling seamless interactions among devices across brands and sectors. As this vision unfolds, it is clear that no single company can accomplish the level of interoperability required to support the horizontal aspects of the IoE. The AllSeen Alliance, announced in December 2013, was formed wi...
SYS-CON Events announced today that Connected Data, the creator of Transporter, the world’s first peer-to-peer private cloud storage device, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Connected Data is the creator of Transporter, the world’s first peer-to-peer private cloud storage device. Connected Data is focused on providing elegantly designed solutions for consumers, professi...
Goodness there is a lot of talk about cloud computing. This ‘talk and chatter’ is part of the problem, i.e., we look at it, we prod it and we might even test it out – but do we get down to practical implementation, deployment and (if you happen to be a fan of the term) actual cloud ‘rollout’ today? Cloud offers the promise of a new era they say – and a new style of IT at that. But this again is the problem and we know that cloud can only deliver on the promises it makes if it is part of a well...
Cisco on Wedesday announced its intent to acquire privately held Metacloud. Based in Pasadena, Calif., Metacloud deploys and operates private clouds for global organizations with a unique OpenStack-as-a-Service model that delivers and remotely operates production-ready private clouds in a customer's data center. Metacloud's OpenStack-based cloud platform will accelerate Cisco's strategy to build the world's largest global Intercloud, a network of clouds, together with key partners to address cu...
I write and study often on the subject of digital transformation - the digital transformation of industries, markets, products, business models, etc. In brief, digital transformation is about the impact that collected and analyzed data can have when used to enhance business processes and workflows. If Amazon knows your preferences for particular books and films based upon captured data, then they can apply analytics to predict related books and films that you may like. This improves sales. T...
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the “Internet of Things” (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, will discuss the real...
IoT is still a vague buzzword for many people. In his session at Internet of @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, will discuss the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. The presentation will also discuss how IoT is perceived by investors and how venture capitalist access this space. Other topics to discuss are barriers to success, what is n...
When one expects instantaneous response from video generated on the internet, lots of invisible problems have to be overcome. In his session at 6th Big Data Expo®, Tom Paquin, EVP and Chief Technology Officer at OnLive, to discuss how to overcome these problems. A Silicon Valley veteran, Tom Paquin provides vision, expertise and leadership to the technology research and development effort at OnLive as EVP and Chief Technology Officer. With more than 20 years of management experience at lead...
BlueData aims to “democratize Big Data” with its launch of EPIC Enterprise, which it calls “the industry’s first Big Data software to enable enterprises to create a self-service cloud experience on premise.” This self-service private cloud allows enterprises to create 100-node Hadoop and Spark clusters in less than 10 minutes. The company is also offering a Community Edition via free download. We had a few questions for BlueData CEO Kumar Sreekanti about all this, and here's what he had to s...