Welcome!

@DXWorldExpo Authors: Zakia Bouachraoui, Yeshim Deniz, Liz McMillan, Elizabeth White, Pat Romanski

Related Topics: @ThingsExpo, Java IoT, @DXWorldExpo

@ThingsExpo: Article

Review | The Internet of Things: QNX Seminar

The IoT has to fit onto the Internet of Reality (the in-place Network Infrastructure) in order to be a real tool for businesses

This is a summary of a QNX webinar that was just presented on July 10, 2014 by BlackBerry (BBRY) and QNX.

This was an anticipated seminar because many people are looking for answers to questions they have about the big buzzword, the Internet of Things (IoT). They also want to know what role does the cloud-based product QNX play within this environment?

The QNX webinar gave a good overview of what the QNX product was and how it fit into the overall business market. They are operating system agnostic, which means they can work with any device, not just one type of Operating System. They have multi-OS adaptability.

They want to be totally inclusive of all devices and use one platform to manage everything. They have focused on end-to-end security, which is a huge issue across many industries as well as many applications.

They focused on the core principles which have to be architected into any platform upfront, especially if it is addressing a massive-scale application:

Cloud Platform Core Principles

Simplicity

Scalability

Security

They offered the right "strategic buzzwords" and marketing "buzz phrases," but they didn't give some concrete examples of how all of this is going to work on a real network. I would have liked them to have given reference-able accounts where QNX was beyond the beta-test project and supporting a real mission critical application. They talk about their long-standing work across 30 years with customers like Cisco, Delphi, General Electric, and Siemens, but the seminar did not get into anything that was customer-specific.

Give Me the Field Test Results
Optimal conditions yield optimal results. If I am testing something in the lab, it is going to perform well. Unless they have a traffic simulator that is actually simulating total network traffic, testing out specific modules and specific applications are not going to show what happens when you have traffic congestion. What has worked well in the field under real conditions?

This may have been too much to ask for in a one-hour seminar, but you would think they could put up some real numbers as to what the IoT servers can handle and where they bottom out.

What they left out were the details of how billions and billions of transactions are going to be transmitted across our current network infrastructure. In the question and answer period, I posed several questions as to the problems of running all these transactions to the end users on current network infrastructure, but specific answers were not given.

I asked several questions after the presentation to see how they anticipate the realities of running on a network infrastructure, which is not ready for gigabit and multi-gigabit speeds to the average end users. Here was the first question;

You talk about IoT transactions and data - what about the actual network infrastructure that needs to be updated before this type of traffic can be handled. What efforts are being made by you and others to get the network part updated?

"We partner with all the carriers. The carriers are working on that" - Hmmmm...Are you sure?

The next question was "half-asked" by the female host moderating the Q&A portion and she did not read the question verbatim. It appeared as if she didn't know what Gbps stood for (Gigabit per second) so she did not ask about gigabit and multi-gigabit speeds as well as terabit speeds for backbone (backhaul) capabilities. This was the question:

What speeds are optimal for this environment? 1Gbps to mobile end user? 5Gbps? 50Gbps to a stationary user? Terabit and multi-terabit for backbone (backhaul component of network?

Note to ALL companies giving a presentation on cutting-edge buzzwords and your key products: Better make sure your moderator is well-versed in the industry's major terminology otherwise you might be shooting yourself in the foot when they gloss over something that should be easily understood and answered by the subject matter experts.

NOTE to ALL executives from any organization: When you give a presentation on your company or strategic vision, take on the questions directly from the floor. Do NOT have some moderator shuffle through the questions and only pick out the easy ones to answer. That's why you get paid the big bucks -- to answer the tough questions. (I don't think this happened at the QNX seminar, but it has been an issue in past executive presentations at major conferences and is a huge pet peeve that must be addressed.)

Do IoT proponents know what the data rates need to be for their applications? Speeds must be upgraded to at least what we see here in this chart of data rates for 5G Networks:

Data Rate for 5G Networks

Speed

Market Segment

100Gbps +

Specialized enterprise users (stationary)

50Gbps

Low-mobility users

5Gbps

High-mobility users

1Gbps

Anywhere (baseline speed)

Source: James Carlini

My other two questions were not asked, or answered, in the rest of the session:

IoT is ONLY as good as the Network Infrastructure it is running on - with that being said, what strategic upgrades are you demanding from a network infrastructure that is still dependent on copper (esp. the Last Mile) - and not fiber?

This should be a question that is driven home to anyone proposing IoT applications. What will your IoT require as to network speeds? How degraded will the application become if you are on a sub-standard network? Those buying your product or service need to know this - what are the limits of the products and how do they work if they are not running on an optimum network?

My other question that was not asked was this:

5G is supposed to be rolled out by 2020 - multi-gigabit speeds will need to be employed if this is going to work. Do you think the carriers will have these speeds available? Will those speeds be ubiquitous? (available ALL over or just major cities)

All in all, I thought the webinar was informative as to the basic principles of what QNX is supporting, but it would have been nice to get them to provide some hard feedback, good or bad, on the actual operations of an IoT type application on a real-world network and not in the lab.

Here is what everyone thinking IoT has to aim for and what needs to be embedded in the network infrastructure:

Design Criteria for Network Speeds

Type of Use

Embedded Speed

 

Common End-User/ Subscriber

1 Gbps (One Gigabit per second)

This includes wireless due to what Smartphones are demanding in bandwidth

Industrial Park, Business Campus

Commercial Space

40-100Gbps

This would include next-generation Intelligent Business Campuses. (Some parks already have multiple carriers providing 40Gbps today.

Downtown/ Commercial Space

40-100Gbps

For downtown urban areas.

Backbone/ Carrier Backhaul

 

1 Tbps (One Terabit per second)

This sounds high, but the way demand is growing, this should be the goal.

Source: James Carlini

Carlini's visionary book, "Location, Location, Connectivity" will be coming out later this year.

Follow daily Carlini-isms at www.TWITTER.com/JAMESCARLINI

Copyright 2014 - James Carlini

More Stories By James Carlini

James Carlini, MBA, a certified Infrastructure Consultant, keynote speaker and former award-winning Adjunct Professor at Northwestern University, has advised on mission-critical networks. Clients include the Chicago Mercantile Exchange, GLOBEX, and City of Chicago’s 911 Center. An expert witness in civil and federal courts on network infrastructure, he has worked with AT&T, Sprint and others.

Follow daily Carlini-isms at www.twitter.com/JAMESCARLINI

Comments (1)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


DXWorldEXPO Digital Transformation Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
As Cybric's Chief Technology Officer, Mike D. Kail is responsible for the strategic vision and technical direction of the platform. Prior to founding Cybric, Mike was Yahoo's CIO and SVP of Infrastructure, where he led the IT and Data Center functions for the company. He has more than 24 years of IT Operations experience with a focus on highly-scalable architectures.
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...