|By Hollis Tibbetts||
|August 6, 2014 07:15 AM EDT||
A completely new computing platform is on the horizon. They're called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general.
What Is a Microserver...and What Isn't
Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some years to come - growing to over 20% of the server market by 2016 according to Oppenheimer ("Cloudy With A Chance of ARM" Oppenheimer Equity Research Industry Report).
According to Chris Piedmonte, CEO of Suvola Corporation - a software and services company focused on creating preconfigured and scalable Microserver appliances for deploying large-scale enterprise applications, "the Microserver market is poised to grow by leaps and bounds - because companies can leverage this kind of technology to deploy systems that offer 400% better cost-performance at half the total cost of ownership. These organizations will also benefit from the superior reliability, reduced space and power requirements, and lower cost of entry provided by Microserver platforms".
This technology might be poised to grown, but today, these Microservers aren't mainstream at all - having well under 1% of the server market. Few people know about them. And there is a fair amount of confusion in the marketplace. There isn't even agreement on what to call them: different people call them different things - Microserver, ARM Server, ARM-based Server and who knows what else.
To further confuse the issue, there are a number of products out there in the market that are called "Microservers" that aren't Microservers at all - for example the HP ProLiant MicroServer or the HP Moonshoot chassis. These products are smaller and use less power than traditional servers, but they are just a slightly different flavor of standard Intel/AMD server that we are all familiar with. Useful, but not at all revolutionary - and with a name that causes unfortunate confusion in the marketplace.
Specifically, a Microserver is a server that is based on "system-on-a-chip" (SoC) technology - where the CPU, memory and system I/O and such are all one single chip - not multiple components on a system board (or even multiple boards).
What Makes ARM Servers Revolutionary?
ARM Servers are an entirely new generation of server computing - and they will make serious inroads into the enterprise in the next few years. A serious innovation - revolutionary, not evolutionary.
These new ARM Server computing platforms are an entire system - multiple CPU cores, memory controllers, input/output controllers for SATA, USB, PCIe and others, high-speed network interconnect switches, etc. - all on a SINGLE chip measuring only one square inch. This is hyperscale integration technology at work.
To help put this into context, you can fit 72 quad-core ARM Servers into the space used by a single traditional server board.
Today's traditional server racks are typically packed with boards based on Intel XEON or AMD Opteron chips and are made up of a myriad of discrete components. They're expensive, powerful, power-hungry, use up a considerable amount of space, and can quickly heat up a room to the point where you might think you're in a sauna.
In contrast, the ARM Servers with their SoC design are small, very energy efficient, reliable, scalable - and incredibly well-suited for a wide variety of mainstream computing tasks dealing with large numbers of users, data and applications (like Web services, data crunching, media streaming, etc.). The SoC approach of putting an entire system on a chip, results in a computer that can operate on as little as 1.5 watts of power.
Add in memory and a solid-state "disk drive" and you could have an entire server that runs on under 10 watts of power. For example, Calxeda's ECX-1000 quad-core ARM Server node with built-in Ethernet and SATA controllers, and 4GB of memory uses 5 watts at full power. In comparison, my iPhone charger is 7 watts and the power supply for the PC on my desk is 650 watts (perhaps that explains the $428 electric bill I got last month).
Realistically, these ARM Servers use about 1/10th the power, and occupy considerably less than 1/10th the space of traditional rack-mounted servers (for systems of equivalent computing power). And at an acquisition price of about half of what a traditional system costs.
And they are designed to scale - the Calxeda ECX-1000 ARM Servers are packaged up into "Energy Cards" - composed of four quad-core chips and 16 SATA ports. They are designed with scalability in mind - they embed an 80 gigabit per second interconnect switch, which allows you to easily connect potentially thousands of nodes without all the cabling inherent in traditional rack-mounted systems (a large Intel-based system could have upwards of 2,000 cables). This also provides for extreme performance - node to node communication occurs on the order of 200 nanoseconds.
You can have four complete ARM Servers on a board that is only ten inches long and uses only about 20 watts of power at full speed - that's revolutionary.
How Do ARM Servers Translate into Business Benefits?
When you account for reduced computing center operations costs, lower acquisition costs, increased reliability due to simpler construction / fewer parts, and less administrative cost as a result of fewer cables and components, we're talking about systems that could easily cost 70% less to own and operate.
If you toss in the cost to actually BUILD the computing center and not just "operate it", then the cost advantage is even larger. That's compelling - especially to larger companies that spend millions of dollars a year building and operating computing centers. Facebook, for example, has been spending about half a billion (yes, with a "b") dollars a year lately building and equipping their computing centers. Mobile devices are driving massive spending in this area - and in many cases, these are applications which are ideal for ARM Server architectures.
Why Don't I See More ARM Servers?
So - if all this is true, why do Microservers have such a negligible market share of the Server market?
My enthusiasm for ARM Servers is in their potential. This is still an early-stage technology and Microserver hardware really has only been available since the last half of 2012. I doubt any companies are going to trade in all their traditional rack servers for Microservers this month. The "eco-system" for ARM Servers isn't fully developed yet. And ARM Servers aren't the answer to every computing problem - the hardware has some limitations (it's 32 bit, at least for now). And it's a platform better suited for some classes of computing than others. Oh, and although it runs various flavors of Linux, it doesn't run Windows - whether that is a disadvantage depends on your individual perspective.
Microservers in Your Future?
Irrespective of these temporary shortcomings, make no mistake - this is a revolutionary shift in the way that server systems will be (and should be) designed. Although you personally may never own one of these systems, within the next couple of years, you will make use of ARM Servers all the time - as they have the potential to shrink the cost of Cloud Computing, "Big Data", media streaming and any kind of Web computing services to a fraction of the cost of what they are today.
Keep your eye on this little technology - it's going to be big.
Note: The author of this article works for Dell. The opinions stated are his own personal opinions vs. those of his employer.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
Jul. 31, 2016 03:15 AM EDT Reads: 1,690
The IETF draft standard for M2M certificates is a security solution specifically designed for the demanding needs of IoT/M2M applications. In his session at @ThingsExpo, Brian Romansky, VP of Strategic Technology at TrustPoint Innovation, explained how M2M certificates can efficiently enable confidentiality, integrity, and authenticity on highly constrained devices.
Jul. 31, 2016 02:30 AM EDT Reads: 1,212
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Jul. 30, 2016 11:30 PM EDT Reads: 1,347
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 30, 2016 09:45 PM EDT Reads: 1,502
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 30, 2016 09:30 PM EDT Reads: 1,538
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Jul. 30, 2016 07:00 PM EDT Reads: 2,779
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 30, 2016 05:00 PM EDT Reads: 1,313
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
Jul. 30, 2016 04:30 PM EDT Reads: 1,129
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 30, 2016 04:30 PM EDT Reads: 2,284
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
Jul. 30, 2016 04:15 PM EDT Reads: 663
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Jul. 30, 2016 02:30 PM EDT Reads: 1,085
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Jul. 30, 2016 02:00 PM EDT Reads: 547
Ovum, a leading technology analyst firm, has published an in-depth report, Ovum Decision Matrix: Selecting a DevOps Release Management Solution, 2016–17. The report focuses on the automation aspects of DevOps, Release Management and compares solutions from the leading vendors.
Jul. 30, 2016 01:00 PM EDT Reads: 1,868
SYS-CON Events announced today that LeaseWeb USA, a cloud Infrastructure-as-a-Service (IaaS) provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LeaseWeb is one of the world's largest hosting brands. The company helps customers define, develop and deploy IT infrastructure tailored to their exact business needs, by combining various kinds cloud solutions.
Jul. 30, 2016 11:30 AM EDT Reads: 1,418
Qosmos has announced new milestones in the detection of encrypted traffic and in protocol signature coverage. Qosmos latest software can accurately classify traffic encrypted with SSL/TLS (e.g., Google, Facebook, WhatsApp), P2P traffic (e.g., BitTorrent, MuTorrent, Vuze), and Skype, while preserving the privacy of communication content. These new classification techniques mean that traffic optimization, policy enforcement, and user experience are largely unaffected by encryption. In respect wit...
Jul. 30, 2016 11:00 AM EDT Reads: 568
SYS-CON Events announced today that Venafi, the Immune System for the Internet™ and the leading provider of Next Generation Trust Protection, will exhibit at @DevOpsSummit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Venafi is the Immune System for the Internet™ that protects the foundation of all cybersecurity – cryptographic keys and digital certificates – so they can’t be misused by bad guys in attacks...
Jul. 30, 2016 10:15 AM EDT Reads: 1,533
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
Jul. 30, 2016 10:00 AM EDT Reads: 2,193
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Jul. 30, 2016 09:30 AM EDT Reads: 867
It’s 2016: buildings are smart, connected and the IoT is fundamentally altering how control and operating systems work and speak to each other. Platforms across the enterprise are networked via inexpensive sensors to collect massive amounts of data for analytics, information management, and insights that can be used to continuously improve operations. In his session at @ThingsExpo, Brian Chemel, Co-Founder and CTO of Digital Lumens, will explore: The benefits sensor-networked systems bring to ...
Jul. 30, 2016 09:15 AM EDT Reads: 1,670
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Jul. 30, 2016 09:00 AM EDT Reads: 656