Welcome!

@CloudExpo Authors: Elizabeth White, Pat Romanski, Harry Trott, Liz McMillan, Mamoon Yunus

Related Topics: @ThingsExpo, Java IoT, Microservices Expo, Linux Containers, Containers Expo Blog, @CloudExpo, @BigDataExpo, SDN Journal

@ThingsExpo: Article

ARM Server to Transform #BigData to #IoT | @CloudExpo #IIoT #AI #ML #DX

New Microserver computing platform offers compelling benefits for the right applications

A completely new computing platform is on the horizon. They're called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general.

What Is a Microserver...and What Isn't
Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some years to come - growing to over 20% of the server market by 2016 according to Oppenheimer ("Cloudy With A Chance of ARM" Oppenheimer Equity Research Industry Report).

According to Chris Piedmonte, CEO of Suvola Corporation - a software and services company focused on creating preconfigured and scalable Microserver appliances for deploying large-scale enterprise applications, "the Microserver market is poised to grow by leaps and bounds - because companies can leverage this kind of technology to deploy systems that offer 400% better cost-performance at half the total cost of ownership. These organizations will also benefit from the superior reliability, reduced space and power requirements, and lower cost of entry provided by Microserver platforms".

This technology might be poised to grown, but today, these Microservers aren't mainstream at all - having well under 1% of the server market. Few people know about them. And there is a fair amount of confusion in the marketplace. There isn't even agreement on what to call them: different people call them different things - Microserver, ARM Server, ARM-based Server and who knows what else.

To further confuse the issue, there are a number of products out there in the market that are called "Microservers" that aren't Microservers at all - for example the HP ProLiant MicroServer or the HP Moonshoot chassis. These products are smaller and use less power than traditional servers, but they are just a slightly different flavor of standard Intel/AMD server that we are all familiar with. Useful, but not at all revolutionary - and with a name that causes unfortunate confusion in the marketplace.

Specifically, a Microserver is a server that is based on "system-on-a-chip" (SoC) technology - where the CPU, memory and system I/O and such are all one single chip - not multiple components on a system board (or even multiple boards).

What Makes ARM Servers Revolutionary?
ARM Servers are an entirely new generation of server computing - and they will make serious inroads into the enterprise in the next few years. A serious innovation - revolutionary, not evolutionary.

These new ARM Server computing platforms are an entire system - multiple CPU cores, memory controllers, input/output controllers for SATA, USB, PCIe and others, high-speed network interconnect switches, etc. - all on a SINGLE chip measuring only one square inch. This is hyperscale integration technology at work.

To help put this into context, you can fit 72 quad-core ARM Servers into the space used by a single traditional server board.

Today's traditional server racks are typically packed with boards based on Intel XEON or AMD Opteron chips and are made up of a myriad of discrete components. They're expensive, powerful, power-hungry, use up a considerable amount of space, and can quickly heat up a room to the point where you might think you're in a sauna.

In contrast, the ARM Servers with their SoC design are small, very energy efficient, reliable, scalable - and incredibly well-suited for a wide variety of mainstream computing tasks dealing with large numbers of users, data and applications (like Web services, data crunching, media streaming, etc.). The SoC approach of putting an entire system on a chip, results in a computer that can operate on as little as 1.5 watts of power.

Add in memory and a solid-state "disk drive" and you could have an entire server that runs on under 10 watts of power. For example, Calxeda's ECX-1000 quad-core ARM Server node with built-in Ethernet and SATA controllers, and 4GB of memory uses 5 watts at full power. In comparison, my iPhone charger is 7 watts and the power supply for the PC on my desk is 650 watts (perhaps that explains the $428 electric bill I got last month).

ARM Server Microserver

Realistically, these ARM Servers use about 1/10th the power, and occupy considerably less than 1/10th the space of traditional rack-mounted servers (for systems of equivalent computing power). And at an acquisition price of about half of what a traditional system costs.

And they are designed to scale - the Calxeda ECX-1000 ARM Servers are packaged up into "Energy Cards" - composed of four quad-core chips and 16 SATA ports. They are designed with scalability in mind - they embed an 80 gigabit per second interconnect switch, which allows you to easily connect potentially thousands of nodes without all the cabling inherent in traditional rack-mounted systems (a large Intel-based system could have upwards of 2,000 cables). This also provides for extreme performance - node to node communication occurs on the order of 200 nanoseconds.

You can have four complete ARM Servers on a board that is only ten inches long and uses only about 20 watts of power at full speed - that's revolutionary.

How Do ARM Servers Translate into Business Benefits?
When you account for reduced computing center operations costs, lower acquisition costs, increased reliability due to simpler construction / fewer parts, and less administrative cost as a result of fewer cables and components, we're talking about systems that could easily cost 70% less to own and operate.

If you toss in the cost to actually BUILD the computing center and not just "operate it", then the cost advantage is even larger. That's compelling - especially to larger companies that spend millions of dollars a year building and operating computing centers. Facebook, for example, has been spending about half a billion (yes, with a "b") dollars a year lately building and equipping their computing centers. Mobile devices are driving massive spending in this area - and in many cases, these are applications which are ideal for ARM Server architectures.

Why Don't I See More ARM Servers?
So - if all this is true, why do Microservers have such a negligible market share of the Server market?

My enthusiasm for ARM Servers is in their potential. This is still an early-stage technology and Microserver hardware really has only been available since the last half of 2012. I doubt any companies are going to trade in all their traditional rack servers for Microservers this month. The "eco-system" for ARM Servers isn't fully developed yet. And ARM Servers aren't the answer to every computing problem - the hardware has some limitations (it's 32 bit, at least for now). And it's a platform better suited for some classes of computing than others. Oh, and although it runs various flavors of Linux, it doesn't run Windows - whether that is a disadvantage depends on your individual perspective.

Microservers in Your Future?
Irrespective of these temporary shortcomings, make no mistake - this is a revolutionary shift in the way that server systems will be (and should be) designed. Although you personally may never own one of these systems, within the next couple of years, you will make use of ARM Servers all the time - as they have the potential to shrink the cost of Cloud Computing, "Big Data", media streaming and any kind of Web computing services to a fraction of the cost of what they are today.

Keep your eye on this little technology - it's going to be big.


Note: The author of this article works for Dell. The opinions stated are his own personal opinions vs. those of his employer.

More Stories By Hollis Tibbetts

Hollis Tibbetts, or @SoftwareHollis as his 50,000+ followers know him on Twitter, is listed on various “top 100 expert lists” for a variety of topics – ranging from Cloud to Technology Marketing, Hollis is by day Evangelist & Software Technology Director at Dell Software. By night and weekends he is a commentator, speaker and all-round communicator about Software, Data and Cloud in their myriad aspects. You can also reach Hollis on LinkedIn – linkedin.com/in/SoftwareHollis. His latest online venture is OnlineBackupNews - a free reference site to help organizations protect their data, applications and systems from threats. Every year IT Downtime Costs $26.5 Billion In Lost Revenue. Even with such high costs, 56% of enterprises in North America and 30% in Europe don’t have a good disaster recovery plan. Online Backup News aims to make sure you all have the news and tips needed to keep your IT Costs down and your information safe by providing best practices, technology insights, strategies, real-world examples and various tips and techniques from a variety of industry experts.

Hollis is a regularly featured blogger at ebizQ, a venue focused on enterprise technologies, with over 100,000 subscribers. He is also an author on Social Media Today "The World's Best Thinkers on Social Media", and maintains a blog focused on protecting data: Online Backup News.
He tweets actively as @SoftwareHollis

Additional information is available at HollisTibbetts.com

All opinions expressed in the author's articles are his own personal opinions vs. those of his employer.

@CloudExpo Stories
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, will introduce two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a...
Any startup has to have a clear go –to-market strategy from the beginning. Similarly, any data science project has to have a go to production strategy from its first days, so it could go beyond proof-of-concept. Machine learning and artificial intelligence in production would result in hundreds of training pipelines and machine learning models that are continuously revised by teams of data scientists and seamlessly connected with web applications for tenants and users.
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
SYS-CON Events announced today that App2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. App2Cloud is an online Platform, specializing in migrating legacy applications to any Cloud Providers (AWS, Azure, Google Cloud).
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, will provide a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to ...