Click here to close now.

Welcome!

Cloud Expo Authors: Elizabeth White, Jnan Dash, Carmen Gonzalez, Liz McMillan, Bart Copeland

Related Topics: Apache, Web 2.0, Cloud Expo

Apache: Article

The Three Salient Features of Cloud Computing

:Accessibility, Availability, and Scalability - Cloud computing provides tangible benefits, available to users on request

The Duo Consulting Blog

When you boil it down to brass tacks, cloud computing is just a new take on an old idea. Businesses are drawn to the facilities that cloud computing has to offer because the availability of our resources dictates our current needs...and our needs always expand beyond the capacity of our resources.

The first computers took up the space of a three-car garage. It’s amazing to think that all the computing power of those vacuum tube behemoths can now fit on a silicon chip the size of your fingernail, with processing room besides. However, even with all these advances in technology, we find it’s still not enough.

As a multimedia guy, I recognize two governing laws of data:

  • The availability of our resources dictates our current needs.
  • Our needs always expand beyond the capacity of our resources.

It is because of these two laws that more businesses are drawn to the facilities that cloud computing has to offer. Before they reach this decision, they usually first follow a process similar to this:

  • Build a bigger system.
  • Compress the bigger system into a smaller space.
  • Connect systems together to share resources.

My Dad is Bigger than Your Dad

Our first inclination is to put more stuff into the existing box. Where previously we would add more vacuum tubes to give our computers the ability to calculate floating point numbers, nowadays we install faster processors (or multiprocessors), larger hard drives, and more powerful graphics cards with higher video display RAM. The more technically inclined may even tweak the hardware to improve the speed of the data path between each of these components as well.

These bigger systems initially appear to provide all the computing power we need. We are able to run virtually any application, and store all of our data, on a single machine. Before long before we begin to realize new problems:

  • These devices are finite. Disk space is a concrete measurement, and fairly soon we are horrified to learn we can’t store the digital photos of little Mischa’s tenth birthday party because our hard drives are already maxed out.
  • These devices are insecure. Because all the applications and data reside on the same machine, that information is not safe in the event that computer is stolen, or worse, destroyed during a disaster.

Mainframe Computing

What if we were to instead invest in a mainframe computer—a really large computer that can hold our vast quantities of information? Then all our data would be in a centralized location safely protected from the elements. Also, theft of mainframe computers is really not an issue: I doubt very much that even a useful part of one would fit inside a hockey bag.

In a mainframe environment, to use any of our applications, all we’d need to do is connect a dumb terminal to this mainframe. Without any processing ability of its own, the terminal provides all the power of the mainframe at our disposal. Connect several hundred dumb terminals to the same mainframe and we can run an organization called IBM.

Reality Bites

While the processing power of mainframe computers is high, the upfront cost to install a mainframe environment is also high, and prohibitive to the average user. But what if we take our powerful desktop computers and connected them all together? For small- to medium-size enterprises, network computing is a lower-cost alternative to investing in a mainframe.

Most local area networks are set up as one or a combination of two ways:

  • Peer-to-Peer: Applications and data are stored on individual computers. Users run applications on their own computers and can allow other users to work with their files from across the wire.
  • Network Server: Applications and data are stored on a central computer or group of computers. Users can run applications installed on their local computer or on the network server. While users are encouraged to store their data on the central server, there is usually nothing to prevent a user from storing files locally as well.

In a networked environment, the security of the data is still at risk, because unlike mainframe components, a network server, usually the size of a standard desktop computer, can fit into a hockey bag. Also, even though a network setup can cost less than a mainframe, to properly configure it and continue to secure it requires a full-time IT specialist, which adds additional overhead. And once again, space is finite. An organization could easily outgrow even the highest capacity hard drive on their network server.

Head in the Cloud Computing

Through cloud computing, you can have all the power of several mainframe computers, the interconnectivity of a network system, the security of all your data being backed up on a regular basis, and the expertise of several IT specialists, all with a cost equivalent to buying lunch for your staff once a week. With faster Internet connection speeds becoming the norm, many users are already experiencing the benefits of cloud computing and not even realizing it.

Cloud computing is described on Wikipedia as, “…a style of computing where IT-related capabilities are provided ‘as a service’, allowing users to access technology-enabled services ‘in the cloud’ without knowledge of, expertise with, or control over the technology infrastructure that supports them.”

If a computer network provides the combination of computing power, storage capacity, user availability, and security that we want, cloud computing is a really large network, with all those features on steroids. Applications and data can be stored on any computer on this network, and while these computers may vary in size, several of them have the processing capability of mainframe computer systems.

We haven’t quite returned to the days of dumb terminals, however. Instead, we use the features of a standard web browser to access Rich Internet Applications (RIA) that simulate the smooth look and feel of a desktop application.

Soft Serve, Not the Ice Cream Kind

Cloud computing provides tangible benefits, available to users on request. Providing these features “as a service” means that the resources can be shared between several users without any noticeable decrease in performance.

Software as a Service (SaaS) reduces the need to install and upgrade software on users’ desktops. The user always opens the most up-to-date copy of the software, because it is maintained at all times. Google Apps and Zoho are examples of companies providing common application software. Both systems even provide an offline mode for times when users aren’t connected to the Internet. Their files are uploaded as soon as their computers are reconnected.

Hardware as a Service (HaaS) provides users with additional computing power, whenever they require it. For instance, if a retailer has a short-term need to process a high volume of point-of-sale (POS) transactions on Boxing Day, applications can be set up to share the processing across additional computers as necessary.

Platform as a Service (PaaS) is another growing enterprise. These shared hosting systems provide a development environment for developers to build their own applications using prebuilt modules or custom code.

Accessibility, availability, scalability: The salient features of cloud computing. But when you boil it down to brass tacks, cloud computing is just a new take on an old idea. I didn’t even get to discuss virtualization, which is what gives all these “as a service” features their power. Because of virtualization, while the two governing laws of data still apply, it will take a LONG time for our needs to “expand beyond the capacity of our resources.”

Read, Watch, and Learn

More Stories By Tony Chung

Tony Chung is a creative communications consultant who draws from his broad range of experiences and abilities to find parallel strategies for solving problems quickly and efficiently. He combines words, music, multimedia, web programming, technological passion, and analytical wisdom to build solutions timed to suit your business needs and requirements.

Submit a request for information on his consulting services now.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Companies today struggle to manage the types and volume of data their customers and employees generate and use every day. With billions of requests daily, operational consistency can be elusive. In his session at Big Data Expo, Dave McCrory, CTO at Basho Technologies, will explore how a distributed systems solution, such as NoSQL, can give organizations the consistency and availability necessary to succeed with on-demand data, offering high availability at massive scale.
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, shared some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, de...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
Software is eating the world. Companies that were not previously in the technology space now find themselves competing with Google and Amazon on speed of innovation. As the innovation cycle accelerates, companies must embrace rapid and constant change to both applications and their infrastructure, and find a way to deliver speed and agility of development without sacrificing reliability or efficiency of operations. In her Day 2 Keynote DevOps Summit, Victoria Livschitz, CEO of Qubell, discussed...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
NaviSite, Inc., a Time Warner Cable company, has opened a new enterprise-class data center located in Santa Clara, California. The new data center will enable NaviSite to meet growing demands for its enterprise-class Cloud and Managed Services from existing and new customers. This facility, which is owned by data center solution provider Digital Realty, will join NaviSite’s fabric of nine existing data centers across the U.S. and U.K., all of which are designed to provide a resilient, secure, hi...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
The IoT market is projected to be $1.9 trillion tidal wave that’s bigger than the combined market for smartphones, tablets and PCs. While IoT is widely discussed, what not being talked about are the monetization opportunities that are created from ubiquitous connectivity and the ensuing avalanche of data. While we cannot foresee every service that the IoT will enable, we should future-proof operations by preparing to monetize them with extremely agile systems.