Welcome!

@CloudExpo Authors: Liz McMillan, Elizabeth White, Jyoti Bansal, Peter Silva, Pat Romanski

Related Topics: @CloudExpo, Microsoft Cloud, Containers Expo Blog, Cloud Security, Government Cloud

@CloudExpo: Article

Cloud Economics – Amazon, Microsoft, Google Compared

A Platform Comparison

Any new technology adoption happens because of one of the three reasons:
  1. Capability: It allows us to do something which was not feasible earlier
  2. Convenience: It simplifies
  3. Cost: It significantly reduces cost of doing something

What is our expectation from cloud computing? As I had stated earlier, it is all about cost saving … (1) through elastic capacity and (2) through economy of scale. So, for any CIO who is interested in moving to cloud, it is very important to understand what the cost elements are for different cloud solutions. I am going to look at 3 platforms: Amazon EC2, Google App Engine and Microsoft Azure. They are sufficiently different from each other and each of these companies is following a different cloud strategy – so we need to understand their pricing model.

(A word of caution: this analysis is as per the published data on 20th January, 2010 and texts in green italics are my interpretation)

[Update on Amazon offering as on June, 2011]

Quick Read: Market forces seem to have ensured that all the prices are similar – for quick rule of thumb calculation to look at viability, use the following numbers irrespective of the provider. You will not go too much off the mark.

  • Base machine = $0.1 per hour (for 1.5 GHz Intel Processor)
  • Storage = $0.15 per GB per month
  • I/O = $0.01 for 1,000 write and $0.001 for 1,000 read
  • Bandwidth = $0.1 per GB for incoming traffic and $0.15 per GB for outgoing traffic

However, if you have time, you can go through the detail analysis given below.

Amazon:
  • Overview: You can create one or more instances of a virtual machine for processing and for storage
    • You pay based on time the instances are running and not on how much they are used – if an instance is idle, you still pay for it
    • There are three physically different locations where the facility is available (called availability zones) – US(N. Virginia, N. California) and EU(Ireland)
    • When you either shutdown the machine instance or it crashes for whatever reason you lose all your data
    • It is possible to have a reserve instance (for 1 year or 3 years) for an initial payment and discounted rate of usage – however, I do not think it provides any guarantee against data loss because of machine crash
    • Data storage can be both relational and non-relational
  • Machine Instance: Virtual machine can be of different capacity – Standard(Small, Large, Extra Large), High-Memory(Double Extra Large, Quadruple Extra Large), High-CPU(Medium, Extra Large)
    • Charge for Machine Usage: You are charged for the time you keep the instance of the machine running – the time is calculated in hours, any fraction of hour is taken as full hour
      • Hourly charge vary from $0.085 (Small – Linux – N. Virginia) to $3.16 (Quadruple Extra Large – Windows – N. California)
      • Both Linux and Windows machine instances are supported – Windows machines are about 40% more expensive – other software charges are extra
    • There are separate charges for mapping IP addresses, for monitoring & auto scaling ($0.015 per instance per hour) and load balancing
    • A message queue is available (Simple Queue Service – SQS) but again it has a separate charge – $0.1 to $0.17 per GB depending on the total monthly volume
  • Data Persistence: To persistent data storage you can one of the 3 alternatives – Simple DB, Simple Storage Service (S3) or Relational Database Service (RDS)
    • Simple DB and S3 storage mechanism is not RDBMS – that is you do not have tables therefore you cannot retrieve records through using JOIN
    • RDS is an instance of MySQL – so you can use it like a normal RDBMS
    • Charges for Simple DB: you pay separately for CPU, disk space and data transfer – though up to a limit they are free (25 CPU hours, 1GB data transfer, 1GB of storage)
      • CPU usage calculation is normalized to 1.7 GHz Xeon (2007) processor and works out to $0.14 to $0.154 per hour depending on location
      • Data transfer In is free till June 2010 and charge for transfer Out is between $0.1 to $0.17
        per GB depending on the total monthly volume
      • Actual storage is charger at $0.25 to $0.275 per GB per month – it includes 45 bytes of overhead for each item uploaded
    • Charges for S3: You are charged for disk space, data transfer and number of request made instead of CPU usage – data transfer charges are the same
      • Storage charge varies from $.055 to $0.165 per GB per month making it slightly cheaper than Simple DB but at a higher level of usage (more than 1000 TB)
      • I/O requests are charged separately – you pay between $0.01 to $0.011 per 1,000 write requests and $0.01 to $0.011 per 10,000 read requests – deletes are free
    • Charge for RDS: You pay for storage, I/O request, data transfer and machine instance (Small, Large, Extra Large, Double Extra Large, Quadruple Extra Large) based on usage
      • You pay for RDS instance – charges vary from $0.11 to $3.10 per hour depending on the instance size
      • The storage charge is not pay as you use – you have to decide in advance (5 GB to 1 TB) and the charges are $0.10 per GB per month
      • The is no charge for backup up to the amount of storage you have chosen but you have to pay $0.15 per GB per month for extra backup
      • You pay separately for I/O at $0.10 per 1 million I/O requests

    Google:
    • Overview: Application written in Python or Java can directly be deployed – the implementation is a subset
      • No need to instantiate any virtual machine
      • You are charged on the actual normalized CPU cycles used
      • Storage is only non-relational
      • Charge is calculated on these parameters – bandwidth, CPU, storage, emails send
      • You have free quota for each of these parameters – it is enough for development, testing and small deployment
      • There are limits imposed for peak usage on many different parameters – with daily limits & limits on usage in a burst
      • You will need to rewrite your application to work on Google App Engine – see this
      • Charge for CPU usage: It is calculated in CPU seconds equivalent to 1.2 GHz Intel x86 processor
        • You pay $0.10 per hour of CPU usage for processing requests
        • 6.5 hours of CPU time is free
        • You do not pay for CPU idle time
      • Charge for storage: Only non-relational storage is available
        • You pay $0.15 per GB per month – the size includes overhead, metadata and storage required for indexes
        • It includes data stored in the datastore, memcache, blobstore
        • You pay for CPU usages for data I/O at $0.10 per hour
        • 60 hours of CPU time for data I/O is free
        • Up to 1 GB of storage is freeFAQ page says that it is 500 MB
        • You are charged every day at $0.005 GB per day after subtracting your free quota
      • Charge for bandwidth usage: Inward and outward bandwidth usage is charged at different rate
        • You pay $0.10 per GB for incoming traffic
        • You pay $0.12 per GB for outgoing traffic
        • 1 GB of incoming traffic and 1 GB of outgoing traffic is free
    Microsoft:
    • Overview: Offering has 3 main parts – Windows Azure, SQL Azure and App Fabric
      • Details available on the Microsoft site is more about the vision of the product than about what is implemented here and now.
      • However this document “Introducing Windows Azure” is good
      • It uses Hyper-V for virtualization – it works more like Amazon than like Google
      • There is an introductory offer where the service can be avail for free
      • The development environment is Visual Studio through an SDK
      • The emphasis of creating applications which partly runs in premise
        and partly on cloud
      • Microsoft wants to keep the programming model as much unaltered as possible – see this
      • Charge for CPU usage: It is calculated in CPU seconds equivalent to 1.2 GHz Intel x86 processor
        • You pay $0.12 per hour of CPU usage for processing requests
      • Charge for storage: Only non-relational storage is available
        • You pay $0.15 per GB per month
        • Storage transactions are charged separately at $0.01 per 10,000 transactions
      • Charge for bandwidth usage: Inward and outward bandwidth usage is charged at different rate
        • You pay $0.10 per GB for incoming traffic – rates for Asia are different $0.30 per GB
        • You pay $0.15 per GB for outgoing traffic – rates for Asia are different $0.45 per GB

Looking at the complexity of pricing I see great prospect for anybody who specializes in optimizing application for cloud – unlike traditional applications – any improvement in cloud application and directly be measured in $$$ saved.

More Stories By Udayan Banerjee

Udayan Banerjee is CTO at NIIT Technologies Ltd, an IT industry veteran with more than 30 years' experience. He blogs at http://setandbma.wordpress.com.
The blog focuses on emerging technologies like cloud computing, mobile computing, social media aka web 2.0 etc. It also contains stuff about agile methodology and trends in architecture. It is a world view seen through the lens of a software service provider based out of Bangalore and serving clients across the world. The focus is mostly on...

  • Keep the hype out and project a realistic picture
  • Uncover trends not very apparent
  • Draw conclusion from real life experience
  • Point out fallacy & discrepancy when I see them
  • Talk about trends which I find interesting
Google

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
tom.eberhard 01/25/10 03:58:00 PM EST

From what I've read, Azure offers relational storage in SQL Azure.
Look at these two different links for example.

Please clarify your article.
Sincerely,
Tom Eberhard.

@CloudExpo Stories
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus o...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).