Welcome!

@CloudExpo Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, Zakia Bouachraoui

Related Topics: @CloudExpo, Microsoft Cloud, Silverlight

@CloudExpo: Blog Feed Post

Windows Azure Blob Storage vs. Amazon S3

I am happy to see the result because the more cloud service providers are available, such as Amazon S3 and Windows Azure Storage

Amazon Session at Cloud Expo

This week, Gladinet Cloud Desktop reached version 1.4.2 (build 232). Windows Azure Blob Storage has been officially added to the list of supported Cloud Storage Providers (S3, AT&T, Google,e tc.)

This means you can map a network drive to the Windows Azure Blob Storage and start using it from Windows Explorer (See this link for detailed How-To information).

image

As always, every time we add a new storage provider integration, we would compare it to the existing ones, such as Amazon S3.

Since Gladinet Cloud Desktop can map both as virtual folders in a network drive, a quick drag-and-drop upload/download experiment should be easy.

Sometime ago, I did a simple comparison between EMC Atmos onLine and Amazon S3. I also did a simple comparison between AT&T Synaptic Storage and Amazon S3.

The performance data are pretty much in line with one and other, with the bottleneck on the server end (in the Cloud). I would expect Azure Storage to be in line as well. Let's begin ...

The test is simple, I have a 27M zip file that I first drag and drop into a folder under Amazon S3 and watch the upload progress in the Gladinet Task Manager, timing it from begin to end. Then I repeat the same thing with Windows Azure Blob Storage.

Amazon S3: 176 seconds
Windows Azure Blob Storage
: 175 seconds
(Upload Speed: 154 KBytes/sec)

It is very interesting to see the results are so close when knowing the two cloud storage providers are completely different. The servers hosting the data are completely different and the IP routes are different. There must be some kind of server side tool to throttle the speed of a single connection to a certain number. Maybe an expert of the cloud server end can explain this.

A speed test shows my upload speed to a random server in US is 3.97Mbit/s (  ~ 490KBytes/sec).

So this means my upload pipe is much wider than 154 KBytes/sec, meaning my computer and the internet connection are not bottlenecks.

image

I am happy to see the result because the more cloud service providers are available, such as Amazon S3, Windows Azure Storage and the AT&T Synaptic Storage, the more choices the consumers like myself has.

Go Cloud!

Grab a copy of the Gladinet Cloud Desktop here and try it yourself.

Read the original blog entry...

More Stories By Jerry Huang

Jerry Huang, an engineer and entrepreneur, founded Gladinet with his close friends and is pursuing interests in the cloud computing. He has published articles on the company blog as well as following up on the company twitter activities. He graduated from the University of Michigan in 1998 and has lived in West Palm Beach, Florida since.

CloudEXPO Stories
Andi Mann, Chief Technology Advocate at Splunk, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, and communicator. For over 30 years across five continents, he has built success with Fortune 500 corporations, vendors, governments, and as a leading research analyst and consultant.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value extensible storage infrastructure has in accelerating software development activities, improve code quality, reveal multiple deployment options through automated testing, and support continuous integration efforts. All this will be described using tools common in DevOps organizations.
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like "How is my application doing" but no idea how to get a proper answer.
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?