@CloudExpo Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, Zakia Bouachraoui

Related Topics: @CloudExpo

@CloudExpo: Article

Cloud Developers, Developers, Developers - Amazon Revolutionizes IT

Amazon is on track to become a leading player in the enterprise software industry

In an interesting article today, Matt Asay argues that Amazon may soon surprisingly emerge as a leading player in the enterprise software industry, and not just as a low-end cloud hardware provider. Stephen O'Grady of Redmonk also wrote on this theme, with his article on the rise of AWS, and its ongoing growth into all kinds of areas of the software industry.

The idea that many big IT vendors are putting out, that Amazon is just a "retailer", and will quickly be overtaken by the established vendors, is a big mistake on their part. AWS is winning because it is exploiting three big factors - commodity hardware pricing, open source software, and startups. In the modern software world, where the game is "Developers, Developers, Developers", this is a strong winning hand. Others can compete, but they will have to compete on all three fronts. Or change the game. I, for one, have been waiting for the game to be changed by the big IT vendors and/or Google for several years, but with every month that goes by with AWS way out in front, changing the game in a really decisive way becomes more difficult for them.

Most innovation in enterprise software today is coming from open source projects and from a new generation of enterprise software startups that are taking a "cloud-first" approach. In both cases, the developers involved really like what AWS offers, and what is offered just gets better and better every month.

At Cloudscale, we've built the world's first Realtime Data Warehouse, and we've built it on AWS first. If you want it in-house to run on your own physical hardware, we can certainly provide that, but before you decide on that you should check it out on standard AWS and on dedicated AWS clusters with 10GigE networking and high performance instances. These clusters are now even being used for high performance supercomputing in research labs, so they're the real deal in terms of performance.

AWS has allowed us to quickly build a major next-generation enterprise software system that enables business users and IT teams to easily design and launch realtime, massively parallel big data analytics apps, and to do it all in the cloud. No hardware, no software, no hassle. Of course, if you really must "get physical" and go in-house (or "private cloud" as it's euphemistically called) then you can, but over time (although not overnight) the reasons for doing that will gradually fade away.

Up until now, Amazon has mainly used standard, widely-used, freely available open source software in its cloud software offerings. They know, however, that to move to the next level as an enterprise software company they will have to also develop or acquire a number of new proprietary software products that can continue to give them an unfair advantage in the cloud space. As their main competitors see the same scenario playing out, with Amazon threatening to move further and further out in front, we can expect to see a sudden rush by Amazon, Microsoft, IBM, Google, Oracle, Salesforce, EMC, SAP to acquire the cloud software companies whose products that can give their clouds a massive proprietary edge.

More Stories By Bill McColl

Bill McColl left Oxford University to found Cloudscale. At Oxford he was Professor of Computer Science, Head of the Parallel Computing Research Center, and Chairman of the Computer Science Faculty. Along with Les Valiant of Harvard, he developed the BSP approach to parallel programming. He has led research, product, and business teams, in a number of areas: massively parallel algorithms and architectures, parallel programming languages and tools, datacenter virtualization, realtime stream processing, big data analytics, and cloud computing. He lives in Palo Alto, CA.

CloudEXPO Stories
Andi Mann, Chief Technology Advocate at Splunk, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, and communicator. For over 30 years across five continents, he has built success with Fortune 500 corporations, vendors, governments, and as a leading research analyst and consultant.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value extensible storage infrastructure has in accelerating software development activities, improve code quality, reveal multiple deployment options through automated testing, and support continuous integration efforts. All this will be described using tools common in DevOps organizations.
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like "How is my application doing" but no idea how to get a proper answer.
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?