Welcome!

@CloudExpo Authors: Progress Blog, Pat Romanski, LeanTaaS Blog, Kevin Benedict, Don MacVittie

Related Topics: @CloudExpo, Java IoT, Linux Containers, Eclipse, Machine Learning , Apache

@CloudExpo: News Item

Amazon Web Services Database in the Cloud

A new web service that makes it easy to set up, operate, and scale relational databases in the cloud

Amazon Cloud Journal

Amazon Web Services LLC, an Amazon.com company (NASDAQ: AMZN), today introduced Amazon Relational Database Service (Amazon RDS), a new web service that makes it easy to set up, operate, and scale relational databases in the cloud. Amazon RDS provides cost-efficient and resizable capacity while automating time-consuming database administration tasks, freeing users to focus on their application and their business. As with all Amazon Web Services, there are no up-front investments required, and you pay only for the resources you use. Also announced today, AWS has lowered prices and introduced a new family of High-Memory instances for Amazon EC2. To get started using Amazon RDS, and other Amazon Web Services, visit http://aws.amazon.com.

“For almost two years, many AWS customers have taken advantage of the simplicity, reliability, and seamless scalability that Amazon SimpleDB provides; however, many customers have told us that their applications require a relational database. That’s why we built Amazon RDS, which combines a familiar relational database with automated management and the instant scalability of the AWS cloud,” said Adam Selipsky, Vice President, Amazon Web Services.

Amazon RDS provides a fully featured MySQL database, so the code, applications, and tools that developers use today with their existing MySQL databases work seamlessly with Amazon RDS. The service automatically handles common database administration tasks such as setup and provisioning, patch management, and backup - storing the backups for a user-defined retention period. Customers also have the flexibility to scale the compute and storage resources associated with their database instance through a simple API call. Amazon RDS is easy to deploy and simple to manage.

“I found Amazon RDS to be a very efficient way to deploy MySQL, and a natural fit for cloud-based application deployment. The instance is up and running in minutes, and very sensible defaults are baked in. The APIs provide streamlined administration, with an ability to programmatically automate administration functions — which is a key feature in cloud-based applications,” said David Tompkins, Sr. Computer Scientist at Adobe Systems Advanced Technology Labs. “Most importantly, Amazon RDS provides pain-free scalability - which is typically one of the most time-consuming and expensive aspects of database deployment.”

"We started using Amazon RDS to store metadata for each and every publisher, advertiser and creative we serve through the system,” said Michael Lugassy, Founder and CEO of Kehalim, an advertising optimization and monetization platform. “After noticing a big performance improvement, we decided to use Amazon RDS to track all of our impression, clicks, and earning data as well. Results were amazing and freed us from the need to run our own MySQL instances. Amazon RDS allows us to focus on frontend features, rather than backend database complexity."

“Our customers have been clamoring for a MySQL option as part of the Heroku platform, so we were thrilled to learn about Amazon RDS,” said Morten Bagai, Director of Business Development at Heroku, a Ruby Platform as a Service Provider. “Amazon Web Services has made it painless to provision and manage a MySQL database. Based on our testing, we expect Amazon RDS to be a very popular database option for our customers.”

Separately, AWS is also lowering prices on all Amazon EC2 On-Demand compute instances, effective on November 1st. Charges for Linux-based instances will drop 15% -- a small Linux instance will now cost just 8.5 cents per hour, compared to the previous price of 10 cents per hour.

Along with today’s announcements, AWS is also introducing a new family of High-Memory Instances for Amazon EC2. This new instance family further expands the available selection of computing configurations for Amazon EC2, helping customers to choose the CPU capacity, memory resources, and networking throughput that their applications require. High-Memory Instances are designed to be used with memory-intensive workloads such as databases, caching, and rendering, and are optimized for low-latency, high-throughput performance.

With the addition of Amazon RDS and Amazon EC2 High-Memory Instances, AWS now provides customers with a multitude of cloud database alternatives. A summary of AWS database options is provided below:

Amazon RDS

For customers whose applications require relational storage, but want to reduce the time spent on database management, Amazon RDS automates common administrative tasks to reduce complexity and total cost of ownership. Amazon RDS automatically backs up a customer’s database and maintains the database software, allowing customers to spend more time on application development. With the native database access Amazon RDS provides, customers get the programmatic familiarity, tooling and application compatibility of a traditional RDBMS. Customers also benefit from the flexibility of being able to scale the compute resources or storage capacity associated with a Relational Database Instance via a single API call.

With Amazon RDS, customers still control the database settings that are specific to their business (including the schema, indices, and performance tuning). Customers also take an active role in the scaling decisions for their database – they tell the service when they want to add more storage or change to a larger or smaller DB Instance class.

Amazon RDS is recommended for customers who:

  • Have existing or new applications, code, or tools that require a relational database
  • Want native access to a MySQL relational database, but prefer to offload the infrastructure management and database administration to AWS
  • Like the flexibility of being able to scale their database compute and storage resources with an API call, and only pay for the infrastructure resources they actually consume

Amazon SimpleDB

For database implementations that do not require a relational model, and that principally demand index and query capabilities, Amazon SimpleDB eliminates the administrative overhead of running a highly-available production database, and is unbound by the strict requirements of a RDBMS. With Amazon SimpleDB, customers store and query data items via simple web services requests, and Amazon SimpleDB does the rest. In addition to handling infrastructure provisioning, software installation and maintenance, Amazon SimpleDB automatically indexes customers’ data, creates geo-redundant replicas of the data to ensure high availability, and performs database tuning on customers’ behalf. Amazon SimpleDB also provides no-touch scaling. There is no need to anticipate and respond to changes in request load or database utilization; the service simply responds to traffic as it comes and goes, charging only for the resources consumed. Finally, Amazon SimpleDB doesn’t enforce a rigid schema for data. This gives customers flexibility – if their business changes, they can easily reflect these changes in Amazon SimpleDB without any schema updates or changes to the database code.

Amazon SimpleDB is recommended for customers who:

  • Principally utilize index and query functions rather than more complex relational database functions
  • Don’t want any administrative burden at all in managing their structured data
  • Want a service that scales automatically up or down in response to demand, without user intervention
  • Require the highest possible availability

Amazon EC2 - Relational Database AMIs

Developers may use a number of leading relational databases on Amazon EC2. An Amazon EC2 instance can be used to run a database, and the data can be stored reliably on an Amazon Elastic Block Store (Amazon EBS) volume. Amazon EC2 provides a variety of instance sizes for developers to match the resource needs of their database including the newly released High-Memory Instance types which are specifically optimized for latency sensitive, I/O intensive workloads. Amazon EBS is a fast and reliable persistent storage feature of Amazon EC2. By designing, building, and managing their own relational database on Amazon EC2, developers avoid the friction of provisioning and scaling their own infrastructure while gaining access to a variety of standard database engines over which they can exert full administrative control. Available AMIs include IBM DB2, Microsoft SQL Server, MySQL, Oracle, PostgreSQL, Sybase, and Vertica.

Amazon EC2 Relational Database AMIs are recommended for customers who:

  • Wish to select from a wide variety of database engines
  • Want to exert complete administrative control over their database server

More Stories By Salvatore Genovese

Salvatore Genovese is a Cloud Computing consultant and an i-technology blogger based in Rome, Italy. He occasionally blogs about SOA, start-ups, mergers and acquisitions, open source and bleeding-edge technologies, companies, and personalities. Sal can be reached at hamilton(at)sys-con.com.

@CloudExpo Stories
Mobile device usage has increased exponentially during the past several years, as consumers rely on handhelds for everything from news and weather to banking and purchases. What can we expect in the next few years? The way in which we interact with our devices will fundamentally change, as businesses leverage Artificial Intelligence. We already see this taking shape as businesses leverage AI for cost savings and customer responsiveness. This trend will continue, as AI is used for more sophistica...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, discussed how given the magnitude of today's application ...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: imple...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.