Click here to close now.

Welcome!

Cloud Expo Authors: Max Katz, AppDynamics Blog, Plutora Blog, Carmen Gonzalez, Cloud Best Practices Network

Related Topics: Cloud Expo, Java, Linux, Eclipse, AJAX & REA, Apache

Cloud Expo: News Item

Amazon Web Services Database in the Cloud

A new web service that makes it easy to set up, operate, and scale relational databases in the cloud

Amazon Cloud Journal

Amazon Web Services LLC, an Amazon.com company (NASDAQ: AMZN), today introduced Amazon Relational Database Service (Amazon RDS), a new web service that makes it easy to set up, operate, and scale relational databases in the cloud. Amazon RDS provides cost-efficient and resizable capacity while automating time-consuming database administration tasks, freeing users to focus on their application and their business. As with all Amazon Web Services, there are no up-front investments required, and you pay only for the resources you use. Also announced today, AWS has lowered prices and introduced a new family of High-Memory instances for Amazon EC2. To get started using Amazon RDS, and other Amazon Web Services, visit http://aws.amazon.com.

“For almost two years, many AWS customers have taken advantage of the simplicity, reliability, and seamless scalability that Amazon SimpleDB provides; however, many customers have told us that their applications require a relational database. That’s why we built Amazon RDS, which combines a familiar relational database with automated management and the instant scalability of the AWS cloud,” said Adam Selipsky, Vice President, Amazon Web Services.

Amazon RDS provides a fully featured MySQL database, so the code, applications, and tools that developers use today with their existing MySQL databases work seamlessly with Amazon RDS. The service automatically handles common database administration tasks such as setup and provisioning, patch management, and backup - storing the backups for a user-defined retention period. Customers also have the flexibility to scale the compute and storage resources associated with their database instance through a simple API call. Amazon RDS is easy to deploy and simple to manage.

“I found Amazon RDS to be a very efficient way to deploy MySQL, and a natural fit for cloud-based application deployment. The instance is up and running in minutes, and very sensible defaults are baked in. The APIs provide streamlined administration, with an ability to programmatically automate administration functions — which is a key feature in cloud-based applications,” said David Tompkins, Sr. Computer Scientist at Adobe Systems Advanced Technology Labs. “Most importantly, Amazon RDS provides pain-free scalability - which is typically one of the most time-consuming and expensive aspects of database deployment.”

"We started using Amazon RDS to store metadata for each and every publisher, advertiser and creative we serve through the system,” said Michael Lugassy, Founder and CEO of Kehalim, an advertising optimization and monetization platform. “After noticing a big performance improvement, we decided to use Amazon RDS to track all of our impression, clicks, and earning data as well. Results were amazing and freed us from the need to run our own MySQL instances. Amazon RDS allows us to focus on frontend features, rather than backend database complexity."

“Our customers have been clamoring for a MySQL option as part of the Heroku platform, so we were thrilled to learn about Amazon RDS,” said Morten Bagai, Director of Business Development at Heroku, a Ruby Platform as a Service Provider. “Amazon Web Services has made it painless to provision and manage a MySQL database. Based on our testing, we expect Amazon RDS to be a very popular database option for our customers.”

Separately, AWS is also lowering prices on all Amazon EC2 On-Demand compute instances, effective on November 1st. Charges for Linux-based instances will drop 15% -- a small Linux instance will now cost just 8.5 cents per hour, compared to the previous price of 10 cents per hour.

Along with today’s announcements, AWS is also introducing a new family of High-Memory Instances for Amazon EC2. This new instance family further expands the available selection of computing configurations for Amazon EC2, helping customers to choose the CPU capacity, memory resources, and networking throughput that their applications require. High-Memory Instances are designed to be used with memory-intensive workloads such as databases, caching, and rendering, and are optimized for low-latency, high-throughput performance.

With the addition of Amazon RDS and Amazon EC2 High-Memory Instances, AWS now provides customers with a multitude of cloud database alternatives. A summary of AWS database options is provided below:

Amazon RDS

For customers whose applications require relational storage, but want to reduce the time spent on database management, Amazon RDS automates common administrative tasks to reduce complexity and total cost of ownership. Amazon RDS automatically backs up a customer’s database and maintains the database software, allowing customers to spend more time on application development. With the native database access Amazon RDS provides, customers get the programmatic familiarity, tooling and application compatibility of a traditional RDBMS. Customers also benefit from the flexibility of being able to scale the compute resources or storage capacity associated with a Relational Database Instance via a single API call.

With Amazon RDS, customers still control the database settings that are specific to their business (including the schema, indices, and performance tuning). Customers also take an active role in the scaling decisions for their database – they tell the service when they want to add more storage or change to a larger or smaller DB Instance class.

Amazon RDS is recommended for customers who:

  • Have existing or new applications, code, or tools that require a relational database
  • Want native access to a MySQL relational database, but prefer to offload the infrastructure management and database administration to AWS
  • Like the flexibility of being able to scale their database compute and storage resources with an API call, and only pay for the infrastructure resources they actually consume

Amazon SimpleDB

For database implementations that do not require a relational model, and that principally demand index and query capabilities, Amazon SimpleDB eliminates the administrative overhead of running a highly-available production database, and is unbound by the strict requirements of a RDBMS. With Amazon SimpleDB, customers store and query data items via simple web services requests, and Amazon SimpleDB does the rest. In addition to handling infrastructure provisioning, software installation and maintenance, Amazon SimpleDB automatically indexes customers’ data, creates geo-redundant replicas of the data to ensure high availability, and performs database tuning on customers’ behalf. Amazon SimpleDB also provides no-touch scaling. There is no need to anticipate and respond to changes in request load or database utilization; the service simply responds to traffic as it comes and goes, charging only for the resources consumed. Finally, Amazon SimpleDB doesn’t enforce a rigid schema for data. This gives customers flexibility – if their business changes, they can easily reflect these changes in Amazon SimpleDB without any schema updates or changes to the database code.

Amazon SimpleDB is recommended for customers who:

  • Principally utilize index and query functions rather than more complex relational database functions
  • Don’t want any administrative burden at all in managing their structured data
  • Want a service that scales automatically up or down in response to demand, without user intervention
  • Require the highest possible availability

Amazon EC2 - Relational Database AMIs

Developers may use a number of leading relational databases on Amazon EC2. An Amazon EC2 instance can be used to run a database, and the data can be stored reliably on an Amazon Elastic Block Store (Amazon EBS) volume. Amazon EC2 provides a variety of instance sizes for developers to match the resource needs of their database including the newly released High-Memory Instance types which are specifically optimized for latency sensitive, I/O intensive workloads. Amazon EBS is a fast and reliable persistent storage feature of Amazon EC2. By designing, building, and managing their own relational database on Amazon EC2, developers avoid the friction of provisioning and scaling their own infrastructure while gaining access to a variety of standard database engines over which they can exert full administrative control. Available AMIs include IBM DB2, Microsoft SQL Server, MySQL, Oracle, PostgreSQL, Sybase, and Vertica.

Amazon EC2 Relational Database AMIs are recommended for customers who:

  • Wish to select from a wide variety of database engines
  • Want to exert complete administrative control over their database server

More Stories By Salvatore Genovese

Salvatore Genovese is a Cloud Computing consultant and an i-technology blogger based in Rome, Italy. He occasionally blogs about SOA, start-ups, mergers and acquisitions, open source and bleeding-edge technologies, companies, and personalities. Sal can be reached at hamilton(at)sys-con.com.

@CloudExpo Stories
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
Business and IT leaders today need better application delivery capabilities to support critical new innovation. But how often do you hear objections to improving application delivery like, “I can harden it against attack, but not on this timeline”; “I can make it better, but it will cost more”; “I can deliver faster, but not with these specs”; or “I can stay strong on cost control, but quality will suffer”? In the new application economy, these tradeoffs are no longer acceptable. Customers will ...
The Internet of Things (IoT) is rapidly in the process of breaking from its heretofore relatively obscure enterprise applications (such as plant floor control and supply chain management) and going mainstream into the consumer space. More and more creative folks are interconnecting everyday products such as household items, mobile devices, appliances and cars, and unleashing new and imaginative scenarios. We are seeing a lot of excitement around applications in home automation, personal fitness,...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Red Hat has launched the Red Hat Cloud Innovation Practice, a new global team of experts that will assist companies with more quickly on-ramping to the cloud. They will do this by providing solutions and services such as validated designs with reference architectures and agile methodology consulting, training, and support. The Red Hat Cloud Innovation Practice is born out of the integration of technology and engineering expertise gained through the company’s 2014 acquisitions of leading Ceph s...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
The free version of KEMP Technologies' LoadMaster™ application load balancer is now available for unlimited use, making it easy for IT developers and open source technology users to benefit from all the features of a full commercial-grade product at no cost. It can be downloaded at FreeLoadBalancer.com. Load balancing, security and traffic optimization are all key enablers for application performance and functionality. Without these, application services will not perform as expected or have the...
The speed of product development has increased massively in the past 10 years. At the same time our formal secure development and SDL methodologies have fallen behind. This forces product developers to choose between rapid release times and security. In his session at DevOps Summit, Michael Murray, Director of Cyber Security Consulting and Assessment at GE Healthcare, examined the problems and presented some solutions for moving security into the DevOps lifecycle to ensure that we get fast AND ...
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
Docker is becoming very popular--we are seeing every major private and public cloud vendor racing to adopt it. It promises portability and interoperability, and is quickly becoming the currency of the Cloud. In his session at DevOps Summit, Bart Copeland, CEO of ActiveState, discussed why Docker is so important to the future of the cloud, but will also take a step back and show that Docker is actually only one piece of the puzzle. Copeland will outline the bigger picture of where Docker fits a...
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. ...
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS soluti...
Advanced Persistent Threats (APTs) are increasing at an unprecedented rate. The threat landscape of today is drastically different than just a few years ago. Attacks are much more organized and sophisticated. They are harder to detect and even harder to anticipate. In the foreseeable future it's going to get a whole lot harder. Everything you know today will change. Keeping up with this changing landscape is already a daunting task. Your organization needs to use the latest tools, methods and ex...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize sup...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures...
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.