Welcome!

@CloudExpo Authors: Liz McMillan, Pat Romanski, Elizabeth White, Automic Blog, Kevin Jackson

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Article

Isolated Customer Data: A Better Fit for the Cloud?

The heavy burden of database management has made massive, multi-customer databases the default choice for ISVs with SaaS apps

For all that global business has embraced cloud computing - welcoming its low cost, low barrier to entry and reduced IT burden - not everything about this architectural sea change is working out perfectly. Limitations in security and customization are a growing source of discontent for many current and prospective cloud customers.

The need to serve similar data for many customers usually shows up in hosted environments. And for the majority of providers the default approach to satisfy that requirement is an amalgamated, "multi-tenant" customer database.

Unfortunately, multi-tenant database architecture is inherently less data-safe than an isolated, per-customer approach. Concerns over trust prevent many organizations from adopting hosted applications. Not every organization is willing or able to accept the risk of letting its data reside in a shared repository. Beyond that, a shared database limits the way that providers can service their customers, forcing them to leave revenue on the table.

Is it time to rethink our default approach to data architecture in the cloud?

If it's been awhile since you've performed an objective assessment of your data isolation strategy, the answer is probably yes. Too many organizations settled on an architectural standard early on and stuck with it - paying little attention to subtle shifts in customer demand as the marketplace matured. Worse, they've often based this important decision primarily on what's convenient for the IT department, not what's most beneficial for the business at large.

The Case for Isolation
New ideas are steadily emerging about how best to service customers through the cloud and these are challenging the status quo. Robust data security and governance policy are two issues that have long been incompatible with shared databases. But businesses are also beginning to demand flexible service options and data schema customizations that cloud providers often simply can't accommodate.

Still, there are reasons why ISVs tend to prefer a multi-tenant database for their cloud applications. Foremost among these is reducing the management burden, an issue that will be discussed in detail later.

After management, data analysis is an oft-cited advantage of the shared database approach. With all the customer data in one large database, the application provider has a hassle-free way to analyze the data extensively for interrelationships. The data is imminently available for a broad range of business analytics purposes.

A shared database will also minimize hardware costs and make it easier to maximize the utilization of your server resources. Finally, for those ISVs that have been using multi-tenant architectures for years, there will be the additional benefit of familiarity and comfort, which can result in faster development time.

Isolated data architecture, on the other hand, can't provide the same level of simplicity that an amalgamated database can for data analysis. Data analysis is still possible, but it requires the additional step of creating an automated routine to extract the data you need from each database, anonymize it and aggregate it for your analysis. It's an inconvenience to be sure, but not a disastrous one.

More important, isolated data architecture has some notable advantages in the win column:

  • Security
    —Eliminates risk of data leakage
  • Governance
    —Simplifies audit process compliance
    —Makes it possible to comply with laws related to physical data storage boundaries
  • Customization
    —Introduces flexibility to customize databases on a customer-by-customer basis
    —Creates opportunities to provide premium services

Next we'll look at each of these benefits in more detail.

Security
In shared databases, very massive and damaging data leaks can be caused by surprisingly trivial software bugs. By isolating each customer's data you virtually eliminate the risk of a competitive data breech. Data stored in disparate databases cannot be accidentally leaked. Even with an application-level error the worst that can happen is that the company's own information is leaked within the company. For many companies, especially large enterprises, a sense of security, control and ownership over the data is mandatory.

Governance
Many industries today are subject to rigorous standards (such as HIPPA and SOX for the health care and finance industries, respectively), and these standards often require a formal audit. The regulations are rarely specific about how to architect data stores but when it comes time to do the auditing, a shared database can make it difficult or impossible to present the data in a way that complies with auditors' demands. In some industries, isolation of data is mandatory, and customers will need to be able to show proof of their compliance.

Occasionally, customers are subject to strict limitations on where their data can be stored. These limitations can be internal or part of a regulatory requirement, but when ISVs can't guarantee that data will reside within the physical boundaries mandated by these policies, the sale is lost. In these cases, only a disparate, standalone database will satisfy the requirements. A globally distributed data center - the most common architectural choice for large multi-tenant hosted apps - is a non-starter.

Customization
When each customer has a disparate data store, the possibilities for customization and premium services are virtually limitless. For ISVs that may be looking for new ways to monetize their existing assets, premium services can be an exceptional source of accretive revenue. Obviously the types of premium services you might offer will vary based on the application and the customer but they can include:

  • Charging usage fees, by the month or the hour
  • Providing an option to query the database directly for reporting purposes
  • Providing an option for local backups, for added peace of mind
  • Adding custom tables to one customer's database without forcing the change on others
  • Adding custom indexes without negatively impacting other customers' performance

When considered across the whole spectrum of pros and cons, it quickly becomes clear that the benefits of isolation far outweigh the costs. Or does it? All of these disparate databases have to be managed.

The Management Dilemma
The benefit list for single-tenant data architecture is undoubtedly long. But for ISVs, many of whom are managing thousands of very similar databases, the entire debate can boil down to one issue: management. It's a universal truth that managing one thing is easier than managing many, and databases are no exception.

Sadly, with little or no tooling in existence to help providers manage the task of deploying changes to some databases but not others, for many ISVs the business advantages of isolated data architecture are moot. The cost to hire sufficient DBA expertise to manage all of the databases effectively would be extraordinary. The management burden trumps all other rationale.

Interestingly, not all ISVs agree with that logic. Familiarity seems to play a big role in how different data architects view the significance of the management dilemma. In a 2011 survey of North American and European ISVs conducted by Sybase, respondents who said that they already were in the habit of managing many disparate customer databases reported that single-tenant architectures eased management issues rather than making them more complex.

Regardless, the widespread use of multi-tenant databases means an uphill battle for hosted application providers as they adapt to new market pressure for more secure and flexible alternatives. Solving the management dilemma, by providing significant tools and automation for common processes such as provisioning new databases, would tilt the scale drastically in the opposite direction.

Such automation is hardly a pipe dream. Self-managing and self-tuning databases have existed for decades in niche and embedded environments. The prospect for a wider spectrum database solution that can make managing many databases as easy as managing one is very near. ISVs that are strategically prepared for the opportunity will reap a competitive advantage in the cloud marketplace of the future.

More Stories By Eric Farrar

Eric Farrar is a senior product manager at Sybase, an SAP company, working on the SQL Anywhere embedded database. He is focused on bringing the power of embedded database technology into the new world of the web, and cloud computing environments.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone inn...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
DevOps promotes continuous improvement through a culture of collaboration. But in real terms, how do you: Integrate activities across diverse teams and services? Make objective decisions with system-wide visibility? Use feedback loops to enable learning and improvement? With technology insights and real-world examples, in his general session at @DevOpsSummit, at 21st Cloud Expo, Andi Mann, Chief Technology Advocate at Splunk, explored how leading organizations use data-driven DevOps to close th...
"Digital transformation - what we knew about it in the past has been redefined. Automation is going to play such a huge role in that because the culture, the technology, and the business operations are being shifted now," stated Brian Boeggeman, VP of Alliances & Partnerships at Ayehu, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
"WineSOFT is a software company making proxy server software, which is widely used in the telecommunication industry or the content delivery networks or e-commerce," explained Jonathan Ahn, COO of WineSOFT, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
Mobile device usage has increased exponentially during the past several years, as consumers rely on handhelds for everything from news and weather to banking and purchases. What can we expect in the next few years? The way in which we interact with our devices will fundamentally change, as businesses leverage Artificial Intelligence. We already see this taking shape as businesses leverage AI for cost savings and customer responsiveness. This trend will continue, as AI is used for more sophistica...
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
Sanjeev Sharma Joins June 5-7, 2018 @DevOpsSummit at @Cloud Expo New York Faculty. Sanjeev Sharma is an internationally known DevOps and Cloud Transformation thought leader, technology executive, and author. Sanjeev's industry experience includes tenures as CTO, Technical Sales leader, and Cloud Architect leader. As an IBM Distinguished Engineer, Sanjeev is recognized at the highest levels of IBM's core of technical leaders.
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
Digital Transformation (DX) is not a "one-size-fits all" strategy. Each organization needs to develop its own unique, long-term DX plan. It must do so by realizing that we now live in a data-driven age, and that technologies such as Cloud Computing, Big Data, the IoT, Cognitive Computing, and Blockchain are only tools. In her general session at 21st Cloud Expo, Rebecca Wanta explained how the strategy must focus on DX and include a commitment from top management to create great IT jobs, monitor ...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
You know you need the cloud, but you're hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You're looking at private cloud solutions based on hyperconverged infrastructure, but you're concerned with the limits inherent in those technologies. What do you do?