Welcome!

@CloudExpo Authors: Pat Romanski, Liz McMillan, Elizabeth White, Yeshim Deniz, Aruna Ravichandran

Blog Feed Post

Applying MaaS to DaaS (Database as a Service) Contract. An introduction to the Practice

The Cloud offers a great opportunity to manage highly available and scalable databases by decreasing cost, time and risks. We have introduced how [4] the DaaS life cycle helps in applying best practices when migrating to the Cloud or administrating day-by-day Cloud activities. Taking into consideration the risks associated with Cloud contracts, we introduce a set of best practices that assist organizations in defining the best possible DaaS agreement. Best practices help define regulation controls that determine when and how applications can be deployed in the Cloud. This means that Cloud computing platforms are made up of different components from a variety of vendors but also of a variety of legal jurisdictions (countries, politics, risk management and compliance).

MaaS applied to drawing up the DaaS contract (and to control the Services)

Applying the MaaS can help manage data storage by using location constraints to check where your data is deployed and how it is implemented. Such constraints need to be clearly defined in the contract; persistence and dependencies have to be those classified (and regularly updated) in the data model in order to standardize the platform technologies that underpin the service provided. The main obligations that must be stipulated in the DaaS contract are the following:

1. Integrity defined at the model level has to be maintained through the service. The monitoring executed by data model, for example, has to match what is defined into the initial data structure and classified in the same way;

2. Country location has to be defined in the model partition and regularly monitored and compared. Any mismatch is an infringement of the agreement and must be reconciled with the terms outlined in the SLA;

3. Include and specify international regulations that the both Provider and the Vendor are responsible for during the service life cycle. In detail, highlight directives containing data breach rules. Provider and Vendors are protected although any violation is a service penalty and the data owner must notify both Provider and Vendor in case of a breach;

4. Specify location properties and not only in terms of country. The site locating machines, racks and so on has to be the appropriate one (weight per square meter, fire safety, anti-flood, employee privileges and security service personnel);

5. Identify trust boundaries throughout the IT architecture. Data models and partitions are the right way to define trust boundaries and stewardship to prevent unauthorized access and sharing;

6. Include the method to encrypt data transmitted across the network. If different encryption is used by the provider/ vendor, specify what and when it is to be used. The contract has to include how encryption is run on multi-tenant storage. List the rules concerning keys adoption;

7. Once data has to be deleted, specify that data retention and deletion are the responsibility of the Provider. Looking at data model mapping, data has to be destroyed in all locations defined and monitored. The Provider has to specify if data, for any reason, has been copied in different media and then shredded. The contract must include a provision for the customer to request an audit in order to certify that data has been deleted. This is strategic because satisfyes 2 important clauses:

7.1) Service Closure: the provider should not be able to terminate the service at his convenience. Merges, acquisition and other unpredictable events cannot stop the service (clause of irrevocable guarentee of continous service). In case the service has to be shutdown, the provider has the obligation to retain the data (and services) for an accepatable period of time and to migrate them to the new provider without costs. Of course, data retention and unrecoverable deletion after the migration are the responsability of the provider;

7.2) Right to Closure: in case the contract’s clauses are non respected (value proposition violated, extra charged upgrades, infrastructure maintenance without appropriate assistance, services have not be rendered adeguately, location security out of order …) you should close the contract without penalties. Again, the provider has the obligation to retain the data (and services) for an accepatable period of time and then to migrate them to the new provider.

8. Models are key to ensuring that logical data segregation and control are effective after backup and recovery, test and compare are completed. Include in the contract that a data model should be used to define the data architecture through the data life cycle. MaaS maintain the right to audit, to test all the clauses have been agreed: the data models keep in.

Although the best practices introduced above are helpful guidelines in defining DaaS contracts, negotiating the contractual clauses of your Cloud agreement is the first constraint. Ensure that all standard functionality are guaranteed and enforce special measures should be taken into consideration to secure data and service both in transit from/to the Provider and during the storage:

1)    Enforce and ensure security compliance through ISO 27001/27002 directions. Schedule vulnerability assessments and regular real-time visibility into data applications. MaaS can define “on-premise” the multitenancy in the provider’s infrastructure and applications. Models map the service requirements at a given infrastructure: then, compliance officers have to periodically verify requirements assessment and outcomes through the infrastructure.

2)    Apply SSL, IPSec constraints to secure data movement into the data center. Perimeter protection is essential to prevent denial-of-service threats;

3)    Consider and include VLAN, VPN rules to secure data movement from/to the data center;

4)    Include full disclosure. Provider’s employees and data administrators have to be certified by regulatory and compliance obligations. ISO 27001/27002 have to be provider’s standards (extended to their employees) in regard to privacy and data residency. Always include in the contract, who is responsible for establishing the compliance policy.

Conclusion

MaaS is the “compass” to define on-premise the DaaS (Database as a Service) properties such as security range, DB partitioning and scaling, multi-tenancy, geo-location and all requested assets might be defined “early”. Still, models increases the efficiency of defining, updating and sharing data models and database designs. In other words, models provide continuity with the databases’ structure to extend to the Cloud preconfigured levels of security, compliance and what has been registered inside the data models.

References
[1] N. Piscopo - ERwin® in the Cloud: How Data Modeling Supports Database as a Service (DaaS) Implementations
[2] N. Piscopo - CA ERwin® Data Modeler’s Role in the Relational Cloud
[3] D. Burbank, S. Hoberman - Data Modeling Made Simple with CA ERwin® Data Modeler r8
[4] N. Piscopo – Best Practices for Moving to the Cloud using Data Models in the DaaS Life Cycle
[5] N. Piscopo – Using CA ERwin® Data Modeler and Microsoft SQL Azure to Move Data to the Cloud within the DaaS Life Cycle
[6] R. Livingstone – Four Barriers to Cloud Due Diligence;
[7] N. Piscopo – MaaS (Model as a Service) is the emerging solution to design, map, integrate and publish Open Data http://cloudbestpractices.net/2012/10/21/maas/
[8] N. Piscopo – MaaS Workshop, Awareness, Courses Syllabus;
[9] N. Piscopo – DaaS Workshop, Awareness, Courses Syllabus;
[10] N. Piscopo – DaaS Contract templates: main constraints and examples, in press.


Read the original blog entry...

More Stories By Cloud Best Practices Network

The Cloud Best Practices Network is an expert community of leading Cloud pioneers. Follow our best practice blogs at http://CloudBestPractices.net

@CloudExpo Stories
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, will provide a fun and simple way to introduce Machine Leaning to anyone and everyone. Together we will solve a machine learning problem and find an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intellige...
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
As popularity of the smart home is growing and continues to go mainstream, technological factors play a greater role. The IoT protocol houses the interoperability battery consumption, security, and configuration of a smart home device, and it can be difficult for companies to choose the right kind for their product. For both DIY and professionally installed smart homes, developers need to consider each of these elements for their product to be successful in the market and current smart homes.
Infoblox delivers Actionable Network Intelligence to enterprise, government, and service provider customers around the world. They are the industry leader in DNS, DHCP, and IP address management, the category known as DDI. We empower thousands of organizations to control and secure their networks from the core-enabling them to increase efficiency and visibility, improve customer service, and meet compliance requirements.
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, will describe how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launchi...
The session is centered around the tracing of systems on cloud using technologies like ebpf. The goal is to talk about what this technology is all about and what purpose it serves. In his session at 21st Cloud Expo, Shashank Jain, Development Architect at SAP, will touch upon concepts of observability in the cloud and also some of the challenges we have. Generally most cloud-based monitoring tools capture details at a very granular level. To troubleshoot problems this might not be good enough.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lav...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
SYS-CON Events announced today that mruby Forum will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. mruby is the lightweight implementation of the Ruby language. We introduce mruby and the mruby IoT framework that enhances development productivity. For more information, visit http://forum.mruby.org/.
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
Digital transformation is changing the face of business. The IDC predicts that enterprises will commit to a massive new scale of digital transformation, to stake out leadership positions in the "digital transformation economy." Accordingly, attendees at the upcoming Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA, Oct 31-Nov 2, will find fresh new content in a new track called Enterprise Cloud & Digital Transformation.
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, will discuss how given the magnitude of today's applicati...
SYS-CON Events announced today that NetApp has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. NetApp is the data authority for hybrid cloud. NetApp provides a full range of hybrid cloud data services that simplify management of applications and data across cloud and on-premises environments to accelerate digital transformation. Together with their partners, NetApp emp...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, will go over the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, applicatio...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...
Data scientists must access high-performance computing resources across a wide-area network. To achieve cloud-based HPC visualization, researchers must transfer datasets and visualization results efficiently. HPC clusters now compute GPU-accelerated visualization in the cloud cluster. To efficiently display results remotely, a high-performance, low-latency protocol transfers the display from the cluster to a remote desktop. Further, tools to easily mount remote datasets and efficiently transfer...
Though cloud is the future of enterprise computing, a smooth transition of legacy applications and systems is critical for seamless business operations. IT professionals are eager to start leveraging the cost, scale and other benefits of cloud, but with massive investments already in place in existing infrastructure and a number of compliance and resource hurdles, it can be challenging to move to a cloud-based infrastructure.