Welcome!

@CloudExpo Authors: Elizabeth White, Liz McMillan, Peter Galvin, Sanjay Zalavadia, Pat Romanski

Blog Feed Post

Applying MaaS to DaaS (Database as a Service) Contract. An introduction to the Practice

The Cloud offers a great opportunity to manage highly available and scalable databases by decreasing cost, time and risks. We have introduced how [4] the DaaS life cycle helps in applying best practices when migrating to the Cloud or administrating day-by-day Cloud activities. Taking into consideration the risks associated with Cloud contracts, we introduce a set of best practices that assist organizations in defining the best possible DaaS agreement. Best practices help define regulation controls that determine when and how applications can be deployed in the Cloud. This means that Cloud computing platforms are made up of different components from a variety of vendors but also of a variety of legal jurisdictions (countries, politics, risk management and compliance).

MaaS applied to drawing up the DaaS contract (and to control the Services)

Applying the MaaS can help manage data storage by using location constraints to check where your data is deployed and how it is implemented. Such constraints need to be clearly defined in the contract; persistence and dependencies have to be those classified (and regularly updated) in the data model in order to standardize the platform technologies that underpin the service provided. The main obligations that must be stipulated in the DaaS contract are the following:

1. Integrity defined at the model level has to be maintained through the service. The monitoring executed by data model, for example, has to match what is defined into the initial data structure and classified in the same way;

2. Country location has to be defined in the model partition and regularly monitored and compared. Any mismatch is an infringement of the agreement and must be reconciled with the terms outlined in the SLA;

3. Include and specify international regulations that the both Provider and the Vendor are responsible for during the service life cycle. In detail, highlight directives containing data breach rules. Provider and Vendors are protected although any violation is a service penalty and the data owner must notify both Provider and Vendor in case of a breach;

4. Specify location properties and not only in terms of country. The site locating machines, racks and so on has to be the appropriate one (weight per square meter, fire safety, anti-flood, employee privileges and security service personnel);

5. Identify trust boundaries throughout the IT architecture. Data models and partitions are the right way to define trust boundaries and stewardship to prevent unauthorized access and sharing;

6. Include the method to encrypt data transmitted across the network. If different encryption is used by the provider/ vendor, specify what and when it is to be used. The contract has to include how encryption is run on multi-tenant storage. List the rules concerning keys adoption;

7. Once data has to be deleted, specify that data retention and deletion are the responsibility of the Provider. Looking at data model mapping, data has to be destroyed in all locations defined and monitored. The Provider has to specify if data, for any reason, has been copied in different media and then shredded. The contract must include a provision for the customer to request an audit in order to certify that data has been deleted. This is strategic because satisfyes 2 important clauses:

7.1) Service Closure: the provider should not be able to terminate the service at his convenience. Merges, acquisition and other unpredictable events cannot stop the service (clause of irrevocable guarentee of continous service). In case the service has to be shutdown, the provider has the obligation to retain the data (and services) for an accepatable period of time and to migrate them to the new provider without costs. Of course, data retention and unrecoverable deletion after the migration are the responsability of the provider;

7.2) Right to Closure: in case the contract’s clauses are non respected (value proposition violated, extra charged upgrades, infrastructure maintenance without appropriate assistance, services have not be rendered adeguately, location security out of order …) you should close the contract without penalties. Again, the provider has the obligation to retain the data (and services) for an accepatable period of time and then to migrate them to the new provider.

8. Models are key to ensuring that logical data segregation and control are effective after backup and recovery, test and compare are completed. Include in the contract that a data model should be used to define the data architecture through the data life cycle. MaaS maintain the right to audit, to test all the clauses have been agreed: the data models keep in.

Although the best practices introduced above are helpful guidelines in defining DaaS contracts, negotiating the contractual clauses of your Cloud agreement is the first constraint. Ensure that all standard functionality are guaranteed and enforce special measures should be taken into consideration to secure data and service both in transit from/to the Provider and during the storage:

1)    Enforce and ensure security compliance through ISO 27001/27002 directions. Schedule vulnerability assessments and regular real-time visibility into data applications. MaaS can define “on-premise” the multitenancy in the provider’s infrastructure and applications. Models map the service requirements at a given infrastructure: then, compliance officers have to periodically verify requirements assessment and outcomes through the infrastructure.

2)    Apply SSL, IPSec constraints to secure data movement into the data center. Perimeter protection is essential to prevent denial-of-service threats;

3)    Consider and include VLAN, VPN rules to secure data movement from/to the data center;

4)    Include full disclosure. Provider’s employees and data administrators have to be certified by regulatory and compliance obligations. ISO 27001/27002 have to be provider’s standards (extended to their employees) in regard to privacy and data residency. Always include in the contract, who is responsible for establishing the compliance policy.

Conclusion

MaaS is the “compass” to define on-premise the DaaS (Database as a Service) properties such as security range, DB partitioning and scaling, multi-tenancy, geo-location and all requested assets might be defined “early”. Still, models increases the efficiency of defining, updating and sharing data models and database designs. In other words, models provide continuity with the databases’ structure to extend to the Cloud preconfigured levels of security, compliance and what has been registered inside the data models.

References
[1] N. Piscopo - ERwin® in the Cloud: How Data Modeling Supports Database as a Service (DaaS) Implementations
[2] N. Piscopo - CA ERwin® Data Modeler’s Role in the Relational Cloud
[3] D. Burbank, S. Hoberman - Data Modeling Made Simple with CA ERwin® Data Modeler r8
[4] N. Piscopo – Best Practices for Moving to the Cloud using Data Models in the DaaS Life Cycle
[5] N. Piscopo – Using CA ERwin® Data Modeler and Microsoft SQL Azure to Move Data to the Cloud within the DaaS Life Cycle
[6] R. Livingstone – Four Barriers to Cloud Due Diligence;
[7] N. Piscopo – MaaS (Model as a Service) is the emerging solution to design, map, integrate and publish Open Data http://cloudbestpractices.net/2012/10/21/maas/
[8] N. Piscopo – MaaS Workshop, Awareness, Courses Syllabus;
[9] N. Piscopo – DaaS Workshop, Awareness, Courses Syllabus;
[10] N. Piscopo – DaaS Contract templates: main constraints and examples, in press.


Read the original blog entry...

More Stories By Cloud Best Practices Network

The Cloud Best Practices Network is an expert community of leading Cloud pioneers. Follow our best practice blogs at http://CloudBestPractices.net

@CloudExpo Stories
SaaS companies can greatly expand revenue potential by pushing beyond their own borders. The challenge is how to do this without degrading service quality. In his session at 18th Cloud Expo, Adam Rogers, Managing Director at Anexia, discussed how IaaS providers with a global presence and both virtual and dedicated infrastructure can help companies expand their service footprint with low “go-to-market” costs.
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
There are several IoTs: the Industrial Internet, Consumer Wearables, Wearables and Healthcare, Supply Chains, and the movement toward Smart Grids, Cities, Regions, and Nations. There are competing communications standards every step of the way, a bewildering array of sensors and devices, and an entire world of competing data analytics platforms. To some this appears to be chaos. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Bradley Holt, Developer Advocate a...
The pace of innovation, vendor lock-in, production sustainability, cost-effectiveness, and managing risk… In his session at 18th Cloud Expo, Dan Choquette, Founder of RackN, discussed how CIOs are challenged finding the balance of finding the right tools, technology and operational model that serves the business the best. He also discussed how clouds, open source software and infrastructure solutions have benefits but also drawbacks and how workload and operational portability between vendors ...
Digital Initiatives create new ways of conducting business, which drive the need for increasingly advanced security and regulatory compliance challenges with exponentially more damaging consequences. In the BMC and Forbes Insights Survey in 2016, 97% of executives said they expect a rise in data breach attempts in the next 12 months. Sixty percent said operations and security teams have only a general understanding of each other’s requirements, resulting in a “SecOps gap” leaving organizations u...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, wh...
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection. In his session at 18th Cloud Expo, Bradley Holt, a Developer Advocate with IBM Cloud Data Services, discussed...
19th Cloud Expo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterpri...
What does it look like when you have access to cloud infrastructure and platform under the same roof? Let’s talk about the different layers of Technology as a Service: who cares, what runs where, and how does it all fit together. In his session at 18th Cloud Expo, Phil Jackson, Lead Technology Evangelist at SoftLayer, an IBM company, spoke about the picture being painted by IBM Cloud and how the tools being crafted can help fill the gaps in your IT infrastructure.
SYS-CON Events announced today that Bsquare has been named “Silver Sponsor” of SYS-CON's @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. For more than two decades, Bsquare has helped its customers extract business value from a broad array of physical assets by making them intelligent, connecting them, and using the data they generate to optimize business processes.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 19th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devices - comp...
"SpeedyCloud's specialty lies in providing cloud services - we provide IaaS for Internet and enterprises companies," explained Hao Yu, CEO and co-founder of SpeedyCloud, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. Big Data at Cloud Expo - to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is...
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Creating replica copies to tolerate a certain number of failures is easy, but very expensive at cloud-scale. Conventional RAID has lower overhead, but it is limited in the number of failures it can tolerate. And the management is like herding cats (overseeing capacity, rebuilds, migrations, and degraded performance). Download Slide Deck: ▸ Here In his general session at 18th Cloud Expo, Scott Cleland, Senior Director of Product Marketing for the HGST Cloud Infrastructure Business Unit, discusse...
Cloud computing is being adopted in one form or another by 94% of enterprises today. Tens of billions of new devices are being connected to The Internet of Things. And Big Data is driving this bus. An exponential increase is expected in the amount of information being processed, managed, analyzed, and acted upon by enterprise IT. This amazing is not part of some distant future - it is happening today. One report shows a 650% increase in enterprise data by 2020. Other estimates are even higher....