|By Cloud Best Practices Network||
|November 4, 2012 12:18 PM EST||
The Cloud offers a great opportunity to manage highly available and scalable databases by decreasing cost, time and risks. We have introduced how  the DaaS life cycle helps in applying best practices when migrating to the Cloud or administrating day-by-day Cloud activities. Taking into consideration the risks associated with Cloud contracts, we introduce a set of best practices that assist organizations in defining the best possible DaaS agreement. Best practices help define regulation controls that determine when and how applications can be deployed in the Cloud. This means that Cloud computing platforms are made up of different components from a variety of vendors but also of a variety of legal jurisdictions (countries, politics, risk management and compliance).
MaaS applied to drawing up the DaaS contract (and to control the Services)
Applying the MaaS can help manage data storage by using location constraints to check where your data is deployed and how it is implemented. Such constraints need to be clearly defined in the contract; persistence and dependencies have to be those classified (and regularly updated) in the data model in order to standardize the platform technologies that underpin the service provided. The main obligations that must be stipulated in the DaaS contract are the following:
1. Integrity defined at the model level has to be maintained through the service. The monitoring executed by data model, for example, has to match what is defined into the initial data structure and classified in the same way;
2. Country location has to be defined in the model partition and regularly monitored and compared. Any mismatch is an infringement of the agreement and must be reconciled with the terms outlined in the SLA;
3. Include and specify international regulations that the both Provider and the Vendor are responsible for during the service life cycle. In detail, highlight directives containing data breach rules. Provider and Vendors are protected although any violation is a service penalty and the data owner must notify both Provider and Vendor in case of a breach;
4. Specify location properties and not only in terms of country. The site locating machines, racks and so on has to be the appropriate one (weight per square meter, fire safety, anti-flood, employee privileges and security service personnel);
5. Identify trust boundaries throughout the IT architecture. Data models and partitions are the right way to define trust boundaries and stewardship to prevent unauthorized access and sharing;
6. Include the method to encrypt data transmitted across the network. If different encryption is used by the provider/ vendor, specify what and when it is to be used. The contract has to include how encryption is run on multi-tenant storage. List the rules concerning keys adoption;
7. Once data has to be deleted, specify that data retention and deletion are the responsibility of the Provider. Looking at data model mapping, data has to be destroyed in all locations defined and monitored. The Provider has to specify if data, for any reason, has been copied in different media and then shredded. The contract must include a provision for the customer to request an audit in order to certify that data has been deleted. This is strategic because satisfyes 2 important clauses:
7.1) Service Closure: the provider should not be able to terminate the service at his convenience. Merges, acquisition and other unpredictable events cannot stop the service (clause of irrevocable guarentee of continous service). In case the service has to be shutdown, the provider has the obligation to retain the data (and services) for an accepatable period of time and to migrate them to the new provider without costs. Of course, data retention and unrecoverable deletion after the migration are the responsability of the provider;
7.2) Right to Closure: in case the contract’s clauses are non respected (value proposition violated, extra charged upgrades, infrastructure maintenance without appropriate assistance, services have not be rendered adeguately, location security out of order …) you should close the contract without penalties. Again, the provider has the obligation to retain the data (and services) for an accepatable period of time and then to migrate them to the new provider.
8. Models are key to ensuring that logical data segregation and control are effective after backup and recovery, test and compare are completed. Include in the contract that a data model should be used to define the data architecture through the data life cycle. MaaS maintain the right to audit, to test all the clauses have been agreed: the data models keep in.
Although the best practices introduced above are helpful guidelines in defining DaaS contracts, negotiating the contractual clauses of your Cloud agreement is the first constraint. Ensure that all standard functionality are guaranteed and enforce special measures should be taken into consideration to secure data and service both in transit from/to the Provider and during the storage:
1) Enforce and ensure security compliance through ISO 27001/27002 directions. Schedule vulnerability assessments and regular real-time visibility into data applications. MaaS can define “on-premise” the multitenancy in the provider’s infrastructure and applications. Models map the service requirements at a given infrastructure: then, compliance officers have to periodically verify requirements assessment and outcomes through the infrastructure.
2) Apply SSL, IPSec constraints to secure data movement into the data center. Perimeter protection is essential to prevent denial-of-service threats;
3) Consider and include VLAN, VPN rules to secure data movement from/to the data center;
4) Include full disclosure. Provider’s employees and data administrators have to be certified by regulatory and compliance obligations. ISO 27001/27002 have to be provider’s standards (extended to their employees) in regard to privacy and data residency. Always include in the contract, who is responsible for establishing the compliance policy.
MaaS is the “compass” to define on-premise the DaaS (Database as a Service) properties such as security range, DB partitioning and scaling, multi-tenancy, geo-location and all requested assets might be defined “early”. Still, models increases the efficiency of defining, updating and sharing data models and database designs. In other words, models provide continuity with the databases’ structure to extend to the Cloud preconfigured levels of security, compliance and what has been registered inside the data models.
 N. Piscopo - ERwin® in the Cloud: How Data Modeling Supports Database as a Service (DaaS) Implementations
 N. Piscopo - CA ERwin® Data Modeler’s Role in the Relational Cloud
 D. Burbank, S. Hoberman - Data Modeling Made Simple with CA ERwin® Data Modeler r8
 N. Piscopo – Best Practices for Moving to the Cloud using Data Models in the DaaS Life Cycle
 N. Piscopo – Using CA ERwin® Data Modeler and Microsoft SQL Azure to Move Data to the Cloud within the DaaS Life Cycle
 R. Livingstone – Four Barriers to Cloud Due Diligence;
 N. Piscopo – MaaS (Model as a Service) is the emerging solution to design, map, integrate and publish Open Data http://cloudbestpractices.net/2012/10/21/maas/
 N. Piscopo – MaaS Workshop, Awareness, Courses Syllabus;
 N. Piscopo – DaaS Workshop, Awareness, Courses Syllabus;
 N. Piscopo – DaaS Contract templates: main constraints and examples, in press.
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
Oct. 8, 2015 04:00 PM EDT Reads: 163
SYS-CON Events announced today that VividCortex, the monitoring solution for the modern data system, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The database is the heart of most applications, but it’s also the part that’s hardest to scale, monitor, and optimize even as it’s growing 50% year over year. VividCortex is the first unified suite of database monitoring tools specifically desi...
Oct. 8, 2015 04:00 PM EDT Reads: 462
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
Oct. 8, 2015 04:00 PM EDT Reads: 219
Secure Cloud through Automated Compliance | @CloudExpo @CloudRaxak #Cloud #BigData #DevOps #Microservices
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
Oct. 8, 2015 04:00 PM EDT Reads: 298
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.
Oct. 8, 2015 04:00 PM EDT Reads: 1,061
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
Oct. 8, 2015 03:30 PM EDT Reads: 131
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Oct. 8, 2015 03:30 PM EDT Reads: 212
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at @DevOpsSummit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Oct. 8, 2015 03:15 PM EDT Reads: 178
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
Oct. 8, 2015 03:00 PM EDT Reads: 218
Saviynt Inc. has announced the availability of the next release of Saviynt for AWS. The comprehensive security and compliance solution provides a Command-and-Control center to gain visibility into risks in AWS, enforce real-time protection of critical workloads as well as data and automate access life-cycle governance. The solution enables AWS customers to meet their compliance mandates such as ITAR, SOX, PCI, etc. by including an extensive risk and controls library to detect known threats and b...
Oct. 8, 2015 03:00 PM EDT Reads: 199
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Oct. 8, 2015 02:45 PM EDT Reads: 492
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...
Oct. 8, 2015 02:30 PM EDT Reads: 644
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete en...
Oct. 8, 2015 02:15 PM EDT Reads: 222
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes.
Oct. 8, 2015 02:15 PM EDT Reads: 187
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction....
Oct. 8, 2015 02:00 PM EDT Reads: 148
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
Oct. 8, 2015 01:45 PM EDT Reads: 232
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
Oct. 8, 2015 01:00 PM EDT Reads: 774
NHK, Japan Broadcasting, will feature the upcoming @ThingsExpo Silicon Valley in a special 'Internet of Things' and smart technology documentary that will be filmed on the expo floor between November 3 to 5, 2015, in Santa Clara. NHK is the sole public TV network in Japan equivalent to the BBC in the UK and the largest in Asia with many award-winning science and technology programs. Japanese TV is producing a documentary about IoT and Smart technology and will be covering @ThingsExpo Silicon Val...
Oct. 8, 2015 01:00 PM EDT Reads: 254
Organizations already struggle with the simple collection of data resulting from the proliferation of IoT, lacking the right infrastructure to manage it. They can't only rely on the cloud to collect and utilize this data because many applications still require dedicated infrastructure for security, redundancy, performance, etc. In his session at 17th Cloud Expo, Emil Sayegh, CEO of Codero Hosting, will discuss how in order to resolve the inherent issues, companies need to combine dedicated a...
Oct. 8, 2015 01:00 PM EDT Reads: 472
SYS-CON Events announced today that ProfitBricks, the provider of painless cloud infrastructure, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. ProfitBricks is the IaaS provider that offers a painless cloud experience for all IT users, with no learning curve. ProfitBricks boasts flexible cloud servers and networking, an integrated Data Center Designer tool for visual control over the...
Oct. 8, 2015 01:00 PM EDT Reads: 759