Welcome!

@CloudExpo Authors: Pat Romanski, Liz McMillan, Elizabeth White, Ram Sonagara, Richard Hale

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, Government Cloud

@CloudExpo: Blog Feed Post

MaaS – The Solution to Design, Map, Integrate and Publish Open Data

Data models can be shared, off-line tested and verified to define data designing requirements, data topology, performance, place

Open Data is data that can be freely used, reused and redistributed by anyone – subject only, at the most, to the requirement for attributes and sharealikes (Open Software Service Definition – OSSD). As a consequence, Open Data should create value and might have a positive impact in many different areas such as government (tax money expenditure), health (medical research, hospital acceptance by pathology), quality of life (air breathed in our city, pollution) or might influence public decisions like investments, public economy and expenditure. We are talking about services, so open data are services needed to connect the community with the public bodies. However, the required open data should be part of a design and then integrated, mapped, updated and published in a form, which is easy to use. MaaS is the Open Data driver and enables Open Data portability into the Cloud.

Introduction
Data models used as a service mainly provide the following topics:

  • Implementing and sharing data structure models;
  • Verifying data model properties according to private and public cloud requirements;
  • Designing and testing new query types. Specific query classes need to support heterogeneous data;
  • Designing of the data storage model. The model should enable query processing directly against databases to ensure privacy and secure changes from data updates and review;
  • Modeling data to predict usage “early”;
  • Portability, a central property when data is shared among fields of application;
  • Sharing, redistribution and participation of data among datasets and applications.

As a consequence, the data should be available as a whole and at a reasonable fee, preferably by finding, navigating and downloading over the Cloud. It should also be available in a usable and changeable form. This means modeling Open Data and then using the models to map location and usage, configuration, integration and changes along the Open Data lifecycle.

What is MaaS
Data models can be shared, off-line tested and verified to define data designing requirements, data topology, performance, placement and deployment. This means models themselves can be supplied as a service to allow providers to verify how and where data has to be designed to meet the Cloud service’s requisites: this is MaaS. As a consequence by using MaaS, Open Data designers can verify “on-premise” how and why datasets meet Open Data requirements. With this approach, Open Data models can be tuned on real usage and then mapped “on-premise” to the public body’s service. Further, MaaS inherits all the defined service’s properties and so the data model can be reused, shared and classified for new Open Data design and publication.

Open Data implementation is MaaS (Model as a Service) driven
Open Data is completely supported by data modeling and then MaaS completely supports Open Data. MaaS should be the first practice, helping to tune analysis and Open Data design. Furthermore, data models govern design, deployment, storage, changes, resources allocation, hence MaaS supports:

  • Applying Best Practice for Open Data design;
  • Classifying Open Data field of application;
  • Designing Open Data taxonomy and integration;
  • Guiding Open Data implementation;
  • Documenting data maturity and evolution by applying DaaS lifecycle.

Accordingly, Maas provides “on-premise” properties supporting Open Data design and publication:

  1. AnalysisWhat data are you planning to make open? When working with MaaS, a data model is used to perform data analysis. This means the Open Data designer might return to this step to correct, update and improve the incoming analysis: he always works on an “on-premise” data model. Analysis performed by model helps in identifying data integration and interoperability. The latter assists in choosing what data has to be published and in defining open datasets;
  2. DesignDuring the analysis step, the design is carried out too. The design can be changed and traced along the Open Data lifecycle. Remember that with MaaS the model is a service, and the data opened offers the designed service;
  3. Data securityData security becomes the key property to rule data access and navigation. MaaS plays a crucial role in data security: in fact, the models contain all the infrastructure properties and include information to classify accesses, classes of users, perimeters and risk mitigation assets. Models are the central way to enable data protection within the Open Data device;
  4. Participation - Because the goal is “everyone must be able to use Open Data”, participation is comprehensive of people and groups without any discrimination or restriction. Models contain data access rules and accreditations (open licensing).
  5. Mapping – The MaaS mapping property is important because many people can obtain the data after long navigation and several “bridges” connecting different fields of applications. Looking at this aspect, MaaS helps the Open Data designer to define the best initial “route” between transformation and aggregation linking different areas. Then continually engaging citizens, developers, sector’s expert, managers … helps in modifying the model to better update and scale Open Data contents: the easier it is for outsiders to discover data, the faster new and useful Open Data services will be built.
  6. OntologyDefining metadata vocabulary for describing ontologies. Starting from standard naming definition, data models provide grouping and reorganizing vocabulary for further metadata re-use, integration, maintenance, mapping and versioning;
  7. Portability – Models contain all the properties belonging to data in order that MaaS can enable Open Data service’s portability to the Cloud. The model is portable by definition and it can be generated to different database and infrastructures;
  8. Availability – The DaaS lifecycle assures structure validation in terms of MaaS accessibility;
  9. Reuse and distribution – Open Data can include merging with additional datasets belonging to other fields of application (for example, medical research vs. air pollution). Open Data built by MaaS has this advantage. Merging open datasets means merging models by comparing and synchronizing, old and new versions, if needed;
  10. Change Management and History – Data models are organized in libraries to preserve Open Data changes and history. Changes are traced and maintained to restore, if necessary, model and/or datasets;
  11. Redesign – Redesigning Open Data, means redesigning the model it belongs to: the  model drives the history of the changes;
  12. Fast BI – Publishing Open Data is an action strictly related to the BI process. Redesigning and publishing Open Data are two automated steps starting from the design of the data model and from its successive updates.

Conclusion
MaaS is the emerging solution for Open Data implementation. Open Data is public and private accessible data, designed to connect the social community with the public bodies. This data should be made available without restriction although it is placed under security and open licensing. In addition, Open Data is always up-to-date and transformation and aggregation have to be simple and time saving for inesperienced users. To achieve these goals, the Open Data service has to be model driven designed and providing data integration, interoperability, mapping, portability, availability, security, distribution, all properties assured by applying MaaS.

References
[1] N. Piscopo - ERwin® in the Cloud: How Data Modeling Supports Database as a Service (DaaS) Implementations
[2] N. Piscopo - CA ERwin® Data Modeler’s Role in the Relational Cloud
[3] N. Piscopo - DaaS Contract templates: main constraints and examples, in press
[4] D. Burbank, S. Hoberman - Data Modeling Made Simple with CA ERwin® Data Modeler r8
[7] N. Piscopo – Best Practices for Moving to the Cloud using Data Models in theDaaS Life Cycle
[8] N. Piscopo – Using CA ERwin® Data Modeler and Microsoft SQL Azure to Move Data to the Cloud within the DaaS Life Cycle
[9] The Open Software Service Definition (OSSD) at opendefinition.org

Read the original blog entry...

More Stories By Cloud Best Practices Network

The Cloud Best Practices Network is an expert community of leading Cloud pioneers. Follow our best practice blogs at http://CloudBestPractices.net

@CloudExpo Stories
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Enterprise networks are complex. Moreover, they were designed and deployed to meet a specific set of business requirements at a specific point in time. But, the adoption of cloud services, new business applications and intensifying security policies, among other factors, require IT organizations to continuously deploy configuration changes. Therefore, enterprises are looking for better ways to automate the management of their networks while still leveraging existing capabilities, optimizing perf...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Many private cloud projects were built to deliver self-service access to development and test resources. While those clouds delivered faster access to resources, they lacked visibility, control and security needed for production deployments. In their session at 18th Cloud Expo, Steve Anderson, Product Manager at BMC Software, and Rick Lefort, Principal Technical Marketing Consultant at BMC Software, discussed how a cloud designed for production operations not only helps accelerate developer in...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
The competitive landscape of the global cloud computing market in the healthcare industry is crowded due to the presence of a large number of players. The large number of participants has led to the fragmented nature of the market. Some of the major players operating in the global cloud computing market in the healthcare industry are Cisco Systems Inc., Carestream Health Inc., Carecloud Corp., AGFA Healthcare, IBM Corp., Cleardata Networks, Merge Healthcare Inc., Microsoft Corp., Intel Corp., an...
The best-practices for building IoT applications with Go Code that attendees can use to build their own IoT applications. In his session at @ThingsExpo, Indraneel Mitra, Senior Solutions Architect & Technology Evangelist at Cognizant, provided valuable information and resources for both novice and experienced developers on how to get started with IoT and Golang in a day. He also provided information on how to use Intel Arduino Kit, Go Robotics API and AWS IoT stack to build an application tha...
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, discussed the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filterin...
So, you bought into the current machine learning craze and went on to collect millions/billions of records from this promising new data source. Now, what do you do with them? Too often, the abundance of data quickly turns into an abundance of problems. How do you extract that "magic essence" from your data without falling into the common pitfalls? In her session at @ThingsExpo, Natalia Ponomareva, Software Engineer at Google, provided tips on how to be successful in large scale machine learning...
Early adopters of IoT viewed it mainly as a different term for machine-to-machine connectivity or M2M. This is understandable since a prerequisite for any IoT solution is the ability to collect and aggregate device data, which is most often presented in a dashboard. The problem is that viewing data in a dashboard requires a human to interpret the results and take manual action, which doesn’t scale to the needs of IoT.
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...