Click here to close now.




















Welcome!

@CloudExpo Authors: Cloud Best Practices Network, Kevin Jackson, Jason Bloomberg, Adrian Bridgwater, Ian Khan

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, Government Cloud

@CloudExpo: Blog Feed Post

MaaS – The Solution to Design, Map, Integrate and Publish Open Data

Data models can be shared, off-line tested and verified to define data designing requirements, data topology, performance, place

Open Data is data that can be freely used, reused and redistributed by anyone – subject only, at the most, to the requirement for attributes and sharealikes (Open Software Service Definition – OSSD). As a consequence, Open Data should create value and might have a positive impact in many different areas such as government (tax money expenditure), health (medical research, hospital acceptance by pathology), quality of life (air breathed in our city, pollution) or might influence public decisions like investments, public economy and expenditure. We are talking about services, so open data are services needed to connect the community with the public bodies. However, the required open data should be part of a design and then integrated, mapped, updated and published in a form, which is easy to use. MaaS is the Open Data driver and enables Open Data portability into the Cloud.

Introduction
Data models used as a service mainly provide the following topics:

  • Implementing and sharing data structure models;
  • Verifying data model properties according to private and public cloud requirements;
  • Designing and testing new query types. Specific query classes need to support heterogeneous data;
  • Designing of the data storage model. The model should enable query processing directly against databases to ensure privacy and secure changes from data updates and review;
  • Modeling data to predict usage “early”;
  • Portability, a central property when data is shared among fields of application;
  • Sharing, redistribution and participation of data among datasets and applications.

As a consequence, the data should be available as a whole and at a reasonable fee, preferably by finding, navigating and downloading over the Cloud. It should also be available in a usable and changeable form. This means modeling Open Data and then using the models to map location and usage, configuration, integration and changes along the Open Data lifecycle.

What is MaaS
Data models can be shared, off-line tested and verified to define data designing requirements, data topology, performance, placement and deployment. This means models themselves can be supplied as a service to allow providers to verify how and where data has to be designed to meet the Cloud service’s requisites: this is MaaS. As a consequence by using MaaS, Open Data designers can verify “on-premise” how and why datasets meet Open Data requirements. With this approach, Open Data models can be tuned on real usage and then mapped “on-premise” to the public body’s service. Further, MaaS inherits all the defined service’s properties and so the data model can be reused, shared and classified for new Open Data design and publication.

Open Data implementation is MaaS (Model as a Service) driven
Open Data is completely supported by data modeling and then MaaS completely supports Open Data. MaaS should be the first practice, helping to tune analysis and Open Data design. Furthermore, data models govern design, deployment, storage, changes, resources allocation, hence MaaS supports:

  • Applying Best Practice for Open Data design;
  • Classifying Open Data field of application;
  • Designing Open Data taxonomy and integration;
  • Guiding Open Data implementation;
  • Documenting data maturity and evolution by applying DaaS lifecycle.

Accordingly, Maas provides “on-premise” properties supporting Open Data design and publication:

  1. AnalysisWhat data are you planning to make open? When working with MaaS, a data model is used to perform data analysis. This means the Open Data designer might return to this step to correct, update and improve the incoming analysis: he always works on an “on-premise” data model. Analysis performed by model helps in identifying data integration and interoperability. The latter assists in choosing what data has to be published and in defining open datasets;
  2. DesignDuring the analysis step, the design is carried out too. The design can be changed and traced along the Open Data lifecycle. Remember that with MaaS the model is a service, and the data opened offers the designed service;
  3. Data securityData security becomes the key property to rule data access and navigation. MaaS plays a crucial role in data security: in fact, the models contain all the infrastructure properties and include information to classify accesses, classes of users, perimeters and risk mitigation assets. Models are the central way to enable data protection within the Open Data device;
  4. Participation - Because the goal is “everyone must be able to use Open Data”, participation is comprehensive of people and groups without any discrimination or restriction. Models contain data access rules and accreditations (open licensing).
  5. Mapping – The MaaS mapping property is important because many people can obtain the data after long navigation and several “bridges” connecting different fields of applications. Looking at this aspect, MaaS helps the Open Data designer to define the best initial “route” between transformation and aggregation linking different areas. Then continually engaging citizens, developers, sector’s expert, managers … helps in modifying the model to better update and scale Open Data contents: the easier it is for outsiders to discover data, the faster new and useful Open Data services will be built.
  6. OntologyDefining metadata vocabulary for describing ontologies. Starting from standard naming definition, data models provide grouping and reorganizing vocabulary for further metadata re-use, integration, maintenance, mapping and versioning;
  7. Portability – Models contain all the properties belonging to data in order that MaaS can enable Open Data service’s portability to the Cloud. The model is portable by definition and it can be generated to different database and infrastructures;
  8. Availability – The DaaS lifecycle assures structure validation in terms of MaaS accessibility;
  9. Reuse and distribution – Open Data can include merging with additional datasets belonging to other fields of application (for example, medical research vs. air pollution). Open Data built by MaaS has this advantage. Merging open datasets means merging models by comparing and synchronizing, old and new versions, if needed;
  10. Change Management and History – Data models are organized in libraries to preserve Open Data changes and history. Changes are traced and maintained to restore, if necessary, model and/or datasets;
  11. Redesign – Redesigning Open Data, means redesigning the model it belongs to: the  model drives the history of the changes;
  12. Fast BI – Publishing Open Data is an action strictly related to the BI process. Redesigning and publishing Open Data are two automated steps starting from the design of the data model and from its successive updates.

Conclusion
MaaS is the emerging solution for Open Data implementation. Open Data is public and private accessible data, designed to connect the social community with the public bodies. This data should be made available without restriction although it is placed under security and open licensing. In addition, Open Data is always up-to-date and transformation and aggregation have to be simple and time saving for inesperienced users. To achieve these goals, the Open Data service has to be model driven designed and providing data integration, interoperability, mapping, portability, availability, security, distribution, all properties assured by applying MaaS.

References
[1] N. Piscopo - ERwin® in the Cloud: How Data Modeling Supports Database as a Service (DaaS) Implementations
[2] N. Piscopo - CA ERwin® Data Modeler’s Role in the Relational Cloud
[3] N. Piscopo - DaaS Contract templates: main constraints and examples, in press
[4] D. Burbank, S. Hoberman - Data Modeling Made Simple with CA ERwin® Data Modeler r8
[7] N. Piscopo – Best Practices for Moving to the Cloud using Data Models in theDaaS Life Cycle
[8] N. Piscopo – Using CA ERwin® Data Modeler and Microsoft SQL Azure to Move Data to the Cloud within the DaaS Life Cycle
[9] The Open Software Service Definition (OSSD) at opendefinition.org

Read the original blog entry...

More Stories By Cloud Best Practices Network

The Cloud Best Practices Network is an expert community of leading Cloud pioneers. Follow our best practice blogs at http://CloudBestPractices.net

@CloudExpo Stories
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
SYS-CON Events announced today the Containers & Microservices Bootcamp, being held November 3-4, 2015, in conjunction with 17th Cloud Expo, @ThingsExpo, and @DevOpsSummit at the Santa Clara Convention Center in Santa Clara, CA. This is your chance to get started with the latest technology in the industry. Combined with real-world scenarios and use cases, the Containers and Microservices Bootcamp, led by Janakiram MSV, a Microsoft Regional Director, will include presentations as well as hands-on...
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of tech...
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes ab...
In today's digital world, change is the one constant. Disruptive innovations like cloud, mobility, social media, and the Internet of Things have reshaped the market and set new standards in customer expectations. To remain competitive, businesses must tap the potential of emerging technologies and markets through the rapid release of new products and services. However, the rigid and siloed structures of traditional IT platforms and processes are slowing them down – resulting in lengthy delivery ...
U.S. companies are desperately trying to recruit and hire skilled software engineers and developers, but there is simply not enough quality talent to go around. Tiempo Development is a nearshore software development company. Our headquarters are in AZ, but we are a pioneer and leader in outsourcing to Mexico, based on our three software development centers there. We have a proven process and we are experts at providing our customers with powerful solutions. We transform ideas into reality.
Any Ops team trying to support a company in today’s cloud-connected world knows that a new way of thinking is required – one just as dramatic than the shift from Ops to DevOps. The diversity of modern operations requires teams to focus their impact on breadth vs. depth. In his session at DevOps Summit, Adam Serediuk, Director of Operations at xMatters, Inc., will discuss the strategic requirements of evolving from Ops to DevOps, and why modern Operations has begun leveraging the “NoOps” approa...
Organizations from small to large are increasingly adopting cloud solutions to deliver essential business services at a much lower cost. According to cyber security experts, the frequency and severity of cyber-attacks are on the rise, causing alarm to businesses and customers across a variety of industries. To defend against exploits like these, a company must adopt a comprehensive security defense strategy that is designed for their business. In 2015, organizations such as United Airlines, Sony...
Red Hat is investing in Tesora, the number one contributor to OpenStack Trove Database as a Service (DBaaS) also ranked among the top 20 companies contributing to OpenStack overall. Tesora, the company bringing OpenStack Trove Database as a Service (DBaaS) to the enterprise, has announced that Red Hat and others have invested in the company as a part of Tesora's latest funding round. The funding agreement expands on the ongoing collaboration between Tesora and Red Hat, which dates back to Febr...
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of ...
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
With the proliferation of connected devices underpinning new Internet of Things systems, Brandon Schulz, Director of Luxoft IoT – Retail, will be looking at the transformation of the retail customer experience in brick and mortar stores in his session at @ThingsExpo. Questions he will address include: Will beacons drop to the wayside like QR codes, or be a proximity-based profit driver? How will the customer experience change in stores of all types when everything can be instrumented and a...
Cloud and datacenter migration innovator AppZero has joined the Microsoft Enterprise Cloud Alliance Program. AppZero is a fast, flexible way to move Windows Server applications from any source machine – physical or virtual – to any destination server, in any cloud or datacenter, using its patented container technology. AppZero’s container is also called a Virtual Application Appliance (VAA). To facilitate Microsoft Azure onboarding, AppZero has two purpose-built offerings: AppZero SP for Azure,...
WSM International, the pioneer and leader in server migration services, has announced an agreement with WHOA.com, a leader in providing secure public, private and hybrid cloud computing services. Under terms of the agreement, WSM will provide migration services to WHOA.com customers to relocate some or all of their applications, digital assets, and other computing workloads to WHOA.com enterprise-class, secure cloud infrastructure. The migration services include detailed evaluation and planning...
This Enterprise Strategy Group lab validation report of the NEC Express5800/R320 server with Intel® Xeon® processor presents the benefits of 99.999% uptime NEC fault-tolerant servers that lower overall virtualized server total cost of ownership. This report also includes survey data on the significant costs associated with system outages impacting enterprise and web applications. Click Here to Download Report Now!
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advance...
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ab...
IBM’s Blue Box Cloud, powered by OpenStack, is now available in any of IBM’s globally integrated cloud data centers running SoftLayer infrastructure. Less than 90 days after its acquisition of Blue Box, IBM has integrated its Blue Box Cloud Dedicated private-cloud-as-a-service into its broader portfolio of OpenStack® based solutions. The announcement, made today at the OpenStack Silicon Valley event, further highlights IBM’s continued support to deliver OpenStack solutions across all cloud depl...
SYS-CON Events announced today that DataClear Inc. will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The DataClear ‘BlackBox’ is the only solution that moves your PC, browsing and data out of the United States and away from prying (and spying) eyes. Its solution automatically builds you a clean, on-demand, virus free, new virtual cloud based PC outside of the United States, and wipes it clean...