Welcome!

@CloudExpo Authors: Yeshim Deniz, Elizabeth White, Liz McMillan, Zakia Bouachraoui, Pat Romanski

Related Topics: @CloudExpo

@CloudExpo: Article

A Tactical Cloud Computing Ontology

What architectural functions should a secure cloud computing ontology address?

Cloud Musings

In an effort to encourage standards and interoperability, the cloud computing community is currently discussing architectural ontologies. Core to most approaches is an assumption of open access, public Internet connectivity and security provisioning by the cloud service provider. Solutions for the government marketplace, however, cannot make these assumptions. This article discusses an expansion of the cloud computing ontology put forth in a paper by the University of California at Santa Barbara and IBM. To address the needs of this specific marketplace, this Tactical Cloud Computing Ontology explicitly addresses these specific critical functions.

The concept of expanding on the UCSB/IBM ontology was prompted by an invitation to speak at two fairly unique conferences. The first invitation was to the Simulation Interoperability Standards Organization (SISO) Workshop held in San Diego, CA. SISO is an international organization dedicated to the promotion of modeling and simulation (M&S) interoperability and reuse for the benefit of a broad range of M&S communities. SISO's Conference Committee organizes Simulation Interoperability Workshops (SIWs) in the US and Europe. SISO's Standards Activity Committee develops and supports simulation interoperability standards, both independently and in conjunction with other organizations.  Los Angeles, CA was the site of the second conference, the Ground System Architecture Workshop (GSAW). Hosted by the Aerospace Corporation, GSAW provides a forum for the world's spacecraft ground system experts to collaborate with other ground system users, developers, and researchers through tutorials, presentations, working groups, and panel discussions on issues and solutions.

The common interest was of course cloud computing and both presentations focused on how to establish a common framework for developing cloud computing solutions. In order to set a baseline for the discussion, I first introduced the UCSB/IBM framework. Even though this layered approach hids many integration details, it served very well as a common discussion platform.

In order to better adapt this excellent framework for my audience, I then presented my personal views on how this framework could be modified to address Federal government community requirements.

Key modifications include:

  • The addition of an access management layer;
  • Explicit SOA related layers to address workflow orchestration, application security and service management; and
  • An explicit connectivity layer in order to avoid a common assumption that the public Internet is always used as the networking layer in cloud computing solutions

I also briefed some advantages to using an ontology for developing federated cloud computing solutions which included:

  • Providing a framework for enterprise architecture development, maintenance, and use that aligns, locates, and links disparate architectures and architecture information via information exchange standards to deliver a seamless outward appearance to users;
  • Recogning the uniqueness and specific purpose of disparate architectures and allows for their autonomy and local governance while enabling the enterprise to benefit from their content; and
  • As a way to organize an enterprise’s body of knowledge (architecture) about its activities (processes), people, and things within a defined context and current/future environment.

This approach seemed to enhance the conversation and interest so with great expectation, I'm now putting this out to the wider community for consideration. Your comments are welcomed and appreciated.

 


( Thank you. If you enjoyed this article, get free updates by email or RSS - KLJ )

More Stories By Kevin Jackson

Kevin Jackson, founder of the GovCloud Network, is an independent technology and business consultant specializing in mission critical solutions. He has served in various senior management positions including VP & GM Cloud Services NJVC, Worldwide Sales Executive for IBM and VP Program Management Office at JP Morgan Chase. His formal education includes MSEE (Computer Engineering), MA National Security & Strategic Studies and a BS Aerospace Engineering. Jackson graduated from the United States Naval Academy in 1979 and retired from the US Navy earning specialties in Space Systems Engineering, Airborne Logistics and Airborne Command and Control. He also served with the National Reconnaissance Office, Operational Support Office, providing tactical support to Navy and Marine Corps forces worldwide. Kevin is the founder and author of “Cloud Musings”, a widely followed blog that focuses on the use of cloud computing by the Federal government. He is also the editor and founder of “Government Cloud Computing” electronic magazine, published at Ulitzer.com. To set up an appointment CLICK HERE

CloudEXPO Stories
Even if your IT and support staff are well versed in agility and cloud technologies, it can be an uphill battle to establish a DevOps style culture - one where continuous improvement of both products and service delivery is expected and respected and all departments work together throughout a client or service engagement. As a service-oriented provider of cloud and data center technology, Green House Data sought to create more of a culture of innovation and continuous improvement, from our helpdesk on to our product development and cloud service teams. Learn how the Chief Executive team helped guide managers and staff towards this goal with metrics to measure progress, staff hiring or realignment, and new technologies and certifications.
Technology has changed tremendously in the last 20 years. From onion architectures to APIs to microservices to cloud and containers, the technology artifacts shipped by teams has changed. And that's not all - roles have changed too. Functional silos have been replaced by cross-functional teams, the skill sets people need to have has been redefined and the tools and approaches for how software is developed and delivered has transformed. When we move from highly defined rigid roles and systems to more fluid ones, we gain agility at the cost of control. But where do we want to keep control? How do we take advantage of all these new changes without losing the ability to efficiently develop and ship great software? And how should program and project managers adapt?
When Enterprises started adopting Hadoop-based Big Data environments over the last ten years, they were mainly on-premise deployments. Organizations would spin up and manage large Hadoop clusters, where they would funnel exabytes or petabytes of unstructured data.However, over the last few years the economics of maintaining this enormous infrastructure compared with the elastic scalability of viable cloud options has changed this equation. The growth of cloud storage, cloud-managed big data environments, and cloud data warehouses like Snowflake, Redshift, BigQuery and Azure SQL DW, have given the cloud its own gravity - pulling data from existing environments. In this presentation we will discuss this transition, describe the challenges and solutions for creating the data flows necessary to move to cloud analytics, and provide real-world use-cases and benefits obtained through adop...
Docker and Kubernetes are key elements of modern cloud native deployment automations. After building your microservices, common practice is to create docker images and create YAML files to automate the deployment with Docker and Kubernetes. Writing these YAMLs, Dockerfile descriptors are really painful and error prone.Ballerina is a new cloud-native programing language which understands the architecture around it - the compiler is environment aware of microservices directly deployable into infrastructures like Docker and Kubernetes.
Your applications have evolved, your computing needs are changing, and your servers have become more and more dense. But your data center hasn't changed so you can't get the benefits of cheaper, better, smaller, faster... until now. Colovore is Silicon Valley's premier provider of high-density colocation solutions that are a perfect fit for companies operating modern, high-performance hardware. No other Bay Area colo provider can match our density, operating efficiency, and ease of scalability.