Welcome!

@CloudExpo Authors: Yeshim Deniz, Liz McMillan, Elizabeth White, Pat Romanski, Stackify Blog

Related Topics: @CloudExpo

@CloudExpo: Blog Post

Log for Better Clouds - Part 8: Cloud Portability

What happens with logs when the honeymoon is over?

Cloud Portability.

(In the context of Logs of course!!)

So the honeymoon is over.

The Cloud Provider that you so carefully selected is not performing like you expected and you are eying the competition.  You might even be considering re-insourcing back some of your IT services.

So what happens to all the logs? As a customer, can you Trust that your Provider(s) will not let you down and mess with your logs?

Well, first off, whose logs are they?  Are they the Provider's logs because they are logs generated by their physical equipment, or are these your logs because they trace your applications and your virtual systems?

Actually they're both at the same time. Let's see why both parties would need access to the logs and reports.

From a Customer perspective, logs are important because they are an indication of my business processes and I need visibility in those. I need visibility in the usage of my applications, for such purpose as trending and capacity planning.  I also need them for internal reports, to tell the Business Units how they fared and I probably need them for chargeback.

For example, say that I'm using a Platform as a Service Provider to host applications that are enabling my sales team to better serve my customers. What is the usage trending of that application?  Is the application's take rate growing or did we reach a peak and we need to reinvent ourselves?  What kind of horsepower do we need to insure good quality of service and make sure that our applications are humming? How much do I need to bill each BU for their fair share of usage? Can I use this usage information to select my next Provider? What kind of usage expectations do I need to raise? All of these questions are very valid questions, and they can be answered by logs, more specifically by the reports on logs.

Providers have questions of their own, questions dealing with the most popular services and systems, usage trending over time with granularity on time of the day or period in the year, where should resources be focused on, what's the take rate on offers, where to better market them, what are the different services take rate, how much to charge customers?  All of these are questions can be answered with logs and log reports, so Providers also need to keep these logs.

OK, so both parties need to keep the logs or at least reports on the logs.

As a customer, I understand why my Provider needs to generate a last set of report from "my" logs, and I will let him do it provided that I get the guarantee that he will not be able to use this data to infer information about my business. Who knows, maybe he just signed my worst competitor as a client and might be tempted to give him access to my logs?  The Provider can use and generate reports based on my logs, but is not allowed to access the raw logs anymore.

And because I am changing Providers, there is no reason why I should still have access to (old) raw logs anymore, I'm not paying usage fee anymore...  and by the same token I need a last set of reports based on "my" logs.

So the solution could be that raw logs are still being stored a while longer, but they are quarantined, they are still there and available in case of dispute, and in case Law Enforcement needs access to raw logs but otherwise they can't be accessed.

Logs end up in a quarantined bucket for both parties and any/all access to the raw logs or generation of reports should trigger alerts that are sent to both parties so that way there are proper checks and balances on the raw log access and on the report generation.  This bucket does not necessarily need to be a physical bucket, it can be a logical one, all we need is a segregation mechanism that prevents unapproved accesses or at least alerts on these accesses.

Techniques for segregation have existed in the Industry for several years, for example this is exactly how accesses are mediated in DataWarehouse, all data is mixed together - including several owners' data - and access mediation is done at the logical level.

Phew, so are we done with Trust?

In absolute terms we will never be done with Trust.  Building Trust is a process, not an event, and all parties need to constantly make efforts so that everybody else's level of Trust is maintained and better yet improved.

I hope that by now both Providers and Customers trust each other a little bit more thanks to logs; how logs have been collected, how we can prove their integrity, higher-level reports that use them, and can then engage in a mutually-beneficial business relationship.

Next time, we'll talk about some specific use cases, starting with Pay Per Use.  Stay tuned!

More Stories By Gorka Sadowski

Gorka is a natural born entrepreneur with a deep understanding of Technology, IT Security and how these create value in the Marketplace. He is today offering innovative European startups the opportunity to benefit from the Silicon Valley ecosystem accelerators. Gorka spent the last 20 years initiating, building and growing businesses that provide technology solutions to the Industry. From General Manager Spain, Italy and Portugal for LogLogic, defining Next Generation Log Management and Security Forensics, to Director Unisys France, bringing Cloud Security service offerings to the market, from Director of Emerging Technologies at NetScreen, defining Next Generation Firewall, to Director of Performance Engineering at INS, removing WAN and Internet bottlenecks, Gorka has always been involved in innovative Technology and IT Security solutions, creating successful Business Units within established Groups and helping launch breakthrough startups such as KOLA Kids OnLine America, a social network for safe computing for children, SourceFire, a leading network security solution provider, or Ibixis, a boutique European business accelerator.

@CloudExpo Stories
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory?
SYS-CON Events announced today that Elastifile will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Elastifile Cloud File System (ECFS) is software-defined data infrastructure designed for seamless and efficient management of dynamic workloads across heterogeneous environments. Elastifile provides the architecture needed to optimize your hybrid cloud environment, by facilitating efficient...
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
SYS-CON Events announced today that Golden Gate University will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Since 1901, non-profit Golden Gate University (GGU) has been helping adults achieve their professional goals by providing high quality, practice-based undergraduate and graduate educational programs in law, taxation, business and related professions. Many of its courses are taug...
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
yperConvergence came to market with the objective of being simple, flexible and to help drive down operating expenses. It reduced the footprint by bundling the compute/storage/network into one box. This brought a new set of challenges as the HyperConverged vendors are very focused on their own proprietary building blocks. If you want to scale in a certain way, let’s say you identified a need for more storage and want to add a device that is not sold by the HyperConverged vendor, forget about it....
With Cloud Foundry you can easily deploy and use apps utilizing websocket technology, but not everybody realizes that scaling them out is not that trivial. In his session at 21st Cloud Expo, Roman Swoszowski, CTO and VP, Cloud Foundry Services, at Grape Up, will show you an example of how to deal with this issue. He will demonstrate a cloud-native Spring Boot app running in Cloud Foundry and communicating with clients over websocket protocol that can be easily scaled horizontally and coordinate...
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Vulnerability management is vital for large companies that need to secure containers across thousands of hosts, but many struggle to understand how exposed they are when they discover a new high security vulnerability. In his session at 21st Cloud Expo, John Morello, CTO of Twistlock, will address this pressing concern by introducing the concept of the “Vulnerability Risk Tree API,” which brings all the data together in a simple REST endpoint, allowing companies to easily grasp the severity of t...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Connecting to major cloud service providers is becoming central to doing business. But your cloud provider’s performance is only as good as your connectivity solution. Massive Networks will place you in the driver's seat by exposing how you can extend your LAN from any location to include any cloud platform through an advanced high-performance connection that is secure and dedicated to your business-critical data. In his session at 21st Cloud Expo, Paul Mako, CEO & CIO of Massive Networks, wil...
Any startup has to have a clear go –to-market strategy from the beginning. Similarly, any data science project has to have a go to production strategy from its first days, so it could go beyond proof-of-concept. Machine learning and artificial intelligence in production would result in hundreds of training pipelines and machine learning models that are continuously revised by teams of data scientists and seamlessly connected with web applications for tenants and users.
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
WebRTC is great technology to build your own communication tools. It will be even more exciting experience it with advanced devices, such as a 360 Camera, 360 microphone, and a depth sensor camera. In his session at @ThingsExpo, Masashi Ganeko, a manager at INFOCOM Corporation, will introduce two experimental projects from his team and what they learned from them. "Shotoku Tamago" uses the robot audition software HARK to track speakers in 360 video of a remote party. "Virtual Teleport" uses a...
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...