Welcome!

@CloudExpo Authors: Pat Romanski, Yeshim Deniz, Elizabeth White, Liz McMillan, Zakia Bouachraoui

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog

@CloudExpo: Blog Feed Post

Cloud Computing and the 4th Dimension

Unfortunately, when that 4th dimension is overlooked, many businesses conclude that “the Cloud doesn’t work”

It’s time to think about time.

According to Wikipedia, in physics, spacetime is any mathematical model that combines space and time into a single continuum. In cosmology, the concept of spacetime combines space and time to a single abstract universe. But you know we’re not here to talk about physics, cosmology, or the universe, as interesting as they may be.

For this moment in time, I want you to think of Cloud computing along the lines of space and time, because too many businesses shifting to the Cloud are focused on space, and space alone. You see, traditionally, for business or enterprise IT, time has not been a factor, so it was not an element usually given much consideration.

Unfortunately, when that 4th dimension is overlooked, many businesses conclude that “the Cloud doesn’t work”.

If you’re shifting your business to the Cloud, it is imperative to understand your processes and the underlying technology infrastructure. It’s true that infrastructure can be abstracted and an application may simply require resources (processing, memory, storage) to be available, though we still need to remember that for the business logic to work, all the necessary components must be available when the business logic (or tool) is executing.

In a traditional enterprise setting, that was not an issue, because servers, once put into operation in your datacenter, were not usually taken down. If servers were taken down, it was planned and coordinated during an outage window, and if unplanned /emergency, the focus was to bring it back into operation. So your servers were always “hot” or “live” 24 hours a day, 7 days a week, within or across departments. In many cases, inter-departmental coordination was non-existent or unnecessary.

With a transition to the Cloud, one of the cost savings is shutting VMs down when they’re not in use. This is where you have to think four dimensionally.

If you have an automatic process that runs in the evenings traversing multiple departments, say departments A, B, and C, and you get unexpected results or your process did not complete, you’ll have to consider the availability of those resources during the time the process is running. Department B may have been achieving cost savings by shutting down their VM when no-one in their department was using it, during evenings and weekends, though unwittingly breaking an enterprise process as they did so.

Historically, we’ve always assumed that the resources would be available when we kicked off an automated process intended to run when we are not working, whether it’s an automated enterprise billing tool, security scan, backup, or business logic, we now have to consider not only where the application and process resides, but the resources that the application or tool requires, and their availability state.

So as you move to the Cloud, remember to consider the time factor, and coordinate the availability of VMs accordingly. It’ll save you a lot of time. The Cloud does work, you just have to know how to use it.

How are you handling the Cloud-time factor?

-Tune The Future-

Twitter: @RayDePena |   LinkedIn |   Facebook |   Google+

Read the original blog entry...

More Stories By Ray DePena

Ray DePena worked at IBM for over 12 years in various senior global roles in managed hosting sales, services sales, global marketing programs (business innovation), marketing management, partner management, and global business development.
His background includes software development, computer networking, systems engineering, and IT project management. He holds an MBA in Information Systems, Marketing, and International Business from New York University’s Stern School of Business, and a BBA in Computer Systems from the City University of New York at Baruch College.

Named one of the World's 30 Most Influential Cloud Computing Bloggers in 2009, Top 50 Bloggers on Cloud Computing in 2010, and Top 100 Bloggers on Cloud Computing in 2011, he is the Founder and Editor of Amazon.com Journal,Competitive Business Innovation Journal,and Salesforce.com Journal.

He currently serves as an Industry Advisor for the Higher Education Sector on a National Science Foundation Initiative on Computational Thinking. Born and raised in New York City, Mr. DePena now lives in northern California. He can be followed on:

Twitter: @RayDePena   |   LinkedIn   |   Facebook   |   Google+

CloudEXPO Stories
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight and has been quoted or published in Time, CIO, Computerworld, USA Today and Forbes.
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the massive amount of information associated with these devices. Ed presented sought out sessions at CloudEXPO Silicon Valley 2017 and CloudEXPO New York 2017. He is a regular contributor to Cloud Computing Journal.
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by sharing information within the building and with outside city infrastructure via real time shared cloud capabilities.
They say multi-cloud is coming, but organizations are leveraging multiple clouds already. According to a study by 451 Research, only 21% of organizations were using a single cloud. If you've found yourself unprepared for the barrage of cloud services introduced in your organization, you will need to change your approach to engaging with the business and engaging with vendors. Look at technologies that are on the way and work with the internal players involved to have a plan in place when the inevitable happens and the business begins to look at how these things can help affect your bottom line.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on organizations of all sizes and in every line of business. Fintech is a constant battleground for this technology expanding trend and the lessons learned here can be applied anywhere. Digital transformation isn't going to go away and the need for greater understanding and skills around managing, guiding, and understanding the greater landscape of change is required for effective transformations.