|By Irfan Khan||
|December 13, 2009 07:00 PM EST||
Doing more with less is a familiar refrain for IT professionals, and today's challenging business environment has only increased the pressure on managers to achieve efficiencies, maximize performance and improve responsiveness of the data center. More and more frequently, IT is turning to virtualization to accomplish its mission-critical goals.
The hot new trend in cloud computing is a natural extension of this drive toward virtualization. In the case of the public cloud, IT can add processing power and infrastructure as needed, and in the case of the private cloud, IT can improve the utilization of existing infrastructure. In other words, cloud computing platforms offer IT the opportunity to increase efficiencies and become more agile, transforming the data center into an environment that delivers greater benefits to end-users.
This tremendous potential of cloud computing can be seen by examining how organizations currently manage, analyze and mobilize data. Cloud has given IT organizations the opportunity to fundamentally shift the way data is created, processed and shared.
In order to take advantage of cloud computing for data management, IT must familiarize itself with the latest issues and trends. This requires a greater understanding of each of the three prominent cloud models - private, public and hybrid - along with proper evaluation of the criteria around a cloud strategy.
Cloud Gazing: Three Prominent Models
Much like those ubiquitous, puffy white masses of water vapor in our stratosphere, each of the prominent cloud computing models has its own unique qualities and benefits. The type of cloud strategy an organization chooses will depend on the specific issues they are addressing.
For IT organizations operating a virtual data center environment, private or internal cloud may offer important advantages. An internal cloud is an effective way to boost productivity while providing users a certain level of self-sufficiency. Such a cloud deployment also mitigates the need to purchase new systems. Through virtualization and cloud infrastructure, users can self-provision a virtual machine on the cloud environment, which can easily be expanded or contracted as a given project reaches fruition. By first developing tools to handle life cycle management of virtual machines, a user can request a virtual machine using a menu to specify resource requirements (CPUs, memory, storage), operating system details, lease period and other miscellaneous configuration information.
An internal cloud can provide critical efficiencies by avoiding data center expansion, faster provisioning of server capacity, faster upgrades of memory, CPU and storage, improved reliability, and business continuity/disaster recovery improvements. Case in point, a major financial services provider recently turned their compute backbone into an internal cloud with a focus on providing a platform to host virtual desktops. The company found that it is easier to share infrastructure with the computational jobs which run after the local trading day ends so there is a nice segregation of "jobs" in the local data center. Additionally, this solution has proven to deliver faster ROI.
Intense interest in public or external cloud strategies is being fueled by the potential for improved efficiencies, flexible and dynamic environments, on-demand infrastructure and smaller maintenance requirements. External cloud services offer the potential for improved business agility, better scalability and versatility. An example of this can be found with one of the world's leading electronic stock exchanges, who turned to an external cloud provider in order to enable brokerage firms to show customers and regulators that best-execution requirements were met for a given trade. An application was written to upload data for every stock in files representing ten-minute increments of trading data based on data from the major exchanges. This allows an accurate reconstruction of the trade environment without having to build out an internal storage infrastructure strictly for emulation (or testing) purposes. A further benefit of this cloud deployment has been the flexibility of pay-as-you-go pricing.
In today's challenging economy, with IT increasingly finding itself strapped for resources, external cloud services are proving to be an effective means for lowering upfront costs and reducing the workload on overburdened IT staff.
Experts agree that the hybrid cloud environment, a mix of in-house and outsourced computing and networking resources, will be a leading choice for enterprises in the near term. The hybrid model offers the greatest flexibility when dealing with the dynamic data requirements of most businesses today.
Regardless of the model, cloud computing is proving to be highly attractive because of its dynamic infrastructure, ability to support any application and operating system, accessibility via Internet protocols, automatic scalability and the fact that no software or hardware installation is required on site. Additional benefits for end-users include the:
- ability to shift resources from undifferentiated heavy lifting to differentiated value creation,
- abstraction from infrastructure,
- "pay for what you use" approach and
- adaptability to resource requirements
As a result, cloud computing is fast becoming an important tool in the IT arsenal.
Cloud Potential: Classes of Applications Deserving of Deployment
At present, cloud isn't for everyone. In those cases where the most stringent privacy and security policies are in place, a public cloud offering may not meet the regulations needed by IT. Conversely, applications that readily lend themselves to cloud are ones in which the user interface is easily presented in a Web browser and where time sensitivity isn't an issue, such as sales force automation or customer relationship management.
Some classes of applications in use today that might not appear to be obvious cloud candidates will in fact be deployed to the cloud in the near future. Increasingly, we are seeing more functions of data management and data analytics being moved to cloud environments with great success.
One such class of applications is database management systems. For organizations with large databases and high transaction volumes, cloud enables a measure of strategic agility that can meet today's enterprise needs. Sybase Adaptive Server Enterprise (ASE) is a prime example of a mission-critical data management system that allows IT to get a handle on exploding information demands. This application provides access and control over data, transforming information into a vital, accessible and decisive mission asset made all the more effective when utilized in cloud.
As organizations employ larger data warehouses for purposes ranging from standard reporting to strategic business analytics, complex event processing and deep-dive data mining, the need for performance will continue to outpace the capabilities of traditional relational databases. Cloud computing offers IT a powerful means for meeting the performance and scalability requirements of the enterprise data warehouse. With large-scale systems continuing to expand, alternate approaches to support standard reporting, analysis, and power-user ad hoc queries will become increasingly established as the platforms of choice for very large database systems.
As with other cloud environments, analytics in the cloud benefit end-users by offering a pay-as-you-go model and adaptable resource requirements that free up IT from the need to purchase additional hardware and going through the extensive procurement process.
In order to satisfy the rapidly expanding need for analytical performance, Sybase offers a solution well suited to exploit the benefits of cloud. Sybase IQ takes an alternate database approach, storing data oriented by columns instead of rows. This approach has proven superior in sustaining the performance and rapid growth requirements of analytical applications and, when combined with cloud computing, offers significant advantages. The column-oriented methodology provides a combination of architectural simplicity and the ability to configure data in a way that can reduce the physical amount of data that must be accessed. Reducing the storage footprint while optimizing column access will reduce data access latency and improve use of network bandwidth, thereby contributing to a scalable environment that continues to provide consistently high performance as data volumes, number of users and number of queries increase.
With a growing trend toward mobility in the workforce, IT faces increased demand for access to mobilized applications. Managing data and exchange technologies can place significant demands on IT, necessitating regular onsite maintenance of mobile devices. Cloud offers IT the scalability, flexibility, performance and responsiveness needed to deploy and manage mobile applications. For example, Sybase SQL Anywhere provides data management and data exchange technologies that enable the rapid development and deployment of database-powered applications. Design and management tools within SQL Anywhere enable IT to implement and deploy mobile applications, while also easily providing support.
For one major telecommunications provider, cloud computing is key in offering Web-interfaced applications hosted and deployed in a managed service (SaaS) model. This company leverages Sybase's Afaria solution, which provides comprehensive management and security capabilities to ensure that mobile data and devices are up-to-date, reliable and secure. The provisioning and scalability advantages of cloud make it possible for the company to offer an integrated suite of managed mobility applications for their customers that can be contracted individually or in combination. Afaria enables the company to offer its hosted applications, and assist customers in managing the deployment, expenses and ongoing support to a global mobile workforce without technical complexity and security risks.
Cloud computing can enable IT to more effectively handle the wide ranging database requirements of mobile workers, while minimizing the impact on end-users, thus allowing them to focus on the work at hand rather than getting bogged down with the technology.
There are numerous considerations IT should take into account when pursing a data center cloud strategy. These range from security and compliance requirements to the need for a rich user interface, effective management and control, and the need to establish standards, provide support and improve performance.
In the case of multi-tenant applications, IT must consider the separation of data from a security, privacy, and compliance perspective. Data that is bound by strict privacy regulations, such as medical information covered by the Health Insurance Portability and Accountability Act (HIPAA), will require that tenants log in, after which they are then routed directly to their own secure database server. In this way, data remains separate and secure, and in compliance with government privacy mandates.
Another factor IT must consider is the implication of application performance on end users. If one tenant is running a large, complex report, overall performance on the database server can be slowed. IT must gauge the sensitivity of other users to such performance issues and factor this into decisions regarding external cloud providers as well as internal cloud initiatives
In terms of choosing cloud schema, IT can establish a single database on the server with separate schemas, allowing one tenant to employ a set of tables within the database while another tenant is able to utilize a different set of tables within that same database. In this scenario, the cost of provisioning is lower, but there is a trade-off in performance and data isolation. A second option is to employ shared schema where all tenants' data is contained in the same table. Each tenant has a column in the table, making this option more cost effective to provision, however application development becomes more complex.
Operational Considerations of Cloud
When hosting data in the cloud, IT must take various operational considerations into account. These considerations should include how backups will be performed and how often data will be backed up. Additional operational details, such as the existence of offsite storage and the robustness of disaster recovery protocols should be examined.
A further consideration IT should take into account as cloud computing gains wider acceptance is the ability for users to access the data center from both desktop and mobile environments. When considering mobile access, IT must evaluate technical considerations, such as how users will traverse an organization's firewall. Existing security requirements around data, and the ability to synchronize that data to mobile devices when appropriate, clearly add to the complexity of cloud deployment. Areas such as these should be top of mind for IT as they move forward.
The Sky Is the Limit With Cloud
As IT seeks to maximize the efficiency, performance and responsiveness of the data center, virtualization and cloud computing are providing important new tools to meet mission-critical goals. Cloud is opening the door for IT to get the most out of vital processing power and infrastructure.
The tremendous potential of cloud computing is fundamentally shifting the way data is created, processed and shared. Currently, the most prominent cloud models, private, public and hybrid, are transforming the data center into an environment that delivers significant benefits to end-users, in how they access and manage data.
As new strategies evolve and more classes of applications become viable in the cloud, IT has the opportunity to dramatically improve the agility, flexibility, performance and efficiency of data centers which, in turn, will ultimately lead to bottom-line organizational success.
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee...
Feb. 12, 2016 10:41 AM EST
SYS-CON Events announced today that (ISC)²® (“ISC-squared”) will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Two leading non-profits focused on cloud and information security, (ISC)² and Cloud Security Alliance (CSA), developed the Certified Cloud Security Professional (CCSP) certification to address the increased demand for cloud security expertise due to rapid growth in cloud. Recently named “The Next...
Feb. 12, 2016 10:30 AM EST
SYS-CON Events announced today that iDevices®, the preeminent brand in the connected home industry, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. iDevices, the preeminent brand in the connected home industry, has a growing line of HomeKit-enabled products available at the largest retailers worldwide. Through the “Designed with iDevices” co-development program and its custom-built IoT Cloud Infrastruc...
Feb. 12, 2016 10:00 AM EST Reads: 121
There will be new vendors providing applications, middleware, and connected devices to support the thriving IoT ecosystem. This essentially means that electronic device manufacturers will also be in the software business. Many will be new to building embedded software or robust software. This creates an increased importance on software quality, particularly within the Industrial Internet of Things where business-critical applications are becoming dependent on products controlled by software. Qua...
Feb. 12, 2016 09:51 AM EST
As enterprises work to take advantage of Big Data technologies, they frequently become distracted by product-level decisions. In most new Big Data builds this approach is completely counter-productive: it presupposes tools that may not be a fit for development teams, forces IT to take on the burden of evaluating and maintaining unfamiliar technology, and represents a major up-front expense. In his session at @BigDataExpo at @ThingsExpo, Andrew Warfield, CTO and Co-Founder of Coho Data, will dis...
Feb. 12, 2016 09:30 AM EST Reads: 209
The Quantified Economy represents the total global addressable market (TAM) for IoT that, according to a recent IDC report, will grow to an unprecedented $1.3 trillion by 2019. With this the third wave of the Internet-global proliferation of connected devices, appliances and sensors is poised to take off in 2016. In his session at @ThingsExpo, David McLauchlan, CEO and co-founder of Buddy Platform, will discuss how the ability to access and analyze the massive volume of streaming data from mil...
Feb. 12, 2016 09:00 AM EST
WebSocket is effectively a persistent and fat pipe that is compatible with a standard web infrastructure; a "TCP for the Web." If you think of WebSocket in this light, there are other more hugely interesting applications of WebSocket than just simply sending data to a browser. In his session at 18th Cloud Expo, Frank Greco, Director of Technology for Kaazing Corporation, will compare other modern web connectivity methods such as HTTP/2, HTTP Streaming, Server-Sent Events and new W3C event APIs ...
Feb. 12, 2016 09:00 AM EST
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, will give users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion w...
Feb. 12, 2016 08:00 AM EST
Advances in technology and ubiquitous connectivity have made the utilization of a dispersed workforce more common. Whether that remote team is located across the street or country, management styles/ approaches will have to be adjusted to accommodate this new dynamic. In his session at 17th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., focused on the challenges of managing remote teams, providing real-world examples that demonstrate what works and what do...
Feb. 12, 2016 08:00 AM EST Reads: 323
SYS-CON Events announced today that Men & Mice, the leading global provider of DNS, DHCP and IP address management overlay solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. The Men & Mice Suite overlay solution is already known for its powerful application in heterogeneous operating environments, enabling enterprises to scale without fuss. Building on a solid range of diverse platform support,...
Feb. 12, 2016 06:00 AM EST Reads: 248
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
Feb. 12, 2016 05:45 AM EST Reads: 457
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, will discuss using predictive analytics to ...
Feb. 12, 2016 04:45 AM EST Reads: 405
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
Feb. 12, 2016 03:45 AM EST Reads: 267
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
Feb. 12, 2016 03:45 AM EST Reads: 351
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Feb. 12, 2016 02:30 AM EST Reads: 226
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
Feb. 12, 2016 02:00 AM EST Reads: 441
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Feb. 12, 2016 12:00 AM EST Reads: 283
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection.
Feb. 11, 2016 11:00 PM EST Reads: 232
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
Feb. 11, 2016 10:45 PM EST Reads: 371
How Best to Integrate Cloud Foundry into Your Existing Ecosystem By @Gidrontxt | @DevOpsSummit #DevOps
As someone who has been dedicated to automation and Application Release Automation (ARA) technology for almost six years now, one of the most common questions I get asked regards Platform-as-a-Service (PaaS). Specifically, people want to know whether release automation is still needed when a PaaS is in place, and why. Isn't that what a PaaS provides? A solution to the deployment and runtime challenges of an application? Why would anyone using a PaaS then need an automation engine with workflow ...
Feb. 11, 2016 05:15 PM EST Reads: 223