|By Lori MacVittie||
|June 3, 2014 10:00 AM EDT||
It's probably no surprise that I have long advocated the position that hybrid cloud would eventually become "the standard" architecture with respect to, well, cloud computing. As the dev/ops crowd at Glue Con was recently reminded by the self-styled "most obnoxious man in cloud", Josh McKenty, you can only add to what exists in the data center. You can't simply rip and replace, forklifts are not allowed, and allowances must be made for how to integrate with existing systems no matter how onerous that might be. The future is, as he put it, open and closed, traditional and modern, automated and human.
I would add to that, it is both public and private, with respect to cloud.
Hybrid cloud models were inevitable for all these reasons and more. Suffice to say that there is unlikely to be a technology that will turn data centers into the green fields every starry-eyed young architect and engineer wishes they could be.
So if the question is no longer what cloud model will ultimately win the hearts and minds of the enterprise, the question must turn to other more tactical concerns, such as integrating the two models into a seamless, well-oiled machine.
Right now, hybrid cloud models are disconnected and managed manually. Oh, there are scripts and APIs, yes. But those are mainly concerned with provisioning and management. They aren't about actually using the cloud as the extension of the data center it was promised to be. They're still separate entities, for the most part, and treated as such. They're secondary and tertiary data centers. Stand-alone centers of computing that remain as disconnected operationally as they are physically.
They aren't a data center fabric, yet, even though the unicorn and rainbow goal of hybrid cloud is to achieve just that: distributed resources that act as a single, unified entity. Like a patchwork quilt, sewn from many different blocks but in the end, a single cohesive product. If not in topology, then in usage. Which is the point of many technologies today: abstraction. Abstraction enables the decoupling of interface from implementation, applications from networks, and control from data.
Doing so liberates applications (which is ultimate the reason for what we all do) from being bound to a given location, frees resources to meld with the broader data center fabric, and offers business greater freedom.
But it isn't just the applications that must be unchained from the data center jail. It is the numerous services within the network that support those applications that must also be set free. Security. Availability. Identity. Access. Performance. Applications are not islands, they are worlds unto themselves comprised of a variety of network and application services that must accompany them as they traverse these new, unfettered boundaries.
As Barrett Lyon, founder of Defense.Net put it so well in his recent blog, what we need is to seamlessly merge these environments without concern for their physical separation:
By having such a solid foundation, the next step is to seamlessly merge the DDoS defense network with F5’s hardware to create the world’s first true hybrid cloud. The vision is that customers can create their own local DDoS defense, and when volumetric attacks hit, at a specific point they’re “automatically” offloaded to the cloud.
Barrett's proposal regarding a hybrid DDoS model carries with it shades of cloud bursting for applications, but goes a step further with the notion that hybrid cloud (at least for DDoS) should be seamless. And why shouldn't it? The definition of cloud brokers includes this capability. To seamlessly automate the provisioning of services and applications based on some relevant criteria. For DDoS, certainly there is a consideration of bandwidth consumption. For applications, it may be demand and capacity. Or it might consider costs and location of the user.
The criteria are not so much the important point but rather it is the capability to achieve this functionality. To be able to seamlessly take advantage of a data center distributed across multiple environments, both on-premise and cloud, both public and private. We've seen the beginnings of these types of seamless integrations with cloud identity federation - the use of standards like SAML to promote access control over applications that reside beyond the corporate borders but within its overall perimeter.
Corporate borders are expanding. They must necessarily include all manner of cloud environments and they cannot continue to be disconnected operational islands. We need to consider that if the future is hybrid and composable, that we ought to be able to manage such a environment more seamlessly and with greater attention to architectures that not only accept that premise, but exploit it to the advantage of IT and the business.
SYS-CON Media announced that IBM, which offers the world’s deepest portfolio of technologies and expertise that are transforming the future of work, has launched ad campaigns on SYS-CON’s numerous online magazines such as Cloud Computing Journal, Virtualization Journal, SOA World Magazine, and IoT Journal. IBM’s campaigns focus on vendors in the technology marketplace, the future of testing, Big Data and analytics, and mobile platforms.
Feb. 27, 2015 10:15 AM EST Reads: 672
There has been a lot of discussion recently in the DevOps space over whether there is a unique form of DevOps for large enterprises or is it just vendors looking to sell services and tools. In his session at DevOps Summit, Chris Riley, a technologist, discussed whether Enterprise DevOps is a unique species or not. What makes DevOps adoption in the enterprise unique or what doesn’t? Unique or not, what does this mean for adopting DevOps in enterprise size organizations? He also explored differe...
Feb. 27, 2015 10:00 AM EST Reads: 2,227
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Feb. 27, 2015 10:00 AM EST Reads: 3,105
NaviSite, Inc., a Time Warner Cable company, has opened a new enterprise-class data center located in Santa Clara, California. The new data center will enable NaviSite to meet growing demands for its enterprise-class Cloud and Managed Services from existing and new customers. This facility, which is owned by data center solution provider Digital Realty, will join NaviSite’s fabric of nine existing data centers across the U.S. and U.K., all of which are designed to provide a resilient, secure, hi...
Feb. 27, 2015 10:00 AM EST Reads: 732
SYS-CON Events announced today that Intelligent Systems Services will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Established in 1994, Intelligent Systems Services Inc. is located near Washington, DC, with representatives and partners nationwide. ISS’s well-established track record is based on the continuous pursuit of excellence in designing, implementing and supporting nationwide clients’ mission-cri...
Feb. 27, 2015 10:00 AM EST Reads: 1,034
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
Feb. 27, 2015 10:00 AM EST Reads: 4,517
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com...
Feb. 27, 2015 10:00 AM EST Reads: 3,849
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
Feb. 27, 2015 09:45 AM EST Reads: 1,839
The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application and business needs change, data repositories and structures get outmoded rapidly, resulting in increased work for applica...
Feb. 27, 2015 09:30 AM EST Reads: 1,268
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Feb. 27, 2015 09:30 AM EST Reads: 1,538
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
Feb. 27, 2015 09:15 AM EST Reads: 1,107
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
Feb. 27, 2015 09:00 AM EST Reads: 1,279
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
Feb. 27, 2015 09:00 AM EST Reads: 1,121
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
Feb. 27, 2015 09:00 AM EST Reads: 2,796
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
Feb. 27, 2015 09:00 AM EST Reads: 873
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities. In his session at @ThingsExpo, Gary Hall, Chief Technology Officer, Federal Defense at Cisco S...
Feb. 27, 2015 09:00 AM EST Reads: 955
Big Data is amazing, it's life changing and yes it is changing how we see our world. Big Data, however, can sometimes be too big. Organizations that are not amassing massive amounts of information and feeding into their decision buckets, smaller data that feeds in from customer buying patterns, buying decisions and buying influences can be more useful when used in the right way. In their session at Big Data Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positi...
Feb. 27, 2015 08:00 AM EST Reads: 981
SYS-CON Media announced today that XebiaLabs launched a popular blog feed on DevOps Journal with close to 2,000 story reads in less than a day. DevOps Journal is focused on this critical enterprise IT topic in the world of cloud computing. DevOps Journal brings valuable information to DevOps professionals who are transforming the way enterprise IT is done.
Feb. 27, 2015 05:00 AM EST Reads: 2,204
Cloudian, Inc., the leading provider of hybrid cloud storage solutions, today announced availability of Cloudian HyperStore 5.1 software. HyperStore 5.1 is an enhanced Amazon S3-compliant, plug-and-play hybrid cloud software solution that now features full Apache Hadoop integration. Enterprises can now transform big data into smart data by running Hadoop analytics on HyperStore software and appliances. This in-place analytics, with no need to offload data to other systems for Hadoop analyses, en...
Feb. 27, 2015 04:00 AM EST Reads: 2,280
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
Feb. 27, 2015 04:00 AM EST Reads: 2,792