|By Brian Gracely||
|November 28, 2012 08:45 AM EST||
At VMworld this year, both in San Francisco and Barcelona, VMware CEO Pat Gelsinger introduced the concept of the Software-Defined Datacenter (SDDC). This builds on the concept that as more and more of the Data Center becomes virtualized (servers, desktops), delivering greater cost-savings and agility to customers, software-defined automation and functionality (network, security, storage, backup) become the next logical steps to help IT deliver greater value to the business.
As with any new technology or vision, there are often many questions about how this will impact the market, how it will affect IT organizations. Wikibon did a nice job providing their view on "Software-led Infrastructure". It's one of many attempts that I've seen to start trying to put a scope around this concept. Some portions are agreed upon, while others are creating some headaches.
I created this short FAQ to help answer some of those questions:
1. VMware is using a new term, "Software-Defined Datacenter" (SDDC), at the center of the 2012 conference. What is Software-Defined Datacenter?
[Steve Herrod blog]. Software Defined Data Center is VMware's vision that greater business value can be created from IT when intelligent software is abstracted from standardized hardware. In the simplest technical definition, it is the separation (or abstraction) of the "control plane" (configuration, topology awareness, management, operations) from the "data plane" (moving data, storing data).
1a. Is there a clear spelling of this term?
- Meh. Maybe, but it will have at least 3-5 variations in 2013. Just call it "SDDC" and save yourself a lot of auto-correct headaches.
2. Is there a clear, agreed upon definition (or standard) for Software-Defined Datacenter at this time?
- Software-Defined Datacenter is not defined by an existing standards body (eg. IETF, ITU, NIST), but rather it is vision for the evolution of how Data Center environments will become more flexible in responding to business demands. SDDC builds upon the abstraction that server virtualization has created and extends this to broader elements of the Data Center (eg. network, storage), as well as expanding the roll that automation will play in the future.
3. How is "Software-Defined Datacenter" different than "Cloud"?
- Cloud (or Cloud Computing) is fundamentally a new operational model for IT, where resources are delivered on-demand. While Cloud uses technologies such as virtualization or converged infrastructure, it's primarily about the shift in delivery and consumption of IT services. Software Defined Data Center is the next evolution of the underlying technology, where software delivers greater levels of intelligence and value, on top of standardized hardware.
4. Does Software-Defined Datacenter eliminate the need for traditional Data Center hardware?
- No. There will still be a need for physical serves (CPU, memory), network devices to connect ports and deliver bandwidth, and devices that can store data on flash/disk/tape. But the trend in the industry is that these devices are becoming more standardized on x86 chips, mass produced memory/disks and mass produced ASICs. This trend should allow faster, more simplified "fabrics" (interconnecting servers, networks and storage) to be built, with the intelligence for policy, security, operations to continue to move into software, which is faster to develop and adapt to changing business requirements. Leading companies have been shifting their product strategies to embrace this trend for the last few years.
5. Which market segments does Software-Defined Datacenter target, or which use cases?
- Software-Defined Datacenter technology are applicable to markets of all sizes (Enterprise, Mid-Market, Service Provider), but the initial adopters have been large Service Providers that are attempting to solve challenges with large-scale Data Centers. As the competition for Public and Hybrid Cloud services increases (Amazon, Google, Rackspace, Microsoft, Cloud Service Providers), the need to drive greater operational efficiency, and associated costs and time-to-market, is pushing them to solve problems in new software-centric ways.
- As more Enterprise and Mid-Market customers adopt Private Cloud and deliver IT-as-a-Service, I also expect SDDC technologies to evolve to solve challenges at different scale, as well as user-centric challenges such as BYOD.
6. How will Software-Defined Datacenter impact IT organizations?
- Even more than ever, the current era of IT is ultimately defined by rapid change, in terms of new devices (smartphones, tablets), new application consumption models (PaaS, SaaS), or converging technology silos (virtualization, converged infrastructure). Software-Defined Datacenter is the next step in converging functional areas, while attempting to give IT the ability to respond to business challenges faster.
7. Is Software-Defined Datacenter a competitive threat to traditional hardware companies?
- As mentioned above, Software-Defined Datacenter does not eliminate the need for physical hardware within the Data Center. Rather it is a vision to enable customers to better take advantage of the trend towards delivering software intelligence on standardized hardware. As with many technology transitions, there are opportunities to evolve technology portfolios, evolve business models and unlock new partnership opportunities.
8. Is Software-Defined Datacenter explicitly linked with open-source technologies such as OpenStack, OpenFlow or Open vSwitch?
- While there are open-source projects today that will have an influence on Software-Defined Datacenters, by no means does this mean that this is the only delivery mechanism for customers to obtain the technology needed for this IT technology evolution. A few examples of this:
- VMware's acquisition of Nicira - while Nicira was a major contributor to the OpenStack Quantum project (network virtualization) and the Open vSwitch project, which are both open-source, their core NVP product was a commercial offering.
- OpenFlow is a standards-based protocol for network virtualization that can be implemented by any vendor, for either open-source or commercial products.
- "Project Razor" is an open-source project that was jointly created by EMC and Puppet Labs to deliver advanced server and application automation for Data Center and Cloud environments. The software can be used with either commercial products (eg. VMware vSphere, Cisco UCS, etc.) or open-source projects (OpenStack, KVM, CloudFoundry)
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
Jul. 31, 2016 03:15 AM EDT Reads: 1,690
In today's uber-connected, consumer-centric, cloud-enabled, insights-driven, multi-device, global world, the focus of solutions has shifted from the product that is sold to the person who is buying the product or service. Enterprises have rebranded their business around the consumers of their products. The buyer is the person and the focus is not on the offering. The person is connected through multiple devices, wearables, at home, on the road, and in multiple locations, sometimes simultaneously...
Jul. 31, 2016 01:30 AM EDT Reads: 1,014
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
Jul. 31, 2016 12:45 AM EDT Reads: 2,043
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Jul. 30, 2016 11:30 PM EDT Reads: 1,347
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
Jul. 30, 2016 10:30 PM EDT Reads: 1,947
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Jul. 30, 2016 09:45 PM EDT Reads: 1,324
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 30, 2016 09:45 PM EDT Reads: 1,502
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 30, 2016 09:30 PM EDT Reads: 1,538
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Jul. 30, 2016 07:30 PM EDT Reads: 1,768
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Jul. 30, 2016 07:00 PM EDT Reads: 2,779
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 30, 2016 05:00 PM EDT Reads: 1,313
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 30, 2016 04:30 PM EDT Reads: 2,284
[webcast] Continuous Delivery in the Enterprise | @DevOpsSummit @IBMDevOps #IBM #DevOps #ContinuousDelivery
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
Jul. 30, 2016 04:30 PM EDT Reads: 595
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
Jul. 30, 2016 04:15 PM EDT Reads: 663
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Jul. 30, 2016 02:30 PM EDT Reads: 1,085
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Jul. 30, 2016 02:00 PM EDT Reads: 547
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Jul. 30, 2016 01:30 PM EDT Reads: 992
Ovum, a leading technology analyst firm, has published an in-depth report, Ovum Decision Matrix: Selecting a DevOps Release Management Solution, 2016–17. The report focuses on the automation aspects of DevOps, Release Management and compares solutions from the leading vendors.
Jul. 30, 2016 01:00 PM EDT Reads: 1,868
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
Jul. 30, 2016 01:00 PM EDT Reads: 2,139
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jul. 30, 2016 12:00 PM EDT Reads: 1,384