Welcome!

Eclipse Authors: Liz McMillan, Elizabeth White, XebiaLabs Blog, Ken Fogel, Sematext Blog

Related Topics: Microservices Expo, Open Source Cloud, Containers Expo Blog, Release Management , @CloudExpo, Apache

Microservices Expo: Blog Feed Post

What Is Software-Defined Datacenter (SDDC)?

I created this short FAQ to help answer some of those questions

At VMworld this year, both in San Francisco and Barcelona, VMware CEO Pat Gelsinger introduced the concept of the Software-Defined Datacenter (SDDC). This builds on the concept that as more and more of the Data Center becomes virtualized (servers, desktops), delivering greater cost-savings and agility to customers, software-defined automation and functionality (network, security, storage, backup) become the next logical steps to help IT deliver greater value to the business.

As with any new technology or vision, there are often many questions about how this will impact the market, how it will affect IT organizations. Wikibon did a nice job providing their view on "Software-led Infrastructure". It's one of many attempts that I've seen to start trying to put a scope around this concept. Some portions are agreed upon, while others are creating some headaches.

I created this short FAQ to help answer some of those questions:

1. VMware is using a new term, "Software-Defined Datacenter" (SDDC), at the center of the 2012 conference. What is Software-Defined Datacenter?
[Steve Herrod blog]. Software Defined Data Center is VMware's vision that greater business value can be created from IT when intelligent software is abstracted from standardized hardware.  In the simplest technical definition, it is the separation (or abstraction) of the "control plane" (configuration, topology awareness, management, operations) from the "data plane" (moving data, storing data).

1a. Is there a clear spelling of this term?

  • Meh. Maybe, but it will have at least 3-5 variations in 2013. Just call it "SDDC" and save yourself a lot of auto-correct headaches.

2. Is there a clear, agreed upon definition (or standard) for Software-Defined Datacenter at this time?

  • Software-Defined Datacenter is not defined by an existing standards body (eg. IETF, ITU, NIST), but rather it is vision for the evolution of how Data Center environments will become more flexible in responding to business demands. SDDC builds upon the abstraction that server virtualization has created and extends this to broader elements of the Data Center (eg. network, storage), as well as expanding the roll that automation will play in the future.

3. How is "Software-Defined Datacenter" different than "Cloud"?

  • Cloud (or Cloud Computing) is fundamentally a new operational model for IT, where resources are delivered on-demand. While Cloud uses technologies such as virtualization or converged infrastructure, it's primarily about the shift in delivery and consumption of IT services. Software Defined Data Center is the next evolution of the underlying technology, where software delivers greater levels of intelligence and value, on top of standardized hardware.

4. Does Software-Defined Datacenter eliminate the need for traditional Data Center hardware?

  • No. There will still be a need for physical serves (CPU, memory), network devices to connect ports and deliver bandwidth, and devices that can store data on flash/disk/tape. But the trend in the industry is that these devices are becoming more standardized on x86 chips, mass produced memory/disks and mass produced ASICs. This trend should allow faster, more simplified "fabrics" (interconnecting servers, networks and storage) to be built, with the intelligence for policy, security, operations to continue to move into software, which is faster to develop and adapt to changing business requirements. Leading companies have been shifting their product strategies to embrace this trend for the last few years.

5. Which market segments does Software-Defined Datacenter target, or which use cases?

  • Software-Defined Datacenter technology are applicable to markets of all sizes (Enterprise, Mid-Market, Service Provider), but the initial adopters have been large Service Providers that are attempting to solve challenges with large-scale Data Centers. As the competition for Public and Hybrid Cloud services increases (Amazon, Google, Rackspace, Microsoft, Cloud Service Providers), the need to drive greater operational efficiency, and associated costs and time-to-market, is pushing them to solve problems in new software-centric ways.
    • As more Enterprise and Mid-Market customers adopt Private Cloud and deliver IT-as-a-Service, I also expect SDDC technologies to evolve to solve challenges at different scale, as well as user-centric challenges such as BYOD.

6. How will Software-Defined Datacenter impact IT organizations?

  • Even more than ever, the current era of IT is ultimately defined by rapid change, in terms of new devices (smartphones, tablets), new application consumption models (PaaS, SaaS), or converging technology silos (virtualization, converged infrastructure). Software-Defined Datacenter is the next step in converging functional areas, while attempting to give IT the ability to respond to business challenges faster.

7. Is Software-Defined Datacenter a competitive threat to traditional hardware companies?

  • As mentioned above, Software-Defined Datacenter does not eliminate the need for physical hardware within the Data Center. Rather it is a vision to enable customers to better take advantage of the trend towards delivering software intelligence on standardized hardware. As with many technology transitions, there are opportunities to evolve technology portfolios, evolve business models and unlock new partnership opportunities.

8. Is Software-Defined Datacenter explicitly linked with open-source technologies such as OpenStack, OpenFlow or Open vSwitch?

  • While there are open-source projects today that will have an influence on Software-Defined Datacenters, by no means does this mean that this is the only delivery mechanism for customers to obtain the technology needed for this IT technology evolution. A few examples of this:
  • VMware's acquisition of Nicira - while Nicira was a major contributor to the OpenStack Quantum project (network virtualization) and the Open vSwitch project, which are both open-source, their core NVP product was a commercial offering.
  • OpenFlow is a standards-based protocol for network virtualization that can be implemented by any vendor, for either open-source or commercial products.
  • "Project Razor" is an open-source project that was jointly created by EMC and Puppet Labs to deliver advanced server and application automation for Data Center and Cloud environments. The software can be used with either commercial products (eg. VMware vSphere, Cisco UCS, etc.) or open-source projects (OpenStack, KVM, CloudFoundry)

Read the original blog entry...

More Stories By Brian Gracely

A 20 year technology veteran, Brian Gracely is VP of product management at Virtustream. He holds a CCIE #3077 and an MBA from Wake Forest University.

Throughout his career Brian has led Cisco, NetApp, EMC and Virtustream into emerging markets and through technology transitions. An active participant in the virtualization and cloud computing communities, his industry viewpoints and writing can also be found on Twitter @bgracely, on his blog Clouds of Change and his podcast The Cloudcast (.net). He is a VMware vExpert and was named a "Top 100" Cloud Computing blogger by Cloud Computing Journal.

@ThingsExpo Stories
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
Large scale deployments present unique planning challenges, system commissioning hurdles between IT and OT and demand careful system hand-off orchestration. In his session at @ThingsExpo, Jeff Smith, Senior Director and a founding member of Incenergy, will discuss some of the key tactics to ensure delivery success based on his experience of the last two years deploying Industrial IoT systems across four continents.
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet a...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
The best-practices for building IoT applications with Go Code that attendees can use to build their own IoT applications. In his session at @ThingsExpo, Indraneel Mitra, Senior Solutions Architect & Technology Evangelist at Cognizant, provided valuable information and resources for both novice and experienced developers on how to get started with IoT and Golang in a day. He also provided information on how to use Intel Arduino Kit, Go Robotics API and AWS IoT stack to build an application tha...
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...