Click here to close now.


Eclipse Authors: Liz McMillan, XebiaLabs Blog, Ken Fogel, Sematext Blog, Marcin Warpechowski

Related Topics: Containers Expo Blog

Containers Expo Blog: Blog Post

Types of Network Automation

In networking, workflows are awfully complicated

In networking, workflows are awfully complicated.  There are many workflows, and the exact nature of each depends on a number of variables.  What task comes next is often dependent on the outcome of the previous task, and there is a large amount of data to navigate sometimes to complete a workflow.  Nevertheless, there plenty of opportunity to identify and automate common tasks and segments of workflows.  Once we’ve identified these, we need to ask ourselves, how exactly should we automate them?

“Encapsulation” means a vendor (possibly a third party vendor) has written software that accomplishes the same thing the workflow does, but usually not the in the same linear way a customer would do it.  Sub-components within an encapsulation have well-designed interfaces for the purpose of accomplishing the goal.  The encapsulation would likely be written in Java or C.  In networking, encapsulated workflows are usually specific to a vendor’s product and often lack flexibility and features.  Encapsulated workflows will manifest as products or product features.

Consider the following workflow:

Untitled Drawing

Figure 1 shows a simplified packet walkthrough for a device.  Here, in the course of evaluating what is happening to a packet passing through this device, we have discovered a filter policy applied to the ingress interface.  This policy has two terms, and each of these terms references an access-list.  A network engineer would need to evaluate this filter policy to determine if it is doing something to the packets of interest.  The thing is, policy languages have a great deal of expressiveness and grammar.  They are also proprietary.  After the filter policy is evaluated, this workflow follows the forwarding pipeline to the egress interface.  If you are an expereinced network engineer, you will know that there are other elements in the pipeline that should be checked for any given network device.  However, there is enormous variation in the structure of the pipeline from one platform to the next.  Therefore, this is a great candidate for discrete encapsulation.* There are more effective ways of achieving the goal of a packet walkthrough than the way a network engineer must do it now (particularly for SDN products), and vendors know their platforms and policy idioms best.

*Discrete means it’s a workflow with a beginning and an end.  It can be manually invoked by a user, and runs for a finite amount of time, reaching some conclusion.

A workflow automation, on the other hand, consists of sub-components that are “glued together.”  These components were not built especially for automation, and the interfaces between these components were not designed for any particular workflow.  Automations can be developed by the customer, and very frequently discrete automations are employed by network engineers.  A great example here would be a script to configure the login banner on some number of devices.  These automations are written in “softer” languages like Python or Perl.

There is a clear need for Continuous Automation in networking.  Plexxi’s own DSE, now an integral part of the OpenStack Congress project, attempts to address this need.  As the name implies, Continuous means it’s an on-going process.  In the case of the Congress, it is a modular, event/data driven system.  In an environment where there are a plethora of protocols and APIs, each with their own idiosyncracies, this kind of automation makes sense.  Particularly in the context of an open-source community.

Curiously, some workflows may best be addressed by a combination of automation types.  For instance, if a customer wanted to know what was going on in the network relative to a particular application, that workflow automation could use the packet walkthrough encapsulation of a vendor combined with an automation tool like the DSE to harvest network meta-data from external systems about application endpoints.  This could yield a network map of the application’s endpoints along with visual indicators of issues in the network that could be impacting to the application.  In this way, the network engineer could quickly and accurately gauge the health of the network in the context of an application versus engaging in a tedious and error-prone search “by hand.”

Customers, vendors, and open source communities should work togethor to make networking better.  Identifying common workflows and determing the best way to automate them is a good first step.  This will require vendors to think differently about how they develop their products, with their user’s needs in mind first.  Traditionally, just getting a network feature to work and interoperate was the goal, but now we must consider how this feature fits into common workflows performed by network engineers.

[Fun fact:  Broccoli is a member of the cabbage family.  In spite of this, Broccoli tastes good.  When someone offers you cabbage, they are insulting you.]

The post Types of Network Automation appeared first on Plexxi.

Read the original blog entry...

More Stories By Derick Winkworth

Derick Winkworth has been a developer, network engineer, and IT architect in various verticals throughout his career.He is currently a Product Manager at Plexxi, Inc where he focuses on workflow automation and product UX.

@ThingsExpo Stories
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Nowadays, a large number of sensors and devices are connected to the network. Leading-edge IoT technologies integrate various types of sensor data to create a new value for several business decision scenarios. The transparent cloud is a model of a new IoT emergence service platform. Many service providers store and access various types of sensor data in order to create and find out new business values by integrating such data.
The broad selection of hardware, the rapid evolution of operating systems and the time-to-market for mobile apps has been so rapid that new challenges for developers and engineers arise every day. Security, testing, hosting, and other metrics have to be considered through the process. In his session at Big Data Expo, Walter Maguire, Chief Field Technologist, HP Big Data Group, at Hewlett-Packard, will discuss the challenges faced by developers and a composite Big Data applications builder, focusing on how to help solve the problems that developers are continuously battling.
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete end-to-end walkthrough of the analysis from start to finish. Participants will also be given the pract...
WebRTC: together these advances have created a perfect storm of technologies that are disrupting and transforming classic communications models and ecosystems. In his session at WebRTC Summit, Cary Bran, VP of Innovation and New Ventures at Plantronics and PLT Labs, will provide an overview of this technological shift, including associated business and consumer communications impacts, and opportunities it may enable, complement or entirely transform.
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures traffic gets delivered faster, safer, and more reliably than ever.
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, will introduce the technologies required for implementing these ideas and some early experiments performed in the Kurento open source software community in areas ...
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Today air travel is a minefield of delays, hassles and customer disappointment. Airlines struggle to revitalize the experience. GE and M2Mi will demonstrate practical examples of how IoT solutions are helping airlines bring back personalization, reduce trip time and improve reliability. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Dr. Sarah Cooper, M2Mi's VP Business Development and Engineering, will explore the IoT cloud-based platform technologies driving this change including privacy controls, data transparency and integration of real time context w...
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, will discuss the impact of technology on identity. Should we federate, or not? How should identity be secured? Who owns the identity? How is identity ...
The IoT market is on track to hit $7.1 trillion in 2020. The reality is that only a handful of companies are ready for this massive demand. There are a lot of barriers, paint points, traps, and hidden roadblocks. How can we deal with these issues and challenges? The paradigm has changed. Old-style ad-hoc trial-and-error ways will certainly lead you to the dead end. What is mandatory is an overarching and adaptive approach to effectively handle the rapid changes and exponential growth.
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver un...
Electric power utilities face relentless pressure on their financial performance, and reducing distribution grid losses is one of the last untapped opportunities to meet their business goals. Combining IoT-enabled sensors and cloud-based data analytics, utilities now are able to find, quantify and reduce losses faster – and with a smaller IT footprint. Solutions exist using Internet-enabled sensors deployed temporarily at strategic locations within the distribution grid to measure actual line loads.
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, will explore the current state of IoT connectivity and review key trends and technology requirements that will drive the Internet of Things from hype to reality.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data shows "less than 10 percent of IoT developers are making enough to support a reasonably sized team....
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Today’s connected world is moving from devices towards things, what this means is that by using increasingly low cost sensors embedded in devices we can create many new use cases. These span across use cases in cities, vehicles, home, offices, factories, retail environments, worksites, health, logistics, and health. These use cases rely on ubiquitous connectivity and generate massive amounts of data at scale. These technologies enable new business opportunities, ways to optimize and automate, along with new ways to engage with users.
The IoT is upon us, but today’s databases, built on 30-year-old math, require multiple platforms to create a single solution. Data demands of the IoT require Big Data systems that can handle ingest, transactions and analytics concurrently adapting to varied situations as they occur, with speed at scale. In his session at @ThingsExpo, Chad Jones, chief strategy officer at Deep Information Sciences, will look differently at IoT data so enterprises can fully leverage their IoT potential. He’ll share tips on how to speed up business initiatives, harness Big Data and remain one step ahead by apply...
There will be 20 billion IoT devices connected to the Internet soon. What if we could control these devices with our voice, mind, or gestures? What if we could teach these devices how to talk to each other? What if these devices could learn how to interact with us (and each other) to make our lives better? What if Jarvis was real? How can I gain these super powers? In his session at 17th Cloud Expo, Chris Matthieu, co-founder and CTO of Octoblu, will show you!
As a company adopts a DevOps approach to software development, what are key things that both the Dev and Ops side of the business must keep in mind to ensure effective continuous delivery? In his session at DevOps Summit, Mark Hydar, Head of DevOps, Ericsson TV Platforms, will share best practices and provide helpful tips for Ops teams to adopt an open line of communication with the development side of the house to ensure success between the two sides.