|By Michael Bushong||
|July 21, 2014 06:00 AM EDT||
Whenever a new networking platform is evaluated, one of the early sales calls includes a packet walkthrough. In excruciating detail, someone walks the customer through the path a packet takes from ingress port, through the device, across the switching or routing ASIC, and back down to the egress port. The technical deep dive frequently includes internals that even the vendor engineers are not all familiar with.
Some people will justify the depth by talking about troubleshooting complex systems. Others will pull on random technical details that suggest one platform is better than another in some regard or under some set of circumstances. Others will actually parrot some of the vendor’s marketing efforts with claims of flexibility, scalability, or extensibility.
While all of these are absolutely valid, they actually miss the biggest reason the packet walkthrough is a ubiquitous part of every selling motion.
We networking gearheads are a skeptical lot. We learned long ago that listening to someone and taking their words for granted was a short path to operational hell. Their words might have sounded true but their promises rang hollow. The platform, or even the architecture, did not perform as advertised. And because the result of a network failure is catastrophically worse than any other infrastructure failure, we have collectively vowed to look at every opportunity with a sideways glance from a somewhat disbelieving perspective.
Trust but verify
The real reason that we evaluate in such detail new platforms and solutions is not because of the inherent troubleshooting value of examining the architecture. Nor is it because we can determine with any certainty what the scaling limits are based on a cursory glance at the internals of a system. We examine architectures in detail because it allows us to put the vendor under a bit of scrutiny. If they stand up to a few somewhat randomly placed questions (less random if you have had particularly painful issues in the past), then we believe with a bit more certainty other claims that are made.
I don’t mention this because I think this is a bad way to do things, mind you. Rather, I bring this up because the collective psyche of the networking buyer needs to be understood if architectural advances like SDN and abstractions are to bring any any real value.
Control freaks and abstraction
Networking generally has operated through meticulous control for decades. Network management via configuration knob puts a ton of power at the hands of the network architect. Behavior can be precisely specified. And when something goes wrong, it can be queried to surmise the cause.
A shift to abstractions might make things easier in terms of actual physical workload (how much typing there is), but it comes with a gigantic leap of faith. Control freaks might complain about how much effort things are, but they absolutely cringe at the thought of giving any of that work up lest something go wrong.
When behavior is specified by an abstraction (as with an edge policy abstraction), not only must the syntax be correct but also the translation of that abstraction into underlying behavior. The former is easy to verify, but the latter requires a bit of faith on behalf of the user that the vendor has done the right thing under the hood.
A peek under the hood
There are already a bunch of industry efforts around SDN and abstractions. Whether it’s vendor-specific (as with Cisco’s ACI) or a part of open source (OpenDaylight, for example), there are a number of movements that either focus on or include some abstraction as part of the solution. But if our past teaches us anything, it is that network architects are not happy with a basic understanding of what the abstractions do. They require additional information so they have at least some concept of how they do it.
It would seem that people peddling abstractions will ultimately need to provide the equivalent of a packet walkthrough. With platforms, this is easy. Where does the packet physically enter the device, and where does it leave? But with abstractions, the equivalent is a bit harder.
Initially, this dynamic favors abstractions that merely replace well-understood configuration with something less. The abstraction walkthrough for a replacement is essentially an expansion of the abstraction into the underlying configuration knobs. Think of this as more indirection than abstraction, more similar to header files than anything else.
But if abstractions are about more than saving keystrokes, this type of walkthrough will not permit itself for even slightly more complex scenarios. This leaves the abstraction salesperson in a tough spot: how do you demonstrate that something works if you cannot provide a meaningful look at the internals?
Behavior determines success
The long-term answer here is going to necessarily fall to actual behavior. The creators of abstractions will need to show in the affirmative that the network (or the applications) behave appropriately when an abstraction is used. This might seem obvious, but the implications are actually quite profound.
For networks today, there are lots of ways to verify specific state in the network (BGP neighbors, interface stats, and so on). And when there is no network state, the configuration itself serves as the check. But what if that configuration is not there?
In the long term, the infrastructure broadly (including but not limited to the network) will need to be instrumented with meaningful abstractions in mind. If abstractions become common around managing edge policy, there will need to be additional ways to instrument specific applications, tenants, and flows. For example, if abstractions allow network engineers to specify a particular application as PCI compliant, then there might need to be ways to verify PCI compliance via command.
The bottom line
The abstraction market initially will be focused on keyboard time reduction. That is a fine place to start, and it is easy to verify. But if the real value of abstractions is in the removal of complexity (not just masking it) and the increased collaboration of infrastructure, then abstraction salespeople are going to need to think through the post-sales elements of their products. Those that do this early will certainly find that having an abstraction walkthrough shortens the evaluation time for new solutions. And if no one else has done this, the existence of such a walkthrough could prove a killer element of the product sales cycle.
[Today’s fun fact: Right-handed people tend to chew food on the right side of their mouths, and lefties on the left side.]
The post Network abstractions need equivalent of packet walkthrough appeared first on Plexxi.
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
Jul. 27, 2016 09:00 AM EDT Reads: 423
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Jul. 27, 2016 08:45 AM EDT Reads: 1,046
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
Jul. 27, 2016 08:45 AM EDT Reads: 900
Verizon Communications Inc. (NYSE, Nasdaq: VZ) and Yahoo! Inc. (Nasdaq: YHOO) have entered into a definitive agreement under which Verizon will acquire Yahoo's operating business for approximately $4.83 billion in cash, subject to customary closing adjustments. Yahoo informs, connects and entertains a global audience of more than 1 billion monthly active users** -- including 600 million monthly active mobile users*** through its search, communications and digital content products. Yahoo also co...
Jul. 27, 2016 08:00 AM EDT Reads: 586
There will be new vendors providing applications, middleware, and connected devices to support the thriving IoT ecosystem. This essentially means that electronic device manufacturers will also be in the software business. Many will be new to building embedded software or robust software. This creates an increased importance on software quality, particularly within the Industrial Internet of Things where business-critical applications are becoming dependent on products controlled by software. Qua...
Jul. 27, 2016 06:30 AM EDT Reads: 1,463
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
Jul. 27, 2016 04:45 AM EDT Reads: 2,269
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
Jul. 27, 2016 04:30 AM EDT Reads: 2,497
Large scale deployments present unique planning challenges, system commissioning hurdles between IT and OT and demand careful system hand-off orchestration. In his session at @ThingsExpo, Jeff Smith, Senior Director and a founding member of Incenergy, will discuss some of the key tactics to ensure delivery success based on his experience of the last two years deploying Industrial IoT systems across four continents.
Jul. 27, 2016 04:00 AM EDT Reads: 1,563
The 19th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Digital Transformation, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportuni...
Jul. 27, 2016 02:15 AM EDT Reads: 2,604
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
Jul. 27, 2016 02:15 AM EDT Reads: 1,461
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
Jul. 27, 2016 12:00 AM EDT Reads: 1,945
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Jul. 26, 2016 11:00 PM EDT Reads: 2,614
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
Jul. 26, 2016 11:00 PM EDT Reads: 2,014
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 26, 2016 10:45 PM EDT Reads: 1,375
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Jul. 26, 2016 09:00 PM EDT Reads: 2,061
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet a...
Jul. 26, 2016 08:00 PM EDT Reads: 395
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
Jul. 26, 2016 07:15 PM EDT Reads: 1,946
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 26, 2016 06:30 PM EDT Reads: 2,150
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jul. 26, 2016 05:45 PM EDT Reads: 1,833
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 26, 2016 04:00 PM EDT Reads: 1,070