Wednesday, October 14, 2015

The 6 laws (for reference)

I'm posting this with references mostly for my own archiving purpose.

http://www.techrepublic.com/article/the-6-laws-every-cloud-architect-should-know-according-to-werner-vogels/?utm_campaign=buffer&utm_content=buffer5c6e4&utm_medium=social&utm_source=twitter.com

This blog is excerpted from techrepublic above.  Appreciation to Werner Vogels for the discussion and Conner Forrest for the article.

Lucas Critique

"It is naive to try to predict the effects of a change entirely on the basis of relationships observed in historical data."

Gall's Law

"A complex system that works is invariably found to have evolved from a simple system that worked. A complex system designed from scratch never works and cannot be patched up to make it work. You have to start over with a working simple system."

Law of Demeter

"Each unit should have only limited knowledge about other units—only units 'closely' related to the current unit. Each unit should only talk to its friends; don't talk to strangers."

Occam's Razor

"The one with the fewest assumptions should be selected."

Reed's Law

"The utility of large networks, particularly social networks, can scale exponentially with the size of the network."

The Gestalt Principle

"The whole is greater than the sum of its parts."

....

The only thing I'd add to this is "don't get inexorably tied to a single mode" as it almost guarantees future failure.  There are exceptions, so it's likely not a law, but there's a lot of truth to it.

....

THE LACK OF HISTORIC KNOWLEDGE IS SO FRUSTRATING -- Ivan Pepelnjak, wish this were a law.  Those who fail to learn from history....

....

rfc1925

Friday, October 9, 2015

Modern Network Engineering

I don't know how to stress this enough, but Network Engineering will NOT go away with the advent of Software Defined Networking.

 Here's why: Software Defined Networking Abstracts the control plane from the data plane. Both the control and data mechanisms continue to exist. What does change is the means by which we interact with them.

 So, as Software Defined Networking and Network Function Virtualization (and SD-WAN, etc, etc) continue to enhance the abstraction, the fundamental knowledge of how the logic of the systems function is still required.

 I can't say, for sure, that the value of someone that knows a particular command line interface is going to be continually valuable though. Look at what happened with the controller (control plane) developments in wireless networking. The value of the CLI diminished drastically with advent of the controller for enterprise wireless networking. The value of how the wireless system actually operates increased and so did the people that could balance the change. The same thing will happen with SDN technology.

REST API and/or eAPI calls are going to replace current system management methods. Any number of programming languages are going to provide the basis for automation of the service. Consider methods like Python scripted interaction with the control plane currently being done with OpenStack and you'll have a glimpse into that future.

 You also shouldn't forget that this isn't the first time software definition has been applied to networking. Some of you may remember TCL. Just saying, not the first time.

 The other element that's driving these advancements in networking can be attributed to the structure of the interface. JSON or similar structure is surely going to play a role in managing software defined elements.

 I urge networkers that are concerned about what looks like an uncertain future in networking to explore the pieces that make up the software defined element. This is the path to value for the individual with Software Defined Networking.

 Take a course, read a book, grab stuff off the web like this and figure out how it's done. You won't regret it.

  Other Reading:





Tuesday, October 6, 2015

Macrosegmentation is now in the Networking Lexicon (soon we'll abbreviate it)

Macrosegmentation at the networking level just got a definition today, we're sure to remove the hyphen (Macro-segmentation) and abbreviate it.

A warm up on the details here:

http://www.arista.com/blogs/?p=1245

At its most basic level, macrosegmentation will allow the re-direction of traffic between specific points in the network to enable logical topologies with firewalls and load balancing systems.

Very nifty trick considering:

     Placement of the equipment is location independent.

     Cybersecurity can continue to manage FW rules without blended support requirements.

     It should* work with multi-vendor equipment.

It also brings back fond memories of an Arista whiteboard barstool in VMworld.  Wish I took a picture of it.





Monday, October 5, 2015

VP of Electricity

Randy Rayess on techcrunch proposes that the CIO is the next VP of Electricity.

Imagine the turn of the 20th century, electricity delivery for a company had to be managed.  Often companies would stand up their own infrastructure to deliver electricity within a building or factory.  To support this infrastructure, there would be a team of electrical specialists, electricians, that would maintain the equipment and support the infrastructure.  Once electricity became a utility, much of this was replaced by the vendor and sold as-a-Service to the customer.

Comparing that to the CIO is interesting because there are some parallels, but not entirely and not in the single component delivery requirement that was under the VP of Electricity.

Consider, for instance, that the CIO's primary goal is to keep the IT Infrastructure delivering the application.  Actually applications.  <--  The plural is incredibly important here. It's not a single service, it's tens, hundreds, sometimes even thousands of applications.

These applications need to inter-operate enough to use common foundational services like networking and data center, platform systems and virtualization, and the growing analytics necessary to make ever increasing critical business decisions.  While we may think about them from a consumption perspective, and that reduces many of the applications to in-business-quarter costs, which is great for current business controlling their financial run rate, but

Inter-operation doesn't happen by magic and someone needs to be in a position to manage these as-a-Service applications before they sprawl into a buffet style line of out of control applications that not only don't support the business objectives, but don't deliver the critical value that is required by the business.  Not to mention the potential risks of data breach and loss that happen when applications are deployed without planning.

Then, consider that the cost of electricity isn't going down.  At best, it's stagnant over long periods of time at commercial rates, but the cost is going up and it's guaranteed to go up.  The use of electricity is also increasing as we put in more general purpose hardware to support more applications on even more virtualized platforms.

My contention to the article is that while we don't need the turn of the 20th century VP of Electricity, we do need to continue to think about the sunk cost in the delivery of applications.  We need to have someone thinking about the plethora of applications needed by each industry to operate as well as the infrastructure and critical access to both private and public services.

Who better than the person that understand's the business demand.