Friday, July 6, 2018

Evolving a Business Process with a Wardley Map, Part 2

Following on from Evolving a Business Process with a Wardley Map, we were discussing the Cost mechanism.

The question was, if we understand the BOM, "Can we get to ROM Cost quicker?"

@swardley, Business, Costing, Map, mapping, Maps, Pricing, Wardley, WardleyMaps,
5. Cost Calculator
This part of the evolution of the business process is actually more difficult.  It involves having ALL of the partner/vendors willing and able to work with you to create a Cost Calculator.

The good thing, if it is possible to create a BOM Calculator, most of the business logic is understood well enough to introduce the concept of Cost with the same (or similar) business logic.  The reverse is also true, we just needed a starting point.

In Figure 5, we're evolving a Cost Calculator as a product of the BOM Calculator output.  It is most assuredly in the Product category because it requires a product like update on a regular interval with the partner/vendor to maintain currency with their cost changes.

In Figure 6, we've created the new Cost method.

6. New Cost Method
The benefit of this change in process, we've shaved an additional 7-10 days off the Process.  We're also able to discuss pricing with a real BOM and a tolerance based Cost.

The problem, it is unlikely to be as exact as the Quotation step, but fine for ROM review. Providing we can live with small fluctuation between the Cost Calculator and the Quotation, everything should be fine.  Just make absolutely sure you understand completely the cost tolerance requirements.

7. Optimization Method
There's quite a good chance, especially with a mixed product-service opportunity, that there is potentially an unacceptable variation to the required tolerance without looking at Optimization of the BOM to ROM, Figure 7.

The Question for the Cost Calculator, "Is 1 large widget more cost effective than 2 medium widgets or even 4 small widgets?"

Consider the scenario where the product is a combination of labor, hardware and software licensing.  Optimization of the Calculators (a Calculator of Calculators, if you will) provides the means to create a deeper awareness of the product as well as a mechanism for Sizing appropriately for the need with cost in mind.

8. New ROM and BOM tool
We then shift our attention to Customer Pricing.

The Question, "Can we get to Customer Price quicker with better awareness?"

In the mapping exercise, we could very well have started with this question as it would be the most customer centric pursuit.  I'm not entirely sure it would have changed the path we followed, but I'm relatively certain it would have been a more daunting starting point.

9. New Costing and Pricing Method

It's obvious that a Calculator for pricing solves this problem the most expediently.

At this point, we've achieved the knowledge necessary to create calculators for the change in process steps.  We can also use those calculators to feed the Quotation requirements, potentially shortening that process step in Final Pricing.

The big win, we've moved the Sales Agent's view of the process from Custom Build to Product and improved his time to respond to the customer.

The overall process, from start to finish goes like this:

Customer Input -> ROM and BOM Calculator (as many times as Customer Input changes) -> Feed the Sizing from the ROM and BOM Calculator to Quotation -> Run the Quotation(s) through the Pricing Tool -> Sales Agent happiness

This Wardley Mapping exercise ended up being mostly about convenience, a bit about standardization.  How do you execute an On-Demand set of actions that would normally take between 22 and 44 days, in the shortest time possible?

As always, I'm interested in your feedback.  Find me on Twitter @abusedbits

Friday, June 29, 2018

Evolving a Business Process with a Wardley Map

Wardley Mapping (Value Chain Mapping) can be a great asset in describing the evolution of a business process.  In this case, I'm going to illustrate the changes of the Costing and Pricing action to show how improvements may be identified once the current or base activity is mapped.

In this scenario, Costing and Pricing, a sales agent needs to arrive at a Price for the customer.

The process includes engagement of a high skill technical person to create a Bill of Materials (BOM) for the customer.

It requires a Quotation that includes all of the BOM items, this is then used to create a Cost.

The Costs are then all rolled up, margin/cost-of-money and services/labor are included to create Customer Price in such a way that the Sales Agent may have a Rough Order of Magnitude (ROM) conversation with the Customer.

Here's what that map looks like.

@swardley, WardleyMaps, Wardley, Maps, Map, mapping, Business, Costing, Pricing
1.  Costing and Pricing Activity
Keeping in mind that Wardley Map placement is relative, BOM activity is in the Genesis category primarily because "we know very little" about the customer desire until a BOM is generated to discuss and modify to the customer requirements.

Quotation is in Custom Build, because it is generated uniquely based on the BOM for the customer.  It is subject to change, also based on customer requirements.

The same can be said for Cost and Customer Price.  Without a customer qualified BOM, the entire process flow is subject to correction and reset until that action is completed.

The Question to ask, "Can we move this process to Product or Utility?"

2.  Optimize the BOM activity
Any one of the items in the process flow could be looked at for possible improvement.  In determining where to start, sometimes it is just picking a point and mapping out possible evolution steps and see what happens to the map.

In this case, we're looking at the BOM, if for no other reason than it feeds the more visible aspects of the value chain.

The Question for BOM, "Can we improve the BOM by creating a calculator for the BOM?"

There are some implications to this question, the principle one, can we create a standard method.  The standard method would be an idealization of the Bill of Materials in such a way that changes to it became mathematical multiples of or well understood relationships between the elements.  This implies a standard and baseline awareness of a design or architecture that at the very smallest level that consists of units that can be both measured and used as incremental building blocks.

3. New BOM Method
  In the case where that is true, the Bill of Materials Calculator (figure 3) can potentially replace the BOM action in Genesis.  Utilizing well understood units of measure, the calculator makes it possible to manipulate the BOM output to the customer desired goals/end-state much more rapidly.  The output of this tooling is arguably a utility in nature, but because it continues to rely on manual understanding/input, we'll put it squarely on the product/utility line.

The benefit of this change in process, we've shaved 7-14 days off the Process.

The Question for BOM Calculator, "How important is the Quotation?"

As it turns out, the Quotation with the source suppliers is only required when finalizing a deal with the customer.  So, as figure 4 indicates, we're going to skip over Quotation and start to consider the Cost analysis.  Because supply chain / purchasing requires the Quotation, we cannot remove it from the process, but it doesn't have to be done until we're in a position to make a purchase.


4.  Quotation
With Quotation removed (for the moment), and a good awareness of the BOM sizing activity we'll tie the BOM directly to the Costing activity.

The benefit of this change in process, we don't have to Quote each an every time we need to understand a Rough Order of Magnitude (ROM) cost.

The Question for the BOM Calculator mechanism then becomes, "Can we get to ROM cost quicker if we understand the BOM?"

In the next installment, I'll review how Costing evolves to improve the process.

Wednesday, April 25, 2018

Getting to the New Problems

At some point in the evolution of a product, a new product will come along and displace it.  Examples are quite prolific in IT, exactly like what happened with video rental, the entire market was replaced.

In the Settlers and New Problem entry, I made what turned out to be an abortive attempt at rationalizing this within the context of the Value Chain.  It should have been in an Evolution Map, as in Descriptive Evolution Mapping.

There still exists the issue of identifying new problems to be solved that create the displacement.  It can happen almost anywhere on the continuum of the product.  It's my feeling that it is the most identifiable and the most "felt" when the product being displaced is at the assumed stage of "Solutions to Known Problems" or "Refinement of Solutions."  That in mind, it can happen at any stage.

The following Wardley Map shows the associated change from the perspective of a Value Chain map.  It's also how, in "Next IT Shift, back to Distributed", the step function that could be used to show a possible future is realized.

The example, in "Next IT Shift, back to Distributed", implies that Cloud Computing is perpetually in a state of Refinement of Solution, but "Sensors, IoT and Machine Learning" cannot all be run from Cloud Computing:  This is the New Problem How do you get data to where it needs to be acted on with an enormously distributed system?  Should we try to move data in this way?
@swardley, WardleyMaps, Wardley, Maps, mapping, Evolution,
Getting to the New Problems

Tuesday, April 17, 2018

Descriptive Evolution Mapping

Thanks to @swardley and the community for steering me to a better definition of the Wardley Maps I was attempting to produce.

Wardley Map Evolution
Wardley Maps - Evolution Map

The graphic above introduces the product view of each phase of Design.

As a starting point, it is the output of the workload or application that has value.

The applications, and indeed the early computers, were Designed for Purpose.  They were also Custom built.  This is illustrated by the simple fact that repurposing an early computer may very well mean that the equipment necessary to run a different program may have to be changed in some pretty fundamental ways.

     Evolution of computing lead to computers that were capable of multi-purpose computing.  The applications were no longer specifically tied to hardware that they operated on (I know there's an argument brewing here, but it's the generalization I'm illustrating).

This provided the means for individuals and companies to run a multitude of programs on computing equipment.  This was a generational improvement in application development and lead to an entire industry of boxed software delivery.

     The box software products were developed within the framework of a particular operating system with guidelines about what and how commands were processed.  This is a manufacturing guideline and therefore we can identify the boxed software industry as Design for Manufacture and each "box" contained a Product on some type of media.

     Interestingly, the Product area is not necessarily the valuable output of the workload or application, but a means integrated to produce the valuable output.  This means that, while boxed software may very well be a Product, the overall application may still be Custom build, retroactively putting the entirety of said application back into Design for Purpose.

     Another interesting tidbit, the tight control over box software lead to the creation of competitors and the OpenSource industry.  This happened because the design of the overall application mandated variation that the box software vendors were either unwilling or unable to easily produce or it came at great licensing expense.

We reach another evolutionary state with what are arguably extremely easy to consume application outputs.  Built specifically with the consumer in mind and creating a service category (Utility Service) built upon commodity components. These are more easily offered in a volume consumption model of cost/volume-time.

     Utility specifically targets ease of consumption, foremost the usage mechanisms or operation.  It therefore represents a Design for Operations method of delivery.  Constantly being updated and firmly rooted in the idea that the easier it is to use, the more difficult it will be to replace.

     An interesting consequence of Design for Operations is that competitive pressures in Utility drive new business practices like Business Agility/Agile DevOps.

What's next you might ask?  google serverless, then google codeless


Friday, April 6, 2018

Settlers and the New Problems

There are some fundamental truths that drive technology forward, not meaning to be entirely inclusive, but they certainly include

     Technology evolves on top of other technology advancements
     Technology needs to be simplified or abstracted away from complexity to advance

This leads to interesting side cases of where the interdependence of technology must co-evolve with a relatively uncertain future state to continue moving forward.  Part of the solution is an evolution on top of the old technology and part of the solution is an abstraction to solve a complexity problem.

There are also characteristics of technology that meet as part of an industrialization of the technology.  Chris Swan @cpswan speaks on this very elegantly in the move from a technology that is design for operations rather than a design for manufacture or the design for purpose.

Industrialization of this sort is a stepping stone, much like the technology changes that move the state of the art from Town Planners back to Genesis and Design for Purpose.

Wardley Map Settlers and the New Problem
Settlers and the New Problem
Here's the diagram I'm using to try to illustrate this effect.  It is loosely based on the Wardley Map of  Settlers and an evolution step function that shows the evolution toward the "New Problems."

Looking forward to your feedback!

Update Apr 9: 


Friday, December 15, 2017

Enterprise Virtualization vs Public Cloud 2018 (Mapping, Wardley)

Enterprise virtualization prediction for 2018, tl:dr - no drastic changes from 2017.

There are some interesting possibilities, though, and I've used Simon Wardley mapping to diagram them.

Wardley Enterprise Cloud 2018 Mapping
Enterprise Virtualization 2018 prediction vs Public Cloud
As shown in the map on the left, we can now argue that the consumption of virtual machines has trended all the way into commodity consumption (with standardized tooling, automation and operating environment).  If it hasn't in your company, you may want to start asking why.

One of the more interesting possibilities for 2018, if the equipment vendors do this correctly, is composable infrastructure. This could completely displace traditional compute and push it into commodity consumption.  I'm going to leave it as a dotted line in the figure for now, as the business impact of technical accounting for corporations might make this a non-starter. That said, I have to imagine that a true utility in any premises would be good for the industry and the consumers.

In the public cloud map on the right, we may need to incorporate some changes based on enterprise use of public cloud to include the difference between “cloud native” capability vs enterprise hosting requirements.

Cloud native capability is the service consumption of the public cloud that relies only on the tools and capabilities built into the public cloud platform element.  Using it for new application development, including things like serverless application design, is growing as AWS and Azure partners and consumers learn to take advantage of those cloud native features.

     Cloud Native platforms are not particularly well placed for traditional enterprise workloads, though, which often require more specific (as well as traditional) care and feeding.  Furthermore, refactoring enterprise applications to take advantage of Cloud Native features may not be a worthwhile endeavor considering the cost to do transformation of applications.  The general thought is to give them a safe place to run until they are no longer needed. 

The enterprise hosting data center exodus from 2017 provides some of the highlights of why workloads will move out of the data center.  It may not be obvious, but the unifying element of both of the diagrams is how Hybrid Computing will be handled between enterprise virtualization and public cloud.  This integration still looks very much like early product (see diagram above).

One of the possible next steps is being taken by both Microsoft Azure and AWS / VMware who have announced methods to move non-cloud native workloads to their IaaS:  Microsoft Azure Stack and VMware Cloud on AWS.  Over time, both of these services should drive workloads more than "smooth out the peaks".  This is a major movement in what I’d titled my prediction last year, and why I say “Public Cloud Continues to Win Workload Love.”

If you've followed the mapping threads on my blog, here's the historical links on these predictions:


And if you want to know more about my Wardley mapping mis-adventures, follow this link.

updated 19 Dec 2017

Saturday, August 26, 2017

It's about the data

A friend of mine recently asked me what my thoughts were around connected home/small business.  He's kindly agreed to sharing the response.
...

In my humble opinion, it's all just verticals until something comes along to unify things and make life simpler.

It is about the Platform today.  In the future it will be about how we act on the data we have.  This will most likely shift to AI in some IoT fashion.

At the moment, I think we're are approaching a Wardly WAR stage with quite a lot of the capabilities moving from custom into product. http://blog.gardeviance.org/2015/02/on-evolution-disruption-and-pace-of.html

It's pre-industrialization of the capabilities that actually prove themselves both useful and offset either a cost, a time pressure or a labor effort that typically seem to win.

It'll almost always be about situational awareness though. A person will eventually see the thing that stands to make a serious and significant change that takes hold of one of the ecoystems and gives it a big push.

That's almost always an abstraction of the capability. I took a wild swing at one of the computing areas as an example. The abstraction of programming I called "codeless programming" that uses visual context rather than programming languages to build applications. It's the next step past "serverless" where inputs are tied to outputs by people not necessarily programming any of the intermediary application chains. http://www.abusedbits.com/2017/04/codeless-programming.html

Zuckerberg is running an experiment along these lines with his integration of capabilities using an AI.

https://www.youtube.com/watch?v=vvimBPJ3XGQ 

Thing is, getting everything to work together is ridiculously complex. Either commonality in method for integration or something that can manage the complexity is required before this is anything other than a really expensive and time consuming hobby. And then there's the data capture....

My brother has a sensor network that he literally built into his house to record a variety of data. The first dimension value of that data was to have it and be able to view it. The second dimension value of that data is to macroscopically act on it, like integrated thermostat control based on conditions and presets. The third dimension value would be to have similar information from 1500 houses and use it to design houses better, as an example.

In each one of the industries you are thinking of, the third dimension values far outweigh the two below it, but getting the data AND being able to act on it is ... difficult. The connected home products are about the only way to get at that data.