Tuesday, October 6, 2015

Telling a Good Story

I recently came across a fantastic news story in one of our process improvement efforts. Unfortunately, the effort and accomplishment was not fully appreciated -- due in part to the way the information was presented. Month after month, we presented the change as an unadorned bar chart. The bar chart showed clear improvements, but the sense of story was lacking.

After working with the team, we applied three recommendations to improve how the information was communicated.
1. We added control limits to the chart to convey a sense of context. How do we know when something changed? When the line breaks the control limits, something has changed.
2. We decided to show both the meter (the # of expired LIMDU PRDs) and the levers (the actions we had taken to influence change).
3. We made the math easy. Senior leadership no longer needs to calculate the size of the impact or guess; we spelled out the accomplishment explicitly.

Thursday, June 11, 2015

Measures of Performance: A Losing Record with a Winning Spirit

How do you measure performance in a recreational league?
Do you take into account bad umpires who don't seem to know the rules of the game?
Does your assessment take into account teams from competitive leagues who stay sharp in the off-season by playing in a recreational league?
Do the league mandated priorities of first fun, second learning, third winning play any role?

Although we have one game remaining, we will end this season with a losing record (currently, we are 1-7-1). However, we started the season with 5 of 14 players who had significant problems hitting a pitched ball. At our last game, every player on our roster got a hit. We started the season with 7 of 14 players who did not understand the basic flow of the game or how to get the other team out. At our last game, we held a very good team to two points in the first inning (a significant accomplishment for our team).

Everyone plays both infield and outfield. Everyone has improved. Our defensive play as a team has improved by leaps and bounds. And we're having fun. Although it is always more fun to win, we are enjoying the game of baseball.

Unfortunately, the scoreboard has been stubbornly resistant to measuring our true performance.

Friday, May 22, 2015

Building Demand for Black Belts

Because our deployment strategy has focused on developing bottom-up support for Continuous Process Improvement, we have been very deliberate about making gradual training investments. Initially, we focused on yellow belt and champion training with the role of "belt" for improvement projects supported from CPI program resources. As our trained population of yellow belts and champions grew, we began training green belts and broadening the base of belt leadership for projects. After three years of building, we felt it was finally time to invest in black belt training to expand the leadership team for the CPI program. This photo represents the first group to complete the 160 hour black belt training curriculum using our own instructors.

Tuesday, March 3, 2015

The Concept of "Process Entitlement" Drives Record Performance

During Lean Six Sigma Yellow Belt training, we introduce the concept of "process entitlement" as a way to drive record performance. Using a simple process simulation with building blocks to make pyramids, we lead a class discussion on what level of performance would be possible if the process was perfect. Perfect is defined as no waste, no variation, no constraints, etc. Collectively, we arrive at a general consensus of the natural physical limits of the process -- the maximum number of pyramids that could be constructed in a 5 minute production period. Then we steer the discussion toward the topic of, "What would have to be different to produce at a rate equivalent to the natural physical limit?" At some point during this discussion, we challenge the students with concepts such as Takt time and process metrics. Then, we reveal the current world record and set production goals somewhere above the current record and the maximum number possible. The record seems to inch higher with every new class and currently stands at 74 (with a competitor team in the same class producing 73).

Monday, December 8, 2014

Trained With An Expectation of Project Work

In November, we added 26 newly trained green belts to the program. All students entered the training with the expectation that they would follow through and complete two green belt projects to complete green belt certification. In theory, that should mean that 52 green belt projects will flow out of this one green belt class.

In reality, only about 40 percent of the students (10 of the 26, in this case) historically follow through to complete projects. Some of the reason for the project deficit are (1) the rotational nature of our workforce (i.e., green belts move on to new assignments before completing projects), (2) our "bottom up" deployment strategy that relies on green belts to generate project areas, and (3) our strategy of conducting projects with collateral duty green belts (i.e., project work is voluntary extra duty).

Monday, October 20, 2014

MAES Workshop at the 40th Annual Symposium

I recently presented an overview of Lean Six Sigma to a group of engineering professionals and students at the 40th Annual MAES Symposium in San Diego. Our Creative shop put together a nice posterboard to put on the eisel outside the event.
The idea was to promote attendance at the event by the symposium attendees.

Wednesday, August 20, 2014

When to Take Credit for Improvements?

From August of 2012 through August of 2014, we initiated 50 projects in our relatively modest Continuous Process Improvement (CPI) program. Some of those projects resulted in significant improvements and big financial savings. Some resulted in marginal improvements. And a few had no impact, were never completed, or were cancelled. However, every single one of the 50 projects had collateral benefits and training value to the organization.

By collateral benefits, I mean three types of improvements that are not typically attributed to a CPI program.
1. The first type of improvements are undocumented changes that result from focus on a problem. Whenever I initiate a project and start asking questions about data availability, the process changes for the better. It happens every time. Scrutiny of a process leads to undocumented process improvements.

2. The second type of improvements are documented changes that are not called CPI. These improvements would not have happened if the CPI program did not exist, but no one seems to acknowledge the connection. Changes in a focal process always generate collateral changes in related processes -- often without the need for a follow-on project to drive the change.

3. The third type of improvements are changes that are generated using CPI tools but not part of a formal project. The covert CPI program, at least in my present circumstances, is often a more powerful tool than the overt program. As individuals internalize the principles of CPI, these tools become second nature, and documenting improvements as formal CPI projects become less likely.

As a CPI program manager, I could legitimately credit my program with all of the improvements. Such a stance might be considered an attempt to steal credit that rightfully belongs to other efforts. I could chase some of the collateral benefits and attempt to formally document them as CPI-driven improvements. Such an approach might be perceived as a desperate attempt to make my program seem relevant. Another approach would be to only take credit for improvements that are formally documented in CPI-related projects. Of course, that method would grossly underestimate the value of the CPI program to the organization.

When should a CPI program take credit for improvements? It is hard to offer a definitive answer.