Volume 49, April 2005: Use of Measurement

Keyzine: An E-zine for Leaders about the People Side of Business

This is a monthly electronic magazine for anyone who wants to be a better leader, coach, facilitator, or simply, to tune up their people skills. It is a complimentary publication, devoted to the next evolution of Quality Thinking.

Publisher: © Key Associates, LLC, 2005 ISSN # 1545-8873

“In God we trust. All others bring data.” — W. Edwards Deming, American Statistician

“All improvement will require change, but not all change will result in improvement.” — Gerald Langley et al. (1996)

“There has never been a measure that would survive the fear of those being measured.” — Don Berwick, Institute for Healthcare Improvement

IN THIS ISSUE:

  • What’s Hot in Leadership
  • Maintaining Yourself as a Leader
  • Frequently Asked Questions from Leaders
  • Educational Opportunities
  • Useful Websites & Newsletters
  • Articles/Publications

What’s Hot in Leadership

  • Using measurement to accelerate learning and improvement.
  • Seeking information not just data.
  • Not reacting to single data points in decision-making.
  • Developing adequate resources for data management.
  • Asking: what could we change? How could we more adequately predict?

Maintaining Yourself as a Leader

Fire-fighting and quickly fixing problems is, for some, the measure of leadership prowess. How quickly can you react (i.e., knee jerk) to correct a situation? The pressure is there to Do-Do-Do-Do. Occasionally, you may have a chance to Plan a bit before you Do. But learning – for you and your organization – is not possible without feedback. Having a theory and data allows you to Study the value of a change. This is the Plan-Do-Study-Act (PDSA) cycle adopted by W. Edwards Deming from his mentor, Walter Shewhart. It is a learning process, a scientific method imposed to profit from experience. Not to say that an experiment cannot be done quickly or that some situations are not truly urgent, but why sacrifice learning and improvement for speed?

Frequently Asked Questions

“We make management decisions based on our quarterly reports – which give current quarter compared to last year same quarter and year-to-date. Isn’t this sufficient?”
Unless you view data over time using Statistical Process Control (SPC) methods, it’s hard to determine if a process is producing acceptable or unacceptable results. Furthermore, quarterly comparisons have aggregated 3 months and 90 days worth of business into a single number. Customers do not care about the average order time or average cycle time for the quarter. They care about what is happening right now! SPC methods are designed to provide such an understanding. To learn more, please see Quality Healthcare: A Guide to Developing and Using Indicators by Robert Lloyd (2004).
“I’ve heard criticism of managing by numbers. Isn’t that what we are being urged to do here?”
Deming said that managers who did not understand a process would manage by the numbers alone. This argues for process knowledge, which includes how the process works, what is normal variation, and what the process is capable of producing. You can only know this by studying data over time (SPC). He was also against setting arbitrary numerical quotas and goals, which have no regard for what is possible. Then we exhort people to work harder, raise the bar, ask for 10% more, when nothing is changed about the process. The Red Bead game below elaborates on this idea. It is well to remember that a number is an “indicator” and not the real thing itself. In fact, Deming was well known for saying, “There is no such thing as a fact.” He was trying to get people to realize that all measurement is fraught with error. Knowledge of variation is more important than having the number.
“How does measurement apply to people?”
For Deming, everything was an N of 1. However, in management, we tend to confuse the system and the person. Deming used the Red Bead experiment to teach the errors of management. As the game goes, the teacher picks a few people from the audience to be "Workers" making widgets. They do their work by using a perforated paddle to scoop tiny balls out of a sampling bowl (a random process). If they scoop up any red balls, they get demerits from other members of the audience that were picked as "Inspectors". The "Accountant" keeps track of the quality of each "Worker’s" results. After a few rounds, the teacher (Manager) calls the Worker who has the best score up and praises them and gives them a raise. Then calls up the Worker with the poorest score and scolds them and mentions firing them if they don’t improve. After a few more rounds the top performer is offered a promotion and the poorest Worker is fired. This can cause people in the class to break out in tears, because they recognize that there is no basis for what is going on, both in the class and on their jobs. Too often, we blame people for the ill effects of the system we helped create, and only we leaders can fix that. “Financial data is fairly easy to come by, as is customer satisfaction. It is outcomes data that we struggle with.” There are other measures to consider. Let me draw from healthcare, as an example. Jim Handyside, at the Vermont Oxford Network NIC/Q meeting (referenced below), made these useful distinctions in measures: Outcome measures relate directly to the aim or result of your study – e.g., reduction in number of transfusions per patient days. Intermediatemeasures predict an outcome – e.g., fewer infections (which leads to fewer deaths). Process measures assess the points in the sequence or flow of the process that lead to an outcome – e.g, actual performance compared to a guideline or percent compliance. Proxy PROXY measures are indirect measures which coincide with or approximate the outcome – e.g., rate of material usage when the material is related to outcome (example- hand hygiene cleanser usage). Balancing measures are the unintended consequences or adverse side effects that can occur when you make changes – e.g., when reducing number of painful procedures, are you missing important care?
“I had statistics in school, isn’t this enough to understand measurement and variation?”
Most professionals receive some training in “enumerative statistics,” such as descriptive statistics, tests of significance and regression analysis. SPC is a distinctive branch of statistics (initially developed by Dr. Walter Shewhart in the 1920’s). According to Dr. Bob Lloyd, the key distinction is that enumerative examines aggregate data at fixed points in time, to determine if one group of data is statistically different from another. SPC, or analytic statistics, seeks to understand the variation that occurs with the data over time – through the use of run and control charts. The question becomes whether the data reflect common or special causes of variation and is prediction possible, not whether two data points are different. This is applied science, not controlled research or experimentation.

Education

Key Associates offers a course on Statistical Thinking. Contact us.

Web-based SPC Workout for basics on Control Charts: SPC Workout Basic SPC Training.

Some more on-line instruction: Quality America.

On-site training or PowerPoint training modules and free monthly e-zine: SPC PowerPoint Training.

Useful Websites & Newsletters

See annual reviews on software products in Quality Progress and Quality Digest: Quality Digest.

Free trial of ChartRunner by PQ Systems: CHARTrunner Lean.

Statistical software for Windows: Minitab 16.

QI Analyst – an SPC product with more industrial and real-time applications: Wonderware QI Analyst Software.

Charts, graphs, and diagrams as add-ons to MS Excel: QI Macros.

Articles/Publications

Books are linked to web descriptions:


 

Buy MK’s latest book!