Building Knowledge Through Predictions: Deming in Schools Case Study with John Dues (Part 4)

In this episode (part 4 of the series), John and Andrew continue their discussion from part 3. They talk about how to use data charting in combination with the Plan-Do-Study-Act cycle to gain the knowledge managers need to lead effectively.
0:00:00.1 Andrew Stotz: My name is Andrew Stotz, and I'll be your host as we continue our journey into the teachings of Dr. W. Edwards Deming. Today I am continuing my discussion with John Dues, who is part of the new generation of educators striving to apply Dr. Deming's principles to unleash student joy in learning. The topic for today is Prediction is a Measure of Knowledge. And John, to you and the listeners, I have to apologize. I'm a bit froggy today, but John, take it away.
0:00:30.9 John Dues: Yeah, Andrew, it's great to be back. I thought what we could do is sort of build off, what we were talking about in the last episode. We sort of left off with sort of an introduction to process behavior charts and importance of charting your data over time. And sort of the idea this time is that, like you said at the outset is prediction is a measure of knowledge and prediction is a big part of improvement. So I thought we'd get into that. What role prediction plays in improvement, how it factors in and how we can use our chart in combination with another powerful tool, the Plan-Do-Study-Act cycle to bring about improvement in our organizations.
0:01:15.1 AS: And when you say that prediction is a measure of knowledge, you're saying that prediction is a measure of how much you know about a system, or how would you describe that in more simple terms that for someone who may not understand that, that they could understand?
0:01:31.4 JD: Yeah, it took me a while to understand this. I think, basically the accuracy of your prediction about any system or process is an observable measure of knowledge. So when you can make a prediction about how a system or a process, and I use those words interchangeably, is gonna perform the closer that that sort of initial theory is, that initial prediction is to what actually happens in reality, the more you know about that system or process. So when I say prediction is a measure of knowledge, that's what I'm talking about is, you make a prediction about how something's gonna perform. The closer that prediction is to how it actually performs, the more you know about that system or process.
0:02:19.1 AS: I was just thinking about a parent who understands their kid very well can oftentimes predict their response to a situation. But if you brought a new kid into that house that the parent didn't know anything about their history, their background, the way they react, that the parent doesn't really have anything to go on to predict except maybe general knowledge of kids and specific knowledge of their own kid. How could that relate to what you're saying that prediction is a measure of knowledge?
0:02:52.3 JD: Well, I think that's a great analogy. One of the things that Dr. Deming said that it took me some time to understand was that knowledge has temporal spread - just a few words, but really causes some deep thinking. And I think what he meant was, your understanding, your knowledge of some topic or system or process or your kid has temporal spread. So that understanding sort of increases as you have increased interaction with that system process or in this analogy, your own kid. So when you replace a parent who knows their kid well with some other person that doesn't know that kid as well, they haven't had that sort of, that that same, that shared time together. So there's that, they don't have that same understanding. It's gonna take time for that understanding to build. I think the same thing happens when we're trying to change a system or a process or improve it or implementing a new idea in our system or process. And so the prediction at the outset is probably gonna be off. Right, and then over time, hopefully as we learn about that system or process or kid in this instance, that that prediction is gonna get better and better, as we learn over time, basically. I think.
0:04:15.8 AS: Yeah, it's interesting because saying the words temporal spread kind of gives way to the idea that Dr. Deming was educated in 1910, 1915, in speaking, reading, writing. And then he also, he said things, that his objective wasn't to just completely simplify. And I think that the messages that he was bringing were difficult to simplify, but you could say that, "improves over time" is what temporal spread may mean. Right? Okay. Let's keep going on this. This is interesting.
0:04:55.0 JD: Yeah, I think, maybe it'd be helpful if I share my screen and we can sort of connect the dots from last time to...
0:05:00.8 AS: Yep. And for the listeners out there, we'll walk you through what John's showing on his screen in just a moment. All right. Now we can see a chart on his screen.
0:05:11.7 JD: Yeah, I think, so we see a process behavior chart sort of orient, the watchers and then even the listeners. So the chart is a process behavior chart. That terminology can be a little bit confusing. Some people would call this a control chart, some people would call it a Shewhart chart, my sort of preferred terminology is process behavior chart because it's literally charting some process over time. So the example I used last time was charting my own weight. So you can use, you can chart personal items, you can also obviously chart things that are important to you in your organization. But the main thing is pull numbers out of a spreadsheet. That's what we talked about last time. Pull numbers out of the table instead plot that same data over time. So you can see how it varies naturally, perhaps, or how it varies in, special ways over time. So the, for the watchers, the blue dots are individual data points. The dates are running along the X-axis of the chart. And so you can see those moving up and down over time as I weigh myself every morning. Then we have the green line.
0:06:30.6 AS: At the beginning of the chart, we see those individual data points hovering around maybe 179 to 80, something like that.
0:06:41.8 JD: Yeah. Bouncing around in the 180, 178, 176 range. And then...
0:06:48.8 AS: And just for the international listener, John is not 180 kilograms [laughter], he's 180 pounds. Okay. Continue.
0:06:56.8 JD: That's right, that's right. On the Y-axis, we have weight in pounds. And so in addition to the blue dots and we've added a green line that is the average over time. And then we have sort of the last component of the process behavior chart, we have the red lines, which are the upper and lower natural process limits, or some people call them control limits sort of are the bounds of this particular system at a given point in time. And so, as we watch this data unfold, we can see that it does move up and down in different ways, in different patterns, but it's far more illustrative than if I was just looking at that table of numbers. So when I do this daily, I don't wanna overreact to any single data point. Instead, what I'm trying to do is get a sense of how this data is performing over time, right? So I can see this unfold over the course of days and then weeks and then months and all along, my knowledge of my weight system is increasing.
0:08:09.7 JD: Even if you don't know anything about process behavior charts, you could do this on a simple line chart or run chart without the limits, and you'd still learn much more than what you would with that table of numbers. But with the addition of the red lines, the natural process limits, what I am doing is sort of saying based on some simple mathematical calculations, that these are the bounds of my system that I would expect because of the data empirically based on the actual dots on the chart, these are the bounds of my system. And if a point would happen to fall outside of those red lines, I know something special has happened because it's so mathematically improbable that it's not to be expected. And there's a few other patterns in the data too that you can look for besides a single point outside of one of those red lines.
0:09:08.4 JD: But I'm looking for those patterns to see if something special has happened or I'm seeing if my data is sort of generally bouncing around between those red lines. And in either case, there are different approaches to trying to improve that, improve that data over time. And one other thing that I like to do, I always make my data blue, my average line green and my process, my natural process limits red. And then whenever I do this internally with data from our own organization, whether it's attendance data or test data or financial data, whatever the data is, I always use that same pattern. So people get used to seeing these colors and they associate blue with data, green with the average and red with the limits.
0:10:00.7 AS: So tell us more about, I mean, one of the things before we even talk about PDSA, what's happening here is that the upper limit and the lower limit at two points in this chart shift down. So you're, if you didn't change the upper and lower limit and you just had your, that standard one across the whole chart, then it probably starts to lose its value because the process that you're describing is going back in time to such an extent that things were different. Tell us about why you've made this adjustment.
0:10:46.0 JD: Yeah, I'd say if the natural process limits, so the red lines sort of stay in the same spot. So if I don't see those special patterns, basically what I can assume is that that system is, despite the fact that the data is bouncing around a little bit naturally, that, there's nothing sort of significant that's happened either in terms of my weight system getting worse, or in this case I want to get better. Obviously, I wanna lose a little bit of we
Information
- Show
- PublishedMay 2, 2023 at 7:00 PM UTC
- Length37 min
- RatingClean