Education is often touted as data- or evidence-driven. But in this discussion, John Dues contends that educational data is often fiction, given how easy it is to distort, both via the inputs and outputs and through manipulation.
0:00:02.6 Andrew Stotz: My name is Andrew Stotz and I'll be your host as we continue our journey into the teachings of Dr. W. Edwards Deming. Today, I'm continuing my discussion with John Dues, who is part of the new generation of the educators striving to apply Dr. Deming's principles to unleash student joy in learning. The topic for today is Data is Meaningless Without Context. John, take it away.
0:00:28.3 John Dues: Yeah, thanks for having me back, Andrew. I'm thinking a lot about educational data, and I think about how it's often presented, and I think so often, what we're actually doing with our educational data is what I call writing fiction, which is taking a lot of liberties with the data that comes into our system, whether it's state testing data or some other type of data that gets reported out to the public, and we often sort of manipulate that data or distort that data in a way that paints our organization or our school system or our state in a positive light, and I think we do that sometimes at the detriment of actually working to improve those organizations or those systems because we spend so much time trying to paint this positive picture instead of just putting the data out there.
0:01:26.3 AS: And it's interesting you talked... One of the interesting things about what you're saying is that it could be accurate and good data, but it's just the context or the structure of how it's presented makes it meaningless.
0:01:39.9 JD: Yeah, we try so hard to sort of paint it in this positive light to make it look like we're doing a good job. Everybody wants to do a good job, but I think we often do that at the detriment of our systems.
0:01:54.7 AS: One of the things that made me think about it, in the financial world, we have a code of ethics, and that is basically that... Particularly for CFA charter holders, financial analysts, that you have to present a complete picture of your performance. So if you have 10 customers that you're managing their money and one of them, you really bombed out and you decide you're gonna do the average of the nine that you did well on and then go out to your clients and say, "This is my performance," that's a very... You have an obligation to accurately represent your performance. And when I think about it in all the charts and graphs that people are making in education all around, I would say that most people probably are just, I would call it CYA, cover your ass type of charts [laughter] of, "How do we make this look good?"
0:02:43.9 JD: Yeah, I think... I read somewhere that there's sort of three ways you can respond to your data. You can actually work to improve the system, which would be a positive, and then the other two ways are two forms of a negative, one is you could distort the system itself, or you could actually distort the data. And a lot of times, there's not sort of a nefarious motivation underneath that distortion, but there's sort of, again, this desire to paint your organization or your system in this positive light. So sometimes they're straight up unethical behavior or cheating, but most of the time, that's not what I'm seeing and that's actually not what I'm talking about here today. It's more of this sort of taking liberties, writing fiction. "Okay, we declined from two years ago, but it's up from last year." Those types of sort of distortions of the data that I think are fairly common in education sector, probably all sectors too, so.
0:03:49.9 JD: I think... Maybe I'll share my screen for the folks that have video and I'll talk through it for the listeners that don't have video, but one of the things I often think of and focus on is state testing data, because so many people are looking at that data all the way from State Departments of Education, the school system, the individual schools, the individual teachers and classrooms with their students, and then of course, families get these state testing reports as well.
0:04:23.8 JD: And a handful of years ago, I was looking at one of these reports from the Ohio Department of Education and sort of picture this fancy, glossy, colorful PDF. It's got this big headline on it, it says, "Ohio students continue to show improved achievement in academic content areas." Then it's got a table with all the state tests, all the different grade levels, and three columns for three different years of data. And then in the last column, there's these up green arrows for where there's been improvement from year over year and then these red down arrows for where there's been a decline, and I was thinking to myself, "Well, in some of these areas, one, some of the percentage changes are so small that just on that... In that realm, they're sort of meaningless, like fifth grade science goes from 68.3% in one year and it goes up to 68.5% in another year. That's essentially a rounding error when you're talking about 100,000 or so students that are taking the test. I think calling that improvement is a stretch at best.
0:05:39.0 JD: And then I was focusing on third grade reading specifically because that's such a critical area. In Ohio, there's actually a third grade reading guarantee, so if you don't pass the test, there's the potential there that you could get held back in third grade, so there's a lot of focus on that data. So I was reading on in that state education department document. It said, "Well, third grade did see this decrease this year, but when you look back two years, it actually had... Third graders actually had an increase of proficiency." So again, you actually have a decline from this previous school year to the more recent school year in this document, and they're still making this claim because if you go back two years versus this most recent year, you do see improvement, and so you start to think to yourself, "Well, what is improvement? Do we have a definition of improvement? And if so, what has to be present?"
0:06:43.4 JD: And a few years ago, I came across this definition in sort of a seminal work in our area called The Improvement Guide, and the author sort of outlined a definition for improvement, and it sort of has these three components, and this made a lot of sense to me. If you're gonna claim improvement, you have to, one, alter how work or activity is done or the makeup of a tool. So you had to change something. Basically change something about the work you're doing. That change had to produce these visible positive differences in results relative to historical norms, and then the third thing is it had to have a lasting impact. And so when I go back and I think about that state testing data or really any type of data, you start to ask this question, Is this really improvement, or again, is this writing fiction? Is this not really improvement, but we're twisting the numbers to sort of fit our narrative?
0:07:45.0 JD: So when we think about that state testing data, do we have knowledge for how worker activity has been altered systematically. And if I can't point to that, then how am I gonna take the so-called improvement and bring it to other places in the state that may not have had those same improvements? Do I have these visible positive differences in results going back and comparing to historical norms, not just last year or even two years ago, but five or six or eight or 10 years worth of data. And then have I been able to sustain that improvement? Has there been a lasting impact? Have I been able to hold the gains? And if I haven't been able to do those three things, point to what we change compared to historical norms and then sustain that improvement, I would argue that we haven't really brought about improvement. We can't claim that we've improved our system.
0:08:46.9 AS: It's interesting. Before we go on the numbers that you were showing, roughly, the average there is something like 60%. What's the 40? That 60% is what? And that means 40% is not that.
0:09:07.7 JD: Yeah, I'll go back. So when you're thinking about state test scores, most states have some type of threshold, like we have this goal that X percent of our students are gonna be considered proficient on any given test. So in Ohio, that threshold is 80%. So the state says, in order to meet the benchmark, any given school needs to have 80% of its students, let's say, on third grade reading test have to meet this proficiency standard. And so what we saw in this particular data is that in the 2015-16 school year, 54.9% of the kids met that proficiency threshold. The following year in '16-17, 63.8% of the kids met that threshold, and then in the most recent year in this particular testing document in '17-18 61.2% of the kids were proficient. So just about 40...
0:10:04.8 AS: So even if it was a sizable increase, it wasn't just statistically insignificant, it's still roughly 40% of the students aren't proficient. No matter even what the government says about what's the minimum standard, it would be hard to really argue too much about improvement when you're so low. [chuckle]
0:10:32.8 JD: That's right, yeah. And that's what you often see in these types of these documents. So 40%, a significant minority of students are not proficient on the third grade reading test, and 60% are, and there's these incremental increases and decreases depending on the year that you're looking at.
0:10:54.6 AS: It's like the Titanic heading for an iceberg and you say, "I've turned the ship one degree, but we're still gonna hit the iceberg."
0:11:01.9 JD: But we're still gonna hit, yep, yep.
[chuckle]
0:11:04.3 AS: Alright, keep going.
0:11:06.0 JD: Yeah. So I think what's really important thinking abo
Information
- Show
- FrequencyUpdated Monthly
- PublishedApril 11, 2023 at 7:00 PM UTC
- Length35 min
- RatingClean