-by George Hess-
We all know that one of the central tenets of medicine is that doctors will “do no harm.” It’s a good standard to abide by in education as well. Over the past five years or so, numerous studies have established that Flipped Learning meets this test and students learn at least as well in flipped classrooms as they do with traditional methods. So when I see a pessimistic headline like the one in a recent Campus Technology article (“Study Finds Flipped Classroom Model Does Not Improve Grades in Health Science Course”), it makes me want to do a little investigating.
The accompanying article wasn’t that negative, and it was clear that researchers themselves supported Flipped Learning, so I decided to go to the source and to reconcile the discrepancy. But then I was sidetracked because what I found was a study that exhibits many of the flaws found all too often in educational research.
The study compares the grades and opinions of students in a graduate health science course from two consecutive years; the first, a traditional lecture-homework model, and the second, a flipped model. A quick look at the data regarding grades and test scores revealed that prior to implementing Flipped Learning, the average grade was 94. Since the course was already very successful–no surprise in a graduate class–there was little chance that there would be any significant improvement in the average grade.
It points to one of the main problems in educational research; the over-reliance on existing tests and assessments to measure a new methodology. It’s an easy trap as numbers are readily understood. But they don’t always tell us what we need to know. Tests have an inherent bias towards traditional learning as they’re designed only to assess what has been taught in the past. When a new method is introduced, and it’s been established that it works at least as well as the old one, we should be asking if there are some additional benefits to be had using the new method. The researchers seemed to be aware of this, yet both the paper and the article still chose to focus primarily on grades based on test scores. They did do the obligatory student opinion surveys which revealed that it had a positive effect on time management for most students, but there was no attempt to look at whether any additional learning beyond the test had occurred.
This misguided focus extends to the course design as well. As Jon Bergmann says, Flipped Learning is not about the videos. Yet, it was the individual space that received the most attention here and to be fair, I must acknowledge it was well done. However, they paid little attention to the group space. They were aware that the activities varied in each tutorial group and conducted observations and surveys of what happened in the group space. But the data from the six groups are averaged, so there is no way to correlate activities to learning. There was also no attempt to implement new approaches for group space activities. At FLGI, we’ve learned that Flipped Learning makes it possible to apply many different models in the group space, and that’s what makes it so powerful.
Looking again at the data, another statistic that was overlooked by the researchers jumped out, a substantial reduction in the standard deviation. One of the primary goals of education is to teach every child. Practitioners of Flipped Learning have observed that it appears to accommodate differentiation better than traditional methods. That may be the case here, but since they don’t address it, it’s impossible to determine the cause. I strongly encourage them to revisit this.
Educational research is an essential tool. We’ve had far too many untested methodologies foisted upon us that either failed or were flatly wrong. But research needs to be well-designed and be looking at the right questions. The only information we gather from this study is that Flipped Learning does no harm in a graduate health science class. I suppose it’s worth knowing, but there was so much more we could have learned. And it certainly didn’t merit the negative headline from Campus Technology.