Do Your Flipped Lessons Include This Critical Piece?

Editors Features October 19 / October 16, 2019

 – Thomas Mennella –

“I hate watching videos, Professor.” “I learn enough from the class activities.” “This class takes up too much time at home.”  “I can multi-task and watch your videos while doing other things, too.”

We’ve heard it all, right?  Over the past seven years that I’ve been flipping, I’ve heard all of the excuses above, and more, for why and how students don’t need to do the prework (or do it well).  From where I sit, asking students to simply and passively watch two videos a week doesn’t seem like much.  But, many students hate it.  So, the question recently emerged for me, how important is it for students to fully engage in the prework?

To be clear, this wasn’t a research question that I planned to answer; I always believed that students actively watching the prework videos was critical to the success of a flipped classroom.  Instead, this question answered itself for me when my students took their first exams of the semester. The initial curiosity arose due to the extreme bimodality of my recent exam grade distribution.  As you can see below, among the 47 students who took my Genetics Unit 1 exam, the majority (59.6%) received a grade either in the 90s-100s (there are ten extra credit points available, making the top score a 110), or a failing grade in the 60s or below.  There is a much lower number of students in-between those two extremes. I don’t need to be a scientist looking at this data to assume that something is differentiating these two groups of students.

 

Since all students sit through and engage in the same group space experiences and activities, I immediately suspected engagement with the prework videos to be the differentiator in the grade distribution.  Luckily, I finally had a way to measure this. Last year, I decided to fully embrace EdPuzzle as my platform for hosting my videos. EdPuzzle allows you to track student video-watching and embed questions into your videos.  Students receive points for watching the videos in my class, so all students ‘watch’ the videos, but nothing stops them from doing other things while the video plays or from simply zoning out during the video. To incentivize active watching, I do embed questions into my videos and while some test comprehension of lower-level Bloom’s concepts, many of my questions simply test video watching (e.g., in a lecture where billy goat beards are used to explain an inheritable trait, an unflattering picture of Brad Pitt is used as an example.  Soon after, an embedded question asks: “Which celebrity was used to illustrate billy goats beards?” If the student was paying attention, they’d know). So, student performance on these EdPuzzle questions is a measure of active student engagement in the videos.

The stage was set.  The fundamental question I asked was: Does student performance on the EdPuzzle questions correlate to student performance on the unit exam?  Or, in other words, do students who watch the videos actively do better on their exams? The answer is a resounding yes!

The graph below plots the average student accuracy on the EdPuzzle questions (measuring student engagement in the videos, on the X axis) against their score on the unit exam (on the Y).  The graph itself tells the story: indeed, students with better accuracy in EdPuzzle do better on their exams. More impressively, though, the correlation coefficient of this relationship is a strong 0.55 (where a value of one represents a complete cause-and-effect relationship and zero demonstrates no causal relationship at all).  And, still more striking, there appears to be a sharp cut-off: do better than 90% on your EdPuzzle question accuracy and you can expect to do well on the exam; no student with an EdPuzzle question accuracy above 90% received a grade less than 78.5 on the exam.

While this result is no surprise, it is striking to see the data bear it out so starkly.  What’s more, the implications of poor video watching habits bleed over into the group space.  If the group space activities leverage and build off of the prework, those students who didn’t reach Bloom’s lower levels at home are not prepared to make sense of the group space activities.  The activities have no context and likely go right over those students’ heads.  

So, next week I will once again explain to my students that Flipped Learning is a formula for success.  That formula requires active and attentive watching of the videos, participating in class, and brushing up at home, week after week.  Any faltering of any of those steps compromises the power of Flipped Learning. The only difference now is that next week, I will not be preaching this advice from the pulpit of my classroom.  I’ll instead be presenting it with hard data and statistics behind me to back it all up. And, it’s my hope that this will make the difference. 






Thomas Mennella
Dr. Thomas Mennella
I have been an instructor in higher education for over ten years. Starting as a lecturer at the University of Massachusetts – Amherst, and then moving on to an Assistant Professorship at Delaware State University (DSU), a small public university, I experimented with Problem-Based Learning (PBL) and was an early-adopter of the iClicker student response system. Now an Associate Professor at Bay Path University, a private liberal arts institution in western Massachusetts, I primarily teach Genetics, Cell and Molecular Biology. I am Flipped Learning 3.0 Level -II Certified and a founding member of the FLGI International Faculty.




Previous Post

International Flipped Learning Program Rankings

Next Post

Flipped Learning Lessons Plans, Group Space Activities and Free Resources





0 Comment


Leave a Reply

Your email address will not be published. Required fields are marked *


More Story

International Flipped Learning Program Rankings

 – Thomas Mennella – The metaphors seem almost endless.  The Global Elements of Effective Flipped Learning are a roadmap,...

October 16, 2019