– Thomas Mennella –
“Show me the data.” The words sliced right through me. My face flushed; my palms got sweaty. I knew I had misstepped. I was a third-year doctoral student giving my first department-wide seminar presentation. The chair of the department had asked me a question and I’d overreached in my answer. I’d extrapolated. And now, I was paying the price. “Show me the data.” A scientist’s way of saying, “I’m not convinced. You over-interpreted. You screwed up.” I learned a lifelong lesson from that experience: don’t make claims without the data. My claim to you today? Experience with Flipped Learning can make the transition to remote instruction seamless. Students can continue to learn without skipping a beat, and enjoy every minute of it. Your post-COVID-19 teaching experience can be a huge success. And I have the data.
Search any Google term related to online teaching, any Twitter hashtag referencing remote instruction, and you’ll find dozens of “experts” telling you exactly how to transition your teaching to an online environment. This echo chamber of self-proclaimed expertise can be confusing and contradictory. Where’s the data?
Best practices
Long before the COVID-19 pandemic, Flipped Learning already had a vetted, open-sourced, and consensus-driven collection of best practices. They’re codified as the Global Elements of Effective Flipped Learning (GEEFL) and were compiled by over a hundred Flipped Learning practitioners scattered across 49 countries. Those of us who were aware of these GEEFL elements quickly aligned our Flipped Learning instruction to them. Little did we know, we were already preparing for the pandemic that was on the way.
Though it was rarely discussed or appreciated before COVID-19, Flipped Learning is a hybrid experience perfectly suited to online instruction. I’ll use my own transition as an example. Before the pandemic, my students watched video lectures through EdPuzzle with interactive questions embedded throughout. They also completed WSQs (watch-summarize-question) on each lecture video and submitted those through our LMS. This was the pre-work done in the individual space. Our class met twice each week. During our first in-class session, I would review those concepts and topics reported as still unclear in the submitted WSQs. These reviews were informal, using only the whiteboard and I even refer to them as “office hours in class.” Our second meeting in class was used for deep, critical thinking, probing “Challenge Questions” which forced students into the higher levels of Bloom’s. Each week ended with a take-home quiz and the cycle repeated each subsequent week.
When it became clear that COVID-19 was going to hit New England hard, and institutions neighboring my own began to close for the semester, I could see the writing on the wall. Proactively, I began planning for my own online transition. Students would still watch their EdPuzzle videos and submit their WSQs. I would now record my mini-reviews – by topic – and post those on YouTube. Challenge Questions would be converted into discussion prompts in our LMS (with students required to post their answers on Wednesday and reply to at least three classmates’ posts by Friday), and the take-home quiz would become an online quiz. The transition appeared to be seamless. I’d sacrifice no learning resource and the entirety of the course design would be maintained. But would it work?
A shaky start
Our first week online was rocky, to say the least. Online learning was new to almost all of my students and the uncertainty of the unfolding pandemic was hard for us all. Student engagement was low, far too low to support genuine learning, and I worried. I had no more carrots to offer (too much extra credit inflates the course grade to meaninglessness, and I had no more course points to spare). And, I’m not a ‘stick’ kind of instructor. I don’t penalize, as a general rule. So I used the last tool I had: modeling. I made it a point to reply to every single student reply in the discussion forums for the Challenge Questions. I modeled best discussion forum practices, and it worked. Students became acclimated to the format and engagement sky-rocketed. Yes, it was incredibly time consuming, but it was worth it.
From then on, it was smooth sailing. Weeks unfolded in a predictable procession. Student engagement was, for the most part, high. And learning appeared to be occurring. I was thrilled. But to be sure, only summative assessment of student learning and a comparison to the same from previous years could tell the story. And, so here’s the data.
Show me the data
This chart shows the year-to-year comparison of student performance on unit assessments: eReports, mastery checks and the cumulative, essay-based final exam. As you can see, there were no significant differences between the non-COVID-19 Spring 2019 cohort of students and those who transitioned to online learning in Spring 2020 (error bars represent the standard deviation of the mean). Students did equally well in both semesters.
But was I a confounding variable? In other words, was I skewing the data by being unconsciously more lenient this spring? To answer this, I needed to leverage my summative assessment reports. In my courses, I assess student learning of all course learning objectives (CLOs) as listed in my syllabus. I do this using a pre-test/post-test approach where students take a comprehensive test on all course topics in the first week of class (where they’re expected to do quite poorly), and then they take the same test in the last week of class (where, hopefully, they do much better). By grouping test questions by CLO, and comparing student performance pre- versus post-, I can effectively measure student learning. I use the same pre-/post-test each year, making year over year comparisons valid. Here’s an example of student learning from Spring 2020:
As you can see, my students learned a lot despite transitioning to online learning. Overall, performance increased from below 20% accuracy on the pre-test to ~75 percent on the post-test. Additionally, we can see that significant learning gains were made in all CLO categories. But, how does the Spring 2020 cohort compare to Spring 2019 on their post-test performance? Did my students who transitioned to online instruction do more poorly on their post-test in Spring 2020 as compared to Spring 2019? No, they didn’t.
Without Flipped Learning, I would have been clueless as to how to transition to an online format. Lecture-based instruction, which was my previous mode of content delivery, requires a passive audience. I could never envision lecturing over Zoom as being an effective online modality for teaching and learning. Embracing Flipped Learning prior to the pandemic, and then sticking with it through our transition, gave my students consistency, which they clearly appreciated, and yielded no loss of learning outcomes. I claim that Flipped Learning is the perfect bridge between face-to-face, on-ground instruction, and an online format. It excels in both worlds and makes transitioning between the two seamless. I am not over-reaching. I am not extrapolating. And I claim to be no expert. I simply showed you the data.
Join us as we discuss the ideas in the article with educators around the global at the Second Wave Summit | 2020
GEEFL best practices covered in this article: