Flipped learning: Beyond the surface to self-regulation

This is the first post in what I hope will become a recurring feature here at the blog, where I take a journal article I've read and give a short review. I need more research articles in my weekly intellectual diet, and tying my reading to blogging seems like a good way to keep me on track with both. And maybe this will give a signal boost to some of the good research that's out there. So, enjoy.

Introduction

The article I'm focusing on today is Flipped classrooms and student learning: Not just surface gains by Sarah McLean, Stefanie Attardi, Lisa Faden, and Mark Goldszmidt of Western University in London, Ontario. McLean and Attardi are in the Department of Anatomy and Cell Biology. The article appeared in Advances in Physiology Education earlier this year.

I have a Google Scholar alert that sends me notifications of research articles that have certain keywords, and "flipped classroom" is one of them. That's how the paper came across my radar screen. What made me want to dig deeper into it was this line from the abstract:

Focusing on a new flipped classroom-based course for basic medical sciences students, the purpose of the present study was to evaluate students’ adjustments to the flipped classroom, their time on task compared with traditional lectures, and their deep and active learning strategies.

The phrase "self-regulated learning" never appears in the paper, but the items mentioned here in the abstract pertain directly to that idea. There are any number of research results that show that students in flipped learning environments perform at least as well as their counterparts in unflipped environments -- on standard tests of content knowledge. But what about the bigger picture? It's always seemed to me that students in a flipped environment would show big gains in self-regulated learning strategies as well, because the main ideas of self-regulation come to the point every day, and students practice and get training on independent learning every day. So the focus on the effects of the flipped environment on learning strategies and metacognition intrigued me.

The methods

The study focused on students in the "Medical Sciences 4200" course, which is "an elective for students in their fourth year of a nonthesis-based honors specialization basic medical sciences bachelor’s degree" -- which quite a mouthful, and it raises some questions that I'll mention in a minute. The course was designed from the start as a flipped learning experience, with students completing online modules (OLMs) prior to class and then participating in a variety of  in-class activities. The OLMs were essentially hour-long lectures with interactive elements like quizzes embedded within them. The classes met once a week for two hours.

The students who consented to be in the study were given surveys and weekly reflective questions to gauge how they were approaching the flipped design, including how much time they were spending on task, how they were using the OLMs, and so on. Some of the questions students answered were objective while others were free-response.

Once the data were collected, the objective items were analyzed for frequency and the free-response items coded by two different reviewers , to look for patterns in the qualitative data. The paper goes into a lot more depth about their coding process and the particular codes they ended up using.

The results

It's probably not surprising, but 70% of the students in the study reported doing the OLMs (the pre-class work) the night before class rather than spacing this work out over a longer period. Also, 80% of the students completed their OLMs in one sitting and did not return to them later or watch them in shorter bursts. In other words, as the authors point out, these students tended to treat pre-class work exactly the same as in-class lecture -- you do it in a "just in time" way and you sit and listen to the whole thing straight through.

After the first OLM, half of the students reported that they would be changing their strategy about completing the OLM based on their experiences -- mostly in the form of time management adjustments. This is a key self-regulating behavior. When learners take an approach to learning that doesn't work for them, you would like to see them spontaneously work to come up with better ways of learning. And these students did. (But again, that demographic... See below.)

One of the more interesting results of the study -- and it wasn't explicitly advertised in the title -- was the reduction of multitasking behavior that students exhibited in the flipped learning environment versus a traditional lecture environment. When asked, most students said that during a traditional lecture class meeting they will attempt to multitask in some way -- checking social media, checking email, texting, or surfing the web. The self-reported rates of multitasking for students in this course were much lower, the differences between the flipped course and the students "favorite lecture course" being statistically significant to the 0.01 level on two measures, and 0.001 and 0.0001 levels on others.

The study also found that students were engaging in deep learning practices that can be attributed to the course design. For example, the authors found that students were handwriting notes onto study sheets that were provided as they worked through the OLMs, even though they had the option to type their notes into the study sheets. The students were taking the information in at their pace -- something you can't do easily in a live lecture -- and reframing it in their own ways.

My take on the results

I appreciate studies like this that focus on more than just exam performance. For me, the real value of flipped learning is in the way "lifelong learning" -- so often used merely as a marketing catchphrase -- becomes a major goal of the course and an intentional focus of instruction. A flipped learning environment lets students build their self-regulatory muscles day in, day out. And more research needs to be done to highlight this and to see where this process can be improved. But let's take a moment and just appreciate the main takeaway from this study, which is that flipped learning experiences seem to lead to deep and active learning as well as mature self-regulatory behavior.

The multitasking behavior results are interesting because I never thought about this being a potential barrier for learning in a flipped course. But as the authors point out, in a flipped learning environment technology is often ubiquitous and so is the temptation to misuse it. The thing to realize is that technology is ubiquitous, period. The best thing we as professors can do is design our courses so that active learning is at the forefront -- so the temptation to check Twitter, for instance, is minimized -- and the proper use of technology is stressed. And these results seem to suggest that flipped learning design does this.

If there's one major issue I have with this study, it's the demographic. Students taking the course are advanced undergraduates, taking the course as an elective (so there's personal investment built in), and it's part of an honors degree program. This is not your typical Calculus 1 or Freshman Comp course. It forces me to take most of the results with a grain of salt. For example, how much of the students' self-regulatory behaviors that they reported -- like making adjustments to how they work through the OLMs -- are really attributable to the course design, versus just their experiences as veteran, high-achieving students taking a course because they want to? Would we see the same kinds of results if we transplanted the course design to, say, Calculus 1 or Freshman Comp and then ran the same study?

One other question I had was about the length of the OLMs. An hour is a long time for this sort of thing, and I was surprised to read that students don't spontaneously use "chunking" behavior when approaching a video of that length -- that is, breaking the video up into smaller units and digesting them one at a time. Then I realized, perhaps the length of the video itself is getting in the way. When you're presented with a long-form video, it seems like the natural thing to do is watch it in one sitting. And that's precisely what students did most of the time in this study. On the other hand, if you present students with ten 6-minute videos instead, it makes me wonder if the chunking behavior (which we know is beneficial to learning) would be more natural. ("Looks like I have ten videos to watch; I'll do three tonight and then do the rest the night before class.")

Odds and ends

I learned in this paper that there is a statistical test for interrater reliability called the Fleiss $\kappa$, and it turns out there are other interrater reliability statistics as well. That sounds like it could be handy in any qualitative study, as well as studies about the reliability of evaluations -- for example if you did a study about whether SBSG (= standards based/specifications grading) has better interrater reliability than points-based grading.

You can continue the discussion in the comments area below -- and I am taking requests for other articles to review in later installments.