Probably the most important ingredient to teaching excellence is **frequent communication with students**. Most of the mistakes I've ever made as a teacher — and over 25 years, there have been a lot of those — have boiled down to mistakes in communication in the end. We can't read minds, and our reading of body language is usually flawed too. So to really know what students are thinking, what they know and don't know, and how they're doing, we have to maintain clear lines of back-and-forth communication at all times — for example giving frequent informal course evaluations during the semester and surveying students on a regular basis— otherwise our teaching is just guesswork.

At my last Trimesterly Review I set a goal to improve the student experience in my courses, specifically the Precalculus course I'm teaching online this summer. I've taught Calculus online before, and in a hybrid format back in Fall 2018, and the classes were *OK...* but *just* OK. Each time, I've come away thinking that something was missing from the student experience. I couldn't quite put my finger on it — which told me that I wasn't communicating well enough with my students and not collecting the right amounts or right kinds of data from them. All I know is that I got decent course evaluations, but there were signs that a lot of improvements could be made. So this summer I am really focusing on finding and making those improvements. To find them, I realized I needed some metrics for gauging how the students are doing.

In the past, I've experimented with two kinds of approaches to getting information from students through mid-semester evaluations:

*The minimalist approach*. I've asked students to fill out a Start/Stop/Continue form — just three questions — and looked at the verbal data. I've even done a two-question version where I ask students "What do you love about this course?" and "What would you change about this course?" The "pro" of this approach is that it's simple and free-form and generates a lot of qualitative data. The "con" is that it's all qualitative and subject to my interpretation; there aren't any quantitative items there that I can track.*The maximalist approach*. This involves taking a significant subset of the end-of-semester course evaluation questions and putting them in a form for a mid-semester evaluation. The "pro" here is that it generates a lot of numerical data that I can look at, and it trains the students on the course evaluation questions so at the end of the semester the questions are familiar. The "con" is that it's often 10+ questions long — too much data to slice effectively and students are put off from responding to such a long survey.

I realized that I needed to make up my own metric that is somewhere in between these two approaches: Quantitative in nature but grounded in good qualitative ideas and sound theory; short and sweet enough to be something students want to finish but not so short that the data have no explanatory power; and something that I could give students on a very frequent (weekly) basis and which would inform my goals for teaching.

What I have come up with is something I call the **Five Question Summary.** It's just the following five questions, which students rate on a scale from 1 (strongly disagree) to 5 (strongly agree):

I was challenged intellectually by the content and activities this week.

I had plenty of support from the professor, my classmates, and the course tools as I worked this week.

I am closer to mastering the ideas of the course now than I was at the beginning of the week.

I made progress in learning this week because of my own efforts and choices.

I felt I was part of a community of learners this week.

The first two questions address the balance between *challenge* and *support.* This balance is at the heart of my teaching philosophy and I want to know on a regular basis how students are experiencing this balance (or lack thereof). As I was thinking up these questions, it occurred to me that you can put these two variables into a 2x2 matrix to create four quadrants, each of which describes a student's potential situation in the course. Students in quadrant I (high challenge/high support) are in something like a state of **flow**, which is what I want. Students in quadrant II (high support/no challenge) are being **coddled** (not a fan of that term but I couldn't come up with anything better). In quadrant III (low support/low challenge) students are **disengaged**. And if you're in quadrant IV (high challenge/low support) you are **stressed** (originally I described it as "screwed").

What I hope for week after week are a majority of the students, if not all of them, in the **flow** quadrant; and if there are students elsewhere, I can track them down, talk with them, and hopefully see them migrate quadrants.

The other three questions come from Self Determination Theory, which I have written about here before (for instance). SDT says that there are three main components to motivation to complete a task: **competence** (the knowledge that you are gaining mastery over the task), **autonomy** (the belief that your improvements are the results of your efforts, not something external), and **relatedness** (the feeling that you belong to or are connected with others as you learn). Those last three questions are aimed directly at those three concepts. According to the theory, students who possess all three characteristics are "autonomous" and have high levels of intrinsic motivation; students who have competence and relatedness but not autonomy are said to be "controlled" (they can get the job done but believe it's not coming from within themselves); students who do not possess either competence or relatedness are said to be "impersonal" (some key aspect of motivation is missing and they are not fully engaged).

The five-question summary allows me to gather data on a regular basis — I'm going with weekly right now — on how students experience support/challenge balance, and on the key levers for their motivation in the course. I feel like the five-question summary is what my course evaluations would look like if I stripped them down to the bare minimum. They are easy to ask, and best of all contain a surprising amount of information. Each week I am having students fill out a Google Form with these questions (and a few others) on it. This "weekly report" is a credit-bearing required part of their weekly coursework. The responses come back in a spreadsheet, and I can then do stuff with the data.

What kinds of stuff? I'm still playing with how I want to analyze my data, but for now I use the data to create three pictures:

- The 2x2 matrix for challenge and support that I described above;
- Another 2x2 matrix for the questions on competence and relatedness; and
- A boxplot for the question on autonomy.

The idea being that the challenge/support matrix is information in its own right as I said earlier; the competence/relatedness matrix gives me a sense of whether students are (or aren't) at least in a "controlled" state of motivation; and if they make it to "controlled", the boxplot for autonomy tells me whether they made it to the "autonomous" zone this week.

I make these visualizations with Python, using pandas and matplotlib — but you could just as easily use just a spreadsheet or, if your class is small enough, hand-plot the data on a piece of paper or a whiteboard. Here's what I got from my students (n = 11) from this past week. First, the challenge/support matrix:

The plot is set up so that individual responses are circles, and the "hotter" the color, the more frequent the response. Here, the light blue is 1 response (like the challenge = 3, support = 3) and the bright purple (challenge = 5, support = 4) happened 3 times. (The other purple is 2 times. I'm not a fan of this particular color map.) What this tells me at a glance is that students are mostly experiencing the right kind of balance of challenge and support in the course — there are no respondents who are coddled, disengaged, or stressed. If there were, I could reach out to them directly (these aren't anonymous, although you could make them so if you wanted) and see what's up. But at a glance there are no worries on this front.

Here's the competence/relatedness matrix:

This time what I see is that there are a couple of students who, while feeling a sense of growing competence on the course material, feel disconnected somewhat from their classmates. Generally speaking most students are in quadrant I or on the boundaries, meaning that they are at least at a "controlled" state of motivation, which I will accept as a good result; but there's room for improvement and I need to talk with those two students in the lower half-plane (which by itself might improve their sense of relatedness) because their motivation is possibly at risk here.

Are the students who are in the "controlled" state of motivation actually at a higher level, in the "autonomous" state? That's where the autonomy plot comes in:

It's kind of hard to see, but the green line for the median is at 4.00, at the same place as the 25th percentile. So this is saying that 75% of the class rates themselves at a 4 or 5 on a scale of 5 on autonomy. At a glance, this is a good sign that students feel in charge of their own learning and not at the mercy of others.

Those three plots, which I can re-run at the click of a button next week once I have a new set of data, give me a kind of dashboard for taking the temperature of my class and real data with which to make decisions about what to do in following weeks. And I can keep asking this week after week — it's a short enough assignment that students aren't put off by it but informative enough that I can do something with the results. Because these are quantitative data, I can set concrete goals for my teaching (e.g. have every student in quadrant I of the challenge/support matrix from week 6 onward, or have a median of 4 on each metric, etc.) and measure my progress toward that goal.

This definitely isn't a perfect set of measures. For example I would need to do more analysis to see if the students who rated themselves low on autonomy are the same who rated themselves low on relatedness. Or maybe some students who have high self-ratings on relatedness have correspondingly low ratings on autonomy because they are *so* connected with others that they're not developing their own skills enough. Also these are self-reported ratings, and they have all the validity issues that self-reporting brings. And so on.

But as a simple, repeatable, lightweight instrument that generates informative data visualizations of what's important to me as a teacher, I'm pretty happy with these.

The code for generating all these reports is below (try not to laugh at my Python, please). Leave your own thoughts and suggestions in the comments.

]]>In case you hadn't heard, my Twitter friend Josh Eyler has written a beautiful and practical book called *How Humans Learn: The Science and Stories Behind Effective College Teaching: *

First of all, I want to encourage all college faculty to read this book. It's unique in how it combines the best of evidence-based instructional practice with a deep respect for the humanity of learners. Some teaching-and-learning books are all about technique and research and miss the human element of learning; others conversely handle the human and emotional side well but are light on the underlying psychology and neuroscience of how the ideas translate into classroom practice. This book gets the balance exactly right, and just like my interactions with Josh over the years, I came away from reading it not only having learned a lot about teaching, but also with a palpable sense of the enormity of what it means to be in charge of a small part of the intellectual development of college learners.

I interviewed Josh about the book and other items back in January before I went into the hospital for heart surgery. I had not actually read his book yet; that didn't happen until *after* the surgery, while I was still in the hospital. Josh's book is really getting a lot of traction, as it should, and I noticed he's been on the road a lot lately to talk about it. This morning we had a quick Twitter interaction in which I congratulated him:

I read it while I was in the hospital back in February. It deserves to be widely read and taken to heart. (See what I did there?)

— Robert Talbert (@RobertTalbert) May 2, 2019

When I was reading Josh's book in the hospital, it was two days after my surgery. I had brought with me to the hospital my Kindle, the paper copy of Josh's book, and my phone along with some personal items. I had intended to do a lot of reading, which I did – but what I didn't realize was how hard *reading* was going to be so soon after surgery. My body was worn out from the strain of the multiple-hour open-heart procedure, and I was on a constant infusion of major-league painkillers. The painkillers, obviously, had an effect on my brain functioning; I remember having to stop and re-read paragraphs in Josh's book multiple times just in order for the symbols on the page to resolve themselves into words and then process those words into meaning. It was *work*.

Then twice a day, a nurse or physical therapist would come by the room, help me get out of bed – because when your sternum has been split in half and your chest cracked open, it turns out you can't just hop out of bed a few days later – and take me for a walk. By "walk", I mean a slow walk, one foot at a time, of about 100 feet around the perimeter of the seventh floor of the heart hospital, while being attached to the nurse or PT person by a strap so they can catch me if I fall, and using a walker. I needed the strap and the walker, too, because standing up and walking would spike my heart rate and make me dizzy. This, too, was exhausting work. And for someone who just a year prior was doing 5K races and Spin classes, it was extraordinarily humbling work.

Thinking back this morning about the context in which I read Josh's book, it seemed to me that being laid up and basically helpless in the hospital was the perfect situation in which to think about the fundamental nature of teaching and learning.

As faculty, it is terribly easy to forget about the humanity of students. We can all come to objectify students and treat them as nuisances. *Students aren't prepared enough. They don't do their reading. They're entitled snowflakes. They have no work ethic and act like customers. My job would be a lot easier without them!* Or, possibly even worse, we can come to treat students as pure abstractions, just lines of a spreadsheet or faces in an audience, with no backstory and no personal stake in what's happening. We come in (or go online), teach our classes cheerfully, and never connect with students – forgetting that teaching and learning is a fundamentally human-to-human undertaking, and not realizing that failing to connect is failing to teach.

In reality, college students are a lot more like I was, while I was in the hospital. They have survived a process and are mostly willing to get better, but it's *work – *long-term arduous work that requires not only effort on their part, but more importantly assistance and guidance on our part. When I was in the hospital, as I said, I couldn't walk 100 feet at 1 mph unassisted. It was not because I lacked the work ethic or was "unprepared", and if I had merely exerted more effort, I probably would just fail worse. Instead what I needed was *guidance* in the form of doctors, nurses, and physical therapists who could, with great skill and patience, create a structure around me inside of which I could focus that effort and eventually get better.

Every patient in that hospital also had a *story*. For example, I was the college professor who one day started passing out while running on the treadmill, and I got a valve replacement so that I could go back to an active lifestyle. (And the only patient on the floor who watched Premier League soccer.) The person down the hall was a grandfather who was in for heart bypass surgery *and* a lung transplant was going to be there for a while. Another was suffering from depression in addition to recovering from heart surgery and was refusing to do her walks or breathing exercises because she just wanted to die. Each of us in that unit had backstory and a vision of our futures, with the hospital and our treatment as a critical juncture.

Do we always remember that students are like this as well? They are human beings in a particular stage of intellectual development. They cannot be lumped together as a single mass. Teaching them is an enormous responsibility, and each student needs personal attention to know exactly how we can help them get from "before" to "after". We can talk and write about this pedagogical technique or another as much as we want, but at the end of the day there is no "technique" for teaching – only patience, skill, and attention focused on the unique stories of individual students.

And at the center of this kind of practice is a radical *humility *that has to be in place inside every instructor. You simply cannot focus on students when you are full of yourself. Hospitals are good places to learn humility. I am grateful that the early days of recovery from surgery for me were like a school for humility – you simply can't do what you want to do, and instead you have to subordinate yourself to the needs (or doctor's orders) of someone else.

So I hope that those of us teaching in the summer, and the rest of us who will be thinking about Fall semester soon enough, can take these ideas to heart (haha!) and come into our next round of classes as we should: ready to respect the unique humanity of students and work with them to get them ready for the rest of their lives. It's an enormous responsibility and an amazing privilege.

]]>*This is a throwback post that first appeared on the Chronicle of Higher Education version of this blog back on April 28, 2014. I've made a few updates and edits for 2019, and added some new thoughts at the end. *

Many of the comments on my recent posts about flipped learning are principled skepticisms of flipped learning and the flipped classroom, and rather than bury my responses in an already crowded comment thread, I thought they deserved to be brought up point by point for discussion. *[2019 Note: "Recent" = January-March 2014.] *

Here’s the first one to bring up, and it’s a tough one. This (and many of the other topics I’ll be bringing up) come directly from Manda Caine’s comment on one of those earlier posts. She said:

When my colleagues and I have [taught with a flipped classroom], students do not perceive that a professor is teaching them at all, so we have comments such as, “We could just do this at home” or “Why am I paying all this tuition to just teach myself?” or “She doesn’t teach. She just expects you do do it all yourself. The class is pretty much pointless” or “If I wanted to learn on my own, I’d just take an online class or get a book out of the library.”

I wrote something about this issue a couple of years ago. A lot of what I am about to say is a recap and updating of those thoughts.

Thought of as a *problem* with the flipped classroom, this has two sides: A problem that *students* encounter (adjusting to the design of a non-lecture oriented class), and a problem that *faculty* encounter (dealing with negative student reactions). It’s tempting to say that once we get the students fully on board with what we are trying to do with the flipped classroom, the faculty problem solves itself. But it’s not that simple – because it’s possible the students have a point.

Let me explain. I loved this recent article about the shortcomings of lecture based on the author’s experience trying to teach her brothers how to play a card game. In short, she gave them a clear, engaging, and brief lecture on the rules of the game; they nodded along, indicating understanding the entire time; and when it came time to play, *nobody except her had any idea what to do*. I have a version of that story too. Once when I was a kid, I had a friend over and he wanted to play a board game. He didn’t know how, so I “taught” him by handing him the rules and telling him to take a few minutes to read them, and then I’d answer his questions. That went over about as well as in the first article.

Neither of these two approaches to learning work because **neither of them take into account the needs of the learner**. Both are one-sided. (Nodding along enthusiastically doesn’t count as audience interaction.) Neither one indicates the sort of partnership between students and instructor that the best learning experiences have. There are things the instructor does that help learners, and things that learners are responsible for because they are the ones learning. The best learning experiences strike a balance.

The flipped classroom does **not** automatically provide those sorts of outstanding learning experiences. What it provides is *space* and *time* for instructors to design learning activities and then carry them out, by relocating the transfer of information to outside the classroom. But then the instructor has the responsibility of using that space and time effectively. And sometimes that doesn’t work. In particular, if there’s no real value in the class time, then the students are not mistaken when they say they are teaching themselves the subject, and they are not wrong to resent it.

So what this means is that with the flipped classroom model, the instructor has the responsibility of designing what we might call *crucial learning experiences* – experiences without which you can’t honestly say you’ve learned the subject. From first-semester calculus, here are a few examples of what I would consider to be crucial learning experiences in first-semester calculus (your mileage may vary):

- Computing the derivative of a second- or third-degree polynomial at a point, using the limit definition of the derivative.
- Taking a relatively complicated function ($y = e^{-x} (x^2- 1)$ would be an example) and executing a complete analysis of its critical points, inflection points, intervals of increase and decrease, and intervals of concavity and then hand-sketching a graph based on the information.
- Solving a nontrivial continuous optimization problem completely, by hand.

Whether you have a traditional or flipped classroom for calculus, if a student had no experience whatsoever of successfully completing these three kinds of tasks, then you would be justified in having reservations about the completeness (and hence the validity!) of that student’s calculus education. In the traditional classroom, such experiences are usually reserved for homework. In the flipped classroom, the expectation is that problems like these will be done during class time under the supervision of the instructor.

*Under the supervision of the instructor* – there’s the rub. I don’t mean a kind of aloof, checking-your-Facebook-while-students-work kind of “supervision” but rather the kind of interactive engagement that a coach might have with his or her players while they practice. The coach doesn’t do the exercises *for* the players, but neither does s/he stand off to the side and let them flail around the entire time. There is interaction between the coach and the player, between different players, and between different groups of players. And through that interaction, questions get answered, others get raised – and things get learned, if it’s done right.

So, getting back to the original issue, if students are voicing the opinion that they are having to teach themselves their subject, it’s well worth looking into just what exactly is going on during class. If the answer is that we’re handing students the rulebook and telling them to learn how to play the game this way, then students have a legitimate beef. In this case, it’s time to give class time a makeover, of sorts, so that students are actively involved *with you* while working *with each other* (or by themselves, or some combination) on crucial learning experiences.

One of the reasons I support the use of a dual set of learning objectives, one “Basic” and one “Advanced”, is that it communicates to students and to the instructor that there are some things that are simple enough to learn before class (Basic) but other things that are best learned together (Advanced). Therefore there’s value in the class meetings, because mastery of only the Basic objectives is not sufficient evidence to indicate mastery of the subject. In student-speak, there’s no way they will get a decent or passing grade unless they can demonstrate mastery of the Advanced objectives, which are *primarily* dealt with in class with *your* (the instructor’s) help.

If we’re doing class this way, this should be enough to keep any reasonable student from saying that the flipped classroom is equivalent to an online course or that they are teaching themselves the subject. But not all students are reasonable, alas. The best I can do here is tell you what I’ve done in similar situations where it’s worked out well.

First, I give numerous opportunities for students to give me feedback on the course throughout the semester. I usually do this at the end of every 4-week period. (In fact my students sometimes get tired of being asked for feedback.) They have the opportunity – anonymously in some cases – to say whatever they need to say about the course, and they are specifically prompted in lower-level courses that if they have concerns about the flipped model to voice them. If a student responds with concerns, I will usually invite him or her to a meeting in the office to hear them out. When they come – and I’ve never had a student who didn’t – I just listen and take notes. When they are done, I ask for *specifics*. For example, one student said that she didn’t think it was fair that her grade should suffer because I run the class flipped. I asked her for specific instances where she lost credit on an assignments *directly because of* the flipped class structure. She couldn’t name one. So, I said, how can we be sure that the flipped structure *caused* your grade to go down? How do we know it might not now be *higher* than what it would be in a traditional class? Where’s the causality that’s being claimed?

I’m not trying to browbeat the student or defend my course design, but if you’ve done all you can to make the class meetings effective, in all likelihood students are complaining because of vague feelings that they cannot support. For their sake they either need to figure out exactly what’s bothering them, or else accept what’s happening as just another way to learn.

Even so, students still may not be able to think of “lecture” and “teaching” as two different things. For me what’s been helpful is to use the results of formative assessments — little assessments that take place in the moment while students are working on these in-class tasks or through their out-of-class prep work — as public evidence of learning. For example, if 90% of my students answer a clicker question correctly, I’ll hold it up to them as evidence of their learning. “You folks seem to really understand the concept of concavity so far.” Convince them that they are learning. Evidence is helpful for this.

It’s also important to take every opportunity – and to make opportunities where there are none to take – to celebrate student successes in the flipped classroom and explain to students how this might help them later on. It doesn’t hurt to talk about how employers like new hires who are self-motivated, don’t need a lecture to learn how to do something, and who – quite frankly – know how to teach themselves things and have a *taste* for teaching themselves things. So in a perfect world, the very thing that students are railing against here – teaching yourself a subject – is actually one of the end goals of “higher” (read: “meta-”) education, namely education about educating oneself.

**2019 Additional thoughts:** Ever since this post, I've used my writing and workshops about flipped learning to explore the idea of flipped learning as self-teaching. I've come to the conclusion that flipped learning should embrace, not explain away, the concept that it is built significantly on top of the idea of students teaching themselves things. In fact, in recent talks, I've been tweaking my definition of flipped learning to say that

Flipped Learning is a pedagogical approach in which first contact with new concepts moves from the group learning space to the individual learning space[Tweaked phrase in bold]through structured self-teaching, and the resulting group space is transformed into a dynamic, interactive learning environment where the educator guides students as they apply concepts and engage creatively in the subject matter.

That is, yes, in flipped learning we most definitely do expect students to teach themselves certain topics in the course. Not *all* topics (a lot of student complaints come from misunderstanding that point) and not without guidance. In fact in my process for flipped course design, a lot of meticulous – some might say obsessive – planning takes place to identify exactly which topics will be covered by students in their individual spaces and what sort of structured guidance they will receive. This is what makes flipped learning different than what some people say they've been doing for centuries, where what they mean is that they hand students a book and tell them to read it for class.

But in the end, absolutely yes, we expect students to teach themselves things, and this is a feature not a bug. The main things to know about this fact are that (1) in almost every situation, there will be some complaints about the approach and (2) it's on us as the instructors of flipped courses to be aware of *why* we are embracing that approach, communicate that "why" to the students, provide the support that students need as they adjust to it, and – most of all – stick to the plan and don't cave in to pressure.

Most instructors are at least a little familiar with Bloom's Taxonomy, which is usually depicted as a pyramid:

Bloom's Taxonomy is a way of categorizing cognitive tasks in terms of their difficulty or complexity. It is most definitely old-school, invented by Benjamin Bloom in the 1950's and updated in the 2000's. The taxonomy in the pyramid above is actually just one of three parts to Bloom's Taxonomy, the other two addressing affective and psychomotor skills. It's a handy diagrammatic way of thinking about how "hard" different kinds of learning tasks are, and how they (literally) stack up against each other. And it should be stressed that Bloom's Taxonomy is just that: A *taxonomy*, a naming system, and nothing more. It's not a *heirarchy* that suggests that the lower levels are less important than the upper ones; nor is it a *timeline* that insists that one level must be mastered by learners before moving on to the next level.

I've used Bloom's Taxonomy countless times in talks, workshops, my book, and in my teaching practice to think about how to sequence learning activities, especially with flipped learning. Specifically: In traditional lecture-based instruction, we assume that learners encounter new ideas for the first time in their group meetings, where we assume they have zero knowledge of the new topic and focus class time on the **bottom half** of Bloom's Taxonomy. And then students work out the more complicated extensions of that basic knowledge, the **upper half** of Bloom's Taxonomy, through work done after class (homework, projects, etc.).

This model, with its emphasis on spending class time exclusively on the bottom 1/2 of Bloom, has numerous issues which I describe at length in my book. The foremost of those issues is that it aligns the difficulty of student work inversely with student access to help --- learners hit the hardest material when the instructor is least available and group collaboration is possibly outlawed by the syllabus. Traditional instruction also fails to take advantage of opportunities to help students become self-sufficient learners, because the material most amenable to self-teaching --- the bottom half of Bloom --- is co-opted by the instructor.

Flipped learning, on the other hand, literally inverts the focus in terms of Bloom's taxonomy, so that the bottom parts of Bloom are reserved for student self-instruction through structured activities, and class time is focused on the upper parts of the taxonomy --- the most complex tasks, which are best served by having a rich social environment in which to work (i.e. class time).

Framing flipped learning in terms of Bloom's Taxonomy has, for me and the people in my workshops and talks, been a helpful way to visualize the concept. But recently, while preparing for a webinar on time management in flipped learning, I took this a step further, and I think the result has even better explanatory power.

Traditional instruction still assumes zero knowledge of new topics coming into class, and then the class meeting focuses on the bottom half of Bloom and post-class work on the upper half. But in applying Bloom's Taxonomy to flipped instruction, rather than just flipping the halves, it's more helpful to break Bloom's Taxonomy up into *thirds*.

**Pre-class work**in flipped learning focuses on the**bottom 1/3**of Bloom's Taxonomy, that is, tasks related to "remembering" and "understanding". We insist students learn these tasks on their own prior to class through structured activities.**In-class work**then picks up where the pre-class work ends, focusing on the**middle 1/3**of Bloom, that is, tasks related to "applying" and "analyzing". These are the simplest extensions of the basics and will be the place where students need the most immediate help.**Post-class work**uses the basics and the extensions to address the**top 1/3**of Bloom, that is, tasks related to "evaluating" and "creating". These are the most complex tasks students can perform with a topic, and they often do not fit neatly into a 50- or 75-minute class period and are thus not good targets for using class time (although parts of those tasks can be addressed).

This division of labor seems to accomplish at least three important things for flipped learning:

- There are "cleaner edges" on the three phases of student work (pre-class, in-class, post-class) and therefore more focus and more perspective on what we should be doing in those phases. For example, thinking of learning activities in this way, we'd avoid putting tasks from the middle or top third of the taxonomy into students' pre-class assignments because this would be too difficult for many students and would steal energy better suited for mastering tasks in the lower third. Likewise, we'd steer clear of spending class time on tasks in the lower third of the taxonomy --- for example, giving students worksheets to do in class that merely rehearse rather than extend the basics --- because the basics have already been covered and now we need to spend time extending them.
- Students will be clearer on what they
*are*responsible for learning prior to class and what they are*not*responsible for. Students need to be told what*not*to do just as much as what*to*do in order to focus their limited energy and time on the right things. - By having the division of tasks like this, time management both outside and inside class meetings becomes easier. When you know that your class meeting should be focused only (or at least overwhelmingly) on application and analysis tasks, and not on basics or very-high-level tasks, there's less of a temptation to write a class activity that tries to do both, and so there's less opportunity to waste time on tasks that are inappropriate for the moment.

I'm teaching our "Functions and Models" course (a 5-credit course for students preparing for calculus) this summer and next year. It uses Matt Boelkins' free textbook. Here's what this split-into-thirds concept might look like as I prepare the lesson for Section 1.5, Quadratic Functions.

First, according to my Seven Steps philosophy, I'd come up with a list of learning objectives for the lesson and then reorder them in terms of Bloom's Taxonomy (simplest first). Here's a possible take on that reordered list, with the objectives that map into the lower third of Bloom ("Remember" and "Understand") highlighted:

**State the definition of a quadratic function.****Define what it means for a function to be concave up or concave down on an interval.****Write a quadratic function in vertex form.****Find the coordinates for the vertex of a parabola given by the quadratic function $f(x)=ax^2+bx+c$.****Use the quadratic formula to find solutions to the equation $ax^2=bx+c=0$.****Given a quadratic function $f(x)=ax^2+bx+c$, explain the effects of each of the parameters a, b, and c on the shape of its graph.****State whether a quadratic function is concave up or concave down.**- Find the average rate of change in a quadratic function over an interval.
- Explain the specific behavior of the average rate of change in a quadratic function over equally-sized intervals.
- Apply basic computations involving quadratic functions in real-world problems.
- Given a data set, determine whether the data follow a quadratic pattern and if so, fit a quadratic function to the data.

Assuming that this is correct for my situation, *those seven items would be the sole focus of students' preclass work*. I'd want to make sure that each of those objectives has at least one of the exercises I give students before class focused on it. **These basic objectives are all things that students can, and should, be learning on their own through structured activities.** To teach them for the first time in class is to misuse the class time and rob students of the opportunity to build their autodidactic skills.

Of the four remaining objectives, I'd say the first three are middle-third tasks ("Apply" and "Analyze") --- definitely the first two are, and the third (applying to real world problems) could involve simple applications in class and more complex ones later. So *those three items are the sole focus of in-class work*. We will not spend time in class on any other objectives, except maybe a recall activity in the first 5 minutes of class to jog students' memory of the basics --- because those basics have already been covered, by students in their pre-class time. My in-class activity should then have an item about finding average rates of change and discovering that the average rates on equally-sized intervals change at a constant rate, plus some basic applications to real world problems.

Finally, the last objective (modeling with a quadratic) is something I would leave for post-class activity, mostly because it involves using (and therefore learning) technology for doing this and perhaps gathering authentic data to analyze. That's a worthy goal, and it needs more time and space than my class seswill allow.

In terms of my Seven Steps process, I would split those learning objectives into "Basic" and "Advanced". Under this rubric, *the Basic Objectives would be the ones targeting the bottom third of Bloom* and *the Advanced Objectives are everything else.* In the pre-class assignment, it would all come together to look like this:

BASIC objectives: Each student will be responsible for learning and demonstrating proficiency in the following objectives *prior* to the class meeting:

- State the definition of a quadratic function.
- Define what it means for a function to be concave up or concave down on an interval.
- Write a quadratic function in vertex form.
- Find the coordinates for the vertex of a parabola given by the quadratic function $f(x)=ax^2+bx+c$.
- Use the quadratic formula to find solutions to the equation $ax^2+bx+c=0$.
- Given a quadratic function $f(x)=ax^2+bx+c$, explain the effects of each of the parameters a, b, and c on the shape of its graph.
- State whether a quadratic function is concave up or concave down.

ADVANCED objectives: The following objectives should be mastered by each student DURING and FOLLOWING the class session through active work and practice:

- Find the average rate of change in a quadratic function over an interval.
- Explain the specific behavior of the average rate of change in a quadratic function over equally-sized intervals.
- Apply basic computations involving quadratic functions in real-world problems.
- Given a data set, determine whether the data follow a quadratic pattern and if so, fit a quadratic function to the data.

Long story short, I'm liking this way of splitting Bloom's Taxonomy into thirds and using these as a tool for sequencing learning activities in a flipped environment. It helps give both focus and perspective to our work and our students' work, which saves time and makes that work more effective.

]]>*Due to being gone for a speaking gig over the weekend and then celebrating Easter, I'm sharing a throwback post with you today.* *This article first appeared on the Chronicle of Higher Education's blog network on January 30, 2012. It was in my first year at Grand Valley State University, which explains some of the context. *

Since moving to west Michigan in July, my family and I have been living in an apartment while our house in Indiana sits on the market. This is the first time since 2001 that we’ve spent longer than six months in a rental property. Sunday morning, as we woke up to find that we’d been buried in snow overnight (as per usual in west Michigan), I realized that the home ownership habit runs pretty deep with me.

When I looked out the door and saw the image you see in the photo, I naturally grabbed the snow shovel, walked out the door, and started clearing off the walkway and the van. I got some curious looks from my neighbors, as if to say: *What are you doing? We are paying rent not to have to do stuff like this*. And it’s true: The apartment manager usually comes through shortly after a snowfall and clears off the walkways. *Usually*. But who knows? Maybe he won’t come today. And anyhow, even though I don’t technically own the apartment, I do have a sense of **ownership** about it, and it just seems the right thing to clear off not only my walkway but also my neighbor’s. Some of my neighbors, on the other hand, take the **renters**‘ approach and prefer to let the guy they’re paying do the job — whether or not it actually gets done.

The difference between an **ownership** and a **rental **mindset is one that we educators encounter all the time with our students. Students engage in one kind of mindset or the other in our classes. The rental mindset says, *I am paying the rent, and as long as I pay, I expect the management to take care of my needs*. The ownership mindset says, on the other hand, *I am invested in this, and although some things are not my responsibility (like plowing the city streets or running the fire department), I choose to take responsibility for myself because it matters to me*.

**We want students to own, not rent, their education**. Ultimately, which kind of mindset students adopt is a choice that only they can make. But while we can’t make students take ownership (just like you can’t make people move out of an apartment into a house), we *can* make the decision to choose ownership easier or harder through the choices *we*, as instructors, make when we design classes and learning experiences.

For example, rather than dictating what content students learn in a course, we can instead design courses around clearly-stated learning objectives and give students some latitude as to how they will show us they’ve mastered those objectives. For instance, in the computer workshop in Calculus 3 this past week, I wanted students to use *Mathematica* to investigate how parameter values affect the behavior of a curve in 3-space. I could have done this by giving students a single vector function with two parameters to study. But instead, I gave three options and had each working group choose one. By giving just a little bit of free choice, students gained a little bit of a stake in the process and thus a little bit of ownership. The object of the course is not to “cover material” — it’s to *meet learning objectives*, and by letting students choose how they want to do this, they are shoveling their own walkways in an educational sense.

We can also encourage ownership by moving away from instructional designs and methods that promote dependency on the “manager” for “services”. I’m thinking primarily of lecture. Lecture has its uses in certain cases, but it’s clear that a lecture-based course wants the learner-instructor relationship to be primarily one-way. The instructor is paid to produce, and the students consume — it’s a renter’s paradise. But it’s an unsustainable practice for students, who will all be moving into positions in life where *they* are supposed to be the producers. Some students come into a lecture course with an ownership mindset — like I came into my apartment — and so they naturally take ownership of their learning. But if the goal is to get *all* students incrementally closer to ownership, a model that disincentivizes ownership is not going to succeed. However, if we choose instructional models that promote student responsibility — like the inverted classroom, peer instruction, or project-based learning — even in small amounts, then we are moving in the right direction.

*The article below originally **appeared in the *Faculty Focus* newsletter** on March 25, 2019. Here it is in its entirety; at the end, I'll add some new information to this article that appeared in the webinar I did on this subject last week. *

In a flipped learning model of teaching, students get first contact with new ideas not during class time but in structured independent activities done prior to class time. This frees up class time to be used for more active work, digging more deeply into advanced ideas. This inversion of the use of time is a key difference between the flipped and traditional models of instruction—and when instructors flip, it brings up issues about time management for both instructors and students that require special attention.

Let’s take a look at three specific questions about time management that often arise when planning and teaching a flipped course.

** 1. How can instructors best manage time when building a flipped course for the first time?** Preparing a flipped learning environment, especially if you’ve never done it before, is complex and time-consuming. It requires detailed enumeration of learning objectives, careful planning and sequencing of activities, and (especially) the creation of high-quality materials for students to learn with outside of class. Flipped learning doesn’t require the use of video, but many flipped learning instructors do use video, and this alone can be immensely time-consuming even if you are just curating existing content rather than creating it.

The best way to manage time in preparing for a flipped course is to make sure you start early, so you have plenty of time to manage. I recommend starting ** one calendar year out** from the start of the class you intend to flip. Assuming that class is in the fall term, here’s how this might look:

- In the fall term one year before, focus on building good habits of active learning in the classes you teach. Start by using simple classroom activities and frequent low-stakes formative assessments to gauge student learning and guide instruction. Also try to completely flip a couple class meetings during that semester to get the flavor of how it works, and take notes and gather student feedback on how it went.
- In the spring term, keep doing all of the above, and flip the last one-third to one-half of the course. Building up to these flipped lessons by gradually releasing responsibility to students will ease the students into it.
- During the summer, go to work on the fully-flipped class: Decide on the learning objectives, sequence the activities for the class, and prepare the materials up to about one month into the term. This way you won’t be scrambling to finish the materials for week 2 during week 1.

By giving yourself breathing room and having a plan for building your learning environment, you’ll avoid stress and make the transition to flipped learning in a calm and orderly manner.

** 2. How can instructors best manage time while teaching a flipped course?**What makes flipped learning truly effective is its focus on using class time almost exclusively for active work on advanced ideas. But a lot can go wrong during that time: What if students don’t come prepared? What if the in-class activity takes too long?

The key to managing flipped instructional time well and avoiding in-class disasters is in the *learning objectives* you set for your lesson. It’s well known that writing good learning objectives benefits students in a number of ways. Having good learning objectives anchors student activities and provides boundaries within which students can focus their efforts, and it makes the use of time in class more purposeful.

The above is true for any class, flipped or otherwise. But in a flipped setting, the learning objectives become even more useful when you divide them up. Once you’ve made your list of learning objectives, split the list into the objectives that students can learn in their pre-class activities (the “basic” objectives) and the the ones they will learn during and after class (the “advanced” objectives). Basic objectives are those that live on the lower two levels of Bloom’s Taxonomy—recall of simple facts and explanation or categorization of basic concepts. In a flipped learning environment, students learn those objectives through their pre-class work and *not* during class time. Class time, instead, is focused on the middle third of Bloom’s Taxonomy—applying basic knowledge to new ideas and drawing connections among ideas. Class time is *not* to be used on re-teaching the basics, just like pre-class time is not to be used on advanced ideas.

This division of labor provides useful constraints on how time should be used both before and during class. By restricting pre-class work to just the basics, students are given work that they can be reasonably expected to complete, along with permission not to fully understand the advanced material yet. Likewise, by insisting that class time only be focused on higher-level concepts, time isn’t wasted on redoing something students did before class.

** 3: How can we help students manage their time effectively while they are taking a flipped course?** Finally, we should always remember that the purpose of flipped learning is to improve student learning, and students are our partners in this process. But flipping a class changes the rules of engagement for learning, and students need our help in navigating this new environment. Here are two things we can do:

*Make expectations clear.*Instructors can help by giving clear explanations of expectations for student work, the outcomes of that work, and why they are doing that work. Writing clear learning objectives and splitting that list into basic and advanced objectives as we described above is one way to do this; it tells students what they can expect to learn and when they can expect to learn it. We should also have conversations, early and often, in which we talk with students about what their roles are in a flipped environment and what our roles are, as well as the reasons why the class is flipped in the first place.*Make a calendar with suggested activities on each day*. Although students sometimes say flipped classes are too much work outside of class, a recent study indicates that students in flipped environment don’t work any more outside of class than students in a traditional environment. What matters is how that time is used, and students often need pointers in how best to use that time. In my classes, students get a biweekly calendar that gives recommended study plans for each day. Here’s an example from a recent class. This reinforces the expectations for out-of-class work while promoting healthy work-life balance (it includes scheduled breaks and arranges work so that students don’t need to work on the weekends).

Learning to manage time well, itself takes time. But the payoffs go well beyond the classroom. Having a more organized, orderly course design helps students, particularly those who need help the most, and it makes the path clearer to a truly transformative learning experience.

Here are a couple of additional thoughts on this article that happened after I wrote it, especially during the webinar:

- David Allen makes this great point in his book Making It All Work that
**there's really no such thing as time management**. We cannot "manage" time in the sense of creating it, destroying it, or making it run faster or slower. Instead**what we manage is ourselves,**and how we focus our attention on things that happen to us in the moment. So when we talk about*managing time*in any classroom situation, what we really mean is managing our plans - Someone asked on the discussion thread,
*what do you have students do during the time in class?*I responded with an idea that came to me recently: Just as we can use Bloom's Taxonomy to map out good learning objectives, the Bloom pyramid is also a kind of roadmap for how to use pre-, in-, and post-class activities in a flipped environment. Namely: The pre-class work should focus on the bottom 1/3 of Bloom's Taxonomy ("Understand" and "Remember"), class time focuses on the middle 1/3 ("Analyze" and "Apply"), and post-class work on the top 1/3 ("Evaluate" and "Create"). In flipped learning, students encounter new ideas for the first time in their individual spaces before group activities, and that's best focused on the simplest tasks, hence the bottom 1/3. Then, crucially, we trust students to learn those things and then focus class time on building upon them — the middle 1/3. Then we address the highest level tasks after class, since these often take up more time and space than 50-75 minutes allows. By dividing up work like this and hardening the edges on the kinds of work you will and won't address in each context – i.e. don't spend time in class on basics, and don't put lots of advanced tasks in the students' pre-class first-contact work – you end up with both focus and perspective in each phase. And the combination of focus and perspective (another David Allen/GTD idea) is the master key to anything we call "time management".

*Welcome to another installment of the 4+1 Interview, where I track down someone doing cool and interesting things in math, technology, or education and ask them four questions along with a special bonus question at the end. This time I caught up with Kate Owens, a professor in the Department of Mathematics at the College of Charleston. Kate is an innovative and effective teacher whose work with students is well worth paying attention to, and she's someone I've enjoyed interacting with for several years on Twitter and elsewhere. *

*You can find more 4+1 interviews here. *

*What's your origin story? That is, how did you get into mathematics, what led you to earn a Ph.D. in the subject, and what led you to the College of Charleston?*

As a kid, I was often bored in math class at school because I didn’t find it particularly challenging or engaging. My dad has a Ph.D. in mathematics and he was always happy to give me new mathematical ideas to think about. In seventh grade, we were supposed to design posters featuring our favorite number, and I picked 43,252,003,274,489,856,000 -- the number of permutations of the Rubik’s cube. I had no idea how to solve the cube, but I was really interested in things like combinatorics and math that wasn’t the “boring stuff” they were making me do in algebra class.

In high school, my plan was to study astrophysics or aerospace engineering. Inspired by images coming from the Hubble telescope, my dream job was to work for NASA. During my first few semesters of college, I was an astrophysics major. One day I realized that I was much happier in calculus than in physics; I spent most of my physics courses feeling confused. More than once I went to my calculus professor asking for physics insight. I got the sense that I spoke mathematician and not physicist, and I changed majors. Eventually I finished my degree in Pure Mathematics from U.C. San Diego. I decided to pursue graduate school in mathematics and I was accepted into the Mathematics Ph.D. program at the University of South Carolina. I finished my M.A. there in 2007 and completed my Ph.D. in 2009.

While in graduate school at the University of South Carolina, I fell in love with another graduate student. He finished his Ph.D. in 2007 and we married in 2009, right as I wrapped up my own dissertation. We spent a long time talking about how we could achieve both our family goals and our career goals, and eventually decided that we would follow his career path -- even if it meant giving up my own job search. My husband accepted a postdoc position in Texas; after a year, he transitioned to an industry job and we moved to Charleston, South Carolina. I had contacts from graduate school and spent a few years at the College of Charleston as a Visiting Assistant Professor before a permanent Instructor position became available. I’ve been teaching here since 2011.

**2. One of the innovations you've championed is the use of mastery-based grading. In your view, what is the purpose of mastery grading, and how well does it work with your students? **

Before I switched to mastery-based grading, I had concerns about how well grades were correlated with student learning. Grades, even those given on assignments early in the semester, always seemed like a final judgement since my students didn’t have a way to demonstrate growth in their understanding. Also, I realized that I couldn’t always diagnose knowledge gaps among my students; many students might earn the same grade on a test for very different reasons. After handing back their assignments, I wouldn’t know how to advise them on what topics they should review or how they could improve. I wanted my gradebook to reflect exactly what content a student knew at this particular time, instead of what percentage of topics they knew at some point in the past.

Now that I’ve switched to mastery-based grading, my gradebook reflects what each student presently knows and what topics they still need to work on. Additionally, it gives me an overview about what the entire class knows already, what they’re still struggling with, and what ideas are most appropriate for us to tackle together next.

The reasons I switched to mastery-based grading are still there, but the two big reasons I won’t switch back to traditional grading are something different. First, mastery-based grading has changed the kinds of conversations I have with students in a fundamental way. I no longer have conversations that begin with questions like, “Why did I get only 8 out of 13 points on this problem?” or “What percent do I need to make on Test 3 to have an average of 88% in the class?” Instead, conversations more often begin with things like, “I don’t how a quadratic equation can tell me if its parabola has *x*-intercepts or not, can you help?” Students are able to track what they’ve mastered and what they haven’t. Second, my system allows for students to improve old scores, so students are incentivized to learn old material that they didn’t quite get the first time. I believe in the importance of having a growth mindset. Mastery-based grading is built on the belief that grades should reflect demonstrated knowledge and that providing many opportunities for the demonstration of newly gained knowledge is important.

**3. College of Charleston is one of the oldest higher education institutions in the United States, founded in 1770. Have you perceived any tension between the history and tradition of the institution on the one hand, and your innovation in the classroom on the other? (If so, how do you make innovation work for you? If not, how does the culture of CofC support both tradition and innovation?)**

You’re right -- the College of Charleston is the 13th oldest educational university in the United States. We are a public, liberal arts college with an undergraduate enrollment around 11,000. The Math Department has over 30 faculty members, whose research areas encompass algebra, numerical methods, logic, number theory, statistics, and more. I believe that our differences in background, research, and instructional methods give us strength as a department. Since CofC is a small liberal arts college, it means that a lot of our mission is about delivering quality undergraduate instruction. Although each faculty member makes different choices in their courses, we have a supportive Department that allows each of us to make our own academic judgments about our courses.

In the Math Department, I’ve helped pilot a program turning traditional, lecture-based college algebra courses into emporium-style classes. In our program, students only work on topics they haven’t yet mastered and in which they have the opportunity to get more one-on-one help on a daily basis. Over the last several years, our data have shown that students are more successful in these college algebra courses as compared to the traditional format, both in terms of course grades and also their raw scores on our departmental-wide final exam for the course. We are now researching longer-term trends of a student’s path through several linked courses (college algebra -> precalculus -> calculus I -> …), and we hope to find ways to raise student success through this course sequence. I’m also piloting an emporium-style approach in precalculus and gathering data about how it’s impacting students.

Outside of the Math Department, one way that CofC supports innovation is through our “Teaching and Learning Team (TLT) for Holistic Development” division. Part of TLT’s mission is to provide support and professional development for faculty interested in cultivating a culture of innovation on campus and in their courses. More than once, I have participated in Professional Learning Clubs about mastery-based grading. They were both a reading group -- we read Linda Nilson’s book Specifications Grading: Restoring Rigor, Motivating Students, and Saving Faculty Time -- and a support group, offering each other instructional ideas about ways to implement mastery-based tasks or non-traditional grading schemes into our courses. I’ve also been a panelist talking about mastery-based grading at TLTCon, CofC’s “Teaching, Learning, and Technology Conference.” There are several faculty members here at CofC who are using non-traditional grading schemes, and I hope our group will continue to grow.

**4. What's something with your teaching and your students right now that you are excited about?**

Our semester is almost over here at CofC. Our last day of classes is April 23 -- only a couple of weeks from now! On the last day of my precalculus course, we have what I call a “Re-Assessment Carnival.” On this day, each student may choose to re-try as many problems as they can complete in the 50-minute class. This gives them a last opportunity to demonstrate knowledge of our course standards before the final examination. It’s an exciting thing to watch: Students are *thrilled *that they’re allowed to take an extra 6 quizzes. From my viewpoint as the instructor, I am thrilled to give out high-fives as they finally master those tricky problems we’ve seen all semester. Mastery-based grading means students can’t get by relying on partial credit, and so they really have to re-visit the tricky topics several times -- but it’s a really great moment when students realize everything has finally clicked.

**+1: What question should I have asked you in this interview?**

What are some projects you’re involved in outside of the classroom?

I’m very involved with our “Science and Mathematics for Teachers (SMFT)” Master of Education (M.Ed.) program. This is an interdisciplinary program designed for in-service middle school and high school teachers. At the end of this semester, two of our students will present their Capstone Projects and officially complete their degrees. I’m excited to see how their projects turn out and how their learnings will impact their classrooms and students.

- Although most of my time is spent on teaching-related tasks, one of the best parts of my job is when I get to be a learner instead of an instructor. Graduate student Colin Alstad is defending his masters thesis (“Categorifications of Dihedral Groups”) later this month. Serving on Colin’s thesis committee has given me a great excuse to keep learning more math -- in this case, some category theory.
- Since 2015 I’ve been the co-Director for the College of Charleston’s “Math Meet,” an annual event held each February. The Math Meet attracts hundreds of students from the region -- this year was the 41st annual Math Meet and we hosted over 450 middle school and high school students from South Carolina, North Carolina, and Georgia. In one day, we offer more than a dozen different events, including three levels of a Written Test, a Math Team Elimination, a Math Team Relay, several Math Timed Sprints, a Physics Brainstorm, a Chemistry Brainstorm, and a trophy presentation in the afternoon. While it seems like the 2019 Math Meet just wrapped up, we have already begun planning for Math Meet 2020.
- Lastly, I’m a parent of three fantastic kids (ages 8, 5, and 3), so I spend a lot of time juggling work-related tasks with gymnastics practice, soccer games, swim lessons, playing outside, laundry, etc. I’m excited for the summer months since it means I’ll have more time to spend with my family. In particular, it’ll mean more time to share some mathematics with my 8-year-old son -- he has decided he wants to become a mathematician when he grows up!

I've been continuing to think about the role of habits in teaching, as I first wrote about here. In that post, I wrote that:

- Habits are the building blocks of behavior, or as James Clear put it in his book Atomic Habits, our behaviors can be seen as lagging indicators of our habits.
- Teaching is a kind of behavior, so we should be able to trace our teaching practices --- both the good ones and the not-so-good ones --- back to the habits that we've adopted.
- In fact it seems to be the case that many, if not most college faculty teach purely motivated by their ingrained habits, rather than careful and scholarly reflection on different methods and their pros and cons. (To be fair, this is probably all faculty some of the time, and some faculty all the time.)
- But, this means that insofar as our teaching may reflect bad or lazy habits and insofar as we need to adopt better ones, there are behavioral tricks we can perform that can help us get there. I highlighted Clear's formula of
*When I do x, then I will do y*for creating habits and gave some examples.

This connection between habits and teaching seems quite rich in terms of explanatory power for how, practically speaking, we might promote better teaching in the college ranks. Here are a couple of additional thoughts on that (which were going around in my head when I wrote the first post, but it was already long enough).

First, **thinking about teaching as an expression of our habits explains a lot about the negativity and pushback we often see from faculty when we advocate for better teaching**. When you ask someone to improve their teaching, even indirectly (for example by bringing up this PNAS meta-analysis of active learning studies), you are often asking them to break old, unproductive habits and replace them with better ones. And as anyone who's a friend of someone stuck in a bad habit and who needs to change --- or anyone who's the parent of a teenager --- can tell you, getting a person entrenched in a bad habit to just *listen* and think about changing is serious work that leads through a morass of denial, defensiveness, and outright hostility, and possibly goes nowhere in the end.

Consider for example those "in defense of the lecture" articles that pop up in the media at regular intervals. They are almost always written by faculty who are entrenched in a mode of teaching Derek Bruff has called "continuous exposition" --- i.e. all lecture all the time --- and are either in denial of the research on active learning or completely ignorant of it, and the essays themselves tend strongly toward being defensive, closed-minded, and self-centered. They focus on what "works" *for the faculty member* but give almost no thought to the needs of others, particularly diverse populations of students who are in these lectures. Quite often there are no real arguments in these pieces at all, but when there are, they tend to be preposterous --- for example this old Atlantic piece I wrote about where the argument was that research shows active learning is better for students than lecture but that's only because we haven't gotten enough training on effective public speaking.

Compare this kind of behavior with that of a person who's entrenched in a habit that's objectively bad, like smoking, biting one's nails, or not exercising. Defensiveness? *Check*. Denial? *Check*. Focus on the self rather than others? *Check*. Responding to interventions with illogical or downright crazy-making "arguments"? *Check*. So, I don't think the authors of these essays are necessarily being obtuse on purpose. I think they're reacting at a visceral, even unconscious level at having to confront the inadequacy of their existing habits and what it will take to change.

So when we advocate for improved teaching practice, we'd do well to remember that while some will respond to this call with enthusiasm, others will burrow in and take a defensive posture, and we have to deal with such people as we would with anybody who's decided to double down on a bad but comfortable and familiar habit rather than accept the challenge to change --- with caution, patience, care, and a focus on small wins, while at the same time giving up no ground on what the research says works best for students.

That gets me to my other thought on this issue, and that's **the role of habit formation when we talk about faculty development**. As most readers know, I go around a few times a year and give faculty workshops on topics like flipped learning and teaching with technology. I used to focus (if you can call it that) these workshops on deep dives into big-picture items --- getting faculty to be emergent experts on flipped learning by giving extensive looks the history and research and practice of that subject. I've realized this isn't an effective approach because it rarely seemed to result in real behavioral change among the participants. The faculty *liked* these workshops and were enthusiastic about them, but I had only a handful of evidence that the faculty in attendance actually adopted different/better teaching practices as a result.

I think that's because the big-picture approach is just too much to translate into concrete action for the average faculty member. So these days, my approach has been to keep things minimal. I talk about the history and basic practice of flipped learning and some of the research behind it, but **only enough to support the formation of basic habits** that will support flipped learning later on --- and focus the workshop on building those habits. That's the essence of my One Year Plan. You don't immediately go and try to flip a class once you hear about flipped learning. You instead focus the first semester on building in habits of active learning and get those firmly ingrained, and *then* you think about redesigning a class.

I've only taken this approach with the most recent few workshops I've done, but I think it has a much bigger impact because it acknowledges the Pareto principle that 80% of our results result from 20% of our actions, and focuses on that crucial 20% instead of a big picture that's too big for most faculty to handle. The results have been pretty promising. Although we cover less in the workshops, we get more done and what we get done has more staying power. (Cue the comparisons with teaching.)

I wonder if faculty development everywhere wouldn't benefit from the same approach, to present a big idea to work toward like active learning or peer instruction or inclusive teaching practices or whatever --- for the *first* 1/5 or 1/4 of a workshop, and then the remaining 3/4 to 4/5 of the event are spent identifying and strategizing how to build daily habits of mind and practice that will eventually lead to the big results. You could even have faculty track this kind of thing using habit tracker apps or through some kind of bullet journal setup and use the results to quantify (or at least visualize) their progress. Maybe some faculty give up on developing as a professional not because they feel their current practices can't be improved upon but because they are secretly shamed by the daily failures and setbacks of adopting new and better habits.

It seems like that kind of approach might work especially well with training graduate students (who we in the faculty ranks do a notoriously bad job of preparing for teaching) and new faculty, because being in a new work context, they're in a perfect position to start fresh with their habits, and habits --- being at the atomic scale --- are simple to think about and measure, and therefore easy to talk about with faculty mentors or center-for-teaching staff. And again, it's satisfying to see behaviors change and improve over time through the adoption of better habits. Actually the behavior of cultivating good habits is itself a habit ("When I hear about some new teaching approach that seems good, I will set up the habits needed to learn it") and faculty at the start of their careers are well-situated to immerse themselves in it.

Any further thoughts on this? Leave them in the comments or on social media.

]]>This week in *Forbes* magazine, Tom VanderArk renewed an ongoing and often intense debate about the role of second-year algebra in math curricula in high schools and colleges. It's far from the first time that people have questioned whether Algebra 2 should be required of all students in a school district or college setting. There was pushback in 2013 from the state of Florida and more recently in California. In fact right around 2012-2013, when the Common Core State Standards were emerging (and largely because of those standards, which mandated some items from Algebra 2), resistance to "algebra 2 for all" also started emerging both in these specific cases and in more general op-eds, like this article in Slate and this one in the New York Times, and the issue doesn't seem to be losing any steam.

I think this is an important issue because it addresses a fundamental question about math education in this country, namely: *What will the quantitative education of school kids and college students consist of?* In that sense, the question of Algebra 2 morphs into a battle for the very definition and purpose of mathematics itself. No wonder it draws out strong responses from all sides. Because this is such a vital issue, I wanted to write this post in which I will probably make everyone who reads it at least a little upset: Because while I agree with much of what Tom wrote, at the same time there are significant issues in his article, and more broadly I think all arguments about this issue are starting to veer dangerously away from the people who can contribute the most: **math educators**. So in the spirit of an *amicus* brief, I want here to affirm some things and call other things out to try to guide this debate to a more productive place.

The *Forbes* article in a nutshell says the following:

- Algebra 2 is currently a gatekeeper course, used as a requirement for graduation in many school districts and colleges.
- Its role as gatekeeper is a holdover from a bygone time, starting in the 1990s with concerns brought up by "A Nation at Risk" which then morphed into Common Core standards. But whatever needs used to be in place for requiring Algebra 2, are today not as valid (if not completely irrelevant).
- In fact there is reason to believe that Algebra 2 perpetuates inequities among some student demographics, and anyway the "vast majority" (Tom's words) of people don't actually use the rote symbolic manipulation skills of Algebra 2.
- So instead, students should study coding and
*computational thinking*. And there are some emerging examples in the real educational world where this is happening.

Before I get into the specifics, just a note: I met Tom VanderArk at the Steelcase Active Learning Symposium last fall, and I was really impressed by him. He's a smart guy with interesting ideas about the role of technology in our lives, including education, and I learned a lot from his keynote address. I was also grateful he took some time out to talk with me about my sabbatical at Steelcase, and he was gracious and generous, despite dealing with the usual last-minute chaos of getting ready for a big talk.

So as I point out some issues in the article, this isn't personal. In fact I agree with many of his points. But if we are going to try to make a push for better mathematical education for young kids and college students today, we have to get the details right. Passion isn't enough. So I'll first point out the issues, then talk about what I agree with and where we should go from here.

In my view there are three major issues with the *Forbes* article.

**Issue #1: Defining one's terms and using facts.** The article leads off with an unfortunate quote from venture capitalist Ted Dintersmith:

The tragedy of high school math. Less than 20% of adults ever use algebra. No adult in America still does integrals and derivatives by hand - the calculus that blocks so many from career paths. It remains in the curriculum because it’s easy to test, not important to learn. https://t.co/DS52yevTbR

— ted dintersmith (@dintersmith) January 20, 2018

"Less than 20% of adults ever use algebra" is not a fact. It is a made-up figure without supporting evidence. (If there *is* research out there that supports this number, I am more than happy to be corrected.) Insofar as that "20%" number *does* have any basis, it's probably in Dintersmith's personal experience, so it's more likely to represent confirmation bias than anything else, a reflection of the people he's around rather than "adults" in general. If we're going to make a compelling argument, we can't lead off with opinions disguised as facts.

Of greater concern to me is the use of the term *computational thinking*. I have written a lot about computational thinking in the past (here and here for starters), but despite this and how much I've worked computational thinking into my teaching, the fact is that what we actually mean by this term is very fuzzy today. Jeanette Wing's paper is the usual go-to reference for computational thinking basics, and various folks (including Google) have built on this. But as Lorena Barba --- an honest-to-goodness computational scientist --- has pointed out, many of these recent formulations have drifted away from the original intention of "computational thinking" which is due to Seymour Papert. She writes:

The contemporary message takes an orthogonal direction to Papert, ignoring his Power Principle. It has happened to many powerful ideas when taught in schools (like probability) that they becomedisempowered: “reduced to shallow manipulation that seldom connects to anything the student experiences as important.” [...] Combing through dozens of articles on computational thinking, the emphasis is on problem-solving (low on the problem–project dimension), understanding (low on power), an operational perspective (low on the object–operation dimension), and content priming over media.

The "Power Principle", deriving from a 1996 paper by Papert, is in her words: "What comes first, ‘using’ or ‘understanding’? The natural mode of learning is to first *use*, leading slowly to understanding. New ideas are a source of power to do something."

When the *Forbes* article speaks about "computational thinking", what does it have in mind? Is it Papert's idea of "progressively deepening understanding" --- and if it is, what exactly do we want students to understand? Or is it just teaching kids how to make animated cats in Scratch? Or something in between? VanderArk writes:

That starts with problem finding--spotting big tough problems worth working on. Next comes understanding the problems and valuables [sic] associated--that’s algebraic reasoning. But rather than focusing on computation (including factoring those nasty polynomials), students should be building data sets and using computers to do what they’re good at--calculations. [...] A little coding can be useful to set up big tough problems. A basic coding class or two can be helpful in this regard. The new approach, exhibited at Olin College and signaled by the launch of the Schwartzman school at MIT, is just-in-time coding, a computational resource available across the curriculum--learn the right coding to apply the right tools at the right time to solve the right problem.

To me, this feels poorly-defined and impoverished. Solving important, relevant problems is of course very important and criminally underused in many math classes today. But there is a *lot* more to learning mathematics than just having the ability to pick the "right tool at the right time to solve the right problem". Ironically, being able to choose the right tool at the right time for the right problem is exactly the same goal that some teachers have for learning the very rote symbolic manipulation that this article denounces. (Think "techniques of integration" in Calculus 2.) Do we really want a "computational thinking" that merely swaps one form of rote symbolic manipulation for another? Where's the four-part foundation of decomposition, pattern recognition, abstraction, and algorithm design in other formulations of this idea? And also, where's the *beauty* that is threaded throughout mathematics? Does everything have to be *useful* in order to justify keeping it around? What a dull world that would be.

The article mentions Olin College and the MIT Schwarzman College of Computing, This website at Olin gives a little more insight on what "computational thinking" might look like. But the Schwarzman College example seems hardly informative. Long story short, MIT is spending $1 billion to create a new college which will "lead the world in preparing for the rapid evolution of computing and AI." Fifty (!) new faculty positions will be created and it promises to "educate students in every discipline to responsibly use and develop AI and computing technologies to help make a better world". But how this will be done and whether it's an externally valid approach to teaching "computational thinking" (whatever this means) is unclear. For example, those faculty --- will they be actually teaching, or just doing more research? And students --- what will they be actually learning, when will they learn it, and how will they learn it? A $1 billion dollar price tag doesn't give the answer and throwing that much money into an already-rich institution for unclear learning outcomes doesn't strike me as a decisive blow toward making math education better in this country.

*To sum up*: If we are going to make a convincing case for a better approach to math education, it has to start with facts, clear definitions, and transparent strategies and learning outcomes. That gets me to the next issue:

**Issue #2: Conflating "Algebra 2" with symbol manipulation.** What exactly *is* Algebra 2? Usually it is a big course with a lot going on inside it. The most memorable (for better or worse, mostly worse) thing that happens in Algebra 2 is a lot of rote symbolic manipulation, like factoring cubic polynomials or other actions which can, I will readily agree, be charitably categorized as party tricks (for very special kinds of parties) and nothing more. In the article, and in many fellow-travelling articles, Algebra 2 is equated with symbol manipulation.

The problem is that this is too simple. Take a look around at people's Algebra 2 syllabi, and you will find Algebra 2 tends to be a mix of topics, some of which are neither clearly interesting nor especially useful (e.g. factoring cubics), along with others that many Algebra 2 detractors would say are vitally important for people to know today, such as:

- Probability and statistics
- Basic function types (linear, power, exponential, etc.) and transformations of, and modeling with those basic types
- Linear programming and linear systems
- Matrices

If you get rid of Algebra 2 to remove the burden of symbolic manipulation skill, you also get rid of the above ideas, unless you create something else in its place. But I don't see these ideas coming up in what's called "computational thinking" here, which remember is characterized as "understanding problems and valuables [*sic*] associated" along with "just in time coding". How is a student supposed to "understand" the structure of a problem without some formal study, at least self-study, of these Algebra 2 topics?

The quote from Ted Dintersmith at the beginning of the article said that very few people "use algebra", but what does "using algebra" mean? If I think critically through a complex problem in life, am I "using algebra"? If I am looking at a data set and trying to decide what kind of regression model to use --- linear vs. power vs. exponential etc. --- am I "using algebra"? Because I learned to think critically through complex problems and learned about function types in algebra.

To sum up: Algebra 2 is not just one thing, or as the article puts it, "a course on regurgitated symbol manipulation (Algebra 2)". It *can* be just mindless symbol manipulation, but it often isn't, because there are teachers out there who have the vision to take it to another level.

And that gets me to the third and perhaps most important issue:

**Issue #3: The lack of math educators' voices.** I haven't read every single article about Algebra 2 in the news, but in the ones I have reviewed for this post, there is a distinct and alarming absence of the opinions and voices of *actual math educators* in these public debates --- i.e. people who are now and have been in the trenches of education, working with students and dealing with the on-the-ground issues of teaching mathematics. The closest we get to talking to an actual mathematics teacher in the Forbes article is a quote from Dan Meyer (which was misattributed before it was fixed); venture capitalist Ted Dintersmith, who has to the best of my knowledge never set foot in a math classroom as a teacher, is quoted twice as much as Dan in the article. The main character in the Slate article is a professor emeritus of political science with zero math teaching experience. And so on.

This doesn't mean that Tom VanderArk or Ted Dintersmith have nothing useful to say about math education. (Far from it; keep reading.) But it does signal a frustrating and dangerous trend where the opinions of those who know best about the nuances of mathematics education in this country and who work daily on the human level to work those nuances out, are being ignored in favor of the opinions of venture capitalists, Silicon Valley types, and sometimes even celebrities.

To sum this up: If you want to make a compelling case for improving math education in this country, *talk to math teachers* and see what they are doing, what's working, and what's true. Linking to news articles about schools doesn't count. Get in there, up close with teachers and let them inform you.

And now that I've called out numerous aspects of the *Forbes* article, let me express how much I agree with its basic message. (I told you, this post will make *everyone* at least a *little* mad.)

Algebra 2 enjoys a privilege in our school and sometimes college curricula that it didn't earn and largely doesn't deserve. While it can contain some ideas that are vitally important (see above for a list) and while you can make an argument that the symbolic manipulation portions of the course are good for building critical thinking skills, or willpower or character or grit or whatever you want to call it, the fact remains that in many people's experiences, Algebra 2 is all about rote symbolic manipulations, and whatever else of value may be in the course are drowned out by the sheer pointlessness of having to factor cubics, simplify $(x^4)^{100} (x^6)^{1/2}$, and so on.

Algebra 2 enjoys this privilege because Calculus enjoys a similar state of unearned privilege, and in many cases Algebra 2 is a requirement because Calculus is seen as the apotheosis of the high school math curriculum; it's also usually the core of the college math major, although courses like linear algebra are probably more important today. David Bressoud among others has written powerfully about the drive to calculus that has reached a fever pitch at the high school level and consequently put college calculus at a crisis point. Calculus has a central position that is a holdover from the Space Race days; we math teachers tend to approach calculus primarily as symbol manipulation; neither of these two defaults are often questioned, and since Algebra 2 is considered the place where kids are supposed to learn the basic symbol-pushing skills "needed" for calculus, guess what Algebra 2 typically looks like.

But, there's hope: Bressoud also has been involved in changing calculus at his home institution of Macalester College so it's now a modeling-centered course with an increased focus on conceptual understanding and applications and a reduced emphasis on symbolic manipulation. It's possible, in other words, to dial back the rote mechanics of Calculus, reframe the subject as a *way of understanding the world* through modeling in context, and still learn all the things we say that symbolic skills teach --- critical thinking, attention to detail, grit, whatever.

If it can be done with Calculus, it can also be done with Algebra 2.

It's also very important to note Tom's reference to Conrad Wolfram and the injunction to "teach as if computers existed". I actually wrote about this over eight years ago, and reprised the idea in this post which eventually turned into a TEDx talk last year. No biology teacher would dream of teaching that subject without a microscope out of the belief that students need to learn critical thinking by doing observations without technology. Similarly, no astronomer would teach her courses without a telescope. And yet when it comes to the computer (and particular tools like SageMath, Python, and Jupyter notebooks) which is a basic tool for exploration and analysis in the study of mathematics --- the microscope/telescope of the math world --- we fear its use and worry that if we teach with it, if we teach students how to use the tool like a professional, that they will somehow be worse off. This is crazy and unprofessional and possibly unethical, and we need to turn it around and be braver and more creative than that.

The way forward from here on this debate can proceed like this:

- Let's be clear about what Algebra 2 is and is not, and also be clear about what we would like to replace it with, if anything --- or what parts of Algebra 2 should be phased out and what parts should remain.
- Let's focus our decisions based on sound research, enriched by the lived professional experiences of math educators who are getting it done with real students every day. Bring others into the mix in this debate but resist the temptation to be over-awed by rich people and thought leaders.
- Let's avoid buzzwords like the plague and insist on concepts that mean something.
- Let's explore how other people have solved this problem. At my university, for example, students have to complete a quantitative reasoning unit to graduate; College Algebra (basically Algebra 2) is
*one*way to satisfy it, but there are*eleven*other courses that do, including courses in logic or computer cartography. So perhaps there's a middle ground between forcing all students to take Algebra 2 and forcing them to learn to code. - Finally, let's try to get
*students*a little more involved. Missing from much of this debate is a diverse collection of student voices, some of whom might want math completely gone while others find something meaningful in that which we want to discard. Because if you center the debate on "What serves students the most?" then you will never go*too*wrong.

After doing speaking engagements and workshops for a few years now, I've decided that there is really only one criterion for deciding if what I'm doing with faculty is successful: **Behavior change**. If I fly into a campus and give a workshop that generates tons of enthusiasm, but then all the faculty's teaching goes back to the *status quo ante* as soon as I board the plane back home, it doesn't matter how well-received it was: I failed. I merely perturbed the system; I didn't actually change it. On the other hand, when I get an email from a workshop participant 2-3 months later who has been trying out the ideas of the workshop in their teaching and is curious about or stuck on some particular, I know that the workshop was a success, because someone's behavior has changed, and that will start a cascade of other changes that will make higher education better.

If individual behavioral change is the building block, then individual **habits** are the atoms that those blocks are made out of. In fact, I recently finished a book with this very idea in its title: *Atomic Habits* by James Clear. Clear's idea in the book is that behavioral change begins with small habit changes that lead to outsized results. Or as he puts it, our behaviors are lagging indicators of our habits. So if we want to improve teaching and learning in higher education, let's start with the habits of teaching that higher education instructors have. How do we do that? Clear, in his book, lays out some helpful frameworks for thinking about habits and how to alter them.

First, habits follow a consistent cycle of **cue -> craving -> response -> reward**. Some kind of cue sets off a craving; we respond to that craving with a behavior that will change our state to one of satisfaction; and the reward is that satisfaction. Habits are very efficient this way; Clear points out that we wouldn't have evolved the capacity for habit formation in the first place if habits weren't exceptionally effective at solving certain kinds of problems. But thankfully, that cycle admits the possibility for a hack. Clear suggests a formula for changing up how we respond to a craving, namely the simple sentence: **When I do x, I will do y.** To change a habit, or start one, identify the relevant cue, the response you *wish* to have, and write it down, like a contract. Then follow that rule until it sticks better than the one you replaced.

I think a lot of teaching at the college level today is done not because professors have looked at a variety of techniques, tried some out, and then made a rational and student-centered decision about it --- but merely because professors get stuck in habits. We walk into a classroom, or start preparing a lesson; that cue makes us crave the path of least resistance because we're overworked or have many other responsibilities; we then teach as we were taught. These habits are so strong that it takes purposeful effort to replace them with something better. This is why in my One Year Plan for flipping a classroom, Step 1 is to start a year in advance simply replacing unproductive habits of teaching with ones that better support active learning. What are those? Here are three suggestions that I think get us in the right direction.

In the literature review I wrote with Anat Mor-Avi on active learning classrooms, one of the things we discovered was the power that the physical layout of a space can exert over teaching and learning. When you walk into a classroom arranged in traditional row-by-column seating all focused forward on a large space only inhabited by an instructor, it creates an expectation of what's going to happen in that space: Students will sit and face forward while the instructor lectures. Anything else, while possible, is not the default and therefore produces a sense of wrongness.

So the first habit instructors can get into is simply **changing the space around to support active learning**. In our formula above:

When I enter my classroom, I will rearrange the furniture to get students in small groups.

This is often a really easy habit to enact. We don't have active learning classrooms in my building, but the furniture in the room I usually use consists of fairly lightweight chairs and rectangular tables; the tables can be put side-by-side to form a square. A couple years ago I started off the first day of classes by having students rearrange these tables into eight squares with four students each at them. The resulting configuration took about 30 seconds to attain (when we all pitched in) and made active learning a lot easier and more effective because students can communicate with each other better, and because I can reach students without having to squeeze down an aisle or row.

If the professor who uses the classroom after you doesn't like this kind of configuration, just end the class a minute early and move things back. If you teach in a room where stuff doesn't move, you can instead have a plan for getting students into groups by zones or proximity.

For those whose main teaching technique is lecture, it might feel like I and others just want you to jettison the lecture and go all-in on active learning right now. I actually recommend *against* that. The evidence is clear that active learning helps students, but I think the best argument for active learning is its own success, which can be realized through small, intentional changes that are more comfortably manageable from your current approach.

Jim Lang's book *Small Teaching* is the pre-eminent source for these simple, inexpensive modifications to traditional pedagogy; this handout and chart from the University of Michigan CRLT is handy as well. Whatever approach you choose to take for "small active learning", it takes intention to build it in. So using our habit formula:

When I plan out my lecture, I will build in breaks every 15 minutes for active learning, and also plan out the activities.

That 15-minute figure is arbitrary, so change it if needed. But, intentionally stop lecturing at reasonable intervals to let students *do something* with what you've lectured about. Don't just stick with the habit of writing a lecture outline and that's that; break the habit up by breaking your classroom time up. Also, give the activity the same level of planning (if not more) as the lecture, instead of just telling students "Break up into groups and discuss".

Another habit to go along with this is:

When I start my lecture in class, I will also start a timer for 15 minutes; and when the timer goes off, I will transition to the activity I planned.

Because it's one thing to *plan* an activity and another to discipline yourself to actually *do* it. It's too easy to just decide in the moment to keep lecturing and skip the activity. Make yourself stop.

One great way to implement small active learning in your class is through frequent, low-stakes quizzing, either on paper or electronically. The benefits of frequent low-stakes assessment are becoming more clear every day. They provide meaningful and effective practice for students, and just as importantly, they provide data for you on how well your teaching is going. And that leads me to the final suggestion:

After class is over, I will spend 15 minutes as soon as is practical to review the results of students' active work, reflect on the results, and decide what to do about those results.

In other words, make a habit not to rely on felt measures of enthusiasm but on what students actually know or don't know based on their work --- and then make real plans that affect your future instruction based on the results.

For example, let's say I teach a 50-minute calculus class that consists of 15 minutes of lecture, followed by a 5-minute ungraded quiz over the derivative rules of the day, followed by that same 15+5 minute block followed by a 5-minute exit ticket activity. (I know that doesn't add up to 50 minutes; I have slack time built in.) Immediately following the class --- or in the next available 15-minute block --- I am going to sit down, look through the results, and see how things went. (In some cases you might even be able to stop in the middle of a class and evaluate these data, for example if you do the low-stakes quizzing using clickers.) If students did well on the quizzes and seemed to articulate the concepts well at the end, I know my lectures were OK. If most students did horribly on those quizzes, then I know I need to do something about it --- change the lecture schedule, build in a practice day... *something* besides just moving on to the next lecture.

This is teaching like a scholar because we are using *evidence* to guide what we do, not feelings or faith. These have their own place in teaching, but they are horrible metrics for teaching effectiveness.

By focusing on building habits that line up with the identity we want to have as teachers --- and hopefully that means we want to be teachers who care about students and act accordingly --- I think meaningful change can come to higher education one faculty member at a time.

*Any other ideas for habits to form? Leave them in the comments.*

One of the most complex issues in teaching mathematics is how to handle examples. On the one hand, examples are important in mathematics because their construction is usually how we make sense of abstract ideas. On the other hand, students can get the wrong idea about examples. They can think that just by seeing enough examples done by a teacher, they will gain understanding of the subject; or that course assessments will be about completing examples just like the teacher did, and so they focus on replicating the teachers' examples instead of learning from them.

Last Fall semester I really felt this struggle as I taught my Modern Algebra 1 class. It's an upper-level course focused on number theory, ring theory, and fields. So there's a lot of abstraction and the only way to truly grasp the subject is to work with a *lot* of examples. Where I fell short in this class, and what I learned from the experience, is something obvious: *When I'm the one doing all the examples, the students aren't learning the math as well as they could.* Or as a colleague of mine put it, *the one doing the math is the one learning the math.* Whenever students ask to "see" more examples, I work them out, and students watch. But watching someone do a thing is not the same as learning the thing.

Why don't we have students generating their own examples more often? And how might such a strategy of student-generated examples look in practice? After my Fall teaching experience, I set out to see what research has been done on these questions, and I found this paper that I wanted to break open here today:

Anne Watson & John Mason (2002) Student‐generated examples in the learning of mathematics, Canadian Journal of Math, Science & Technology Education, 2:2, 237-249, DOI: 10.1080/14926150209556516

Link to paper: https://bit.ly/2SZ5CtX

This paper is a little different than other research articles I've written about here, in that it's a *qualitative* study. Qualitative research focuses not so much (or at all, in this case) on numerical measures of variables and their statistical differences, as it does on making careful observations of phenomena and then systematically analyzing what's observed. You'll often see qualitative research aimed at exploring questions that are difficult or impossible to operationalize, through anthropological-style observations, interviews, surveys, with the agenda of simply asking questions and making sense of the answers.

Some people think this makes qualitative research less rigorous than quantitative research. That's not the case. In my own research experience, doing qualitative research *well* is a lot harder than doing quantitative research; and doing quantitative research poorly is just as easy as doing qualitative research poorly. They're just complementary practices (and you'll often see them mixed together) and sometimes one is simply a better tool for the job.

Back to this study: The authors worked with kids (the study mentions 11- and 12-year olds in a few of the observations) on in-class exercises where students were asked to generate examples of five different kinds:

: Examples that involve executing and reversing processes, and "doing and undoing".**Experiencing structure**: Examples that elicit different kinds of examples of the same concept from different learners, or multiple representations of the same idea, or different questions that give the same answer.**Experiencing and extending the range of variation**: Examples that result in seeing a pattern in the examples that are produced.**Experiencing generality**: Examples asking learners to illustrate new concepts and invent notation or terminology to explain a phenomenon and then compare to standard mathematical notation and conventions.**Experiencing the constraints and meanings of conventions**: Examples that satisfy some conditions but not others, or those that exemplify "what is and what is not" or what cannot be done within specified constraints.**Extending example-spaces and exploring boundaries**

Each kind of example accesses different cognitive aspects of the example-making process. It's appropriate to give examples of each kind of example here.

The *experiencing structure* example given was about solving linear equations. They were asked to start with a value of $x$ stated as an equation (like $x = 5$) and then build up a linear equation by repeatedly doing the same operation to both sides. So start with $x=5$, then add 4 to both sides to get $x + 4 = 9$, then subtract $10x$ from both sides to get $x + 4 - 10x = 9 - 10x$, and so on. Then at some point they stopped and presented their equations and the rest of the class asked to solve them. The question came up --- if you were given this final equation and asked to solve for $x$, how would you do it? Well, for the group that created the example, it was easy --- just reverse all the steps that were used to build up the equation in the first place. For the rest of the class, the process was about figuring out what those steps were. Both groups experienced a structural process that generalizes to solving other linear equations.

The *experiencing and extending the range of variation* task had students give examples of multiplying multi-digit numbers together and making visible their thought processes for how this might work. I've seen this example myself in my own kids' school work. When asked to compute $89 \times 4$, some will compute $80 \times 4 + 9 \times 4$. Others will compute $90 \times 4$ and subtract another $1 \times 4$. By letting kids make the rules and then making their work visible, the entire group is exposed to a multiplicity of examples, some of which might "click" with a student who wouldn't have thought of it otherwise.

In *experiencing generality* tasks the researchers used a method they called "particular-peculiar-general". The specific task they gave students was:

- Write down a particular number that leaves a remainder of 1 when divided by 7.
- Write down a number that leaves a remainder of 1 when divided by 7 which is peculiar in some way.
- Write down a general form of a number that leaves a remainder of 1 when divided by 7.

Students mostly contributed small numbers for the first task like 8 or 15, then weirder ones like 700001 and 1 for the second task. The authors said that discussion ensued about how to handle negative numbers, and that "the third request followed easily from these contributions". (The latter I have to admit I'm skeptical about, because forming the right generalization for these numbers isn't easy.) This is an instance of using examples in a different direction --- not constructing particular instances from general concepts but using particular instances to *arrive at* the general concept, which is something that would be right at home in my Modern Algebra class.

In *experiencing the constraints and meanings of conventions* tasks, the teacher gave students a general idea --- in this case, to represent a function whose output is equal to the input plus 3 --- and have students generate their own representation of the idea. Students came up with some wildly different ways to think about these functions; the paper shows one result involving a sequence of nested boxes that eventually results in the graph of the function $y = x + 3$. After looking at student examples, the idea is to debrief the activity by comparing representations, discussing their pros and cones, and then comparing student representations to mathematically standard representations. The idea is that

If students have had to develop notations for themselves, and compared their usefulness, they are more likely to understand and accept the strengths and idiosyncracies of conventional notations.

And who among us math teachers has not had to deal with students struggling to understand the "idiosyncracies of conventional notations"? I'm looking at you, inverse functions and logarithms.

Finally, the *extending example-spaces and exploring boundaries* tasks are what I've had my students do in the past: Build a sequence of examples that satisfy increasingly strict constrains. In the paper, students were asked to draw a quadrilateral, then a quadrilateral with a pair of equal sides, then a quadrilateral with a pair of equal sides and a pair of parallel sides, then a quadrilateral with all these features and a pair of equal opposite angles. The idea with this kind of example is to explore the space of possible examples of a concept and discern what's possible and what's not possible.

So, what did the researchers find when they gave these kids all these example-generation tasks? Again, while no quantitative data were collected, the researchers uniformly observed that

Students were actively, noisily, and verbally struggling with attempts to reorganize what they knew to fit the kind of example the teacher was seeking. Students were led away from limited perceptions of concepts and towards wider ranges of objects. They restructured their ways of seeing and experienced the creation of mathematical objects and notations.

Some caveats are in order here: This is great, but it is a long way from a systematic analysis of student observations, and unfortunately it's pretty much the only general conclusion that the researchers draw. They also tend to align their observations with their own experiences as mathematicians: *In our own mathematical training, example-generation helped us, and look! It made these kids better too* --- which sounds to me like confirmation bias. I'd like to see this kind of study done again with tighter controls on the observations and analyses, and in fact this has been done --- actually Watson and Mason went on to write an entire book about this subject.

For me, the main importance of this article is that it validates the idea that while instructors may need to give examples to students at times --- and there is some reason to believe that instructor-led examples can be helpful in reducing cognitive load for students --- there is also great value in placing the main work of example generation into the hands of the students. It also sparks ideas for how we might do this on a regular basis in our teaching. Watch this space for some future posts on specific activities for different classes; and leave your own ideas in the comments.

]]>*Welcome to another 4+1 interview. This time, we're talking with Bonni Stachowiak, director of the Institute for Faculty Development at Vanguard University of Southern California and host of the popular and influential Teaching in Higher Ed Podcast. Bonni is someone in higher education who leads by example, and with the human element always at the forefront. I've been on her podcast a couple of times and I am really honored to return the favor now.*

**What got you started with the Teaching in Higher Ed podcast? What gave you the idea for doing the podcast, and what was it like getting the podcast up and running?**

My husband Dave had been running his Coaching for Leaders podcast for three years by that point. There really weren’t podcasts that focused exclusively on teaching in the context of higher education at that point, except a couple that also addressed other audiences like parents and students. Some of the big higher education news organizations had podcasts that talked about policy, but I was interested in conversations about teaching.

I had no idea what I was doing at first.

Except that I could go off of some of what had worked for Dave with Coaching for Leaders. One big boost in the beginning came from people who accepted invitations to be on a podcast that they hadn’t ever heard of before.

The early guests were also generous about recommending other people to be on the show. I had no idea what was in store and am eternally grateful for the transformation it has provided me.

**What are some things you've learned about teaching and learning from your interactions with the guests on TiHE that really stand out to you?**

While much has changed about the podcast over the years, one thing that has remained constant is that the start of each show, I talk about the show being a space to explore the art and science of facilitating learning.

What continues to stand out to me is just how much teaching is both an art and a science.

I enjoy those people who have helped me dive deeper into the scholarship of teaching and learning (SoTL). Yet, it is also invigorating to get to talk to someone who is playfully experimenting in their teaching and reveling in what Amy Collier (and Jen Ross) refer to as not yet-ness.

When Amy was a guest on episode #070, she stressed:

When you embrace not yet-ness, you are creating space for things to continue to evolve. – Amy Collier

Each student and every class community is different. We can build up a repertoire of approaches that have been demonstrated to improve learning (such as retrieval practice). This is a practice worth pursuing. Yet sometimes we will be playing the role of artist and creating something unique to a particular set of circumstances, without having the assurances that it is going to work.

**Are there any funny or embarrassing "outtake" moments from TiHE that you can share?**

The most memorable one comes from my conversation with Ken Bain. What the Best College Teachers Do was the first book I ever read about teaching in the context of higher education. I was incredibly nervous to speak with him.

At the end of the conversation, he was mentioning that he wished he could have said more. I joked that through the “magic” of podcasting, we could make that happen. Before I had a chance to press record, again, he started sharing the specifics about what he would like to add.

Fortunately, I type pretty quickly. I captured the bullet points of what he wanted to add and started to prepare to hit the record button, again. One big topic he wanted to do had to do with Eric Mazur, who had won The Minerva Prize for excellence in teaching.

I was unfamiliar with Mazur and also with the Minerva Prize.

Thus, when I pressed record and confidently resumed my inquiry, I started by asking him about the Manure Prize. I said it three times before Ken gently let me know that it was actually called the Minerva Prize. Autocorrect had changed Minerva to Manure and as I looked back over my notes, I didn’t know any different.

I could have just left that part of the interview out, entirely, when doing the edits. However, it was so illustrative of what I have always wanted to model about teaching with the podcast. I left it in and instead, eventually created the Manure Award to recognize people who have been vulnerable enough to experiment in their teaching and have experienced those inevitable failures that result from those endeavors.

**If you could have any one person, living or deceased, on the podcast to interview about teaching and learning, who would it be and why?**

Because the necessity for vulnerability is so central to what I believe about teaching, I would treasure the opportunity to speak with Brené Brown on the podcast. I just finished reading her latest book, Dare to Lead, and consider it to be essential reading for people who want to maximize the potential of people on a team.

**BONUS +1: What question should I have asked you in this interview?**

I mentioned earlier that I had wanted to start a podcast about teaching, yet I also have decided to include a focus on productivity. Some people might wonder why I considered it important to include that topic in some of the episodes, in addition to talking about teaching.

Robert, you wrote so powerfully conducting your trimesterly review after getting the news about your heart health issues. As we corresponded about me guest posting during your time of healing after the surgery, you said your friends were going to think you were nuts for spending time being sure stuff keeps getting posted.

When we regularly reflect on what is most important to us and have systems in place to continually be moving forward toward those aims, I truly believe we have more peace in our lives. It is a regular practice of aligning our sense of purpose with how we invest our time, energy, and attention.

**If you enjoyed this interview, check out these other ones from the past:**

- Josh Eyler on what he learned about teaching while writing a book on teaching and learning.
- Andrew Kim on the role of space and design in education.
- Lorena Barba on Jupyter notebooks and open science.
- T.J. Hitchman on inquiry-based learning.
- Linda Nilson on specifications grading and grading in general.

*I'm happy to reintroduce 4+1 interviews after a few months' hiatus, where I find someone who is doing something interesting in the world and then ask them four questions, plus a special bonus question at the end. This time I'm pleased to interview Josh Eyler, Director of the Center for Teaching Excellence at Rice University and author of the new book How Humans Learn: The Science and Stories Behind Effective College Teaching. *

**What was the catalyst for your book How Humans Learn?**

When I first moved into the world of teaching and learning centers, I did a lot of reading about pedagogical strategies, both general and discipline-specific techniques. There are some really wonderful resources out there about these methods--what works and how to implement them in your classroom. But one question continued to haunt me: *why* do some strategies work and others don't? I started doing some of the research that led to this book as a way to answer this question, and it took me into fascinating fields that I had never explored before--fields like developmental psychology and biological anthropology.

**Were there any issues or questions that you wanted to address, or wish you addressed, in the book but weren't able to?**

There is a point in my chapter on curiosity where I start to talk about our educational systems and why curiosity might fade into the background as students proceed through their educational careers. That section was originally much longer and explored all of the factors that might play into this, including teacher education programs and standardized testing. The section was eventually cut because of length, but I wish I had been able to keep it, because I think these are conversations we need to have as educators.

**What were some important things that you learned while writing How Humans Learn?**

I learned *so much,* which was one of the joys of writing the book. It took me 5 years to write this book, because I needed to dive deeply into these scientific disciplines that were largely unfamiliar to me and to become acquainted enough with their methods and findings that I could make claims that scientists felt were credible. That was really important to me. They didn't have to agree with me necessarily, but they did need to find what I was saying to be credible. In a sense, then, I was in a position similar to a student in a series of introductory courses--I needed to build frameworks and connections that were not yet in place, and that took quite a while. The things I was learning about were so interesting too! If I had to single anything out, I would have to say that learning more about evolutionary biology gave me a greater appreciation for the natural world and our fellow animals than I had before I started studying it. I'm grateful for that, and for everything else I learned along the way.

**Your "day job" is Director of the Center for Teaching Excellence at Rice, but you're also listed as an adjunct associate professor of humanities. What are some concrete ways that the lessons of your book have made their way into your own classroom teaching, and what are some aspects of your teaching you'd like to improve as a result of your research?**

Two big things have changed in my own teaching and lots of smaller things too. The big things, though, include even more prioritization of student agency. In the past, I've had students collaborate to design exam questions and things of that sort, but now we do an exercise at the beginning of the semester where we work together on a classroom compact--a document that lays out ground rules for discussions, guidelines for technology use, etc. Students generate the document together, and I give some feedback along the way. This gives them a sense of ownership over the class climate, and it sets the tone from the very beginning that I am interested in what they have to say and in their own contributions to the course. The other major thing that has changed has been my strategies for grading. My research on failure had a tremendous impact on me, and now all of my undergraduate courses are graded using the portfolio model (which aligns well with the writing courses I teach), and all of my graduate courses utilize contract grading. I want to continue pushing even more toward ungrading over the next few years. As for what I'd like to continue improving on, well, the answer is pretty much everything! I have so many ideas percolating now after finishing the book, but I'd like to pay some immediate attention to the assignments I'm developing and the ways in which they help my students engage their curiosity and take intellectual risks.

**BONUS +1: What question should I have asked you in this interview?**

How about: **what is the best part about writing a book?** My answer to that would be that the best part is being finished with it and getting the opportunity to have conversations with others about it. I've already learned a lot from those who have read it and talked to me about the extent to which the ideas fit in their own courses and teaching philosophy.

**If you enjoyed this interview, check out these other ones from the past:**

- Lorena Barba on Jupyter notebooks and open science.
- Eric Mazur (audio interview) on peer instruction, Montessori education, and Super Bowl picks.
- Diette Ward on how libraries support higher ed.
- Derek Bruff on directing a Center for Teaching and classroom response systems.

The longer I use specifications grading, and the more I see how differently students experience college courses that use mastery grading compared to courses that don't, the more I believe that the reform of our grading practices is an urgent ethical imperative. Like I said on Twitter last week:

Not just less important - it's clearer every year to me that grades are increasingly corroding education and student well being. The alarm bells are getting louder.

— Robert Talbert (@RobertTalbert) January 18, 2019

I switched from traditional, points-based, no-revision grading a few years ago to specifications grading because I had a strong sense that not only was traditional grading uninformative (large numbers of false positives and false negatives, and no clear link between the grade and what the students can do) but actively harmful to many students in many ways, one of the biggest being *motivation*. When I used traditional grading, students always seemed motivated not by the promise of learning the subject but by the inner game of scoring enough points in the right ways to get the grade they needed to move on --- or else they had no motivation at all.

This intution that traditional grading is demotivating was just that: An intuition. But a study I came across recently gives results about the real effects of traditional grading on motivation.

Chamberlin, K., Yasué, M., & Chiang, I. C. A. (2018). The impact of grades on student motivation. Active Learning in Higher Education, 1469787418819728.

Link to paper: https://journals.sagepub.com/doi/pdf/10.1177/1469787418819728

The authors in this study investigate how "multi-interval" grades (read: the A/B/C/D/F system) affect the basic psychological needs and academic motivation of students when compared with "narrative evaluation", where the instructor gives students verbal feedback both instead of, and in addition to multi-interval grades.

The theoretical basis of the study is self-determination theory (SDT). This framework is where we get the concepts of *extrinsic* and *intrinsic* motivation, where people are motivated to complete a task either by an external reward or for the sake of the task itself, respectively. (For more background, I wrote about SDT and flipped learning in this post.) According to SDT, there are three basic psychological needs that learners have while they are involved in a learning process: **competence** (the need to be good, or at least feel that they are good, at what they are learning), **autonomy** (having choice and agency), and **connectedness** or "relatedness" (being psychologically connected to others while doing the task). Essentially, the more these three needs are met in a learning process, the more intrinsic motivation the learner will experience; the lack of satisfaction of these needs leads to less intrinsic motivation, either in the form of extrinsic motivation or no motivation at all.

They studied 394 students at three different universities. One of those universities gave exclusively multi-interval grades in its classes; another had institutionally eschewed multi-interval grades and used only **narrative evaluations** in its courses. This is where instead of a grade, students get verbal feedback (that is honest, detailed, constructive, and actionable) on what they did and what they need to do. The third used a mix of narrative evaluation and letter grades. The students were given two surveys on academic motivation, and a subset of those underwent semi-structured interviews to dig deeper.

The results are a sobering indictment of traditional grading. Here are just a few that stood out.

Students were asked, among other things, about what information (if any) they got from their grades, whether their grades affected their decisions on what classes to take, and whether their relationship with grades had changed since high school. The prevailing opinion was that grades do *not* convey "competence-enhancing feedback" that can be used to improve; most students could not give any examples of how they used grades to improve their learning. Worse, the information that grades *did* give students tended to be negative signals about the students' self-worth. High-achieving students experienced pressure to achieve high grades; low-achieving students felt condemned by their low grades. All students associate the word "stress" with grades far more frequently than any other concept.

Moreover, traditional grades actively decayed students' sense of autonomy because many times the grade they get and what they have learned seem unrelated. As one student said:

And it was actually pretty frustrating because it felt like even in classes where I was really into the content and worked really hard I came out with a B+. And in classes that I didn’t care about and didn’t work very hard I still got a B+.

Grades worked against relatedness as well, as expressed by some students who described how their relationships with their parents suffered when their grades were poor.

The authors also noted that when discussing traditional grades, students readily adopted capitalist-style business language, for example referring to "cost-benefit analysis" and "payoffs" in describing how they approach class. That's strategic learning and extrinsic motivation taking hold.

The results from students who experienced narrative evaluation were almost completely the opposite of the results from multi-interval grading. Every "narrative evaluation" student interviewed expressed that narrative evaluation gave them usable information about their competence and were more useful than multi-interval grades. The study found strong links between narrative evaluation and enhanced competence, autonomy, and connectedness, and many of the students commented about how narrative evaluation built *trust* between the student and the instructor --- even if the feedback was largely negative.

These results came not just from the interviews but also from the quantitative results of the surveys, with statistically significant differences in measures of academic motivation found between students from traditional grading backgrounds versus narrative backgrounds (with narrative grading leading to higher indicators of motivation). Students from the university that used mixed grading methods experienced some of the benefits of narrative evaluation, but also some of the detractions of traditional grading --- and although the study didn't say this directly, it seems clear to me that the detractions happen because of the letter grades. (If you put a student in a "mixed" environment and give them good narrative evaluations followed by a "B+" grade, guess what the student will tend to focus on?)

So what do we do about this? For me, the course of action is clear: **We need to walk away from traditional grading** --- in which I include not only multi-interval letter grades but also grades based on statistical point accumulation. We've seen enough. Grades are harmful to students' well-being; they do not provide accurate information for employers, academic programs, or even students themselves; and they steer student motivations precisely where we in higher education do *not* want those motivations to go. There is no coherent argument you can make any more that traditional grading is the best approach, in terms of what's best for *students*, to evaluating student work. If we value our students, we'll start being creative and courageous in replacing traditional grading with something better.

Cue the objections about how this can't be done because of transfer credit issues, making non-traditional grading work at scale, etc. I agree partially, in the sense that this move is a long sequence of small steps. The article here is similarly pragmatic and gives some good advice:

Few universities are likely to abolish grades. However, universities should question the conventional use of multi-interval grades and consider their advantages and disadvantages in different departments, years of study, courses and learner types. For example, there may be specific courses or programs [...] in which cultivating deep learning and motivation may be more important than standardized communication of performance to external audiences. For such courses, greater use of narrative evaluations (as opposed to multi-interval grades) may be warranted. In addition, withholding grades from students or providing narrative/ written feedback several days prior to the grades may help students focus on mastery-related learning goals rather than extrinsic rewards.

I'd add the following ideas that I've learned from using specifications grading and hearing about how others use this and other forms of mastery grading:

- It's possible to keep the A/B/C/D/F system for reporting
*semester*grades, but use narrative evaluation and mastery grading instead of points and statistics to determine students' grades. Here's an example.^{[1]} - Do what the article suggests and start changing your grading practices over to something less focused on letters and points, in those courses where narrative evaluation and mastery grading make the most sense: Graduate courses, seminars, proof-oriented upper-level math courses, honors sections of courses, and so on.
- I think you could also make a strong case that introductory courses are also fertile ground for trying out narrative evaluation and/or mastery grading because these are where student motivation tends to be at its lowest point.
- Treat student work --- as much of it as possible --- like submissions to a journal. When we academics submit articles to a journal (or tenure portfolios, etc.) we don't get a point value or letter grade attached. We get verbal feedback with a brief summary: "Accept", "reject", "Major revision", "Minor revision" etc. followed by details. Then assign course grades, if you must have them, based on how much acceptable work the student was able to produce.

There are some practical issues at work here that can't be minimized, for example (and especially) large sections. The issue of scaling is a tough one, but it's not impossible. In my experience with specs grading, doing narrative evaluation takes no more time per student than traditional grading (which involves endless hair-splitting on how many points to give a response), so I don't think there's any reason to believe that nontraditional grading can't scale up.

Moving away from traditional grading could be one of those Pareto principle concepts where focusing intently on this one idea could usher in outsized improvements in many other areas of student learning. I think it could be a fulcrum for bringing about wholesale, even revolutionary change in higher education. Let's give it a try.

Although: I have to admit that recently, I've noticed that students in my specs-graded classes tend to focus laser-like on their grading checklist where they keep track of how many Learning Targets they've passed, rather than on what those Targets represent. In other words the specs end up becoming a proxy for letter grades and students fixate on those accordingly. I'm still thinking about how to handle this. ↩︎

Over the last few years, I think I've finally learned the obvious fact that *communication with students is the most important ingredient for a successful course*. Most of my failures in teaching can be traced back to failures in communication. Conversely most of the time when something goes right in my teaching, it's because I gave students the opportunity to speak their minds, and I listened.

For me, then, one of the most important aspects of designing a course is deciding on the tools students and I will communicate with each other. For a long time, this was just email. But as I started wanting more robust, full-group communication among students outside of class, I outgrew email, and I started using discussion boards like Piazza. But soon, even the best discussion board technologies weren't cutting it for me. The "discussions" were too *pro forma*, too much like something the teacher was telling you to use and not enough like real conversation. So a couple of years ago, I tried something different: I set up a Slack workspace for my class.

If you're not familiar with Slack, here's a video that covers the basics:

I introduced Slack in my Discrete Structures for Computer Science course, where the students are upper-level CS majors, with lots of tech savvy. Many of them had used Slack before in internships, jobs, and other classes, so it was no big deal for me to introduce it. Although I don't think I fully knew what I was doing and definitely didn't probe Slack's full potential, students liked it and used it, and it was a success.

This past fall, I taught a hybrid section of Calculus and two sections of Modern Algebra. I made up two Slack workspaces, one for Calculus and another for the two Modern Algebra sections. This time, however, the results were much less favorable.

I introduced Slack by setting up an "intro to Slack" assignment to be done in the first week that included watching tutorials on how to use Slack, posting a "hello world" message to a specially-created channel, replying to another person's message, and sending me a direct message (DM). Those are the absolute basics; students can learn more as they go, I decided.

Here's the channel setup that I used in Calculus (Modern Algebra was similar):

The #assignments channel is for discussion of assignments; #course is for discussing the course and the syllabus, #f2fmeetings is for followup discussion of what we did during the face-to-face meetings, etc. Technology questions are to be asked on the #tech channel; #updates is for course announcements and calendar events. I was posting course announcements to the #announcements channel as they occurred. (More on that below.)

The tools we had set up in the course were Blackboard for grades and files; Slack; and a Google Calendar. Calculus also used WeBWorK for online homework. Everything was linked on Blackboard so that all tools in the course were 3 clicks or fewer from the Blackboard main page.

So we were all set up for a very productive semester, right?

Although students seemed curious, if not enthusiastic, about using Slack and did use it initially, we ran into problems right away.

- Some students skipped the "intro to Slack" assignment. It only counted for participation credit, and so if a student just didn't want to do it, they could make up the credit it later. One student finally admitted in
*week 10*that he had never signed up for Slack. - A lot of students were not seeing the announcements for the course. There are several reasons for this, but one of them is inherent in Slack: Whereas Blackboard announcements are not only posted but also pushed to student emails, students had to proactively go to the #announcements channel on Slack to find their announcements. This one extra step was one step too many for a lot of students, which resulted in missed announcements and therefore missed assignments and therefore a lot of angst. I call this the "Yet Another Inbox Problem".
- Also with multiple announcements per day being posted, students needed to be logged into Slack constantly to catch them all. This is actually the way Slack wants you to use their product: be logged in on all devices at all times and listen for the ping. But that's not how students operate. (It might not be the best way for
*anybody*to operate.) Since students weren't always-on, they'd miss announcements. - Even those who went to the #announcements channel might have missed announcements because of overcrowding. I used a plugin for Google Calendar that pushed calendar events to the #announcements channel, giving students a daily digest of events for the day. Nice idea, but when the plugin drops 5-6 calendar items into the channel, the actual course announcements get pushed off the screen. Yes, there is a little alert telling you how many unread messages there are and an "All Unreads" aggregator that you can turn on. But it's easy to miss the unread messages alert, and the All Unreads feature is turned off by default.
- Students were confused by my channel structure. Many weren't sure where to post things and so put things in the "wrong" channels, e.g. technology questions in the #general channel, which meant I had to contact students to ask them to delete the original post and re-post it in the "right" channel. This got confusing/annoying enough to some students that they just disengaged from Slack altogether.
- Students were also confused by where things were located. I introduced the mantra
*If it's a file or a grade it's on Blackboard; if it's a message it's on Slack; if it's a date it's on the Calendar*but this wasn't enough for many students who were used to having everything just on Blackboard.

By week 4-5, we were settling into a habit of using Slack just for direct messages to the professor --- a feature students did use widely, and enjoyed using --- but nothing else. I didn't run the numbers, but I am sure that well over 85% of the posts on Slack were originated by me and have no replies.

I give informal course surveys in weeks 2, 6, and 9 to ask students what we need to start, stop, and continue doing. I knew from week 2 that Slack was an issue. In the week 6 survey, I asked the Calculus class whether we should keep announcements on Slack or move them elsewhere. They said:

Only 1 in 9 students wanted announcements to be on Slack! In response, I started putting announcements on Blackboard and not Slack and changed the name of the channel from #announcements to #updates, because I kept the Google Calendar plugin active. I thought I was making the right move here, but what this ended up doing was causing even more students to disengage from Slack.

After relocating announcements to Blackboard, student use of Slack (apart from DM's) dropped to essentially zero, even when I put activities on Slack that counted for credit or were requested by students. In the last two weeks of the semester, because some students were still using Slack while many others had dropped it, I needed to be sure that critical course announcements were being seen, so I was cross-posting announcements to both Slack and Blackboard. The way I had handled it, I couldn't be sure that students were using *either* of the communications methods I had set up. That's not good.

This issue was weighing on my mind all semester, so near the end, I gave a survey to collect student data about what they *did* want from a course communication tool. I got a surprisingly good response rate (46 students out of 72 total) and their replies were pretty illuminating.

When asked *How important is it to you, generally speaking, that your courses include an official electronic communications tool for communicating with the professor and classmates in between classes?*, 76.1% responded "important" or "very important". So it's not a case where students are just not interested in using these tools.

I asked which specific communications tools students used regularly, before taking my course. The top three were their university emails (98% of respondents), the Blackboard discussion board feature (56%), and personal emails (43.5%). Fourth and fifth place were Facebook (41.3%) and group text apps like WhatsApp and GroupMe (39%). Twitter was used by 26% and all other tools --- including Slack, Discord, Instagram, and Stack Exchange --- came in under 7% each. I think maybe I wasn't clear in my question here, because I wanted to know what tools students are using in *any* part of their lives to communicate with others. I think the students thought I meant in a classroom context only. I'm pretty sure Instagram is used by more than 7% of my students.

I then asked students about the kinds of features they'd want to have on a communications tool used for an academic course, and which of those features would be the their top-3 most important. They said:

- Notifications are pushed to my email (rated in the top 3 by 78.3% of respondents)
- The tool lets me send direct messages to my instructor (56.5%)
- The tool has mobile apps for my phone or tablet (50%)

I thought this was surprising because some of the features I thought would be must-haves, like the ability to share files easily, didn't crack the top 5. My students' top features are those that *actually facilitate communication* and not other bells and whistles.

The next question and its result are, I think, at the heart of my issues with Slack. I asked students to vote for which of the following situations they would prefer if they had the choice: (1) *Using an electronic communications tool that I already use or which is easy to learn and use, but which lacks many or most of the features that I find important*; or (2) *Using a tool that has most or all of the features I find important, but which is not something I already use and would take effort to learn and use.*

The results tell you everything you need to know:

This is a very economical way of expressing the problems we had with Slack. It's a tool that has a ton of excellent features but requires students to adopt a different paradigm of communicating and to check Yet Another Inbox. Half my students preferred this; the other half preferred the opposite. No wonder we couldn't figure out how to integrate it into the course.

The last question I asked had to do with just how many communications tools students would be willing to tolerate in the course. I asked students to vote on which of the following situations they would prefer if they had the choice:

- Having one tool for all course communication
- Using email for personal messages but then one tool for everything else
- Using Blackboard for announcements but then one tool for everything else
- Using Blackboard for announcements, email for personal messages, and a communications tool for everything else

Here are the results:

I was a bit surprised that the plurality of students, all things considered, would prefer to have one tool for everything, even if that tool requires effort to learn and use, rather than split the work of communication among multiple tools some of which may be very simple and familiar. Only a handful of students wanted to see three separate tools being used, even if two of them are email and Blackboard. It's as if they're saying, *if you're going to use a communciation tool like Slack, go the whole way*.

I don't regret having moved to Slack from straight-up discussion boards. I think Slack facilitates a more organic and free conversation, and it does make it easier for my students and I to communicate thanks to its immediacy and informality. But it's also a tool that requires a different way of thinking. That's not bad; but it does produce cognitive load for students that is not necessarily germane to their learning in the course.

In addition to everything I've presented, many of my students found Slack to be confusing, not in terms of ease of use, but rather in the chaotic nature of the communication process that Slack tends to promote. Yes, there is threaded discussion on Slack but even with threads, discussion topics tend to pile on each other with no clear way to disentangle them. This isn't helpful for those in an online or hybrid course, or in a flipped learning environment, or students in any situation who have learning disabilities, all of whom need structure to have a successful experience.

What the students are telling me is that they do see the value in course communications tools, but there's a balance to be struck:

- The tool needs to be simple and/or familiar, so students don't spend more time figuring out the tool than they do the math in the course.
- But it also has to have a good feature set, and many students are willing to put in the work to learn the software if there's a clear payoff in terms of better communication.
- The rule about low cognitive load does not distribute across multiple tools. That is, it's not OK to have several tools to use, all of which are simple; it's better to have one or two tools that have moderate complexity and a good feature set.

Slack *might* strike the right balance for some courses and not for others. Or, in some courses, Slack is the right fit but it will take guidance and modeling to help students learn how to use it; you can't just drop it into any class and expect success. Learning a new tool requires cognitive resources; you have to gauge your audience to know how much is reasonable to ask.

My next course is an asynchronous online Precalculus class in the summer, and I'm already thinking about the communications infrastructure. I might use Slack. I am also looking more closely at similar platforms like Discord or Twist, a promising new product called Campuswire, or even something like WhatsApp. Whatever I choose, I'm going to make sure that I will:

- Have students learn the platform and start using it early and often, through a week-1 assignment, this time with a lot more value and weight to it, to help ensure students don't treat it lightly. I'm considering capping students' course grades at a B if they don't complete it within the first 10 days of the course. Communication is
*that*important. - Make the structure of channels (or whatever the analogue is) muck simpler. If you can't tell what the channel is for just by reading the name, it's too complicated.
- Provide lots of support by finding excellent tutorials for it, and making the tutorials myself if I can't find any; and model the behavior I want to see by using the platform myself, also early and often.
- Lay out clear expectations for how and when to interact on the platform, and insist students abide by those expectations and make the platform a central, integrated part of the course experience.

I did some of these reasonably well in the Fall, some poorly and some not at all. I have a lot to fix myself when it comes to these tools.

]]>