Tuesday 7 June 2022

2nd-year biochemistry in winter 2021

Introduction

This is the fifth and final instalment of my reflections on my experience with online teaching during the COVID-19 pandemic of the 2020/21 academic year. This reflection considers the course AUBIO/AUCHE 280 - Biochemistry: Proteins, Enzymes & Energy which I taught in the winter term of 2021. Links to my previous reflections may be found at this link here.

I have taught Biochemistry: Proteins, Enzymes & Energy a total of 23 times since 1994 on the Augustana Campus of the University of Alberta.  The Augustana Campus is the rural undergraduate liberal arts and sciences campus of a large research university whose primary campus (among five) is in Edmonton, the capital of Alberta. Edmonton is an hour northwest of Camrose where the Augustana Campus is located. Augustana has a small student population (1100) relative to the rest of the university (40,000).

AUBIO/AUCHE 280 is the first of two biochemistry courses that we teach on the Augustana campus. This first biochemistry course reviews the chemical properties of water upon which students are able to build an understanding of amino acid behaviour within cells and as components of proteins. This is critical for students to understand the properties of enzymes and how they are able to regulate metabolism. The course then uses these principles of enzyme regulation to explore the central pathways of cellular metabolism: glycolysis, citric acid cycle, and oxidative phosphorylation. The second biochemistry course we teach at Augustana is AUBIO/AUCHE 381 - Biochemistry: Intermediary Metabolism which picks up where AUBIO/AUCHE 280 ended by completing the exploration of cellular metabolism: gluconeogenesis, pentose phosphate pathway, and the metabolism of lipids, amino acids, and nucleotides. 

My reflection on teaching biochemistry during the winter 2021 term relies on my own personal experience, students' feedback from the end of term student ratings of instruction (SRI), the SoTL literature I have read, and advice that I have received from my colleagues. These are the four lenses that Stephen Brookfield (2017) advocates should inform any critical reflection of teaching:

  • students' eyes
  • colleagues' perceptions
  • personal experience of the instructor
  • theory (the SoTL literature)

Methods & Materials

The course syllabus for the Winter 2021 iteration of AUBIO/AUCHE 280 is available at this link here. Briefly, students' grades were based on two midterm exams (20% each), a final exam (35%), a video assignment (10%) and homework (15%) that was assigned through the Achieve website that accompanied the required textbook for this course, Biochemistry, 9/e by Berg et al (2019). The video assignment consisted of students providing a recorded voice-over of animations that lacked sound which explained particular biochemical phenomena being studied in the course. The assignment of Achieve homework was a new course requirement implemented for the first time in 2021.

In this blog post, I use the SRIs (student ratings of instruction) that I received that term as the lens of students' experiences placing them in the context of others I have received over the years. I have posted the details of Augustana's SRIs in a previous blog post linked here. Note that the student comments below in the Results section are in response to four open-ended questions inviting students to type their comments into our online SRI survey:

  • What aspects of the course and/or instructor did you find most valuable?
  • What aspects of the course and/or instructor did you find least valuable?
  • How useful were the course textbook(s) and/or other learning support materials?
  • Please add any other comments that you would like to make about the course and/or instructor.
Analysis of variance (ANOVA) was used to test for significant differences among the cohorts of students. If ANOVA detected a difference of statistical significance (𝛼 = 0.05) in the response to the SRI prompt, then the Tukey-Kramer post-hoc test was used to determine if there were significant differences between cohort pairs.

Results

Over the years students have fairly consistently rated this course and how I teach it very well. There are a couple of statistically significant differences indicated for the cohorts in winter 2018 and 2021 that I note in the results below.

Students' perceptions of the instructor

Instructor overall

Students in my biochemistry class have consistently well-rated my excellence as an instructor (mean = 4.4). ANOVA did not find any significant differences among the different cohorts of students (𝛼 = 0.05). 

Related student comments:
  • Dr. Haave was always willing to answer questions related to the course material.
  • The instructor overall was excellent
  • Dr. Haave is a great instructor and very knowledgeable! I enjoy taking his classes!

Instructor preparedness
Students in my biochemistry class have consistently highly rated how well I had prepared the course (mean = 4.7). ANOVA did not find any significant differences among the different cohorts of students (𝛼 = 0.05). 
Related student comments
  • [I found most valuable to be the] course video
  • I liked that all the asynchronous lectures were posted at the start of the semester so that if I had a busy week coming I could get ahead before hand.
  • I found this course to be very well planned and the instructor was very helpful.

Instructor's effective use of contact time
Students have consistently rated my use of class time (changed to contact time starting in 2021) as being very effective (mean = 4.4). However, ANOVA did detect differences with the Tukey-Kramer post-hoc test indicating that W2021 is significantly lower from cohorts W2006 to F2012 and W2020 (α = 0.05). 

Related student comments:
  • [The team apps] can help fix loose ends you may have or point out important concepts.

The instructor communicated effectively
Biochemistry students have consistently rated highly my effective communication (mean = 4.6). ANOVA did detect significant differences among the cohorts. Tukey-Kramer indicates that W2018 is significantly lower than all other cohorts except W2006, W2017, W2019 & W2021 (α = 0.05). This prompt was worded differently prior to fall 2020 (the instructor spoke clearly).

Related student comments:
  • One of the problems I personally had (might be a me problem exclusively) is that trying to make correlations to other fields was sometimes difficult, but the breakdown Dr Haave would give made it easy to understand in hindsight.

Instructor's constructive feedback
Students in AUBIO/AUCHE 280 consistently highly rate my constructive feedback (mean = 4.2). ANOVA did not find any significant differences among the different cohorts of students (𝛼 = 0.05). 

Related student comments:
  • I liked the use of achieve and the homework assignments
  • [I found most valuable the] Achieve website quizzes.
  • [I found most valuable the] 2 stage exam
  • I liked the homework assignments and thought they were good practice.
  • The textbook and achieve website were very useful in this course.
Instructor's respectful treatment of students
Students have consistently appreciated my respectful treatment of them (mean = 4.6). ANOVA did detect significant differences among the cohorts with the Tukey-Kramer post-hoc test finding that the F2008 cohort significantly rated me lower (mean = 4) compared to all other cohorts except W2006 & W2018 (α = 0.05).

One student commented:
  • Dr. Haave was always willing to ask questions and never made you feel stupid.

Students' perceptions of the course structure & material

Quality of course content
Most student cohorts have rated the quality of this biochemistry course content highly with an SRI greater than 4. ANOVA did detect differences among the cohorts with the Tukey-Kramer post-hoc test detecting paired differences between the extremes: W2018 is significantly lower than the W2007 and F2010 cohorts (α = 0.05).

One student commented:
  • I found every aspect of the course valuable

Clarity of course goals and objectives
Biochemistry students consistently rate highly the clarity of my goals and objectives for the course (mean = 4.3). ANOVA did not find any significant differences among the different student cohorts (𝛼 = 0.05). 

There were no student comments related to the clarity of the course goals and objectives.


Course workload and difficulty
Biochemistry students consistently rate the workload and difficulty of the course to be high. ANOVA did detect significant differences among the cohorts. The Tukey-Kramer post-hoc test detected significant differences for workload between the W2020 cohort and all other cohorts except W2006 & W2021 (α = 0.05).  Note that data is missing for the W07 and W18 cohorts due to an administrative error.

ANOVA did not detect significant differences among the cohorts regarding the difficulty of the course (α = 0.05).
Related student comments:
  • The team apps, though quite hard sometimes, are great for applying your learning.
  • the course was demanding and took up more time than all my other courses
  • It is a tough course overall

Students' perceptions of their own experience

The course was a good learning experience
Students typically find biochemistry to be a good learning experience (mean = 4). ANOVA did not detect significant differences among the student cohorts (α = 0.05). Note that due to an administrative error, data was not collected for this prompt in 2007 or 2018.

Related student comments:
  • I found the group applications most valuable because they allowed for group interaction and discussion about the material being learned.
  • I really enjoyed working on the video assignment (even though I'm not great with video editing).
  • a fun learning experience

Motivation to learn more
Most students in this course are motivated to learn more about biochemistry (mean = 3.6). ANOVA indicated significant differences among the student cohorts (α = 0.05) but differences between pairs of cohorts were not detected by the Tukey-Kramer post-hoc test (α = 0.05).

There were no student comments related to students' motivation to learn more.

Students increased their knowledge
Biochemistry students highly rate their acquisition of biochemical knowledge in this course (mean = 4.4). ANOVA did not detect significant differences among the student cohorts (α = 0.05).
Related student comments:
  • After taking this course I found my knowledge in a field that I had zero knowledge on expanded greatly as the course was streamlined and the videos provided gave a lot of insight on what we needed to know.

Student concerns

A couple of students noted that they felt that there was insufficient time for the online exams (1 hour for the MT exams and 1.5 hrs for the final exam) delivered through ExamLock. In addition, a couple of students indicated that they would have preferred live lectures over Zoom rather than the pre-recorded video lectures I had prepared before the course began. One student indicated their dissatisfaction with two-stage exams (these were implemented for the two MT exams) preferring to write the entirety of an exam on their own. A few students indicated dissatisfaction with the use of many Zoom meetings for answering student questions with few opportunities to apply their learning as occurred in the previous semester in Molecular Cell Biology with TBL Apps.

Discussion

Although students' responses to my teaching and the course I had prepared for them were somewhat weaker than in previous terms, it was still overall well-received by students in the winter 2021 term. I knew that the winter 2021 offering of AUBIO/AUCHE 280 - Biochemistry: Proteins, Enzymes & Energy was not going to be as good of a learning experience as in previous years because I did not have sufficient time to solve the problem of academic dishonesty during online exams, quizzes and applications of learning that I had observed in the previous fall 2020 term with AUBIO 111 - Integrative Biology I and AUBIO 230 - Molecular Cell Biology. Please do not misunderstand me: there were only a handful of students who unfairly took advantage of the online learning situation in fall 2020; most students learned with academic integrity. Still, it is incumbent upon the instructor to ensure that students are being assessed equitably in an environment where no student receives an unfair advantage. The only way I was able to ensure equitable assessment for all students in the winter 2021 term was to pre-Google every online question that contributed to students' final grades. I was able to do that for the MT and final exams but not for in-class (via Zoom) applications of students' learning. As a result, I was unable to implement TBL as the instructional strategy as students had experienced in courses that I had taught in previous terms. In addition, I also used our university's in house developed exam proctoring software, ExamLock which takes periodic snapshots of students' desktop and sends an alert to the online proctor when students browse to another window on their desktop.

What did I learn? What will I do differently?

Students appreciate and benefit from the online homework websites that many publishers make available with the adoption of their textbooks for a course. I tried this for the first time in the previous fall 2020 term, using Mastering Biology for AUBIO 111 and Smartwork5 for AUBIO 230. For AUBIO/AUCHE 280 I used the Achieve homework website that accompanied the textbook I had adopted for the course (Biochemistry, 9/e by Berg et al, 2019) which has been shown to improve student learning outcomes and is viewed favourably by students (McWilliams & Bergin, 2020; McWilliams et al, 2020). I decided to implement these online homework websites for my courses in the 2020/21 academic year to increase the feedback that students would receive as they practised their learning. I was concerned that this would be lacking in the online learning environment that existed during the COVID-19 pandemic. Many of my students commented that they appreciated the practice of applying their learning through Achieve. I will use publisher's homework websites again in the future.  

Unlike the previous term in which I implemented TBL as the instructional strategy for AUBIO 111 - Integrative Biology I and AUBIO 230 - Molecular Cell Biology, I did not do the same for the winter 2021 offering of AUBIO/AUCHE 280 - Biochemistry. As I already stated above, this was because any online question that was assessed for contribution to students' final grades had to be pre-Googled and I simply did not have the time to do that for each and every App that I had prepared for F2F teaching. Instead, I used class time to meet with students online over Zoom to answer any of their questions. Many students used that time to either read the textbook pages I had assigned or view the video recorded minilectures I had prepared during the preceding summer of 2020. But many students showed up without questions hoping that others would ask questions for them. After a couple of weeks, online attendance over Zoom declined but a core of approximately 10 students continued to show up for Zoom classes. I offered to put students into random breakout rooms to attempt the Apps I had prepared using Google Forms but without the marks being recorded. This turned out to be a good learning experience for the students who did show up. But a couple of students did comment on the SRI that they felt little incentive to continue showing up for class when the marks were not contributing toward their final grade. In the future, when I again implement TBL for this course I will have students do a couple of Apps each class that do not contribute toward their final grade and then end the class meeting with one that does for which I have pre-Googled the exercise to ensure that it is not possible to copy and paste and answer after a simple Google search (I can usually complete 2-3 Apps in a one hour class). A study has found that implementing Apps without the mark contributing to students' final grades does not negatively impact student learning outcomes and is preferred by students (Deardorff et al, 2014) yet some students indicate that they will not show up for the class if the work completed in-class does not contribute to their final grade. I wish there had been time for me to prepare to mix it up like this (most not for marks, one per class for marks) for biochemistry in winter 2021.

Two-stage tests during MT exams

Two-stage testing has been found to improve student learning outcomes (Mahoney & Harris-Reeves, 2019). However, in this course the response from students toward this testing strategy was mixed. This is an interesting result because when I used two-stage testing for the Readiness Assurance Tests (RATs) when TBL is the teaching strategy, the student response has been predominantly positive. I believe the difference in response is that in the winter 2021 term, the two-stage testing was used for high stakes MT exams instead of for low stakes RATs. Also, unlike two-stage testing for RATs in which students spend most of their time on the team portion of the quiz, for higher stakes exams, it is the individual portion of the two-stage quiz for which students require more time. My experience suggests that the time allocation is different depending upon whether the test is formative or summative. If it is formative the allocation is 1 part individual to 2 parts team; if it is summative it is 2 parts individual to 1 part team. This is my sense based on winter 2021 vs the preceding terms which implemented TBL in biochemistry. My colleagues have suggested this may be because students are less prepared for the formative quizzes (i.e., students are still learning the material - they have only just experienced the new course material in the assigned pre-class reading or video) vs being better prepared for the summative exams (i.e., students have spent more time and repeated time with the material being examined). Thus, more time is required during the team portion of a 2-stage exam/quiz that is formative in nature because there is more student discussion. In contrast during a summative 2-stage exam, students are more sure of their understanding and thus the team discussion does not require as much time. I have not yet found a peer-reviewed paper that analyzes this issue.

What happened in 2018 and 2021?

This biochemistry course is one of my favourite courses to teach and is typically very well received by students despite its perceived difficulty and workload. So what happened in 2018 and 2021 in which the SRIs are lower for some of the survey prompts (i.e., contact time was used effectively, instructor communicated effectively, quality of course content).

Winter 2018 was the 2nd term of Augustana's first year of implementation of our new term structure in which students completed one course during the first three weeks of the term and then completed their other four courses during the subsequent 11 weeks of the term. I think students (and instructors!) were still adjusting to that change and it impacted some of the ratings for that term though perhaps not as much as it affected the SRIs for Molecular Cell Biology in the preceding fall 2017 term.

I did not use TBL as the instructional strategy in 2021 for AUBIO 280 because I had not yet solved the problem of academic dishonesty in an online environment. The solution required pre-Googling all online assignments, quizzes, and exams and I only found the time to do that for the exams. As a result, many students did not avail themselves of the practice time that I provided them to apply their learning because the in-class (via Zoom) Apps were no longer for marks that contributed toward their final grade. I simply did not have adequate time to prepare for this in the transition from F2F to online teaching.

But why did the W2021 cohort rate my effective use of class time significantly differently from cohorts W2006 to F2012 and W2020 but not from the cohorts in W2017 through to 2019? I think the answer is that I started using TBL consistently as the instructional strategy after F2012 and that although students' learning outcomes improve with TBL (Liu & Beaujean, 2017), students may not perceive the instructional time (i.e., in-class or over Zoom) to be as well utilized as when class time is used for lecturing (Lane, 2008) similar to what has been found in other active learning classes (Deslauriers, et al, 2019; Van Sickle, 2017; Smith & Cardaciotto, 2011). However, when implemented well, most students will respond well to active learning (Finelli et al, 2018). 

Based on the graphs in the Results section above, it appears (i.e., a nonsignificant trend) to me that my ability to implement TBL in AUBIO/AUCHE 280 - Biochemistry: Proteins, Enzymes & Energy was improving prior to the COVID-19 pandemic. What might be improving students' reception of TBL in this biochemistry course? Some of the things that I was starting to do are well-articulated suggestions in the article by Finelli et al (2018). My method of canvassing students' team responses during their in-class apps when we were meeting F2F was to use QuickKey which uses QR codes on cards that are quickly scanned by instructor's smartphones or tablets. It is an ingenious system (note that the app also enables smartphones and tablets to grade MCQ tests similar to a scantron) but requires instructors to remain visible at the front of the class in order to scan the QR cards. This limits instructors' ability to walk among student teams to offer encouragement and advice while the teams work on the assigned applications of learning. Between 2012 and 2021 I became better at interacting with student teams in a specified time before returning to the front of the class to scan the QR codes. A simple thing to do in principle but something that I need to be intentional about - it took practice for me to do well.

A final consideration is the room that I teach in (the physical space) can impact how students learn (Cotner, et al, 2013; Park & Choi, 2014). I started teaching in Augustana's active learning classroom (ALC) in 2017. In contrast to the traditional lecture theatre, our ALC is designed around pods of tables at which is located a large computer screen/whiteboard. This allows students to work in teams facing each other across a common table while at the same time viewing what I am presenting on the screen. In addition, the whiteboard ability of the screens at each pod enables students to work on group problems that I assign during class. This is an excellent design for active learning and teamwork. A drawback of Augustana's ALC is that there are two wings that float above the main floor of the classroom that house 4 pods of students (6 students to a pod). When the number of students seated on the main floor exceeds its limit (approximately 48) then the overflow goes upstairs into these wings where the sight lines between the student and the instructor are poor. It took me a couple of years to learn that the best way to deal with student enrolments greater than 48 in this ALC was to rotate the student teams through the different learning pods such that no one team was relegated to the weaker learning environment in those upstairs wings for the entire term. When the situation is explained to students along with its pros and cons students come on board with the learning environment. When well-implemented, this is an excellent learning space.

Resources

Berg, Jeremy, M., Tymoczko, J. L., Gatto, G. J. J., & Stryer, L. (2019). Biochemistry (9th ed.). W. H. Freeman and Company, MacMillan Learning.

Brookfield, S. D. (2017). Becoming a critically reflective teacher (2nd ed.). Jossey-Bass.

Cotner, S., Loper, J., Walker, J. D., & Brooks, D. C. (2013). It’s not you, it’s the room - Are the high-tech, active learning classrooms worth it? Journal of College Science Teaching, 42(6), 82–88. 

Deardorff, A. S., Moore, J. A., McCormick, C., Koles, P. G., & Borges, N. J. (2014). Incentive structure in team-based learning: graded versus ungraded Group Application exercises. Journal of Educational Evaluation for Health Professions, 11, art 6. 

Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., & Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proceedings of the National Academy of Sciences, 116(39), 19251–19257. 

Eguchi, H., Sakiyama, H., Naruse, H., Yoshihara, D., Fujiwara, N., & Suzuki, K. (2020). Introduction of team-based learning improves understanding of glucose metabolism in biochemistry among undergraduate students. Biochemistry and Molecular Biology Education, 49(3), 383–391.

Finelli, C. J., Nguyen, K., DeMonbrun, M., Borrego, M., Prince, M., Husman, J., Henderson, C., Shekhar, P., & Waters, C. K. (2018). Reducing student resistance to active learning: Strategies for instructors. Journal of College Science Teaching, 47(5), 80–91.

Lane, D. R. (2008). Teaching skills for facilitating team-based learning. New Directions for Teaching and Learning, 2008(116), 55–68.

Liu, S.-N. C., & Beaujean, A. A. (2017). The effectiveness of team-based learning on academic outcomes: A meta-analysis. Scholarship of Teaching and Learning in Psychology, 3(1), 1–14. 

Mahoney, J. W., & Harris-Reeves, B. (2017). The effects of collaborative testing on higher order thinking: Do the bright get brighter? Active Learning in Higher Education, 20(1), 25-37. 

McWilliams, K., & Bergin, J. (2020). Achieving student success: Using indicators of college readiness to measure the efficacy of Achieve. MacMillan Learning.

McWilliams, K., Bergin, J., Black, A., Baughman, M., & Runyon, B. (2020). Achieve more: The learning engineering of Achieve and insights into instructor implementations and instructor and student outcomes. MacMillan Learning.

Park, E. L., & Choi, B. K. (2014). Transformation of classroom spaces: traditional versus active learning classroom in colleges. Higher Education, 68(5), 749–771. 

Smith, C. V, & Cardaciotto, L. (2011). Is active learning like broccoli? Student perceptions of active learning in large lecture classes. Journal of the Scholarship of Teaching & Learning, 11(1), 53–61. 

Van Sickle, J. R. (2016). Discrepancies between student perception and achievement of learning outcomes in a flipped classroom. Journal of the Scholarship of Teaching and Learning, 16(2), 29–38. 

Friday 3 June 2022

confusing the teaching strategy with the professor and course content

At the Augustana Campus, a number of us have been using team-based learning (TBL) for a number of years and it is interesting how different students and colleagues respond to its implementation. Most students, when confronted with TBL for the first time are open to it but unsure. At midterm, many students are frustrated with the course but seem to confuse the difficulty of the course content with the teaching strategy blaming the strategy rather than the difficulty of the course. By the time the course ends most students appreciate the incremental and developmental nature of TBL realizing that the daily/weekly requirement to attend to learning the course material ends up making studying for the final exam more efficient because TBL has structured their learning such that they are studying for the final exam throughout the course rather than leaving the learning to cram it in during the week before the exam.

But, there is a very vocal minority who are frustrated with TBL as an instructional strategy and are convinced that their instructor has abandoned them to have to learn it on their own instead of understanding that ultimately, learning does occur on one's own but that TBL has structured class time to practice their learning thereby revealing to students those areas that still need their studious attention. It can be heartbreaking to receive student evaluations of instruction at the end of the term that harshly denigrate the course, instructor, and instructional strategy after working hard to develop appropriate in-class assignments (apps or applications in the language of TBL) for students to learn the course material and accompanying skills through practice under the guidance of both peers and instructor.

I think some of the frustration experienced by students with TBL is misplaced and should actually be placed on the course content itself. I use TBL to teach biochemistry, molecular cell biology and first-year functional biology. Each of these courses was a challenge for students before I began implementing TBL in my courses. It is just that students' frustration with learning difficult course content has shifted from blaming the nature of the course content to blaming the nature of the instructional strategy used to teach the course.

Now, don't misinterpret what I am saying here. Most of my students learn to appreciate what TBL does for them. But the minority who passionately dislike TBL as a learning and teaching strategy is incredibly vocal about it assuming that most students think like them when the data from my student evaluations of teaching make this clearly a false assumption.

The other interesting response is how colleagues respond to my use of TBL as an active learning strategy in my classrooms. Many are very interested, some are little sceptical, and a very few are very annoyed that this teaching and learning strategy persists on our campus. My sense is that these annoyed colleagues are taking the vocal dislike of the passionate few students at face value and accept their opinion to be the common judgement of the inability of TBL to promote student learning outcomes. I find it interesting that colleagues who are rigorous about ensuring that the conclusions they make in their own research are based upon evidence end up making vocal judgements about a teaching strategy on their campus based on hearsay. And when introduced to the vast literature which provides the evidence of its efficacy, dismisses the entire published body of evidence on the basis of a few poor studies.

Part of the issue of a few colleagues negatively responding to TBL being used on their campus is, I am sure, because the ones who first implemented it on our campus, followed by myself who adopted it a few years later were rather vocal in its efficacy making it seem as if those who were not using TBL to teach their courses were somehow teaching with an inferior instructional strategy. There is nothing as infuriating as the zealousness of the recent and naive convert and I confess to being a TBL zealot when I experienced TBL on the road to Damascus back in 2010.

In order to promote active learning on my campus in all of its marvellous and effective forms, I need to rebuild some bridges after unleashing the rhetoric of TBL. Teaching and learning is a wondrous activity to be engaged in. To be part of someone's learning journey and see the lightbulb come on when a concept, principle or skill finally clicks into place and becomes integrated with the mental model of the world ... that is a wondrous thing to behold. And as instructors, when we experience our own "aha" moment while realizing that a different instructional approach works for a different student who was previously struggling to understand, that is our own lightbulb experience that I am so grateful to experience again, and again, and again.

Resources


Carmichael, J. (2009). Team-based learning enhances performance in introductory biology. Journal of College Science Teaching, 38(4), 54–61.

Cooper, K. M., Ashley, M., & Brownell, S. E. (2017). Using expectancy value theory as a framework to reduce student resistance to active learning: A proof of concept. Journal of Microbiology & Biology Education, 18(2).

Felder, R. M., & Brent, R. (1996). Navigating the bumpy road to student-centered instruction. College Teaching, 44(2), 43–47.

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences of the United States of America, 111(23), 8410–5.

Huggins, C. M., & Stamatel, J. P. (2015). An exploratory study comparing the effectiveness of lecturing versus team-based learning. Teaching Sociology, 43(3), 227–235.

Mezeske, B. (2004). Shifting paradigms? Don’t forget to tell your students. The Teaching Professor, 18(7), 1.

Michael, J. (2006). Where’s the evidence that active learning works? Advances in Physiology Education, 30(4), 159–167.


Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93(3), 223–231.

Prince, M., & Weimer, M. (2017, November 2). Understanding student resistance to active learning.

Schwegler, A. F. (2013). From lessons learned the hard way to lessons learned the harder way. InSight: A Journal of Scholarly Teaching, 8, 26–31.

Seidel, S. B., & Tanner, K. D. (2013). “What if students revolt?”—Considering student resistance: Origins, options, and opportunities for investigation. CBE-Life Sciences Education, 12(4), 586–595.

Spence, L. (2004). “The professor made us do it ourselves.” The Teaching Professor, 18(4), 6.

The Team-Based Learning Collaborative.

Weimer, M. (2013). Responding to resistance. In Learner-centered teaching: Five key changes to practice (2nd ed., pp. 199–217). San Francisco, CA: Jossey-Bass, a Wiley imprint.

Weimer, M. (2014, September 10). “She didn’t teach. We had to learn it ourselves.” Faculty Focus - The Teaching Professor Blog.

Wieman, C. E. (2014). Large-scale comparison of science teaching methods sends clear message. Proceedings of the National Academy of Sciences of the United States of America, 111(23), 8319–20.

Thursday 2 June 2022

meeting student resistance to learning with resilience


The interview with Michael Ungar (Bethune 2019) and Ungar's Globe and Mail article (2019) have made me rethink this issue of resistance to active learning by students. It is interesting what our 2019 Faculty Learning Community stumbled across in our exploration of this issue - that it is a chicken-egg issue, a cycle in which students need to be resilient to engage in active learning yet active learning promotes the development of student resilience. However, we have been thinking of this in terms of resilience being an internal issue. We have been assuming that resilience is a capacity or skill or attribute that can be developed within students. But what Ungar has got me thinking about now is that the ability to be resilient is also a matter of social network and institutional systems. Ungar's work has found that people are more resilient when there are support measures in place for people under stress. These could be some sort of welfare policies and structures or they could be the network of family and friends. Actually, it is not either-or, it is both, Resilience is fostered in people when they have a solid social network (friends and family) and when their community has structures and systems in place to ensure that people are taken care of in times of crisis. Ungar's work cites a number of these social structures such as ensuring that people are able to access insurance and welfare benefits quickly when needed or that there are systems in place to ensure that people have food and shelter when they become unsheltered. The idea that resilience is completely an internal quality is false. I don't think, however, that it is a dualistic situation. I do think that there is an internal aspect of resilience. People may have better resilience developed than others. But what is interesting from Ungar's work is that for most people it seems that social systems play a greater role in producing resilient people than relying solely on internal capability.

So what does this have to with students' resistance to learning?

It has got me thinking that if we as educators wish to promote resilience in our students such that they are able to engage in active learning then we also need to consider our classroom policies and course structures we have implemented. My question is, what sort of classroom policies and course structures will develop students' resilience? Well, a good place to start, I think is to consider the AAC&U's high impact practices. Which one of them will develop social structures in our courses? First-year seminars I think make sense. Also, learning communities, collaborative assignments and projects, undergraduate research, and service-learning. Why these five of the 11? I think these five will promote students' resilience because they are all structured to develop relationships among learners, between students and teachers, and between learners and their community.

I also think that active learning itself, when properly structured can promote resilience because learning activities that promote social interaction among students will develop their resilience based on Ungar's research. I think this is why team-based learning (TBL) can be such a powerful active learning instructional strategy: the stable teams established at the beginning of the term develop into a learning community as a result of students working together on the two-stage tests and on the in-class applications of learning. Think-pair-share does the same thing by getting students to interact with each other; portions of the class become a transient community. Personal response systems (PRS), such as clickers, can do the same provided that there is a sharing among students after their initial response. PRSs used properly end up being a kind of two-stage exam that fosters student interaction.

However, active learning can explicitly develop resilience or it may not, depending upon how it is implemented. If it is just think-pair-share or personal response systems which include peer discussion, it may or may not develop resilience in students because it is dependent upon whether or not students develop relationships with the students with whom they interact. This will only occur if students interact consistently with the same students. If it is a class of 500, students may not interact with the same students on an ongoing basis if they are always changing where they sit. If on the other hand students are creatures of habit (and I would argue that the vast majority are) then they will likely sit in the same seat and get to know their neighbours. The nice thing about TBL is it explicitly facilitates this relationship-building in its educational system by forming stable teams at the beginning of a course.

I think this is key. If they have a learning community, they may be more willing to take risks in learning and not be devasted by the occasional low-stakes failure. As a result, learning will be more robust with active learning as a result of interleaved retrieval practice which we know enhances learning.


Resources

Beri, N., & Kumar, D. (2018). Predictors of academic resilience among students: A meta analysis. I-Manager’s Journal on Educational Psychology, 11(4), 37.

Bethune, B. (2019). The real key to bouncing back. Maclean’s, 132(5.4), 1–5. (available online as When it comes to resilience, the self-help industry has it all wrong)

Fink, L. D. (2016). Five high-impact teaching practices: A list of possibilities. Collected Essays on Learning and Teaching, 9, 3–18.

Holdsworth, S., Turner, M., & Scott-Young, C. M. (2018). … Not drowning, waving. Resilience and university: a student perspective. Studies in Higher Education, 43(11), 1837–1853.

Kuh, G., O’Donnell, K., & Schneider, C. G. (2017). HIPs at ten. Change: The Magazine of Higher Learning, 49(5), 8–16. 

Lemelin, C., Gross, C. D., Bertholet, R., Gares, S., Hall, M., Henein, H., Kozlova, V., Spila, M., Villatoro, V., & Haave, N. (2021). Mitigating student resistance to active learning by constructing resilient classrooms. Bioscene: Journal of College Biology Teaching, 47(2), 3–9. 

Liu, S.-N. C., & Beaujean, A. A. (2017). The effectiveness of team-based learning on academic outcomes: A meta-analysis. Scholarship of Teaching and Learning in Psychology, 3(1), 1–14.

Swanson, E., McCulley, L. V., Osman, D. J., Scammacca Lewis, N., & Solis, M. (2019). The effect of team-based learning on content knowledge: A meta-analysis. Active Learning in Higher Education, 20(1), 39–50.

Ungar, M., & Liebenberg, L. (2013). Ethnocultural factors, resilience, and school engagement. School Psychology International, 34(5), 514–526.

Ungar, M. (2019, May 25). Resilience: Our ability to bounce back depends more on what’s around us than what’s within us. The Globe and Mail.