Saturday 18 February 2017

learning is impacted by how we mentally organize our knowledge

Last week our SoTL Teaching Circle met to discuss the 2nd chapter of How Learning Works. This chapter discusses the impact of how the way students organize their knowledge impacts their learning. Basically, this chapter considers the differences between how experts organize their knowledge vs how novices organize their knowledge. The bottom line is that expertise is accompanied by a greater network of mental connections among their nodes of knowledge. In contrast novices at best will have a linear connection that links their different points of knowing. Most students, however, will have islands of knowledge for which there are no connections between their different courses even when those courses are within the same major.

A few years ago Kimberly Tanner was the keynote speaker for a series of workshops at the UofA for the AIBA annual conference. The title of the conference was Mind the Gap which was meant to highlight the difference between thinking like an expert versus a novice. She explained that one of the issues that make it difficult for experts to teach novices is that much of our expertise is unarticulated to ourselves. Experts (e.g. holders of PhDs) are unaware of how they organize their knowledge that makes them an expert. Thus, it makes it difficult to help novices to transition to expert thinking because the experts do not know what the novice needs to change in order to become an expert. I know I have this difficulty when teaching many of my courses. Something that is obvious to me and thus not worth mentioning to my students ends up being critical for students to be made aware of in order to progress in the discipline. This is particularly true for those of us who suffer from academic fraud syndrome - that thinking that really, I am not that smart and someone is going to realize their mistake and revoke my PhD. Thus, university and college instructors may tend to keep some aspects of their expert thinking to themselves because to articulate that may reveal that what the expert thinks is worth teaching is actually common knowledge and inappropriate for discussion in the classrooms of higher education.

But like we tell many of our students, if you have a question, it is likely that many in the classroom have the same question. This is what makes teaching difficult - being courageous to be intellectually humble in the midst of both our peers and students.

On the other hand, the work that is being done to identify threshold concepts in different disciplines, I think is a good step toward understanding those key points that we ourselves grasped on our way to developing expertise. As instructors in higher education, we need to understand what those stumbling blocks were for us and our colleagues when developing our expertise. Once identified, we can then ensure that our own students know where to concentrate their attention in order to understand the depth and breadth of the discipline. And I think this can be readily facilitated by helping students make links within their own knowledge structure so that their mental models of our world becomes robust.

This is one of the reasons why I advocate for students to develop an e-portfolio that provides a platform for them to reflect on their education that cuts across disciplinary boundaries and even the boundaries between the courses within their major. Students need to understand that knowledge is a whole rather than a series of separate islands. We want our students to understand the world not just what is currently in front of their nose.

Resources

Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How does the way students organize knowledge affect their learning? In How Learning Works: Seven Research-Based Principles for Smart Teaching (pp. 40–65). San Francisco, CA: Jossey-Bass Publishers.

Haave, N. (2016). E-portfolios rescue biology students from a poorer final exam result: Promoting student metacognition. Bioscene: J College Biol Teaching, 42(1), 8–15.

Krieter, F. E., Julius, R. W., Tanner, K. D., Bush, S. D., & Scott, G. E. (2016). Thinking like a chemist: Development of a chemistry card-sorting task to probe conceptual expertise. J Chem Ed, 93(5), 811–820.

Loertscher, J., Green, D., Lewis, J. E., Lin, S., & Minderhout, V. (2014). Identification of threshold concepts for biochemistry. CBE-Life Sciences Education, 13(3), 516–528.

Smith, J. I., Combs, E. D., Nagami, P. H., Alto, V. M., Goh, H. G., Gourdet, M. A. A., … Tanner, K. D. (2013). Development of the biology card sorting task to measure conceptual expertise in biology. CBE-Life Sciences Education, 12(4), 628–644.

Tuesday 7 February 2017

what students don't know can hurt them

This term four of my colleagues have formed a teaching circle to discuss SoTL. We are interested in understanding how to engage in SoTL research and to also use the SoTL literature to improve our own teaching praxis. For this term (Winter 2017) we decided to work through How Learning Works by Ambrose et al (2010). Last week we met to discuss the first chapter which considers how students' prior knowledge can affect their learning. The chapter makes a clear distinction between declarative and procedural knowledge but we noted that there are other types of knowledge such as the knowledge of application and context: The ability of people to know when or the correct context in which to apply their declarative or procedural knowledge. However, sometimes in this first chapter, it seemed that Ambrose et al were implying that procedural knowledge encompassed contextual knowledge. I think that greyness of my own understanding is apparent below.

People who can do but not explain how or why have procedural but not declarative knowledge and run the risk of being unable to apply a procedure within a new context or explain to someone else how to do the task. Many instructors are like this about their teaching being able to teach but not articulate why their teaching is effective nor are they able to teach as effectively in another context. Similarly, students may know facts (grammar, structures, species' names) but not know how to solve problems with that knowledge. These students have declarative but not procedural knowledge. This is what I am trying to move my second-year molecular cell biology students toward - being able to use their molecular cell biology knowledge to solve problems. The issue I have is that I do not know how to effectively teach procedural knowledge other than to have students practice and myself model problem-solving. Ambrose et al note that if students' prior knowledge is fragmentary or incorrect, this will interfere with their subsequent learning. In addition, even if their prior knowledge is sound, students are not always able to activate it in order to integrate it with new learning. Instructors need to be able to assess whether or not students' prior knowledge is sound and active for effective learning to occur. Ambrose et al also note that students need to be able to activate their knowledge appropriate to the learning context - this is not always the case. Thus, instructors need to monitor the appropriateness of students' knowledge and make clear the appropriate connections/knowledge for the context. Students need to learn contextual knowledge. For procedural knowledge I think this simply requires numerous examples and opportunities for practice.

This first chapter suggests that a good way to correct inaccurate knowledge is to give students an example or problem that exposes misconceptions and sets up cognitive dissonance in students' thinking.  Ambrose et al suggest using available concept inventories to probe students' inaccurate knowledge. This is what my physics colleague, Ian Blokland is doing in his classes which employ iClickers and what I am attempting to do with 4S apps in my classes which are taught using team-based learning. This a great idea but it is time-consuming work to produce plausible wrong answers/distractors. I have found that most textbook testbanks do not do this well.

Something the authors suggest and I attempt to do in my courses is to help students make connections in their learning from earlier in the same course, from previous prerequisite courses, and also from supporting courses I know they are taking at the same time. I have had students comment that they appreciate that I do this on the end of term student evaluations of teaching. In the language of Ambrose et al, I am activating students prior knowledge but acknowledging to students that this is an appropriate context in which to consider integrating that prior knowledge. Students are not always capable of doing this themselves.

Something else this first chapter suggests to activate prior knowledge that I attempt to do in my courses is to have students consider their own everyday context for how things work. The classic example that I often use is the increase in frequency in washroom trips after drinking alcohol is a direct result of alcohol inhibiting the release of ADH from the posterior pituitary. I consciously try to offer everyday examples of the applicability of students' new knowledge.

One example of being explicit about the context is the style of writing expected. In the biological sciences concise clear writing is necessary as opposed to a possible narrative in English - although a good science paper tells a good story....

Brian Rempel, my organic chemistry colleague highlighted a paper (Hawker et al 2016) at our teaching circle which investigated the Dunning-Kruger effect in a first-year general chemistry class. Generally speaking, students are poor at assessing how well they perform on exams after the exam has been written. As others have shown, better-performing students are generally better at assessing their performance. I think this is a case of you don't know what you don't know. If students do not know the material, then they are unable to assess whether or not they knew the answers on the exam - they thought they knew!

In the context of the first chapter of How Learning Works, it seems to me that students' prior knowledge impacts their ability to assess their performance on their exam. Is there a link? I guess the point in this chapter is that students do not always know what they don't know and this impacts their ability to integrate new knowledge and to assess how to apply that knowledge. 

What is interesting in the Hawker et al (2016) study is that there is a significant improvement in postdiction accuracy between the first exam written and the second exam but not in subsequent exams (in this study there were five with the 5th being the final comprehensive exam). The authors of this study suggest that the first exam is an abrupt corrective to students' expectations of what is expected of students on a university exam (this is a first term general chem course). Thus it may be that the effect is not specific to chemistry but may simply be a result of students' transition to university. Their analysis of first-time chemistry students in the second semester found the same significant difference between the first two exams for university chemistry neophytes but not for students who had completed the first chem course. So there may be something about general chem in particular that prompted the authors to study this in the first place: there is some suggestion in the SoTL literature that chemistry is different in terms of students' ability to monitor their performance. They suggest that this may be due to the difficult nature of chemistry. Other STEM disciplines have reported similar results (Ainscough et al 2016; Lindsey & Nagel 2015).

Resources

Ainscough, L., Foulis, E., Colthorpe, K., Zimbardi, K., Robertson-Dean, M., Chunduri, P., & Lluka, L. (2016). Changes in Biology Self-Efficacy during a First-Year University Course. CBE-Life Sciences Education, 15(2), ar19.

Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How does students’ prior knowledge affect their learning? In How learning works: Seven research-based principles for smart teaching (pp. 10–39). San Francisco, CA: Jossey-Bass, an imprint of Wiley.

Hawker, M. J., Dysleski, L., & Rickey, D. (2016). Investigating general chemistry students’ metacognitive monitoring of their exam performance by measuring postdiction accuracies over time. Journal of Chemical Education, 93(5), 832–840.

Lindsey, B. A., & Nagel, M. L. (2015). Do students know what they know? Exploring the accuracy of students’ self-assessments. Phys. Rev. ST Phys. Educ. Res., 11(2), 20103.