Ten years ago (2009), we proposed that we could “spice up” a traditional, graduate molecular biology course on the “Principles of Molecular Biology” by introducing student-delivered experimental-design chalk talks. This form of assessment challenges students to synthesize material presented in class - specifically, content on biochemistry and molecular biology as well as classical and contemporary techniques used in these fields - to devise and present a solution to an open-ended question in the field. The value of this assignment rests in allowing students to practice experimental design and a form of science communication that many have not yet encountered (presenting without the use of pre-prepared visual aids). In this paper, we share an example of an open-ended experimental-design question relevant to one module in our course (specifically “DNA Repair”); two example student presentations (which are considered “good” and “great” as they distinguish between the typical types of presentations we observe); an annotated guide to these videos which emphasizes what level of detail typically distinguishes a “good” from a “great” presentation; and, perhaps most importantly, our grading rubric. This rubric was optimized over several years - to the point that we have used it consistently for the past three years - and it standardizes the student experience by streamlining grading and providing a framework for structured feedback.
Figure 1: Graduate student presenting an experimental-design chalk talk. (Photo credit: Kate Harding and Madhvi Venkatesh. Image copyright.)
Since inception, the exercise has gone through a series of enhancements and is now fully integrated into our course, wherein it has replaced traditional “journal clubs” as the core activity of our student-centered discussion sections. Much of the work to drive improvements was done through the work of three consecutive Curriculum Fellows (CFs) - Johanna Gutlerner, me, and Madhvi Venkatesh - in collaboration with the faculty Course Directors, most recently Joe Loparo. Curriculum Fellows are life sciences PhD-trained scientists who come to Harvard Medical School (HMS) for an immersive training experience directed towards careers in teaching and higher education administration. Course Directors were key for buy-in and support to implement, refine and sustain these assessments; most importantly, they were receptive to opening their graduate classrooms to experimentation in education.
Since the introduction of this exercise, more than 700 students have taken our course, more than 50 graduate students have served as section leaders, and more than 20 faculty have been involved as lecturers or as section leaders. We were motivated to put this paper together based on the feedback we received from colleagues in professional societies and after presentations at conferences. We were equally motivated to share these resources in light of new requirements prescribed by the National Institute for General Medical Sciences (NIGMS), which call for increased emphasis not only on skills training at the graduate level but also on evaluation of educational outcomes. Taken together, these calls recommend sharing evidence of impact and tools for best practices. A Careers and Recruitment article in Nature Biotechnology was an ideal place to share our tools not only with others interested in graduate education and the interface with student career preparation, but also with scientist-teachers who may be motivated to incorporate this exercise into their traditional bioscience classrooms.
The formalization of our rubric and a collective set of improvements to our course have laid the foundation for a number of research questions. First, how do students perform on this assessment format and how does their performance change over the course of a semester? Are there certain categories on our rubric on which students, in general, perform better and worse? How do students think this form of assessment advances their learning in the areas of experimental design and science communication? Are they right? Are real learning gains correlated with increased self-confidence (or self-efficacy) in these areas? Does the anecdotal evidence we have that students enjoy the exercise track with metrics from course evaluations? These questions form the basis of our ongoing efforts.
This article was recently published in Nature Biotechnology: Heustis, R.J., Venkatesh, M.J., Gutlerner, J.L. et al. Embedding academic and professional skills training with experimental-design chalk talks. Nat Biotechnol 37, 1523–1527 (2019) doi:10.1038/s41587-019-0338-1
Please sign in or register for FREE
If you are a registered user on Research Communities by Springer Nature, please sign in