Skip to main content

Graduate- and undergraduate-student perceptions of and preferences for teaching practices in STEM classrooms

Abstract

Despite positive evidence for active learning (AL), lecturing dominates science, technology, engineering, and mathematics (STEM) higher education. Though instructors acknowledge AL to be valuable, many resist implementing AL techniques, citing an array of barriers including a perceived lack of student buy-in. However, few studies have explored student perceptions of specific AL teaching practices, particularly the perceptions of graduate students. We explored student-reported instructional strategies and student perceptions of and preferences for a variety of teaching practices in graduate and undergraduate classrooms across three STEM colleges at a large, public, research university. We found that both graduate and undergraduate students desired more time for AL and wanted less lecturing than they were currently experiencing. However, there was no single universally desired or undesired teaching practice, suggesting that a variety of AL teaching practices should be employed in both graduate and undergraduate courses.

Introduction

Science, technology, engineering, and math (STEM) higher education is undergoing rapid change, driven by an increase in the number and diversity of students, digitalization and globalization, and shifting demands from policymakers and society at large (Brewer & Smith, 2011; Graham et al., 2013; Olson & Riordan, 2012; Shin & Harman, 2009). Simultaneously, research into evidence-based pedagogy has revealed that traditional, lecture-based teaching is not only ineffective overall, but disproportionately disadvantages women, first-generation students, and students from underrepresented minority groups (Ballen et al., 2017; Haak et al., 2008; Theobald et al., 2020). As a result, instructors are encouraged to teach using evidence-based approaches that increase student motivation, collaboration, and metacognition, all of which influence students’ learning and course performance in STEM (Council, 2003; Glynn et al., 2011; Tanner, 2013). These challenges and expectations directly impact instructors, who may lack the time, funds, and extrinsic motivators to think deeply and scientifically about teaching (Gormally et al., 2016; Miller & Metz, 2014; Patrick et al., 2016).

Evidence-based teaching is an umbrella term that includes active learning and other teaching practices shown to positively impact student learning (Felder et al., 2000; Owens et al., 2017). Active learning is itself a catch-all phrase, derived from constructivism, a learning theory that proposes that students learn by constructing their own knowledge (Freeman et al., 2014). Within a constructivist framework, learning is an active process and builds on experience, instruction, and the foundations of prior knowledge (Bransford & Schwartz, 1999; Prince, 2004). In practice, active learning can include small-group discussion (Tanner, 2013), classroom-response systems (e.g., “clickers”; Cotner et al., 2008), one-minute papers and worksheets, completed individually or in groups, and collaborative group work (e.g., via case-studies, problem-based learning, or process-oriented guided inquiry learning [POGIL] (Eberlein et al., 2008). Through engagement in active learning practices, students remain an integral part of the learning process by building meaning and constructing knowledge (Prince, 2004).

Despite the evidence in support of active learning (Freeman et al., 2014), lecturing remains a pervasive feature of STEM teaching (Akiha et al., 2018; Stains et al., 2018). Some faculty may choose to lecture because they are not convinced that active learning is effective (Michael, 2007; Silverthorn et al., 2006). Other instructors value active learning (Patrick et al., 2016), but refrain from integrating active learning techniques, citing an array of barriers including the time needed to prepare “activities” (Brownell & Tanner, 2012), lack of training in effective teaching techniques, lack of time for content coverage (or, the loss of lecture time), perceived lack of student buy-in (Cavanagh et al., 2016; Deslauriers et al., 2019; Owens et al., 2017), or the concern that their classes are prohibitively large for active learning (Patrick et al., 2016; Silverthorn et al., 2006). In this work, we focus on one of these perceived barriers—a lack of student buy-in to active learning pedagogies.

A significant predictor of active learning implementation and engagement with teaching practices is the faculty and students’ buy-in. (Cavanagh et al., 2016; Madson et al., 2017). Faculty often fear student resistance to active learning (Seidel & Tanner, 2013; Silverthorn et al., 2006), impacting the classroom environment and their evaluations by students (Henderson et al., 2018). Although these evaluations are flawed (Carpenter et al., 2020; Stroebe, 2020; Uttl et al., 2017; Wang & Williamson, 2020) they remain meaningful to the faculty for merit pay, promotion, and tenure. We also recognize that student preferences don’t always mirror the practices that lead to the most learning gains (Deslauriers et al., 2019) which is also reflected by work at our own institution (Cotner et al., 2008; Walker et al., 2008). Perceived student resistance can be lowered with evidence, and previous studies have examined student perceptions of active learning in individual undergraduates (Bransford & Schwartz, 1999; Brazeal et al., 2016; Brigati, 2018; Brown et al., 2017; Cavanagh et al., 2016, 2018; Cooper et al., 2017; England et al., 2017; Machemer & Crawford, 2007; Mcmillan et al., 2018; Owens et al., 2017; Patrick et al., 2016; Smith & Cardaciotto, 2011) or graduate (Jones et al., 2010; Lopez & Gross, 2008; Miller & Metz, 2014; Tune et al., 2013) courses. Although the opinions of individual students may differ, as a whole, both graduate and undergraduate students in individual STEM courses reported neutral through very positive perceptions of and preference for active learning teaching practices (Patrick, 2020). These studies provide valuable insight into how students view these teaching practices. However, these were studies of specific courses, and active learning implementation was controlled by or known to the researchers. As a result, it remains unknown if the findings represent student perceptions and preferences within a broader context.

Few studies have examined student perceptions of and preferences for teaching practices across STEM disciplines, where active learning remains limited, and student resistance is often a perceived barrier (Patrick et al., 2016; Patrick et al., 2018). Patrick et al. (2016) and Patrick et al. (2018) examined student perceptions of active learning and other teaching practices among science college departments of a large research-intensive university in the southeastern United States. Using a modified survey (Miller & Metz, 2014), students were asked to estimate the amount of their science class time devoted to active learning and the amount of time students thought should be dedicated to active learning. These studies also prompted students to rank six broad teaching practice categories most effective for their learning. Compared to graduate students, undergraduates reported less active learning in their classes. Nevertheless, both groups wanted more active learning than currently experienced (Patrick, 2020; Patrick et al., 2016, 2018). These studies used broad categories of teaching practices, making it impossible to interpret student perceptions of and preferences for specific teaching practices like think-pair-shares (Kaddoura, 2013). However, we can leverage student attitudes of particular teaching practices to increase faculty willingness towards such activities. Also, gauging the perceptions of different student populations is essential to learn how active learning and other teaching practices can be generalized in different contexts (Patrick, 2020).

As we have stated above, similar works in different higher educational contexts have not explored student attitudes towards particular teaching practices. To address this gap in knowledge, we surveyed students in three STEM-focused colleges at one large university in the midwestern United States to determine their experiences and perceptions of specific teaching practices. Specifically, we compared student perceptions of the teaching strategies employed in their undergraduate and graduate-level courses to detect whether students valued different pedagogies at different stages of their education (i.e., undergraduate or graduate). We also compared the alignment between experienced and desired teaching practices in both undergraduate and graduate-level courses. Through our study, faculty and other stakeholders can be more fully informed and understand the instructional choices and student preferences throughout the STEM curriculum.

The main questions guiding this work were:

  • How do undergraduate and graduate students perceive the teaching practices in their curricula? Specifically, which teaching practices do undergraduate and graduate students experience and which do they prefer?

  • Are there notable differences in undergraduate and graduate perceptions regarding the implementation of active learning in their courses?

Theoretical framework

Active learning is a broad group of teaching practices informed by constructivism and socio-constructivism (a variant of constructivism), which view learning as an active process (Dewey, 1966). In a constructivist approach, learners internally build knowledge structures from experience, instruction, and on the foundations of prior knowledge (Bransford & Schwartz, 1999; Prince, 2004). As such, teaching practices informed by constructivism require learning to begin from a student’s prior knowledge. For instance, in an undergraduate learning space, instructors can set up pre-lecture questions or clicker questions to investigate students’ prior knowledge on the topic at hand. Once prior knowledge is known, it is used as the foundation to design instruction from where students begin their learning process (Handelsman et al., 2004). Building off Dewey’s work, a constructivist approach, including active learning, rejects the notion that students are “empty vessels” needing teachers to fill them with knowledge. Instead, students actively engage in their own learning process (Bransford & Schwartz, 1999). By definition, active learning is “instructional activities involving students in doing things and thinking about what they are doing” (Bonwell & Eison, 1991). At one end of the active learning spectrum, it can be merely pausing lectures to allow students to clarify and organize their ideas through discussion with a neighbor. The other end of the spectrum can encompass more complex activities, including using case studies as a focal point for decision making (Brame, 2016).

In the higher education context, “active learning” is a term encompassing a diverse assortment of teaching practices in which students engage actively with the course content, instructor, and each other using various activities. Additionally, active learning practices are characterized by students involved in solving problems, reading, writing, and discussing (Prince, 2004). Overall, such methods have a greater emphasis on students’ explorations of their attitudes and values than traditional ways of teaching (Bonwell & Eison, 1991). Specifically, some teaching practices with these attributes include group discussions, clicker questions, debates, and projects (Miller & Metz, 2014). These practices aim to involve students in higher-order thinking tasks, which lead to knowledge construction.

Materials and methods

Institution and study participants

Our institution is a large land-grant university in the Midwest, serving 32,000 undergraduate and 16,000 graduate students. There are three central STEM Colleges: the College of Biological Sciences (CBS), the College of Science and Engineering (CSE), and the College of Food, Agriculture, and Natural Resource Sciences (CFANS). There is a total of 27 departments in these Colleges, with an aggregate enrollment of 12,096 students at the time of survey distribution. We were interested in comparing STEM undergraduate and graduate student experiences with active learning. Accordingly, any student enrolled in a degree program in either of the three STEM Colleges was in our target population.

Survey instrument

Because we were interested in student perceptions of and preferences for specific teaching practices and active learning in general, we combined items from several existing survey instruments (DeMonbrun et al., 2017; Miller & Metz, 2014; Patrick et al., 2016). We asked all students to identify the largest STEM course they had taken in the preceding semester and to estimate the number of students enrolled. We asked this to encourage students to think about a single large-enrollment course and make the response sets more similar in the types of courses evaluated. For undergraduates, such large courses are often considered “gateways” to STEM majors and have received considerable attention from discipline-based education researchers (Barr et al., 2008; Witherspoon et al., 2019; Xie et al., 2015). For graduate students, a question on their largest course is likely to prevent them from considering a seminar or dissertation-credit course in their responses. For the identified course, we asked students how often the course instructor used specific teaching practices (Table 1). Most of the teaching practices included in the instrument were taken directly from the Student Response to Instructional Practices (StRIP) instrument (DeMonbrun et al., 2017). We modified two practices for clarity and added three known teaching practices at our institution (Table 1). For each teaching practice, students responded to the prompt: “Please indicate how often each activity was done in the largest science course you took this semester.” Frequency options were: Never or almost never (0–10% of the time) (scored as 1); Seldom (11–30% of the time) (scored as 2); Sometimes (31–50% of the time) (scored as 3); Often (51–70% of the time) (scored as 4); Very often (71–100% of the time) (scored as 5). These options differed from those in the original StRIP to better align with the question about how much class time they think is and should be devoted to active learning included in our study. For each teaching practice, students were asked, “How often would you like to do each activity in an ideal course you would take as a student?” Response options were: Much less (scored as 1); Slightly less (scored as 2); About the same (scored as 3); Slightly more (scored as 4); Much more (scored as 5).

Table 1 Teaching practices included in the survey instrument and their source. Students were asked how often each teaching practice occurred in their largest STEM course and how often they would like each teaching practice to occur

We also provided the students with a definition of active learning (Miller & Metz, 2014). After reading the definition, students estimated the percentage of class time typically devoted to active learning and how much time they think should be devoted via an open-ended response. Students also reflected on their experiences with active learning: “Please describe your experiences with active learning in the classroom.” Finally, we asked students to report their status (graduate or undergraduate).

Our study design and survey instrument were approved by our the University of Minnesota’s IRB (approval #STUDY00002261).

Survey distribution

During the Spring 2018 semester, we contacted college administrators to obtain lists of current graduate and undergraduate students enrolled in the STEM colleges. We used the Qualtrics platform to distribute the survey and collect responses. We offered the first 100 respondents a $5 coffee card as an incentive to complete the survey. The survey was open for a total of 2 weeks. Two reminders were sent to students who had not yet completed the survey–1 week and 1 day prior to the close of the survey.

Data analysis-quantitative

Responses were downloaded to Microsoft Excel from Qualtrics and de-identified by a researcher not otherwise affiliated with this project. We removed incomplete de-identified responses or those reported by individuals under the age of 18. Graduate and undergraduate student responses were analyzed separately. Significant differences between graduate and undergraduate students for activities desired and experienced were determined using the Mann-Whitney U test. The Mann Whitney test is most appropriate for ordinal data that deviates from a normal distribution (MacFarland et al., 2016). All data were analyzed using Sigma plot 14.0, the ggplot2 package for R, and RAWGraphs (Mauri et al., 2017; Team, 2013; Wickham, 2016).

Data analysis-qualitative

Seven hundred sixty-eight students responded to the open-ended prompt, “Please describe your experiences with active learning in the classroom.” Two coders, trained in in-vivo coding (Saldaña, 2009), read student responses, unaware of whether a graduate- or undergraduate-level student wrote the response. In an initial meeting, the coders identified consensus categories; afterward, they identified the following emergent categories (along with associate sub-categories) from the student responses: Positive About Active Learning (active learning a) makes the class engaging, b) is beneficial, c) helps with content retention, d) builds community and e) prepares for real-world work environment); Negative About Active Learning (active learning a) limits individual thinking and learning, b) professor does not implement active learning approach correctly, c) should only be used in particular fields and d) current active-learning methods are a “waste of time”); and Constructive (active learning a) works when people are prepared to collaborate b) only works in smaller classes, c) effective active learning is desirable, and d) works well when supplemented with other methods. Upon the generation of a sub-categories codebook, each coder coded the same randomly selected 10% of the comments to establish interrater reliability. Cohen’s Kappa (κ) is a robust statistical approach for testing reliability while accounting for chance agreement between two raters. The raters received a κ = 0.87, considered a strong agreement on the Kappa scale (Cohen, 1960). Once reliability was established, we divided the remaining responses among the two coders. When the coding was completed, the sub-categories were tallied, and the responses were decoded to allow for comparison between graduate and undergraduate student responses.

Results and discussion

Participant attributes

In total, 1274 undergraduate (n = 1113) and graduate (n = 161) students completed the survey (Table 2). Nearly equal numbers of first-year, second-year, third-year, and senior undergraduates responded, which together greatly outnumbered the graduate respondents. Respondents who identified as female outnumbered respondents who identified as male. The mean class sizes reported by graduate and undergraduate students were 39 and 178 students, respectively, for their largest science course.

Table 2 Student Attributes

Teaching and learning practices experienced by students

Of the 23 teaching practices included in our survey, all students identified instructor lecturing (Q1) as the most common teaching practice in their courses, which occurred, on average, very often (Fig. 1a). This finding is consistent with other studies, which demonstrate that STEM classrooms are dominated by teacher-centered pedagogy with lecturing as the primary mode of instruction (Akiha et al., 2018; Stains et al., 2018). Graduate and undergraduate students also identified four other practices that, on average, happened often in their largest STEM courses: assuming responsibility for learning the material (Q8), getting homework information from the instructor (Q10), taking the initiative in deciding what is necessary to know (Q20), and watching the instructor demonstrate how to solve problems (Q21; Fig. 1a). These results suggest that all students experienced teaching and learning practices that were dominated by direct faculty-to-student, teacher-centered instruction.

Fig. 1
figure1

Box and whisker plots of the most and least common instructional practices reported by graduate (n = 161) and undergraduate (n = 1113) students in STEM classrooms. a Most frequently experienced activities, and b Least frequently experienced activities. Triangles indicate mean values. *indicates p-values < 0.05. Activities displayed are the five most and least desired. A reported sixth activity in panel b reflects the ranking misalignment between graduate and undergraduate students

The teaching practices experienced by students least often were all active learning techniques, but these differed between undergraduate and graduate students. Undergraduates identified individual student presentations (Q5) as the least performed instructional practice (Fig. 1b). This finding is not surprising due to time constraints in large-enrollment courses. Teaching activities that entail direct feedback from instructors were also infrequently experienced. For example, students seldom experienced answering questions using student response systems—either directly or following a consultation with a classmate (Q17 and Q18; Fig. 1b). However, studies have demonstrated that students using student response systems in large classrooms are more engaged than those that do not use clickers (e.g., Mayer et al., 2008). Moreover, students retain more material on exam units covered in lessons that incorporate clicker activities (Crossgrove & Curran, 2008). Hands-on group activities and answering questions posed by the instructor were also uncommon practices in undergraduate courses (Q23 and Q15; Fig. 1b).

Graduate students reported the use of student response systems (i.e. Top Hat, clickers), both in isolation or following consultation with a classmate (think-pair-share), as the least common activities experienced in their courses (Q17 and Q18; Fig. 1b). We suspect that graduate-level instructors may be less inclined to use a classroom response system given the smaller class sizes of such courses. Giving individual presentations (Q5), answering questions verbally in the classroom (Q14), and solving problems that have more than one correct answer (Q22) were also infrequent in graduate courses (Fig. 1b).

Our findings for how often all 23 teaching practices were experienced by students in our sample can be found in the lower panel of Fig. S1.

Teaching practices preferred by students

Undergraduate and graduate students desire significantly more class time for active learning pedagogies than they are experiencing (Fig. 2). On average, undergraduate students reported 31% of class time was currently being devoted to active learning and that 36% of class time should be devoted to active learning. Graduate students reported that significantly less class time, 25%, was currently devoted to active learning and that 36% of class time should be devoted to active learning (Fig. 2). These findings suggest that both groups of students want more active learning in their classrooms than currently experienced and desire a similar amount (~ 36% of class time) dedicated to active learning. Overall, most students have positive perceptions of active learning and perceive the benefits and/or utility of these practices. However, both student populations still valued listening to lecture (Fig. 4).

Fig. 2
figure2

Box and whisker plots of the percent of class time graduate and undergraduate students think is currently (“Current”) and should be (“Best”) devoted to active learning. Students were provided a definition of active learning (Miller & Metz, 2014) and via an open-ended response, asked to estimate the amount of class time typically devoted to active learning and how much time they think should be devoted to active learning. Triangles indicate mean values. *indicates p-values < 0.05

Specifically, undergraduate students preferred instructional practices that involve peer-assisted learning and direct feedback from instructors (Fig. 3a). For undergraduate students, the top five most desired teaching practices were watching the instructor demonstrate how to solve problems (Q21), getting homework help directly from the instructor (Q10), brainstorming different solutions (Q2), studying course content with classmates outside of class (Q7), and asking the instructor questions during class (Q19). These preferences reflect that students value a variety of teaching strategies in their classroom. For instance, students’ understanding of conceptual questions increases after discussion with classmates regardless of students’ initial knowledge of the answer (Smith et al., 2009). Undergraduate students also valued peer-assisted learning outside of the classroom and discussing course concepts with peers. These meaningful peer interactions outside of the classroom lead to gains in students’ cognitive development (Jones et al., 2008) Similarly, graduate students’ most desired forms of instruction included watching the instructor demonstrate how to solve problems (Q21), brainstorming solutions (Q2), asking the instructor questions during class (Q19), discussing concepts with classmates (Q9), and getting help from the instructor with their homework (Q10; Fig. 3a).

Fig. 3
figure3

Box and whisker plots of the most and least desired instructional practices reported by graduate (n = 161) and undergraduate (n = 1113) students in STEM classrooms. a Most preferred activities b Least preferred activities. Triangles indicate mean values. *indicates p-values < 0.05. Activities displayed are the top five most and least desired. Reported additional activities reflect the ranking misalignment between graduate and undergraduate students

For undergraduate students, the five least desired forms of instruction were finding additional information not provided by the instructor (Q3), being graded based on the performance of a group (Q11), assuming individual responsibility for the learning material (Q8), making individual presentations to the class (Q5), and being graded on class participation (Q6; Fig. 3b). The least desired instructional practices were similar for graduate students: being graded on group performance (Q11) or class participation (Q6), finding additional information not provided by the instructor to complete assignments (Q3), working in assigned groups (Q4), and listening to the instructor lecture during class (Q1; Fig. 3b). Other studies have also found that undergraduate students felt unprepared to evaluate the value and importance of information and the work of others (Owens et al., 2017) and were often resistant to working collaboratively when their grades were on the line (Machemer & Crawford, 2007; Owens et al., 2017; Patrick et al., 2016). It is noteworthy that graduate students also disliked these teaching practices because in many ways these are vital elements of modern scientific practice. Using transparent grading rubrics, making expectations clear, and using best practices when assigning group work may help to increase student buy-in. The results for all 23 teaching practices can be found in the upper panel of Fig. S1. Although the mean values vary, the median desired level of each teaching practice mostly centered around about the same.

The summary statistics above highlight the trends in the data, but they also mask important variation that lends insights into student perceptions of active learning. Figure 4 illustrates the variation in responses for three example teaching strategies (responses for the remaining teaching strategies can be found in S2, S3, S4, S5, S6, S7, S8, S9, S10,S11, S12, S13, S14,S15, S16, S17, S18, S19, S20 and S21 Figs). As reflected in Fig. 2, most students experienced lecturing (Q1) the majority of the time. Very few students reported experiencing courses in which 70% or less of the time was devoted to lecture. Overall, students desired lecture about as much as they were currently getting (Fig. 4). This interest in lecture may seem counter to the documented benefits of active learning teaching practices (Freeman et al., 2014), however, a deeper look at the data gives the story more nuance.

Fig. 4
figure4

Alluvial plots of how often students reported experiencing a teaching practice and how much that teaching practice is desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left) and the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia,” or lines between columns, indicate how often students reported experiencing a teaching practice (left side of each graph) and how much that teaching practice was desired (right side of each graph) and are proportional to the number of respondents. NA = respondents who did not respond to a particular question; 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time). Less = a combination of Much less and Slightly less desired; Same = About the same; More = a combination of Slightly more and Much more desired

For example, approximately equal numbers of students reported that discussing concepts with classmates during class (Q9) happened never or almost never through very often (Fig. 4). Despite a median desired value of same (Fig. 3a), this particular active learning teaching practice was desired more often by a substantial number of students; very few students who never or almost never experienced this practice wanted less of it (Fig. 4). Answering questions posed by the instructor using a student response system after consulting with a classmate (Q18) was experienced never or almost never by the vast majority of the students in our sample (Fig. 4), and while the majority of students desired this teaching practice about the same amount, a sizeable proportion wanted more and fewer still wanted less (Fig. 4). Similar trends are evident for most of the remaining 20 teaching practices (S2, S3, S4, S5, S6, S7, S8, S9, S10,S11, S12, S13, S14,S15, S16, S17, S18, S19, S20 and S21 Figs).

The data above demonstrate that while many students value lecture, they also desire active learning practices, suggesting the students in our study “buy-in” to active learning—that is, they perceive it as valuable. However, we also found that no single teaching practice was universally desired – or not desired – suggesting to us that educators should employ a variety of active learning teaching practices in graduate and undergraduate classrooms alike.

Student perceptions of active learning based on free-response items

Seven hundred sixty-eight students answered the prompt, “Please describe your experiences with active learning in the classroom.” We analyzed all the responses, which include 104 graduate and 664 undergraduate student responses. We identified these responses by the categories and subcategories described in the methods. Sample comments are included, along with the total number of responses in each category, in Table 4. Each student response was coded, line by line. As a result, some responses were a combination of values under the three categories. For example, 64% (422/664) of the undergraduate responses were in a single category, 29% (193/664) in two and 7% (49/664) were in three categories. Likewise, 72% (75/104) of graduate responses were in a single category, 21% (22/104) in two and ~ 7% (7/104) were in three categories. For both graduate and undergraduate students, most responses were in a single category. In Table 3, we provide an example of a response coded for more than a single category.

Table 3 Student Response Coding Template/Example

The three overarching categories are Positive (in which the student is voicing an appreciation for active learning); Negative (in which the student is unambiguously negative); and Constructive (in which the student suggests the conditions under which active learning can be useful). We identified overwhelmingly positive codes for both graduate and undergraduate student responses, followed by negative and constructive codes (Table 5).

Five Positive subcategories emerged—active learning a) makes the class engaging, b) is beneficial, c) helps with content retention, d) builds community and e) prepares for real-world work environment (Table 4). Over half of both the graduate (62/104) and the undergraduate (380/669) students expressed a positive impression of active learning (subcategory “active learning is beneficial”) (Table 4). One graduate student reported, “I felt I got to use the material rather than passively take it in, and I got to know my classmates better, which made coming to class much more enjoyable.” An undergraduate said, “Active learning is a very effective method to get students to really understand their curriculum since questions are encouraged as well as predictions/guesses.” Many of the responses spoke to a general appreciation for active learning. However some responses included disclaimers such as, “I think a traditional lecture has its place explaining basic concepts and helping students get comfortable with the material, but activities can help engage and apply the material” (undergraduate student) and “Sometimes [active learning] is effective, and sometimes it is patronizing” (graduate student).

Table 4 Categories (Positive, Negative, and Constructive) and subcategories identified in student responses to the prompt “Please describe your experiences with active learning in the classroom.” Number of responses in each subcategory are included in parentheses after a sample comment

Four Negative sub-categories were identified—(a) active learning limits individual thinking and learning, b) professor does not implement active learning approaches correctly, c) active learning should only be used in certain fields, d) current active learning methods are a “waste of time” (Table 4). In these negative subcategories, the largest number of comments belonged to “current active-learning methods are a ‘waste of time’”—for both graduate (n = 14 responses) and undergraduate (n = 98 responses) students (Table 4). According to one graduate student, “Some [professors] are awesome at using active learning for its positive, intended uses while others are totally off-base and waste class time without any benefit to the students.” Similarly, an undergraduate student opined, “The material has to be dumbed down so any idiot can figure it out and working in groups means you work at the slowest pace of anyone there. It also doesn’t help that there’s barely any time for the professor to talk about a new concept.”

Lastly, four Constructive sub-categories emerged—(a) active learning works when people are prepared to collaborate, b) active learning only works in smaller classes, c) effective active learning is desirable, d) active learning works well when supplementing other methods (Table 4). The most comments in this domain belong to sub-category, “Effective active learning is desired.”, evident in 116 undergraduate and 18 graduate student responses (Table 4). Student suggestions varied, from “Sometimes they [active learning] feel [s] like a hassle. Good active learning should be integrated smoothly, but with a clear goal in mind” (undergraduate), and “Usually there is some lecture portion, then the active learning with worksheets or games, etc. What is most vital is that we actually have the information to be able to do the activity before we do the activity. I found it happens a lot where we have no idea what we are doing or the instructors have given no examples so we sit and play with our thumbs because we haven’t learned anything. Learning first. Activities next to reinforce concepts and maybe expand upon them” (undergraduate), to “it’s helpful when it is done correctly. sometimes it’s more distracting than helpful” (graduate), and “Often spent at least half of lecture time going over a paper in groups. Mostly effective for learning the paper if students actually read it, but there was no system in place for ensuring accountability, leading to some group sessions suffering. Also, while a lot was learned during these times, what was discussed in class did not usually help for questions on the exams” (graduate).

Notably, apparent differences between the two populations were not evident. Specifically, the themes that were the most common for undergraduate students were the most common for graduate students as well (Table 4).

Are there notable differences between undergraduate and graduate students in the amount or type of active learning experienced or desired?

Undergraduate and graduate students reported similar patterns in the most common instructional activities (Fig. 1a). Some differences emerged in the least common teaching practices (indicated by * in Fig. 1b), the most and least desired teaching practices (* in Fig. 3), and how much time was currently devoted to active learning (* in Fig. 2). While all students rarely experienced individual presentations (Q5), such practices occurred significantly more often in graduate than in undergraduate courses. Undergraduate students also reported significantly fewer opportunities for verbally answering questions in class (Q15) compared to graduate courses. For graduate classes, the use of student response systems (i.e., Top Hat, clickers), either independently (Q17) or following consultation with a classmate (think-pair-share; Q18), was significantly less common than in undergraduate classrooms (Fig. 1b). Overall, graduate students reported slightly, though significantly, less class time devoted to active learning practices than their undergraduate counterparts (Fig. 2).

Although both groups of students wanted more direct engagement with instructors, undergraduates had a greater desire for such activities than their graduate counterparts. (Figs. 3a and 4; S9,17 Figs). Specifically, undergraduate students sought more opportunities to ask their instructors questions, and seek information directly from instructors for homework (S9, 17 Figs). Compared to graduate students, undergraduates also wanted more opportunities to study with classmates outside of class time (Fig. 3a, S7 Fig). Furthermore, there was a significant difference between undergraduate and graduate students in the desire for less lecture (Q1). Graduate students identified lecturing as one of the top five least desired activities in their classrooms and wanted significantly less than undergraduate students (Figs. 3 and 4). Finally, undergraduate students had significantly less desire for assuming responsibility for learning material on their own, listing it as one of their top five least desired classroom activities (Fig. 3b, S8 Fig).

Overall, graduate student views are similar to those of undergraduates. Both groups of students want more active learning than they are currently getting. These results are similar to recent studies examining perceptions of active learning at a different university (Patrick et al., 2016, 2018), suggesting that these findings indicate a larger trend in higher education. Additionally, graduate and undergraduate courses implemented similar teaching strategies, dominated by lecturing; in fact, significantly less class time was devoted to active learning in graduate courses compared to undergraduate courses.

Conclusions and implications

Any conclusions from these findings should be tempered with the limitations of this work. We hesitate to assign differences to our two populations beyond their status as either graduate or undergraduate students. Nonetheless, it is a reasonable assumption that graduate students are characterized by different levels of, for example, intrinsic motivation than their undergraduate counterparts. This differential motivation may lead to different expectations of or demands from their instructors. Future work, in which we attempt to align student responses with different aspects of student affect (e.g., motivation, mindset, self-efficacy) with perceptions of teaching strategies, would provide additional clarity. Further, we realize that graduate courses are likely to have a different culture than undergraduate courses, informed by factors beyond e.g., course level and class size. For example, instructors may perceive graduate students as closer to being colleagues than undergraduate students; these differences likely change in-class behaviors—of both students and instructors. It would be helpful, in follow-up work, to combine student perceptions with those of their professors.

Regardless of these and other unidentified limitations, our work contributes to the relatively small body of literature exploring the use of evidence-based, active learning techniques in undergraduate and graduate-level STEM courses. Similarly, we contribute to understanding student buy-in to active learning in the curriculum. Critically, our approach is sufficiently fine-grained to isolate which evidence-based techniques are in place and which of these techniques are desired by STEM students.

We found that graduate and undergraduate students want to experience a higher degree of active learning in their STEM classrooms than they currently experience (Fig. 2). Additionally, our open-ended responses indicate an overwhelmingly positive experience when we sought student insights on their experience with active learning (Tables 4 and 5). Though some differences were identified in the specific type of active learning preferred, both graduate and undergraduate populations wanted more direct feedback (i.e., formative assessment) and a chance to learn in small groups. Further, all students wanted to learn through student presentations and direct engagement with the instructor using student response systems.

Table 5 Summed Total of Identified Categories

On average, graduate students wanted to engage more in individual learning compared to undergraduates. Our study suggests that graduate students experienced low levels of active learning, significantly less than their undergraduate counterparts. Our findings collectively provide evidence for educators, especially those wary of student resistance to change, that students buy-in to active learning. These findings along with student input via an open-ended response suggest that most students would like to experience more active learning instructional practices in their STEM classrooms.

Our results confirm that evidence-based teaching remains relatively scarce in graduate courses. However, this part of the STEM curriculum remains insufficiently explored from the perspectives of who uses active learning, what pedagogies graduate students prefer, and whether student preferences are in line with the evidence for what works best in the classroom. Few studies are investigating active learning practices for graduate classrooms. The existing studies suggest that graduate students hold active learning perceptions similar to those of undergraduates, that is, neutral through positive (Patrick et al., 2016, 2018). For example, graduate students in a flipped classroom performed better on exams than in traditional lectures but disliked the extra time necessary to prepare for class meetings (Tune et al., 2013). Graduate students also value courses that implement a variety of active learning practices, especially when familiarized with the activities (Jones et al., 2010; Lopez & Gross, 2008; Miller & Metz, 2014). Combined with previous studies, our findings indicate a need to integrate active learning throughout all levels of the curriculum. While it may be essential to use active learning in first-year STEM courses, time, and resources should also be allocated to innovative teaching practices in upper-division and graduate-level courses.

Finally, we find no evidence that students, on average, are resistant to the implementation of techniques such as student response systems, opportunities for hands-on group work, and opportunities for direct interaction with the instructor. Instead, we identify evidence that students would prefer more active learning in their courses. Fortunately, instructors can implement many of these preferred active learning techniques into their existing courses with relative ease. For example, there are several excellent sources designed for educators to develop a toolkit of in-class assessment techniques such as classroom polling, short written reflections, and think-pair-share activities (Angelo & Cross, 1993; Fink, 2013; Tanner, 2013). These suggestions, combined with an awareness of student preferences, may help instructors, teetering on the brink of adoption, to leap into active learning.

Availability of data and materials

Due to the need to protect individual student data, the datasets used for the current study are available, in de-identified, aggregate form, from the corresponding author on reasonable request.

References

  1. Akiha, K., Brigham, E., Couch, B. A., Lewin, J., Stains, M., Stetzer, M. R., … Smith, M. K. (2018). What types of instructional shifts do students experience? Investigating active learning in science, technology, engineering, and math classes across key transition points from middle school to the university level. Frontiers in Education, 2, 68. https://doi.org/10.3389/feduc.2017.00068.

    Article  Google Scholar 

  2. Angelo, T. A., & Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers. Jossey-Bass Publishers.

    Google Scholar 

  3. Ballen, C. J., Wieman, C., Salehi, S., Searle, J. B., & Zamudio, K. R. (2017). Enhancing diversity in undergraduate science: Self-efficacy drives performance gains with active learning. CBE—Life Sciences Education, 16(4), ar56. https://doi.org/10.1187/cbe.16-12-0344.

    Article  Google Scholar 

  4. Barr, D. A., Gonzalez, M. E., & Wanat, S. F. (2008). The leaky pipeline: Factors associated with early decline in interest in premedical studies among underrepresented minority undergraduate students. Academic Medicine, 83(5), 503–511. https://doi.org/10.1097/ACM.0b013e31816bda16.

    Article  Google Scholar 

  5. Bonwell, C. C., & Eison, J. A. (1991). Active learning: Creating excitement in the classroom. ASHE-ERIC Higher Education Reports. ERIC Clearinghouse on Higher Education, The George Washington University, One Dupont Circle, Suite 630, Washington, DC 20036–1183.

    Google Scholar 

  6. Brame, C. (2016). Active learning. Vanderbilt University Center for Teaching.

    Google Scholar 

  7. Bransford, J. D., & Schwartz, D. L. (1999). Chapter 3: Rethinking transfer: A simple proposal with multiple implications. Review of Research in Education, 24(1), 61–100. https://doi.org/10.3102/0091732X024001061.

    Article  Google Scholar 

  8. Brazeal, K. R., Brown, T. L., & Couch, B. A. (2016). Characterizing student perceptions of and buy-in toward common formative assessment techniques. CBE Life Sciences Education, 15(4), ar73. https://doi.org/10.1187/cbe.16-03-0133.

    Article  Google Scholar 

  9. Brewer, C. A., & Smith, D. (2011). Vision and change in undergraduate biology education: A call to action. American Association for the Advancement of Science.

    Google Scholar 

  10. Brigati, J. (2018). Student attitudes toward active learning vs. lecture in cell biology instruction. The American Biology Teacher, 80(8), 584–591. https://doi.org/10.1525/abt.2018.80.8.584.

    Article  Google Scholar 

  11. Brown, T. L., Brazeal, K. R., & Couch, B. A. (2017). First-year and non-first-year student expectations regarding in-class and out-of-class learning activities in introductory biology †. Journal of Microbiology & Biology Education, 18(1). https://doi.org/10.1128/jmbe.v18i1.1241.

  12. Brownell, S. E., & Tanner, K. D. (2012). Barriers to faculty pedagogical change: Lack of training, time, incentives, and...tensions with professional identity? CBE Life Sciences Education, 11(4), 339–346. https://doi.org/10.1187/cbe.12-09-0163.

    Article  Google Scholar 

  13. Carpenter, S. K., Witherby, A. E., & Tauber, S. K. (2020). On students’ (Mis) judgments of learning and teaching effectiveness. Journal of Applied Research in Memory and Cognition, 9(2), 137–151. https://doi.org/10.1016/j.jarmac.2019.12.009.

    Article  Google Scholar 

  14. Cavanagh, A. J., Aragón, O. R., Chen, X., Couch, B. A., Durham, M. F., Bobrownicki, A., … Ledbetter, M. L. (2016). Student buy-in to active learning in a college science course. CBE—Life Sciences Education, 15(4), ar76. https://doi.org/10.1187/cbe.16-07-0212.

    Article  Google Scholar 

  15. Cavanagh, A. J., Chen, X., Bathgate, M., Frederick, J., Hanauer, D. I., & Graham, M. J. (2018). Trust, growth mindset, and student commitment to active learning in a college science course. CBE—Life Sciences Education, 17(1), ar10. https://doi.org/10.1187/cbe.17-06-0107.

    Article  Google Scholar 

  16. Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20(1), 37–46. https://doi.org/10.1177/001316446002000104.

    Article  Google Scholar 

  17. Cooper, K. M., Ashley, M., & Brownell, S. E. (2017). Using expectancy value theory as a framework to reduce student resistance to active learning: A proof of concept. Journal of Microbiology & Biology Education, 18(2). https://doi.org/10.1128/jmbe.v18i2.1289.

  18. Cotner, S. H., Fall, B. A., Wick, S. M., Walker, J. D., & Baepler, P. M. (2008). Rapid feedback assessment methods: Can we improve engagement and preparation for exams in large-enrollment courses? Journal of Science Education and Technology, 17(5), 437–443. https://doi.org/10.1007/s10956-008-9112-8.

    Article  Google Scholar 

  19. Council, N. R. (2003). BIO2010: Transforming undergraduate Education for future research biologists. The National Academies Press. https://doi.org/10.17226/10497.

    Book  Google Scholar 

  20. Crossgrove, K., & Curran, K. L. (2008). Using clickers in nonmajors- and majors-level biology courses: Student opinion, learning, and long-term retention of course material. CBE Life Sciences Education, 7(1), 146–154. https://doi.org/10.1187/cbe.07-08-0060.

    Article  Google Scholar 

  21. DeMonbrun, M., Finelli, C. J., Prince, M., Borrego, M., Shekhar, P., Henderson, C., & Waters, C. (2017). Creating an instrument to measure student response to instructional practices. Journal of Engineering Education, 106(2), 273–298. https://doi.org/10.1002/jee.20162.

    Article  Google Scholar 

  22. Deslauriers, L., McCarty, L. S., Miller, K., Callaghan, K., & Kestin, G. (2019). Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom. Proceedings of the National Academy of Sciences of the United States of America, 116(39), 19251–19257. https://doi.org/10.1073/pnas.1821936116.

    Article  Google Scholar 

  23. Dewey, J. (1966). Democracy and education (1916). In J. A. Boydston (Ed.), The Middle Works of John Dewey, (vol. 9, pp. 1899–1924).

    Google Scholar 

  24. Eberlein, T., Kampmeier, J., Minderhout, V., Moog, R. S., Platt, T., Varma-Nelson, P., & White, H. B. (2008). Pedagogies of engagement in science. Biochemistry and Molecular Biology Education, 36(4), 262–273. https://doi.org/10.1002/bmb.20204.

    Article  Google Scholar 

  25. England, B. J., Brigati, J. R., & Schussler, E. E. (2017). Student anxiety in introductory biology classrooms: Perceptions about active learning and persistence in the major. PLoS One, 12(8), e0182506. https://doi.org/10.1371/journal.pone.0182506.

    Article  Google Scholar 

  26. Felder, R. M., Woods, D. R., Stice, J. E., & Rugarcia, A. (2000). The future of engineering education II. Teaching methods that work. Chemical Engineering Education, 34(1), 26–39.

    Google Scholar 

  27. Fink, L. D. (2013). Creating significant learning experiences: An integrated approach to designing college courses. Wiley.

    Google Scholar 

  28. Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences of the United States of America, 111(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111.

    Article  Google Scholar 

  29. Glynn, S. M., Brickman, P., Armstrong, N., & Taasoobshirazi, G. (2011). Science motivation questionnaire II: Validation with science majors and nonscience majors. Journal of Research in Science Teaching, 48(10), 1159–1176. https://doi.org/10.1002/tea.20442.

    Article  Google Scholar 

  30. Gormally, C., Sullivan, C. S., & Szeinbaum, N. (2016). Uncovering barriers to teaching assistants (TAs) implementing inquiry teaching: Inconsistent facilitation techniques, student resistance, and reluctance to share control over learning with students. Journal of Microbiology & Biology Education, 17(2), 215–224. https://doi.org/10.1128/jmbe.v17i2.1038.

    Article  Google Scholar 

  31. Graham, M. J., Frederick, J., Byars-Winston, A., Hunter, A. B., & Handelsman, J. (2013). Increasing persistence of college students in STEM. Science, 341(6153), 1455–1456. https://doi.org/10.1126/science.1240487.

    Article  Google Scholar 

  32. Haak, D. C., HilleRisLambers, J., Pitre, E., & Freeman, S. (2008). Increased structure and active learning reduce the achievement gap in introductory biology. Science, 332(6034), 1213–1216 36(4):262–73. Retrieved from http://science.sciencemag.org/content/332/6034/1213.abstract.

    Article  Google Scholar 

  33. Handelsman, J., Ebert-May, D., & Beichner, R. (2004). Scientific teaching, (April). Retrieved from http://science.sciencemag.org/content/304/5670/521.short

    Google Scholar 

  34. Henderson, C., Khan, R., & Dancy, M. (2018). Will my student evaluations decrease if I adopt an active learning instructional strategy? American Journal of Physics, 86(12), 934–942. https://doi.org/10.1119/1.5065907.

    Article  Google Scholar 

  35. Jones, M. H., Estell, D. B., Alexander, J. M., Jones, M. H., Estell, D. B., & Alexander, J. M. (2008). Friends, classmates, and self-regulated learning: Discussions with peers inside and outside the classroom. Metacognition Learning, 3(1), 1–15. https://doi.org/10.1007/s11409-007-9007-8.

    Article  Google Scholar 

  36. Jones, N. L., Peiffer, A. M., Lambros, A., Guthold, M., Johnson, A. D., Tytell, M., … Eldridge, J. C. (2010). Developing a problem-based learning (PBL) curriculum for professionalism and scientific integrity training for biomedical graduate students. Journal of Medical Ethics, 36(10), 614–619. https://doi.org/10.1136/jme.2009.035220.

    Article  Google Scholar 

  37. Kaddoura, M. (2013). Think pair share : A teaching learning strategy to enhance students ’ critical thinking. Education Research Quarterly, 36(4), 3–24.

    Google Scholar 

  38. Lopez, R. E., & Gross, N. A. (2008). Active learning for advanced students: The Center for Integrated Space Weather Modeling graduate summer school. Advances in Space Research, 42(11), 1864–1868. https://doi.org/10.1016/J.ASR.2007.06.056.

    Article  Google Scholar 

  39. MacFarland, T. W., & Yates, J. M. (2016). Mann–whitney u test. In Introduction to nonparametric statistics for the biological sciences using R (pp. 103-132). Springer, Cham.

  40. Machemer, P. L., & Crawford, P. (2007). Student perceptions of active learning in a large cross-disciplinary classroom. Active Learning in Higher Education, 8(1), 9–30. https://doi.org/10.1177/1469787407074008.

    Article  Google Scholar 

  41. Madson, L., Trafimow, D., & Gray, T. (2017). Faculty members’ attitudes predict adoption of interactive engagement methods. The Journal of Faculty Development, 31(Number 3), 39–50 (12). Retrieved from https://www.ingentaconnect.com/content/nfp/jfd/2017/00000031/00000003/art00005#Supp.

    Google Scholar 

  42. Mauri, M., Elli, T., Caviglia, G., Uboldi, G., & Azzi, M. (2017). RAWGraphs: A visualisation platform to create open outputs. In Proceedings of the 12th Biannual Conference on Italian SIGCHI Chapter, (p. 28). ACM.

    Google Scholar 

  43. Mayer, R. E., Stull, A., Deleeuw, K., Almeroth, K., Bimber, B., Chun, D., … Zhang, H. (2008). Clickers in college classrooms: Fostering learning with questioning methods in large lecture classes q. Contemporary Educational Psychology, 34(1), 51–57. https://doi.org/10.1016/j.cedpsych.2008.04.002.

    Article  Google Scholar 

  44. Mcmillan, C., Loads, D., & Mcqueen, H. A. (2018). From students to scientists: The impact of interactive engagement in lectures. New Directions in the Teaching of Physical Sciences, 13(1). https://doi.org/10.29311/ndtps.v0i13.2425.

  45. Michael, J. (2007). Faculty perceptions about barriers to active learning. College Teaching, 55(2), 42–47. https://doi.org/10.3200/CTCH.55.2.42-47.

    Article  Google Scholar 

  46. Miller, C. J., & Metz, M. J. (2014). A comparison of professional-level faculty and student perceptions of active learning: Its current use, effectiveness, and barriers. Advances in Physiology Education, 38(3), 246–252. https://doi.org/10.1152/advan.00014.2014.

    Article  Google Scholar 

  47. Olson, S., & Riordan, D. G. (2012). Engage to excel: Producing one million additional college graduates with degrees in science, technology, engineering, and mathematics. Report to the President. Executive Office of the President. https://doi.org/10.1080/10668921003609210.

    Book  Google Scholar 

  48. Owens, D. C., Sadler, T. D., Barlow, A. T., & Smith-Walters, C. (2017). Student motivation from and resistance to active learning rooted in essential science practices. Research in Science Education, 1–25. https://doi.org/10.1007/s11165-017-9688-1.

  49. Patrick, L. (2020). Faculty and student perceptions of active learning. In J. J. Mintzes, & E. M. Walter (Eds.), Active learning in college science-the case for evidence-based practice. Springer Nature. https://doi.org/10.1007/978-3-030-33600-4_55.

    Chapter  Google Scholar 

  50. Patrick, L. E., Howell, L. A., & Wischusen, W. (2016). Perceptions of active learning between faculty and undergraduates: Differing views among departments. Journal of STEM Education: Innovations and Research, 17, 55–63.

  51. Patrick, L., Howell, L. A., & Wischusen, E. W. (2018). Roles matter: Graduate student perceptions of active learning in the STEM courses they take and those they teach. BioRxiv, 502518. https://doi.org/10.1101/502518.

  52. Prince, M. (2004). Does active learning work? A review of the research. Journal of Engineering Education, 93(3), 223–231. https://doi.org/10.1002/j.2168-9830.2004.tb00809.x.

    Article  Google Scholar 

  53. Saldaña, J. (2009). The coding manual for qualitative researchers. Sage.

    Google Scholar 

  54. Seidel, S. B., & Tanner, K. D. (2013). “What if students revolt?”—Considering student resistance: Origins, options, and opportunities for investigation. CBE—Life Sciences Education, 12(4), 586–595. https://doi.org/10.1187/cbe-13-09-0190.

    Article  Google Scholar 

  55. Shin, J. C., & Harman, G. (2009). New challenges for higher education: Global and Asia-Pacific perspectives. Asia Pacific Education Review, 10(1), 1–13. https://doi.org/10.1007/s12564-009-9011-6.

    Article  Google Scholar 

  56. Silverthorn, D. U., Thorn, P. M., & Svinicki, M. D. (2006). It’s difficult to change the way we teach: Lessons from the integrative themes in physiology curriculum module project. Advances in Physiology Education, 30(4), 204–214. https://doi.org/10.1152/advan.00064.2006.

    Article  Google Scholar 

  57. Smith, C. V., & Cardaciotto, L. (2012). Is active learning like broccoli? Student perceptions of active learning in large lecture classes. Journal of the Scholarship of Teaching and Learning, 11(1), 53-61.

  58. Smith, M. K., Wood, W. B., Adams, W. K., Wieman, C., Knight, J. K., Guild, N., & Su, T. T. (2009). Why peer discussion improves student performance on in-class concept questions. Science, 323(5910), 122-124.

  59. Stains, M., Harshman, J., Barker, M. K., Chasteen, S. V., Cole, R., DeChenne-Peters, S. E., … Young, A. M. (2018). Anatomy of STEM teaching in north American universities. Science, 359(6383), 1468 LP–1461470 Retrieved from http://science.sciencemag.org/content/359/6383/1468.abstract.

    Article  Google Scholar 

  60. Stroebe, W. (2020). Basic and applied social psychology student evaluations of teaching encourages poor teaching and contributes to grade Inflation: A theoretical and empirical analysis. https://doi.org/10.1080/01973533.2020.1756817.

    Book  Google Scholar 

  61. Tanner, K. D. (2013). Structure matters: Twenty-one teaching strategies to promote student engagement and cultivate classroom equity. CBE Life Sciences Education, 12(3), 322–331. https://doi.org/10.1187/cbe.13-06-0115.

    Article  Google Scholar 

  62. Team, R. C. (2013). R: A language and environment for statistical computing.

    Google Scholar 

  63. Theobald, E. J., Hill, M. J., Tran, E., Agrawal, S., Nicole Arroyo, E., Behling, S., … Freeman, S. (2020). Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proceedings of the National Academy of Sciences of the United States of America, 117(12), 6476–6483. https://doi.org/10.1073/pnas.1916903117.

    Article  Google Scholar 

  64. Tune, J. D., Sturek, M., & Basile, D. P. (2013). Flipped classroom model improves graduate student performance in cardiovascular, respiratory, and renal physiology. Advances in Physiology Education, 37(4), 316–320. https://doi.org/10.1152/advan.00091.2013.

    Article  Google Scholar 

  65. Uttl, B., White, C. A., & Gonzalez, D. W. (2017). Meta-analysis of faculty’s teaching effectiveness: Student evaluation of teaching ratings and student learning are not related. Studies in Educational Evaluation, 54, 22–42. https://doi.org/10.1016/j.stueduc.2016.08.007.

    Article  Google Scholar 

  66. Walker, J. D., Cotner, S. H., Baepler, P. M., & Decker, M. D. (2008). Article a delicate balance: integrating active learning into a large lecture course background and course design. https://doi.org/10.1187/cbe.08.

    Book  Google Scholar 

  67. Wang, G., & Williamson, A. (2020). Course evaluation scores: Valid measures for teaching effectiveness or rewards for lenient grading? Teaching in Higher Education. https://doi.org/10.1080/13562517.2020.1722992.

  68. Wickham, H. (2016). ggplot2: Elegant graphics for data analysis. Springer. https://doi.org/10.1007/978-3-319-24277-4.

    Book  Google Scholar 

  69. Witherspoon, E. B., Vincent-Ruz, P., & Schunn, C. D. (2019). When making the grade Isn’t enough: The gendered nature of premed science course attrition. Educational Researcher, 48(4), 193–204. https://doi.org/10.3102/0013189X19840331.

    Article  Google Scholar 

  70. Xie, Y., Fang, M., & Shauman, K. (2015). STEM Education. Annual Review of Sociology, 41(1), 331–357. https://doi.org/10.1146/annurev-soc-071312-145659.

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Kendall Edstrom, Monica Osterbauer, Maryam Karaborni and Ryan Laffin for their help in coding the qualitative responses of the participants. Thanks also to Michael Waltonen, Michael Balak, Michael White, John Ward, and Paul Strykowski for connecting us to their students. We appreciate the helpful input of #therealcotnerlab on an earlier version of this manuscript. A Cotner Lab Winter Writing retreat at Lake Itasca Biological Station and Laboratories was critical for initial data interpretation and discussion.

Funding

This work was not externally funded.

Author information

Affiliations

Authors

Contributions

SC and LP developed the survey and survey protocol. SC secured approval from the University’s Institutional Review Board. NG analyzed the data. NG and LP created the data visualizations. All authors conceived of the study and participated in writing the manuscript. The author(s) read and approved the final manuscript.

Corresponding authors

Correspondence to Ngawang Gonsar or Sehoya Cotner.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1: Fig. S1.

Box and whisker plots of all surveyed instructional practices (Q_1 through Q_23) reported by graduate (n = 161) and undergraduate (n = 1113) students in STEM classrooms. Upper row (Desired) details levels of desire for each instructional activity. Lower row (Experienced) details levels of each instructional activity reported as actually experienced in the classroom. Levels range from 1 (Never or almost never; 0–10% of the time) to 5 (Very often; 71–100% of the time). Triangles indicate mean values.

Additional file 2: Fig. S2.

Alluvial plots of how often students reported experiencing the teaching practice in Q_2 (“Brainstorm different possible solutions to a given problem”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 3: Fig. S3.

Alluvial plots of how often students reported experiencing the teaching practice in Q_3 (“Find additional information not provided by the instructor to complete assignments”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 4: Fig. S4.

Alluvial plots of how often students reported experiencing the teaching practice in Q_4 (“Work in assigned groups to complete homework or other projects”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 5: Fig. S5.

Alluvial plots of how often students reported experiencing the teaching practice in Q_5 (“Make individual presentations to the class”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 6: Fig. S6.

Alluvial plots of how often students reported experiencing the teaching practice in Q_6 (“Be graded on class participation”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 7: Fig. S7.

Alluvial plots of how often students reported experiencing the teaching practice in Q_7 (“Study course content with classmates outside of class”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 8: Fig. S8.

Alluvial plots of how often students reported experiencing the teaching practice in Q_8 (“Assume responsibility for learning material on own”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 9: Fig. S9.

Alluvial plots of how often students reported experiencing the teaching practice in Q_10 (“Get most of the information needed to solve the homework directly from the instructor”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 10: Fig. S10.

Alluvial plots of how often students reported experiencing the teaching practice in Q_11 (“Be graded based on the performance of a group”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 11: Fig. S11.

Alluvial plots of how often students reported experiencing the teaching practice in Q_12 (“Preview concepts before class by reading, watching videos, etc.”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 12: Fig. S12.

Alluvial plots of how often students reported experiencing the teaching practice in Q_13 (“Solve problems in a group during class”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 13: Fig. S13.

Alluvial plots of how often students reported experiencing the teaching practice in Q_14 (“Solve problems individually during class”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 14: Fig. S14.

Alluvial plots of how often students reported experiencing the teaching practice in Q_15 (“Verbally answer questions posed by the instructor during class”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 15: Fig. S15.

Alluvial plots of how often students reported experiencing the teaching practice in Q_16 (“Verbally answer questions posed by the instructor during class after consulting with a class mate (think-pair-share)”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 16: Fig. S16.

Alluvial plots of how often students reported experiencing the teaching practice in Q_17 (“Answer questions posed by the instructor during class using a student response system (clickers, TopHat, etc)”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 17: Fig. S17.

Alluvial plots of how often students reported experiencing the teaching practice in Q_19 (“Ask the instructor questions during class”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 18: Fig. S18.

Alluvial plots of how often students reported experiencing the teaching practice in Q_20 (“Take initiative for identifying what is necessary to know”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 19: Fig. S19.

Alluvial plots of how often students reported experiencing the teaching practice in Q_21 (“Watch the instructor demonstrate how to solve problems”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 20: Fig. S20.

Alluvial plots of how often students reported experiencing the teaching practice in Q_22 (“Solve problems that have more than one correct answer”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Additional file 21: Fig. S21.

Alluvial plots of how often students reported experiencing the teaching practice in Q_23 (“Do hands-on group activities during class”) and how much that teaching practice was desired for graduate students (left column) and undergraduates (right column). The vertical bars are the experienced frequency (left side of each graph) and; the vertical height of each bar is proportional to the number of respondents who chose that response. The “alluvia” or lines between columns indicate how often students reported experiencing the teaching practice (left side of each graph) and how much this teaching practice was desired (right side of each graph) and are proportional to number of respondents. NA = respondents who did not respond to a particular question. 1 = Never or almost never (0–10% of the time); 2 = Seldom (11–30% of the time); 3 = Sometimes (31–50% of the time); 4 = Often (51–70% of the time); 5 = Very often (71–100% of the time).

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Gonsar, N., Patrick, L. & Cotner, S. Graduate- and undergraduate-student perceptions of and preferences for teaching practices in STEM classrooms. Discip Interdscip Sci Educ Res 3, 6 (2021). https://doi.org/10.1186/s43031-021-00035-w

Download citation