One of the earliest Open Educational Resources (OER) is the Massachusetts Institute of Technology (MIT) OpenCourseWare (OCW) (Bonk, 2009; MIT News, 2001), which was launched in 2001. It was intended to share MIT course materials with the public for free. As of August 2019, more than 2,400 courses were available, with approximately 170 million visitors to their website (MIT OpenCourseWare, 2019b). Nine percent self-reported as educators (MIT OpenCourseWare, 2019a). The majority of the courses are related to science, technology, engineering, and mathematics (STEM) subjects such as Electrical Engineering (EE) and Computer Science (CS).
Open Educational Resources (OER) was previously examined regarding design, development, disseminating methods, and quality by Kimmons (2015), cost-saving by Wiley, Hilton, Ellington, and Hall (2012), and impact by d’Oliveira, Carson, James and Lazarus (2010). However, STEM pedagogical strategies were not yet considered. Thus, the purpose of this study is to examine MIT OCW STEM instructor insights in order to inform other STEM educators regarding effective pedagogical strategies and assessment methods and possible challenges.
The following three research questions guided this study:
Open Educational Resources (OER) was defined by UNESCO (2002) as, “the open provision of educational resources, enabled by information and communication technologies, for consultation, use and adaptation by a community of users for non-commercial purposes” (p. 24). OER dates to the initiative of Massachusetts Institute of Technology (MIT) in 2001, with the intention to share learning materials with the public for free on the Internet (Goldberg, 2001). The OER movement has been gaining attention around the world (Guttenplan, 2010), attracting a huge number of international audiences. Not surprisingly, some institutions translated MIT OCW into their local languages such as Chinese, Spanish, and Portuguese (Abelson, 2008).
Subsequently, MIT published 2,466 courses and has hosted 285 million site visits based on MIT OCW report in August 2019 (MIT OpenCourseWare, 2019a). Over 600 tenured or tenure-track faculty from MIT, approximately 60% of them, participated in OCW movement. As a result, MIT OCW influences people world-wide. Forty-four percent of the site visitors were from North America, followed by East Asia (20%), Europe (17%), South Asia (9%), Latin America (4%), and Mid East (4%), and Africa (2%) (MIT OpenCourseWare, 2019a). Based on MIT OCW site statistics, the audience of OCW includes self-directed learners (43%), students (42%), educators (9%), and other (6%). Among educators, their stated aims were to improve personal knowledge (31%), learning about innovative teaching (23%), leveraging OCW materials for their own course (20%), finding reference materials (15%), and developing curriculum for their department (8%) (MIT OpenCourseWare, 2019a).
The National Science Foundation (NSF) and other related professional societies have focused on the basic and applied research in STEM and to improve the quality of STEM education (Fairweather, 2008). The attention to STEM was a response to the decrease of the number of students selecting STEM majors and the needs of STEM related employees (Center for Science Mathematics and Engineering Education, Committee on Undergraduate Science Education, 1999; National Science Foundation, 1996). The Committee of Science, Technology, Engineering, and Math Education (CoSTEM), a branch of the National Science and Technology Council, was formed to reform STEM education from K-12 to the higher education levels in order to build a pipeline of jobs for the development of the economy (CoSTEM, 2013).
Low quality college teaching in STEM courses is a critical issue in higher education (Seymour & Hewett, 1997), promoting the search for effective pedagogical practices. This includes instructors’ professional development programs (Wulff & Austin, 2004) for teaching innovation. Fisher, Zeligman, and Fairweather (2005) indicated that pedagogical reforms and innovations in engineering courses significantly enhanced student learning outcomes, including ill-structured problem-solving skills.
Active learning is potentially more effective than the traditional teacher-centered instruction in terms of enhancing student learning and increasing student retention in STEM education (Freeman et al., 2014; Lund & Stains, 2015; Michael, 2006; Prince, 2004). Active learning could address student learning needs and promote critical thinking (Kim, Sharma, Land, & Furlong, 2012). For example, Freeman et al. (2014) conducted meta-analysis on the effectiveness of teaching in traditional lecturing vs. active learning approach in STEM courses, and the latter was frequently accompanied by increased student performance.
Interactive lecture increases student attention and motivation by using discussions or questions and answers (Allen & Tanner, 2005; Steinert & Snell, 1999). It also enhances student problem-solving and communication skills (Scott et al., 2018) and improves student learning outcomes (Ernst & Colthorpe, 2007). Molinillo, Aguilar-Illescas, Anaya-Sánchez, and Vallespín-Arán (2018) found social presence and teacher-student interaction positively influences students’ active learning.
Blended learning is another active learning strategy in which both online and face-to-face instruction or learning materials are used (Bonk & Graham, 2012; Güzer & Caner, 2014). It enables students to control the time, location, and paces of their learning to some extent (Güzer & Caner, 2014). The flipped classroom as a type of blended learning model used to promote student-centered learning and active learning (Pierce & Fox, 2012). It was originally used to provide videos or screencast instructions to students who were absent from class (Hamdan, McKnight, McKnight, & Arfstorm, 2013). Class time could be shifted from lecture-centered class to class with enriched activities to promote problem solving skills (Tucker, 2012). Flumerfelt and Green (2013) found using screencast videos in flipped classroom could promote the interaction between students and instructors, further fostering active learning (Leicht, Zappe, Messner, & Litzinger, 2012).
However, active learning strategies are not widely adopted in classroom yet (Hora, Ferrare, & Oleson, 2012). Barriers hinder the instructors’ adoptions of active learning strategies (Finelli, Daly, & Richardson, 2014; Froyd, Borrego, Cutler, Henderson, & Prince, 2013; Lund & Stains, 2015; Shadle, Marker, & Earl, 2017), such as instructors’ concerns about its effectiveness, time consumption to prepare courses, student resistance (Tharayil et al., 2018), and instructors’ understanding of the theoretical background (Borda et al., 2020).
Personalized instruction and learning have a theoretical base in learner-centered and constructivist learning (Reigeluth, Myers, & Lee, 2017; Watson & Watson, 2017). It customizes the instruction to individual learners’ needs through providing learning resources, technologies, and activities (Kelly, 2016). Its learner-centered theory perspective can address learners’ diverse backgrounds, competencies, and requirements (Green, Facer, Rudd, Dillon, & Humphreys, 2005). One way of personalizing the instruction is through technologies such as social bookmarking, blogs, and collaborative tools (Haworth, 2016). Besides technology-enabled personalized learning environment, social interaction and participatory learning also support personalization (Haworth, 2016; McLoughlin & Lee, 2010).
A document analysis, a systematic approach to obtain meaning, understandings, and develop empirical knowledge by examining and analyzing the existing documents (Corbin & Strauss, 2008; Rapley, 2007), was the basis of the research design in this study. Such documents for review could include texts, images, and videos generated without a researcher’s intervention (Bowen, 2009). Document analysis is a social research method and research tool (Bowen, 2009). This study adopted document analysis because the documents on the MIT website presented instructor’s pedagogical strategies.
The documents reviewed were instructor insights of 15 MIT OCW Courses (table 1) from the Department of Electrical Engineering and Computer Science which was published on MITOCW site (https://ocw.mit.edu/courses/instructor-insights/#electrical-engineering-and-computer-science). In general, each instructor insights page included seven sections: (1) Course overview: a general information about the course; (2) course outcomes: course overall goals and learning objectives; (3) instructor insights: Instructors’ thoughts on effective teaching strategies that they used; (4) curriculum information: semester of the course and other related courses; (5) assessment: the detailed assessment methods and percentage of each element; (6) student information: the number of students, their grade levels, and majors; and (7) how student time was spent: estimated time for students to spend on learning the course content in and out of class.
|Introduction to Electrical Engineering and Computer Science I (Spring 2011)||Undergraduate|
|Computation Structures (Spring 2017)||Undergraduate|
|Signals, Systems and Inference (Spring 2018)||Undergraduate|
|Computer System Engineering (Spring 2018)||Undergraduate|
|Artificial Intelligence (Fall 2010)||Undergraduate|
|Design and Analysis of Algorithms (Spring 2015)||Undergraduate|
|Creating Video Games (Fall 2014)||Undergraduate|
|Principles and Practice of Assistive Technology (Fall 2014)||Undergraduate|
|Engineering Innovation and Design (Fall 2012)||Undergraduate|
|Cognitive Robotics (Spring 2016)||Graduate|
|Geometric Folding Algorithms: Linkages, Origami, Polyhedra (Fall 2012)||Graduate|
|Advanced Data Structures (Spring 2012)||Graduate|
|Algorithmic Lower Bounds: Fun with Hardness Proofs (Fall 2014)||Graduate|
|Teaching College-Level Science and Engineering (Fall 2015)||Graduate|
|Electric Machines (Fall 2013)||Graduate|
Nine courses were at the undergraduate level courses, and the remaining six courses were graduate level. Six of the courses provided text insights on the website, and eight provides video insights with verbatim transcripts. However, one graduate level course did not include detailed instructor insights. All the courses were taught with a team of educators (instructors, lab staffs, and teaching assistants). However, not all of the educators shared their insights. A majority of the insights are from one or some of the instructors from each course.
The data were analyzed using thematic analysis (Braun & Clarke, 2006; Braun, Clarke, & Rance, 2014), following the procedures of Braun and Clarke’s (2006). The steps are: (1) becoming familiar with the data; (2) inductive open coding; (3) identification of themes; (4) review of themes; (5) refining and defining themes; and (6) report writing. Following a review of the 15 courses’ instructor insights, the data for each course was coded. The analysis unit was the meaning unit. Themes were identified across all the courses. To increase the trustworthiness of this study, a debriefing was held with an expert in OER field.
A variety of pedagogical strategies were shared by the faculty members regarding teaching EE and CS courses. They included active learning, personalizing instruction, engaging learners, providing feedback, building learning community, clarifying learning objective, and integrating teaching and research.
Active learning is one of the most popular pedagogies used by the MIT OCW instructors. It means that learners should be actively involved in learning through hands on activities and interactions, rather than just passively listening to the lecture. Active learning is an umbrella which covers a variety of strategies such as hands on experience, authentic problems, flipped classrooms, discussions, think-pair-share, debates, and etc. One of the active learning approaches that were shared by MIT OCW instructors was practice-theory-practice, which means that instruction should expose students to practice first, followed by presenting theory and providing practical problems to solve. Dr. Dennis Freeman, an instructor of Introduction to Electrical Engineering and Computer Science I, explained that the entire practice-theory-practice process emphasizes hands-on experiences of solving authentic problems. Along the same line, Dr. Erik Demaine, who taught Design and Analysis of Algorithms, encouraged his students to solve open problems:
So one of the exciting parts of this class is that we ran an optional session where whoever is interested in doing the research side of the material could solve open problems together. So we call this a problem-solving session. But it’s all the problems [that] are unsolved in the field.
Similarly, Dr. Blade Kotelly, an instructor for Engineering Innovation and Design also emphasized hands on experience in his course. As he said:
The lectures are interspersed with activities. So students will do some hands-on activities every, let’s say, half hour. Probably at the limit is about halfway through, about an hour through, they’ll have to do something no matter what. Because you want to keep students’ attention up.
Virtual learning environment was used for hands on activities. Dr. Chris Terman, an instructor of Computation Structures shared these thoughts on providing virtual lab for hands-on activities:
These virtual labs actually… takes courses from being a listening experience with maybe some pencil P-sets to your hands are active. So hands-on, brain on, right? And when people’s brains turn on, it’s amazing what they remember.
Another active learning strategies was flipping the classroom. Dr. Sanjoy Mahajan, a visiting professor of MIT, stated “we flip the classroom in the sense that we have students do as much as they can outside of class to climb the active learning hierarchy so that when we’re together in class we can focus on constructive and interactive learning.” A common active learning method used by MITOCW instructors is discussion, which is also an important element in flipped classroom. Dr. Katrina LaCurts, the instructor of the course, Computer System Engineering, said:
Each recitation focuses around a technical paper that the students have read beforehand. And the goal for those discussion sections, those recitations, is for them to be largely discussion based… A large part of that instruction happens in the recitation, where we’re having these discussions. And they go and apply those skills to their design projects.
The detailed discussion formats vary. One of the commonly used discussion formats as well as an active learning method is think-pair-share. Dr. Janet Rankin, the Interim Director of the Teaching and Learning Lab at MIT at the time of the interview, explained:
In general, you first give students time to think about a question or a situation or some other scenario that you want them to think about. And you give them time to think actively but alone about that. And often you can ask them to write down their thoughts. But then at the end of that short period of time, whether it’s three minutes or five minutes, you have them pair up. And if it’s a really big class or the numbers of students in your class warrant it, you can have them triple up, it doesn’t have to be a pair.
Another important pedagogy was personalizing the instruction, which engages learners with diverse knowledge levels and background. It may be invoked via customized language use, optional learning materials, and offering learning materials online. Also, the use of different languages is productive in this regard. Dr. Dennis Freeman, based on experiences of addressing students with diverse prior knowledge, concluded:
When I’m talking to a student in lab, I adjust what I’m saying to match the student’s current level of understanding. If the student has never seen programming before, I don’t use advanced programming comments in the way I talk to them. If they’ve never seen a circuit before, I don’t use jargon. I’ve learned how to say things without using jargon.
Personalized instruction requires optional learning materials. For example, Dr. Chris Terman, provided a range of materials to benefit students with different backgrounds and learning preferences:
So I create a huge-- I think of it as a buffet. There’s lots of dishes. And you can start at the beginning of the buffet and sort of pick it up from scratch. Or you can say, I’ll skip the first couple courses, and I’m ready to dive in sort of in the middle of the conversation somewhere.
The short and small asynchronous materials available online offer students an opportunity to study at any time and any place. Dr. Chris Terman noted:
It’s keeping things short and sweet. So you have a huge-- because now you have a bunch of short bites. Now, the MITx platform lets you organize those with questions that let you sort of continue to test your learning. So it’s actually worked out to be a very nice way of making a fairly organized tour through the material that the students can start, and stop, and come back to. Plus, it’s asynchronous. In other words, they get to choose their time and place.
Another theme was learner engagement. Relevant strategies include storytelling, joking, showing passion, and making class fun. Dr. Chris Terman emphasized the importance of engaging students, especially, after a long talk on technical topics. Dr. Patrick Winston, who taught Artificial Intelligence used stories to provide a big picture and inspire learners:
I think stories are an important element of education, and if you strip them out, you don’t have much left that can possibly be inspiring… I call them powerful ideas. If all you’re teaching is skills, the educational experience you offer students is okay, but if you can accompany the skills with some big-picture, powerful ideas, the educational experience becomes more impactful, more important.
Another way to engage learners is to convey passions, as Dr. Winston sated,
There are things people need to know, but you can’t say they’re very exciting. You have to pretend they’re exciting. Somehow. Otherwise, your passion won’t come across and your teaching won’t be inspiring…
One of my colleagues told me that he always ends his lectures with something fun so that people feel like they’ve enjoyed the class the whole time. It could be a joke, or an historical anecdote, or an intriguing demo. I do that now, too. I always try to end with something fun.
Guest speakers provided diverse perspectives and engaged learners. Dr. Joel Schindall, who taught Engineering Innovation and Design said:
In some cases, we’ll bring in someone from the Engineering faculty who is particularly gifted at communicating mechanical engineering design skills or electrical or chemical, because we want to give the students-- there tend to be some discipline unique ways of thinking, and we want to give the students an idea of what the broad range is.
Providing systematic support and feedback to learners was important. Usually, they have an education team including instructors, lab staffs, and teaching assistants (TAs). Dr. Chris Terman indicated the advantages of having a hierarchy feedback system:
But the students actually prefer the other thing, which is actually asking an TA is not very intimidating. The students, maybe they just took it last semester… And then you sort of work the chain up, work up the hierarchy to get an answer of people below. And that way you’re only asking questions of the more intimidating people when you’re pretty sure that no one else has the answer.
In addition, TAs provided optional tutorials. Students who wanted to join the tutorials can sign up the session. A third of the students of Dr. George Verghese’s, Signals, Systems and Inference, attended the weekly tutorials noted:
The teaching assistants go prepared with a small set of basic problems, simpler than those on homework, and illustrating points that have come up in lecture. However, the tutorials are also teaching assistant office hours, and students are encouraged to come with questions they may have.
Serval courses used discussion forum as a platform for questions and answers. The instructors and the TAs provided timely feedback to address students’ questions. When a student asks a question in the forum, other students with similar question benefit from the response. The online discussion forum scales up the way of providing feedback and enhances the efficiency of providing support to students. Dr. Terman said:
For the first time I’m able to make a thoughtful answer to a question and have 180 people look at the answer instead of one. And then the next person who has the same question, you say, well, I just spent 10 minutes. And with a large class, you can’t spend 10 minutes for each of 300 people.
Learning communities provide a physical or virtual learning environment to support building learning community. Dr. Verghese created formal learning community through providing a physical learning space and TA support:
We reserve a classroom for the three or four evenings that precede the day homework is due, and guarantee that at least one of the staffs will be present there for 1.5-2 hours; usually we have the lecturer or a recitation instructor, as well as a teaching assistant. We find students working individually as well as collaboratively, and periodically interacting with the staff, either at the board or at their desk—very immersed and engaged in the homework problems, and in sorting out ideas and misconceptions related to these. The staff will typically respond to student questions with other (well chosen!) questions or hints that guide them along, rather than with answers—and that makes for a very fruitful dynamic.
Some instructors proposed to share learning objectives with learners at the beginning of the class. Dr. Patrick Winston said: “you want to tell them what they’ll be able to do at the end of the lecture that they couldn’t do in the beginning. I try to start every lecture with a promise, every time.” Dr. Philip B. Tan, who taught Creating Video Games, shared similar ideas:
So one thought that I have for educators who are running those classes is be very clear to yourself and then to the students about whether you are running a class about game design, or game programming, or project management. When we started this class, we were trying to be as clear as possible to the students. This is a class project management. You will do all those other things in the process of this class, and many of you are here for that reason.
Integrating research into teaching stimulates instructor motivation and engages learners. Some instructors used the research results in their teaching, and were inspired to pursue related research topics. For instance, Dr. George Verghese, who taught Signals, Systems and Inference, noted regarding integrating teaching and research:
I routinely discuss with my class such examples originating in research. I also bring in application examples from other fields, as opportunity arises. These various examples are motivating for the students, as they illustrate the relevance of the course material. It is also almost invariably the case that each time I lecture the subject, I encounter new questions and ideas to carry back to my research!
Similarly, an instructor of the course Design and Analysis of Algorithms, Dr. Erik Demaine, expressed passion with algorithms in both research and teaching:
And all of my research is also around algorithms. So this is me living the dream, teaching the topic that I love. And it’s an exciting class… I try to add in new topics that I don’t know so well, so I learn them even better. And that, in turn, influences my research.
Formative assessment included quizzes and oral exams for just-in-time teaching, as well as the creation of an online automatic environment for students’ self-assessment. The summative assessments were primarily final exams and projects.
Formative assessments were widely used by MIT OCW instructors. Some instructors used in class formative assessment to get to know students’ learning levels to adjust their teaching accordingly. For instance, Dr. Blade Kotelly stated, “we administer a quiz, we swap all the quizzes, we review all the answers… So we try to do a diagnostic to see what’s happening.” Similarly, Dr. Dennis Freeman used formative assessment for just-in-time teaching:
Since implementing the practice-theory-practice approach, I’ve become more careful to assess students’ understanding during lectures. I do this by asking a concept question every 15 minutes or so… Students work in pairs to answer the questions. I show them five possible answers, and they raise their hands showing some number of fingers that corresponds with their answer choice. I look at their responses. If everybody gets the question right, I know I don’t need to explain the concept again. I keep going. If some students get the question wrong, I provide more explanation. If everyone gets the question wrong, then I know I didn’t explain the concept well and I start from the beginning.
They also used technology to enhance the efficiency of providing feedback. Dr. Sanjoy Mahajan described noted students in the design lab put their questions in an electronic help queue seen by all the students and TAs. TAs could use their mobile phone to respond, provide check-off points, and record the results online.
One of the methods to check whether student master the learning content was to use oral exams. However, its effectiveness was limited due to the large number of evaluators. Dr. Freeman noted:
We asked very open-ended questions, from which we learned a lot. We would start with an easy question, a question that we expected everybody would get. If the student got it easily, then we would ask a harder question. If the student didn’t get it easily, then we’d ask another easier question. If he or she sailed through the easy question, we went straight to a difficult question. In other words, we adjusted to the student’s level, and it was really quick. In 10 minutes, and sometimes fewer, we had a good feeling for the student’s level of understanding.
Self-assessment was obtained via an online tutorial environment in which students checked their code, and receive feedback. If incorrect, detailed feedback is provided. Dr. Dennis Freeman said: “Checking test cases offers much richer feedback. ‘Your code passes tests 1, 2, 4, and 5 but fails test 3.’ This message contains a wealth of information about not only the problem at hand but also about how to construct effective test cases, which is essential to becoming an expert programmer.”
Final exams, projects, and presentations were the basis of summative assessments. Projects were used in Dr. Erik Demaine’s graduate level course Algorithmic Lower Bounds: Fun with Hardness Proofs:
So with every sort of advanced class that I teach there’s a final project. And the goal of the final project is for students to somehow get their feet wet with the material and sort of experience it at a more researchy level. In general, this can be things like surveying papers that I didn’t cover in the class because there’s only so much you can fit in one semester. So they’ll go and read other material and kind of aim to teach that to the students. So there’s a written project part, and then there’s also a presentation in class. So this is an opportunity for students to learn more.
Challenges ranged from ways to assess students’ learning to approaching in changing other instructors’ pedagogical beliefs. Dr. Erik Demaine explained the challenge to assess whether students could apply the algorithms creatively:
But it’s hard to measure a student’s understanding because it’s like, did you get the creative trick that we had in mind, or find another one that’s just as good? Students may find a different creative trick that doesn’t end up with as good an algorithm in the end. So we penalize that someone, but for the most part we are happy when people get correct algorithms.
Regarding colleagues’ pedagogical beliefs, Dr. Freeman stated:
In fact, it’s been a process to shift the teaching model toward a hands-on approach, such that most of the learning happens in the lab. It’s been hard to get the faculty on board with this shift. Some of our most effective lecturers have commented, “I don’t want to teach this course because there’s no teaching,” which is completely wrong. Usually by the time they’ve done it a few times, faculty realize that they’re actually imparting a lot more knowledge by facilitating hands-on learning than they would solely by lecturing. There’s a misconception on the part of “broadcast” lecturers that if they say it, students will understand it. That’s so wrong.
The purpose of this study was to examine MIT OCW STEM instructor insights, particularly their pedagogical strategies, assessment methods, and perceived challenges in instruction in order to inform other STEM educators regarding effective pedagogical strategies and assessment methods. Fifteen MIT OCW instructor insights from the department of Electrical Engineering and Computer Science were reviewed. A variety of pedagogical strategies were identified, such as active learning, personalizing instruction, engaging learners, providing feedback, building learning community, clarifying learning objective, and integrating teaching and research. Both formative and informative assessment methods were used. Challenges such as effective way to assess learners and changing instructors’ pedagogical beliefs were identified.
Active learning is one of the primary themes of the instructor insights. Flipped classroom is one of the effective active learning strategies used by instructors, which concurs with previous research (i.e., Freeman et al., 2014; Michael, 2006; Prince, 2004; Lund & Stains, 2015). The flipped classroom manifests the core pedagogical shift from lecture-centered teaching to learner-centered instructions that use activities to engage learners and solve problems (Tucker, 2012). However, the pedagogy shifts also encountered barriers as identified in this study. Instructors doubted the effectiveness of active learning strategies and felt uncomfortable of using it. This finding is in line with the previous research which indicated that barriers hinder the adoption of active learning strategies in classroom (Finelli et al., 2014; Froyd et al., 2013; Lund & Stains, 2015; Prince, Borrego, Henderson, Cutler, & Froyd, 2013; Shadle et al., 2017) and instructors do not believe in the effectiveness of active learning (Tharayil et al., 2018). Thus, one of the possible ways to address this problem is to provide professional development to educate instructors by showing successful examples of using active learning.
Another important pedagogical strategy is personalized instruction reported by MIT OCW instructors. This study found that the common practice of personalization is through providing optional materials to learners. These personalized instructions address needs of individual learners (Kelly, 2016) with diverse backgrounds, competencies and requirements (Green et al., 2005). As more and more blended learning mode is used in higher education, providing optional learning materials online could be one way to provide personalized instruction in higher education.
There are several limitations in this study. First, the researcher only reviewed MIT OCW instructor insights. If the researcher could also review the learning materials such as videos and quizzes of the course to triangulate the data, it would possibly increase the trustworthiness of the study. Future researchers could combine both instructor insights and MIT OCW course learning materials for data triangulation. In addition, the pedagogical strategies were from educators’ perspective, future research might interview learners who have taken MIT OCW STEM courses regarding their learning perceptions and experiences of effective instructional strategies.
The significance of this study contributes to both research and practice. For research, this study indicated that a variety of pedagogical strategies could be used for STEM education such as active learning, personalizing instruction, and etc. In addition, formative assessments such as quizzes and in class evaluations could be used for just-in-time teaching to improve the teaching quality; and both final exams and projects could be used for summative assessment in STEM education. This study indicated that the majority of the pedagogical strategies and assessment methods were aligned with the pedagogy of student-centered learning. This study could be an initial step of the research on the impact of OER on STEM educators teaching pedagogies.
For practice, STEM instructors and instructional designers could leverage the existing experience of MIT OCW STEM educators to improve teaching and student learning. Given challenges faced on how to effectively and efficiently evaluate learning and changing faculty members’ teaching beliefs, practitioners could keep these challenges in mind while designing and delivering their instructions and figure out strategies to address these issues.
This research was completed as a part of the OER Fellowship honored to the first author by the Open Education Group. I thank Dr. John Hilton III for his support with this research. In addition, I would like thank the MIT OCW team for sharing the course resources and instructor insights with the public. Particularly, I truly appreciate the advice from Sarah Hansen, the OCW Educator Project Manager, on the available resources.
Abelson, H. (2008). The creation of OpenCourseWare at MIT. Journal of Science Education and Technology, 17(2), 164–174. https://doi.org/10.1007/s10956-007-9060-8
Allen, D., & Tanner, K. (2005). Infusing active learning into the large-enrollment biology class: seven strategies, from the simple to complex. Cell Biology Education, 4(4), 262–268. https://doi.org/10.1187/cbe.05-08-0113
Borda, E., Schumacher, E., Hanley, D., Geary, E., Warren, S., Ipsen, C., & Stredicke, L. (2020). Initial implementation of active learning strategies in large, lecture STEM courses: Lessons learned from a multi-institutional, interdisciplinary STEM faculty development program. International Journal of STEM Education, 7(1), 4. https://doi.org/10.1186/s40594-020-0203-2
Bowen, G. A. (2009). Document analysis as a qualitative research method. Qualitative Research Journal, 9(2), 27–40. https://doi.org/10.3316/QRJ0902027
Braun, V. & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa
Braun, V., Clarke, V. & Rance, N. (2014). How to use thematic analysis with interview data. In A. Vossler & N. Moller (Eds.), The counselling & psychotherapy research handbook (pp. 183–197). London: Sage.
Center for Science, Mathematics, and Engineering Education, Committee on Undergraduate Science Education (1999). Transforming undergraduate education in science, mathematics, engineering, and technology. Washington, D.C.: National Academy Press.
Committee on STEM Education (CoSTEM) (2013). Federal science, technology, engineering, and mathematics (STEM) education: 5-Year strategic plan. Retrieved from https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/stem_stratplan_2013.pdf
d’Oliveira, C., Carson, S., James, K., & Lazarus, J. (2010). MIT OpenCourseWare: Unlocking knowledge, empowering minds. Science, 329(5991), 525–526. https://doi.org/10.1126/science.11826962
Ernst, H., & Colthorpe, K. (2007). The efficacy of interactive lecturing for students with diverse science backgrounds. Advances in Physiology Education, 31(1), 41–44. https://doi.org/10.1152/advan.00107.2006
Fairweather, J. (2008). Linking evidence and promising practices in science, technology, engineering, and mathematics (STEM) undergraduate education. Board of Science Education, National Research Council, The National Academies, Washington, DC.
Finelli, C. J., Daly, S. R., & Richardson, K. M. (2014). Bridging the research-to-practice gap: Designing an institutional change plan using local evidence. Journal of Engineering Education, 103(2), 331–361. https://doi.org/10.1002/jee.20042
Fisher, P., Zeligman, D., & Fairweather, J. (2005). Self-assessed Student Learning Outcomes in an Engineering Service Course. International Journal of Engineering Education, 21, 446–456. Retrieved from https://www.ijee.ie/articles/Vol21-3/IJEE1595.pdf
Flumerfelt, S., & Green, G. (2013). Using lean in the flipped classroom for at risk students. Educational Technology and Society, 16(1), 356–366. Retrieved from https://pdfs.semanticscholar.org/9100/8b1349b6fb6329727a2fa3c2d5960856fd9d.pdf
Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415. https://doi.org/10.1073/pnas.1319030111
Froyd, J., Borrego, M., Cutler, S., Henderson, C., & Prince, M. (2013). Estimates of use of research-based instructional strategies in core electrical or computer engineering courses. IEEE Transactions on Education, 56(4), 393–399. https://doi.org/10.1109/TE.2013.2244602
Goldberg, C. (2001, April 4). Auditing classes at M.I.T., on the web and free. The New York Times. Retrieved from http://web.mit.edu/ocwcom/MITOCW/Media/NYTimes_040301_MITOCW.pdf
Guttenplan, D. D. (2010, November 1). For exposure, universities put courses on the web. The New York Times. Retrieved from http://www.nytimes.com/2010/11/01/world/europe/01iht-educLede01.html?pagewanted=all&_r=0
Güzer, B., & Caner, H. (2014). The past, present and future of blended learning: An in-depth analysis of literature. Procedia - Social and Behavioral Sciences, 116, 4596–4603. https://doi.org/10.1016/j.sbspro.2014.01.992
Haworth, R. (2016). Personal learning environments: A solution for self-directed learners. TechTrends, 60, 359–364. https://doi.org/10.1007/s11528-016-0074-z
Kelly, R. (2016, July 14). 7 universities receive grants to implement adaptive learning at scale. Campus Technology. Retrieved from https://campustechnology.com/articles/2016/07/14/7-universitiesreceive-grants-to-implement-adaptive-learning-at-scale.aspx
Kim, K., Sharma, P., Land, S. M, & Furlong, K. P. (2012). Effects of active learning on enhancing student critical thinking in an undergraduate general science course. Innovative Higher Education, 38, 223–235. https://doi.org/10.1007/s10755-012-9236-x
Kimmons, R. (2015). OER quality and adaptation in K-12: Comparing teacher evaluations of copyright-restricted, open, and open/adapted textbooks. International Review of Research in Open and Distributed Learning, 16(5), 39–57. https://doi.org/10.19173/irrodl.v16i5.2341
Leicht, R. M., Zappe, S. E., Messner, J. I., & Litzinger, T. (2012) Employing the classroom flip to move “lecture” out of the classroom. Journal of Applications and Practices in Engineering Education, 3(1), 19–31. Retrieved from https://www.researchgate.net/publication/243458004_EMPLOYING_THE_CLASSROOM_FLIP_TO_MOVE_LECTURE_OUT_OF_THE_CLASSROOM
Lund, T. J. & Stains, M. (2015). The importance of context: an exploration of factors influencing the adoption of student-centered teaching among chemistry, biology, and physics faculty. International Journal of STEM Education, 2(13), 1–21. https://doi.org/10.1186/s40594-015-0026-8
McLoughlin, C., & Lee, M. J. (2010). Personalised and self-regulated learning in the Web 2.0 era: International exemplars of innovative pedagogy using social software. Australasian Journal of Educational Technology, 26(1), 28–43. https://doi.org/10.14742/ajet.1100
Michael, J. (2006). Where’s the evidence that active learning works? Advances in Physiology Education, 30(4), 159–167. https://doi.org/10.1152/advan.00053.2006
MIT News. (2001, April 4). MIT to make nearly all course materials available free on the world wide web. Cambridge, MA: MIT Press. Retrieved from http://web.mit.edu/newsoffice/2001/ocw.html
MIT OpenCourseWare (2019a, August). Dashboard report. Retrieved from https://ocw.mit.edu/about/site-statistics/monthly-reports/MITOCW_DB_2019_08_v1.pdf
MIT OpenCourseWare (2019b, December). Site statistics. Retrieved from https://ocw.mit.edu/about/site-statistics/
Molinillo, S., Aguilar-Illescas, R., Anaya-Sánchez, R., & Vallespín-Arán, M. (2018). Exploring the impacts of interactions, social presence and emotional engagement on active collaborative learning in a social web-based environment. Computers & Education, 123, 41–52. https://doi.org/10.1016/j.compedu.2018.04.012
National Science Foundation (1996). Shaping the future: New expectations for undergraduate education in science, mathematics, engineering, and technology. Washington, D.C.: National Science Foundation.
Pierce, R., & Fox, J. (2012). Vodcasts and active-learning exercises in a “flipped classroom” model of a renal pharmacotherapy module. American Journal of Pharmaceutical Education, 76(10), 1–5. https://doi.org/10.5688/ajpe7610196
Prince, M. (2004). Does active learning work? A Review of the Research. Journal of Engineering Education 93(3), 223–231. https://doi.org/10.1002/j.2168-9830.2004.tb00809.x
Prince, M., Borrego, M., Henderson, C., Cutler, S. L., & Froyd, J. (2013). Use of research-Based instructional strategies in core chemical engineering courses. Chemical Engineering Education, 47(1), 27–37.
Reigeluth, C. M., Myers, R. D., & Lee, D. (2017). The learner-centered paradigm of education. In C. M. Reigeluth, B. J. Beatty, & R. D. Myers (Eds.), Instructional-Design theories and models (pp. 5–32). Routledge.
Scott, P. H., Veitch, N. J., Gadegaard, H., Mughal, M., Norman, G., & Welsh, M. (2018). Enhancing theoretical understanding of a practical biology course using active and self-directed learning strategies. Journal of Biological Education, 52(2), 184–195. https://doi.org/10.1080/00219266.2017.1293557
Shadle, S.E., Marker, A., & Earl, B. (2017). Faculty drivers and barriers: laying the groundwork for undergraduate STEM education reform in academic departments. International Journal of STEM Education, 4(8), 1–13. https://doi.org/10.1186/s40594-017-0062-7
Steinert, Y., & Snell, Y. S. L. S. (1999). Interactive lecturing: strategies for increasing participation in large group presentations. Medical Teacher, 21(1), 37–42. Retrieved from https://www.mcgill.ca/medicinefacdev/files/medicinefacdev/InteractiveLecturingStrategies-MedicalTeacher1999.pdf
Tharayil, S., Borrego, M., Prince, M., Nguyen, K. A., Shekhar, P., Finelli, C.J. & Waters, C. (2018). Strategies to mitigate student resistance to active learning. International Journal of STEM Education, 5(7). https://doi.org/10.1186/s40594-018-0102-y
Tucker, B. (2012). The flipped classroom. Education Next, 12(1), 82–83. Retrieved from http://www.msuedtechsandbox.com/MAETELy2-2015/wp-content/uploads/2015/07/the_flipped_classroom_article_2.pdf
UNESCO (2002). Forum on the impact of open courseware for higher education in developing countries: Final report. Retrieved from www.unesco.org/iiep/eng/focus/opensrc/PDF/OERForumFinalReport.pdf
Watson, W. R., & Watson, S. L. (2017). Principles for personalized instruction. In C. M. Reigeluth, B. J. Beatty, & R. D. Myers (Eds.), Instructional-design theories and models (pp. 93–120). Hillsdale, NJ: Routledge.
Wiley, D., Hilton, J., Ellington, S., & Hall, T. (2012). A preliminary examination of the cost savings and learning impacts of using open textbooks in middle and high school science classes. The International Review of Research in Open and Distance Learning, 13(3), 262–276. https://doi.org/10.19173/irrodl.v13i3.1153