In their seminal Seven Principles for Good Practice in Undergraduate Education, Chickering and Gamson (1987) wrote about the following seven principles:
It is likely the best-known set of engagement factors (Kuh, 2009), having been cited almost 7,000 times. The seven principles identified success factors that influence students’ engagement, success, and persistence during their undergraduate-education experience. The principles have been used in multiple ways over the years, including as a lens for integrating technology into the classroom (Chickering & Ehrmann, 1996) and evaluating online courses (Graham, Cagiltay, Lim, Craner & Duffy, 2001). They also served to inform the design of the National Survey for Student Engagement (NSSE) (Kuh, 2009). But what is not talked about much is how the principles were created (Chickering & Gamson, 1999). The principles were not solely derived from a systematic review of the literature. At a conference, Chickering and Gamson invited a group of experienced postsecondary educators to share what they knew about good practice for undergraduate education—although not a term used at the time, Chickering and Gamson essentially used a crowdsourcing approach to help them co-construct the seven principles of good practice in undergraduate education.
Crowdsourcing—a conjunction of “crowd” and “outsourcing” coined by Jeff Howe in a June 2006 Wired magazine article—is “the process by which the power of many can be leveraged to accomplish feats that were once the province of a specialized few” (Howe, 2008). Howe (2010) further defined crowdsourcing as “the act of taking a job traditionally performed by a designated agent (usually an employee) and outsourcing it to an undefined, generally large group of people in the form of an open call.” Also referred to as citizen science and citizen social science (e.g., Procter et al., 2013; also see https://scistarter.com/), this participatory process has recently gained in popularity due to the increasing volume of scientific data and limited centralized support to efficiently process the data (Ranard et al., 2014). Although extensively used in the biomedical domain as a way to harness the computational power of many people to process large-scale biomedical data—such as to support genome sequence analysis (Kawrykow, Roumanis, Kam & Kwak, 2012; Rallapalli et al., 2015) and protein structure prediction (Cooper et al., 2010)—it is also now being used as a field-based research methodology in a wide range of disciplines; for example crowdsourcing approaches are used to classify distant galaxies (Lintott et al., 2008; also see https://www.galaxyzoo.org/), create geographic digital maps (Whitmeyer & De Paor, 2014), collect more representative data in forensic psychology research (Baker, Fox & Wingrove, 2016), tackle complex architectural design needs (Newton & Backhouse, 2013), validate assessment of interventions for speech disorders (Byun, Halpin & Szeredi, 2015), and engage in new product development (Schemmann, Hermann, Chappin & Heimeriks, 2016). The growing need for crowdsourcing in research and development has led to social networked spaces such as the Amazon Mechanical Turk (https://www.mturk.com/), CloudFactory (https://www.cloudfactory.com/), CrowdFlower (https://www.crowdflower.com/), and clickworker (https://www.clickworker.com/)—online, distributed sources of available workers. Social network platforms have increasing potential to change the way people connect and engage online; people no longer solely consume online content, but are now empowered to actively participate and contribute. With the advent of online social networked spaces and platforms, crowdsourced content and opportunities for contribution are ubiquitous via YouTube, Twitter, Quora, Pinterest, TripAdvisor, Wikipedia, Kickstarter, and so on.
There are four types of crowdsourcing: collective intelligence, crowd creation, crowd voting, and crowd funding (Howe, 2008). All four types of crowdsourcing go far beyond divide-and-conquer approaches to goal achievement and research; they are true collaborations between and among members of the crowd—leading to much more than individual, isolated contributions. Crowdsourcing has fundamentally changed and enhanced the collection and dissemination of data, content, resources, problem solving, and computing power, and has been proven effective for rapid and efficient data collection, especially where expert-level knowledge of a topic or discipline is not a necessity (Whitmeyer & De Paor, 2014). [To view a wide range of crowdsourcing projects, see https://en.wikipedia.org/wiki/List_of_crowdsourcing_projects].
Online courses are part of the postsecondary teaching and learning landscape. Online education has grown from a fringe activity to something that millions of people take part in (Allen, Seaman, Poulin, & Straut, 2016; Ginder & Stearns, 2014). Despite the popularity of online education, online educators are in many ways still trying to figure out the best ways to design and facilitate online learning experiences (Everson, 2009; Motte, 2013; Tallent-Runnels et al., 2006; Ubell, 2017). Fortunately instructional design models have emerged to help designers and educators consider critical instructional decisions inherent in designing and teaching online courses. One model in particular has gained a lot of traction, and has significantly influenced our work in online education—the Community of Inquiry (CoI) model. Garrison, Anderson and Archer (2000) developed the Community of Inquiry (CoI) model to describe how the interplay between teaching presence, social presence, and cognitive presence are foundational to the development of deep and meaningful educational experiences in online courses (see Figure 1). The CoI model emphasizes balanced instructional attention to teaching, social, and cognitive presence in order to cultivate an engaged online learning community (Lowenthal & Dunlap, 2014):
Because the CoI model is a descriptive model that does not provide much prescriptive guidance on how to intentionally design for and facilitate student learning and engagement in online courses (Garrison & Arbaugh, 2007), online educators continue to experiment with different ways of establishing a Community of Inquiry in their online courses (Dunlap & Lowenthal, 2014; Lowenthal & Dunlap, 2014). Online educators can make some inferences from the indicators of teaching presence developed by Anderson et al. (2001), but even these indicators lack sufficient detail (Dunlap, Verma & Johnson, 2016). There is also literature suggesting strategies for establishing social presence (Dunlap & Lowenthal, 2014; Lowenthal & Dunlap, 2014) and cognitive presence (Dunlap, Furtak & Tucker, 2009; Dunlap, Sobel & Sands, 2007; Sobel, Sands & Dunlap, 2009) in online courses, however these strategies represent recommendations from a few as opposed to the many. Therefore, in much the same way Chickering and Gamson used a crowdsourcing approach to illuminate success factors for undergraduate education, we broadened the online-teaching conversation by crowdsourcing specific recommendations online educators have for teaching online. Through this process we curated prescriptive strategies for actualizing the CoI model in the design and teaching of online courses.
We were interested in co-constructing a list of recommendations for online educators, using a crowdsourcing approach similar to the one used by Chickering and Gamson’s to derive the seven principles for good practice in undergraduate education. Because crowdsourcing is a participative activity in which “an individual, an institution, a non-profit organization, or company proposes to a group of individuals of varying knowledge, heterogeneity, and number, via a flexible open call, the voluntary undertaking of a task” (Estellés & González, 2012, p. 197), we invited online educators from a variety of disciplines and with a range of experiences to share recommendations for online teaching, knowing that experienced online educators—regardless of discipline and experience level—would be able to contribute relevant recommendations. We defined “experienced online educators” as educators who had taught at least one online course in the last three years, whether or not they designed the course themselves or inherited from another educator. Similarly, we invited contributions from educators representing a variety of disciplines. Although there are situational factors—such as discipline differences, course size, length of course, preparation and disposition of students and faculty—that make every online course unique (Dunlap, Furtak & Tucker, 2009; Dunlap, Verma & Johnson, 2016; Sobel, Sands & Dunlap, 2009), we believed many recommendations for online teaching would transcend situational factors in the same way Chickering and Gamson’s seven principles transcend situational factors. Our inquiry was shaped by the belief that there is value in exploring the day-to-day practice of online educators who have amassed recommendations for teaching online.
The recommendations were crowdsourced from online educators who attended our presentation sessions for special interest groups focused on online education at seven professional education conferences over a two-year period:
During each of our presentations audiences collaborated on a shared Google Doc, with each participant anonymously contributing one to two recommendations about teaching online. We displayed the list of recommendations on a screen as the audience added them. As part of the sessions, we then opened up the conversation so the audience could discuss similarities, surprises, and future actions. In this way we used crowdsourcing to create an increasingly robust list of recommendations from online educators in the trenches; crowdsourcing audiences at professional conferences allowed us to tap into the collective intelligence of educators with online teaching experience.
Individually, we examined the curated recommendations for common themes. Although our own work as online educators is significantly influenced by the Community of Inquiry (CoI) model, we intentional set aside the model as we analyzed and sorted the recommendations; we wanted any themes to organically emerge from the data. After we each sorted recommendations into general categories (e.g., learning, teaching, design, support), we worked together to define and describe specific themes. Through our collaborative analysis we found that the recommendations consistently fell into four themes: (a) supporting student success, (b) providing clarity and relevance through content structure and presentation, (c) establishing presence to encourage a supportive learning community, and (d) being better prepared and more agile as an educator. Below is a sampling of recommendations to illustrate each theme.
Experienced online educators shared strategies for supporting students in online courses so that students have the potential to be successful. For example, some of the recommendations referred to the need to:
Experienced online educators also shared lessons learned regarding the structure of online courses and the presentation of content within an online course. Some of the recommendations they shared include:
Interestingly, the highest number of recommendations shared by experienced online educators fell into the “presence” theme. Online educators commented on the importance of connecting with students, helping students connect with each other, and helping students feel they are members of a supportive learning community. Recommendations include:
Experienced online educators also pointed to being better prepared and more agility as useful lessons they had learned, sharing recommendations such as:
The four themes that emerged from our analysis of the recommendations resonated well with the Community of Inquiry (CoI) model:
This alignment with the CoI model—arguably the most popular framework for the research and practice of online learning—has reinforced for us that using crowdsourcing to curate recommendations for teaching online from experienced online educators is a sound approach to broadening the conversation and taking advantage of online educators’ collective intelligence. It has also reinforced for us the soundness of the recommendations online educators shared, and the appropriateness of heeding their advice; the themes and associated recommendations have potential to help faculty new to teaching online courses start out on solid footing, and to help continuing online educators consider alternatives and enhancements to their course design and facilitation.
We found that crowdsourcing online educators during live professional conference sessions was fruitful, leading to many insightful recommendations. Although crowdsourcing as a research methodology has limitations (see Khare, Good, Leaman, Su & Lu, 2016), our experience in this project well illustrated the central principle of crowdsourcing—that the collective intelligence of a group generally leads to more valuable results than the limited contributions of a few (Howe, 2008). The benefit of this approach for us is that the results are authentic and credible because the source of the results is experienced online educators. Online educators’ recommendations ring true to people learning how to be effective online educators because the recommendations are derived from people who are just like them: educators who care about the quality of the online-learning experiences they design and facilitate and who face similar professional pressures, opportunities, and constraints.
Through our analysis of experienced online educators’ recommendations, we identified four themes related to effective online course design and facilitation: (a) supporting student success, (b) providing clarity and relevance through content structure and presentation, (c) establishing presence to encourage a supportive learning community, and (d) becoming better prepared and more agile as an educator. These themes and associated recommendations are relevant for faculty new to online teaching, as well as for those already in the trenches. The work is significant because it captures the lessons experienced online educators have learned about designing and facilitating online courses—based on their experimentation, assessment, revision, and reflection. In addition, the work is an example of how professional conferences can be opportunities for crowdsourcing; this participatory approach recognizes the expertise of our colleagues and our valuing of that expertise. Finally, the work offers an additional data point in the larger scholarly quest for prescriptive guidance to online educators on how best to design and facilitate online courses. Through this work—which we continue to add to, especially in light of the increasing use of synchronous communication and collaboration tools and spaces in online courses—we hope to inspire our colleagues and students to (a) consider their own unique lessons learned, (b) explore different ways to attend to those lessons learned in their online courses, and (c) consider crowdsourcing as a research methodology.
Allen, I. E., Seaman, J., Poulin, R., & Straut, T. T. (2016, February). Online report card: tracking online education in the United States. Babson Survey Research Group and Quahog Research Group. Retrieved from https://www.onlinelearningsurvey.com/reports/onlinereportcard.pdf
Baker, M. A., Fox, P., & Wingrove, T. (2016). Crowdsourcing as a forensic psychology research tool. American Journal of Forensic Psychology, 34(1), 37-50. https://doi.org/10.13140/RG.2.1.1308.3129
Byun, T. M., Halpin, P. F., & Szeredi, D. (2015). Online crowdsourcing for efficient rating of speech: A validation study. Journal of Communication Disorders, 53, 70-83. https://doi.org/10.1016/j.jcomdis.2014.11.003
Chickering, A. W., & Gamson, Z. F. (1999). Development and adaptations of the seven principles for good practice in undergraduate education. New directions for teaching and learning, 80, 75-81. https://doi.org/10.1002/tl.8006
Cooper, S., Khatib, F., Treuille, A., Barbero, J., Lee, J., Beenen, M., Leaver-Fay, A., Baker, D., Popovi, >57,000 Foldit players (2010). Predicting protein structures with a multiplayer online game. Nature, 466(7307), 756-760. https://doi.org/10.1038/nature09304
Dunlap, J.C., Furtak, T.E., & Tucker, S.A. (2009). Designing for enhanced conceptual understanding in an online physics course. TechTrends, 53(1), 67-73. https://doi.org/10.1007/s11528-009-0239-0
Dunlap, J. C., & Lowenthal, P. R. (2014). The Power of presence: Our quest for the right mix of social presence in online courses. In A. A. Piña & A. P. Mizell (Eds.) Real life distance education: Case studies in practice (pp. 41-66). Greenwich, CT: Information Age Publishers.
Dunlap, J.C., Sobel, D.M., & Sands, D. (2007). Supporting students’ cognitive processing in online courses: Designing for deep and meaningful student-to-content interactions. TechTrends, 51(4), 20-31. https://doi.org/10.1007/s11528-007-0052-6
Dunlap, J.C., Verma, G., & Johnson, H. (2016). Presence+Experience: A framework for the purposeful design of presence in online courses. TechTrends, 60, 145-151. https://doi.org/10.1007/s11528-016-0029-4
Estellés-Arolas, E. & González-Ladrón-de-Guevara, F. (2012). Towards an integrated crowdsourcing definition. Journal of Information Science, 38(2), 189-200. https://doi.org/10.1177/0165551512437638
Everson, M. (2009, September). 10 things I’ve learned about teaching online. eLearn Magazine. Retrieved from http://elearnmag.acm.org/featured.cfm?aid=1609990.
Ginder, S., & Stearns, C. (2014, June). Enrollment in distance education courses, by state: Fall 2012. Washington DC: National Center for Education Statistics, U.S. Department of Education. Retrieved from http://nces.ed.gov/pubs2014/2014023.pdf
Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education, 2(2-3), 87-105. https://doi.org/10.1016/s1096
Garrison, D. R., & Arbaugh, J.B. (2007). Researching the community of Inquiry framework: Review, issues, and future directions. The Internet and Higher Education, 10(3), 157-172. https://doi.org/10.1016/j.iheduc.2007.04.001
Howe, J. (2010). Crowdsourcing: A definition. Retrieved from http://crowdsourcing.typepad.com/
Kawrykow, A., Roumanis, G., Kam A., Kwak, D. (2012). Phylo: a citizen science approach for improving multiple sequence alignment. PLoS One,7(3): e31362. https://doi.org/10.1371/journal.pone.0031362
Khare, R., Good, B. M., Leaman, R., Su, A. I., Lu, Z. (2016). Crowdsourcing in biomedicine: Challenges and opportunities. Briefings in Bioinformatics, 17(1), 23-32. https://doi.org/10.1093/bib/bbv021
Kuh, G. W. (2009). The National Survey of Student Engagement: Conceptual and empirical foundations. New Directions for Institutional Research, 141, 5-20. https://doi.org/10.1002/ir.283
Lintott, C. J., Schawinski, K., Slosar, A., Land, K., Bamford, S., Thomas, D., Raddick, M. J., Nichol, R. C., Szalay, A., Andreescu, D., Murray, P., Vandenberg, J. (2008). Galaxy Zoo: Morphologies derived from visual inspection of galaxies from the Sloan Digital Sky Survey. Monthly Notices of the Royal Astronomical Society, 389(3), 1179–1189. https://doi.org/10.1111/j.1365–2966.2008.13689.x
Motte, K. (2013). Strategies for online educators. Turkish Online Journal of Distance Education, 14(2), 258-267. Retrieved from http://dergipark.ulakbim.gov.tr/tojde/article/view/5000102223.
Newton, C. & Backhouse, S. (2013) Competing in architecture: Crowdsourcing as a research tool. Form Academic/Akademisk, 6(4), 1-13. Retrieved from https://journals.hioa.no/index.php/formakademisk/article/view/746/702.
Procter, R., Housley, W., Williams, M., Edwards, A., Burnap, P., Morgan, J., Rana, O., Klein, E., Taylor, M., Voss, A., Choi, C., Mavros, P., Smith, A. H., Thelwall, M., Ferne, T., Greenhill, A. (2013). Enabling social media research through citizen social science. In Korn, M., Colombino, T., Lewkowicz, M. (Eds), The 13th European Conference on Computer Supported Cooperative Work (ECSCW) Adjunct Proceedings. Retrieved from http://mkorn.binaervarianz.de/pub/ECSCW_2013_Adjunct_Proceedings-web.pdf#page=59
Rallapalli, G., Saunders, D. G., Yoshida, K., Edwards, A., Lugo, C. A., Collin, S., Clavijo, B., Corpas, M., Swarbreck, D., Clark, M., Downie, J.A., Kamoun, S., MacLean, D. (2015). Cutting edge: Lessons from Fraxinus, a crowd-sourced citizen science game in genomics. eLIFE. https://doi.org/10.7554/eLife.07460
Ranard, B.L., Ha, Y.P., Meisel, Z.F. et al. (2014). Journal of General Internal Medicine, 29(1), 187–203. https://doi.org/10.1007/s11606-013-2536-8
Schemmann, B., Hermann, A. M., Chappin, M. M. H., Heimeriks, G. J. (2016). Crowdsourcing ideas: Involving ordinary users in the ideation phase of new product development. Research Policy, 45(6), 1145-1154. https://doi.org/10.1016/j.respol.2016.02.003
Tallent-Runnels, M. K., Thomas, J. A., Lan, W. Y., Cooper, S., Ahern, T. C., Shaw, S. M., Liu, X. (2006). Teaching courses online: A review of the research. Review of Educational Research, 76(1), 93–125. https://doi.org/10.3102/00346543076001093
Ubell, R. (2017, January). Why faculty still don’t want to teach online. OLC Insights. Retrieved from https://onlinelearningconsortium.org/faculty-still-dont-want-teach-online/
Whitmeyer, S. J., De Paor, D. G. (2014, November). Crowdsourcing digital maps using citizen geologists. Eos, Transactions, American Geophysical Union, 95(44), 397-408. https://doi.org/10.1002/2014EO440001