Start Submission Become a Reviewer

Reading: High Structure Renewable Assignments: A Design Study

Download

A- A+
Alt. Display

Research articles

High Structure Renewable Assignments: A Design Study

Authors:

Peter Daniel Wallis ,

The University of Washington, US
X close

Jennifer Mae White,

The University of Washington, US
X close

Stephen Kerr

The University of Washington, US
X close

Abstract

We seek to guide design, development, and adoption of Renewable Assignments by testing ways learners can contribute to Open Educational Resources (OER). We design, test, and iterate four assignment structures to this end. Testing was completed in an upper-division undergraduate endocrinology course, taught emergency remote due to COVID-19.

Using mixed methods: surveys, focus groups, and iterations, we assessed assignment structures and created design guidance for renewable assignments and open pedagogy. We find that in a remote course, these assignments were effective in advancing learning goals. Both students and teachers favored their inclusion in the course. Analysis revealed six design principles to maximize effectiveness of renewable assignments and courses, and empowering teachers and learners to contribute to open knowledge. These principles also provide insight to praxis related to theories of open pedagogy, scaffolding, peer interaction, and active learning.

How to Cite: Wallis, P. D., White, J. M., & Kerr, S. (2022). High Structure Renewable Assignments: A Design Study. Open Praxis, 14(1), 39–53. DOI: http://doi.org/10.55982/openpraxis.14.1.146
17
Views
  Published on 25 Nov 2022
 Accepted on 13 Feb 2022            Submitted on 14 Sep 2021

Introduction: Nature of the Problem

This project was motivated by some of the broader problems in Higher Education. College costs have risen consistently and considerably (College Board, 2019). Textbook costs have increased sharply (Weissmann, 2013). Open Textbooks save students money, reduce dropout, and increase student engagement (Colvard et al., 2018) while opening the door to promising practices of Open Pedagogy (Paskevicius & Irvine, 2019), yet adoption remains low (Biswas-Diener, 2017). Efficacy of College teaching, and the worth of college degrees have been called into question (Bennett & Wilezol, 2013). Active Learning designs have been shown to be clearly effective (Freeman et al. 2014) yet their adoption also remains limited (Miller & Metz, 2014; Eickholt et al., 2019).

We seek to address these problems through the design of high-structure renewable assignments, creating pathways through which teachers and learners may simultaneously reduce costs and increase learning. We build upon the conceptual framework laid out by Wiley and Hilton (2018) proposing renewable assignments as assignments which invite, empower, and guide students to contribute to open knowledge. We hope to expand this doubly beneficial pathway by answering the following research questions:

  1. How might we design active, structured, renewable assignments?
  2. What is the feedback of teachers and learners regarding these assignment designs?
  3. What practices and design guidance are revealed as we test and iterate renewable assignment designs in an authentic teaching context?

To answer these questions, we implemented, iterated, and gathered feedback on renewable assignments our students used to edit an open course pack. To design these assignments, we drew on research concerning open textbooks, high structure active learning, tagging, and peer review. Our literature review briefly covers the developing research in these areas.

Literature Review

Open Textbooks, Open Pedagogy, and OER-enabled pedagogy

Open Textbooks: Over the past 20 years, researchers have developed an understanding of the adoption and use of open textbooks. Overall, they find open textbooks save students money (Hilton, 2016), and increase desirable academic outcomes, especially persistence of students from marginalized groups (Colvard et al., 2018) while not negatively impacting other academic outcomes. No clear drawbacks have emerged in research on Open Textbooks.

Faculty and student perceptions of open textbooks are generally positive, with some concerns about quality and the difficulty of adopting a new textbook (Weller et al., 2017). Concerns also center lack of ‘supporting materials’ like quiz/review questions, labs, and LMS/online supplements. In part because of these concerns, adoption of Open Textbooks has remained low, despite their benefits (Dastur, 2017). One other main barrier to adoption is the lack of relevant open materials for specific courses or sub-disciplines (Seaman & Seaman, 2017). We hypothesize that empowering learners and teachers to co-create and improve textbooks and supporting materials is a key method by which educators and institutions can solve this limitation.

High Structure Active Learning

Learners and teachers co-creating open materials is a form of active learning. High Structure Active Learning practices have been proven to improve outcomes for all students, and particularly for students from marginalized backgrounds. See Freeman et al. (2014) for meta-analysis. We are not aware of a framework or research that directly measures ‘structure’ in assignments. We propose a comparative definition: High structure is characterized by shorter periods of time (assignments completed in hours and due in weeks, as opposed to days and semesters) and greater scaffolding (detailed instructions and steps).

Up-front costs of low-structure active learning are lower for teachers. For example, a simple term paper assignment requires less construction than multiple assignments walking students through each step of thesis, outline, and paper construction and revision. However, greater benefits of higher structure fall in line with well-researched theories, particularly extraneous cognition (Mayer, 2017), scaffolding (Doo et al., 2020; Ninio & Bruner, 1978), and Kirschner et al.’s (2006) criticisms of the lack of evidence for results of lower structure active learning.

Given the empirical and theoretical support for higher structure, we adopted higher structure designs like tagging and peer review in our renewable assignments. What is unclear in the literature, and is a key design question, is whether high structure active learning reduces metacognition – learners’ thinking about their own thinking and the structure of the field.

Tagging, Social Annotation, and Peer Review

Tagging, social annotation, and peer review have been relatively well studied both as specific modes of high structure active learning, and as social media. See Ghadirian et al., (2018), Krouska et al. (2018), for literature reviews of social annotation, Macgregor & McCulloch, (2006) for review of tagging, and Double et al. (2020) for a meta-analysis of peer review. Generally, these structures have not been used in OER-enabled learning environments. Tagging, peer review, and social annotation have been found effective in engaging students and increasing their learning in other environments.

In general, social annotation refers to any work done to ‘mark up,’ add explanation or marginalia to a text. Within academic studies, annotation usually takes the form of a sentence or paragraph of commentary next to the text. By contrast, tagging in general refers to adding metadata to texts or images, usually in the form of single-word markers. Peer review generally involves students evaluating (often through annotation) other students’ contributions. Ghadirian et al., (2018) notes that there has been little design research on tagging and social annotation, and we note that design studies are generally lacking in peer review assignment research as well.

Methods

Study system & Limitations

“Design research studies problems in their inherent messiness” (Sandoval & Bell, 2004). Our research (Figure 1) took place during the COVID-19 pandemic. The University of Washington moved all courses to emergency remote teaching late in Winter Quarter 2020. Spring Quarter 2020 was conducted almost entirely online. During that quarter, the teacher in this teacher-researcher partnership taught this upper-level undergraduate course in Endocrinology (Biology) for the first time, in an open pedagogy model, working with students to begin adapting the previous teacher’s course pack to open text. In the last weeks of Spring quarter, we formed our teacher-researcher partnership, determined initial assignment designs, and planned to test and evaluate high structure renewable assignments in Summer Quarter.

Timeline showing stages of research feedback and changes in learning technologies and assignment designs over the course of the academic quarter and the three preceding weeks. The main emphasis is on the ongoing nature of feedback, analysis, and iterative change
Figure 1 

Timeline of research activities.

Population

The summer quarter course had 24 students enrolled, of which 16 agreed to participate in data gathering and focus groups. The course was taught remotely from Seattle. Student demographics approximated general demographics of the University of Washington, which in Summer 2020 were 56% Female, 44% male. 4.2% identified as African-American, 1.2% American Indian, 25% Asian, 40.3% Caucasian, 0.9% Hawaiian/Pacific Islanders, 7.4% Hispanic or Latino, 18% “International” and 3% did not indicate an ethnicity (https://studentdata.washington.edu/quick-stats/).

The course was taught by one Teacher and one Teaching Assistant (TA). Throughout the course as well as in a final focus group, the TA and teacher gave feedback and input that informed iterations and research direction.

Assignment Designs

Before the start of the course, the instructor and researcher met three times (approximately three hours total), selecting and defining assessment designs. We discussed desired outcomes of the course, in terms of specific knowledge and conceptual shifts. We shared the goal that students learn practices of knowledge generation and critique, and to think about how knowledge is created and learned. We wrote and edited four assessment designs, based on the literature summarized above. Brief descriptions of these four designs follow. To our knowledge, this is the first study to present tagging and peer review used in open textbook revision.

  1. Tagging: An exercise where individual students highlighted and tagged (socially annotated with tags) chapter-length pre-existing course pack materials with (initially) the following four tags:
    Core – This section (length selected by students, generally about a sentence) is core/feels really important to the chapter or to endocrinology as a whole.
    Unclear – This section is unclear to me. (If you don’t understand something, that may not always be your fault!)
    Connect – This section feels disconnected from the rest of the text.
    Incorrect – This section contains an error, is incorrect, or out of date.
    In class, we briefly reviewed and summarized students’ tags of the chapters in question following their tagging.
  2. Peer Review: Student groups reviewed paragraph-length suggested additions to the course pack/open text generated by previous quarter students. We provided a detailed rubric to evaluate these sections.
  3. Working group assignments: Student groups reviewed and re-wrote small sections (one paragraph or image) of the course pack materials for conversion into an open textbook.
  4. Chapter rewrite: Students individually annotated and re-wrote larger sections of the course pack materials.

We included 11 working group assignments (4 drops, where student’s lowest scores, including any not turned in, don’t affect their grade), 5 tagging, and 5 peer review assignments (1 drop each), and one final chapter rewrite which included elements of tagging, peer review, and revision. The course also included 4 low-stakes quizzes (in total accounting for less than 10% of student’s grades).

In-Class Opportunistic Data Collection

During the class sessions, conducted by Zoom, we gave opportunities for students to comment on assignments. Several students were open with experiences, feedback, and concerns. We cautiously integrated their thoughts into iterations, cross checked with a mid-quarter survey. Generally, the same students spoke up throughout the quarter. This opportunistic participation peaked around introduction of new assignments. During group work sessions in class time, the teacher, TA, and researcher, would often discuss assignments and plan iterations.

Mid-course Survey

We surveyed students in week 6 (of 10). We surveyed students about assignment designs, particularly asking questions about tagging and peer review as novel designs. 22 of 24 students completed the survey. Students who did not consent to the full research protocol had the option to participate in the survey anonymously.

Assignment Design Iterations

Throughout the course, we used feedback from opportunistic conversations and surveys to make changes to assignment designs. Student and instructor feedback centered on tagging. In hindsight, we believe this was because this design was novel, and novelty sparked additional conversation and questions. For more discussion, see the Future Directions section.

Tagging iterations: We made changes to tags available and how and when students could see other students’ tags or tagging patterns. We also changed the system we were using for tagging, from Google Drive comments to hypothes.is (a social annotation system).

Changes to tags were mostly additions to words used to tag, in response to students’ requests. We split “Core” – into important concepts and “ah-ha moments.” After students expressed hesitance to tag problems or errors, for fear it would expose their ignorance, we added detailed tags for potential problems in the text, “repeated,” “unclear,” and “needs more context,” as we hoped those would help students use more critical tags.

In our first iteration of tagging, students tagged a shared document in which each student could see all other students’ tags. While all 25 students successfully completed an assignment of adding five tags, struggling to find novel tags when they were the last to read. They also reported wanting to tag independently, and then see each other’s tags, which aligns with research (see mid-quarter survey and design guidance). Later iterations of tagging assigned each student a separate document, which they tagged independently.

In the last tagging assignment, we moved systems from PDFs in Google Drive to Pressbooks, where students tagged using hypothes.is, and we allowed comments ‘with’ tags. This informed students’ comments on usability and the inclusion of comments (Figure 2).

Pressbook interface with highlighted text about brain reigons, and in a sidebar, student comments tagging it as ‘core’
Figure 2 

Anonymized screenshot of later tagging in pressbooks and hypothes.is.

Peer Review Iterations: In contrast to tagging, students gave less feedback on peer review in course sessions. Lacking learner feedback, the teaching team worked to assign particularly high-quality submissions for students to peer review. We were hoping this change would increase student learning (as they had to focus more closely to find any issues) and focus peer reviews on submissions likely to contribute most. In the final weeks of the quarter, and in parallel with the tagging exercise, we assigned peer reviews in Pressbooks and hypothes.is, as opposed to earlier Google Drive and Canvas. We made this change both for comparison, and to begin use of the open Pressbooks platform.

Group Work Iterations: As with Peer Instruction, we completed little iteration based on feedback from students about group work. We did offer students the opportunity to change group membership. Only one group did so, most reported happiness with their group assignments and groups’ contributions. The teaching team reported that student satisfaction with groups and contribution to group work was generally higher in this OER-enabled context than in most disposable assignment designs.

End of the course Data Collection (Survey & Focus Groups):

Post-course survey: At the end of the course, we distributed a survey to students, focused on experiences with and perceptions of tagging and peer review. We included questions about experience with the iterations. For the full survey, see Appendix A.

Focus Groups with learners: Within two weeks of the course ending, we conducted three focus groups with learners, 8 participants total. All focus groups were conducted virtually. Two were synchronous through videoconferencing software. One was asynchronous through the messaging system in the LMS. Synchronous and asynchronous groups used the same questions, and answer length was similar across modalities. Each focus group represents approximately one hour of spoken time. We asked participants to speak to questions centering on experiences with assignment designs (see Appendix B for focus group questions).

Focus Group with the instructional team: Following the course, we held one focus group with the instructional team (the course instructor [author] and graduate student TA). We asked about assignment designs, difficulties administering or grading assignments, and feedback on the structure of the course. The researcher conducted this focus group after completing the work with student focus groups and their analysis, to center student’s input, and not drive potential questions or analysis of student response from teacher perspectives.

Analysis

We conducted a thematic analysis of the open text response fields in the mid-quarter survey, final survey, the three student focus groups, and the one teacher focus group. After anonymizing focus group notes and survey responses, we reviewed open-ended questions to extract themes, categorize key quotes, and re-reviewed answers to ensure both themes and conclusions extracted represented student comments and foci. Finally, we counted the appearance of each theme within each instance of feedback (six total instances: two surveys, four focus groups.)

Results

Thematic Analysis

Table 1 contains short descriptions of each theme, and the number of times learners mentioned it, excluding those with 2 or less mentions.

Table 1

Thematic analysis of teacher and learner comments.


POSITIVE OR NEGATIVE PERCEPTION OF WHICH SUM MID-Q SURVEY FINAL SURVEY FG1 FG2 FG3 FG4 (TEACHERS)

Positive – all assignment designs 112 0 39 0 6 8 18 5

Positive general re: OER-enabled pedagogy 8 1 1 1 3 2

Positive – tagging increased: 88 38 1 2 9 2

careful reading 28 21 1 2 4

critical thinking 21 19 1 1

focus 18 16 1 1

memory 5 4 1

reading 7 5 1 1

understanding of lecture 6 6

(was simple) 3 2 1

Positive – peer review 13 1 1 4 6 1

Positive – working group 8 2 2 3 1

Positive – final exercise 3 2 1


Negative – all assignment designs 22 0 13 0 1 4 4 0

Negative – tagging 18 13 1 4

General

tagging resulted in disconnecting paragraphs from overall text 4 2 2

tagging distracted from reading for content 12 10 1 1

Negative – peer review

Negative – working group 1 1

Negative – final exercise 3 3


Neutral (or positive to teachers, negative to students) 14

Tagging resulted in students re-reading 11 5 2 2 2

Tagging was time consuming 3 2 1

Design Guidance – Structure within assignments

Integrate individual skills 6 1 4 1

Compare only after contributing 7 1 4 2

Design Guidance – Structure across assignments

Create opportunities to dive deeper 19 5 3 6 3 2

by creating glossary 2 1 1

Design for depth and breadth 8 2 2 1 1 2

Technology guidance

Allow tags+comments 14 6 1 4 2 1

Ease of use 12 0 8 4

Quantitative Survey Analysis

In the mid-quarter survey, in addition to open-ended questions, we asked several quantitative, Likert-style questions, with ranges of 1–6, with 6 indicating strong agreement, and 1 strong disagreement. Table 2 provides descriptive statistics for the results of quantitative questions in this survey, and Table 3 provides the choice counts for each option on each prompt.

Table 2

Descriptive statistics of Likert-style questions in mid-quarter survey.


DESCRIPTIVE STATISTICS

N MINIMUM MAXIMUM MEAN STD. DEVIATION

Tagging is a useful exercise 22 2 6 4.55 1.011

I’d like to see other student’s tags while I’m tagging 22 1 5 2.82 1.097

…like to see other students’ tags after I’m done tagging 22 1 6 4.18 1.708

…like access to others’ tags for final project 22 2 6 5.27 1.120

Valid N (listwise) 22

Table 3

Counts on scale of Likert-style questions in the mid-quarter survey.


COUNTS ON LIKERT SCALE

“STRONGLY AGREE” “STRONGLY DISAGREE”

6 5 4 3 2 1

Tagging is a useful exercise 4 7 9 1 1 0

I’d like to see other student’s tags while I’m tagging 0 1 5 8 5 3

…like to see other students’ tags after I’m done tagging 9 0 4 5 3 1

…like access to others’ tags for final project 14 2 5 0 1 0

Final Survey

In the final survey, we focused on designs for tagging and peer review. We wanted questions to allow students to respond to assignment iterations, so we asked about added tag definitions, and usefulness of tagging and peer review in the final project. Questions included responses to themes of mid-course feedback and iterations, and Likert-style response to a question about usefulness and timing of seeing other students’ tagging. Results are summarized in Tables 4, 5, 6 and 7.

Table 4

Student response counts to number of tags students would prefer to have when tagging a chapter and using others’ tags to revise a chapter.


# OF TAGS STUDENTS WOULD PREFER TO HAVE IF THEY WERE…

3–5 5–7 7–9

Tagging a chapter 3 6 3

Revising a chapter with someone else’s tags 3 8 1

Table 5

Student perceptions of learning with tags, as compared to without.


READING WITH TAGS, I LEARNED:

A LOT MORE THAN WITHOUT A LITTLE MORE THAN WITHOUT ABOUT THE SAME

5 2 5

Table 6

Student responses to the usefulness of new tags in learning.


OVERALL, THE ADDITION OF NEW TAGS:

HELPED ME LEARN MADE NO DIFFERENCE MADE IT HARDER TO LEARN

9 2 1

Table 7

Counts for Likert-style question about how helpful other students’ peer review and tagging contributions were when revising the final project.


LIKERT-STYLE COUNTS OF A QUESTION ON THE HELPFULNESS OF OTHER STUDENTS’ ANNOTATION AND PEER REVIEW RESPONSES

“EXTREMELY HELPFUL” “Not Helpful at All”

Count 6 4 1 0 1

Discussion

Overall, these assignment designs were successful. According to surveys and focus groups, students appeared to learn a similar if not greater amount than in the previous iteration of the course and were able to create improvements to key parts of the textbook. Designs required minimal additional effort to adopt. Student feedback was overwhelmingly positive, with 112 positive comments, compared to 22 negative. Most negative comments related to things that could be improved about the designs or applications, rather than inherent ineffectiveness. For example, students critiqued the lack of opportunity to return to tagged sections and improve them, or a preference to focus on the same topic in several successive peer reviews.

To our surprise, in focus groups and surveys, students not only gave us feedback on the assignment designs, but on the course structure, particularly, connections between assignment designs. They asked for more opportunities to dive deep into a topic by repeating the same section or contribute to a glossary overview of the subject. As such, our design guidance represents both guidance on designing assignments, and on bringing those assignments together into a course. We take this as an indication (among others) of metacognition enabled by high structure renewable assignments.

Below we discuss the application of results to each research question:

1. Can we design and implement high structure active learning, open pedagogy assignments?

The answer to this research question is yes. In one academic quarter, we were able to design and implement four high structure active, renewable, OER-enabled assignments. All four assignment types were easily adopted by teachers and students. They presented no major confusion or breaks in the class, even though several were novel.

2. What are the perceptions of teachers and learners of high structure open pedagogy assignment designs?

Teachers and learners were positive about the assignments we developed. All assignment types prompted positive student comment. Most students said the assignment structures helped them achieve their goals in the class. Of the 12 students who completed our final survey, 10 (83.3%) said that they planned to use similar tagging in other classes, even when not assigned. In one focus group, a student said of the working group assignment “updating sections was one of the most useful activities. We had to understand what the paragraph was saying, applying it to whatever you were going to create, and the specific section.” Another student spoke about their experiences as a learner with dyslexia and finding tagging particularly useful to their reading process and focus. Other students without similar diagnoses spoke highly of tagging’s ability to help them focus while reading. Several commented that in later non-tagged readings, they noticed themselves paying less attention.

Assignment designs had weaknesses, like tagging’s tendency to over-focus learners on details, and peer review and working groups’ lack of structured roles. However, both learners and teachers suggested these weaknesses could be balanced by continued iteration of the designs (see Design Guidance for examples).

A few students reported neutral impact, and a minority reported having to spend more time to maximize their outcomes. None said they learned less due to these designs.

3. What design guidance emerges as we test and iterate those assignment designs in a real teaching context?

We were surprised that the student participants not only thought critically and creatively about the structure of assignments, but also about the context of those assignments within the course. For example, they often addressed potential drawbacks of one assignment by suggesting pairing it with another. Learners suggested tagging assignments, which they were concerned over-focused them on details (mentioned 4 times), be combined with a ‘glossary’ assignment or one that would outline the text (mentioned twice).

Other students suggested they be given the same segment of text for multiple assignments, so groups could own a section through multiple iterations. They understood and integrated the aims of renewable assignments and sought to maximize the meaning of those assignments for their learning, and for others.

Overall, six themes emerged from our data collection summarized as follows (Table 8)

Table 8

Student-Driven Guidance for Renewable Assignments Within an OER-Enabled Course.


WITHIN-ASSIGNMENT GUIDANCE

THEME LEARNER QUOTE(S)

1. Create Opportunities to Dive Deeper

Create opportunities for an excellent contribution to a smaller part of the open resource, and to learn a part of the subject in depth. Have multiple assignments contribute to the same part of text or have several steps in an assignment iteratively improve an open resource. “Overall, the most helpful thing for learning in the course was the working group assignments. (Another student nodding) One for each chapter, had to collaborate and discuss. With WG assignments, assigned a working group on each chapter. Know one thing in that chapter very well. (In-depth contributions) Could work well.”

2. Design for Depth and Breadth

Use a mix of assignments as opportunities to contribute to open knowledge about both specific topics and the scope and structure of the field – empowering students to learn about the field’s breadth. “Broad” co-constructions can include glossaries, chapter listings, summaries, and more. “I did enjoy these assignments and feel that the concept of an open text like this is wonderful for many reasons, but I do think it’s worth noting that I definitely learned more about the sections my group was assigned than other topics presented in class.” “Make a glossary hyperlinking words. Could we make a glossary when doing edits? Would be really helpful for an online textbook, to have that.”

ACROSS-ASSIGNMENT GUIDANCE

3. Integrate Student Skills

Students appreciate opportunities to use the skills they bring to the class to contribute to the open resources through role-based groups. “(My) group had an english minor, had a lot of grammatical changes. Mostly biology students- writing isn’t the biggest skill. A lot of edits were grammatical. Could be used with role-based peer review.”

4. Compare only after Contributing

Learners and educational theory agree that opportunities for learners to compare their contributions with others come best after, not during or before, they contribute. For example, showing other students’ tags only after students completed their own tagging. In a survey question, no students strongly agreed with wanting to see each other’s tags while tagging. 16 of 22 disagreed to strongly disagreed. 9 strongly agreed with seeing tags after they were done tagging, 14 strongly agreed that other’s tags are useful when they edit a section.

TECHNOLOGY GUIDANCE

5. Enable Layered Contributions

Most learning technologies focus on one-time viewings, contributions, and submissions. Learning technologies in general, and renewable learning technologies in particular, will benefit from enabling students to return, compare, revise, and recontribute. “It can be helpful to have already thought through the good areas and potential problem areas of the reading before group work so I already have some ideas of what could be improved upon (or can look to the well-done areas for inspiration of how to make things better).”

6. Maximize Ease of Use

Despite the priority given ease of use and design in technology construction, learners still feel systems fall short – and appreciate technologies that are very easy and straightforward to use. “I found both platforms to be easy to use, and don’t have any major grievances with either…. google would occasionally lag behind, and we couldn’t see each other comments that were made on separate computers for a few minutes…. Pressbooks had a different issue where some people’s comments were hiding/reappearing…”

Structure Across Assignments

We were struck by how completely students understood and spoke to the dual goals of OER-enabled pedagogy in course organization. When prompted with questions about assignment designs, students suggested not only changes to the assignments, but changes to the structure of the course – adding assignments or pairing assignments to produce open materials while helping them achieve their learning goals of deep and broad knowledge.

1. Create Opportunities to Dive Deeper

Most student guidance for renewable assignments (n = 19) related to students’ desire to create more complete contributions to parts of the text – and dive deeper into knowledge of specific topics. Students spoke of the opportunity to maximize both their own learning and their contribution to the open resource over multiple assignments.

Students provided examples of how assignments could be aligned to maximize the depth of student knowledge and contributions. They suggested providing tagging of a section early, so they could pay particular attention to that section of the text, then following with peer review and completing work group suggestions in that same section.

Students also spoke to of the importance of assignment scale. Should this assignment cover the breadth of the subject, or a few particular facts? Should it be one paragraph, or five pages? Learners generally said that about two paragraphs of textbook, at least within endocrinology, is a good ‘chunk’ to try to improve over the course of a week. A larger chunk could be handled in a quarter-long project, or deeper improvements (such as checking all facts, or creating illustrations) could be completed across a quarter for a smaller segment, in combination with other learning across the quarter.

Several students suggested that the final project be structured across the whole course to build up to a greater contribution in an area. They suggested a sequence of tagging, then peer review, then work group, and then final project to maximize the depth of contributions to a section of text. The teaching team confirmed that this structure would work well from their perceptions, and along with students suggested assignment types to combine this depth with breadth. We cover those in the next section.

2. Design for Depth and Breadth

While students asked for opportunities to dig deeper, develop expertise, and polish topics, they also remained aware of their goal to learn overarching principles and concepts of Endocrinology. Eight times, particularly focus groups, they spoke of a desire for balance: They wanted to contribute to the text in depth but understand it in breadth.

Both renewable and disposable assignments create focus, both a positive and negative. As one learner said, “I did enjoy these assignments and feel that the concept of an open text like this is wonderful for many reasons, but I do think it’s worth noting that I definitely learned more about the sections my group was assigned than other topics presented in class.” Students expressed this concern with the tagging assignment– it over-focused some students on reading for problems in the text, rather than for the broader concepts.

Suggestions to resolve this issue focused on assignment designs that help students learn the breadth of the subject, while creating open resources that cover or organize often used concepts. For example, by contributing to a glossary, or chapter summaries, or suggesting organizational schemas for the text, learners contribute and learn context. Learners and teachers suggested, for maximum effect, a course go back and forth between ‘breadth’ and ‘depth.’

Structure Within Assignments

3. Integrate Student Skills

Students bring existing skills to their work, especially in group contexts. In focus groups, students expressed a desire to integrate existing skills into group work. Suggestions for cultivating student skill development included assigned roles, such as editor or fact-checker. Students showed awareness that, both in learning and producing open resources, prior skills and talents help.

Learners suggested we intentionally pair groups to review each other’s work. If one group did not have a particular strong editor, for example, their work might be reviewed by a group with particularly strong editing skill. Meanwhile, teachers emphasized the content-learning needs of students – that while role and skill-oriented designs can work, care must be taken to balance content learning with students using their skills to improve open texts.

Building on this, teachers suggested opportunities for collaboration across departments, using renewable assignments to connect classes. For example, a course on editing could copyedit STEM courses, or students learning graphic design or science illustration could contribute illustrations.

4. Compare only after Contributing.

Students were clear that they only wanted to see other students’ work after they had read the material, and ideally, contributed themselves. Seven students mentioned this theme in our focus groups and surveys. Students also answered a survey question in the mid-quarter survey, asking whether they’d like to see other students’ tags, and they generally (59.1%, N = 13) said they’d like to – but only after completing tagging themselves. In contrast, students, responding a Likert-style question were generally negative about seeing others’ tags while they were tagging on a scale of 1–6 (n = 22, µ = 2.82, st. dev = 1.10).

This design guidance is in alignment with other high structure active learning findings, like those of peer instruction (Crouch & Mazur, 2001) and with experiments in physics education that find better retention of correct information when students had, first, committed to an incorrect answer (Muller et al., 2007). This design principle is interesting given the tendency of many learning technologies to focus either on the presentation of the teachers’ answers, or on student contributions or responses, but not on a contribution-comparison conversation, where questions are answered and re-answered, and answers are compared, a practice much more in line with practices of scientific consensus building, and with learning research.

Technology Feedback

5. Enable layered contributions

Of the minority of negative feedback we received, a considerable number (14) students mentioned wanting to add comments as well as tags. They saw the value of tags for further open text production, but often wanted to comment and explain their tags in more depth.

Given the relative rarity of students asking to create additional explanations, and the potential of tags to represent group thoughts and contributions, we believe this indicates a need for designs that support collaborative writing in layers of work – tags, peer review, and document-level feedback. Future researchers and designers may benefit from taking up efforts to allow for multiple layers of contribution, such as tags alongside comments, or the ability to see several peer reviews of a segment of open text once a learner has reviewed the text. Students indicated a desire, in the final survey, for a wealth of resources as they update or edit text.

6. Maximize Ease of Use

In some ways, this is an obvious principle of design, but given the issues our students encountered, it bears repeating. 12 students across our focus groups and surveys mentioned the difficulty of highlighting and annotating text, particularly in Google Docs. Text often wouldn’t select naturally, highlighting an additional word would cause the interface to select the whole following sentence as well. User experience enables – or disables all collaborative, computer-based learning.

Conclusion & Future Directions

In this research we showed effective, easy-to-implement, high structure renewable assignments, appreciated by students and teachers. We analyzed data from surveys and focus groups to extract design principles to improve these designs, courses that implement them, and future OER-enabled pedagogy efforts. We found evidence throughout students thought deeply about the structure of their learning in renewable assignments. We conclude with one encouraging finding: high structure OER-enabled pedagogy, engaging undergraduate students in creating materials that save themselves and other students money, while giving them structured practice in being knowledge creators, is possible, and practical, and generative.

The extent to which student feedback on this pedagogy suggested further, very practical, improvements has likewise been encouraging and enlightening. In future work, we hope to make good on these suggestions by structuring courses in accordance with suggestions to structure renewable assignments for deeper dives, and a combination of breadth and depth across the course, as well as integrating student skills. We are interested in continued work on structure not only within assignments, but across the course or courses, that guide students to maximize learning and contributions to open texts.

We look forward to future research in this subject area. We hope that we, or others, will have opportunities to conduct causal experiments, especially relating to student outcomes in OER-enabled pedagogy courses. These will help resolve open questions in the field about the effectiveness of these and similar assignment designs in helping students to achieve their educational goals.

In our findings, teachers and learners indicated that learning equivalent to non-renewable assignments is possible, while students contribute to free and open knowledge. Exploring these learning designs, we found key principles to make these and other assignments more effective, maximizing both student learning and contributions. Principles derived from this study may be applicable well beyond the bounds of renewable assignments, but ethically, it makes sense to continue to pursue the double benefit of student learning and contribution to open knowledge.

Additional Files

The additional files for this article can be found as follows:

Appendix A

Mid-course Survey. DOI: https://doi.org/10.55982/openpraxis.14.1.146.s1

Appendix B

End-of-course Survey. DOI: https://doi.org/10.55982/openpraxis.14.1.146.s2

Competing Interests

The authors have no competing interests to declare.

References

  1. Bennett, W. J., & Wilezol, D. (2013). Is College Worth It?: A Former United States Secretary of Education and a Liberal Arts Graduate Expose the Broken Promise of Higher Education. Thomas Nelson Inc. 

  2. Biswas-Diener, R. (2017). You Can’t Sell Free, and Other OER Problems. In R. Jhangiani & R. Biswas-Diener (Eds.), Open: The Philosophy and Practices that are Revolutionizing Education and Science. Ubiquity Press. DOI: https://doi.org/10.5334/bbc.u 

  3. College Board. (2019, June 7) Trends in College Pricing 2020 – Research – College Board. https://research.collegeboard.org/trends/college-pricing 

  4. Colvard, N. B., Watson, C. E., & Park, H. (2018). The Impact of Open Educational Resources on Various Student Success Metrics. International Journal of Teaching and Learning in Higher Education, 30(2), 262–276. 

  5. Crouch, C. H., & Mazur, E. (2001). Peer Instruction: Ten years of experience and results. American Journal of Physics, 69(9), 970. DOI: https://doi.org/10.1119/1.1374249 

  6. Dastur, F. (2017). How to Open an Academic Department. In R. Jhangiani & R. Biswas-Diener (Eds.), Open: The Philosophy and Practices that are Revolutionizing Education and Science. Ubiquity Press. https://www.ubiquitypress.com/site/chapters/e/10.5334/bbc.m/. DOI: https://doi.org/10.5334/bbc.m 

  7. Doo, M. Y., Bonk, C., & Heo, H. (2020). A Meta-Analysis of Scaffolding Effects in Online Learning in Higher Education. International Review of Research in Open and Distributed Learning, 21(3), 60–80. DOI: https://doi.org/10.19173/irrodl.v21i3.4638 

  8. Double, K. S., McGrane, J. A., & Hopfenbeck, T. N. (2020). The Impact of Peer Assessment on Academic Performance: A Meta-analysis of Control Group Studies. Educational Psychology Review, 32(2), 481–509. DOI: https://doi.org/10.1007/s10648-019-09510-3 

  9. Eickholt, J., Gandy, L., Seeling, P., & Johnson, M. (2019). Advancing Adoption of Active Learning Pedagogy via New Avenues of Research and Training. 2019 IEEE Frontiers in Education Conference (FIE), 1–5. DOI: https://doi.org/10.1109/FIE43999.2019.9028603 

  10. Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415. DOI: https://doi.org/10.1073/pnas.1319030111 

  11. Ghadirian, H., Salehi, K., & Ayub, A. F. M. (2018). Social annotation tools in higher education: A preliminary systematic review. International Journal of Learning Technology, 13(2), 130. DOI: https://doi.org/10.1504/IJLT.2018.092096 

  12. Hilton, J. (2016). Open educational resources and college textbook choices: A review of research on efficacy and perceptions. Educational Technology Research and Development, 64(4), 573–590. DOI: https://doi.org/10.1007/s11423-016-9434-9 

  13. Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why Minimal Guidance During Instruction Does Not Work: An Analysis of the Failure of Constructivist, Discovery, Problem-Based, Experiential, and Inquiry-Based Teaching. Educational Psychologist, 41(2), 75–86. DOI: https://doi.org/10.1207/s15326985ep4102_1 

  14. Krouska, A., Troussas, C., & Virvou, M. (2018). Social Annotation Tools in Digital Learning: A Literature Review. 2018 9th International Conference on Information, Intelligence, Systems and Applications (IISA), 1–4. DOI: https://doi.org/10.1109/IISA.2018.8633609 

  15. Macgregor, G., & McCulloch, E. (2006). Collaborative tagging as a knowledge organization and resource discovery tool. Library Review, 55(5), 291–300. DOI: https://doi.org/10.1108/00242530610667558 

  16. Mayer, R. E. (2017). Using multimedia for e-learning. Journal of Computer Assisted Learning, 33(5), 403–423. DOI: https://doi.org/10.1111/jcal.12197 

  17. Miller, C. J., & Metz, M. J. (2014). A comparison of professional-level faculty and student perceptions of active learning: Its current use, effectiveness, and barriers. Advances in Physiology Education, 38(3), 246–252. DOI: https://doi.org/10.1152/advan.00014.2014 

  18. Muller, D. A., Bewes, J., Sharma, M. D., & Reimann, P. (2007). Saying the wrong thing: Improving learning with multimedia by including misconceptions. Journal of Computer Assisted Learning, 24(2), 144–155. DOI: https://doi.org/10.1111/j.1365-2729.2007.00248.x 

  19. Ninio, A., & Bruner, J. (1978). The achievement and antecedents of labelling*. Journal of Child Language, 5(1), 1–15. DOI: https://doi.org/10.1017/S0305000900001896 

  20. Paskevicius, M., & Irvine, V. (2019). Open Education and Learning Design: Open Pedagogy in Praxis. Journal of Interactive Media in Education, 2019(1), 10. DOI: https://doi.org/10.5334/jime.512 

  21. Sandoval, W. A., & Bell, P. (2004). Design-Based Research Methods for Studying Learning in Context: Introduction. Educational Psychologist, 39(4), 199–201. DOI: https://doi.org/10.1207/s15326985ep3904_1 

  22. Seaman, J. E., & Seaman, J. (2017). Opening the Textbook: Educational Resources in U.S. Higher Education, 2017. In Babson Survey Research Group. Babson Survey Research Group. https://eric.ed.gov/?id=ED582411 

  23. Weissmann, J. (2013, January 3). Why Are College Textbooks So Absurdly Expensive? The Atlantic. https://www.theatlantic.com/business/archive/2013/01/why-are-college-textbooks-so-absurdly-expensive/266801/. DOI: https://doi.org/10.5334/bbc.e 

  24. Weller, M., de los Arcos, B., Farrow, R., Pitt, R., & McAndrew, P. (2017). What Can OER Do for Me? Evaluating the Claims for OER. In R. Jhangiani & R. Biswas-Diener (Eds.), Open: The Philosophy and Practices that are Revolutionizing Education and Science. Ubiquity Press. http://oro.open.ac.uk/49041/ 

  25. Wiley, D., & Hilton, J. L., III. (2018). Defining OER-Enabled Pedagogy. International Review of Research in Open and Distributed Learning, 19(4). DOI: https://doi.org/10.19173/irrodl.v19i4.3601 

comments powered by Disqus