ICDE-8-1
Open Praxis, vol. 9 issue 1, January–March 2017, pp. 93–108 (ISSN 2304-070X) image

Measures of student success with textbook transformations: the Affordable Learning Georgia Initiative

Emily Croteauimage

University of Kentucky (USA)

emily.croteau@uky.edu

Abstract

In 2014, the state of Georgia’s budget supported a University System of Georgia (USG) initiative: Affordable Learning Georgia (ALG). The initiative was implemented via Textbook Transformation Grants, which provided grants to USG faculty, libraries and librarians, and institutions to “transform their use of textbooks and other learning materials into using lower cost options”, in other words to use open educational resources (OER) in lieu of a traditional bound textbook. The Round One Textbook Transformation Grants have already shown to be successful in that they saved students approximately $760,000. What is not known, is the collective impact on student learning. This study examines the learning gains or losses pre- and post-transformation in ALG Round One courses where traditional resources were replaced with OER. It estimates differences between pre- and post- textbook transformation across the following outcomes: 1) Drop Fail Withdraw (DFW) rates, 2) rates of completion, 3) numbers of students receiving a final grade of A or B, C and D, 4) numerical final grades as a percent, 5) final exam grades as a percent, and, 6) course-specific assessment grades measured in percent. Twenty-four data sets were analyzed for DFW rate, eight data sets for completion rate, fourteen data sets for grade distribution, three data sets for final exam grades, three data sets for course specific assessment and one data set for final grades. The null hypothesis that there would be no differences between pre- and post-transformation rates in these learning outcomes was supported. Thus, this study demonstrates that the USG’s ALG initiative helped students save money without negatively impacting learning outcomes. In addition, it is the first of its kind to measure some of these learning outcomes (e.g. final exam grade, assessment grade, and distribution of letter grades) at this scale.

Keywords: Open Educational Resources; Affordable Learning Georgia; textbook transformation; learning outcomes; higher education

Reception date: 16 December 2016 • Acceptance date: 14 March 2017

Introduction

Successful teaching involves many components such as the knowledge and capabilities of both teachers and students, as well as curriculum materials and other available resources (Charalambous & Hill, 2012). One of the key pieces of curriculum materials in post-secondary education is the textbook (Altbach, Kelly, Petrie & Weis, 1991). Textbooks synthesize information on a particular subject, making them an invaluable reference for any curriculum. Although textbooks are valuable learning resources, their costs have risen dramatically, in some instances making them costprohibitive for many students. Hilton, Robinson, Wiley and Ackerman (2014) found that the average textbook price across seven colleges and multiple general education classes was $90.00. Furthermore, during the 2015–2016 academic year, textbook and supplies costs for a college student ranged from $1,249–$1,364 (College Board, 2016). In addition, research done by the National Association for College Stores (NACS) show that average “new” textbook prices have increased steadily since the 2009–2010 academic year from $62 to $82 (NACS, 2016).

The perceived high cost of textbooks combined with other costs of higher education may negatively impact students from lower socioeconomic backgrounds (Paulsen & St. John, 2002). For example, those with lower incomes are more prone to delay college enrollment than their wealthier peers(Provasnik & Planty, 2008). High costs, which include textbook costs, can also result in students taking fewer classes, delaying graduation (Buczynski, 2007). Moreover, many students do not purchase textbooks, which weakens their learning opportunities. One survey suggested that 23% of students regularly forego purchasing required textbooks due to their high cost (Florida Virtual Campus, 2012). One method to circumvent the high cost of textbooks is to replace commercial textbooks with Open Educational Resources (OER).

The following definition of OER was offered by Saul Fisher from the Andrew W. Mellon Foundation in 2002 at the Forum on the Impact of Open Courseware for Higher Education in Developing Countries convened by UNESCO: “The open provision of educational resources, enabled by information and communication technologies, for consultation, use and adaptation by a community of users for non-commercial purposes” (UNESCO, 2002, p. 24). According to the William and Flora Hewlett Foundation, OER can include full courses, course materials, modules, textbooks, streaming videos, tests, software, and any other tools, materials, or techniques used to support access to knowledge (Hewlett, 2013). OER eschews traditional copyright in lieu of licenses that allow others to retain, reuse, revise, remix, and redistribute the materials (Hilton, Wiley, Stein & Johnson, 2010; Wiley, Bliss & McEwen, 2014).

By far the simplest way to implement OER in a college course is to replace the traditional textbook with an “open” textbook. There are scores of high-quality open textbooks available for students and faculty to freely use (Open Textbook Library, 2016; OpenStax, 2016) many of which go through professional peer review and publishing processes. Furthermore, many open textbooks are available in print in addition to being online. Many students prefer to purchase a printed copy regardless of whether the online version is free (Hilton & Wiley 2011). Printed versions of open textbooks cost substantially less than traditional textbooks.

The biggest concerns that faculty have concerning the adoption of OER are: 1. Whether the open resource is of similar quality to the traditional resource and, 2. How students will perform utilizing open resources (Allen & Seaman, 2014). To address the first concern, Bliss, Hilton, Wiley and Thanos (2013) surveyed the experiences of fifty-eight teachers and 490 students across eight colleges in their utilization of open texts. Bliss et al. (2013) found that approximately 50% of students said that the OER textbooks were of the same quality as traditional textbooks and nearly 40% said that they were better. Additionally, 55% of teachers adopting OER reported that the open materials were of the same quality as the materials they had previously used, and 35% felt that they were better. A recent study by Allen & Seaman (2014) found that of 2,144 surveyed college professors, 34% were aware of OER and of that 34%, 61.5% indicated OER had about the same “trusted quality” as traditional resources, 26.3% said that traditional resources were superior, and 12.1% said that OER were superior. Similarly, 68.2% said that the “proven efficacy” was about the same, 16.5% said that OER had superior efficacy and 15.3% said that traditional resources had superior efficacy. Hilton (2016) examined an additional eight studies of perceptions of OER in higher education and found similar results, namely that a strong majority of teachers who had adopted OER felt that they were as good or better than commercial resources. Based on these studies, on average OER appears to be of similar quality to traditional texts.

To address how students will perform utilizing open resources several studies have examined how using OER influences student performance measures. Lovett, Meyer and Thille (2008) measured the efficacy of an OER statistics module in comparison with the traditional educational model at Carnegie Mellon University during fall 2005 and spring 2006 semesters. Their results showed no significant difference between test scores (three midterms and one final exam) of students utilizing OER vs. the students in the traditional class. Bowen, Chingos, Lack and Nygren (2014) also compared the use of a traditional textbook in a face-to-face lecture class with that of a blended approach utilizing OER at Carnegie Mellon’s open statistics module. Bowen and colleagues found that, while students who utilized OER scored slightly higher than their peers on standardized exams, the difference was not statistically significant. Allen, Guzman-Alvarez, Molinaro and Larsen (2015) studied 478 students that used ChemWiki, an OER, for its primary textbook, and 448 students who utilized a commercial textbook. Pre-tests, combined with final exams showed no significant differences in individual learning gains between the two groups. These studies show that utilizing OER not only results in cost-savings but do not sacrifice student learning outcomes.

A few studies have shown that student learning increased in OER classes in comparison to a course that used traditional resources. First, Pawlyshyn, Braddlee, Casper and Miller (2013) found that when OER material was integrated into the math courses at Mercy College, student learning significantly increased. The pass rates of math courses increased from 63.6% in fall 2011 (when traditional learning materials were employed) to 68.9% in fall 2012 when all courses were taught with OER. Similarly, students who were enrolled in OER versions of a reading course performed better than their peers who enrolled in the same course using non-OER materials. Second, Hilton & Laman (2012) compared the performance of 690 students using an open textbook in an introductory psychology class to the performance of 370 students who used a traditional textbook in a previous semester. They concluded that students who used the open textbook achieved better grades, had a lower withdrawal rate, and scored better on the final examination. Lastly, Feldstein et al. (2012) found that students in courses using open textbooks typically had higher grades and lower failure and withdrawal rates than those in courses with traditional textbooks. However, it is important to note that the authors pointed out significant limitations in the two latter studies and stressed that these results were not generalizable. Given this, there isn’t enough information to universally say that OER will unequivocally increase student learning gains.

In contrast, in one instance, OER were found to be associated with lower outcomes. Robinson(2015) examined OER adoption at seven different institutions of higher education. In the 2012–2013 academic year, 3,254 students across the seven institutions enrolled in experimental versions of eight different courses that utilized OER and 10,819 enrolled in the equivalent versions of the course that utilized traditional textbooks. Robinson (2015) found that there were no statistically significant differences between the two groups in terms of final grades or completion rates in five of the eight classes. However, students in two courses performed significantly worse, receiving one-half to a full grade lower than their peers. Students in one class were significantly more likely to complete the course, although there were no statistically significant differences between groups in the overall course grades. Across all classes there was a small but statistically significant difference between the two groups in terms of the number of credits they took, with students enrolled in OER versions of the course taking on average .25 credits more than their counterparts in the control group. This study demonstrates the confounding factors that need to be taken into account when specific measures of performance are analyzed.

Hilton (2016) synthesized the above studies, as well as some additional ones and found that when students use OER in their classes, student outcomes are the same or better than when a traditional textbook is used. While these results are collectively interesting, they are far from comprehensive. Given the paucity of studies that have measured student performance using OER, much more research needs to be done to determine what relationship (if any) exists between the use of OER and student performance in higher education. In addition, performance measures like distribution of letter grades or performance on course specific assessments are an important part of general course assessment. These types of data are specifically lacking in the OER literature and should be included in evaluations of OER efficacy. The purpose of the present study is to add to the body of literature by examining the effectiveness of several OER adoptions that occurred in connection with Affordable Learning Georgia. As described in further detail below, I examined the results of 4950 students across 36 classes in 18 universities. My specific research question was as follows: is using OER associated with a change in student learning outcomes?

Context of the Present Study

In 2014, the state of Georgia decided to include funding in the state budget to support a University System of Georgia (USG) initiative: Affordable Learning Georgia (ALG). ALG’s focus was on reducing costs of textbooks and enhancing GALILEO (GeorgiA LIbrary LEarning Online), and Georgia’s Virtual Library. The initiative was implemented via Textbook Transformation Grants, which provided grants to USG faculty, libraries and librarians, and institutions to “transform their use of textbooks and other learning materials into using lower cost options” (Affordable Learning Georgia, 2016). ALG’s Textbook Transformation Grants program has a three-fold objective: 1. Pilot different approaches in USG courses for textbook transformation including adoption, adaptation, and creation of Open Educational Resources (OER) and/or identification and adoption of materials already available in GALILEO and USG libraries, 2. Provide support to faculty, libraries, and their institutions to implement these approaches, and, 3. Lower the cost of college for students and contribute to their retention, progression, and graduation.

Two levels of funding are available for award. The Standard-Scale Transformation includes transformation of one or more courses with less than 500 students enrolled on average per academic year, funding a maximum of $10,800. The Large-Scale Transformation involves one or more courses/sections or department-wide adoptions involving 500 or more students enrolled on average per academic year and funds a maximum of $30,000. Proposals can be submitted to one of four categories: No-or-Low-Cost-to Students Learning Materials, OpenStax Textbooks, Interactive Course-Authoring Tools and Software approach, (replaced the Course Pack Pilots category available in Rounds 1 and 2) and, the Top 100 Undergraduate Courses. Proposals to be submitted for funding through the Textbook Transformation Grants need to follow certain guidelines and certain activities are required to receive full funding (see https://affordablelearninggeorgia.org for more information on proposal submission and information about submission categories). To date, there have been eight calls for proposals for textbook transformation, with the latest call addressing courses for FA 2017.

The Round One Textbook Transformation Grants have already shown to be successful in that they saved students approximately $760,000 (Affordable Learning Georgia, 2015). What is not known, however, is the collective impact on student learning. Yes, students saved $760,000—but did they obtain positive learning outcomes? In the present study I will examine the learning gains or losses pre- and post-transformation in ALG Round One Grantees.

Methods

ALG’s Round One call for proposals for Textbook Transformation Grants yielded the funding of 29 proposals encompassing 36 courses set to take place during the Spring 2015 academic semester. The types of data that were reported across projects varied. For example, some reported qualitative data and some reported quantitative data or both were reported. Quantitative data reporting consisted of 1) Drop Fail Withdraw (DFW) rates, 2) rates of completion, 3) numbers of students receiving a final grade of A or B, C and D, 4) numerical final grades as a percent, 5) final exam grades as a percent, and, 6) course-specific assessment grades measured in percent. These measures were not consistently reported across all groups. Some groups provided pre-transformation values for these measures. Not all groups reported qualitative data, but those that did, collected those data via surveys, focus groups or student quotes. Survey questions were not consistent across groups.

In terms of quantitative data, one project did not report any quantitative data and seven projects did not make comparisons to pre-transformation data, leaving 21 projects and 27 courses viable for pre/post-transformation data analysis (Table 1). In each of the 21 projects, a faculty member (or members) created or utilized pre-existing OER to substitute for the traditional resources they used in previous semesters. For example, faculty members replaced the traditional bound textbook with a complete open source book (online textbook or ebook) or with individual subject-specific opensource documents or used subject specific websites. The goal of the transformation grants was to replace the costly resources used with a free version(s) and not to transform content or learning activities. Each faculty member taught both the pre- and post-transformation course and supplied the data for comparison. The information gathered provided paired data sets for analysis without instructor bias within each set.

In terms of qualitative data reported, all projects reported three quotes from their students for their respective projects. Twenty of the 29 projects provided results from surveys administered for their project. Survey questions varied, but those of interest for the present study included, whether students thought the quality of the text was comparable to a traditional textbook and whether they thought their learning experiences were enhanced.

Table 1: Courses, Universities/Colleges and Number of Students affected in ALG’s Round 1 Textbook Transformation Grants

Course Title University/College Number of
Students enrolled
Calculus I, Calculus II, Calculus III (MATH
1161, MATH 2072, MATH 2083)
Armstrong State University 300
Principles of Biology (BIO 1215K) Columbus State University 188
Anatomy and Physiology I & II (BIO 2212, BIO 2213) Dalton State College 71
General Psychology (PSYC 1101) East Georgia State College 204
Legal Environment of Business (LENB 3135) Georgia and State University College 124
College Algebra (MATH 1111) Georgia and State University College 159
Human Factors in Design (ID 2320) Georgia Institute of Technology 68
Introduction to Computing (CSCI 1100) Georgia Perimeter College 925
Introduction to Psychology (PSYC 1101) Georgia Southwestern State
University
34
Issues in African and African Diaspora
Studies (AADS 1102)
Kennesaw State University 37
Principles of Chemistry I (CHEM 1211) Kennesaw State University 70
Introduction to Web Development (IT5302) Kennesaw State University 62
Calculus II (MATH 2254) Kennesaw State University 70
Nursing Research for Evidence Based Practice (NURS 4402) Kennesaw State University 56
American Government (POLS 1101) Middle Georgia State College 210
Introduction to Biology II (BIO 1020K) South Georgia State College 34
Evolution and Biodiversity, Organismal
Biology (BIOL 1010, BIOL 1030)
Valdosta State University 959
Mathematics and Technology in Early
Childhood Education (ECED 3300)
Valdosta State University 43
Principles of Logic and Argumentation (PHIL 2020) Valdosta State University 39
Exploring Socio-Cultural Perspectives on
Diversity (EDUC 2120)
University of Georgia 99
Introduction to Algebra, Intermediate Algebra,
College Algebra (MATH 0097, MATH 0099,
MATH 1111)
University of North Georgia 95
27 Courses 14 Institutions 3847 Students

Data Analysis

I estimated differences between pre- and post- textbook transformation across the following outcomes: 1) DFW rates, 2) rates of completion, 3) numbers of students receiving a final grade of A or B, C, and D, 4) numerical final grades as a percent, 5) final exam grades as a percent, and, 6) course-specific assessment grades measured in percent. Since the data accumulated from this study comes from different populations of students, it is necessary to check whether each data set conforms to a normal distribution (or not) to direct subsequent statistical analyses. Additionally, it is possible that variance across projects is not homogeneous (Glass, 1966). Equality of variances should be assured prior to performing statistical tests. A Levene test (Levene, 1960; Brown & Forsythe, 1974) was used to check for equality of variances. A Shapiro-Wilk test was used to check for data normality (Shapiro & Wilk, 1965). This test was chosen over other tests of normality because it works best with smaller sample sizes and has shown to be more powerful than other similar statistical tests (Razali & Wah, 2011). If the Shapiro-Wilk test shows that the data are normally distributed, parametric statistics can be performed and if the data is not normally distributed nonparametric statistics are performed. Since the data is paired (i.e. pre- and post- data), a paired t-test(data normally distributed) or a Wilcoxon signed-rank test (data not normally distributed; Wilcoxon, Katti and Wilcox (1970) will be used to test the null hypothesis that there is no difference between pre- and post-transformation student learning outcomes. Lastly, the Bonferronni correction was applied to the paired tests to adjust for multiple comparisons and control for Type I errors.

Results

I measured results based on the data that was provided by the individual reports. Sixteen projects reported information on DFW rates, seven projects reported completion rate data, seven projects reported grade distribution data, three projects reported final exam grades, three projects reported course specific assessment data and one project reported final grade data. In sum, there are twentyfour data sets for DFW rate, eight data sets for completion rate, fourteen data sets for grade distribution, three data sets reported for final exam grades, three data sets reported for course specific assessment and one data set reported for final grades. The Shapiro Wilk test indicated that most of the data was normally distributed (á = 0.05), although the paired data for completion rate and distribution of D grades was not at that same alpha level (Table 10). That being said, the Levene’s test indicated that there was equality of variance across all the data (á = 0.05; Table 10). As a result, a paired t-test was performed for all analyses except for completion rate and distribution of D grades, in which a Wilcoxon signed rank test was performed. All analyses, parametric and non-parametric alike, was not significant after Bonferronni correction (á = 0.008; Table 10). Hence, the null hypothesis that there was no difference pre- and post- transformation was supported.

Pre- and post-transformation data sets compiled for DFW rate resulted in the analysis of 24 courses/sections of courses (Table 2). Twenty-four data sets were included affecting 3133 students. DFW rate was provided for paired courses/sections of courses. Inspection of the data showed some individual variation from course to course. For example, some courses showed changes in DFW rate in favor of pre-transformation (N=11) and others showed changes in favor of post-transformation(N=12). In one case, there was no change (N=1). A Shapiro Wilk test indicated that the data was normally distributed and a Levene’s test indicated that there was equality of variance across the data (á = 0.05; Table 10). A paired t-test showed that the results were not statistically significant(á = 0.008; Table 10). Hence, the null hypothesis that there was no difference pre- and posttransformation was supported.

Table 2: Drop Fail Withdraw (DFW) Rate Pre- and Post- OER Transformation for 10 Georgia Colleges and Universities

College/University Number of
Students
DFW Rate Pre-
Transformation
(Percent) per
section
DFW Rate Post-
Transformation
(Percent) per
section
Favors Pre-
or Post –
OER.
Columbus State University 188 6.59 14.89 Pre
Dalton State College 71 35
42
8
37
Post
Post
Georgia College and State University
College
159 17.4 21.7 Pre
Georgia College and State University
College
124 1 9 Pre
Georgia Perimeter College 925 4 2 Post
Georgia Southwestern State University 34 9 8 Post
Kennesaw State University 56 0 0 Neither
Kennesaw State University 37 17 36 Pre
Kennesaw State University 70 25.5
49
34.3
45.7
Pre
Post
Kennesaw State University 62 14.2 11.1 Post
Kennesaw State University 70 41.67 55.7 Pre
Middle Georgia State College 210 38
32
9
18
18
17
27
19
18
24
13
16
Post
Post
Pre
Pre
Post
Post
South Georgia State College 34 4 9 Pre
University of North Georgia 95 31.2
32.3
48.3
66.2
Pre
Pre
Valdosta State University 39 28.3 11.2 Post
Valdosta State University 959 32.97 29.25 Post

Pre- and post-transformation data sets compiled for completion rate resulted in the analysis of eight courses/sections of courses (Table 3). Eight data sets were included affecting 329 students. Completion rate was provided for paired courses/sections of courses. These data also showed some individual variation from course to course. Two courses showed changes in completion rate in favor of pre-transformation and four courses showed changes in favor of post-transformation. In two cases, there was no change. A Shapiro Wilk test indicated that the data was not normally distributed(á = 0.05; Table 10) but a Levene’s test indicated that there was equality of variance across the data (á = 0.05; Table 10), so a Wilcoxon signed rank test was performed and was found to be not significant (á = 0.008; Table 10). Hence, the null hypothesis that there was no difference pre- and post- transformation was supported.

Table 3: Completion Rate Pre- and Post- OER Transformation for four Georgia Colleges and Universities

College/University Number of
Students
Completion Rate
Pre-Transformation
(Percent) per section
Completion Rate
Post-Transformation
(Percent) per section
Support of
Pre- or
Post-?
Georgia Southwestern
State University
34 94 97 Post
Kennesaw State University 56 100 100 Neither
Kennesaw State University 62 85.8 88.9 Post
University of Georgia 99 88.03 98 Post
University of North Georgia 95 68.8
67.7
51.7
41.2
Pre
Pre
Valdosta State University 39 71.7 88.8 Post
Valdosta State University 43 100 100 Neither

Pre- and post-transformation data sets compiled for grade distribution rate resulted in the analysis of 14 courses/sections for A/B grades (Table 4, affecting 828 students), 12 courses/sections for C grades (Table 5, affecting 733 students), and eight courses/sections for D grades (Table 6, affecting 403 students). Grade distribution data was provided for paired courses/sections of course in each table. Variation from course to course was evident. For A/B grades, six courses showed changes in favor of pre-transformation, seven courses showed changes in favor of post-transformation and there was no change in two courses. For C grades, seven courses showed changes in favor of pre-transformation, four courses showed changes in favor of post-transformation and there was no change in one course. For D grades, three courses showed changes in favor of pre-transformation, three courses showed changes in favor of post-transformation and there was no change in two courses. Separate Shapiro Wilk tests implemented for A/Bs, Cs and Ds indicated that the paired data for A/Bs and Cs was normally distributed but that the paired data for Ds was not (á = 0.05;Table 10). However, separate Levene’s tests for numbers of A/Bs, Cs and Ds all indicated that there was equality of variance across the data (á = 0.05; Table 10). As a result, paired t-tests were performed for A/Bs and Cs and a Wilcoxon signed rank test was performed for Ds. All tests were found to be not significant (á = 0.008; Table 10). Hence, the null hypothesis that there was no difference pre- and post- transformation was supported.

Table 4: Distribution of A&Bs Pre- and Post- OER Transformation for six Georgia Colleges and Universities

College/University Number of
Students
A & B Pre-
Transformation
(percent)
A & B Post-
Transformation
(percent)
Support
of Pre- or
Post-?
East Georgia State College 204 51 69 Post
Georgia and State University College 159 64.9 71.1 Post
Kennesaw State University 56 100 96 Pre
Kennesaw State University 70 14
19
12
16
Pre
Pre
Middle Georgia State College 210 38
35
65
38
36
34
50
39
68
59
25
34
Post
Post
Post
Post
Pre
Neither
South Georgia State College 34 83 73 Pre
University of North Georgia 95 40.9
22.6
37.9
27.5
Pre
Post

Table 5: Distribution of Cs Pre- and Post- OER Transformation for five Georgia Colleges and Universities

College/University Number of
Students
C Pre-
Transformation
(percent)
C Post-
Transformation
(percent)
Support of
Pre- or
Post-?
East Georgia State College 204 34 20 Pre
Georgia and State University College 159 17.7 13.2 Pre
Kennesaw State University 56 0 4 Post
Kennesaw State University 70 6
7
5
7
Pre
Neither
Middle Georgia State College 210 25
32
21
35
24
20
20
16
12
15
25
34
Pre
Pre
Pre
Pre
Post
Post
South Georgia State College 34 13 18 Post

Table 6: Distribution of Ds Pre- and Post- OER Transformation for three Georgia Colleges and Universities

College/University Number of
Students
D Pre-
Transformation
(percent)
D Post-
Transformation
(percent)
Support of
Pre- or
Post-?
Georgia and State University College 159 5.7 5.7 Neither
Middle Georgia State College 210  3
 0
 6
 9
21
29
 3
26
33
38
16
Neither
Post
Pre
Pre
Post
Pre
South Georgia State College 34 0 6 Post

Pre- and post-transformation data sets compiled for final exam grade rate (Table 7, affecting 186 students) and assessment grade (Table 8, affecting 328 students) resulted in the analysis of three courses for each. All courses showed changes in favor of pre-transformation (Tables 7 and 8). A Shapiro Wilk test indicated that the data was normally distributed and a Levene’s test indicated that there was equality of variance across the data (á = 0.05; Table 10). Although the raw scores were higher pre-transformation, these results were not statistically significant (á = 0.008; Table 10). Hence, the null hypothesis that there was no difference pre- and post- transformation was supported.

Table 7: Final Exam Grades Pre- and Post- OER Transformation for two Georgia Universities Table 8: Assessment Grades Pre- and Post- OER Transformation for two Georgia Universities

College/University Number of
Students
Final Exam Grade
Pre-Transformation
(Percent) per section
Final Exam Grade
Post-Transformation
(Percent) per section
Support
of Preor
Post-?
Georgia Institute of Technology 68 84 78 Pre
Kennesaw State University 56      95.37      92.78 Pre
Kennesaw State University 62 89 77 Pre

Table 8: Assessment Grades Pre- and Post- OER Transformation for two Georgia Universities

College/University Number of
Students
Assessment Grade
Pre-Transformation
(Percent) per section
Assessment Grade
Post-Transformation
(Percent) per section
Support
of Pre- or
Post-?
Columbus State University 188 64 58 Pre
Kennesaw State University 70 78 68 Pre
Kennesaw State University 70 74 64 Pre

Data analysis was not performed within the final grade data category since only one paired course data set was provided (Table 9). However, in the sample of 68 students, final grades pre-transformation was favored.

Table 9: Final Grades Pre- and Post- OER Transformation for one Georgia University

College/University Number of
Students
Final Grade Pre-
Transformation
(Percent) per section
Final Grade Post-
Transformation
(Percent) per section
Support
of Pre- or
Post-?
Georgia Institute of Technology 68 91 89 Pre

The Shapiro Wilk Test and Levene Test was tested at á = 0.05 and the Paired T-Test and Wilcoxon Signed Rank Test was tested at á = 0.008 (Table 10).

Table 10: Descriptive statistics for each set of data used in the analysis

Data Set Sample size (N)
N = # of paired
data sets
Shapiro Wilk
Test
p – value
Levene
Test
p – value
Paired T-Test or
Wilcoxon Signed
Rank Test p – value
DFW rate 24 0.054 0.25 0.51
Completion rate 8 0.008 0.39 1
Number of A’s & B’s 14 0.142 0.36 2.16
Number of C’s 10 0.436 1.1 2.20
Number of D’s 8 0.005 0.14 0.528
Final exam grade 3 0.52 0.082 4.30
Course specific assessment 3 0.819 0.68 4.30

The qualitative data that was provided varied. Quotes provided by students were generally uninformative with regards to their perception of the quality of the text. The vast majority of comments were about textbook cost (or lack thereof). Responses to survey data were more informative;however, since questions were different for each project, standardizing responses is impossible. That being said general insight can be gleaned from these data. Of the 20 projects that provided survey data, 16 (80%) were on average positive or neutral with regards to OER quality and perceived learning, three provided an overall negative perception of OER (15%), and one was uninformative with regards to OER quality and enhancement of learning (5%). The three projects that had negative OER survey data related to specific chapters of the OER rating lower or the entire book rating lower in terms of quality in comparison to the traditional textbook. In these surveys quality in general was perceived as organization, helpfulness with coursework or visual appeal.

Discussion

The null hypothesis that there would be no differences between pre- and post-transformation rates of DFW, rates of completion, distribution of letter grades, final exam grades and course specific assessment grades was supported (p values ranged from 0.51–4.30). Thus, this study demonstrates that the USG’s ALG initiative helped students save money without negatively impacting learning outcomes. Non-significant results are important to report (Polanin, Tanner-Smith & Hennessy, 2016) and in this case supports the utility of OER because they indicate that students did as well using an open resource as they did using a traditional resource. Furthermore, Polanin et al. (2016) suggested that not reporting non-significant results can create dissemination biases that can affect which programs or policies are continued that may or may not be effective. Additionally, the purporting of these biases may inhibit the growth of new research.

This study is the first of its kind to measure some of these learning outcomes (e.g. final exam grade, assessment grade, and distribution of letter grades) at this scale. Fischer, Hilton, Robinson and Wiley (2015) focused on course completion, final grade, and enrollment intensity measures in a multi-institution study but indicated that more replicative studies were necessary and suggested that questions pertaining to the grades individual students receive when using OER vs. traditional resources would be of value.

The overall results are not statistically significant even though some measures of student learning outcomes show small gains or decreases in student learning when OER are adopted. These results suggest a consistent level of student performance pre- and post-transformation and underscores the quality of each chosen OER. The survey data that was provided generally supports the notion that students did not perceive a difference in quality or understandability when using the OER and the demonstration that students performed equally as well with the OER supports perceived high quality.

This study indicates that the individual project investigators chose appropriate OER to substitute for the traditional text(s) and aligned their course objectives with them well. The differences between pre- and post-transformation may have been more widespread with different overall results had the OER not been chosen and developed carefully. While the overall results are not statistically significant, there were individual instances in which students did better (or worse) when OER were implemented. Future studies should examine more carefully what factors coincide with higher or lower efficacy results. For example, it is possible that the change in resources resulted in instructor anxiety, lack of confidence or disorganization relating to the alignment of teaching materials with the new resources. Furthermore, it is possible that the overall impact of curriculum materials is relatively low and that the overall influence is small because it reflects this fact.

Moreover, further studies should examine whether there are connections between students’utilization of curriculum materials and their overall scores. While explicit quantitative data on student use was not gathered, I have implicitly assumed that had utilization decreased significantly, it would have had a significant negative impact on student measures. However, it is conceivable that curriculum materials matter less than we think, or that the relative use of materials would need to be dramatically different in order to significantly influence student outcomes.

Limitations

Selecting projects that performed pre/post-transformation analysis and further selecting for specific measures whittled down the sample size for each data point, even though the overall sample size is large. This was a result of an inconsistent rate of reporting of specific data measurements amongst researchers (i.e. some reported only DFW while another only reported assessment grades), lack of pre-transformation data reporting and limited reporting of informative data (perhaps researchers were not sure what to report). Far more data should be collected by future Textbook Transformation Grant awardees to clearly address whether students are succeeding with OER. Additionally, the data that are collected should be consistent throughout grants. For example, all grantees should collect the same types of data to form a more robust data set and this data collection should be explicitly requested by ALG in the information when the call for proposals is made and outlined in final reports. Additionally, identical surveys should be employed across grants to ensure consistency of qualitative data.

The overall non-significant differences between pre- and post-transformation may have come from the overall re-design of courses and not the OER on its own. In some cases, the OER may have necessitated a reexamination of the course, so it is possible that course objectives aligned better with the OER than the traditional text. In addition, a fresh look at course material may have clarified objectives or alignment issues that were previously undetected. However, both of these factors are positive occurrences in terms of teaching and education.

Future Directions

To date, relatively little is known about the efficacy of OER. Additional large-scale studies are needed. With so many institutions now using OER there is an opportunity to conduct research on many aspects, including those that focus on differences in outcomes between traditional and OER taught courses. Furthermore, some individual courses are taught by multiple professors, which would lend to studying the learning outcomes based on pedagogical differences. Identifying differences in pedagogy may provide insight into the instructional design measures that may enhance OER learning outcomes.

The results of this study showed no difference in expected learning outcomes, which is satisfactory. However, most teachers are looking to improve student learning. It would be important to identify if there are certain types or platforms of delivery of OER that assist in learning or whether there are specific improvements that could be made to the OER to augment learning.

Acknowledgements

I’d like to thank J. Gallant at ALG for providing the data for analyses. J. Hilton III deserves special recognition for providing comments on and direction of the manuscript. Thanks to L. Fischer and C.L. Mott for help with statistical analyses. Additional thanks to R. Bodily and several anonymous reviews of previous versions of this manuscript. This work could not have been completed without the support of the OER research fellowship program provided by the open education group(openedgroup.org) and additional support provided in part by the William and Flora Hewlett Foundation. The Foundation did not see or influence this work prior to its publication.

References

Affordable Learning Georgia (2015). Textbook Transformation Grants Round One: Spring Semester 2015 Final Report Summary. Retrieved from: http://affordablelearninggeorgia.org/documents/ALG_R1_Final_Report.pdf

Affordable Learning Georgia (2016). https://affordablelearninggeorgia.org

Allen, G., Guzman-Alvarez, A., Molinaro, M., & Larsen, D. (2015). Assessing the Impact and Efficacy of the Open-Access ChemWiki Textbook Project. Educause Learning Initiative Brief. Retrieved from https://library.educause.edu/resources/2015/1/assessing-the-impact-and-efficacy-of-the-openaccess-chemwiki-textbook-project

Allen, I. E., & Seaman, J. (2014). Opening the Curriculum: Open Educational Resources in US Higher Education, 2014. Babson Survey Research Group.

Altbach, P. G., Kelly, G. P., Petrie, H. G., Weis, L. (Ed.). (1991). Textbooks in American society: Politics, policy, and pedagogy. New York: State University of New York Press.

Bliss, T J., Hilton III, J., Wiley, D., & Thanos, K. (2013). The cost and quality of online open textbooks: Perceptions of community college faculty and students. First Monday, 18(1). https://doi.org/10.5210/fm.v18i1.3972

Bowen, W. G., Chingos, M. M., Lack, K. A., & Nygren, T I. (2014). Interactive Learning Online at Public Universities: Evidence from a Six-Campus Randomized Trial. Journal of Policy Analysis and Management, 33(1), 94-111. https://doi.org/10.18665/sr.22464

Brown, M. B. & Forsythe, A. B. (1974). Robust Tests for the Equality of Variances. Journal of the American Statistical Association, 69, 364-367.

Buczynski, J. A. (2007). Faculty begin to replace textbooks with “freely” accessible online resources. Internet Reference Services Quarterly, 11(4), 169-179.

Charalambous, C. Y. & Hill H. C. (2012). Teacher knowledge, curriculum materials, and quality of instruction: Unpacking a complex relationship. Journal of Curriculum Studies, 44(4), 443-466.

College Board (2016). https://collegeboard.org

Feldstein, A., Martin, M., Hudson, A., Warren, K., Hilton III, J., & Wiley, D. (2012). Open Textbooks and Increased Student Access and Outcomes. European Journal of Open, Distance and E-Learning. Retrieved from http://www.eurodl.org/index.php?article=533

Fischer, L., Hilton III, J., Robinson, T. J. & Wiley, D. (2015). A multi-institutional study of the impact of open textbook adoption on the learning outcomes of post-secondary students. Journal of Computing in Higher Education, 27(3), 159-172. https://doi.org/10.1007/s12528-015-9101-x

Florida Virtual Campus. (2012). 2012 Florida Student Textbook Survey. Tallahassee, FL. Retrieved from http://www.openaccesstextbooks.org/%5Cpdf%5C2012_Florida_Student_Textbook_ Survey.pdf.

Glass, G. V. (1966). Testing homogeneity of variances. American Educational Research Journal, 3(3), 187-190.

Hewlett (2013). Open Educational Resources. Retrieved from http://www.hewlett.org/programs/ education-program/open-educational-resources

Hilton III, J. L. (2016). Open educational resources and college textbook choices: a review of research on efficacy and perceptions. Education Technology Research and Development, 64(4), 573-590. https://doi.org/10.1007/s11423-016-9434-9

Hilton III, J. L., & Laman, C. (2012). One college’s use of an open psychology textbook. Open Learning: The Journal of Open, Distance and e-Learning, 27(3), 265-272. http://dx.doi.org/10.1080/02680513.2012.716657

Hilton III, J. L., Robinson, T J., Wiley, D., & Ackerman, J. D. (2014). Cost-savings achieved in two semesters through the adoption of open educational resources. The International Review of Research in Open and Distributed Learning, 15(2). http://dx.doi.org/10.19173/irrodl.v15i2.1700

Hilton III, J. L., & Wiley, D. (2011). Open access textbooks and financial sustainability: A case study on Flat World Knowledge. The International Review of Research in Open and Distributed Learning, 12(5), 18-26. http://dx.doi.org/10.19173/irrodl.v12i5.960

Hilton III, J. L., Wiley, D., Stein, J., & Johnson, A. (2010). The four ‘R’s of openness and ALMS analysis: frameworks for open educational resources. Open Learning: The Journal of Open, Distance and E-Learning, 25(1), 37-44. Retrieved from http://www.tandfonline.com/doi/full/10.1080/02680510903482132

Levene, H. (1960). Robust tests for equality of variances. In Contributions to probability and statistics (pp. 278-292). Stanford University Press.

Lovett, M., Meyer, O., & Thille, C. (2008). The open learning initiative: Measuring the effectiveness of the OLI statistics course in accelerating student learning. Journal of Interactive Media in Education, 2008(1), Art-13. http://doi.org/10.5334/2008-14

National Association of College Stores (NACS) (2016). https://nacs.org OpenStax (2016). https://openstax.org

Open Textbook Library (2016). http://open.umn.edu/opentextbooks/

Paulsen, M. B., & St John, E. P. (2002). Social class and college costs: Examining the financial nexus between college choice and persistence. The Journal of Higher Education, 73(2), 189236.

Pawlyshyn, N., Braddlee, D., Casper, L., & Miller, H. (2013). Adopting OER: A Case Study of Cross- Institutional Collaboration and Innovation. Educause Review. Retrieved from http://er.educause.edu/articles/2013/11/adopting-oer-a-case-study-of-crossinstitutional-collaboration-and-innovation

Polanin, J. R., Tanner-Smith, E. E. & Hennessy, E. A. (2016). Estimating the difference between published and unpublished effect sizes: A meta-review. Review of Educational Research, 86(1), 207-236. https://doi.org/10.3102/0034654315582067

Provasnik, S., & Planty, M. (2008). Community Colleges: Special Supplement to The Condition of Education 2008. National Center for Education Statistics. Retrieved from https://nces.ed.gov/pubs2008/2008033.pdf

Razali, N. M. & Wah, Y. B. (2011). Power comparisons of Shapiro-Wilk, Kolmogorov-Smirnov, Lilliefors and Anderson-Darling tests. Journal of Statistical Modeling and Analytics, 2, 21-33.

Robinson, T J. (2015). The Effects of Open Educational Resource Adoption on Measures of PostSecondary Student Success. Unpublished Doctoral Dissertation, Brigham Young University, Provo, UT, USA.

Shapiro, S. S., & Wilk, M. B. (1965). An analysis of variance test for normality (complete samples). Biometrika, 52, 591-611.

UNESCO (2002). Forum on the impact of open courseware for higher education in developing countries: Final report. Retrieved from www.unesco.org/iiep/eng/focus/opensrc/PDF/OERForumFinalReport.pdf

Wilcoxon, F., Katti, S. K., & Wilcox, R. A. (1970). Critical values and probability levels for the Wilcoxon rank sum test and the Wilcoxon signed rank test. Selected tables in mathematical statistics, 1, 171-259.

Wiley, D., Bliss, T. J., & McEwen, M. (2014). Open Educational Resources: a review of the literature. In Handbook of research on educational communications and technology (pp. 781-789). New York: Springer.

Papers are licensed under a reative Commons Attribution 4.0 International License

Refbacks

  • There are currently no refbacks.