Start Submission Become a Reviewer

Reading: Student support service excellence evaluation: Balancing the Iron Triangle of accessibility,...

Download

A- A+
Alt. Display

Research articles

Student support service excellence evaluation: Balancing the Iron Triangle of accessibility, cost-effectiveness and quality?

Authors:

Asteria Nsamba ,

University of South Africa, ZA
X close

Angie Bopape,

University of South Africa, ZA
X close

Bongi Lebeloane,

University of South Africa, ZA
X close

Laetitia Lekay

University of South Africa, ZA
X close

Abstract

Recently, the University of South Africa widened access to academic facilities and services at one of its study centres. Although this is laudable and demonstrates a commitment by the university towards its students, it raises these three concerns (1) What is the occupancy rate of the facilities? (2) To what extent are these improved facilities cost-effective? (3) What is the quality of the services at these facilities? A modified iron triangle was employed to analyse and determine accessibility, cost-effectiveness and the quality of the facilities. Data mining techniques involving descriptive analysis indicated that the most utilised service facilities were the computer laboratories and the least utilised was the study space. Moreover, perceived service quality of the facilities was rated good to excellent by the majority of the respondents. The modified iron triangle was found to be useful in helping us understand Student Support Excellence Project’s (SSEP) improvements at the identified study centre.

How to Cite: Nsamba, A., Bopape, A., Lebeloane, B., & Lekay, L. (2021). Student support service excellence evaluation: Balancing the Iron Triangle of accessibility, cost-effectiveness and quality?. Open Praxis, 13(1), 37–52. DOI: http://doi.org/10.5944/openpraxis.13.1.1168
3
Views
1
Downloads
  Published on 31 Mar 2021
 Accepted on 28 Feb 2021            Submitted on 20 Sep 2020

Introduction

Many open distance learning (ODL) universities such as the University of South Africa (UNISA), the Open University in UK and the Central Queensland University in New Zealand have established learning spaces known as study centres to allow students to have access to a variety of support services. Student support services, defined as a cluster of facilities and activities that makes the learning process easier and more interesting for students (Krishnan, 2012) form an integral part of ODL. Past research (Robinson, 1995; Tait, 2003) and the most recent (Zuhairi et al., 2019; Ouma & Nkuyubwatsi, 2019; Makoe & Nsamba, 2019) emphasizes the importance of support services in ODL. Zuhairi et al. (2019) note that support services motivate students to engage in their learning, and learn autonomously and independently (p. 2).

Support services for undergraduate and postgraduate UNISA students include student registration; technical support; counselling and career development support; assignment management, face-to-face and online tutorial classes, as well as academic literacies, which include academic writing and numeracy. These support services are available online and face-to-face, and are provided at all study centres, located in different provinces in South Africa: Eastern Cape, Gauteng, KwaZulu-Natal, Limpopo, Mpumalanga, Free State, North West and the Western Cape; and one in Ethiopia. Facilities found in these centres include libraries, computer laboratories and study spaces, which are rooms that students use for discussion and study purposes. These centres are equipped with digital and information technologies, which include assistive technologies for students with disabilities. Most of them are crammed on most days, especially on weekends and during examination periods (Nsamba & Makoe, 2017). While on a visit to one of these centres, we observed that students utilise all spaces available –unoccupied rooms, passages and open lobbies. Most users come from under-resourced communities and schools, and they are in dire need of these facilities to support their learning. Equally important is the fact that these centres give the students a sense of belonging to a community of higher education (Higher Education Quality Committee (HEQC), 2010).

As demographics shift to student communities comprising school-leavers and young adult learners, thus changing the landscape of distance education, study centres are becoming platforms to establish collaborations, study groups and networks. Of interest to us as researchers is to understand the utilisation and quality of these student support facilities and services provided because little research has been conducted in this area. The study’s focus is the Rustenburg study centre in North West Province. The Rustenburg study centre serves approximately 12000 registered students, from Rustenburg town and the surrounding villages.

In 2017, UNISA commissioned a project titled “Student Support Excellence Project” (SSEP) at the Rustenburg study centre, with the purpose of improving access to academic facilities and services, in response to students’ demands for resource accessibility. The project provided for more study spaces, and longer opening hours for three computer laboratories (Labs) and the Library; and introduced Saturday studies. Visiting hours were extended from 07h45 to 20h00 on weekdays; and from 08h00–16h00 on Saturdays. The Library, which has the seating capacity of 40, offers services such as information literacy training, electronic information resources and information support searches; and the Labs which offer services such as digital skills training, technological support, access to the LMS for facilitation of online assessments and online modules, have the capacity of 59. Additionally, four classrooms for study purposes with the seating capacity of 35 each, were provided.

Widening access to learning facilities and resources for longer opening hours is laudable and demonstrates a commitment by the university towards its students. Power and Gould-Morven (2011) refer to this as the student-administration interface, resulting in “a pull response to a student-initiated accessibility push” (27). However, this raises three concerns summarized in these questions: (1) What is the occupancy rate of the facilities? (2) To what extent are these improved facilities cost- effective? (3) What is the quality of the services at these facilities? The purpose of this study, therefore, was to analyse the SSEP project improvements in order to assess the extent to which the support service facilities were utilised and determine their quality and cost-effectiveness. Data mining techniques were used to uncover information related to service facility utilization and quality. Power and Gould-Morven’s (2011) iron triangular perspective was employed to analyse accessibility, cost- effectiveness and the quality of the facilities.

The Iron Triangle Concept

The Higher Education Iron Triangle is a visual triangular model representing three factors of access, cost and quality. The iron triangular concept, whose origins can be traced from the field of project management, theorises the relationship and interactions among access, cost and quality. The model argues that these three factors are bonded and interdependent, and any change in one of them affects the other two, either individually or collectively. In higher education (HE), the factors of the triangle are considered to represent three key areas of university course delivery, namely; access, cost and quality (Daniel et al., 2009). The triangle can be used as criteria to manage access, costs and quality in HE. However, this concept is not seen as workable in its original form. Lane (2014, p. 2) cautions that “there is little scope to alter these factors advantageously” as improving one will worsen the others. This is corroborated by Immerwahr et al.’s (2008) study exploring the perspectives of university Chancellors and Vice Chancellors in United States (US). The majority of the participants indicated that any improvements to accessible and high-quality HE would escalate costs.

It is our observation that the unbreakable nature of the HE iron triangle factors has led to the development of several versions in the last decade. Daniel et al.’s (2009) proposed version is an iron triangle whose vectors (sides) can be altered for improvements; thus “breaking out of the iron triangle” (Power & Gould-Morven, 2011). Their proposition is that the economies of scale of ODL teaching and learning model can help break the economic iron triangle that applies to campus-based institutions. In addition, this model deals with tradeoffs in the allocation of resources in an economically minded approach, thus minimising any conflicts among the three factors of the triangle. The concept of tradeoffs is regarded as a central operations strategy that forms the foundation of managers’ approach to process improvements within organisations (Da Silveira & Slack, 2001). To Power and Gould-Morven (2011), these tradeoffs should be acceptable to all stakeholders of ODL: Administrators, staff and students.

Extensions of Daniel et al.’s (2009) ODL triangle of access, cost and quality have been proposed in the literature (Power & Gould-Morven, 2011; Lane, 2014; Mulder, 2013). Of interest to this study is Power and Gould-Morven’s (2011) modified triangular concept of three priorities. These authors modified the ODL triangle by removing the term “vectors” and replacing it with the term “priorities” at the corners of the triangle. Further modification included renaming two of the factors: access and cost and associating each with a stakeholder group. Cost was renamed cost-effectiveness priority and associated with the administrative staff stakeholder group. Access was renamed accessibility and defined as increasing access to courses, and was associated with the student stakeholder group; because students are said to be the most concerned about accessibility to educational resources. The only factor that corresponds to Daniel et al.’s (2009) triangle is quality. However, it was also modified by being associated with faculty, because faculties are said to be defenders of quality (Power & Gould-Morven, 2011, p. 25).

The priorities in this version indicate how stakeholder groups interact as they “advance their agendas” (p. 26). Each group possesses the “push” and “pull” power. The push power is understood to mean putting forward some demands; and the pull power refers to responding to those demands. For example, when one stakeholder group pushes for improvements in teaching or learning, other stakeholder groups may respond favourably, if all priorities are aligned. Put succinctly,

A situation is created whereby one stakeholder group will respond to the priority of another, but only insofar as such a response does not impede the pursuit of their own priority. Ideally, this dynamic would lead to the state of equilibrium and the balancing of priorities between two stakeholder groups. However, should increasing accessibility lead to a state of worsening quality, then these two stakeholder groups would have overtly non-aligned priorities, resulting in a lower probability of pull at the faculty end (Power & Gould-Morven, 2011, p. 26).

The intention of Daniel et al.’s (2009) and Power and Gould-Morven’s (2011) triangular perspectives is to strike a balance that will not affect any one of the factors or priorities negatively. However, in contrast to Daniel et al.’s (2009) ODL triangle, Power and Gould-Morven’s (2011) version deals with the behaviours of specified stakeholder groups with the greatest stake in accessibility, cost-effectiveness and quality, as relate to specific provision of ODL programs.

Figure 1 shows the HE iron triangle of equal sides. Figure 2 shows Daniel et al.’s (2009) triangle that is flexible and can be adjusted; and Figures 3a and 3b depict Power and Gould-Morven’s (2011) modified triangle. Figure 3a shows student push and staff-pull alignment; and Figure 3b shows student push and staff push-back non-alignment.

Figure 1 

Iron Triangle

Figure 2 

Daniel et al (2009) Triangle

Figure 3a 

Power & Gould-Morven’s (2011) (student push and staff pull alignment)

Figure 3b 

Gould-Morven’s (2011) (student push and staff push-back non-alignment)

Power and Gould-Morven’s (2011) triangle was used in this study to analyse and determine accessibility, cost-effectiveness and the quality of the improved service facilities at the identified study centre. This version is appropriate for this study because it highlights interactions among the priorities and their stakeholder groups. ODL is a high involvement service system, with multiple service interactions of students, administrators and staff (Makoe & Nsamba, 2019). Understanding stakeholder group interactions and their priorities is a means to improved service delivery.

Literature Review

There are limited empirical studies in the literature focusing on the interactions of the original Iron Triangle’s three factors: access, cost and quality or Power and Gould-Morven’s (2011) version. Earlier research (Immerwahr et al., 2008) that explored views of university presidents and the general public about educational demands in US public colleges and universities, has helped illuminate the nature of these factors. The study highlighted conflicting views regarding issues of access, cost and quality. The college and university presidents believed that the bond among these three factors was unbreakable and any change in one of them would impact the other two. They suggested that HE costs were the responsibility of governments and parents. Conversely, the public surveys indicated that institutions could make HE accessible to “more students without compromising quality or increasing tuition” (p. 33). Contrary to the presidents’ view, the public participants indicated that there was no unbreakable relationship among the three factors. 56% believed that quality could be maintained at a low cost; and four in ten people believed that mismanagement and waste were driving up costs.

A more recent qualitative study employed the Iron Tringle to understand access to quality postgraduate ODL education at Indira Ghandi National Open University (IGNOU), in Ethiopia (Woldeyes, 2016). The study’s premise was that access to distance education was cost-effective; therefore, it was imperative to understand aspects of distance education quality of the courses and support services. The study’s findings indicated that the quality of these services was perceived as satisfactory by the majority of the student participants. However, some students were dissatisfied with the quality of support services such as feedback. Woldeyes (2016) further observed that the cost of reproducing and distributing elearning study material was minimal, which made ODL cost-effective and accessible for students. This observation corroborated earlier studies (Hulsmann, 2004; Gaba, 2004; Rumble, 2003) that found the system of distance education cost- effective. However, we are beginning to see rapid increases in higher education costs, which affect ODL institutions as well. Gaba and Li (2015) noted that the ODL system in countries like India and China are experiencing decreases in government funding, thereby shifting the responsibility to students.

Woldeyes’s (2016) research has highlighted the bond that exists among quality, cost- effectiveness and accessibility; and has also indicated the importance of these factors in ODL. The preceding discussion is based on studies that have not applied the iron triangle, because there is limited research in this area. Instead they have examined access/accessiblity, cost/cost-effectiveness and quality separately (Apuke & Iyendo, 2018; Mawere & Sai, 2018; Onifade et al., 2013; Salubi et al., 2018; Becker et al., 2017; Olajide & Adio, 2017). We believe that this discussion will help the reader visualise a triangle representing access/accessiblity, cost/cost-effectiveness and quality. In addition, the findings of these studies will help us in understanding how various stakeholder groups behave in their sphere of influence. The primary participants in the studies were the student stakeholder group.This group visits libraries for personal study and research, and to access the computer laboratory, Wi-Fi and other resources. The library use ranges from high, moderate to low, with most studies reporting low library use. A contrast was found in Becker et al.’s (2017) study which had observed an increase of library visits over the past few years. Goodall and Pattern (2011) define low use as: (1) Having less than five visits to the Library; (2) logging in to the University’s electronic resources collection less than five times; or (3) borrowing less than five books, during an academic year.

A study on the utilisation of library support services and resources by postgraduate students (Onifade et al., 2013) revealed that the library was occasionally used by the majority of these students (47%), with a mere 10.5% of the students using the library daily; the most utilised resource being the Internet facility. 14% of the students visited the library to study for examination. The authors’ observation was that the majority of postgraduate students at the institution were full time workers who had to combine work and study, hence their occasional library use.

Similarly, a study involving 390 students from two South African universities (Salubi et al., 2018) indicated that the majority of the respondents rarely or never used the library databases and e-resources, including e-books; and did not utilise information literacy training, which was recorded as the least used service, the most used being the Wi-Fi. The study showed that 44.3% of the students visited the library occasionally, 27.3% almost daily and 15.7% never visited the library. A further revelation was that 83.5% of the students visited the library to access Wi-Fi, not e-resources or databases. The study also indicated that 63% of the respondents always use computer labs, and 31.3% use the discussion room.

Studies have also evaluated levels of satisfaction in using library services and resources. Becker et al. (2017) evaluated students’ use of library facilities, as well as service satisfaction and accessibility in a South African university of technology. Data indicated that 72% of the students were satisfied with library facilities and 62% with the computer facilities, “despite the long queues often experienced by students waiting to use the computer facilities” (p. 17); and 80% perceived the library as comfortable and inspiring. On the contrary, a more recent quantitative study (Mawere & Sai, 2018) found that students were dissatisfied with most of the library facilities and resources of their university. Dissatisfaction was caused by lack of access to e-resources and relevant materials, inadequate reading space and unpredictable power supply. Poor library staff-student relationship due to untrained staff and low bandwidth were also cited as contributory factors to non-utilisation of library resources. Similar results were reported by Olajide and Adio (2017). In Addition, Gathoni and van der Walt’s (2019) study uncovered service quality gaps between library users’ expectations and perceptions.

Dissatisfaction regarding inadequate internet access in three universities in Nigeria were also reported by Apuke & Iyendo (2018). This study found that 86.8% of students had inadequate Internet access, while 13.2 % reported having adequate access. This is in contrast with India and China, whose citizens “enjoy” good infrastructure, such as Internet, due to fast industrial development (Gaba & Li, 2015).

It is worth noting that the student stakeholder group is usually seen “as a more disembodied influence” (Power & Gould-Morven, 2011, p. 25) than other groups. In ODL, this situation could be worse due to distance. Power and Gould-Morven (2011) have indicated the importance of integrating all the groups into an overall strategy. Makoe and Nsamba (2019) assert that this could be achieved by evaluating the quality of offerings or support services from students’ perspective.

Quality is very critical in ODL, and as noted by Gaba and Li (2015), courts in India have declared that ODL is not at par with conventional universities due to deterioration in quality assurance practices. ODL universities are making efforts to quality assure their offerings and services, and research conducted on service quality include Uppal et al. (2017); Makoe and Nsamba (2019); Dursun et al. (2014). In addition, service quality dimensions have been tested and recommended by researchers such as Gathoni and van der Walt (2019), and Makoe and Nsamba (2019). Gathoni and van der Walt (2019) have suggested reliability, access and collection as appropriate to assess library quality, whilst Makoe and Nsamba (2019) proposed the following modified dimensions for ODL support service quality:

  • Tangibles: adequate and appropriate physical facilities: study centres, equipment; friendly personnel
  • Reliability: the ability to perform the desired service dependably, accurately, and consistently; keeping promises to match to the goals; handling complaints; solving problems and understanding users’ needs
  • Delivery: feedback; guidance on learning guidance on assignment; access to academic and administrative staff
  • Assurance: the knowledge and competence of the staff; possession of necessary skills; staff courtesy and their ability to inspire trust and confidence (p. 4)

Quality has also been linked to satisfaction as in Hsu et al. (2014) and in a more recent study (Gathoni & van der Walt, 2019) which examined students’ perceptions of library service quality dimensions (Parasuraman et al., 1988) and found gaps in the services.

The final important concept in this review is cost-effectiveness priority. As suggested by Power and Gould-Morven (2011) cost-effectiveness is a more significant indicator of quality in the Iron Triangle than the cost factor. There are limited studies on cost-effectiveness of ODL student support facilities and services. Research in this area had focused on online learning (Jung, 2005); e-learning (Hulsmann, 2004) and the cost of the ODL system (Gaba & Li, 2015; Gaba, Panda & Murthy, 2011; Rumble, 2003). When addressing the issue of cost-effectiveness evaluation, Gaba (2004) who has written extensively on issues of costs in ODL (Gaba & Li, 2015; Gaba et al., 2011), suggested that cost-effectiveness analysis is appropriate because it addresses inputs in terms of the level of achievement of the objectives. In addition, this technique is used to compare the cost of a programme or project relative to its expected benefits, when it is difficult to monetise the outcomes (Cellini & Kee, 2015; Johnson, 2014). This gives credence to Power and Gould-Morven’s (2011) use of the term cost-effectiveness.

In this study, the cost-effectivess priority is critical because it examines whether the outcomes of the SSEP were achieved, and whether the administrators received their money’s worth. This is very imperative because the SSEP was borne out of a push by the student stakeholder group.

Research Processes

Data Collection

This study used Data mining and descriptive analysis to understand accessibility, cost-effectiveness and quality of the improved student support facilities and services. Quantifiable information from students’ information dataset, representing quality and utilisation of facilities and student support services was extracted.

The target population consisted of all students regardless of age, gender, field and level of study, who visited the Library, Computer Laboratories and Study Space, after working hours. The normal working hours in South Africa is between 07:00 and 16:00. Data were collected from 01 July 2017 to 31 May 2018.

Three data collection and analysis processes were followed:

  1. Data showing the occupancy of the facilities were mined and analysed. Each student who utilized the services of any of the three facilities signed an attendance register. The data (from the attendance register) represented accessibility priority.
  2. Service quality data were mined from a questionnaire that was distributed to the whole population to evaluate the service facilities. The questionnaire measured the following three attributes on a five-point rating scale –“Excellent”, “Good”, “Average”, “Poor”, “Very Poor” and “No Answer”:
    • The level of knowledge of the staff that assisted you.
    • Friendliness of the staff that assisted you (including security and cleaning staff).
    • The cleanliness of the area you were visiting.
  3. To understand and determine the cost-effectiveness of the three facilities, occupancy rate was calculated. In this study, occupancy rate means the extent to which services and facilities were utilised. It should be highlighted that utilisation of facilities and services was dependent upon the facilities accessibility.

Data Analysis

The first part of the analysis is based on visits to the three facilities: The Labs, the Library and the Study Space. This is intended to understand service facility monthly use and average occupancy. The second part of the analysis focuses on service quality of the facilities. The three stakeholder groups relevant in this analysis are: Administrators, staff (faculty) and students.

Service Facility Use and Occupancy

Data indicated that within a period of eight months, from 01 July 2017 to 31 May 2018, students’ use of different support facilities varied. The Labs had the highest number of total attendances (451), the Study Space had 326 while the Library had the lowest number of attendances (217). Average % occupancy for the Labs was 96%, while that for the Study Space was 68% and 39% for the Library. Occupancy is the number of students who attended in relation to the number of spaces available. Table 1 presents the break-down of service facility monthly use. The greyed-out cells indicate unoccupied facilities during that month.

Table 1

Service Facility Monthly Use

2017 2018 Total Average per month Capacity % Average occupancy
Jul Aug Sep Oct Nov Jan Feb Mar Apr May
Lab 110 125 32 70 4 55 28 27 451 56 59 96%
Study 72 54 32 32 28 46 42 20 326 41 105 39%
Library 1 82 10 10 41 10 3 60 217 27 40 68%
Total 183 261 64 112 4 10 124 84 45 107

The Computer Labs

The occupancy data show that the Labs were the most frequented and most utilised of the three facilities, at 96% occupancy. An average monthly facility utilisation of 56 had been recorded, whereas available capacity is 59, which gives percentage average occupancy of 96% (56/59= 96 %). Services offered in these facilities include digital skills training, technological support and the LMS; which makes this area the busiest of the whole study centre.

The Library

The second highest utilised facility is the Library, with 68% average occupancy. An average monthly facility utilisation of 27 was recorded, whereas the available capacity is 40, which gives percentage average occupancy of 68% (27/40= 68 %). The Library offers services such as information literacy training, electronic information resources, and support on information searches. The largest attendances were 82 visits in August 2017 and 60 in May, 2018.

Study Space

Data show that the least utilised facility is the Study Space, consisting of three classrooms. An average facility utilisation per month is 41, whereas the available capacity is 105, which gives percentage average occupancy of 39% (41/105= 39 %). This implies that on average two of the classrooms remained unoccupied for a period of 8 months. Data also indicate that during September and October 2017; and February and May 2018, the average occupation was as low as 28% (112/4=28%). This result is largely consistent with Salubi et al. (2018) and Onifade et al. (2013) who indicated that 31.3% and 14% of the students used study rooms respectively.

Quality

The Computer Labs Quality

Data as summarised in Table 2 show that an average of ((970+337)/1350) 97% of the respondents over an eight-month period rated the quality of the Labs as good and excellent. 72% of respondents rated “the Level of knowledge from staff” as excellent, and 25% rated it good. 70% of the respondents rated “Friendliness of staff” as excellent and 27% rated it good. The cleanliness of the Labs was rated as excellent by 74% of the respondents and good by 23% of the respondents.

Table 2

Computer Labs Quality

Excellent Good Average Poor Very poor No Answer Total
Level of knowledge from staff 324 112 9 2 2 2 451
Friendliness of staff member 315 121 8 2 2 3 451
The cleanliness of the area you were visiting 331 104 8 1 2 2 448
Total 970 337 25 5 6 7 1350
Average Total Percentage 72% 25% 2% 0% 0% 1%

The Library Quality

Data show, as summarized in Table 3, that an average of ((460+183)/676) 95% of the respondents over an eight-month period rated the quality of the Library as excellent and good. 68% of the respondents rated “the Level of knowledge from staff” excellent, and good by 27%. “Friendliness of staff” was rated excellent by 67% of the respondents while 29% rated it good. The cleanliness of the Library was rated excellent by 70% of the respondents and good by 27%.

Table 3

Library Quality

Excellent Good Average Poor Very poor No Answer Total
Level of knowledge from staff 153 57 8 3 2 5 228
Friendliness of staff member 149 65 7 0 1 1 223
The cleanliness of the area you were visiting 158 61 5 0 0 1 225
Total 460 183 20 3 3 7 676
Average Total Percentages 68% 27% 3% 0.5% 0.5% 1%

The Study Space Quality

The data as summarised in Table 4 show that an average of ((444+347)/951) 83% of the respondents over an eight-month period rated the quality of the Study Space as excellent and good. 47% of respondents rated “Level of knowledge from staff” excellent whereas 36% rated it good. “Friendliness of staff” was rated excellent by 43% of the respondents and as good by 38%. The cleanliness of the Study Space was rated excellent by 49% of the respondents and good by 32%. Table 4 shows the summary of the Study Space data.

Table 4

Study Space Quality

Excellent Good Average Poor Very poor No Answer Total
Level of knowledge from staff 151 124 24 1 3 14 317
Friendliness of staff member 137 120 31 5 6 18 317
The cleanliness of the area you were visiting 156 103 34 5 7 12 317
Total 444 347 89 11 16 44 951
Average Total Percentages 47% 36% 9% 1% 2% 5%

Discussion

The purpose of this study was to analyse the SSEP’s improvements implemented by UNISA at one of its study centres. Three concerns were raised: (1) What is the occupancy rate of the facilities? (2) To what extent are these improved facilities cost-effective? (3) What is the quality of the services at these facilities? The study used Power and Gould-Morven’s (2011) iron triangular perspective consisting of accessibility, cost-effectiveness and quality priorities as criteria to understand how these priorities were managed at the following facilities: Labs, Library, and Study Space. The results of this study show that satisfactory levels of accessibility as demanded by the students was achieved; and the perceived service was found to range between good and excellent. The study also found that two facilities have low levels of occupancy, which suggests that they are not cost-effective. These results support Power and Gould-Morven’s (2011) model. As depicted by this model, stakeholder groups interact to advance their priorities. These interactions can lead to alignment or non-alignment of priorities, depending on the direction of their push or pull powers. The results support this logic. The study found that student stakeholder group’s push for improved accessibility to facilities and services led to high levels of accessibility to services. In addition, the quality of these services was highly rated, which is an indication of students’ satisfaction with the services. Similar to Power and Gould-Morven’s (2011) model, certain priorities are aligned, and others are not aligned. The discussion below is based on the three questions asked in this study.

The occupancy rate and the cost-effectiveness of the facilities

The Computer Laboratories

The occupancy data show that the Labs were the most frequented and most utilized of the three facilities, at 96% occupancy. This high percentage also indicates the Labs’ accessibility. Services offered in these facilities include digital skills training, technological support, and the Learning Management System (LMS), which makes this area the busiest of the whole study centre. Although the occupancy rate looks satisfactory, the Labs at this study centre are not operating at full capacity and there are no long queues, in contrast to the first author’s observations of long queues to computer labs in other study centres she visited. (The first author facilitates workshops at regional centres). In support, Nsamba and Makoe’s (2017) research indicated that computer rooms are the most utilized facilities at study centres. To corroborate, Becker et al. (2017) reported the long queues experienced by university students waiting to use their computer facilities. A more recent study (Salubi et al., 2018) has reported that 63% of the respondents indicated that they always use computer labs. These data are consistent with earlier research (Saadon & Liong, 2011) that found that more than 95% of the respondents used the computer lab at least once a week.

We find this utilisation acceptable and cost-effective due to its high occupancy. We can conclude that this is an acceptable threshold (equilibrium). However, what is unclear and surprising are the non-attendances in January and April 2018, given that these facilities are the busiest, while the library and the study space were occupied during these months. Even more surprising is that the quality data do not suggest indications of dissatisfaction from the student stakeholder group regarding this.

The Library

According to the data, the library is the second highest utilised facility with 68% average occupancy. The library offers services such as information literacy training, electronic information resources, and support on information searches. There was an average of 27 visits per month which translates to 12% of the total visits. The highest number of visits was 82 which constituted 37% in August. Although these results contrast those of Onifade et al. (2013); Olajide and Adio (2017) and Salubi et al. (2018), our view is that these visits are still low. Onifade et al. (2013) indicated that only 10.5% of the students were using the library daily, and 47% occasionally; and Salubi et al. (2018) indicated that the majority of the respondents rarely or never used the library e-resources, and information literacy training.

The low library use suggests that students do not borrow library books or access electronic resources as expected. It also suggests that they make less than five visits to the library, log into the electronic resources less than five times or borrow less than five books (Goodall & Pattern, 2011). This is worrying because this study centre serves approximately 12000 undergraduate and postgraduate students. The likelihood is that many of these students may not be reading and preparing well enough for their studies. As Salubi et al. (2018) indicated, students rarely or never use library resources. This is not cost-effective for UNISA because the library operates for 12 hours on weekdays and 8 hours on Saturdays. Again, these low levels of facility usage are in contrast to students’ demand for increased access to facilities and services.

It would be interesting to see the nature of administrators’ pushback now that a non-alignment of priorities has emerged. This could bring the three stakeholder groups: student, administrators and staff, into conflict because, firstly, low occupancy is not cost-effective and secondly, less reading may suggest less quality of students’ work. To reiterate, concerns have been raised over the years that ODL students’ performance is far lower than that of conventional institutions.

Study Space

Data show that the study space, consisting of three classrooms, is the least utilised facility. This occupancy rate is very low and indicates that on average two of the classrooms remained unoccupied for long periods of time – (8 months). This result is largely consistent with Salubi et al. (2018) and Onifade et al. (2013) who indicated that 31.3% and 14% of their participants used study rooms, respectively.

This is a concern because students’ push for increased access to study spaces resulted into a ‘pull’ of three additional classrooms for study purposes and discussions, which now stand idle. These facilities are not cost-effective because of the low levels of occupancy and long operating hours. This could be another source of conflict among the three stakeholder groups due to non-alignment of priorities. We expect some form of staff and administrator pushback regarding this situation. At this stage, the staff has the right to demand good quality work from students because they have access to adequate facilities. The administrators on the other hand should be concerned about this low occupancy and should push for adequate occupancy.

The quality of the improved support service facilities and cost-effectiveness

Power and Gould-Morven’s (2011) triangular perspective indicates that the quality priority is the responsibility of university faculty/staff. Different service quality models suggest that service quality should be evaluated by the users themselves. In line with this, the SSEP quality was evaluated from students’ perspective. Quality attributes that measured perceived quality for the three facilities were “Knowledge of staff”, “Friendliness of staff” and “Cleanliness of the area visited”. The overall results indicated that the three facilities were highly rated, which means that the students were satisfied with their service. These results support Power and Gould-Morven’s (2011) assertion that students pull quality when it promotes accessibility. Nsamba and Makoe’s (2017) also found that students award excellent ratings to study centres that provide excellent service. This is corroborated by Woldeyes (2016) who found that good quality of student support services leads to student satisfaction.

In these results, we have observed a point of equilibrium whereby all the three stakeholder groups attain acceptable levels of satisfaction of their priorities. This indicates that acceptable threshold levels can be attained if stakeholders understand their priorities. The results suggest that quality seems to be the only priority that has led to the desired outcomes among the three stakeholder groups. Therefore, thus far, there is an appropriate pull by all the stakeholders.

Recommendations

The results of the study suggest that some stakeholder groups may not be having a clear understanding of their priorities and do not work hard enough to promote or protect these priorities.

The students have access to improved support facilities, and the quality surveys have indicated their satisfaction towards the services. However, the facilities are not adequately utilized. Are students taking these improvements for granted? To prevent non-alignment of priorities, we recommend that each stakeholder group should understand what their priorities are, in order to have acceptable levels of alignment of these priorities. In this case, administrators are responsible for directing activities of the University and must help students and staff achieve their objectives. Students on the other hand have a responsibility towards their studies therefore they should utilize the services provided to them. In the same breath, the staff should organize more face-to-face tutoring support.

The results also indicate that students do not engage in required reading, despite having resources in the library. We recommend that the staff should provide more activities on reading to promote the use of the library and to improve the quality of students’ work. We also recommend more training sessions on how to access library resources.

Our reflections should also glance at concerns that have been raised over the years that ODL students are lonely and unsupported. Our observation is that students visit these centres with the hope of receiving academic support from the staff. We recommend the use of teleconferencing technology to support the students, and the idling facilities could be used for this purpose.

Lastly, a limitation found in the data is that the questionnaire administered in this project did not include many important items that would have illuminated a lot more about the quality of facilities. We recommend that more items be included in this questionnaire to understand other aspects of service quality.

Conclusion

Over the years, scholars have been vocal about the provision of quality support services in ODL institutions. Two variables can be added to this discourse, namely; cost-effectiveness and satisfactory levels of accessibility to ODL facilities and services. Power and Gould-Morven’s (2011) model was found appropriate to understand the management of these three priorities in an ODL environment. We recommend the application of this model in ODL because it recognises tradeoffs and emphasizes the attainment of acceptable levels of satisfaction of students, staff and administrators’ priorities, thus balancing this iron triangle, unlike the notion of breaking the iron triangle. Daniel et al.’s (2009) iron triangle is more suited to comparing campus-based learning with ODL.

The focus of this study was on students’ utilisation of service facilities and their perceptions of these facilities, leaving out other stakeholders such as academic staff and administrators. We suggest that future research should examine the other two stakeholders.

References

  1. Apuke, O.D., & Iyendo, T. O. (2018). University students’ usage of the internet resources for research and learning: Forms of access and perceptions of utility. Heliyon, 4(12), e01052. https://doi.org/10.1016/j.heliyon.2018.e01052 

  2. Becker, H., Hartle, H., & Mhlauli, G. (2017). Assessment of use and quality of library services, accessibility and facilities by students at Cape Peninsula University of Technology. South African Journal of Libraries and Information Science, 83(1), 11–24. http://dx.doi.org/10.7553/83-1-1642 

  3. Cellini, S., & Kee, J. (2015). Cost-Effectiveness and Cost-Benefit Analysis. In K.E. Newcomer, H.P. Hatry, & J.S. Wholey (Eds.), Handbook of practical program evaluation (4th ed, pp. 636–672). Wiley. https://dx.doi.org/10.1002/9781119171386.ch24 

  4. Daniel, S. J., Kanwar, A., & Uvalic-Trumbic, S. (2009). Breaking higher education’s iron triangle: Access, cost, and quality. Commonwealth of Learning. http://hdl.handle.net/11599/1442 

  5. Da Silveira, G.J.C., & Slack, N. (2001). Exploring the tradeoff concept. International Journal of Operations & Production Management, 21(7), 949–964. https://dx.doi.org/10.1108/01443570110393432 

  6. Dursun, T., Oskaybas, K., & Gokmen, C. (2014). Comparison of quality of services of distance education universities. The Online Journal of Science and Technology, 4(3). https://files.eric.ed.gov/fulltext/EJ1105551.pdf 

  7. Gathoni, N. & Van der Walt, T. (2019). Evaluating library service quality at the Aga Khan University library: Application of a total quality management approach. Journal of Librarianship and Information Science, 51(1), 123–136. https://doi.org/10.1177/0961000616679725 

  8. Gaba, A.K. (2004). Cost analysis in Open and Distance Learning. Indira Gandhi National Open University. 

  9. Gaba, A., Panda, S., & Murthy, C. R. K. (2011). Costing distance learning: A study of the Indian mega university. International Journal of Instructional Technology and Distance Learning, 8(6), 59–75. 

  10. Gaba, A.K., & Li, W. (2015). Growth and development of distance education in India and China: A study on policy perspectives. Open Praxis, 7(4), 311–323. http://dx.doi.org/10.5944/openpraxis.7.4.248 

  11. Goodall, D., & Pattern, D. (2011). Academic library non/low use and undergraduate student achievement: A preliminary report of research in progress. Library Management, 32(3), 159–170. https://doi.org/10.1108/01435121111112871 

  12. Higher Education Quality Committee (HEQC) (2010). UNISA HEQC audit report, 2009 ( Report No. 24). Council for Higher Education (CHE). 

  13. Hsu, M.K., Cummings, R.G., & Wang, S. W. (2014). Business students’ perception of university library service quality and satisfaction. Contemporary Issues In Education Research, 7(2), 137–144. 

  14. Hulsmann, T. (2004). Low cost distance education strategies: The use of appropriate information and communication technologies. The International Review of Research in Open and Distributed Learning, 5(1), 1–14. https://doi.org/10.19173/irrodl.v5i1.175 

  15. Immerwahr, J., Johnson, J., & Gasbarra, P. (2008). The Iron Triangle: College presidents talk about costs, access, and quality. The National Center for Public Policy and Higher Education and Public Agenda. 

  16. Johnson, J. (2014). Cost-effectiveness and cost-benefit analysis of governance and anti-corruption activities. Issues Paper 10, CMI/U4, Bergen. https://www.u4.no/publications/cost-effectiveness-and-cost-benefit-analysis-of-governance-and-anti-corruption-activities.pdf 

  17. Jung, I. (2005). Cost-effectiveness of online teacher training. Open Learning, 20(2), 131–146. https://doi.org/10.1080/02680510500094140 

  18. Krishnan, C. (2012). Student support services in distance higher education in India: A critical appraisal. International Journal of Research in Economics & Social Sciences, 2(2), 459–472. 

  19. Lane, A. (2014). Placing students at the heart of the Iron Triangle and the Interaction Equivalency models. Journal of Interactive Media in Education, 2(5), 1–8. http://doi.org/10.5334/jime.ac 

  20. Makoe, M., & Nsamba, A. (2019). The gap between student perceptions and expectations of quality support services at the University of South Africa. American Journal of Distance Education, 33(11), 1–10. https://doi.org/10.1080/08923647.2019.1583028 

  21. Mawere, T., & Sai, K.O.S. (2018). An investigation on e-resource utilisation among university students in a developing country: A case of Great Zimbabwe University. South African Journal of Information Management, 20(1), a860. https://doi.org/10.4102/sajim.v20i1.860 

  22. Mulder, F. (2013). The LOGIC of national policies and strategies for Open Educational Resources. International Review of Research in Open and Distance Learning, 14(2), 96–104. https://doi.org/10.19173/irrodl.v14i2.1536 

  23. Nsamba, A., & Makoe, M. (2017). Evaluating quality of students’ support services in open distance learning. Turkish Online Journal of Distance Education, 18(4), 91–103. https://files.eric.ed.gov/fulltext/EJ1161816.pdf 

  24. Olajide, O., & Adio, G. (2017). Effective utilisation of university library resources by undergraduate students: A case study of Federal University Oye-Ekiti, Nigeria. Library Philosophy and Practice (e-journal). http://digitalcommons.unl.edu/libphilprac/1503 

  25. Onifade, N.F., Ogbuiyi, S.U., & Omeluzor, S.U. (2013). Library resources and service utilization by postgraduate students in a Nigerian private university. International Journal of Library and Information Science, 5(9), 289–294. 

  26. Ouma, R., & Nkuyubwatsi, B. (2019). Transforming university learner support in open and distance education: Staff and students perceived challenges and prospects. Cogent Education, 6(1), https://doi.org/10.1080/2331186X.2019.1658934 

  27. Parasuraman, A., Zeithaml, V. A., & Berry, L. L. (1988). A multi-item scale for measuring consumer perceptions of service quality. Journal of Retailing, 64(1), 12–40. 

  28. Power, M., & Gould-Morven, A. (2011). Head of gold, feet of clay: The online learning paradox. The International Review of Research in Open and Distance Learning, 12(2), 19–39. https://doi.org/10.19173/irrodl.v12i2.916 

  29. Robinson, B. (1995). Research and pragmatism in learner support. In F. Lockwood (Ed.), Open and distance learning today (pp 221–231). Routledge. 

  30. Rumble, G. (2003). Modeling the costs and economics of distance education. In M. G. Moore & W. G. Anderson (Eds.), Handbook of distance education. Lawrence Erlbaum Associates. 

  31. Saadon, S., & Liong, C.Y. (2011). Perception of students on services at the computer laboratory: A case study at the School of Mathematical Sciences, Universiti Kebangsaan Malaysia. Procedia - Social and Behavioral Sciences, 59(2012), 117–124. https://doi.org/10.1016/j.sbspro.2012.09.254 

  32. Salubi, O.G., Okemwa, E.O., & Nekhwevha, F. (2018). Utilisation of library information resources among Generation Z students: Facts and fiction. Publications 2018, 6(16). https://doi.org/10.3390/publications6020016 

  33. Tait, A. (2003). Reflections on student support in open and distance learning. International Review of Research in Open and Distance Learning, 4(1), 1–9. https://doi.org/10.19173/irrodl.v4i1.134 

  34. Uppal, M.A., Ali, S., & Gulliver, S.R. (2017). Factors determining elearning service quality. British Journal of Educational Technology, 49(3), 412–426. https://doi.org/10.1111/bjet.12552 

  35. Zuhairi, A., Karthikeyan, N., & Priyadarshana, S.T. (2019). Supporting students to succeed in open and distance learning in the Open University of Sri Lanka and Universitas Terbuka Indonesia. Asian Association of Open Universities Journal, 1(1), 13–35. https://doi.org/10.1108/AAOUJ-09-2019-0038 

  36. Woldeyes, M.M. (2016). Breaking the Higher Education Iron Triangle through Distance Education: The case of IGNOU in Addis Ababa, Ethiopia. International Journal of Education, 8(3), 31–49. https://doi.org/10.5296/ije.v8i3.9771 

comments powered by Disqus