Start Submission Become a Reviewer

Reading: A MOOC approach for training researchers in developing countries


A- A+
Alt. Display

Research articles

A MOOC approach for training researchers in developing countries


Ravi Murugesan ,

X close

Andy Nobes,

X close

Joanna Wild

X close


We report on an online course in research writing offered in a massive open online course (MOOC) format for developing country researchers. The concepts of cognitive presence, teacher presence, and social presence informed the design of the course, with a philosophy of strong social interaction supported by guest facilitators. The course was developed with low-bandwidth elements and hosted on a Moodle site. It was offered twice as a MOOC and 2830 learners from more than 90 countries, mainly in the developing world, took part. The average completion rate was 53%. Female learners and learners who were active in the forums were more likely to complete the course. Our MOOC approach may be a useful model for continuing professional development training in the developing world.

How to Cite: Murugesan, R., Nobes, A., & Wild, J. (2017). A MOOC approach for training researchers in developing countries. Open Praxis, 9(1), 45–57. DOI:
  Published on 01 Jan 2017
 Accepted on 14 Feb 2017            Submitted on 7 Nov 2016


Knowledge advances through scholarly research, and communication is essential for this advancement to happen. However, researchers in developing countries face multiple challenges in publishing their work in peer-reviewed journals: they often lack access to mentors, have limited opportunities for research funding, have poor access to literature (Nchinda, 2002), and lack training in research writing skills (Langer, Díaz-Olavarrieta, Berdichevsky & Villar, 2004). There is a basic “information inequality” in knowledge of publishing practices (such as open access), international ethical standards, and exploitative practices such as so-called “predatory journals” (Jones, 2015). Researchers in the Global South may be detached from the information-rich dialogues on scholarly publishing that tend to take place in the North (Nobes, 2016a). Despite lacking access to continuing professional development (CPD) training in research communication, developing country researchers are driven by the same “publish or perish” culture as in developed countries, and they remain acutely aware of the importance of scholarly research on their countries’ development.

In 2006, the AuthorAID concept was developed to bridge the publishing gap between the developed and developing world (Freeman & Robbins, 2006). Since then, the AuthorAID project at INASP, an international development charity in the UK, has supported developing country researchers through mentoring, training workshops, online courses, e-resources, and institutional partnerships (Nobes, 2016b). In 2011, our pilot online course in research writing was successfully offered to a group of 28 Rwandan researchers with a 90% completion rate (Murugesan, 2012). During 2012 and 2013, we saw a steady increase in the demand for this course. In 2014 and 2015, we ran the course for 267 and 356 learners, respectively. Over this period, we began to notice that the gender balance among learners in our online courses was consistently better than what is typical at face-to-face workshops we support in developing countries, in which female researchers are often underrepresented. A participant of an AuthorAID course in 2013 commented that online training may be especially accessible for women: “Women have multiple responsibilities with work and family. Sometimes they can’t think of taking time away from their job and their children” (Owens, 2013)

In 2015, seeing the rapidly increasing demand for training in research writing from developing countries and their accessibility for both male and female researchers, we were encouraged to develop a massive open online course (MOOC) approach.

There is already some evidence that researchers are more likely than the general population to be participants of online courses and MOOCs. For example, Aboshady et al. (2015) found a high awareness of MOOCs in a survey of Egyptian medical students: 30% had enrolled in at least one MOOC. Research on MOOCs in developing country contexts is, however, sparse (Castillo, Lee, Zahra & Wagner, 2015) and mostly limited to participation of developing country users in western MOOCs (Garrido et al., 2016; Christensen et al., 2013; Ho et al., 2015), where there tends to be low participation from Asia and especially Africa (Liyanagunawardena, Williams & Adams, 2013). We know, however, that developing country participants are generally more likely to complete MOOCs compared to those from developed countries (Garrido et al., 2016), especially when learning specific job skills (Christensen et al., 2013).

There have been some efforts to adapt western MOOCs to developing country audiences following the realisation that high-bandwidth elements such as video lectures are not practical and existing platforms need to be “open-sourced” for adaptation (Lieber, 2013). However, MOOC usage can be restricted by poor infrastructure as highlighted by Liyanagunawardena et al. (2013), who notes that even when there is access to good Internet connectivity, poor digital literacy skills pose a barrier. The problem is intensified when online courses and MOOCs use high-end technology (Warusavitarana, Lokuge Dona, Piyathilake, Epitawela & Edirisinghe, 2014).

The early promises around MOOCs focused on their potential to democratise education and reach people with limited learning opportunities. However, it soon became obvious that this ambition could not be realised with a “one-size-fits-all” MOOC approach, that is, without (1) taking into account the learners’ circumstances and the context in developing countries; (2) providing little or no guidance and support for the learners (Patru & Balaji, 2016); and (3) considering language challenges (Liyanagunawardena et al., 2013). MOOC providers need to adapt to the cultural contexts and needs of users (Castillo et al., 2015; Daniel, Vázquez & Gisbert, 2015), but it is not yet known what pedagogical approach is most effective—we are very much in an “experimental phase” for MOOCs in developing countries (Wildavsky, 2015).

To design the AuthorAID research writing MOOC for audiences in developing countries, we were guided by the pedagogical model developed by Garrison (2007) and our organisational knowledge of the socio-political and cultural contexts of researchers in developing countries in Africa and Asia. In this paper, we explain (1) the approach that guided course design and development; (2) the rationale behind the choice of technology, content, and methods to support interaction; (3) how we implemented the course; and (4) the ealuation method and results. Finally, we reflect on the strengths and limitations of our approach.

Course design and development

Given the lack of recognised pedagogical models for designing MOOCs specifically for developing countries, we reviewed existing models of online pedagogy to guide our work. Because our MOOC had to support low-bandwidth connections, we could not rely on multimedia and video content to motivate and attract learners. We had to provide high-quality content in a largely textual format. Therefore, we needed a strong pedagogy to support the learners’ interaction with the content, facilitators, and peers, and thus to create an engaging learning experience.

There are several models of online pedagogy that have strong social elements. Mayes’s Conceptual Learning model (Mayes & Fowler, 1999) is based on stages of “conceptualisation” and “construction” through the content first, before the stage of “dialogue” is introduced where deep learning takes place via student/teacher and student/student interaction. Laurillard’s (2002) Conversational Framework looks at higher level learning through dialogue at a theoretical and practical level: through discussion, adaptation, interaction, and reflection. Salmon’s (2011) Five Stage Model is based on increasingly high social interaction and reflective dialogue between students and facilitators, through multiple stages of tasks. However, based on our experiences with previous online courses, the model that seemed to best fit our situation was Garrison’s (2007) Community of Inquiry model.

According to Garrison, Anderson and Archer (1999), “presence” is the key in online interactions, which makes learning “deep and meaningful” (Garrison & Cleveland-Innes, 2005), leads to high student engagement, and therefore achievement (Oblinger, 2014). In the Community of Inquiry model (Garrison, 2007), the learners’ experience is shaped by three elements: cognitive presence, social presence, and teacher presence. Cognitive presence is the learner’s ability to construct knowledge and negotiate meaning, with the learner actively engaging with the content through reflection and discourse with others. Social presence refers to connecting with others in the course, relationship building, and being able to engage in purposeful activities together despite the lack of face-to-face interaction. Finally, teacher presence comes across both within the design—through explicit signposting and guidance, and through facilitation activities and support offered to the learners.

Throughout the course design and development process, we referred to Garrison’s model to support our decision-making. In the next section, we outline the rationale behind the pedagogic choices we made with respect to the course content, facilitation, and opportunities for peer-learning and peer-interaction.

Cognitive presence – simple design and engagement with content

The content we developed for the course was deliberately text-based and low-bandwidth, considering the challenges related to Internet connectivity and digital literacy skills in developing countries. Then we focused on ways to engage learners with the topic and create opportunities for discussion, reflection, and practice. As most learners who take our courses have full-time responsibilities, we designed the course to not take more than three to four hours of study time per week.

The course was divided into sections: (1) course induction, with introductory information and activities to help participants make a strong start; (2) discussion forums containing thematic forums related to the course topics; (3) one section for each of the five key topics covered in the course; and (4) a wrap-up section containing the participant feedback form among other things. The five key topics were literature review, research ethics, writing a research paper (parts 1 and 2), and publishing a paper, and each topic was allocated one week in the course schedule (for example, Figure 1).

Figure 1 

Screenshot of a key topic in the course

The course content was in the form of 12 lessons, with two to three lessons for each of the five key course topics. Reflective questions embedded in the lessons serve to check learners’ understanding, support meaning-making, and reinforce learning. Learners are provided with instant feedback and commentary on the questions. Cognitive presence can often be initiated through “triggering” events (Garrison, 2007). The lessons engaged the learners when introducing controversial topics such as research ethics and paper authorship, which organically led to lively discussions in the forums.

Each course topic had a check-your-understanding quiz made up of multiple-choice questions. Unlimited attempts were allowed on each quiz, and the answer key was revealed at the end of the course. Learners were required to pass each quiz by scoring at least 80% in order to receive a course completion certificate.

There were two writing activities in the course which included peer assessment. Peer assessment helps to develop cognitive presence through higher-level learning (Nagel & Kotzé, 2010) and social presence through co-learning with other participants. Learners benefit when they spend time on the “cognitively demanding” exercise of reviewing other learners’ work (Nagel & Kotzé, 2010), which, in turn, should cause them to reflect on their own approach to the activity—this is a form of metacognitive knowledge construction through collaborative learning (Akyol & Garrison, 2011), which we have also noted in our recent work (Wild, Murugesan, Schaeffler & Powell, 2016). Furthermore, in a MOOC, it is usually not possible for the teacher to give personalised feedback on the learners’ work. Through peer assessment, learners have the opportunity to get feedback on their work without depending on the teacher. In the MOOCs reported in this paper, the learners were asked to write a short essay on research ethics in the first writing activity and a research abstract in the second activity. After the submission phase, the activity moved to the assessment phase: every learner who submitted their work was randomly allocated three of their colleagues’ submissions for assessment and given an assessment form.

Teacher presence – guest facilitation and focused use of video

We considered teacher presence to be an important aspect of the course. Only two members of the AuthorAID team were in charge of running the MOOCs, but the topic of research writing lends itself to deep discussion. To meet this challenge, we formed a team of “guest facilitators” drawn from (1) the AuthorAID network of voluntary mentors from our mentoring scheme; (2) researchers in developing countries who have attended AuthorAID train-the-trainers workshops; and (3) high achievers from previous online courses. Their main role was to respond to questions and take part in conversations in the discussion forums, and they were provided with detailed guidelines on how to do this. Guest facilitators’ posts were marked with an AuthorAID logo to distinguish them from participants’ posts. When making weekly announcements, we encouraged the learners to engage in the forums because of the ever-present team of guest facilitators.

This approach has been tried in MOOCs elsewhere, for example, a team of 800 volunteers with basic subject knowledge to encourage learning connections in a MOOC with over 20,000 participants (Ramirez, 2014) and academic staff or “student ambassadors” as MOOC facilitators to intervene and guide discussions (Padilla Rodríguez, Armellini & Cáceres Villalba, 2016). Similarly, Lee & Rofe (2016) included completers of a previous course to create a team of “associate tutors”. This model was also used by Redfield (2015), who called returning volunteer students “community TAs”. The difference in our approach is that our guest facilitators are mostly experts in the course topic (research writing), which further enhances the depth of conversations in the forums.

Although we tried to keep the course as light on bandwidth as possible, we were influenced by studies showing that even short videos are effective at establishing the instructor’s teaching presence with students (Jones, Naugle & Kolloff, 2008; Pan et al., 2012), and could potentially contribute to all three types of presences in Garrison’s model (Garrison & Arbaugh, 2007). We therefore experimented in our first MOOC with a short introductory video featuring the two AuthorAID team members in charge of the course, in order to “humanise” their online interaction with the course participants. This was extended in our second MOOC to include a recorded discussion with the two AuthorAID team members and five guest facilitators.

Social presence – structured and facilitated forum interaction

We took care to create social presence early in the course by initiating personal introductions and peer-to-peer interaction in the forums. Social presence is achieved when participants are able to project their personal characteristics into the community, as “real people” (Garrison et al., 1999). Forum participation is vital in creating a sense of community and social presence, which if done successfully can lead to higher student performance (Hostetter & Busch, 2013). As Salmon (2011) points out, facilitators also have an important role in managing a learning community and socialisation. By engaging guest facilitators who were mostly subject-matter experts, we hoped to strengthen both the cognitive and the social aspects of the course.

Asynchronous online discussions can actually achieve higher cognitive value than face-to-face communication (Meyer, 2003), but it is important for the online discourse to be “structured and cohesive” (Garrison & Cleveland-Innes, 2005). In our MOOCs, we made available the course materials in a staggered fashion, with the materials for the first two weeks opened only at the start of those weeks, and the remaining materials opened in the third week. This was to keep the learners focused on one or more related topics at a time. However, in the first MOOC, we decided to ignite discussion by opening all the main discussion forums (one for each course topic) at the start of the course. Unfortunately, this made the forums somewhat chaotic with many questions about topics covered in lessons not yet available. Hence, in the second MOOC, we decided to structure the forums in the same way as the weekly course topics, only providing access to one forum per week until the third week, by which time an etiquette to forum interaction had emerged.

Course implementation and evaluation

The course was implemented on INASP’s Moodle site (Moodle version 2.6). Moodle is open-source educational technology widely used in both developed and developing countries. The course lessons were developed using eXeLearning (eXeLearning version 2.0), an open-source content authoring tool. The content in the lessons is almost entirely text-based and therefore suitable for low-bandwidth connections common in developing countries. The content is licensed under Creative Commons and can be downloaded and viewed offline. Quizzes and peer assessment activities were developed using the Moodle “quiz” and “workshop” tools, respectively. Google Hangouts was used to record the video discussions.

The first MOOC (MOOC #1 henceforth) ran from October to November 2015, and MOOC #2 ran from April to May 2016. Both were six weeks long. For both the MOOCs, an open call for enrolment was publicised over a month before the start date through the AuthorAID website (visited by over 50,000 users annually), social media accounts (Facebook and Twitter), and email to the INASP network of institutions and partner organisations (INASP, 2016).

To evaluate the learners’ experience of the MOOCs, we collected both qualitative and quantitative data—the former through a participant feedback survey at the end of each course and the latter through learning analytics. The feedback survey, made with the Moodle “feedback” tool, contained several questions with answer choices on a 5-point Likert scale plus a “not applicable” option, as well as open-ended questions. The participants were asked to give feedback on (1) the course structure and overall relevance of the course; (2) the course administrators and facilitators; (3) the lessons, activities, and discussion forums; (4) logistics and technology—for example, the device they used to study and the factors that disrupted their work on the course; and (5) overall matters.

Moodle offers learning analytics to help teachers understand participation dynamics and performance levels. This allowed us to explore associations between different items. We used MS Excel and the statistical computing application R for quantitative analysis covering the whole population of learners (without any sampling).

Results and discussion

In reporting the results below, we focus on the three types of presences (cognitive, social, and teacher) and evidence of (1) the learners’ ability to deeply engage with the content and negotiate meaning through the learning platform; (2) relationship building and learning together and from each other through peer interaction; and (3) facilitation presence encouraging learner engagement and supporting learning in the course. We also report on results from a follow-up survey and the course completers’ perceptions of the role that the course played in helping them get published.

Typically in MOOCs there is a high dropout rate right at the beginning of the course, and completion often averages around 3% to 15% (Hollands & Tirthali, 2014). One of the key indicators of success of our approach would be, therefore, a healthy completion rate. In both our MOOCs, the learners had to complete a background information form in order to access the course material. Completion of the background information form was the first step towards actual participation in the course, so in our analysis we focused on this base of “actual learners”.

In MOOC #1 there were 1275 actual learners among the 1752 people who enrolled. In MOOC #2, these numbers were 1555 and 3033, respectively. The total of 2830 actual learners came from 95 countries predominantly in the developing world: a little over 50% from Africa, another 40% from Asia, and most of the remainder from the Middle East and Latin America. Highly represented countries—with more than 100 learners per country—were Sri Lanka, India, Nigeria, Kenya, Ghana, Nepal, Egypt, Somalia, the Philippines, and Uganda. Table 1 shows the completion rate in the two MOOCs; the average rate was 53%.

Table 1

Actual learners and completion rates

Course No. of actual learners (% women) % of course completers (% women) No. of developing countries represented by the completers (women)
MOOC #1 1275 (45%) 47% (49%) 51 (40)
MOOC #2 1555 (46%) 58% (61%) 52 (42)

In addition to the high completion rates, the results by gender are worth noting. The participation rate of 45%–46% women would seem to buck the trend of research showing that developing country female participation in MOOCs is low: Christensen et al. (2013)’s study of Coursera MOOCs found that only 37% of participants from developing countries were female, in contrast with US participants who were 49% female. Ho et al. (2015) discovered that as low as 31% of developing country participants on HarvardX and MITx courses were female.

The completion rate for female learners is higher than that for males in both the MOOCs (Table 1), and to check if this difference is statistically significant we ran the Pearson’s Chi-squared test in R using the data shown in Table 2.

Table 2

Contingency table for gender and course completion (both the MOOCs combined)

Gender Participants who did not complete the course Participants who completed the course
Female 571 712
Male 761 786

Because the p value obtained (0.014) is less than the significance level (0.05), we can reasonably conclude that there is an association between gender and course completion. Female learners were more successful at completing the course. This result compares to Garrido et al. (2016), who found that female MOOC participants in certain developing countries (Colombia, South Africa, and the Philippines) are more likely than men to be completers.

In general, feedback from the participants suggests the course content was engaging: 90% of the 1475 respondents (in the two MOOCs combined) completely agreed that the course was relevant to their learning needs and 83% completely agreed that the interactive e-learning format used for the lessons was better than plain text or static content. Further, 93% completely agreed that the weekly quizzes helped them validate their understanding of the course content. Participants also appreciated that the lessons were downloadable and could be reviewed in the future, and the fact that they were written with non-native speakers in mind: “The lessons were straight-forward with simplified English which makes understanding very easy”. We must add a caveat here that nearly everyone who gave feedback was a course completer.

In our choice of technology we took great care to ensure that the course was accessible to researchers in countries with weak Internet connectivity. Although 32% of feedback respondents still experienced significant disruption to their participation because of Internet connectivity, it is worth mentioning that among the course completers were 115 researchers from many countries considered as fragile: the top 12 countries from a formal list (List of countries by Fragile States Index, n.d.) plus Sierra Leone, Liberia, Myanmar, Burundi, and West Bank and Gaza/Palestine.

It is worth noting that when it came to peer assessment, 48% of respondents named this activity as one of the most useful elements of the course and 71% either completely or partly agreed that they received useful feedback from their peers. Several learners expressed interest in having more peer assessment activities. Some commented on why doing the review was helpful to them: “It gives me the experience to be the student and the teacher as well. It validates what I have learned and I share that learning to my colleagues’. Others said they benefited from the feedback they received: “I am satisfied with the assessments and I feel energised and motivated to complete my journal paper”.

The social component of the course was driven by active forums. Of the 2830 learners in the two MOOCs, 1088 (36%) made at least one post in the forums, and there were 9456 posts in all. This compares favourably to other MOOCs. Hill (2013) has argued that centralised discussion forums do not scale in MOOCs, using examples such as the first edX MOOC where forums were used only by 3% of the learners (Breslow et al., 2013) and the first Vanderbilt MOOCs where forum participation ranged from 4% to 22% (Bruff, 2013).

Feedback for the social component was positive: 63% completely agreed that they learnt new things from peers on the forums, with 23% partly agreeing; 45% thought that the forums were one of the most useful elements of the course; 54% planned to stay in touch with other participants after the course; and 19% reported that they had already started discussions with other participants about possible research collaboration, showing the strong social connections that had been established. The level of participation and interaction was praised: “The large volume of participation in itself was a very good thing” and “I thoroughly enjoyed how interactive the course was, particularly the discussion forums”.

To see how forum participation influenced success on the course, we put together the data shown in Table 3.

Table 3

Contingency table for forum posts and course completion (both the MOOCs combined)

No. of forum posts Participants who did not complete the course Participants who completed the course
Zero 1157 665
One or more 175 833

Only 36% of the learners who made zero posts (665 out of 1822) completed the course, in stark contrast to 83% of those who made at least one post (833 out of 1008). Unsurprisingly, this difference is statistically significant (p < 2.2e-16; Pearson’s Chi-squared test). Both men and women were almost equally likely to participate in the forums: 35% and 36% of the female and male learners, respectively, made at least one post.

It seems reasonable to attribute the success of the discussion forums to the strong teacher presence provided by the team of guest facilitators. In both the MOOCs, the ratio of the number of learners’ posts to the guest facilitators’ posts was encouraging (Table 4).

Table 4

Count of forum posts (both the MOOCs combined)

Course No. of guest facilitators (% women) No. of countries represented by guest facilitators (women) No. of guest facilitators’ posts (% women) No. of learners’ posts (% women) Ratio of learners’ to guest facilitators’ posts
MOOC #1 18 (39%) 13 (7) 796 (22%) 4002 (44%) 5.03
MOOC #2 20 (50%) 16 (8) 1073 (45%) 5454 (39%) 5.08

Feedback on the guest facilitator input was positive: 61% completely agreed that guest facilitator posts in the forums “contributed significantly to my knowledge” and 23% partly agreed. One participant commented that “It doesn’t seem like self-learning at all, thanks to the easy to use features, design and amazing level of interactions. The replies by guest facilitators is a course in itself”.

The facilitator panel discussion video we included in the middle of MOOC #2 was also well received: “It was nice to put faces and voices to some of the facilitators which gave me a sense of nearness”.

Early results from a follow-up survey

We administered a follow-up survey to the 596 completers of MOOC #1 ten months after the course ended to find out if they had achieved success in what the course had trained them for, that is, writing and publishing papers. We report below the results from an initial analysis of the data.

The 284 survey respondents reported having published 148 journal articles after the course, and 74% of the respondents who managed to publish a paper in the time elapsed felt that the course “helped a lot” with their publishing endeavours. Some expanded on their responses, revealing that the course provided a boost in confidence to write up and submit their paper:

  • “The course has helped me to have self-confidence in scientific writing. After the course, I wrote two papers and both are already published. Now, I am preparing for another one!”
  • “Before the online training, I had carried out my fieldwork but had been unable to consolidate it into a report or publication. However, after the training I acquired confidence in writing and within two months, I was able to submit a manuscript. So far I have been able to publish 2 papers in peer refereed journals.”

Although female learners were more successful in completing the course, better performance on a training programme does not necessarily translate into better post-training success. While 46% of male respondents have published at least one paper after the course, only 36% of women have done so.

We plan to do an in-depth analysis in the near future to compare the participants’ post-course publication success with baseline data from the background information form and to understand the reasons for the gendered difference in success.


Although MOOCs are now commonplace, the literature review we conducted revealed a lack of recognised models for developing such courses specifically for the context and needs of developing countries. We used Garrison’s Community of Inquiry model and our experience of working in Africa and Asia to guide us in our design decisions and, hopefully, create an engaging and social learning experience. Below, we reflect on the strengths, challenges, and limitations of our work.

Strengths of our approach

The completion rate of 58% in MOOC #2 can be considered high in the MOOC context. This may be partly because developing country learners are more likely to complete MOOCs (Garrido et al., 2016), but we also believe that two critical factors contributed to the high completion rate in both the MOOCs we offered: (1) we designed the course for a specific target audience, keeping in mind their expectations and constraints; and (2) the pedagogical model we adopted helped us develop learning activities in such a way that deep learning and meaning-making were facilitated through structured social interaction and driven by the strong presence of expert facilitators. These factors may be largely responsible for the following positive outcomes:

  1. The evaluation results from both the MOOCs show that female researchers were not only well represented among the course participants but also more successful at completing the course. Therefore, this MOOC approach to CPD training is encouraging from a gender equity perspective.
  2. By tapping into our network of experts, providing them with clear guidelines that made guest facilitation an interesting activity (without excessive responsibility on any single guest facilitator), and incentivising the work of facilitation through certificates and badges, we were able to provide a healthy level of teacher presence on the MOOCs without incurring any financial cost.
  3. The more structured social interaction in MOOC #2 may have contributed to the improved completion rate compared to MOOC #1. Because the forum for each course topic was opened in sync with the relevant course material, the forum posts seemed to be of higher quality than in MOOC #1. Guest facilitators who served on both the MOOCs observed that in MOOC #2 learners were asking more thoughtful questions upon reading the lessons, rather than overloading the forums without prior preparation.
  4. A substantial number of participants were engaged enough in the course to commit to peer assessment activities, which contained cognitive and social learning aspects. The task was designed to relate directly to CPD in that it modelled a real-life episode participants would be likely to face: journal peer review. Some participants reported that the activity actually gave them confidence to submit their research to a journal.

Challenges and limitations

Taking stock of our MOOC approach and results, we have identified some key challenges and limitations.

  1. It has been difficult to collect feedback from those who did not complete the course, so we do not have a good understanding of why 40% to 50% of learners dropped out of the course after making a start on it. Perhaps following up with them in a personalised way and arranging phone interviews will provide useful information.
  2. The peer assessment activities did not suit everybody. The majority of the course completers (79%) in the two MOOCs participated in both the submission and assessment phases of at least one peer assessment activity (and thus received a “merit” grade instead of the basic “pass”), but there was often confusion about how this activity worked. The out-of-the-box functionality of the Moodle “workshop” tool is not as intuitive as it could be. There were many requests for deadline extensions which we could not offer because of the timed phases of the activity (submission, assessment, closure). Others have commented that strict deadlines in MOOCs fail to account for developing country problems such as intermittent Internet and electricity (Bali, 2014). Hopefully, our decision to make this an optional but recommended activity partly addresses this issue. We are also looking into other tools for peer assessment that can be integrated in a Moodle course.
  3. Initial analysis of the follow-up survey of MOOC #1 reveals that women do not seem to have achieved as much success as men in terms of papers published. This indicates that some kind of gendered intervention may be necessary beyond training to support women in achieving impact, at least for the context in which we work.
  4. Certain learner groups may be less successful in completing our MOOC than others. We plan to carry out a detailed analysis to identify these learner groups and to ascertain what can be done to improve the course outcomes on the whole and for different learner groups.

We intend to continue using the pedagogical model described in this paper for our MOOCs in the future, and we shall use data from learning analytics, learner feedback, and impact evaluation to improve our course design and delivery. We believe that the MOOC approach reported in this paper may be suitable for providing CPD training in other topics that are in high demand in the developing world.


The authors thank Ruth Bottomley and Julie Walker of INASP who helped develop the strategy behind the MOOC approach reported here, and all the guest facilitators who played a vital role in these courses. We also thank the UK’s Department for International Development (DFID) and Swedish International Development Cooperation Agency (Sida) for funding INASP’s AuthorAID project, which includes the MOOCs described here.


  1. Aboshady, O. A., Radwan, A. E., Eltaweel, A. R., Azzam, A., Aboelnaga, A. A., Hashem, H. A., Darwish, S. Y., Salah, R., Kotb, O. N., Afifi, A. M., Noaman, A. M., Salem, D. S. and Hassouna, A. (2015). Perception and use of massive open online courses among medical students in a developing country: Multicentre cross-sectional study. BMJ Open 5((1)): e006804. Retrieved from. 

  2. Akyol, Z. and Garrison, D. R. (2011). Understanding cognitive presence in an online and blended community of inquiry: Assessing outcomes and processes for deep approaches to learning. British Journal of Educational Technology 42((2)): 233–250. 

  3. Bali, M. (2014). MOOC pedagogy: Gleaning good practice from existing MOOCs. Journal of Online Learning and Teaching 10((1)): 44. Retrieved from. 

  4. Breslow, L., Pritchard, D. E., DeBoer, J., Stump, G. S., Ho, A. D. and Seaton, D. T. (2013). Studying learning in the worldwide classroom: Research into edX’s first MOOC. Research & Practice in Assessment, : 8. Retrieved from. 

  5. Bruff, D. (2013). August;Lessons learned from Vanderbilt’s first MOOCs. Blog post, Retrieved from. 

  6. Castillo, N. M., Lee, J., Zahra, F. T. and Wagner, D. A. (2015). MOOCS for development: Trends, challenges, and opportunities. Information Technologies & International Development 11((2)): 35. Retrieved from. 

  7. Christensen, G., Steinmetz, A., Alcorn, B., Bennett, A., Woods, D. and Emanuel, E. J. (2013). The MOOC phenomenon: Who takes massive open online courses and why?, 

  8. Daniel, J., Vázquez, E. and Gisbert, M. (2015). The Future of MOOCs: Adaptive Learning or Business Model?. International Journal of Educational Technology in Higher Education 12((1)): 64–73. 

  9. Freeman, P. and Robbins, A. (2006). The publishing gap between rich and poor: the focus of AuthorAID. Journal of Public Health Policy 27((2)): 196–203.  

  10. Garrido, M., Koepke, L., Andersen, S., Mena, A. F., Macapagal, M. and Dalvit, L. (2016). An examination of MOOC usage for professional workforce development outcomes in Colombia, the Philippines, & South Africa. Seattle: Technology & Social Change Group, University of Washington Information School..  

  11. Garrison, D. R. (2007). Online community of inquiry review: social, cognitive, and teaching presence issues. Journal of Asynchronous Learning Networks 11((1)): 61–72. Retrieved from. 

  12. Garrison, D. R., Anderson, T. and Archer, W. (1999). Critical inquiry in a text-based environment: Computer conferencing in higher education. The Internet and Higher Education 2((2–3)): 87–105. 

  13. Garrison, D. R. and Cleveland-Innes, M. (2005). Facilitating cognitive presence in online learning: Interaction is not enough. The American Journal of Distance Education 19((3)): 133–148. 

  14. Garrison, D. R. and Arbaugh, J. B. (2007). Researching the community of inquiry framework: Review, issues, and future directions. The Internet and Higher Education 10((3)): 157–172. 

  15. Hill, P. (2013). September;MOOC discussion forums: Barrier to engagement? Blog post, Retrieved from. 

  16. Ho, A. D., Chuang, I., Reich, J., Coleman, C., Whitehill, J., Northcutt, C. and Petersen, R. (2015). HarvardX and MITx: Two years of open online courses fall 2012–summer 2014. HarvardX Working Paper No. 10, 

  17. Hollands, F. M. and Tirthali, D. (2014). MOOCs: expectations and reality. Full report. NY: Center for Benefit Cost Studies of Education, Teachers College, Columbia University.  

  18. Hostetter, C. and Busch, M. (2013). Community matters: Social presence and learning outcomes. Journal of Scholarship of Teaching and Learning 13((1)): 11–86. Retrieved from. 

  19. INASP (2016). INASP network of countries, Retrieved from. 

  20. Jones, P. (2015). July;Predatory publishing isn’t the problem, it’s a symptom of information inequality. Perspectives, Retrieved from. 

  21. Jones, P., Naugle, K. and Kolloff, M. (2008). March;Teacher presence: Using introductory videos in online and hybrid courses. Learning Solutions Magazine, Retrieved from. 

  22. Langer, A., Díaz-Olavarrieta, C., Berdichevsky, K. and Villar, J. (2004). Why is research from developing countries underrepresented in international health literature, and what can be done about it?. Bulletin of the World Health Organization 82((10)): 802–803.  

  23. Laurillard, D. (2002). Rethinking university teaching. A conversational framework for the effective use of learning technologies. London: Routledge.  

  24. Lee, Y. and Rofe, J. S. (2016). Paragogy and flipped assessment: Experience of designing and running a MOOC on research methods. Open Learning: The Journal of Open, Distance and e-Learning 31((2)): 116–129. 

  25. Lieber, J. (2013). In the developing world, MOOCs start to get real. MIT Technology Review, Retrieved from. 

  26. List of countries by Fragile States Index ().  n.d..In Wikipedia, Retrieved from. 

  27. Liyanagunawardena, T. R., Williams, S. and Adams, A. A. (2013). The impact and reach of MOOCs: A developing countries’ perspective. eLearning Papers, : 38–46. Retrieved from. 

  28. Mayes, J. T. and Fowler, C. J. (1999). Learning technology and usability: a framework for understanding courseware. Interacting with Computers 11((5)): 485–497.  

  29. Meyer, K. A. (2003). Face-to-face versus threaded discussions: The role of time and higher-order thinking. Journal of Asynchronous Learning Networks 7((3)): 55–65.  

  30. Murugesan, R. (2012). Promising outcomes of an online course in research writing at a Rwandan university. European Science Editing 38((3)) Retrieved from. 

  31. Nagel, L. and Kotzé, T. G. (2010). Supersizing e-learning: What a CoI survey reveals about teaching presence in a large online class. The Internet and Higher Education 13((1–2)): 45–51. 

  32. Nobes, A. (2016a). Open access plays a vital role in developing-country research communication. Eon 9(2): 25. 

  33. Nobes, A. (2016b). AuthorAID – supporting early career researchers in developing countries. The Biochemist 38((5)) Retrieved from. 

  34. Nchinda, T. C. (2002). Research capacity strengthening in the South. Social Science & Medicine 54((11)): 1699–1711.  

  35. Oblinger, D. (2014). Designed to engage. Educause Review 49((5)) Retrieved from. 

  36. Owens, B. (2013). AuthorAID to add online courses for social scientists. SciDevNet, Retrieved from. 

  37. Patru, M. and Balaji, V. (2016). Making sense of MOOCs: A Guide for policy-makers in developing countries. Commonwealth of Learning Guides & Toolkits. Retrieved from. 

  38. Padilla Rodríguez, B. C., Armellini, A. and Cáceres Villalba, V. C. (2016). Massive open online courses (MOOCs) behind the scenes In:  In.Kirby, P. and Marks, G. eds.   (Eds.).Proceedings of Global Learn 2016, : 359–366. Retrieved from. 

  39. Pan, G., Sen, S., Starrett, D. A., Bonk, C. J., Rodgers, M. L., Tikoo, M. and Powell, D. V. (2012). Instructor-made videos as a scaffolding tool. Journal of Online Learning and Teaching 8((4)): 298. Retrieved from. 

  40. Ramirez, M. S. (2014). Training strategies in team teaching to facilitate the connection of learning in MOOC courses. In EDULEARN14 Proceedings, : 3052–3060.  

  41. Redfield, R. J. (2015). Putting my money where my mouth is: the Useful Genetics project. Trends in Genetics 31((4)): 195–200. 

  42. Salmon, G. (2011). E-moderating: The key to teaching and learning online. 3rd ed New York: Routledge.  

  43. Warusavitarana, P. A., Lokuge Dona, K., Piyathilake, H. C., Epitawela, D. D. and Edirisinghe, M. U. (2014). MOOC: a higher education game changer in developing countries In:  In.Hegarty, B., McDonald, J. and Loke, S.-K. eds.   (Eds.).Rhetoric and Reality: Critical perspectives on educational technology. Proceedings ascilite Dunedin 2014, : 359–366.  

  44. Wild, J., Murugesan, R., Schaeffler, V. and Powell, A. (2016). Bringing learning closer to the workplace: An online course for librarians in developing countries. Pan-Commonwealth Forum 8 (PCF8) Working Paper, Retrieved from. 

  45. Wildavsky, B. (2015). MOOCs in the developing world: Hope or hype?. International Higher Education 80: 23–25. 

comments powered by Disqus