Evaluation of Student Feedback Within a MOOC Using Sentiment Analysis and Target Groups
Lundqvist, Karsten Ø. · Liyanagunawardena, Tharindu · Starkey, Louise

PublishedSeptember 2020
JournalInternational Review of Research in Open and Distributed Learning
Volume 21, Issue 3, Pages 140-156
CountryAustralia, Oceania

ABSTRACT
Many course designers trying to evaluate the experience of participants in a MOOC will find it difficult to track and analyse the online actions and interactions of students because there may be thousands of learners enrolled in courses that sometimes last only a few weeks. This study explores the use of automated sentiment analysis in assessing student experience in a beginner computer programming MOOC. A dataset of more than 25,000 online posts made by participants during the course was analysed and compared to student feedback. The results were further analysed by grouping participants according to their prior knowledge of the subject: beginner, experienced, and unknown. In this study, the average sentiment expressed through online posts reflected the feedback statements. Beginners, the target group for the MOOC, were more positive about the course than experienced participants, largely due to the extra assistance they received. Many experienced participants had expected to learn about topics that were beyond the scope of the MOOC. The results suggest that MOOC designers should consider using sentiment analysis to evaluate student feedback and inform MOOC design.

Keywords MOOC · teaching programming · sentiment analysis · target group · feedback · learner analytics

LanguageEnglish
ISSN1492-3831
RefereedYes
RightsCC BY
DOI10.19173/irrodl.v21i3.4783
Export optionsBibTex · EndNote · Tagged XML · Google Scholar



AVAILABLE FILES
4783-Article Text-33596-2-10-20200914.pdf · 361.6KB



Viewed by 98 distinct readers




CLOUD COMMUNITY REVIEWS

The evaluations below represent the judgements of our readers and do not necessarily reflect the opinions of the Cloud editors.

Click a star to be the first to rate this document


POST A COMMENT
SIMILAR RECORDS

Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies
Means, Barb; Toyama, Yukie; Murphy, Robert; Bakia, Marianne; U.S. Department of Education, Office of Planning, Evaluation, and Policy Development
A systematic search of the research literature from 1996 through July 2008 identified more than a thousand empirical studies of online learning. Analysts screened these studies to find those that (a) contrasted an ...
Match: evaluation

MOOCs: A systematic study of the published literature 2008-2012
Liyanagunawardena, Tharindu; Adams, Andrew; Williams, Shirley; McGreal, Rory; Conrad, Dianne
Massive open online courses (MOOCs) are a recent addition to the range of online learning options. Since 2008, MOOCs have been run by a variety of public and elite universities, especially in North America. Many ...
Match: liyanagunawardena, tharindu; mooc

Enhanced peer assessment in MOOC evaluation Through assignment and review analysis
Alcarria, Ramón; Bordel, Borja; de Andrés, Diego Martín; Robles, Tomás
The rapid evolution of MOOCs in recent years has produced a change in the education of students and in the development of professional skills. There is an increasing pressure on universities to establish procedures for ...
Match: evaluation; mooc

OERu context evaluation
Murphy, Angela
A survey was developed and launched in May and June 2012 to assess the extent to which OERu members are ready for participating in the pilot of the OERu model and determine which challenges organisations are currently ...
Match: evaluation

Critical evaluation of quality criteria and quality instruments in OER repositories for the encouragement of effective teacher engagement
Connell, Marina; Connell, John
This paper offers a short evaluation of the variety of quality criteria used in Open Educational Resources and some of the methods and practices in use to ensure quality. The paper surveys and reviews effective ...
Match: evaluation

Meaningful learner information for MOOC instructors examined through a contextualized evaluation framework
Douglas, Kerrie; Zielinski, Mitchell; Merzdorf, Hillary; Diefes-Dux, Heidi; Bermel, Peter
Improving STEM MOOC evaluation requires an understanding of the current state of STEM MOOC evaluation, as perceived by all stakeholders. To this end, we investigated what kinds of information STEM MOOC instructors ...
Match: evaluation

2010-2011 African health OER network phase 2 evaluation: Consolidation and sustainability
Harley, Ken
As part of the Hewlett Foundation grant for the African Health OER Network, Professor Ken Harley (University of KwaZulu-Natal) conducts an annual external evaluation of the project. For his 2009 evaluation, Prof Harley ...
Match: evaluation

Massive open online courses: A review of usage and evaluation
Sinclair, Jane; Boyatt, Russell; Rocks, Claire; Joy, Mike
The massive open online course (MOOC) has seen a dramatic rise in prominence over the last five years and is heralded by some as disrupting existing pedagogy and practices within the education sector, while others are ...
Match: evaluation

Evaluation of free platforms for delivery of massive open online courses (MOOCs)
Zancanaro, Airton; Nunes, Carolina Schmitt; de Domingues, Maria Jose Carvalho Souza
For the hosting, management and delivery of Massive Open Online Courses (MOOC) it is necessary a technological infrastructure that supports it. Various educational institutions do not have or do not wish to invest in ...
Match: evaluation; mooc

Scaffolding-informed design of open educational resources in Chinese secondary school mathematics: insights from multi-cycle formative evaluation
Huang, Xiaowei; Lo, Chung Kwan; He, Jiaju; Xu, Simin; Kinshuk
In the post-pandemic world, open educational resources (OER) have the potential to ensure educational equity by providing all students with access to learning materials and by supporting teachers’ instructional ...
Match: evaluation