Doctor in Waiting

Doctor in Waiting

Search
Skip to content
  • About Hannah Gore

Monthly Archives: February 2016

Doctorate, EdD, EdD Chat, Hannah Gore, Learner Engagement, Literature Review, Methodology, MOOC Engagement, MOOCs, Research, Research Assistant, Students, Year Two

Year Two: Methods for Research in Main Study

February 25, 2016 hrgore Leave a comment
  1. Conceptual Design of Methodology

4.1 Methods and Instruments Applied for Main Study

Though there are a range of methodologies that can be used in research such as interviews, focus groups, and observations, due to the volume of learners undertaking open courses the most suitable methodology would be an online survey as a survey will be able to collate both quantitative and qualitative large-scale data for analysis. This became evident in the execution of the initial study due to the length of time it took to contact learners, arrange interview times, carry out interviews, and transcribe notes. There are a number of attractions in using a survey; one-shot data gathering, wide population representation, ascertains correlations, accurate data capture, and statistical data processing (Morrison, 1993: 38-40).

The self-completion survey is to be hosted online as the sample population of learners enrol and study open courses online, so the required demographic is suitably targeted. In a change to the original strategy for the main study, learners who have completed start of course surveys for courses that are hosted on both FutureLearn and OpenLearn will be targeted. Then the response data for individual courses can be analysed for comparison, and also collated for collective comparison of open online courses hosted on OpenLearn and FutureLearn. As the platform functionality and features differ this will aid in developing an understanding as to whether the presence or absence of such features impact on engagement and learning design.

The survey includes a combination of nominal data (for comparison with ‘traditional’ MOOC data (Jordan, 2014) to ascertain whether this open course community of informal learners is different), and scaled questions to establish attitudes of participants towards course engagement and learning design. Capturing large scale data through an online survey will aid to determine factual information; preferences, attitudes, behaviour, experiences and beliefs (Weisberg et al, 1996).

The design of the survey has taken into consideration Hoinville and Jowell’s (1978) three prerequisites of survey design; purpose of inquiry, population specification, and resources available. The survey questions strongly address the research questions of engagement, disengagement and learning design. Three populations of learner strategically aligned to the JIFL journey have been identified (address in the Participants and Samples Chosen for Main Study section below), and the survey is to be hosted in the three said locations online within the research timescales. Concern for participants (Sapsford, 1999: 34-40) has been taken into consideration through ensuring anonymity of participants and the fourteen stage process identified by Cohen et al (2009: 209) was followed.

The main study will also include the development of interview questions to be conducted on an individual 1:1 basis or as part of a focus group. For this purpose, and to confirm, clarify, and question any commonalities, trends and anomalies the final survey question allows participants to submit their personal details for further contact.

The evaluation report data from the first ten MOOCs presented by The OU on FutureLearn will also be reviewed in conjunction with the demographic data collected by the survey as historical documentary research to ascertain whether non-MOOC (therefore JIFL prospective) learners are being successfully targeted.

4.2 Participants Chosen

Due to the range of research being undertaken by various academics at The Open University with regards to FutureLearn MOOC data, careful consideration and discussion was required as to how the participants for this study were to be selected. It was agreed that a random sample of 500 participants from each of the courses currently hosted on FutureLearn would be selected. Where multiple presentations existed the random sampling would incorporate all cohorts to ensure that the level of repeat contact by other research studies would be kept to a minimum. As data is continually collected in new and repeating presentations this would ensure that the pool of particpants would continue to expand and the likelihood of repeat contact being reduced.

Within the main study survey there is further opportunity for particpants to submit their personal contact information for further research. It is from this that the participants will be selected for follow up interviews or focus groups.

4.3 Data Collection

The data from the survey will be collected via Qualtrics and held securely within the account for analysis via SPSS (Statistical Package for the Social Sciences) for which both have single account holder access. The survey allows participants to remain anonymous unless contact details are submitted in the final question. If contact details are submitted and then used for follow up telephone or Skype interviews then the same process of storage and number allocation is followed from the initial study.

The current blog ‘Doctor in Waiting’ will utilised further as a research journal (Burgess, 1984b) documenting the decision making processes of the content analysis. The use of grounded theory (Strauss, (1987) and Strauss and Corbin, 1990; 1997) using thematic analysis and to fragment and isolate data from qualitative responses from within surveys and interviews will be used. If a higher number of interviews than expected are conducted within the main study, the software package QSR NVivio will be utilised to aid the extrapolation of attitudes and themes from the qualitative data. Following the methodology by Stroh (2000) the text will be chunked, labelled, and coded to organise the data for analysis.

Though the data to be collected from OpenLearn and FutureLearn OOCs are stored separately it will be analysed for prevalent themes and patterns from within the individual platform and course data sets and collectively. The application of different levels of analysis will be conducted to ascertain whether commonalities occur with regards to engagement, disengagement, preference of learning design elements, etc. but also to clarify whether demographical data and technology use have an affect also. Themes will be drawn together to denote whether there is a stronger narrative within the data and reviewed as to the reoccurrence of the themes to denote emerging patterns. This analysis is undertaken throughout the stages of research (Bryman, 2001) to ensure that the research questions are been considered and answered.

4.4 Variables and Factors Affecting Study

In creating samples from across the two platforms the survey it aids in the broadening out of the respondent demographic to address the representation of informal learners interested in open courses that may produce similar or different results. From these results, themes across all samples may be determined, or it may be determined that each sample approaches informal learning differently so therefore has different needs for building and sustaining engagement to completion.

It is also possible that each of the samples may have a very different response rate, meaning that themes may be detected in larger response rates and not in smaller ones, or that it may be difficult to compare a small sample with a much larger one. In this situation it is proposed that a random sample of equal size is taken from each of the sets and the data from this is analysed also.

As part of the survey there will be an opportunity for participants to enter an email address if they wish to participate in further research. From the data gathered the provision of contact information creates the further exploration of themes or anomalies that will become apparent from the findings either with a secondary survey or in a telephone interview. At present this research is an unknown entity as platforms have not been survey in regards to learning engagement before either individually or collectively for comparison. The surveys will be hosted through Qualtrics from which the numerical data will be analysed through SPSS and the narrative data to be analysed using the protocols of content analysis.

In the hosting of separate (yet identical) surveys on Qualtrics, it will ensure that the samples are kept separate for comparison studies and each set to be labelled numerically to give the platform surveyed anonymity as well as the participants within. As noted previously it is expected that the goals for engaging with the courses will be extrinsically different, therefore it is hypothesised that the level of engagement with the individual elements of the course and the course as a whole with the view of completion, will be different with the learners on different completion trajectories dependent on external pressures (such as work and study related deadlines).The data groups can then be combined for additional collective analysis to understand as a collective the data isolated to determine if other factors affect levels of engagement such as age or types of technology used to access the courses.

The limitation in using both surveys and interviews is that there is little flexibility in relating the questions directly to the respondent’s personal circumstances and therefore may limit the respondent in the answers given. Through the use of natural language (Kvale, 1996) ‘stimulus equivalence’ (Oppenheim, 1992) may be achieved, whereby each respondent may understand the questions set before them, even if they are unable to relate it then to their personal circumstances.

However, the use of interview questions from the pilot study and then in the main study, can result in unanticipated answers that can lead to further connections in data relationships and addressing or the creation of hypothesis (Cohen et al., 2007). These responses could potentially be categorised in Tuckman’s (1972) seven modes, of which ‘filled in response’ is most likely due to the type of questions to be posed to the respondents.

The final limitation of the main study to consider is the current absence of a survey being conducted within a presentation of a MOOC. Instead random samples of participants are being selected from contact information inputted into start and end of course surveys of the courses presented. It must be taken into consideration that these learners are already engaged to a degree to have completed optional surveys.

  1. Ethical Considerations

Before conducting any research an application to HREC (Human Research Ethics Committee) at The OU outlining the research proposal was made. To ensure full compliancy an enquiry for further submission to SRPP (Student Research Project Panel) was also made to be informed that as OU students wouldn’t be specifically targeted additional SRPP ethical approval wouldn’t be required. Upon HREC approval further ethics application was made and granted from the Open Media Unit to research and release data on The OU’s open educational resources.

In line with guidelines of BERA and the Association of Internet Researchers the moral duty to respect privacy, confidentiality and anonymity is adhered to. For the surveys an introductory page on all surveys displays the ethical research statement detailing the purpose of the research, how the research will be used, how to exit the survey at any time, contact details for further information, and that by clicking to enter the questionnaire is a confirmation of acceptance of the ethical statement (information on how withdraw is also given).

The list of email addresses for survey use is kept in a password protected spreadsheet that will form part of the use, storage, and disposal requirements of the other data gathered for the research. Only completed papers, reports and publications will be published whereby participants are anonymised to protect their privacy addressing Bryman’s (2001) ethic principles and Bassey’s (1999) ethical values, whilst adhering to the guidelines set by BERA (1992).

  1. Implementation

The main study survey has been finalised and replicated for each of the OOCs in hosting on FutureLearn and OpenLearn securely on Qualtrics. The emails in which the individual survey links specific to the relevant course will be sent through Qualtrics. Learners will be given a deadline of completion for the 28 February 2016, with Qualtrics generating reminders and thank you emails upon successful completion of the survey.

The study will take into account surveys for 23 FutureLearn hosted OOCs and 10 OpenLearn hosted OOCs (which are also hosted on FutureLearn). The OOCs provide a balance across the subject categories provided in FutureLearn so have the opportunity to demonstrate engagement patterns based on platform, subject category and individual OOC.

Upon completion of the deadline date, analysis will aim to identify learners for potential for interviews and focus groups. The analysis of the data with be provided in PR07, and the methodology and analysis of the interviews and focus groups to be provided in PR08 and PR09 respectively.

 

  1. Structure of Data Analysis

As both quantitative and qualitative data will be collated for the main study via surveys and interviews different approaches to data analysis will be taken.

With regards the data analysis of the quantitative data from the survey, the use of nominal scales for questions relating to demographics, and ordinal scales operating on the Likert scale principles will be utilised, thus considered non-parametric.

In the analysis of the data a one-tailed test may be applied to test the hypothesis that those learners associating engagement with open courses as related to an extrinsic professional or academic goal in comparison to a leisure learner, are more likely to engage with the course until completion, with leisure learners being more succinct and sporadic in their engagement strategy. The analysis of the data should define whether variables, such as academic and professional current positioning and future goals bear any relation to the learners perception of, and engagement with the open courses and what linear or non-linear relationships can be drawn from this.

To ensure reliability in the data analysis SPSS will be used, especially in taking into consideration the large-scale data expecting to be received via the surveys. It is proposed that the split-half technique be applied in conjunction with the use of SPSS. Data will then be tabulated for expression within the text, and data visualisations in the form of graphs, pie charts, etc. only to be displayed where it adds greater value to the analysis beyond the use if frequency and percentage tables.

The secondary element of analysis will be for the qualitative data acquired through data collection from the interviews following the survey. To amalgamate the key issues emerging from the transcriptions a combination of progressive focusing (Parlett and Hamilton, 1976), content analysis (Ezzy, 2002: 83, Anderson and Arsenault, 1998: 102) and grounded theory (Strauss and Corbin, 1994: 273) will be used, initially taking a wide angle approach to gather the data from interviews across the three data sets, to then through sorting, coding, reviewing, and reflection upon the responses given to systematically gather and analyse the data.

Through typological analysis (LeCompte and Preissle, 1993: 257) the three data sets the organising of the data can firstly be ordered in the three platform groups, then reviewed and organised as individuals to ascertain whether any themes or frequencies through the application of secondary coding (Miles and Huberman, 1984) emerge that allow the organisation of the data by issue to analysis plausibilities as whether it can be organised by research question. The following of Brenner et al’s (1985) steps to content analysis in conjunction with the organisation of the data into groups should ensure reliability in its interpretation.

Share this:

  • Twitter
  • Facebook

Like this:

Like Loading...
DoctorateEdDEdD ChatHannah GoreHRGorelearner engagementLearning DesignLearning JourneysMOOCMOOCsMotivationyear two
Doctorate, EdD, EdD Chat, Hannah Gore, Learner Engagement, Literature Review, MOOC Engagement, MOOCs, Research, Students, Year Two

Year Two: Progressing from Initial Study

February 25, 2016 hrgore Leave a comment
  1. Introduction

This research is a continuation into the investigation as to the attraction of open online courses and what elements of learning design engages learners through to course completion. The purpose of this research is to identify what elements of open online courses that learners engage and disengage with, and how these research findings can influence the learning design of open online courses (OOCs).

This progress report concentrates solely on the development of methodology for conducting the main study.                                                                                                                                                                         

  1. Summary of Impact of Literature with Regards to Initial Study

In counter to the evidence of the initial study, the literature gave strong emphasis to the need for participant engagement (Clouse and Evans (2003), Coppola et al (2002), Marks et al (2005) and Swan 2002), peer influence (Yang et al. 2014) and socially conducive environments (Rosé et al. 2014), giving indication to the need for collaborative activities for positive learning outcomes (Gunawardena and Zittle (1997) and Rovai (2002) in particular the types of learner interaction classified by More (1989) as learner-instructor, learner-learner, and learner-content. It may be possible that the results of the initial study, as it is a small sample, may have produced an anomaly of favour towards the anti-social learner-content which counters this literature, to which further exploration within the main study is required. It may be possible as found with Caspi, Gorksy and Chajut (2003) that the majority of students contributed to a small amount of messages, or that learners had difficulty finding interesting discussion opportunities (Yang et al. 2014) and that in the situation of the small sample may have only highlighted this pattern of activity.

There are links to be made between motivation and engagement as motivation is multidimensional and multilevel in construct (Boaekaerts, 1997). Though Tai (2008) states that strong motivation is a prerequisite for online learning, it is in the field of formal study, if learners are choosing OOCs for personal developments and leisure learning then the level of motivation may differ to that of an online student studying towards a formal qualification. Whilst a teacher seems to hold a strong presence in face-to-face learning (Roth et al (2007) and Legault et al (2006) and Junco (2012) this doesn’t seem to translate into the findings of the initial study with learners rating alternative features before Lead Educators and Facilitators.

Interestingly whilst Rienties et al (2009) found that learners that were highly extrinsically motivated contributed less actively to what Veerman and Veldhuis-Diermanse (2001) consider to be ‘social contributions’ this would aid in the explanation of the ranking of the importance of social engagement within the initial study.

In researching engagement versus performance Aguiar et al. (2014) noted that the understanding of retention has changed considerably over time, and therefore more complex than initially quantified. This theory is important to the understanding and answering of the research questions as given the heterogeneity of the learners (Lackner et al. (2015), the understanding of retention and completion may vary considerably within the learners community in contrast to that of the academic and the quantifying of performance with (Nicol and Macfarlane-Dick 2006).

Whilst Gibbs and Simpson (2004) have argued that assessment has a positive effect on student’s learning and engagement within traditional teaching environments, this is not a pattern depicted in the results of the initial study. Within the main study a clearer pattern is expected to develop, the requirement for further analysis of this statement would be required through post-survey interviews to ascertain whether it is a strong requirement for learning design of OOCs.

  1. Research Questions and Hypothesis from Initial Study

3.1 Reviewing the Research Questions

One of the themes emerging from the literature reviewed to date and from the feedback given is that there has been an expression of academic interest in the retention and completion figures of a range of MOOCs and OOCs, however very little literature has been dedicated to the engagement of the learner with the content, how they were initially attracted to the course (much emphasis is placed on ‘free’ rather than the content, the Lead Educator, the university facilitating the course, how it is delivered, how it can be studied etc.).

Understanding the attraction to engage and then the elements that maintain engagement to completion need addressing as there is a distinct gap in the literature regarding this, and would be of benefit to academics and learning design teams in the creation of open online courses. Hence the research questions for the main study and thesis are:

  1. Why do people engage, and remain engaged in free open online courses?
  2. What elements of the design of the free open online courses increase or maintain learner engagement?

Addressing these questions should guide the understanding of these issues. It is important to note that this title and research questions have a wider application beyond that of this doctorate as its findings and recommendations may be able to translate through to formal offering to aid student engagement to qualification completion.

3.2 Hypothesis of Initial Study

In the analysis of the data a one-tailed test may be applied to test the hypothesis that those learners associating engagement with open courses as related to an extrinsic professional or academic goal in comparison to a leisure learner, are more likely to engage with the course until completion, with leisure learners being more succinct and sporadic in their engagement strategy. The analysis of the data should define whether variables, such as academic and professional current positioning and future goals bear any relation to the learners perception of, and engagement with the open courses and what linear or non-linear relationships can be drawn from this.

Furthermore, due to recent events in the media an additional hypothesis has emerged that learner engagement goes beyond learning design and is also determined by population of the course and number of presentations of the course (Gore, 2015). Questions within the survey have been adapted to address this hypothesis which will be further addressed in follow-up interviews.

Share this:

  • Twitter
  • Facebook

Like this:

Like Loading...
DoctorateEdDEdD ChatHannah GoreHRGorelearner engagementLearning DesignLearning JourneysMOOCMOOCsMotivationyear two

Captured thoughts of a doctorate student

Recent Posts

  • Taking Time Out
  • I do. Or do I?
  • The Afterlife 
  • Data, Data Everywhere, and not a Min to Think
  • The Loneliness of the Long Distance Runner 

Recent Comments

Stephanie Lay on I do. Or do I?
r3becca on Learning about Design Part 1
r3becca on Still Motivated to Learn?
hrgore on Motivation, Motivation, Motiva…
hrgore on Gaming With Learning Design

Archives

  • July 2017
  • June 2017
  • April 2017
  • March 2017
  • February 2017
  • October 2016
  • September 2016
  • February 2016
  • November 2015
  • October 2015
  • December 2014
  • November 2014
  • October 2014
  • August 2014
  • July 2014

Categories

  • Book Chapter
  • Doctorate
  • EdD
  • EdD Chat
  • Hannah Gore
  • ICDE
  • Learner Engagement
  • Literature Review
  • Methodology
  • MOOC Engagement
  • MOOCs
  • Personal Reflection
  • Research
  • Research Assistant
  • Students
  • Uncategorized
  • Year One
  • Year Three
  • Year Two

Meta

  • Register
  • Log in
  • Entries feed
  • Comments feed
  • WordPress.com
Blog at WordPress.com.
Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy
%d bloggers like this: