Artikel
Artikel
Artikel
Keywords: In this paper we report on our evaluation of the impact of a flipped classroom approach on the
Flipped classroom learning experience of students undertaking an undergraduate biology course. The flipped ses-
Participation and collaboration sions comprised pre-recorded lectures, online quizzes and in-class group activities in the course
Assessment and feedback design. The success of the approach was evaluated on the basis of perceptions held by the course
Knowledge construction
coordinator and students on how the new course design influenced the student learning ex-
Students learning
Student engagement
perience. Data were collected through a student questionnaire and structured interviews with the
Student motivation course coordinator. Overall, the students reported a high degree of satisfaction with some ele-
Confidence ments of the flipped approach. However, some activities were less well regarded, with concerns
identified by the course coordinator and students. A key finding was that elements from the
model for student learning design presented in this article were correlated with student con-
fidence, motivation and engagement. It was concluded that refinements of components of the
flipped design, such as the pre-recorded lectures and the structure of the in-class sessions, may
further enhance the student learning experience in this course.
1. Introduction
The ‘flipped classroom’ is a pedagogical model in which a traditional learning environment and its activities are reformed, or at
least rearranged. For example, in a Western university setting the usual lecture and follow-up learning activities may be reversed,
with instructional lecture material delivered online prior to class time, and in-class time used for more active group learning tasks
than those undertaken in a traditional lecture. As with all new curricula designs, the rationale for flipped approaches is improved
student learning, but in these financially constrained times the promise of more efficient use of resources is also likely to be invoked,
along with the potential benefit from incorporating new digital technologies.
In this article we report on our evaluation of a particular flipped classroom innovation, focusing on the satisfaction and ob-
servations of both the students and the course coordinator (CC). We framed our evaluation on Awidi (2006a,b)) model for improving
student learning (MISL), which comprises five key scaffolds (supports or structures) for enabling and enhancing student learning,
which are described in detail in the literature review section below. In addition to the scaffolds, we assumed that ‘confidence’,
‘motivation’ and ‘engagement’ would be key drivers (and dependent variables) for the studentlearning experience. Thus the hy-
pothesis for this study had two related components: 1) students engaged in the flipped approach will express high levels of sa-
tisfaction with each of the five elements of MISL; and 2) student satisfaction will be associated with high levels of confidence,
motivation and engagement in their learning. As a consequence, the main thrust of our research was quantitative in nature; however,
∗
Corresponding author.
E-mail addresses: atisaiah@hotmail.com, i.awidi@ecu.edu.au (I.T. Awidi).
https://doi.org/10.1016/j.compedu.2018.09.013
Received 9 September 2017; Received in revised form 20 September 2018; Accepted 21 September 2018
Available online 29 September 2018
0360-1315/ © 2018 Published by Elsevier Ltd.
I.T. Awidi, M. Paynter Computers & Education 128 (2019) 269–283
we did add two qualitative components (described in the methodology section below). Our efforts to integrate these ‘mixed methods’,
we believe, is an important part of the story of our research.
The structure of our article takes a conventional form: we commence with a quite extensive review of relevant literature, follow
with a description of methodology and methods, research results, discussion, and conclusion.
2. Literature review
Flipped approaches for tertiary teaching and learning have typically been implemented with the aims of increasing student engagement,
enhancing the student learning experience and, ultimately, improving student outcomes (Bossaer, Panus, Stewart, Hagemeier, & George,
2016; Cavanagh, 2011; Caviglia-Harris, 2016; Chiang, 2017; Connell, Donovan, & Chambers, 2016; Day, 2018). Clearly, flipped approaches
have taken many forms, ranging from teaching and learning entirely in class to being delivered fully online (Bates, 2015).
The most common rationale for flipped classroom approaches is that they facilitate experiential learning and support the active
construction of knowledge. Kolb and Kolb (2005) defined ‘experiential learning’ as consisting of four phases: active experimentation,
concrete experience, reflective observations, and abstract conceptualisation. However, all of these learning phases could be of a
solitary nature. The theoretical perspective underpinning the ‘construction of knowledge’, called constructivism, emphasises the
social aspects of learning. In defining constructivism Crotty (1998, p.42) noted that: “all knowledge and therefore all meaningful
reality … is contingent upon human practices being constructed in and out of interaction between human beings and their world”.
However, most educationists distinguish ‘social learning’ from learning in general. For example, Wanner and Palmer (2015) and
Lai and Hwang (2016) claimed that a frequent justification for flipped classroom approaches is that they increase social learning.
Social learning is mentioned also in the framework developed by Harasim (2007, p. 282), who asserted that the generation and
linking of ideas, as well as intellectual convergence, are facilitated through social learning. Biggs (1996), who is known for his work
on ‘deep learning’, is more in line with Crotty's theorising of constructivism, emphasising the importance of learner interaction with
peers, teachers and others to deepen the meaning of their learning experience.
Thus the meanings of experiential learning and ‘construction of knowledge’ have been extended to include: learning through
interaction with others, developing the capacity to apply or transfer one's knowledge and skills to other contexts, and acquiring
deepened understandings. Day (2018) in an experimental study conducted over two semesters in Boston, USA compared the learning
of an experimental group (flipped approach) with that of a control group (traditional lecture), and found that the flipped group
performed significantly better in their final grades than the traditional group for both semesters. However, focusing only on the
outcomes is like describing the inputs and outputs of a ‘black box' – without intelligence about what happened within the box.
Now for an elaboration of the afore-mentioned MISL, which we adopted for our study. Awidi (2006a,b), who developed his model
within a constructivist paradigm, posited that learning can be successfully acquired by students when they have the following
‘scaffolds’: (1) access to information and learning resources; (2) support and motivation; (3) participation in learning activities and
collaboration with others; (4) assessments and feedback which help them improve their learning; and (5) active engagement in their
learning and critical reflection. Thus, according to Awidi (2006a, b), scaffolds are both supporting devices and propitious student
behaviours, such as motivation, that together provide the impetus for students to be actively engaged in their learning.
Abeysekera and Dawson (2015) reviewed the literature on flipped classroom approaches and tentatively proposed that flipped
approaches might improve student motivation and help manage cognitive load.1 The focus of their review was on the following learner
motivational factors: sense of competence, sense of autonomy, and sense of security and relatedness. And after a thorough analysis,
Abeysekera and Dawson (2015, p.7) concluded that “learning environments created by the flipped classroom approach are likely to
satisfy students' needs for competence, autonomy and relatedness and thus entice greater levels of intrinsic and extrinsic motivation”. In
their view, students’ cognitive load may be reduced with the use of pre-recorded lectures and instruction designed to the expertise level
of students. Thus a feeling of competence, senses of autonomy, security and relatedness are likely to influence student engagement in the
flipped learning environment. For example, when students are given the opportunity to apply information gained from the pre-recorded
lectures to in-class activities and create new knowledge they are likely to feel more motivated and engaged.
Other researchers have investigated flipped classroom innovations in terms of the learning environment, and student self-efficacy
beliefs, intrinsic and extrinsic motivation, and self-regulation (Bhagat, Chang & Chang, 2016; Chuang, Weng, & Chen, 2018; Thi; Hsieh,
Wu, & Marek, 2017; Thai, De Wever & Valcke, 2017; Yilmaz, 2017). Central to these studies were student satisfaction and motivation
and how the flipped learning experience affected student achievement and motivation. While these publications provide some un-
derstanding of how each flipped classroom approach worked, the authors did not shed much light on how motivation, confidence and
student engagement actually influenced the student learning experience. Consequently, our key aim was to seek an understanding of
how these variables and the MISL scaffolds contributed to student learning experiences in the flipped learning environment.
Based on our review of literature, we came to realise that any evaluation of a flipped approach is contingent on first ascertaining
the rationale and intentions of those responsible for implementing their program. We are also now more aware that measuring the
impact of flipped approaches is no simple matter (Abeysekera & Dawson, 2015; Kim, Kim, Khera, & Getman, 2014; Pierce & Fox,
2012; Thai, De Wever, & Valcke, 2017). For example, in their analysis of a quasi-experimental flipped approach in an introductory
biology course, Heybourne and Peretts (2016, p.31) collected data on student performance and perceptions; but in their conclusion
the researchers realistically claimed only the modest finding of a “trend toward performance gains using the flipped approach
pedagogy”. However, in another evaluation of a flipped innovation in an introductory biology course, Morevec, Williams Aguilar &
1
‘Cognitive load’ is the amount of mental effort demanded by a primary task (Block, Hancock & Zakay, 2010).
270
I.T. Awidi, M. Paynter Computers & Education 128 (2019) 269–283
O'Dowd (2010, p.473) claimed that their flipped approach resulted in “significant student performance gains”. And in his report of a
further quasi-experimental study, Peterson (2016) confidently asserted that student performance and satisfaction were higher in the
‘flipped’ group than the ‘lecture’ group. Nevertheless, all of these researchers admitted to limitations in their research designs, yet it
seems fair to say that research has added to the growing body of knowledge about flipped approaches.
To sum up this section: the literature on flipped classroom innovations indicates that the emphasis of educators has been on
student engagement and student-centred teaching and learning processes (Bates, 2015, p. 511). Of course ‘student-centred teaching
and learning’ anywhere implies that student interests are attended to, and students are given more responsibility for deciding how
they will learn; however, in higher education this rarely entails students being given the freedom to decide what they will learn.
Nevertheless, the various flipped interventions have all attempted some aspect of a student-centred pedagogy (Bates, 2015; Bates,
Almekdash, & Gilchrest-Dunnam, 2017, pp. 3–10; Bishop & Verleger, 2013; Long, Logan, & Waugh, 2016; McNally et al., 2017). The
extent to which the flipped design and its goals reported in this paper were student centred will be discussed below.
3. Methodology
3.1. Background circumstances and motivation for our flipped classroom design
Our flipped classroom pilot study involved 117 students in Evolutionary Processes, a third-year biology2 course in a Western
Australian University. This course was compulsory for students with a zoology major and was an elective for students with other
biology majors. Motivations underpinning the implementation of the flipped approach included:
Thus the flipped classroom approach was designed with the aims of increasing student attendance and engagement and improving the
efficiency of course administration. The new design incorporated pre-recorded lectures, online quizzes and new assessment tasks within a
flipped approach, while retaining two of the four lectures, and the essential content and learning goals of the original program. The
learning activities also were aimed to be hands-on, flexible (to fit the student learning style), collaborative, and with a degree of freedom
for students to choose their project topics and data for in-class and post-class activities. It was anticipated that the redesign of key learning
and assessment strategies would improve the learning experience for students and support them to actively construct their knowledge.
As indicated above, in the new course two of the four one-hour weekly lectures were replaced by flipped sessions: in-class group
sessions, supported by pre-recorded lectures (uploaded to the LMS a week before each in-class session) and online quizzes; the other two
lectures were retained. However, the laboratory requirements and mid-semester assessments were also altered. The traditional la-
boratory report (a hard copy submitted by student to the CC and returned with written comments) was replaced with a digital sub-
mission of the report by the student and digital feedback by the CC. And rather than relying on the lecturer to broadcast all information
pertaining to the course, students were now responsible for accessing the LMS for information about the course outline, course ob-
jectives, program of work, expectations and assessment schedule. In addition, supplementary instructional material (e.g. videos) and
other resources were provided through the LMS, including the opportunity for students to contact the CC if they required assistance.
Now for a brief elaboration on the design of in-class sessions. Ideally, this was to be a time for students to interact with the CC and each
other in order to pursue their learning goals, resolve problems, and discuss matters of individual interest. Furthermore, during in-class sessions
the CC was to provide students with feedback on their assignments, and point out common misconceptions that had emerged. To maintain
student interest, lecturers endeavoured to keep pre-recorded lectures on the LMS short (between 15 and 20 min). Then, after reflecting on the
video content, students were expected to tackle an online quiz for which students received automated feedback on their answers.
To encourage student participation and collaboration, the CC placed approximately 15 students in each group for problem-solving tasks
during in-class and laboratory sessions. In their laboratory groups students analysed data that had been collected by the entire class, according
to guiding questions, and took turns to report on the group's findings and respond to questions from peers in the other groups. The online
quizzes were designed to deepen student understanding by encouraging reflection on what had been presented and supporting the appli-
cation of their understanding to the problem presented. Performance in the online quizzes was summarised and graded and replaced the
previous mid-semester written examination. The group activities also provided formative assessment opportunities, with feedback from peers
and instructors intended to improve learning and deepen understanding of concepts (Kong, 2014; Wanner & Palmer, 2015).
To examine the impact of the flipped classroom design, the CC and two researchers undertook an action research approach,
informed by the work of Willis (2007) and Seymour-Rolls and Hughes (2000), who describe four phases in an action research cycle:
reflect, plan, act and evaluate. In keeping with action research methodology, our study comprised a mixed method approach. Our
2
Evolutionary Processes: Required for taking some units but elective for other students.
271
I.T. Awidi, M. Paynter Computers & Education 128 (2019) 269–283
Table 1
Frequency of student response to categorised questionnaire items (n = 50).
Element of MISL/questionnaire item Disagree % Neutral % Agree %
Note: 1. Level of significance: *p < 0.05; **p < 0.01; ***p < 0.001.
2. Percentages have been rounded to the nearest whole and may not sum to 100.
approach was predominantly objectivist (quantitative), with a student questionnaire containing 22 closed questions as well as three
open questions. A qualitative approach was included also in our study to assist interpretation and explanation of the student and CC's
views about their experience of the flipped classroom.
The design of the primary research instrument to collects quantitative data, the student questionnaire (see Table 1), was guided by
the five key elements of MISL: (1) access to information and learning resources; (2) support and motivation; (3) participation in
learning activities and collaboration with others; (4) assessments and feedback helping students improve their learning; and (5)
critically reflection and construction of knowledge by actively engaging in their learning. The MISL construct was also expected to
provide key information about designs which engage students in deeper learning by building confidence, motivation and engagement
(McLaughlin et al., 2014; McLaughlin, White, Khanova, & Yuriev, 2016; Wanner & Palmer, 2015). Approval for the research was
gained from the human research ethics office of the university (RA/4/1/9122).
We conducted the qualitative analysis of the student views by using the research questions to group the responses from three open
questions in the student questionnaire. The studentresponses were examined and then coded as either satisfied or dissatisfied. We then
explored whether there were relationships between the responses based on the categorisation of satisfied and dissatisfied with their
learning experience. Themes emerging from the responses were then categorised and summarised (Appendix Tables D and E). In
drawing out themes we were mindful that the student experience and their understanding will differ, and hence their understanding
needs be interpreted within the context of this study.
The CCs views were collated through two face-to-face interviews conducted by the two researchers. Both interviews with the CC
were of 45 min duration and followed a semi-structured format. The interviews were recorded, fully transcribed and cleaned to ensure
that the transcribed narrative fairly represented the CC's responses during the interview. Based on the research questions, we examined
the transcripts, identified key issues and made a summary. After reflecting on the summary and making some minor adjustments, the
summary was then provided to the CC to confirm that it accurately reflected what he had intended to convey. If required, further
alterations were made to ensure the CC was satisfied with the summary of his views that would be used in the study.
272
I.T. Awidi, M. Paynter Computers & Education 128 (2019) 269–283
All 117 students enrolled in the course were invited by the CC to respond to the online questionnaire, which was made available
to students on LMS over a two-week period, mid-way through the semester. In his invitation, the CC explained to students the purpose
of the questionnaire and emphasised that participation was strictly voluntary. The CC also informed students that data collected
would be anonymous and kept confidential; and it would definitely not be used in the calculation of the students’ grade for the
course. During the second week that the questionnaire was available to students, the CC posted a reminder about the questionnaire on
the LMS. As it happened, 50 students responded to the questionnaire. The response rate of 43% was considered to be satisfactory
because it was similar to the usual response rate (40%) for end-of-semester evaluations in this course. The research team checked the
respondents and found them to be representative of the student cohort within the course.
The 22 closed questions in the student questionnaire required students to express their level of agreement with a specific statement on a
five-point Likert scale (i.e. strongly disagree, disagree, neither agree nor disagree, agree, strongly agree). The researcher assigned to this task
collapsed the five-point scale into three categories: ‘disagree’ (i.e. strongly disagree and disagree), ‘neutral’ (i.e. neither agree nor disagree) and
‘agree’ (i.e. agree and strongly agree). Consistent with the overall research question, the questionnaire designer focused 17 statements on the
extent to which students agreed that the flipped approach enabled them (or otherwise) to scaffold their learning using each of the five
elements of MISL (Table 1). The remaining five statements sought student self-assessment of their confidence, motivation and engagement
in the course, their competence using digital technologies, and their previous experience of a flipped approach (Table 1).
The statistical software IBM SPPS V23 was used to analyse data collected from the questionnaire. Cronbach's alpha values were
calculated for each group of items associated with each of the five elements of the MISL model. For four of these elements (access to
information and learning resources, support and motivation, participation and collaboration, and critical reflection and knowledge
construction), the alpha values were in the range of 0.7–0.8 which is generally considered as acceptable for reliability (DeVellis,
2012). The values for the four elements were 0.77, 0.77, 0.70 and 0.85 respectively. An alpha value of 0.64, which was calculated for
the assessment and feedback element, was also considered as an acceptable value of reliability in this questionnaire. This judgement
was made based on the attitudinal construct underlying each item as well as the small number of items (four only) in the group. A
single scale (mean score) was then calculated for each of the five groups of items. These data were then tabulated to show the
proportion of students in each of the three student response groups for each of the five elements of the model (Table 1).
To check for significant differences between the proportions of disagreeing, neutral, and agreeing students, our researcher cal-
culated likelihood ratios for the overall score for each of the five elements. For three of the independent variables, the response
categories of disagree, neutral, and agree were renamed for levels of confidence as not confident, neutral, and confident respectively;
for levels of motivation as not motivated, neutral, and motivated respectively; and for levels of engagement as not engaged, neutral, and
engaged respectively. Likelihood ratios were calculated to check for significant differences between response groups (Table 1).
To identify any associations between the responses of the three student groups to the items on confidence, motivation and
engagement and the five elements of MISL, a one-way-between-groups ANOVA was conducted. The ANOVA used single scores for
each of the five elements in the original five-point scale. Then Tukey's (honestly significant difference) post hoc test was performed to
identify within the three groups where these differences occurred.
As described above, the qualitative data collected in our study comprised the students' responses to the open-ended questions in
the questionnaire and the CC's responses in interviews with the researchers. The first interview was conducted in the second week of
the semester and identified the CCs aims for the new course design as well as details about the specific flipped approach being used.
The second interview was undertaken late in the semester after the student data from the questionnaire had been summarised for the
CC. Discussion about these data and the CC's interpretation constituted the initial phase of the second interview; questions in the
second phase were directed at the CC's intended curriculum design for the next iteration of the course (Appendix B).
The open questions in the student questionnaire asked respondents to add comments about their learning experience in the course.
The researcher compared the open comments by the students against some of the closed items, such as: ‘My experience in this course has
fully engaged me as a learner’. Strongly agree and agree responses to the closed item were classified as students being satisfied with the
flipped approach, strongly disagree and disagree as dissatisfied, and neither agree nor disagree as neutral (Table 3). Furthermore,
comments drawn from the open responses were tabulated under design challenges (as perceived by students) and under student
perception of the overall learning experience. These comments were further classified as pre-class activity, in-class-activity, post-class-
activity, online quizzes and labs, group discussion, learning resources and time allocation or overall perception (Appendix Tables D and E).
4. Results
Of the 50 respondents to the online student questionnaire, 33 (66%) had not participated in a previous online course with
recorded videos with learning activities course, and 17 (34%) had. On the question of digital literacy, 36 (72%) of the respondents
considered themselves highly competent users. An encouraging trend from a pedagogic point of view was that the majority of
respondents (60%) thought the flipped classroom approach fully engaged them (Table 1). Now for a more detailed breakdown of this,
the most substantial set of data.
The majority of respondents agreed that most of the strategies introduced by the flipped approach (e.g. online quizzes, pre-
recorded lectures, online laboratory report and group tasks) provided some learning benefit (Table 1). However, there were three
exceptions: only 11 (22%) of respondents agreed that pre-recorded lectures motivated them to read further about the subject; 23 (46%)
273
I.T. Awidi, M. Paynter Computers & Education 128 (2019) 269–283
Table 2
Comparative measures - association between the five elements of MISL and each of confidence, motivation and engagement (n = 50).
Element of MISL Confidence Motivation Engagement
Likelihood ratio ANOVA F Likelihood ratio ANOVA F Likelihood ratio ANOVA F value
(df = 2) value (df = 2) value (df = 2)
Access to information and learning 23.2*** 12.9*** 12.2** 6.5** 29.8*** 14.0***
resources
Support and motivation 14.5** 7.2** 8.2** 4.1* 20.0*** 11.4***
Participation and collaboration 10.3** 4.4* 2.1 1.0 16.5*** 8.44***
Assessment and feedback 8.8** 5.0** 1.9 1.0 6.1* 3.34*
Knowledge construction 24.8*** 14.4*** 22.0*** 10.9*** 35.0*** 28.64***
Note: Level of significance *p < 0.05; **p < 0.01; ***p < 0.001.
agreed that pre-recorded lectures motivated them to engage in planned learning activities; and 21 (42%) agreed that they enjoyed
participating in the flipped lecture group discussions (Table 1).
For the MISL element ‘access to information and learning resources’ the overall score showed a significantly higher proportion of
respondents (81%) agreeing than disagreeing (9%). The ‘assessment and feedback’ element also showed that significantly more students
(73%) agreed than disagreed (12%). These proportions were similar for ‘knowledge construction’, with significantly more students
agreeing (74%) than disagreeing (11%). For ‘support and motivation’, significantly more students agreed (52%) than disagreed (23%),
but a sizable number (25%) of students were neutral in their view. Likewise for ‘participation and collaboration’, significantly more
students agreed (51%) than disagreed (22%), but 27% expressed a neutral view. In regards to perceived confidence, motivation, and
engagement, more respondents agreed to the statements (54%, 54%, and 60% respectively) than disagreed (14%, 14% and 18%
respectively); and again the proportions of respondents with neutral views were quite high (32%, 28% and 22%).
The likelihood ratio confirmed a significant relationship between ‘access to information and resources’ and ‘student confidence’ (χ2
(2) = 23.2; p = 0.000); ‘student motivation’ (χ2 (2) = 12.1; p = 0.014); and ‘student engagement’ (χ2 (2) = 29.8; p = 0.00) (Table 2). The
ANOVA post hoc analysis confirmed that there was a significant difference between students who felt confident and those who did not feel
confident (p = 0.000), as well as between those confident and neutral (0.007). Those who felt motivated and those who did not feel
motivated were significantly different also (p = 0.004). In terms of student engagement, a significant difference was detected between those
who felt engaged and those who did not feel engaged (p = 0.000), and between those who felt neutral (p = 0.000) (see Appendix A Table C).
The results showed a significant relationship between ‘support and motivation of students’ and ‘student confidence’ (χ2
(2) = 14.5; p = 0.001), ‘student motivation’ (χ2 (2) = 8.2; p = 0.016) and ‘student engagement’ (χ2 (2) = 20.0; p = 0.000)
(Table 2). The post hoc analysis revealed a significant difference between students who felt confident and those who did not feel
confident (p = 0.002), but not between confident students and neutral students. A significant difference was also observed between
students who felt engaged and students who did not feel engaged (p = 0.000), and students who were neutral (p = 0.000). There was
also a significant difference between those who were motivated and not motivated. A significant relationship was identified also
between ‘participation and collaboration’ and ‘student confidence’ (χ2 (2) = 10.3; p = 0.006) and ‘student engagement’ (χ2
(2) = 16.5; p = 0.000), but not with ‘student motivation’ (χ2 (2) = 2.1; p = 0.346) (Table 2). Post hoc analysis showed a significant
difference between those who felt confident and those who were not confident (p = 0.019). For engagement, there was a significant
difference between all groups except for students who did not feel engaged and neutral students (p = 0.622) (Appendix A Table C).
A significant relationship was detected between ‘assessment and feedback’ and ‘student confidence’ (χ2 (2) = 8.8; p = 0.012) and
‘student engagement’ (χ2 (2) = 6.2; p = 0.047), but not with ‘student motivation’ (χ2 (2) = 2.0; p = 0.384). The post hoc analysis
showed a significant difference between those who felt confident and those who were not confident (p = 0.014). And a significant
difference was found between students who were engaged and those who were not engaged (p = 0.046) (Appendix A Table C).
For ‘knowledge construction’, there was a significant relationship with ‘student confidence’ (χ2 (2) = 24.8; p = 0.000), ‘student motivation’
(χ (2) = 22.0; p = 0.000), and ‘student engagement’ (χ2 (2) = 35.0; p = 0.000) (Table 2). The post hoc analysis showed a significant dif-
2
ference between confident and not confident (p = 0.000), but not between the confident and neutral aggregations. For motivation, there was a
significant difference between the motivated and not motivated (p = 0.000), but not between the motivated and neutral. For engagement, a
significant difference was found between the engaged and not engaged (p = 0.000) and the neutral (p = 0.004) (Appendix A Table C).
In summary, our analysis found positive associations between student respondents' confidence levels in the course and their levels
of satisfaction with each of the five elements of MISL. This same positive association with each of the five elements was evident for
engagement levels. However, motivation levels were positively associated with levels of satisfaction for only three elements of MISL:
'access to information and resources', ‘support and motivation' and ‘knowledge construction'.
As noted above, only 23 (46%) respondents provided comments in the open-response item. These responses did not lend
themselves to cross-tabulation, in any strict quantitative sense, but through their responses the students indicated a degree of sa-
tisfaction, which we categorised roughly as follows: 6 (26%) dissatisfied, 5 (22%) neutral, and 12 (52%) satisfied (Table 3).
The particular aspects of the learning design which respondents singled out for praise in their open-question responses were the
274
I.T. Awidi, M. Paynter Computers & Education 128 (2019) 269–283
Table 3
Cross-tabulation of open-item responses to a specific closed item.
Category Dissatisfied Number (%) Neutral Number (%) Satisfied Number (%) Total Number (%)
Experience in the course fully engaged the student 9 (18) 11 (22) 30 (60) 50 (100)
Responses with no comments 3 (11) 6 (22) 18 (66) 27 (100)
Responses with general comments 6 (26) 5 (22) 12 (52) 23 (100)
pre-class activities, in-class activities (including group discussions), and online activities. Respondents considered that the pre-class
activities (pre-recorded lecture and online quiz) were effective in preparing them well for in-class sessions (11), and for under-
standing the content better (7). These affirmations were represented in the following student response: “The flipped lectures were a
great way to learn and understand the content better. It meant you were more inclined to watch the lecture and it gave you the
opportunity to fully understand it in the class discussions.” The in-class activities (including group discussions) were considered
helpful from a peer learning perspective (6), and because they were motivational (3). One student remarked that, “This was the most
organised unit I have ever completed; it was refreshing. Plus, all the lecturers were enthusiastic about teaching which gave me
motivation to want to learn.” Quizzes, as a component of online activities overall, were favourably commented upon (8), and the
online lab report was regarded as useful (3). And comments were made in favour of the in-class activities being recorded and posted
on the LMS for benefit of students who could not make it to the in-class activities (13) (Appendix A Table E).
On the other hand, some respondents (5) expressed dissatisfaction with the design of in-class activities and considered that the
first few in-class sessions were not effective. There was also concern about the lack of time to cover topics. According to some
respondents (6), the discussion groups were not very helpful to their learning, because the groups were too big (i.e. with 15–20
students). One student remarked, “If you want to do that, then just run a tutorial, not a lecture – where only the people sitting at the
front on the right [of the room] get any help”. Some felt the in-class sessions and pre-recorded lectures distracted them from the
content of the course. Furthermore, according to one student, “the in-class activity didn't always align with the lecture recordings in
the first part of the semester”. Another student thought that critical information had been missed because she/he was unable to attend
the in-class sessions due to a timetable clash, stating that “Flipped sessions [in-class sessions] should either be properly recorded or
run as traditional tutorials”. In this context, the student's reference to ‘traditional tutorials’ meant a tutorial session for about 20
students, facilitated by the CC (or another teaching staff member) in which there is an emphasis on interaction between students as
well as interaction between students and the CC.
Other students (7) were critical about the recording of sessions, with some suggesting also that the use of a dedicated lecture
capture system would be more effective in supporting student learning than the in-class sessions. It was also noted by some re-
spondents that they were not in favour of in-class sessions, and would prefer more traditional lectures. Finally, several respondents
(8) reported that the in-class session notes and recordings were too brief or otherwise insufficient, whereas others acknowledged that
attending in-class sessions was beneficial (Appendix A Table E). One student remarked:
I know that most of the flipped sessions involved group discussion, but it would have been beneficial to hear the lecturer go
through the questions at the end of each flipped session. Instead we had to try and contact those who had attended, which for
some people was not possible.
Technical issues also contributed to students' dissatisfaction. Three students pointed out that some pre-recorded lectures were
difficult to navigate on the LMS. And some students found it confusing that lectures were embedded in different locations. Other
students reported that the auto-save function in the online laboratory assignment was unreliable and they had to rewrite large
sections of the assignment as a consequence. As noted above, student digital competence overall was high; therefore it was un-
surprising that digital competence was not considered a limitation on the student learning experience in this flipped approach.
In his first interview the CC reiterated what was noted above: although most respondents to the student questionnaire were
positive about some elements of the flipped classroom design, such as the online quizzes, respondents were more ambivalent about
the pre-recorded lectures and participation in the in-class sessions. However, the CC described the in-class sessions, as follows:
My initial interaction with the students, I thought, was positive. It was worthwhile. In the beginning, I had a very good turnout at
the flipped classes; 70%–80% of the class attended, which is really, really good. And it all seemed to be really, really positive. I
introduced some flipped sessions and the numbers died off a bit, but students who were present again generally seemed very
positive about it. So, the feedback from the questionnaire comments is at odds with those impressions.
The discrepancy between the respondents' and the CC's view of in-class sessions is not surprising; our decision to seek qualitative
input in our evaluation was driven by the wish to explore any differences in member realities.
Before the course redesign, the CC had observed that in previous iterations of the course student attendance tapered off towards
the end of the semester. But even with the flipped classroom approach, attendance towards the end of the semester declined.
Competing interests, such as demands from other subjects and employment commitments, appeared to affect student participation in
sessions. Indeed, on some occasions up to 70% of students did not attend the in-class sessions. Regardless of low attendance, the CC
275
I.T. Awidi, M. Paynter Computers & Education 128 (2019) 269–283
was encouraged by his students’ end-of-semester achievement. The CC also mentioned that LMS analytics indicated a high proportion
of students were accessing pre-recorded lectures and there was a high level of compliance by students in meeting online submission
times for required tasks. Thus overall the CC had a positive view of the flipped approach.
Prior to the introduction of digital assessment tasks, a large amount of time was required for the CC to grade and provide feedback
on individual student scripts. Digital completion and submission, together with digital assessment and feedback, reduced this
commitment markedly. The CC commented “From our perspective, the digital task took a lot of time to put together, but it was much
easier to mark and so it was reassuring that it was positively received by students.” However, the CC expressed concern about reports
of student collusion on quizzes, commenting, “The feedback of student collusion on quizzes flagged a need to change the LMS
settings, making it more difficult for this to occur.” The CC reflected that he would benefit from further training in the design of
online assessments and rubrics for grading student assessments.
5. Discussion
A key aim guiding this study was to understand how student motivation, confidence and engagement relate to the student
learning experience in the flipped learning environment. Correspondingly, its hypotheses were 1) students engaged in the flipped
approach will express high levels of satisfaction with each of the five elements of MISL; and 2) students’ satisfaction will be associated
with high levels of confidence, motivation and engagement in their learning.
The overall perception from respondents was that the flipped classroom approach provided a beneficial learning experience. High
levels of satisfaction were reported in closed items associated with three of the five MISL elements: ‘access to information’, ‘as-
sessment and feedback’, and ‘knowledge construction’. Lower levels of satisfaction were expressed by respondents for the two re-
maining elements: ‘support and motivation’ and ‘participation and collaboration’ (Table 1).
The students reported that they were provided with access to the necessary resources and information in order to successfully complete
the course. The criteria for assessment were clearly communicated and the recent change from a traditional laboratory report to a digital
submission with feedback was preferred by students overall. In addition, respondents agreed that the assessments contributed to their
understanding of course concepts and the online assessments enabled them to demonstrate their learning. Most students also expressed
satisfaction with the responsiveness of teaching staff to their learning needs. However, the pre-recorded lectures motivated only a minority
of respondents to engage in planned activities or additional reading. Similarly, the group discussions within the in-class sessions were
enjoyed by only a minority of students. These views were expanded upon by the 23 students who responded to the open-ended question in
the questionnaire, and there was a fairly close correspondence in the items and aspects for which there was high satisfaction (online
quizzes and the flipped approach generally) and for which there was less satisfaction (the group discussions within the in-class sessions).
In his interview responses, the CC contested the view that the in-class-sessions were less than successful and cited first hand class
observations and LMS analytics to support this opinion. Of course there is bound to be some discrepancy in views both between the
CC and the students as well as within the student group. With the latter, there are always some students who are more or less diligent
in the way they apply themselves to a particular learning task, so some individual differences occur in their actual experience and in
how these experiences are perceived. Another point of difference between the views of the CC and the students may reside in when
they were surveyed. The student questionnaire was administered mid-semester when the flipped approach had been newly in-
troduced and had only been in operation for about five weeks. Whereas the second interview with the CC was conducted at the end of
semester; his views encompassed more than 12 weeks of classes, a much longer time period than for students. In relationo to timing,
Davenport's (2018) recent study of a small group of graduate student perceptions of a flipped approach is relevant. Interestingly
Davenport found that students expressed increased negativity towards certain aspects of the program at the end of semester – when
they were likely to be under stress with assignments.
In considering all relevant data, the researchers conducting the analysis of data concluded that there was quite a high level of
correspondence between the perceptions of the students and those of the CC: both indicated a high degree of satisfaction with three
aspects of the flipped classroom learning experience, ‘access to information’, ‘assessment and feedback’ and ‘knowledge construction’.
These tentative findings provide partial support for the first hypothesis.
As for the second hypothesis – that student satisfaction with the elements of MISL would correlate with student confidence,
motivation to learn, and engagement in the learning process – we found a high proportion of respondents expressed satisfaction with
three of the MISL elements and a smaller proportion expressing satisfaction with the remaining two. We will now discuss the
association of each of these elements with student confidence, engagement and motivation in the course.
The MISL proposed that when students access learning resources and information and also feel supported, they will be motivated
to participate in learning activities. Ideally, this in turn will encourage collaboration with peers in further learning opportunities and
build additional confidence in themselves as learners. With feedback from formative assessment, knowledge construction is sup-
ported, enabling its application to new contexts. Motivation, confidence building and consistent engagement are viewed as inter-
dependent and together are interwoven with the five elements of MISL (Fig. 1).
Table 2 shows that in our study confidence and engagement were associated with respondents' satisfaction with all five elements
of MISL. Motivation, however, was associated in our study with only three of the five elements, ‘access to information and resources’,
‘support and motivation’ and ‘knowledge construction’. This suggests that the second hypothesis was only partially supported.
Responses to the open-ended question by students and comments by the CC in the interview provide some insight into why
276
I.T. Awidi, M. Paynter Computers & Education 128 (2019) 269–283
Fig. 1. Theorized interaction between the model for improving learning design and learner characteristics.
motivation was not associated with ‘participation and collaboration’. For example, the students' responses included: insufficient time
provided for in-class activities; some recordings of the in-class sessions were inadequate; and scheduling of the in-class session being
affected by other-class clashes. And interview comments by the CC relate to why student motivation in this study was not associated
with ‘assessment and feedback’. For example, the CC attributed problems with assessment to the limited support he received in his
online task design and, indeed, for the overall design of the flipped approach. The CC acknowledged that he had not received any
training for developing online assessments, rubrics, grading or feedback.
Notwithstanding the reservations expressed above, the student respondents were generally satisfied with the flipped approach
overall (including pre-recorded lectures) and made the constructive suggestion that in-class activities be recorded for the benefit of
those unable to attend, and as a resource for those who did attend. The CC's recognition of ‘teething problems’, willingness to
improve, and enthusiasm for the flipped experiment bode well for its future iterations (more on this below).
The flipped approach was implemented by the CC ostensibly to enhance the student learning experience, increase student at-
tendance at in-class sessions, and ease course administration issues. Although the initial design process was time consuming, the
flipped approach, once implemented, did reduce the teaching required for traditional lecture presentations. The in-class problem
solving activities, intended to provide students with a deeper understanding of course content, were not fully appreciated by students;
and whereas attendance at in-class activities was high soon after implementation of the flipped approach, it waned later in the
semester – to some extent due to timetabling clashes. The online submission of assessments eliminated the more time-consuming task
of managing and marking written scripts; enabling grading and meaningful feedback to students wholly online. It was observed by
the CC, however, that a high proportion of students completed online submissions.
The aspects of their learning experience regarded positively by students – access to information and resources, assessment and feedback,
and knowledge construction – together with the positive association indicated between student confidence and all five elements of MISL,
between student engagement and all five elements, and between student motivation and three of the elements (access to information, support
and motivation, and knowledge construction), provide practical guidance for the CC, and others with an interest in improving course design.
The second and third elements of MISL, support and motivation, and participation and collaboration, were identified from the data as
two focus areas for design improvement. Consequently, the in-class activities are being re-designed and students will be placed in smaller
groups (from current groups of 15 students to groups of 8), where they can interact and discuss problems more effectively. Furthermore,
the CC is seeking strategies to encourage stronger participation by students in the LMS discussion forums, thereby increasing their col-
laboration and the support they receive. In addition, the CC has arranged for support from the university teaching and learning centre in
the preparation of additional pre-recorded lectures, again with the intention of increasing support and motivation for students.
6. Conclusion
In our study we aimed to examine the impact of a flipped approach on aspects of the student learning experience in a university
biology course. In addition, we aimed to identify aspects of the course design that were associated with student confidence, en-
gagement and motivation. It was intended that these insights would enable the CC to identify which aspects of the course design to
target for improvement in the next iteration of the course.
The flipped classroom approach has been given much attention in Australian higher education and beyond. Consistent with the
277
I.T. Awidi, M. Paynter Computers & Education 128 (2019) 269–283
findings of previous researchers (e.g. Morevec, Williams et al., 2010; Heybourne and Perett, 2016; and Petersen 2016), we have found
positive signs that a flipped classroom approach can enhance the student learning experience and outcomes. However, we must be
cautiously realistic in the extent of our claims about the success of the innovation in this study, and the validity of our research findings.
We do not need to apologise about the specificity of our study because evaluations of curriculum improvement are always
constrained by specifics around time and place. However, the theoretical work of other scholars in this field has been a useful
resource for planning and design, and we expect that our study will provide something of value for future researchers. Our study was
undertaken with a small sample of participants, and our evaluation measures were made very early in the life of the new curriculum
innovation. We had intended for the research to be a full-fledged action research study, but since it covered only the first cycle
(reflect, plan, act and evaluate), perhaps it might be better referred to as a pilot study.
Methodologically, we followed the objectivist convention of seeking to establish causal relations between variables and factors identified
in the literature. However we take satisfaction from the value of including limited qualitative data in our study, which arguably has enabled
us to sketch tentatively the dual perspectives of students and CC on ‘what happened’ with the flipped classroom innovation. We recommend
to future researchers that the potential of this inclusion be considered in the evaluation of currciulum design initiatives.
Finally, we acknowledge again our reliance on the Awidi (2006a,b) five elements of the model for improving student learning
(MISL) for enabling us to identify benefits in the flipped approach for students and positive relationships between the elements and
student confidence, engagement and motivation. As a consequence, we close with the optimistic prospect that our research will be a
useful stepping stone for CCs to pursue further iterations of the flipped approach in their course designs.
Acknowledgement
Our thanks go to the Human Research Ethics Office of UWA, for granting approval for the study (RA/4/1/9122) , and the Unit
Coordinator and students of the Evolutionary Processes course who participated in the study. Thanks also to all staff of the Centre for
Education Futures at UWA who provided professional support during the research period.
Appendix A
Table A
Chi-square (χ2) likelihood ratio (n = 50)
Table B
One-way ANOVA (n = 50)
Access to information and resources 12.894 0.000 6.477 0.003 13.955 0.000
Support and motivation 7.155 0.002 4.132 0.022 11.364 0.000
Participation and collaboration 4.409 0.018 0.999 0.376 8.414 0.001
Assessment and feedback 5.037 0.010 0.964 0.389 3.320 0.045
Knowledge construction 14.436 0.000 10.887 0.000 28.637 0.000
278
I.T. Awidi, M. Paynter Computers & Education 128 (2019) 269–283
Table C
Tukey's (Honestly Significant Difference) Post Hoc - Multiple Comparisons Test
Access to Information and Not Neutral 0.093 Not Neutral* 0.009 Not Neutral 0.986
Resources confident motivated engaged
Confident* 0.000 Motivated* 0.004 Engaged* 0.000
Neutral Not 0.093 Neutral Not 0.009 Neutral Not 0.986
confident motivated* engaged
Confident* 0.007 Motivated 0.999 Engaged* 0.000
Confident Not 0.000 Motivated Not 0.004 Engaged Not 0.000
confident* motivated* engaged*
Neutral* 0.007 Neutral 0.999 Neutral* 0.000
Support and Motivation Not Neutral 0.093 Not Neutral 0.071 Not Neutral 0.146
confident motivated engaged
Confident* 0.002 Motivated* 0.018 Engaged* 0.000
Neutral Not 0.093 Neutral Not 0.071 Neutral Not 0.146
confident motivated engaged
Confident 0.159 Motivated 0.927 Engaged* 0.044
Confident Not 0.002 Motivated Not 0.018 Engaged Not 0.000
confident* motivated* engaged*
Neutral 0.159 Neutral 0.927 Neutral* 0.044
Participation and Not Neutral 0.306 Not Neutral 0.800 Not Neutral 0.622
Collaboration confident motivated engaged
Confident* 0.019 Motivated 0.369 Engaged* 0.002
Neutral Not 0.306 Neutral Not 0.800 Neutral Not 0.622
confident motivated engaged
Confident 0.231 Motivated 0.727 Engaged* 0.023
Confident Not 0.019 Motivated Not 0.369 Engaged Not 0.002
confident* motivated engaged*
Neutral 0.231 Neutral 0.727 Neutral* 0.023
Assessment and Feedback Not Neutral* 0.012 Not Neutral 0.607 Not Neutral 0.569
confident motivated engaged
Confident* 0.014 Motivated 0.355 Engaged* 0.046
Neutral Not 0.012 Neutral Not 0.607 Neutral Not 0.569
confident* motivated engaged
Confident 0.943 Motivated 0.923 Engaged 0.373
Confident Not 0.014 Motivated Not 0.355 Engaged Not 0.046
confident* motivated engaged*
Neutral 0.943 Neutral 0.923 Neutral 0.373
Knowledge Construction Not Neutral* 0.004 Not Neutral* 0.043 Not Neutral* 0.002
confident motivated engaged
Confident* 0.000 Motivated* 0.000 Engaged* 0.000
Neutral Not 0.004 Neutral Not 0.043 Neutral Not 0.002
confident* motivated* engaged*
Confident 0.076 Motivated 0.093 Engaged* 0.004
Confident Not 0.000 Motivated Not 0.000 Engaged Not 0.000
confident* motivated* engaged*
Neutral 0.076 Neutral 0.093 Neutral* 0.004
Note: * The mean difference is significant at the 0.05 level.
279
I.T. Awidi, M. Paynter Computers & Education 128 (2019) 269–283
Table D
Students' perception of their learning experience in their open responses – satisfied categorised (n = 23)
Pre-class activities In-class activities & Group Online activity Overall Experience
Discussions
Good way to solidify A good way to interact with Online quizzes are okay, not Unit was a great idea and a good
understanding of concepts other students really bad (3) learning experience (7); nice and
(3) fun unit
Great way to learn and Provided the opportunity to Really liked the online Learnt better about processes and
understand content better contribute and fully components (quizzes and methods of the unit, rather than
(4) understand in-class online report and essay) (4) memorise for a temporary amount
discussions of time (2).
Provides students the Lecturers were enthusiastic Loved the online quizzes; Love Overall – really enjoyed the
opportunity to fully about teaching which gave having online quizzes instead of flipped classroom sessions (12),
reflect and understand students motivation to want mid-semester assessment was the flipped classroom worked very
subject in the class to learn better (5) well in the course (3)
discussions (6).
Pre-recorded and flipped Recording of flipped The unit and labs run were Flipped class is one of the better
lectures were really good lectures would help useful in students methods of teaching (2) and most
(3) enormously (13) understanding of what was organised unit and refreshing.
happening
Fantastic and prepares More inclined to watch the Content and provide good They help to solidify ideas (5),
students well for the in- lecture before in-class practice for online quizzes and enhance understanding of the
class discussions (5) activities (2) exams later subjects (3)
Helps to get opinion of Online lab report should be Flipped lectures are a great way to
others about the subject (5) complemented with the writing allow students more time to
of an actual report interact with their lecturers and
peers
Lab assignment were nicely laid It was a good way of answering
out any questions about (3) the unit
Great to introduce the flipped Great concept that help student
lectures and the online quizzes understanding of the subject/
Great way to learn and understand
the content better
Table E
Design challenges perceived by students in their open responses in questionnaire – dissatisfied categorised (n = 23).
Pre-class In-class activity Post-class Online Quizzes Group Learning Overall Perception
activity activity and Lab Discussion Resources and
Time allocation
Not properly Sessions not Difficulty Online report Felt like a Lecture notes brief Really didn't work
recorded recorded (7) in writing should be tutorial (2); and insufficient well. It did not
of essays complemented more tutors translate (2)
with an actual needed
report
Recorded Class discussions Renting a Preferred to Large group Lecture notes short Timetable clashes (5)
lecture not uploaded on tutor for read and write sizes (working and difficult to Students unable to
difficult to LMS (Blackboard) support on paper to with 30–40 understand make it physically (3)
engage with having on people in the
screens (2) class)
None of pre- In-class activity did Rely on Online lab Inadequate/lack Insufficient time to Specific needs to
recording not always line up peer and report was new of support; go over topic with have an individual
seemed with lecture others for and strange attention groups effective learning
relevant for recording support. experience focused on one experience
what was side of the class
(continued on next page)
280
I.T. Awidi, M. Paynter Computers & Education 128 (2019) 269–283
Table E (continued)
Pre-class In-class activity Post-class Online Quizzes Group Learning Overall Perception
activity activity and Lab Discussion Resources and
Time allocation
being
discussed in
the groups
Felt under Issues not properly Online lab Poor Insufficient time to Flipped classroom
prepared to explained. assignment coordination of complete the not integrated into
participate Questions not was very discussions (not online quiz the discussion groups
in class properly answered unreliable planned)
activities
Inadequate support Insufficient Not very helpful Extra time needed Not a big fan of
from lecturers (3), time to in learning (2) to discussion flipped lectures
to reflect on complete flipped lecture
questions online quizzes questions with
group and lecturer
Inadequate Online lab Feeling of Difficulty to Difficulty in writing
explanation to assignment frustration in navigate on LMS/ essays/
understand were very group Problem with Comprehension of
calculations and unreliable discussion and saving an subject
answers flipped lecture assignment
document
Expectations of Lack of interest Had to rewrite Unit was not a good
lecturer to go in unit took large part of learning experience/
through question at away the ability assignment that Unit was difficult/
the end of each to fully engage were previously Concepts could be
flipped lecture and enjoy unit saved before taught without maths
session not met (2) submission dates equations
Interview 1
Participants: The course coordinator (CC) and research team. Time allocated 45 minutes
Purpose: To interview the CC about the course re-design that had been implemented recently.
Questions: Response/Comments
1 Can you please say your name, role within the university and faculty?
2 What change are you implementing this semester?
3 What motivated your need for change?
4 What do you hope to achieve with this change?
5 How do you expect students to respond to the change?
6 When is the change scheduled to take place?
7 Do you perceive any obstacles or barriers to implementing the change? If so, what are they?
8 How do you plan to overcome possible obstacles?
9 What support can we offer you in collecting student data for this research project?
10 Do you have any questions?
11 Is there anything else you wish to add?
Interview 2
Participants: The course coordinator (CC) and research team. Time allocated 45 minutes
Purpose: This interview aims to record the response of the CC to the course re-design as well as his/her response to the student
data summarised from the questionnaire. The second part of the interview is about the CC's plans for any further action in response to
this information.
Process: Before the formal recording begins, there will need to be time made available for the CC and researcher to explore and
discuss all the data that has been prepared.
281
I.T. Awidi, M. Paynter Computers & Education 128 (2019) 269–283
This data will include the transcript from the initial interview - providing the purpose and nature of the innovation as well as how it was
implemented; a summary of the student responses prepared from the online questionnaire using IBM SPSS Ver. 23; a draft case study summary.
It is intended that the discussion of the data will lead to an agreed interpretation by both the CC and the researchers. Once this
point has been reached then a more formal aspect of the interview will commence.
Questions: Response/
Comments
1 Please describe your own response to the re-design that you implemented in your unit.
2 How do you interpret the responses of the students to your re-design?
3 What are you plans in response to this information - next time the unit runs?
References
Abeysekera, L., & Dawson, P. (2015). Motivation and cognitive load in the flipped classroom: Definition, rationale and a call for research. Higher Education Research and
Development, 34(1), 1–14. https://doi.org/10.1080/07294360.2014.934336.
Awidi I.T. (2006a). ICT-enabled flexible learning systems for higher education in Ghana. (Master of science, technology application in education and training research),
university of twente. Enschede, the Netherlands: University of Twente.
Awidi I. T. (2006b). Elements for improving students learning experience in a digital environment. Workshop presentation on Models for Improving Learning and Learning
design in e-learning implementation, University of Twente (November 2005); Universities of Ghana (May/June 2006).
Bates, A. W. (2015). Teaching in a digital age: Guidelines for designing teaching and learning for a digital age. Retrieved from https://opentextbc.ca/teachinginadigitalage/.
Bates, J. E., Almekdash, H., & Gilchrest-Dunnam, M. J. (2017). The flipped classroom: A brief, brief history the flipped college classroom. Springer.
Bhagat, K. K., Chang, C. N., & Chang (2016). The impact of the flipped classroom on mathematics concept learning in high school. Educational Technology & Society,
19(3), 134–142.
Biggs, J. (1996). Enhancing teaching through constructive alignment. Higher Education, 32(3), 347–364. https://doi.org/10.1007/BF00138871.
Bishop, J. L., & Verleger, M. A. (2013). The flipped classroom: A survey of the research. Paper presented at the 120th ASEE annual conference and exposition, atlanta.
Block, R. A., Hancock, P. A., & Zakay, D. (2010). How cognitive load affect duration judgements: A meta-analytic review. Acta Psychologica, 134(2010), 330–343.
Bossaer, J. B., Panus, P., Stewart, D. W., Hagemeier, N. E., & George, J. (2016). Student performance in a pharmacotherapy oncology module before and after flipping
the classroom. American Journal of Pharmaceutical Education, 80(2), 31.
Cavanagh, M. (2011). Students' experiences of active engagement through cooperative learning activities in lectures. Active Learning in Higher Education, 12(1), 23–33.
Caviglia‐Harris, J. (2016). Flipping the undergraduate economics classroom: Using online videos to enhance teaching and learning. Southern Economic Journal, 83(1),
321–331.
Chiang, T. H.-C. (2017). Analysis of learning behavior in a flipped programing classroom adopting problem-solving strategies. Interactive Learning Environments, 1–14.
Chuang, H., Weng, C., & Chen, C. (2018). Which students benefit most from a flipped classroom approach to language learning? British Journal of Educational
Technology, 49, 56–68. https://doi.org/10.1111/bjet.12530.
Connell, G. L., Donovan, D. A., & Chambers, T. G. (2016). Increasing the use of student-centered pedagogies from moderate to high improves student learning and
attitudes about biology. CBE-life Sciences Education, 15(1) ar3.
Crotty, M. (1998). Foundations of social research. Australia: Allen and Unwin.
Davenport, C. E. (2018). Evolution in student perceptions of a flipped classroom in a computer programming course. Journal of College Science Teaching, 47(4), 30–35.
Day, L. J. (2018). A gross anatomy flipped classroom effects performance, retention, and higher‐level thinking in lower performing students. American Association of
Anatomists. https://doi.org/10.1002/ase.1772.
DeVellis, R. F. (2012). Scale development: Theory and applications, Vol 26. New York, NY: SAGE Publications.
Harasim, L. (2007). Assessing online collaborative learning: A theory, methodology. Flexible learning in an information society.
Heyborne, W. H., & Perrett, J. J. (2016). To flip or not to flip? Analysis of a flipped classroom pedagogy in a general biology course. Journal of College Science Teaching,
45(4), 31.
Hsieh, J. S. C., Wu, W. C.,V., & Marek, M. W. (2017). Using the flipped classroom to enhance EFL learning. Computer Assisted Language Learning, 30(1–2), 1–21.
Kim, M. K., Kim, S. M., Khera, O., & Getman, J. (2014). The experience of three flipped classrooms in an urban university: An exploration of design principles. The
Internet and Higher Education, 22, 37–50.
Kolb, A. Y., & Kolb, D. A. (2005). Learning styles and learning spaces: Enhancing experiential learning in higher education. The Academy of Management Learning and
Education, 4(2), 193–212.
Kong, S. C. (2014). Developing information literacy and critical thinking skills through domain knowledge learning in digital classrooms: An experience of practicing
flipped classroom strategy. Computer Education, (78), 160–173.
Lai, C., & Hwang, G. (2016). A self-regulated flipped classroom approach to improving students' learning performance in a mathematics course. Computers & Education,
126–140.
Long, T., Logan, J., & Waugh, M. (2016). Student’ perception of the value of using videos as a pre-class learning experience in the flipped classroom. SpringerLink Tech
Trends, 60(3), 245–254.
McLaughlin, J. E., Roth, M. T., Glatt, D. M., Gharkholonarehe, N., Davidson, C. A., Griffin, L. M., et al. (2014). The flipped classroom: A course redesign to foster
learning and engagement in a health professions school. Academic Medicine, 89(2), 236–243.
McLaughlin, J. E., White, P. J., Khanova, J., & Yuriev, E. (2016). Flipped classroom implementation: A case report of two higher education institutions in the United
States and Australia. Computers in the Schools, 33(1), 24–37. https://doi.org/10.1080/07380569.2016.1137734.
McNally, B., Chipperfield, J., Dorsett, P., Del Fabbro, L., Frommolt, V., Goetz, S., et al. (2017). Flipped classroom experiences: Student preferences and flip strategy in a
higher education context. Higher Education, 73(2), 281–298. https://doi.org/10.1007/s10734-016-0014-z.
Moravec, M., Williams, A., Aguilar-Roca, N., & O'Dowd, D. K. (2010). Learn before lecture: A strategy that improves learning outcomes in a large introductory biology
class. CBE-life Sciences Education, 9(4), 473–481.
282
I.T. Awidi, M. Paynter Computers & Education 128 (2019) 269–283
Peterson, D. J. (2016). The flipped classroom improves student achievement and course satisfaction in a statistics course: A quasi-experimental study. Teaching of
Psychology, 43(1), 10–15.
Pierce, R., & Fox, J. (2012). Vodcasts and active-learning exercises in a “flipped classroom” model of a renal pharmacotherapy module. American Journal of
Pharmaceutical Education, 76(10), 196.
Seymour-Rolls, K., & Hughes, I. (2000). Participatory action research: Getting the job done. Action Research e-reports, 4.
Thai, N. T. T., De Wever, B., & Valcke, M. (2017). The impact of a flipped classroom design on learning performance in higher education: Looking for the best “blend”
of lectures and guiding questions with feedback. Computers & Education, 107, 113–126. https://doi.org/10.1016/j.compedu.2017.01.003.
Wanner, T., & Palmer, E. (2015). Personalising learning: Exploring student and teacher perceptions about flexible learning and assessment in a flipped university
course. Computers & Education, 354–369.
Willis, J. W. (2007). Foundations of qualitative research: Interpretive and critical approaches. Sage.
Yilmaz, R. (2017). Exploring the role of e-learning readiness on student satisfaction and motivation in flipped classroom. Computers in Human Behaviour, 70(2),
251–260.
283