Chapter II

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 15

Chapter - II

REVIEW OF LITERATURE

2.1 Introduction

The purpose of a literature review is to place each work in the context of its

contribution to understanding the research problem being studied and describe the

relationship of each work to the others under consideration.

2.2 Studies Conducted in school based assessment

According to (Broadbent & Poon, 2015), The pervasive integration of digital

technology into higher education (HE) influences both teaching and learning practices, and

allows access to data, mainly available from online learning environments, that can be used to

improve students’ learning. Online learning facilitating the use of asynchronous and

synchronous interaction and communication within a virtual environment has “succeeded in

becoming an integral part of HE, and now itneeds to turn its focus, from providing access to

university education, to increasing its quality”. To this end, HE institutions are implementing

Learning Analytics (LA) systems to better understand and support student learning. This

study presents a literature review with the objective of mapping the current landscape of

contemporary LA research in HE.

(Schumacher & Ifenthaler, 2018). There is an evolving interest in LA among not

only practitioners but also researchers in Technology-Enhanced Learning (TEL). LA emerges

as a fast-growing and multi-disciplinary area of TEL, which forms its own domain (Strang,

2016). In LA, information about learners and learning environments is used to “access, elicit,

and analyse them for modelling, prediction, and optimization of learning processes”
Vengroff and Bourbeau (2006) compared achievement and participation of students

taking an introductory comparative politics class in the traditional format, completely online,

and through a learning analytics . The online and hybrid courses utilized the same materials

as the traditional format for consistency, with the online and hybrid courses were

implemented via an online platform. Student performance was measured with examinations,

short research papers, and discussions (online and in class.) The results indicate exams

between online and traditional formats were not significantly different, but that traditional

students did considerably better on the research papers than online students. However,

students in the hybrid classes did significantly better on both examinations and research

papers. It should be noted that the hybrid classes consisted of honor students, which

potentially skewed the results. However, the authors indicated that subsequent testing of

hybrid classes with regular students yielded similar results. The traditional classes in this

investigation, unlike traditional courses in other studies, had access to online materials to

complement the lectures.

Oh & Park, (2009) One relatively new method of instruction at the tertiary level that

helps to minimize the advantages of a pure online environment is the learning analytics, also

known as blended, online learning. Learning analytics is an educational approach that

combines evaluation and interpretation of results in elaborative manner. It was initially

designed to ease students into the online transition and overcome anxieties related to the

online learning environment. In learning analytics, instructors utilize technologies,

specifically online instructional components, to replace and augment portions of classroom

instruction. One of the biggest drawbacks is a lack of personal interactions and face-to-face

communication that exists in a traditional classroom. In one study, students enrolled in an

online course noted that with the elimination of face-to-face time with their teacher and peers,
it was posited learning usually gained from assimilating information and interacting with

other students and faculty was somehow appeared.

Sun, Joy, and Griffiths (2007) presented a new approach to the incorporation of learning

analytics style theory in developing an adaptive e-learning system in which the adaptation

into education systems was improved. Following this approach, in the hybrid e-learning

model developed in this study, adaptive e-learning was realized by providing customized e-

learning content, tailored to learners’ individual learning styles.

In a related study, Collopy and Arnold (2009) examined student learning in a student’s

assessment and completely online format for teacher education students. Surveys were

administered to determine how well the students learned the content and to determine their

overall satisfactions with the course. The results illustrated that participants showed an

increase in learning in both hybrid classes as compared to the completely online class. In

addition, although the two hybrid classes varied in the amount of traditional class time, they

reported similar levels of content learned, which emphasizes the importance of the classroom

experience.

2.3 Studies Conducted in learning analytics

Ghavifekr, Afshari & Amla Salleh, (2012) stated that in this 21st century, the term

“technology” is an important issue in many fields including education. This is because

technology has become the knowledge transfer highway in most countries. Technology

integration nowadays has gone through innovations and transformed our societies that have

totally changed the way people think, work and live. As part of this, schools and other

educational institutions which are supposed to prepare students to live in “a knowledge

society” need to consider ICT integration in their curriculum.


Arnseth & Hatlevik, (2012) argued that the Integration of Information,

Communication, and Technology (ICT) in education refers to the use of computer-based

communication that incorporates into daily classroom instructional process. In conjunction

with preparing students for the current digital era, teachers are seen as the key players in

using ICT in their daily classrooms. This is due to the capability of ICT in providing dynamic

and proactive teaching-learning environment. While, the aim of ICT integration is to improve

and increase the quality, accessibility and cost-efficiency of the delivery of instruction to

students, it also refers to benefits from networking the learning communities to face the

challenges of current globalization. Process of adoption of ICT is not a single step, but it is

ongoing and continuous steps that fully support teaching and learning and information

resources.

Berry, Sharla. (2018). In this qualitative case study, the researcher draws on

interviews with 13 faculty in an online doctoral program to find out how professional

development offerings strengthened distance instructors’ technical, pedagogical and content

knowledge. Findings suggest that guided practice sessions in the virtual classroom

strengthened newer faculty members’ technical knowledge. Biweekly meetings turned into a

community of practice where newer and more experienced faculty could build content

knowledge. Faculty in the distance program desired more professional development in the

area of online pedagogy.

Malaysian Examinations Syndicate, (2012). In the SBA system, the assessment component

consists of two categories, namely academic and non-academic. For academic, SBA assesses

theoretical knowledge according to content knowledge in multiple ways such as quizzes,

tests, and assignments; meanwhile, non-academic focuses on using multiple approaches to

assess students’ generic skills, including communication skills, technical skills, and team

skills. Moreover, SBA involves all physical, emotional, spiritual and intellectual aspects,
compared with the existing assessment that is more focused on academic achievement. In

SBA, teachers are the main personnel to implement the continuous assessment process during

teaching and learning in the classroom. This system had been implemented in all subjects,

starting from students in form one in the year 2012.

Purvin, (2011).suggests that the biggest challenge of SBA implementation is dealing with

teachers’ and students’ perceptions in Bangladesh, students were skeptical about whether

assignments given by their teachers contributed to their learning and final grade.

In Hong Kong, Qian (2014) studied a new SBA implementation in the English

language; findings indicated that SBA was beneficial to students but there was much room

for improvement in terms of students’ and teachers’ perceptions. Similarly, in a study of Yip

and the results indicated that the implementation lacked materials and training; thus fairness

of SBA was the main concern of teachers. Similar findings were supported, in which fairness

of SBA implementation was the main concern discussed in his study. Also, in South Africa,

SBA serves as a platform for students preparing for the final external examination.

Despite these discouraging findings, in a study by Tong (2011), a qualitative study of

an English subject revealed an encouraging finding. This study focused more on students’

views and experiences in regard to the SBA system implementation. The results indicated

that students were happy with a multifaceted approach to the concept of assessment and they

perceived that feedback given during the process of assessment was a source of motivation

for learning.

It is important to note that there will always be new challenges and resistance when

implementing a new system. According to Cheung (2000), a successful new system

implementation must be according to a proper process of educational changes and concern

for the individuals or organisations involved. In this case, several stages must be considered
when adopting SBA into the existing system, for instance, the five modified Stage of

Changes model, would be more useful to at least avoiding failure of a new system

implementation, and thus save more resources (money, effort, time, etc.).

The SBA system is in contrast to the examination-oriented system, where students

were stressed due to negative perception of assessment (Brown and Wang, 2011); the main

cause of stress for the students is the requirement to sit for a national examination, which will

determine their next education level. Furthermore, a successful SBA implementation and

education transformation requires thorough understanding of the students in the schools;

there are several reasons for failure of transformation, including failure to understand that

each school is unique and different. Based on these premises, it is important to assess

students’ perception and their readiness on the SBA system implementation.

One of the latest reviews (Ferguson & Clow, 2017) included the exploration of the

evidence of whether LA improve learning practice in HE based on four propositions of LA:

1) they improve learning outcomes, 2) they support learning and teaching, 3) are deployed

widely, and 4) are used ethically. Based on these propositions, the authors pinpointed that

many studies did not contain strong evidence for or against one of these propositions. Most of

the evidence based on the analysis of 28 papers relates to the proposition that LA improve

learning support and teaching, comprising retention, completion and progression, which has

been categorised as evidence that LA improve teaching in universities. The weaknesses of the

present research include lack of geographical spread, gaps in knowledge (e.g., in terms

informal learning and a lack of negative evidence), little evaluation of commercially available

tools, and little attention to ethics. Some other studies examined both LA and EDM research

in HE confirming several of the findings presented above.

One of the earlier reviews (Ferguson, 2012) investigates the technological, educational and

political factors driving the development of LA in education. This review elaborates on the
discussion about the relationships between LA, EDM, and academic analytics. Dawson,

Gašević, Siemens, and Joksimovic (2014) performed a citation network analysis to identify

the emergence of trends and disciplinary hierarchies that influence the field's development.

The results show that the most commonly cited papers are conceptual and review-based,

which implies a need for scholars to define a space for LA.

With the establishment of TEL, a new research field, called Learning Analytics, is

emerging (Elias, 2011). This research field borrows and synthesizes techniques from

different related fields, such as Educational Data Mining (EDM), Academic Analytics, Social

Network Analysis or Business Intelligence (BI), to harness them for converting educational

data into useful information and thereon to motivate actions, like self-reflecting ones previous

teaching or learning activities, to foster improved teaching and learning. The main goal of BI

is to turn enterprise data into useful information for management decision support. However,

Learning Analytics, Academic Analytics, as well as EDM more specifically focus on tools

and methods for exploring data coming from educational contexts. While Academic

Analytics take a university-wide perspective, including also e.g., organizational and financial

issues, Learning Analytics as well as EDM focus specifically on data about teaching and

learning.

Siemens (2010) defines Learning Analytics as “the use of intelligent data, learner-

produced data, and analysis models to discover information and social connections, and to

predict and advise on learning.” It can support teachers and students to take action based on

the evaluation of educational data. However, the technology to deliver this potential is still

very young and research on understanding the pedagogical usefulness of Learning Analytics

is still in its infancy.

Many teachers are motivated to evaluate their courses and they already have research

questions related to their teaching in mind. For example, a teacher who offers weekly online
exercises has the intention to help her students to prepare for an exam. But she is not sure if

the currently available exercises are helpful enough for this purpose. Therefore, the teachers

would like to know if those students who practice with her online exercises on a continually

basis are better in the final exam than students who do not use them. A Learning Analytics

toolkit could help her to do research on this hypothesis by automatically collecting,

analyzing, and visualizing the right data in an appropriate way. Yet, most monitoring and

reporting tools found in current VLEs are designed to collect, analyze, and visualize data in a

static tabular form that was predefined by system developers. Teachers face the difficulty that

appropriate and usable Learning Analytics tools that help them answer their individual

questions continuously and efficiently are missing in prevalent VLEs, since most of the work

in the area of Learning Analytics is conceptual (Johnson et al., 2011b).

Glahn (2009) argued only indicate certain facts about the usage and properties of the

learning environment and try to visualize them appropriately. Therefore, we revert to the

concept of indicators, which can be described as specific calculators with corresponding

visualizations, tied to a specific question. For example, if the teacher’s question is “Are those

students who practice with online exercise on a continually basis are better in the final exam

than students who do not use them,” the corresponding indicator could show a chart that

quickly facilitates a visual data comparison. Indicator concepts have been used before. for

example, introduced the concept of smart indicators, which he defined as “a context aware

indicator system, which dynamically aligns data sources, data aggregation, and data

presentation to the current context of a learner”. However, in our case the target group differs.

The eLAT indicators are collecting and visualizing data of students to present them to

teachers.

Oster, Lonn, Pistilli, & Brown, (2016), As a means to gauge stakeholder

expectations of a possible service, have encouraged the use of psychometric instruments


during different stages of implementations. Within the context of LA, a measure is available

to assess an institute's readiness for LA, but no preexisting scale is available to gauge student

expectations of LA services. Even though Arnold used a survey to understand student

perceptions of data handling, their reported findings can be questioned on the basis of using

an on the fly scale. Schumacher and do, however, present an exploration of expected LA

dashboard features from the perspective of students. Although these authors ground this work

in expectations, the distinction between expectations and perceptions is not completely

conceptualized. As a great majority of the student population is unlikely to have experienced

institutional LA services, measures of experience are not always appropriate, particularly

given that majority of students are not acquainted with LA services. Expectations, however,

can be measured prior to implementations and are an important determinant in the acceptance

of systems.

Davis & Venkatesh, (2014) As indicated above, although the importance of

systematically gathering university students' expectations about LA is of paramount

importance for the success of the service, little has been done in this regard and no adequate

tool is still available. In the present research, we have attempted to close this gap by

developing and validating a descriptive questionnaire to collect students' expectations of LA

services. Throughout the development of this instrument, the accessibility and understanding

of the items from the student perspective were always considered. Put differently, although

students are largely unaware of LA services, the phrasing of each item had to be balanced

between providing an institution with an informative understanding of what students expect

and also general enough for all students to understand. In doing so, the university can identify

particular areas of focus for their LA implementation, which can then inform direct

engagement strategies with their students.


Arnold & Sclater, 2017; Schumacher & Ifenthaler, (2018) Measuring student

expectations of LA services is a fundamental step to the success of future implementations.

Although others have offered solutions, the use of inconsistent terminology, limited scope,

and methodological limitations does leave a lot to be desired. Using the identified expectation

themes (ethics and privacy, agency, intervention, and meaningfulness) and expectation types

(ideal and predicted), we aim to develop and validate a descriptive questionnaire that offers a

robust and methodologically sound solution to measuring student expectations of LA

services. An overview of the steps taken in the current work is presented in Figure 1. This

figure provides a breakdown of each of the three studies undertaken, a description of how the

items were generated or how the data were analysed, the number of items retained or

dropped, and how many responses were collected at each stage. Furthermore, to illustrate the

utility of the instrument in measuring students' expectations of LA services, we will present a

brief overview of how beliefs towards certain features vary in accordance to the two

expectation types (ideal and predicted). It is anticipated that being able to gauge and measure

student expectations of potential LA services will promote further engagement with these

stakeholders in the implementation process, with a view of understanding the specific

requirements of the student population.

Brown et al., 2012; Brown et al.,; Davis & Venkatesh, (2014). In the case of

Bowling et al., these researchers explored patients' ideal and predicted expectations as it

allowed for both an upper and lower reference point with regard to knowing what service

elements to focus on. Put differently, the responses present an idealized perspective of a

service and also a realistic profile of what users believe is most likely. This approach would

be advantageous for LA service implementation decisions as it can differentiate between

what features students would like and what should be a priority (i.e., what is realistically

expected). In addition to providing a deeper understanding of stakeholder perspectives, both


research streams have shown that failure to gauge user expectations can lead to

dissatisfaction and low adoption of the implemented service Thus, by measuring stakeholder

expectations towards a service early on the service implementation process, the provider can

proactively identify main areas of focus and manage expectations.

These student views resonate with the concerns towards the obligation to act raised by

Prinsloo and Slade (2017). Within their discussions on this topic, Prinsloo and Slade do

state that the analysis of student data should be guided by a view of providing improved

support but at no point should it undermine their (the students') responsibility to learn. This

view has further been captured in the concerns intervention centric LA services as creating a

culture of passivity. Put in a different way, LA services that are designed to intervene when

students are struggling ignore their ability to be self‐directed learners who continually

evaluate their progress to set goals. The importance of viewing students as active agent in

their own learning should be a central tenant to LA services. Therefore, institutions should be

considerate of this and not implement LA services that remove the ability for students to

make their own decisions on the data.

Park & Jo, 2015; Verbert et al., (2013) Introducing new forms of feedback as a result of

implementing LA services should, theoretically, promote positive changes in student

behaviour such as motivating learning However, if meaningful inferences about learning

progress cannot be drawn from the information received through LA services (i.e., how visual

representations of performance relates to personal learning goals), then it is unlikely to be

incorporated into any decisions made. An example of information that was found to not be

meaningful for students was the provision of login metrics in Park and Jo's (2015) LA

dashboard, which was perceived as being unhelpful for the purposes of reflecting upon their

learning. In other words, although resource use metrics continue to be used in LA service

implementations, their utility, from the perspective of students, can be questioned. It has been
shown that usefulness expectations are an important determinant in the future success of a

technology. This is also true of LA services, where beliefs towards the utility of certain

features (e.g., visualizations and the level of detail provided) affect adoption rates together;

this does reinforce the importance of gauging what stakeholders in a service want, with a

focus on the type of information and its relevance to learning.

Dawson, Heathcote, & Poole, 2010; van Barneveld, Arnold, & Campbell, (2012).

However, it is often understood as ‘the measurement, collection, analysis and reporting of

data about learners and their contexts, for purposes of understanding and optimising [sic]

learning and the environments in which it occurs’. While emerging learning analytics

practices hold some promise to improve higher education, they are morally complicated and

raise ethical questions, especially around student privacy. Since learning analytics often rely

on aggregating significant amounts of sensitive and personal student data from a complex

network of information flows, it raises an important question as to whether or not students

have a right to limit data analysis practices and express their privacy preferences as means to

controlling their personal data and information.

Long & Siemens, (2011) suggest that New pathways for higher education policy and

the learning sciences are opening up due to the growth of interconnected databases in data

warehouses. Many learning analytics advocates believe capturing, archiving, and analyzing

student profiles and behaviors will lead to improved institutional decision making,

advancements in learning outcomes for at-risk students, greater trust in institutions due to the

disclosure of data, and significant evolutions in pedagogy, among other things. To support

these ends, universities are actively aggregating student data to support an array of learning

analytics initiatives.

Norris, (2011) the most common application of learning analytics technology is in the

context of an institution’s learning management system (LMS). LMSs are traditionally used
to support online or hybrid teaching environments, within which students interact with

various learning objects and work collaboratively. For example, students take quizzes; submit

assignments; read assigned materials, such as journal articles and other electronic texts

(eTexts or eBooks); and interact with their peers in discussion forums and wikis. Learning

analytics systems capture student behaviors, which are commonly referred to as the ‘digital

breadcrumbs’ students leave throughout the system within LMSs as students navigate and

interact with their peers and the digital space. In the recent past, it was a ‘slow and

cumbersome’ process to export LMS data for analysis, but it is increasingly the case that

common LMS systems include data extraction tools alongside their analytic products.

Brown, Dehoney, & Millichamp, (2015). The analytics can descriptively detail the

date, time, and duration of students’ digital movements, including if, when, and for how long

they read an electronic text (e.g., eBook or PDF article) or took an online quiz. Other

statistics detail a student’s overall completion rate of a course, whether or not a student is

predicted to succeed in the course, and map the strength of a student’s peer-to-peer/peer-to-

instructor network using social network analysis. LMSs embedded with learning analytics

tools use data visualization techniques to create information dashboards from which

instructors can infer how to intervene in a student’s education, while other systems allow

students, themselves, the ability to monitor their own progress using similar dashboards.3

Some systems automatically intervene with algorithms, which send status updates or e-mails

to students and instructors alike, notifying both parties of potential problems. LMS-based

learning analytics are informed by student data from other campus systems, including

commonly used student information systems (SISs). SISs hold a majority of the information

students disclose on their applications for admission, their enrollment records, and their

academic history. Over time, their digital records may be augmented with other information,
including financial aid awards, involvement on campus, disciplinary and criminal reports, and

personal health information.

Glass, (2013) while learning analytics applications typically focus attention on

individual courses and learners, there is a growing market for institution-wide analytic

applications, all prominent educational technology companies, offer learning analytics

solutions that allow institutional researchers and other administrators access to data and

dashboards that compare student activity and learning metrics within and between courses,

departments, and colleges across a university.4 Institution-wide learning analytics afford

administrators the ability to drill down into segmented and longitudinal student data. Doing

so helps an institution develop reports concerning student performance with respect to

learning outcomes, departmental performance measures, and instructor performance over

time. These measures and more, some argue, help an institution and its individual

departments respond to stakeholder pressures to demonstrate institutional effectiveness and

more easily meet government reporting requirements.

Johnson, Adams Becker, Estrada, & Freeman, (2015). to develop data analytics

projects and infrastructures in order to capture sensitive, comprehensive student data, the

obligation to do so responsibly will increase as well. Even with noble and good ends in mind

—namely improving learning (however defined)—learning analytics practices surveil and

intervene in student lives. Consequently, learning analytics, like many Big Data practices, are

rife with privacy problems and ethical quandaries, which continue to grow in complexity The

question then is whether or not those who design learning analytics systems and support its

ends will provide students privacy protections. Evidence in the literature suggests that

learning analytics highlight ‘blind spots’ in institutional policy and ‘poses some new

boundary conditions’ around student data and privacy, which may negatively affect the future

success of learning analytics.


Andrejevic & Gates, (2014) Due to predictive capabilities and the direct influence

Big Data practices have in daily life, these emerging data-based technologies present real

threats to individual autonomy. Many Big Data practices aim to capture as much of the

human experience as possible, including physical, mental, and emotional activity. In doing

so, individuals are taken from a corporeal whole and transformed into binary code as ‘data

doubles’ with the purpose of changing ‘the body into pure information, such that it can be

rendered more mobile and comparable’. The problem is that the data double fails to be a

‘comprehensive or representative’ reflection of human life, yet powerful actors use it to

influence a person’s behavior Where autonomy is concerned, organizations and institutions

who analyze data doubles rarely promote autonomy by failing to describe the construction of

the algorithm, the information on which it relies, and how and when analytic technologies

nudge humans to accomplish specific ends—which may not be in an individual’s best

interests. When we know that such practices are occurring in digital spaces that help us

intellectually develop (e.g., when we search for information or read eBooks), we may ‘guard

our words [and] our thoughts’, thus, the surveillance minimizes our autonomy.

2.4 Conclusion

Reviewing the related literature requires the researcher to make in-depth readings of

both the research and conceptual literature and pertains to published reports of actual research

studies done previously. The above cited studies paves researcher to drive out the indeed

outcome of the present research.

You might also like