Jurnal
Jurnal
Jurnal
net/publication/304527099
130 Evaluation of the e - Learning developed for casemix and clinical coding:
Quality of the material and usability of the system
CITATIONS READS
0 45
All content following this page was uploaded by Azam Rahimi on 28 June 2016.
Azam Rahimi
Iran Social Security Organization; United Nation University-
International Institute for Global Health
Aqdas Rahimi
Ministry of Education, Iran
Abstract: Elearning program for casemix system has been developed by International Training
Centre for Casemix and Clinical Coding . In developing any learning program the use of suitable and
relevant materials and usable system are undeniably important. Thus, the main objective of this
study was evaluation of e-Learning material to find out the learners‘ perceived satisfactions towards
the quality of e-Learning material and usability of the system in order to determine the success of
the program. The early evaluation of the e-Learning material and system look promising and provide
a framework for further developments in the field.
This descriptive study was an analysis of end-of-course evaluations completed by participants who
attended the first module of case-mix online course. A questionnaire include a variety of questions to
gather end-users' opinion about the e-learning materials and usability of the system was sent to e-
learner through their email address after completing the course and asked them to fill up the
questionnaire and return backed to the researcher‘s email address in one week. Each evaluation
form provided a Likert scale for rating statements related to the content and delivery of the course.
A total number of 57 learners participated in the study from around the world. The mean age of
subjects was 34.70±8.66 years. The results from the analysis of the questionnaire revealed that In
term of quality of the material the mean score was 48.4 out of 60 (80.7%) which indicate a good
quality level. The mean score for the usability items was 60.16 out of 100 which needs for further
improvement.
The findings of the evaluation of the case-mix e-learning program indicated that e-learners found the
educational performances of the case-mix online program to be successful. At the same time,
students declared that they achieved their learning objectives to a great degree.
130
Evaluation of the e-Learning developed for casemix and clinical coding
Rahimi et al/ Argos Special Issue 2, 2015/pp.130-143
1- Introduction
The technology of the World Wide Web is changing the way people learn, work, and
socializes (Manyika & Roxburgh, 2011). More and more adults are turning to the Web for
their learning needs due to its flexible delivery system in both education and training
settings. Therefore, in response to high demand in or training casemic system, the
International Centre for Casemix and Clinical Coding (ITCC) designed an e-learning
educational program to provide initial and preparatory knowledge for development and
implementation the casemix system. This training program give trainees and participants
background information on the casemix system in order to prepare them for more advanced
casemix training program. Establishing new method of teaching and learning requires an
assessment and evaluation in order to determine the success of the program. Evaluation
offers the information to make decision about using the product or not (Phillips & Gilding,
2003). Reeves and Hedberg (2003) say that evaluation can improve training or learning
quality by reflecting the quality and effectiveness of the learning material. Aquaro and
DeMarco (2008) pointed that when evaluating a website, there are three main areas to
focus on: the content, the overall structure and accessibility to all. Thus, the aim of this
study was to evaluate users' evaluation of quality of resources implemented in casemix
learning program and usability of the system. Evaluation can illustrate how to improve
future programs, to determine whether the program should be continued or dropped.
The use of suitable and relevant materials is undeniably important in process of teaching
and learning (Oliva, 2005). Learners will more frequently want to access good-quality
educational material than poor one, and will have better understanding of the topics
covered by high-quality material. However, choosing the suitable materials that would fit
the students‘ needs is not easy whereby teachers need to make necessary considerations
before using them (Rashidah et al, 2011). Therefore, in order to assure wide acceptance of
educational material by different students, it is necessary to constantly evaluate the
material and to insist on instructors' and authors' responsibility to constantly update the
material according to the evaluation results (Devedzic, 2006). According to Baker and
Papp (2004), the content quality measures the quality of the course content and is
concerning the accuracy, authenticity, accessibility, the design and the appropriateness of
the course content.
Usability is the quality attribute that assesses the ease of using the application by users to
accomplish their specified goals effectively, efficiently, and with a high level of satisfaction.
In addition to ease of use, a usable e-learning system should be useful for the learners in
accomplishing their learning task (Venkatesh et al., 2003). According to ISO 9241- 11 ,
usability may be defined as the extent to which a product (such as software) can be used by
specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a
specified context of use. On the Web, usability is a necessary condition for survival. If a
website is difficult to use, people leave. If the homepage fails to clearly state what a
131
Evaluation of the e-Learning developed for casemix and clinical coding
Rahimi et al/ Argos Special Issue 2, 2015/pp.130-143
company offers and what users can do on the site, people leave. If users get lost on a
website, they leave. If a website's information is hard to read or doesn't answer users' key
questions, they leave. Note a pattern here? There's no such thing as a user reading a
website manual or otherwise spending much time trying to figure out an interface (Nielsen,
2012).Usability plays an imperative role for the success of e-learning applications. If an e-
learning system is not usable, the learner is forced to spend much more time trying to
understand software functionality, rather than understanding the learning content (Wong
et al., 2003). The need for usability has been recognized in web site design literature as a
crucial quality when determining user satisfaction in such systems. Therefore, it can be
argued that the usability of e-learning applications can significantly affect learning
(Coffield et al., 2004).
A self-administrated questionnaire was used in the study to collect the data. The target
population for this study was participants in casemix introductory course with any
background who attended to casemix e-learning program. The study's sample consists of
learners enrolled in the first module of case-mix e-learning course, which was developed by
ITCC across several sequential administrations of the courses in one year (June2012-.June
2013). Geographically, the learners are scattered throughout all over the world.
The researcher used questionnaire for this study containing three parts; a demographic
part and Quality of e-learning Material instrument (12 items), Usability of the system (10
items).
132
Evaluation of the e-Learning developed for casemix and clinical coding
Rahimi et al/ Argos Special Issue 2, 2015/pp.130-143
To investigate how users evaluated the quality of resources implemented in the casemix
learning program, with permission this study used questionnaire for evaluation of Web-
based learning program modified by Nam (2003), originally developed by Kennedy. The
questionnaire included 2 part 3 question regarding objectives and directions and 9
questions about content in a Likert scale (1 = strongly disagree; 5 = strongly agree in
positive statements and 1 = strongly agree; 5 = strongly disagree in negative statements)
format measuring e-learners' point of view about quality of the material used in the case-
mix online program. Scores on End-User Evaluation of Quality Of E-Learning Material was
obtained by summing the participants‘ responses. The score fell between a minimum score
of 12 and the maximum score of 60. A quality score below or equal to 36 represented a weak
quality, a quality score between 37 to 44 represented a fair quality, a quality score between
45 to 52 represented a good quality, and whereas score between 53 to 60 represented as
very good quality. The measurement tool had a Cronbach‘s alpha reliability coefficient of
0.855.
2-5- Usability
To investigate how users evaluated the usability of case-mix learning program, this study
used the System Usability Scale which was developed by Brooke as part of the usability
engineering program in integrated office systems development at Digital Equipment Co
Ltd., Reading, United Kingdom (Brooke, 2013). The System Usability Scale (SUS) is a
simple, ten-item scale giving a global view of subjective assessments of usability. The
System Usability Scale has been made freely available for use in usability assessment, and
has been used for a variety of research projects and industrial evaluations; to calculate the
SUS score, first sum the score contributions from each item. Each item's score contribution
ranged from 0 to 4. For items one, three, five, seven, and 9 (positive statements) the score
contribution is the scale position minus one. For items 2,4,6,8 and 10 (negative statements),
the contribution is five minus the scale position. Multiply the sum of the scores by 2.5 to
obtain the overall value of system usability. SUS scores have a range of 0 to 100. The
higher the SUS score, the higher is the perceived usability. It was only intended to measure
perceived ease-of-use. A usability score below 5o represented a poor level, score between 50
to 70 represented just OK levels which means an acceptable product, scores between 70 to
80 represented good level, scores between 80 to 90 represented as excellent level and finally
scores between 90 to 100 fell into the best imaginative level. The usability scale had a
Cronbach‘s alpha reliability coefficient of 0.813. In addition, the SUS score was compared
with a benchmark which indicates whether the score is above or below the average of a
global benchmark of 68 which was suggested by Sauro (2011) from assessing 446 studies
and over 5000 individual SUS responses in previous studies.
133
Evaluation of the e-Learning developed for casemix and clinical coding
Rahimi et al/ Argos Special Issue 2, 2015/pp.130-143
2-6- Procedure
A research process began after the researcher received the written approval to conduct
research from the head of the department of community health, UKM. Before the study, a
pilot study with 10 subjects was carried out to obtain useful information for improving the
questionnaire and to test the reliability of the questionnaire. Respondents were being
informed of the study purpose and the voluntary participation right and were being asked
to comment on items they could not understand easily. The questionnaire was modified
based on the respondent‘s comments. The reliability of the questionnaire was examined by
using Cronbach‘s Alpha test. Values for reliability for both materials (Quality of the
material and Usability of the system) were above the suggested minimum of 0.7. In order to
establish the content validity, the questions was reviewed and approved by experts in
professional educator in the area of e-learning.
After registration to the e-learning platform, a user name and password was send to the
users‘ email address so they could sign in for the program. After that learners could take
courses any time between, 2 weeks' time frame to complete the program. For support, the
instructors‘ email addresses were provided and one research assistant helps to keep track of
the records. Upon completion of the course a questionnaire was sent to every learner and
asked them to fill in the questionnaire and send it back to the researcher's email address in
one week after receiving the questionnaire. An Approval letter from the UKM research
center was attached to the email for confidant of participants.
For this study, descriptive analyses such as number, mean, and standard deviation was
used to analyze participants‘ demographics. All data was analyzed using the Statistical
Package for the Social Sciences version 20 (SPSS).
The study was approved by the Universiti Kebangsaan Malaysia Research Ethics
Committee (UKM 1.5.3.5/244/SPP3, Code: FF-173-2012). All subjects had given written
informed consent.
3- Result
Demographic data shown 35 subjects or 61.4% of the participants in the e-learning course
were male and 38.6% of them were female. The mean age of the participants was
34.70±8.66 years, ranged from 22 to 52 years old. For level of education, Bachelor holders
had the highest percentage of participants. Mean length of job experience was 6.4 years.
Regarding profession, 40.4% of the participants were belonged to academic sector and
healthcare sector, It was also first experience of 70.2% of the participants.
134
Evaluation of the e-Learning developed for casemix and clinical coding
Rahimi et al/ Argos Special Issue 2, 2015/pp.130-143
Variables Frequency
n (%)
Gender
Male 35 (61.4)
Female 22 (38.6)
Age
20-29 21 36.8)
30-39 16 (28.1)
40-49 19 (33.3)
50 and above 1 (1.8)
Educational Level
Diploma 3 (5.3)
Bachelor Degree 19 (33.3)
Master 18 (31.6)
PhD 11 (19.3)
M.D 6 (10.5)
Profession
Healthcare 23 (40.4)
Financial 2 (3.5)
Information technology and 4 (7.0)
Medical Coding
Academic 23 (40.4)
Management 5 (8.8)
Year of Employment
< 1 year 8 (14.0)
1-3 years 24 (42.1)
4-6 years 9 (15.8)
7-9 years 1 (1.8)
≥ 10 years 15 (26.3)
Casemix Experience
Yes 17 (29.8)
No 40 (70.2)
Participants in the case-mix e-learning course were asked to evaluate various aspects of
the learning material used for the course and to rate them on a scale from one to five where
one was the most negative score and 5 was the most positive score. There were twelve
positive statements in this section of questionnaire. The minimum possible score for this
135
Evaluation of the e-Learning developed for casemix and clinical coding
Rahimi et al/ Argos Special Issue 2, 2015/pp.130-143
questionnaire was 12 and maximum possible score was 60. The score ranged from 35 to 60
and the mean Score for this part was 48.4±5.9 (80.7% of the total score). The highest score,
4.2 out of 5, is on the statement,‖ Generally the content was clear.‖ followed by 4.1 on the
statement '' The information presented in the program was relevant to my learning goals.''
Logically organization of the contents'' scored average of 4.1, ''easy to identify key concepts''
scored average of 4.1.
The least score (3.7) is on the statement,‖ I knew enough about the content area to get the
most out of this program.'', followed by 3.8 on the statement '' The case-mix course learning
material reinforced what I had learned elsewhere.'' Details on the respondents‘ Evaluation
of Quality of Resources in e-learning Program are presented at the Figure 1. The highest
percentile of the e-learners strongly agreed that from the start it was clear what they were
going to do in the program and the highest percentage of participants indicated the reply ''
Natural'' for the I knew enough about the content area to get the most out of this program.
The highest percentage of participants indicated the reply '' Agree'' for the rest of the
statements, which were about objective / direction, content/structure, interactivity,
navigation of the course.
40
30
20
10 1
0 2
3
4
5
The respondents' total scores regarding the quality of e learning was grouped as weak
(≤36), fair (37-44), good (45-52) and very good (53-60). A total of 26.3 % of e-learners'
responses stand for very good quality level (53 or above on a scale of 12 to 60), 49.1 % of
respondents' score stand for good quality (45-52), 21.1% of e-learners rated the quality of
136
The role of affecting factors on migration…
Shojaei / Argos Special Issue2, 2015/pp. 1-11
the E-learning material as Fair (37 to 44), and finally 3.5% of e-learners 'score stand for
weak quality.
In general, Responses were favorable about quality of E-learning material, ranging from
74.4% to 83.2% (Mean of Max Percentage) agreeing that the objectives were clear, the
content was logical and the navigation was easy. The results of quality level of e-learning
material by grouping the scores as weak, fair, good, and very good are listed in Table 3.
Overall 75.4% of the e-learners‘ perception about quality of the material stands for very
good or good level.
49.1
50
45
40
35 26.3
30 21.1
25
20
15
10 3.5
5
0
There were ten statements in this section of questionnaire in assessing the usability of the
e-learning program. There are five positive statements and five negative statements in this
part. Participants in the case-mix e-learning course were asked to rate them on a scale from
zero to four where zero was the most negative score and 4 was the most positive score on
positive statements and 4 was the most negative statement and zero was the most positive
score on negative statements. The minimum possible score for this questionnaire was zero
and maximum possible score was 100.
The highest score, 3.0 out of 4, is on the statement, ‗I felt very confident using the system‘
followed 3.0 on the statement ‗I thought the system was easy to use‘. ‗I think that I would
like to use this system frequently‘ and ‗I needed to learn a lot of things before I could get
going with this system‘ scored average of 2.9. ‗I think that I would need the support of a
technical person to be able to use this system‘ scored average of 2.7. The least score (1.5) is
for both statements, ‗I think that I would need the support of a technical person to be able
137
The role of affecting factors on migration…
Shojaei / Argos Special Issue2, 2015/pp. 1-11
to use this system‘ and ‗I found the system very cumbersome to use‘ followed by 1.9 on the
statement ‗I found the system unnecessarily complex‘. Details on the Usability Evaluation
of e-learning Program are presented at the Figure 3.
40
35
30
25
20
15 Strongly disagree
10 Disagree
5
0 Natural
Agree
Strongly agree
The highest percentage of participants indicated the reply ‗Disagree‘ for both statements ‗I
needed to learn a lot of things before I could get going with this system‘ and ‗I thought there
was too much inconsistency in this system‘. The highest percentage of participants
indicated the reply ‗Agree' for the rest of the statements.
The respondents' total scores on the usability section of the questionnaire was ranked as
awful (Score 30 and less than 30), poor (for score from 31 to 50), OK (for score from 51 to
70), good (for score from 71 to 80), excellent (for score from 81 to90), and best imaginative
(for score from 91 to100). 14.0% of e-learners' usability was grouped as poor. 63.2% of
respondents' usability score falls within our definition of ‗Ok‘, which means an acceptable
product. About 19.3% of respondents‘ usability score falls within our definition of ‗good'.
Finally 3.5 % of e-learners' usability scores grouped as excellent which is between 81-90.
The results of Usability Level of online program by ranking them as awful, poor, ok, and
good, excellent and best imaginative based on the usability score are illustrated on Figure
4.
138
The role of affecting factors on migration…
Shojaei / Argos Special Issue2, 2015/pp. 1-11
40
35
30
25
20
15
10
5
0 Frequency
The over-all usability mean score for the casemix online course on the SUS was 60. A one-
sample t-test was used to compare the overall usability score with the benchmark value of
68 which is a global benchmark. The overall usability score of the casemix e-learning was
significantly below the benchmark, p< .001.
4- Discussion
The demographic of the study noted that the mean age of e-learners group was 34.70±8.66
years. The highest population of e-learners was age range of 20-29 with 36.8%. E-learners
indicated the male gender at 61.4%. The highest percentages of participants were Bachelor
Degree holders, with 33.33%. The year of employment at their current job position with the
highest percentage was 1-3. The highest percentages of e-learners were belonging to
healthcare sector and academic sector with 40.4%.
Purpose of this study was to determine the perceived quality of the casemix e-learning
material from e-learners point of view. Students were asked to evaluate resources available
in the casemix e-learning course. The mean score for the content items was 48.42±5.91 out
of 60, with the highest areas related to clarity of the content followed by the relevancy of
the course with the e-learners' learning goals. Many students found that the quality of
resources was valuable and appropriate and most e-learners found the course useful for
revision with a mean percentage score of 82.2%. However, the item regarding sufficient
course information at beginning of the course, which got the lowest score, suggests room for
139
The role of affecting factors on migration…
Shojaei / Argos Special Issue2, 2015/pp. 1-11
improvements. The overall results from the evaluation of the casemix e-learning material
by the users was very favorable and participants expressed a positive attitude toward
learning material implemented in the casemix online course, with the majority of
participants scores fall in good level (49.1%) followed by very good, with 26.3% and only
3.5% of participants' total score fall in weak level.
In this study, among the most highly rated evaluation items was the item ―I found the
casemix learning material useful for revision‖ which, means casemix e-learning resources
effectively support users‘ learning on their own. And this is valuable since the e-learners
are not dependent on instructor as the primary source of information. The mean score for
the evaluation statement of ―The important information or key concepts were easy to
identify‖ was 4.1 out of 5.0. Which means the location of the information on the screen, the
attributes of the screen (color, graphics, size of text, etc.), the pacing of the information, and
the mode of delivery (audio, visuals, animations, video) are appropriate. These allow the
information to transfer to the working memory of learners (Ally, 2004).
Positive perception of e-learner about the casemix e-learning resources is not surprising
since for developing casemix online course the principles of learning and how students learn
was considered. Ally (2004) has suggested learning theory is one of aspects that should be
considered for developing an educational program to be successful this is especially true for
online learning, where the instructor and the learner are separated. In developing casemix
e-learning material the various aspect of learning theory was considered. For example; the
learning materials are included examples that relate to learners, so that they can make
sense of the information and to facilitate the transfer of learning and to encourage their
application in different and real-life situations, using real-life cases, is part of the casemix
e-learning course. In casemix e-learning course the target audience (and the learning
objective are clearly defined. Moreover, the material is broken into manageable modules
(chunks) such as chapters and lessons, enabling the learners to grasp the overall structure
of the material and map it to the course objectives, as well as to follow details in chapters
and lessons more easily. Each module typically takes a learner about 15-30 minutes to
cover. In addition, the e-learners were satisfied with the way the information organized on
the screen, as well as different interactivity options.
This finding is important, because the quality of an e-learning system is one of important
determinants of e-learning effectiveness (Wu et al., 2010). The structure and the learning
material are a major factor for facilitating meaningful learning. Learning is influenced
more by the content and instructional strategy in the learning materials than by the type of
technology used to deliver instruction (Jati, 2013). The course organization and
instructional materials are more important to online students than face-to-face students
since e-learner more rely on learning material.
140
The role of affecting factors on migration…
Shojaei / Argos Special Issue2, 2015/pp. 1-11
The sixth purpose of this study was to determine the perceived usability of the casemix e-
learning system from e-learners point of view. Students were asked to evaluate casemix e-
learning system. The mean score for the usability items was 60.61 ±10.72 out of 100. Each
category of Usability of the system showed high scores, with the highest areas related to
feeling confident about using the system followed by the easiness of using the system. In
contrast, the areas scored the lowest by e-learners were related to self-efficiency and
convenience of using the system. The most percentage of e-learners' usability score falls
within our definition of ‗Ok‘ which shows acceptable usability level of the system. However,
11.2% of the respondents' usability score fall in our definition of 'poor' which shows the site
should be improved in terms of usability. This finding is important since several empirical
evidences have argued that system functionality (Pituch & Lee, 2006) affects the
effectiveness of computer-mediated learning. System functionality have the potential to
directly affect perceived usefulness of Information System (Hong et al., 2002; Pituch & Lee,
2006) that are thought to be similar concepts in performance expectation.
Too much focus on developing the application and not enough focus on the implementation
is a problem. This problem is because people think the job is done when the program is
developed. Therefore, an e-learning program with high quality can be low in usability
(Kobbenes & Folkman, 2003). E-learning program that has high quality level in order to
support learning may not be usable in order to support the need for quick help. Challenge
in an information-rich world is not only to make information available to people whenever
and wherever they need and in any form but specifically to say the right thing at the right
time in the right way (Fallman, 2003).
5- Conclusion
The quality and reliability of an e-learning system, as well as easy access to appropriate
educational technologies, material content, and course-related information are important
determinants of e-learning effectiveness. The data of this study indicated a great degree of
satisfaction with the quality and structure of the e- Learning material developed for the
study. This is important because learning in e-learning course is based on material and
course organization, while learning performance in traditional learning is based on
instructors. E-learning system usability results show us that usability of the system is high
enough to support web-based education but it need improvements in sectors like self-
efficiency and convenience of the system and etc.
Starting from the experience here reported, the authors are confident that e-learning will
continue to gain ground and evolve as an effective and appreciated educational mean for
Case-mix system. It offers flexibility since it can effortlessly be accessed anytime and
anywhere. This finding is valuable for ITCC to improve and fine-tune future course
offerings. This finding is also important to learners because they can be confident about the
141
The role of affecting factors on migration…
Shojaei / Argos Special Issue2, 2015/pp. 1-11
content of the course. Realizing the positive view of participants regarding casemix e-
learning material, this study suggests ITCC may consider into the need of improving the
casemix training for healthcare staff in developing countries and regularly evaluation of
course to ensure the accuracy of the material and up-to-date information is provided to the
users.
References
1. Ally, M. 2004. Foundations of educational theory for online learning. In T. Anderson (2nd
ed.). The theory and practice of online learning , pp. 3−31. Athabasca, AB: Athabasca
University. (online)http://desarrollo.uces.edu.ar:8180/dspace/bitstream/1234 5
6789/586/1/Theory%20and%20Practice%20of%20online%20learning.pdf#page=227 (20 May
2011).
2. Aquaro, P.D. & DeMarco, M. 2008. Literature review: what makes a good school website,
why build one and is there help in doing so?. (online) http://www.Mariannedema
rco.com/academic/graduate_ms/Research/LiteratureReview.pdf ( 20 Jun 2010).
3. Baker, R. & Papp, R. 2004. Evaluating critical success factors of distributed learning.
Proceedings 7th Annual Conference of the Southern Association for Information Systems , pp.
256-263.
4. Brooke, J. 2013. SUS: A Retrospective. Journal of Usability Studies 8(2):29-40. (online)
http://uxpajournal.org/sus-a-retrospective/ (20 Septamber 2013)
5. Coffield, F., Moseley, D.; Hall, E.; & Ecclestone, K. 2004. Learning styles and pedagogy in
post-16 learning. A systematic and critical review. (online) http://www.lsda.org.uk/
files/PDF/1543.pdf (25 July 2011).
6. Devedzic, V. 2006. Semantic Web and education. (online) http://www.springercom/cda/
content/document/cda_downloaddocument/9780387354163-c1.pdf?SGWID=0-0-45-346 664 –
p173663511(10 June 2010)
7. Fallman, D. 2003. Design-oriented Human—Computer Interaction. Department of
Informatics and Umeå Institute of Design user modelling in human computer interaction.
8. Hong, W., Thong, J. Y. L., Wong, W. M., & Tam, K. Y. 2002. Determinants of user acceptance
of digital libraries: An empirical examination of individual differences and system
characteristics. Journal of Management Information Systems 18(3):97–124.
9. Jati, G. 2013. Learning management system (moodle) and e-learning content development.
(online) http://journal.fsrd.itb.ac.id/jurnal-desain/pdf_dir/issue_3_12_28_3 .pdf (6 December
2013)
10. Kobbenes, H. & Folkman, K. 2003. E-learning and usability: integrating the use and user
dimension. (online) http://ren.inekstranett.no/PageFiles/610/Usability, %20REN.pdf (12 June
2012)
11. Manyika, J. & Roxburgh, C. 2011. The great transformer: the impact of the Internet on
economic growth and prosperity. (online) http://www.mckinsey.com/~/media/McKins
ey/dotcom/Insights%20and%20pubs/MGI/Research/Technology%20and%20Innovation/The%2
0great%20transformer/MGI_Impact_of_Internet_on_economic_growth.ashx (20 Septamber
2012).
12. Nam, C.S. 2003. A Theory-Based Integrated Design Process for Development and Evaluation
of Web-Based Supplemental Learning Environments. Ph.D thesis, The Virginia Polytechnic
Institute and State University.
13. Nielsen, J. 2012. Usability 101: Introduction to Usability. (online) http://www.
nngroup.com/articles/usability-101-introduction-to-usability/.(10Nov 2013).
14. Oliva, P.F. 2005. Developing curriculum. United States of America: Allyn and Bacon.
142
The role of affecting factors on migration…
Shojaei / Argos Special Issue2, 2015/pp. 1-11
15. Phillips, R., Gilding, T. 2003. Approaches to evaluating the effect of ICT on student learning.
(online) https://www.alt.ac.uk/sites/default/files/assets_editor.../eln01 5.pdf (10 May 2011).
16. Pituch, K. & Lee, Y. 2006. The influence of system characteristics on e-learning use.
Computers & Education 47:222–244.
17. Rashidah, R., Parilah, M.S., Rosseni, D., Sharifah, N.P., Juhaida, A.A., & Mohamed, A.E.
2011. Learners‘ Evaluation of an e-Learning Material. (online) http://www.wseas. us/e-
library/conferences/2011/Jakarta/EACT/EACT-11.pdf (20 June 2012).
18. Reeves, T. & Hedberg, J. 2003. Interactive Learning Systems Evaluation. New
Jersey:Educational Technology Pubs.
19. Sauro, J. 2011. A practical guide to the system usability scale (SUS). Measuring usability,
Denver, USA
20. Venkatesh, V., Morris, M.G., Davis, G.B., & Davis, F.D. 2003. User Acceptance of
Information Technology: Toward A Unified View. MIS Quarterly 27(3):425-478.
21. Wong, S, Nguyen, T., Chang, E., & Jayaratna, N. 2003. Usability metrics for e-learning. In R.
Meersman, & Z. Tari (ed). Lecture Note in Computer Science:On The Move to Meaningful
Internet Systems 2003: OTM 2003 Workshops. pp.235–252. Heidelberg: Springer Berlin
Heidelberg.
22. Wu, J_H., Tennyson, R. D., & Hsia, T-L. 2010. A study of student satisfaction in a blended e-
learning system environment.Computers & Education 55:155–164.
143