Paperless Examinations Project
Paperless Examinations Project
Paperless Examinations Project
In the rollout since March 2003, support needs have become minimal.
Pedagogical
The authoring process in bringing together the subject experts and the technical authors proved to be one of the most valuable learning elements of this phase. The authoring process raised upskilling and training issues if Awarding Bodies were to develop large numbers of onscreen examinations. Once confident of the facilities available, the authors were able to make good use of the technology to assess parts of the specifications in ways that were not possible with paper based examinations. The need to convert paper based Basic/Key Skills Tests for on screen use confirmed the findings of the Phase One report that converting from paper to screen is not the most effective method of creating or using the features available from on screen testing. Learners found all the tests accessible and there were no apparent IT barriers to completion; in the Basic/Key Skills Tests no learners indicated that they would have preferred to undertake the paper tests. Most centres conducted the tests in rooms already fitted out with computers. Whilst this caused some environment issues, these were not insurmountable. Markers generally liked the software now being used for marking, although there are still some issues stemming from the different marking process that need to be addressed.
Summary of Recommendations
Item and Test development
Authors for onscreen item development should be ICT competent and should have had used a range of multimedia materials both online and on CD-ROM in the classroom. A more freeform approach to authoring, while generating a lot of creativity, caused problems with the programming and screen creation. The awarding bodies should draw up guidelines concerning item development, software capability and procedures for authoring. The development of objective question styles for computer-based summative assessment is a professional skill. This should be offered as part of teacher training and/or within INSET provision in order that item design expertise can be developed. In fact, the Scottish Qualifications Authority (SQA) is currently developing an Advanced Certificate in E-assessment. A national qualification such as this would be of great value in ensuring homogeneity of standards. Authors will not have the same technical skills and facilities available as the technology partner. With regard to this, video conferencing could bridge the geographical distance of authors and item producers and allow more frequent and efficient reviewing of items and mark schemes. This would reduce costs and would ease the revision activities and the scrutiny of the near final product. Provision for the development of the widest possible range of item styles is required so that assessment design is not compromised by limitations within the system. A revision of the method in which templates, offered in a Microsoft Word format, were employed for item writing is recommended. A custom HTML based system with upload functionality and version control would be preferred. Scrutineers should experience a fully functional model of the exams with all technical requirements in place exactly as a candidate would experience it.
Research into the potential for different item types should be utilised. Innovative items, which make increasing use of the technology, could be developed. As computer-based exams become more common, specimen discs of items could be developed. To provide Basic/Key Skills Tests more suitable for computer-based testing the issue of item design needs to be addressed and computer-based items developed from the beginning. In particular, audio for explaining instructions should be used within onscreen Basic/Key Skills Tests. Expertise in the areas of accessibility with regard to learners with special needs, for example visual impairment, needs to be introduced from the beginning. This is not only with regard to the assessment but also in the development of teaching resources and exemplar materials. The use of audio is particularly recommended. A greater use of reference material should be available onscreen where appropriate. The periodic table in the chemistry paper would be a good example. Calculators and other tools should be incorporated where this is allowed/appropriate.
Technology
Content
The range and complexity of custom items needs to have greater direction from the technical end, without compromising author creativity. Formal item authoring software should be provided to authors to allow them to create screens in a consistent fashion with upload functionality. This should replace the often awkward and inconsistent use of word documents.
Security
Further integration of ExamBase server with the web server to remove any potential security weaknesses should be implemented.3 Automated ExamBase server housekeeping functionality is needed (removing old data). Full and encrypted system logging is required to allow auditing of exam centre procedures.
Development
More thorough testing using structured procedures and in multiple environments is required at every stage of the development process.4 Although the administrative system worked effectively for the volume of candidates in the trial, MicroBoard needs an end-to-end redesign and development in the light of comments from users and scalability concerns.5
Installation
Specifications for centres need to be more stringent, notably no use of NT4 server operating system, and standards for proxy servers need to be developed.6
3 4
This has now been fully implemented in the latest version of the software (V2.0). Full development procedures have now been implemented. These include a register to ensure all centres are using the same version of the software and that any upgrades have been implemented. A specification for this upgrade is under development. NT4 is no longer being supported.
5 6
Smoother and more integrated installation methods are required to allow easier self-installation in centres.7 A date of installation should be agreed with a centre and a technical representative of the technology partner should visit all centres. This is costly in terms of resources and travel but saves time in the end. Ideally, combined installation and training to the technicians in the use of the software on site should be introduced.8 A self-install option of the software for LAN-centres together with rigorous monitoring and testing procedures should be offered to centres for large-scale rollout.9 .pdf versions of documentation or training materials should be offered on a dedicated secure website, including forms. Posting hard copies of documentation should be avoided.9 With a large-scale rollout a detailed plan for installation needs to be in place. This would require documentation made available to all centres from the Awarding Body. In Northern Ireland, a rollout through a managed service such as Classroom 2000 (C2k)29 would help to avoid the occurrence of the installation difficulties encountered during the project.
Display
Centres should set ExamBase client machines to exactly 800 600 resolution to avoid the black border effect found with higher resolutions.
Since March 2003 over 135 centres (as at end of November 2003) have successfully self-installed the software using revised training and installation procedures. As a result of the successful self-installation processes, it is not considered necessary for a technical representative to visit all centres. All these recommendations have now been put into place with the rollout of the Basic/Key Skills Tests. An on site installation process is now being used successfully with the Basic/Key Skills rollout. The Internet-based solution is no longer being offered. See section 4.5.2.
9 10 11
For future use of the system, Awarding Bodies and the Technology Partner need to develop a clear and stringent approval process for centres that includes the way in which the technology is managed and supported. Centres need to be clear about the responsibility they have to ensure the system is properly managed and supported (including adhering to deadlines for centres, for example for registration and installation procedures) in the same way that they are held responsible for the proper operation of the paper-based system. Patterns of centre support requirements (technological and learner support) need to be analysed on an ongoing basis so that future needs can be assessed.12 The software development undertaken by BTL has produced a robust solution that provides the facilities for centres to access tests and to run on-screen examinations. However, the project has shown the need for IT expertise to be available in centres. Centre staff, both technical and administrative, need to be fully trained to operate the system. In addition, it is vital that centres complete a rigorous testing programme prior to participating in live examinations. There would be great benefits if all centres were equipped with common IT platforms. This will be the case in Northern Ireland with the rollout of Classroom 2000 (C2k)29. Coupled with clear guidance from Awarding Bodies, this would ensure that centres able to deliver on-screen examinations in an effective way. Adult Basic/Key Skills Tests will need a much more streamlined and automated process to move results from MicroBoard to the Awarding Bodies legacy systems.13 A mobile test centre, for example a bus equipped with a wireless LAN, perhaps Internet Access and exam delivery software, would assist test centres that would wish to offer tests to small numbers of candidates on a non-regular basis but do not operate a LAN or have little ICT expertise available, from the installation/invigilation procedures and the costs included in the updating of their ICT system.14
Marking
The Awarding Bodies need to develop procedures for computer-based marking processes and mark centres. The MarkerBase system would be even more popular with examiners if they could mark onscreen at home, which would provide greater flexibility. MarkerBase can be configured for secure Internet access and has to undergo a testing procedure for remote marking. MarkerBase needs to be made even more user-friendly by implementing the markers' suggestions received during the GCSE marking pilot. MarkerBase needs to offer facilities for markers and supervisors (chief examiners) to communicate electronically and immediately. Furthermore, access for the supervisors to the facility of monitoring markers' work needs to be tested. A mobile mark centre, for example a bus equipped with a wireless LAN, perhaps Internet Access, and marking software, would offer savings in travel time and costs to examiners who at any stage during the marking process temporarily do not have secure Internet access or the technology at home. That way marking could continue under secure conditions and delays in marking resulting in late releases of results would be kept to a minimum. This mobile mark centre could be offered as a regular test centre, too, as well as a contingency for marking. Consideration of using wireless technology (laptops) should be included in forward planning/implementing of computer-based marking.
12 13 14
See section Key Issues and Conclusions, page 2. This has now been implemented (September 2003). Trials are now being undertaken both with a mobile bus equipped with an on board network and a portable wireless LAN based system.
PROJECT OVERVIEW
1.1 Background
The Paperless Examinations Project was initiated in 2000, following discussions between CCEA and Edexcel. The two Awarding Bodies recognised the growing importance of technology within assessment and the likelihood that, in the medium term, there would be initiatives to introduce the use of technology within public examinations. Furthermore, in Northern Ireland the Department of Education document A Strategy for Educational Technology proposed the piloting of ICT enriched syllabuses and examining modes in one or two subjects.15 CCEA and Edexcel were aware that, while there were a very limited number of public examinations delivered using computers in the United Kingdom, there was no assessment technology being utilised in schools. None of the tests currently delivered using computers could be described as involving high stakes qualifications such as GCSE. The only large-scale use of IT for testing purposes was the DSAs driver theory test. This test, together with the TTAs skills tests for newly qualified teachers represented most of the IT testing being undertaken within public examinations in the United Kingdom. Government has shown considerable interest in the possible development of computer-based tests and onscreen examinations. There is little doubt that, within the next few years, the awarding bodies will be expected to introduce computer-based examinations. The CCEA/Edexcel project aimed to evaluate the technical feasibility and educational benefits of developing a system that allowed the introduction of computer-based assessment within the current practices used for high stakes examinations such as GCSE. The Paperless Examinations Project was designed to address the issues, both pedagogical and technological, that were likely to be faced by Awarding Bodies in implementing computer-based examinations. The intention was to use ICT to improve the processes associated with: designing, delivering, marking, and reporting of results
of examinations without changing the current rigour of the assessment. It was recognised that the introduction of ICT within teaching and learning will change the nature of the assessment process. Initially, a timescale of three to four years for development work was envisaged. The project was planned in phases with the outcomes of each phase analysed prior to proceeding to the next stage. In the end, the project was undertaken in two phases and lasted from October 2000 to July 2003. A report on Phase One was produced in November 2001. This report deals with the outcomes of the second phase of the project. Phase One was undertaken from October 2000 to September 2001. On-screen trial tests were completed by a group of fourth-year (first year GCSE) pupils in seven schools in Northern Ireland (April 2001). In parallel with the on-screen tests, a control group of pupils in the same schools completed the tests on paper with their completed scripts being scanned to facilitate computer-based processing. The tests were based on GCSE Modular Science examinations. The pilot ran successfully in four schools, two schools experienced some technical difficulties and one school encountered major difficulties using the system. Marking of the computer-based tests was a combination of computer marking and, where judgements were required, marking by trained examiners. The examiners were based in a mark centre and marked online. The focus within this first phase was on the technical feasibility of offering computer-based GCSE type tests and on identifying issues arising. A full evaluation report on Phase One of the project was produced in November 2001 and is available from either CCEA or Edexcel.
15
Department of Education Northern Ireland (1997): A Strategy for Education Technology in Northern Ireland, page 14.
1.2.1 Aims
To provide a well-argued case for the use of ICT within high stakes examinations that would improve the processes associated with candidate registration, test development and delivery, marking, and reporting and analysis of results. To develop assessment that is fit for purpose where large-scale assessment is computer-based. This would include moving and manipulating large amounts of data as required by the delivery of multimedia questions and the capturing and marking of complex constructed responses. To pilot live computer-based Basic/Key Skills Tests and investigate implications for rollout of onscreen skills test.
1.2.2 Objectives
To develop a robust, viable and secure computer-based assessment delivery system that would ensure accessibility, authenticity and control, and would enable CCEA and Edexcel to plan the introduction of computer-based assessment on a large scale. To conduct six computer-based 60 minutes tests adhering to GCSE Science specifications and two computer-based 75 minutes tests adhering to GCSE Geography specifications in twenty secondary schools across Northern Ireland and England. To develop software that will enable computer-based testing and processing for Basic/Key Skills. To pilot live computer-based 60 minutes or 75 minutes Basic/Key Skills Tests in fifteen to twenty test centres (different from GCSE test centres) located in Northern Ireland, England and Germany.
CCEA/Edexcel (2001): Paperless Examinations Project Phase I Report. Executive Summary. Available on CCEA website www.ccea.org.uk
16
To produce an informed report with recommendations to plan for large-scale rollout of computerbased examinations.
1.2.3 Scope
Consultation and liaison with approximately thirty-five CCEA and Edexcel schools, colleges and other training providers17 in Northern Ireland, England and Germany with regard to compatibility and system requirements. In all cases, the centres involved needed to show a significant ICT ethos and a willingness to participate in this type of developmental work. Design and testing of the computer-based GCSE-type tests as specified by CCEA and Edexcel Science and Geography examination teams. The test items developed had to be based on GCSE or Basic/Key Skills Tests requirements, as agreed by the Regulatory Authorities (QCA, ACCAC and CCEA)18. Recruitment of CCEA and Edexcel examiners and markers and the provision of training in the design of innovative items and the use of the marking software. Review of the computer-based GCSE-type tests in collaboration with CCEA and Edexcel examination officers and representatives from the pilot schools with regard to quality assurance and relevance to the curriculum and assessment criteria requirements. Design, adaptation and development of invigilation instructions for delivery of assessment to test centres in relation to computer-based test delivery based on the BSI Code of Practice (BS 7988)19. Administration of the computer-based tests including test centre set-up, training of test centre staff with regard to the technology, registration, test requirements, test delivery to centres, delivery of the pupils responses to markers, test results processing and reporting. The system/software required complying with interoperability standards20 being developed in the area of computer-based assessment and online managed learning systems where these were in place or under development.
17 18 19 20
Hereafter referred to as centres. Refer to glossary section. BS 7988:2002. Code of practice for the use of information technology (IT) in the delivery of assessments. SCORM: The Sharable Content Object Reference Model (SCORM) defines a Web-based learning Content Aggregation Model and Run-Time Environment for learning objects. The SCORM is a collection of specifications adapted from multiple sources to provide a comprehensive suite of e-learning capabilities that enable interoperability, accessibility and reusability of Web-based learning content. The work of the ADL Initiative to develop the SCORM is also a process to knit together disparate groups and interests. This reference model aims to coordinate emerging technologies with commercial and/or public implementations. www.adlnet.org
(Geography) was chosen because questions/items in this subject would require more and longer free response answers and certain scientific methods relevant to data handling. In addition, Geography was considered to offer further opportunities for the use of multimedia within the item development. Since the trial exams were based on the GCSE specifications offered by both Awarding Bodies, it was expected that they would be useful in the schools for revision purposes prior to the live GCSE examinations. The combination of automated and professional marking was an important feature of the pilot software. Strand B: Piloting live Adult Basic/Key Skills Tests The second strand of the project was the live piloting of Adult Basic/Key Skills National Tests in centres across England, Northern Ireland, and Germany.
Background information
The current (paper-based) procedure for these tests is outlined in the following. The four Adult Basic Skills Qualifications available are: Certificate in Adult Literacy (Level One); Certificate in Adult Literacy (Level Two); Certificate in Adult Numeracy (Level One); Certificate in Adult Numeracy (Level Two).
These tests are offered by a number of accredited Awarding Bodies. The Awarding Bodies currently accredited to offer assessments for these qualifications include AQA, City & Guilds, Edexcel, LCCI, NCFE and OCR. The only form of assessment currently approved for these qualifications are the National Basic Skills tests devised for the Regulatory Authorities (QCA, ACCAC and CCEA ). These tests lead to certification for adult learners who wish to have their achievements in the National Adult Core Curricula for Literacy and Numeracy recognised. The above four tests are the same tests as those for Key Skills qualifications in Communication (Levels One or Two) and Application of Number (Levels One or Two). For the Key Skills qualification, in addition to the tests, learners are required to complete portfolios of evidence. For Key Skills, tests in Information Technology (Levels One or Two) are also available Test questions are based around everyday scenarios, each group of questions testing different aspects of reading, writing, numeracy, or IT skills. All questions are multiple-choice questions of the formula offering four answers out of which one is correct. During each test window a different version of the tests is offered. Candidates are allowed to re-sit the tests. They can be drawn from any type of Basic/Key Skills provision. The set of paper-based tests converted for computer-based assessment for this project included: Adult Literacy/Communication (Level One) Adult Literacy/Communication (Level Two) Adult Numeracy/Application of Number (Level One) Adult Numeracy/Application of Number (Level Two) Information Technology (Level One) Information Technology (Level Two)
For the Phase Two pilot it was essential that the learners were prepared for the assessment and that they were confident enough in their use of computers. The test window lasted for two weeks.
10