06 Tsagari
06 Tsagari
06 Tsagari
dina tsagari
University of Cyprus
Abstract: The measurement of language skills is a widespread, if not integral practice of most language teaching programs around the world. This has resulted in the introduction of various assessment procedures carried out by teachers and used as a basis for measuring progress in a foreign language (Brindley 1997). However, we know very little about how teachers deal with assessment demands in their daily practice and the types of skills they need to cope with these demands (assessment literacy). This paper discusses the nature of assessment literacy and presents results of investigations carried out to date in the field. It also presents further research findings which illustrate the nature and practices of classroom assessment and opportunities from in-service teachers of English in public schools in Greece and Cyprus. The paper concludes by making suggestions on how teachers can successfully develop their assessment literacy.
1. Introduction
Stiggins (2001:531) first coined the phrase assessment literacy to represent the standards of professional excellence that teachers need to attain in relation to assessment. For example, Standards of teachers assessment literacy (Standards for Teacher Competence in Educational Assessments of Students, 1990) specify that teachers should be skilled in:
JAL 27 (2011/2012)
123
124
Dina Tsagari
choosing and developing assessment methods appropriate for instructional decisions; using assessment results when making decisions about individual students, planning teaching, developing curriculum, and institutional improvement; developing, using and evaluating valid student grading procedures which use student assessments; communicating assessment results to students, educational decision makers and other concerned stakeholders. The importance of teacher literacy in Language Testing and Assessment (LTA) has been recognised lately (Hasselgreen 2008, Kaftandjieva 2008, Reckase 2008, Taylor 2009, Yin 2010). Vogt et al (forthcoming) define assessment literacy of language teachers as the ability to design, develop and critically evaluate tests, design and monitor assessment procedures, grade and score them on the basis of theoretical knowledge. Gardner (2010) stresses that literacy in LTA for classroom purposes has taken on a new importance in educational systems on a global scale. However, Alderson (2005: 4) maintains that teachers do not possess the necessary levels of assessment literacy and stresses that tests made by teachers are often of poor quality, and the insight they could offer into achievement, progress, strengths and weaknesses is usually very limited indeed. Research findings from the field of general education and English as a Second Language (ESL) seem to confirm this picture: teachers assessment competencies have been doubted and teachers, very often, are depicted as assessment illiterate. For instance, studies depict teachers as heavy users of tests (Gullickson 1984) with superficial knowledge of test use, especially in interpreting standardized tests (Goslin 1967, Gullickson 1984). In other studies, while teachers reported using a variety of assessment methods, their most frequently used formats corresponded to those used by formal external examinations (Rogers 1991, Wilson 1998, 2000). In other cases, teachers were seen to be doing very little reflection on what was being assessed and were unaware of the assessment work of their colleagues (Black and Wiliam 1998, Harlen and Deakin-Crick 2002, 2003). Teachers are also seen to have little knowledge of assessment frameworks and adapt whatever assessment activities are in their disposal to suit their own teaching contexts (Arkoudis and OLoughlin 2004, Breen et al 1997, Davison 2004). Other studies have yielded contradictory results depicting teachers with strong preference for informal assessment methods (Brindley 1989, Mavrommatis 1997, Stiggins and Conklin 1992). Studies also highlighted the tensions between administrative and educational purposes for the use of assessment in-
125
struments and state-mandated assessment policies on teacher assessment (Arkoudis and OLoughlin 2004, Breen et al 1997, Davison 2004, Rogers 1991, Wilson 1998, 2000). Finally, studies have shown that teachers assessment practices vary according to teachers' experience, their views of the role of assessment in the curriculum, collegial expectations and external reporting demands (Breen et al 1997, McCallum et al 1995) (see Tsagari 2011, for an extensive discussion of the research literature). In recent discussions of teacher assessment literacy Leung (2005: 879) observes that teachers own professional knowledge about the content of () assessment schemes may not be taken for granted and that further detailed research needs to be undertaken. In addition, further research needs to be undertaken for another reason too, that is for determining levels of teachers assessment literacy in English as a Foreign Language (EFL) contexts as whatever research studies we have are either in ESL contexts (Arkoudis and OLoughlin 2004, Breen et al. 1997, Davison 2004) or Higher Education English language programmes (Cheng, Rogers and Huiqin 2004, Cumming 2001). As a result, we know little about how familiar EFL teachers are with assessment standards such as the ones mentioned above. What we know so far is that classroom assessment (assessment methods and procedures used in small-scale, in-class tests) used by EFL teachers tend to mirror those of examinations (externally set, standardised, large-scale public tests), especially before their administration when teachers feel obliged to coach their students for such exams (Cheng 2005, Tsagari 2009, Wall 2005, Wall and Alderson 1993, Wall and Horak 2006). Thus, how EFL teachers choose and develop their classroom assessment methods, report their learners language progress and achievement, the skills or language areas they choose to test, the purposes they have for their assessments, etc. and whether their training is sufficient has not been clearly established yet. These issues actually remain largely unproblematized and unresearched. If educators, particularly those in teacher training programs, are to help teachers use their testing time effectively, more must be learned about how teachers perceive and use LTA practices and what their training needs are. Against this background, the present study aimed at investigating the inclass assessment practices and training needs of EFL teachers in Greece and Cyprus. The number of students who study EFL and teachers who teach in these settings as well as the central role that assessment plays in the teaching and learning process in both contexts provide a much needed research context for the investigation of assessment practices and needs of EFL teachers.
Dina Tsagari
The aim of the study was to systematically investigate the extent to which EFL teachers in state schools in Greece and Cyprus are assessment literate by studying, through quantitative data and analysis, the ways in which teachers create their assessment instruments for monitoring, recording, and assessing learners' progress and achievement in the classroom and the levels of teachers training. The results of the present study aspired to become a stimulus for bringing about change in the assessment procedures in the Greek and Cypriot state-school EFL contexts, e.g. serve as the basis of pre- and in-service training programs for language teachers that could help them acquire the level of assessment literacy required to meet their assessment needs.
127
the fact that some of the teachers worked in two different types of schools (see Table 1).
Overall, the participants of the study were 443 state school EFL teachers: 353 participants from Greece and 90 participants from Cyprus. The majority of the teachers were female between the ages of 31 and 50, with more than 11 years experience on average. All of them had a university degree (in English Language and Literature), which is a requirement for employment in the state school sector in both countries. Some of the teachers also had additional qualifications, e.g. MA degrees in Applied and Computational Linguistics, Educational Management and Literature.
128
Table 2. Reasons for assessment
Why do you assess your students? Student-centred purposes b. To obtain information about my students progress d. To motivate my students to learn e. To make my students work harder f. To prepare my students for published tests they will take in the future Instruction-based purposes c. To plan my teaching Administration-based purposes g. To determine students final grades a. To place students at appropriate levels h. To provide information to central administration (e.g. School, Ministry) Greece 92.6 56.9 44.2 21.8 72.5 68.8 41.4 24.1
Dina Tsagari
LTA is also used a. to place students at appropriate levels and its use h. to provide information to central administration accords with the mandates of the Ministry of Education and the administrative goals that state school teachers need to meet. Interestingly, more than half of the teachers use LTA either as an external motivation for learning (purposes d and e) or, in the words of Shohamy (2001), as a disciplinary tool (purpose e), that is as a kind of forced work. Under Other, teachers mentioned that they provide information about students progress to other stakeholders beyond their immediate teaching context such as parents. Finally a good number of teachers spend time preparing students for published tests (purpose f) even though there are no standardised tests that students need to prepare for in state schools in both countries. The results also showed that teachers followed the mandates of the EFL curricula and Presidential Decrees in that they used and designed a variety of tests such as progress tests, achievement tests and mini-quizzes, etc. (see Table 3). Table 3. Types of tests and ownership of tests designed
What kind of tests do you use? a. Mini-quizzes b. Progress tests c. Achievement tests d. Diagnostic tests e. Placement tests f. Other Greece Cyprus 71.7 93.8 79.6 40.8 36.8 1.7 72.2 92.2 74.4 47.8 14.4 2.2 Which of these tests do Greece Cyprus you prepare yourself? a. Mini-quizzes b. Progress tests c. Achievement tests d. Diagnostic tests e. Placement tests f. Other 60.3 83.3 66.3 30.6 22.1 1.4 64.4 81.1 63.3 28.9 10.0 1.1
129
Teachers are also busy delivering these tests throughout the school year with the most frequent being mini-quizzes and progress tests (see Table 4). Table 4. Frequency of classroom test administration
How often do you use them in the school year? a. Mini-quizzes G* C* b. Progress tests G C c. Achievement tests G C d. Diagnostic tests G C e. Placement tests G C
*G: Greece, C: Cyprus
Once 0.8 4.4 1.7 4.4 15.9 15.6 23.8 23.3 35.1 14.4
Twice 3.7 10.0 16.7 8.9 22.4 11.0 6.2 7.8 0.6 3.3
Three times 7.4 18.9 32.0 38.9 29.2 25.6 5.1 8.9 0.3
More often 61.5 42.2 41.6 40.0 9.9 22.2 6.5 5.6 0.3 1.1
With regard to the types of assessment methods teachers use for the testing of their students reading skills, the results showed that teachers use a variety of methods (see Table 6).
130
Table 6. Assessment methods for reading
What testing and assessment methods do you use to evaluate your students reading skills? Teacher-oriented methods 2. Ask oral questions on the reading text 3c. True-false items 3d. Matching items 3h. Short answer questions 3b. Sentence completion items 3e. Multiple-choice items 3a. Cloze items 3i. Editing an already written text 3g. Fill in forms 3f. Interpretative items Student-conducted methods 1. Read text aloud 3j. Student summaries of what they read 7. Self-assessment 6. Peer-assessment 5. Student portfolio 4. Student diary/journal Standardised testing 8. Published reading tests Other Greece 84.4 79.6 72.2 60.3 53.8 51.3 39.1 33.1 32.3 25.2 57.2 34.0 26.3 22.9 12.7 6.2 20.7 0.3
Dina Tsagari
Cyprus 84.4 82.2 77.8 57.8 57.8 48.9 52.2 35.6 52.2 32.2 45.6 55.6 35.6 26.7 26.7 20.0 20.0 2.2
However, teacher-oriented methods are used the most, especially objectively scored items such as true-false, matching, etc., while the more subjective items such as editing an already written text, fill in forms or using interpretative items are favoured less. By contrast, teachers generally make less frequent use of student-conducted methods while student diaries/journals are used the least. Interestingly, approximately 20% of the teachers use published reading tests even though there is no such need for them to do so. Teachers reported using various methods for the assessment of writing, too (see Table 7). The most popular methods are teacher-oriented, especially direct items such as composition/essay writing and editing a piece of writing followed by indirect items such as matching items, multiple-choice and True/False items. On the other hand, with the exception of project work, student-conducted methods are used less frequently. Finally, as with reading, teachers use standardised tests even though there is no need for them to do so. The results showed that four teacher-oriented methods are used for the assessment of speaking and listening (Table 8).
131
132
Dina Tsagari
The most popular method among Greek teachers is oral reading/dictation while teachers in Cyprus prefer multiple-choice for the assessment of listening. Two times more student-conducted methods are used perhaps due to the nature of the skills examined. Of all methods, oral interviews/dialogues and oral discussions with individual students are given priority. Interestingly, alternative forms of assessment such as peer- and self-assessment remain the least used methods in both contexts. Finally, as with the rest of the language skills, published tests are used for the testing of listening and speaking skills. Despite small differences between methods used in the two contexts (see Table 9), nine teacher-oriented methods are used for the assessment of grammar and vocabulary with discrete-point tasks such as sentence completion, sentence transformation, True/False, etc., used most frequently. Table 9. Assessment methods for grammar and vocabulary
What testing and assessment methods do you use to evaluate your students grammar and vocabulary skills? Teacher-oriented methods 1b. Sentence completion items 1g. Identify grammatical error(s) in a sentence 1d. Sentence transformations 1e. True/False 1c. Multiple-choice items 1a. Cloze items 1i. Composition or essay writing 1h. Edit a piece of writing such as a sentence or a paragraph 1f. Translation from L1 to L2 and vice-versa Student-conducted methods 4. Self-assessment 3. Peer-assessment 5. Student portfolio 2. Student diary/journal Standardised testing 6. Published grammar and vocabulary tests Other Greece 81.3 67.7 66.0 65.7 64.6 63.7 42.8 39.9 30.3 21.2 17.3 11.9 5.7 28.9 0.3 Cyprus 82.2 63.3 77.8 82.2 52.2 74.4 65.6 44.4 18.9 27.8 24.4 25.6 22.2 22.2 1.6
Student-conducted methods are used less frequently. In addition, translation from L1 and L2 and vice-versa is used by one third of the teachers in Greece probably due to the requirements of the end of the year exam, while published grammar and vocabulary tests are frequent in both contexts.
133
Half of the teachers also use the internet as a source for test items or other tests. Only a small percentage of teachers work together to develop their assessments in Greece unlike Cyprus where there seems to be more communication and sharing of assessments among teachers. Teachers methods for providing feedback after administering a test in class is most often presented to students in three ways: verbally, in the form of written comments or as total test scores (see Table 11).
A small number of teachers hold private meetings with students (conference with students) or use letter grades or checklists as a feedback mechanism during the course. As clarified under Other, teachers also conduct follow-up discussions after a test with their students, that is, they correct the test in the classroom by going through individual test items, providing correct answers, justifying them and talking about individual errors while others simply comment on the overall performance of the class.
134
Dina Tsagari
The majority of teachers followed the same pattern observed for feedback, that is, they mainly provide total test scores and written comments for students final reports (see Table 12). These findings are in conformity with the directives of the Ministry of Education which requires teachers to present their feedback as total test scores while in some cases (e.g. primary school reports), there is some space for written comments on the students reports.
135
However, only one third of the teachers received training from such organisations. Among the above institutions and organisations were various teachers associations, the Pedagogic Institute and the Training Organisation of Educators which offered LTA training but this was attended by a small number of teachers. Interestingly, teachers received LTA training from external boards, too.
136
Dina Tsagari
in the EFL curricula (Ministry of Education and Religious Affairs 1999, 2003) and decrees (Presidential Decree 4230 1996) (see Table 14).
Teachers favour the use of teacher-oriented assessment methods in both countries. For example, objectively scored items such as T/F, multiple choice, sentence completion, sentence transformation, etc., are among the high-ranking task types for the assessment of most skills. A number of teachers in Greece, in particular, also favoured dictation writing because of the influence of the guidelines of the achievement tests, and translation techniques due to the requirements of the English paper included in the university entrance exam (called Panhellenic Exams in Greece). On the other hand, student-constructed assessments, such as self- and peer-assessment or student portfolio, are used to a lesser extent. An exception here is that teachers prefer student-conducted methods slightly more for the assessment of listening and speaking, perhaps due to the nature of the skills tested. As mentioned earlier, teachers also used samples of external exams for their in-class assessment. Such a tendency can be attributed to either lack of teacher training in LTA or the influence of private language schools where preparation for external exams is a frequent practice given that the majority of state school teachers have worked for a number of years in such educational contexts before being employed in state schools. In summary, in both countries teacher-oriented methods were used the most (see Table 15). As a result, teachers mainly employ summative assessment as opposed to formative or diagnostic assessment for their in-class assessment. This runs contrary to the educational policy described in the official documents
137
that recommend the use of a variety of student-oriented methods for classroom assessment (Ministry of Education and Religious Affairs 1999, 2003).
Furthermore, teachers appear to be reliant on available print sources (e.g. test booklets) for their assessments. This does not come as a surprise as textbooks are, very often, accompanied by test booklets, that teachers use quite extensively (see also Tsagari 2007). In addition, teachers favourite methods for providing feedback after an assessment event, are verbal feedback and written comments while their end-of-course assessment feedback mechanisms takes the form of a structured report, that is total test scores, rather than a combination of qualitative and quantitative feedback, as suggested in the official documents. The research results also showed that teachers, especially in the Greek context, are not accustomed to reviewing the assessment questions or tasks they use for their in-class assessments and do not discuss them critically with peers. As a consequence teachers under study do very little reflection on what is being assessed and are unaware of the assessment work of their colleagues. In addition to the above, the results showed that the teachers participating in the study are not sufficiently trained in LTA. Only half of them received training in areas of LTA. However, their training was not extensive. LTA training mostly limited itself in the form of topics included in courses or workshops that have limited duration. This finding is also in agreement with other research findings that show that state school teachers, generally, do not receive high quality overall professional development opportunities (Karavas 2008). Taken together, these survey findings helped in exploring how the participating EFL teachers experienced their roles as assessors of student work. They revealed that progress toward an assessment-literate culture has been rather slow and that EFL teachers in the Greek and Cypriot contexts need to acquire higher degrees of assessment literacy. The results also illustrate some of the complexity of assessment and evaluation practices in EFL courses in the two countries. For example, the study showed that assessment and evaluation in the present contexts of inquiry are a necessary but complex undertaking reflecting
138
Dina Tsagari
the nature of the teaching and learning environment where assessment takes place. The tendencies in assessment and evaluation practices observed in this study seem to have been influenced by the interplay of the following internal and external factors: Internal factors: Teachers instructional beliefs and attitudes Teachers views of the role of assessment Students learning and assessment needs Teaching and learning environment (size of the classes, teaching resources)
External factors: Mandated assessment policies (Ministry of Education through Presidential Decrees) Impact of external testing (e.g. international standardised language exams, university entrance exams) Involvement of other stakeholders, e.g. parents These factors seem to have collectively contributed to the preferences in the choice of assessment types used by the EFL teachers in the study.
5. Limitations
The findings described above are based on EFL instructors self-reports of their own assessment practices and attitudes as these are elicited through a survey questionnaire. As with any such survey, there is the potential that not all teachers interpreted the survey questions in the same fashion or, perhaps, responded the way they did so as not to lose face. In the case of the Cypriot teachers we have to mention that the results may be slightly inflated due to the smaller number of teachers compared to the larger number of their Greek colleagues. In addition, many of the Cypriot participants work on a temporary basis in the private sector where they could have more freedom to use a variety of assessment tools. In an attempt to confirm the findings of the present study and to gain a better understanding as to why the teachers implement assessment in the ways that they do, the following data are now being collected: Follow-up interviews with teachers, school advisors and inspectors and Samples of teachers tests and other official documents.
139
References
Alderson, J.C. (2005). Diagnosing Foreign Language Proficiency: The Interface between Learning and Assessment. London / New York: Continuum.
140
Dina Tsagari
Arkoudis, S. and K. OLoughlin (2004). Tensions between validity and outcomes: Teacher assessment of written work of recently arrived immigrant ESL students. Language Testing, 21/3: 284-304. Black, P. and D. Wiliam (1998). Assessment and classroom learning. Assessment in Education: Principles, Policy and Practice, 5/1: 7-74. Breen, M.P., C. Barratt-Pugh, B. Derewianka, H. House, C. Hudson, T. Lumley, and M. Rohl (1997). Profiling ESL Children: How Teachers Interpret and Use National and State Assessment Frameworks (Volumes 1-3). Canberra: DEETYA Brindley, G. (1997). Assessment and the language teacher: Trends and transitions. The Language Teacher, 21/9. Available: http://jalt-publications.org/old_tlt/files/97/sep/ brindley.html [28/9/2011] Brindley, G. (1989). Assessing Achievement in the Learner-centred Curriculum. Sydney: National Centre for English Language Teaching and Research, Macquarie University. Cheng, L. (2005). Changing Language Teaching through Language Testing: A Washback Study. Cambridge: Cambridge University Press. Cheng, L., T. Rogers, and H. Huiqin (2004). ESL/EFL instructors classroom assessment practices: purposes, methods, and procedures. Language Testing, 21/3: 360-389. Cumming, A. (2001). ESL/EFL instructors practices for writing assessment: Specific purposes or general purposes? Language Testing, 18: 207-24. Davison, C. (2004). The contradictory culture of teacher-based assessment: ESL teacher assessment practices in Australian and Hong Kong secondary schools. Language Testing, 21/3: 305-334. Gardner, J. (2010). Developing teacher assessment: An introduction. In J. Gardner, W. Harlen, L. Hayward and G. Stobart (eds), Developing Teacher Assessment. Maidenhead: Open University Press, 1-11. Goslin, D.A. (1967). Teachers and Testing (2nd ed.). New York: Russell Sage Foundation. Gullickson, A.R. (1984). Teacher perspectives of their instructional use of tests. Journal of Educational Research, 77: 244-248. Harlen, W. and R.D. Crick (2003). Testing and motivation for learning. Assessment in Education: Principles, Policy & Practice, 10/2: 169-207. Hasselgreen, A. (2008). Literacy in classroom assessment (CA). What does this involve? Paper presented at the 5th Annual Conference of EALTA. Athens, Greece. [online]. Available http://www.ealta.eu.org/conference/2008/docs/sunday/panel/Literacy%20 in%20classroom%20assessment.pdf [04/09/2011] Hasselgreen, A., C. Carlsen, and H. Helness (2004). European Survey of Language Testing and Assessment Needs. Part One: General Findings. [online]. Available www.ealta.eu.org/resources [9/11/2008] Kaftandjieva, F. (2008). Assessment literacy in Europe and beyond: Realities and
141
prospects. Paper presented at the 5th Annual Conference of EALTA. Athens, Greece. [online]. Available http://www.ealta.eu.org/conference/2008/docs/sunday/ panel/Panel.pdf [04/09/2011] Karavas, K. (2008). How satisfied are Greek EFL teachers with their work? A study of the motivation and job satisfaction levels of the Greek EFL teacher. Paper presented at the 29th Annual TESOL Greece Convention, Hellenic American Union, Athens, Greece. Karavas-Doukas, E. (1996). Using attitude scales to investigate teachers attitudes to the communicative approach. English Language Teaching Journal, 50/3:187-198. Leung, C. (2005). Classroom teacher assessment of second language development: Construct as practice. In E. Hinkel (ed.), Handbook of Research in Second Language Teaching and Learning. Mahwah, NJ: Lawrence Erlbaum, 869-888. Mavrommatis, Y. (1997). Understanding assessment in the classroom: Phases of the assessment processthe assessment episode. Assessment in Education, 4/3:381-400. McCallum, B., C. Gipps, S. McAlister and M. Brown (1995). National Curriculum assessment: Emerging models of teacher assessment in the classroom. In H. Torrance (ed.) Evaluating Authentic Assessment. Buckingham: Open University Press, 57-87. Ministry of Education and Religious Affairs (1999). English Language Curriculum for Senior High Schools. Ministry of Education and Religious Affairs: Athens. Greece. Ministry of Education and Religious Affairs/Pedagogic Institute (2003). Crossthematic Curriculum Framework for Compulsory Education: English. [Online] Available: http://www.pi-schools.gr/lessons/english/pdf/14depps_XenonGlossonAgglika.pdf. [28/9/2011] Newfield, T. (2007). Engendering assessment literacy: Narrowing the gap between teachers and testers assessing foreign language performances. In T. Newfields, I. Gledall, M. Kawate-Mierzejewska, Y. Ishida, M. Chapman, and P. Ross (eds), Proceedings of the 2007 KELTA International Conference. August 25, 2007. College of Education, Seoul National University, 22-42. Presidential Decree, 4230 (1996). Government Gazette. Reckase, M. (2008). Assessment literacy. Paper presented at the 5th Annual Conference of EALTA. Athens, Greece. [online] Available http://www.ealta.eu.org/ conference/2008/docs/sunday/panel/Reckase.pdf [04/09/2011] Rogers, T. (1991). Educational assessment in Canada: evolution or extinction? The Alberta Journal of Educational Research, 37:179-92. Shohamy, E. (1992). Beyond proficiency testing: A diagnostic feedback testing model for assessing foreign language learning. Modern Language Journal, 76/4: 513-521. Shohamy, E. (2001). The Power of Tests: A Critical Perspective on the Uses of Language Tests. London: Longman.
142
Dina Tsagari
Standards for Teacher Competence in Educational Assessments of Students (1990). Washington, DC: American Federation of Teachers, National Council on Measurement in Education, and National Educational Association [online]. Available: http://www.unl.edu/buros/bimm/html/article3.html [28/9/2011] Stiggins, R. and N. Conklin (1992). In Teachers Hands: Investigating the Practice of Classroom Assessment. Albany, NY: SUNY Press. Stiggins, R.J. (1999a). Learning teams can help educators develop their collective assessment literacy. Journal of Staff Development, 20/3 [online]. Available: http://www.nsdc.org/library/publications/jsd/stiggins203.cfm [28/9/2011]. Stiggins, R.J. (1999b). Teams. Journal of Staff Development, 20/3 [online]. Available: http://www.nsdc.org/library/publications/jsd/stiggins203.cfm [28/9/2011] Stiggins, R.J. (2001). Student-Involved Classroom Assessment (3rd ed.). Upper Saddle River, NJ: Prentice-Hall, Inc. Taylor, L. (2009). Developing assessment literacy. Annual Review of Applied Linguistics, 29, 21-36. Tsagari, C. (2007). Investigating the Washback Effect of a High-Stakes EFL Exam in the Greek context: Participants Perceptions, Material Design and Classroom Applications. Unpublished PhD Thesis, Department of Linguistics and English Language, Lancaster University, UK. Tsagari, D. (2009). The Complexity of Test Washback: An Empirical Study. Frankfurt am Main: Peter Lang GmbH. Tsagari, D. (2011). Investigating the Assessment Literacy of EFL state school teachers in Greece. In D. Tsagari & I.Cspes (eds), Classroom-based Language Assessment. Frankfurt: Peter Lang, 169-190. Vogt, K., D. Tsagari, S. Sahinkarakas, Q. Arifi, E. Guerin and U. Migdal (forthcoming) Assessment literacy of foreign language teachers: Findings of a European study. Language Assessment Quarterly. Wall, D. (2005). The Impact of High-Stakes Examinations on Classroom Teaching: A Case Study Using Insights from Testing and Innovation Theory. Cambridge: Cambridge University Press. Wall, D. and J.C. Alderson (1993). Examining washback: The Sri Lankan impact study. Language Testing, 10/1:41-69. Wall, D. and T. Horak (2006). The Impact of Changes in the TOEFL Examination on Teaching and Learning in Central and Eastern Europe: Phase 1, the Baseline Study. TOEFL Monograph Series. Report Number: RR-06-18, TOEFL-MS-34. Princeton, NJ: Educational Testing Service [online]. Available http://www.ets.org/ Media/Research/pdf/RR-06-18.pdf [20/9/2011]
143
Wilson, R.J. (1996). Assessing Students in Classrooms and Schools. Toronto, Canada: Allyn & Bacon. Wilson, R.J. (1998): Aspects of validity in large-scale programs of student assessment. Paper presented at the Conference on Measurement and Evaluation: Current and Future Research Directions for the New Millennium, Banff, Canada. Wilson, R.J. (2000). A model of assessment-in-practice. Paper presented at the Annual Conference of the Canadian Society for the Study of Education, Edmonton, Alberta. Yin, M. (2010). Understanding classroom language assessment through teacher thinking research. Language Assessment Quarterly, 7/2: 175-194.
144 Appendix
Dina Tsagari
Survey on Practices and Training Needs of EFL Teachers in Language Testing and Assessment
In order to determine the LTA practices and needs of Greek EFL teachers, we would like to kindly ask you to fill in the following questionnaire. Based on the findings, we will have a clearer picture of your testing and assesment needs and be in a better position to propose training courses for pre-service and in-service training programmes that can meet the needs identified. Thank you for your co-operation!
A. Biographical information
Please tick your answers unless otherwise stated. 1. Age: (1) 2130 (2) 3140 2. Gender: (1) Male (2) Female
o o
o o
o o
3. Years of teaching experience: (1) 01 o (2) 25 o (3) 610 o (4) 11+ o 4. Professional qualifications: (1) BA in English Language and Literature o (2) MA in ____________________________________________ (3) Other: ________________________________________________ 5. Current (main) teaching situation: (1) State Primary School o (2) State Secondary School (Gymnasium) o (3) Secondary School (General Lyceum) o (4) Secondary level (Technical & Vocational Lyceum) o (5) Other: __________________________________________________o
145
B. Purposes of LTA
1. Do you assess your students language skills at your school? (1) YES (2) NO q q If no, please give reasons why:
________________________________________________________________ 2. Why do you assess your students? Please tick () all that apply. a) To place students at appropriate levels o b) To obtain information about my students progress o c) To plan my teaching o d) To motivate my students to learn o e) To make my students work harder o f) To prepare my students for published tests they will take in the future o g) To determine students final grades o h) To provide information to central administration (e.g. School, Ministry) o i) Other: _______________________________________ 3. What kind of tests do you use? Please tick () all that apply. a) Mini-quizzes o b) Progress tests o c) Achievement tests (end of the term/year) o d) Diagnostic tests o e) Placement tests o f) Other: ______________________________________ 4. Which of these tests do you prepare yourself? Please tick () all that apply. a) Mini-quizzes o b) Progress tests o c) Achievement tests o d) Diagnostic tests o e) Placement tests o f) Other: ______________________________________ 5. How often do you use them in the school year? Please check () all that apply. Once a. Mini-quizes b. Progress tests c. Achievement tests d. Diagnostic tests e. Placement tests f. Other: o o o o o o Twice o o o o o o Three times o o o o o o More often o o o o o o
Dina Tsagari
What assessment methods do you use to evaluate your students? a. Reading 1. 0. If you do not test reading, please put a tick () here o and move to the next section. 1. If you do, please put a tick () here o and continue below 2. Tick () the method/s you use to evaluate your students in reading. 1. Read text aloud 2. Ask oral questions on the reading text 3. Teacher-made tests containing: a) cloze items b) sentence completion items c) true-false items d) matching items e) multiple-choice items f) interpretative items (e.g. read a passage & interpret a map or a set of directions) g) fill in forms (e.g. read a passage and fill in an application form or an order form
of some kind)
o o o o o o o o o o o o o o o o o
h) short answer items i) editing an already written text j) write summary of the text 4. Student diary/journal 5. Student portfolio 6. Peer-assessment 7. Self-assessment 8. Published reading tests 9. Other? Please specify here: b. Writing 1. 0. If you do not test writing, please put a tick () here o and move to the next section. 1. If you do, please put a tick () here o and continue below 2. Tick () the method/s you use to evaluate your students in writing. 1. Teacher-made tests containing: a) matching items b) true-false items c) multiple-choice items to identify grammatical error(s) in a sentence d) elaborating/expanding a text e) composition or essay writing 2. Student diary/journal 3. Peer-assessment 4. Self-assessment 5. Student portfolio 6. Project work 7. Published writing tests 8. Other? Please specify here:
o o o o o o o o o o o
147
c. Speaking and listening 1. 0. If you do not test speaking and listening, please put a tick () here o and move to the next section. 1. If you do, please put a tick () here o and continue below 2. Tick () the method/s you use to evaluate your students in speaking and listening. 1. Oral reading/dictation o 2. Oral interviews/dialogues o 3. Oral discussion with each student o 4. Oral presentations o 5. Public speaking o 6. Teacher made tests requiring students to: a) give oral directions o b) follow directions given orally o c) provide an oral description of an event or object o d) prepare summaries of what is heard o e) answer multiple-choice test items following a listening passage o f) take notes o g) retell a story after listening to a passage o 7. Peer-assessment o 8. Self-assessment o 9. Published speaking test o 10. Published listening test o 11. Other? Please speficy here: d. Grammar and vocabulary 1. 0. If you do not test grammar and vocabulary, please put a tick () here o and move to the next section. 1. If you do, please put a tick () here o and continue below 2. Tick () the method/s you use to evaluate your students in grammar and vocabulary. 1. Teacher-made tests containing: a) cloze items o b) sentence completion items o c) multiple-choice items o d) sentence transformations o e) True/False o f) translation from L1 to L2 and vice-versa o g) identify grammatical error(s) in a sentence o h) editing a piece of writing such as a sentence or a paragraph o i) composition or essay writing o 2. Student diary/journal o 3. Peer-assessment o 4. Self-assessment o 5. Student portfolio o 6. Published grammar/vocabulary tests o 7. Other? Please speficy here:
Dina Tsagari
Please tick () all that apply. 1. Which of the following represents your primary source(s) for test questions/items and other assessment procedures? a. Items developed by myself o b. Items prepared together with other teachers o c. Items from published textbook materials o d. Items found on the Internet o e. Other published test items o f. Other (please specify):_________________________________________ 2. When you give feedback to your students after a test, how do you do so? a. Verbal feedback o b. Checklist o c. Written comments o d. Conference with student o e. Total test score o f. A letter grade o g. Other (please specify): ______________________________________ 3. When you give final evaluation feedback to your students, how do you do so? a. Checklist o b. Written comments o c. Total test score o d. A letter grade o e. Other (please specify): _____________________________________
E. Training in LTA
1. During your pre-service or in-service teacher training, have you learned something about LTA (theory and practice)? 0. No o 1. Yes o 2. If yes above, please tick () all that apply. a. Yes, I have completed a full course on testing and assessment. o b. Yes, I have completed a course where testing and assessment were topics. o c. Yes, I have completed a workshop on testing and assessment. o 3. If yes above, who organised this? Please give details: a. Regional Training Centres (PEK) o b. Pedagogical Institute o c. Training Organisation of Educators (OEPEK) o
149
If you have any further comments you wish to make in relation to EFL testing and assessment in state school education, please use the space below: ______________________________________________________________________________ ______________________________________________________________________________ _____________________________________________________________________ ______________________________________________________________________________ ______________________________________________________________________________ _____________________________________________________________________
If you are interested in receiving the results of the survey please submit your e-mail address: Thank you!