´ëÇѾð¾îÇÐȸ ÀüÀÚÀú³Î

´ëÇѾð¾îÇÐȸ

28±Ç 2È£ (2020³â 7¿ù)

A Study of High School English Listening Assessment Items Based on a Discriminant Analysis

Nayu Kim ¡¤ Heechul Lee

Pages : 1-14

DOI : https://doi.org/10.24303/lakdoi.2020.28.2.1

PDFº¸±â

¸®½ºÆ®

Abstract

Kim, Nayu & Lee, Heechul. (2020). A study of high school English listening assessment items based on a discriminant analysis. The Linguistic Association of Korea Journal, 28(2), 1-14. The purpose of this study is to investigate what types of listening assessment items take an important role in identifying high school students English listening ability, and to explore teachers considerations about designing and modifying the types of items. 187 high school students participated in this research and took an English listening test made up of twenty multiple-choice items. Each students score was unpacked by discriminant analysis, and it was investigated what types of items made a difference when the students were classified into three English listening levels; high, intermediate or low. Two types that had a key role on distinguishing high level listeners from the others were to identify the other person's response and to infer what one asks the other. In contrast, low level listeners responded appropriately to such types as to find out main idea of a dialogue and to guess characters following response after a dialogue. Based on the findings from this research, the English teacher can identify a high or low level listener easily and correctly based on the above four types of listening test items. In addition, the English teacher should ensure the validity and reliability of the items, and modify them properly considering a students cognitive and affective development, listening strategies and learning motivation in order to construct the effective listening assessment system.

Keywords

# English listening assessment item # English listening level # discriminant analysis

References

  • Alderson, C. (2000). Assessing reading. Cambridge, UK: Cambridge University Press.
  • Anna, C., & Shyang C. (2008). Listening strategies of L2 learners with varied test tasks. TESL Canada Journal 25(2), 15-16.
  • Batty, A. O. (2015). A comparison of video-and audio-mediated listening tests with many-facet Rasch modeling and differential distractor functioning. Language Testing, 32(1), 3-20.
  • Brown, D. H., & Abeywickrama, P. (2010). Language assessment: Principles and classroom practices (2nd Eds.). NY: Pearson Education.
  • Brown. J. D. (2005). Testing in language programs: A comprehensive guide to English language assessment, New York: McGraw-Hill College.
  • Buck, G. (2001). Assessing listening. Cambridge: Cambridge University Press.
  • Carr, N. T. (2011). Designing and analyzing language tests. Oxford: Oxford University Press.
  • Chang, J. (2012). The construct validity of models of L2 listening ability. Modern English Education, 13(2), 207-229.
  • Cubilo, J. (2017). Video-mediated listening passages and typed note-taking: Examining their effects on examinee listening test performance and item characteristics (Unpublished doctoral dissertation). University of Hawaii at Manoa, Honolulu, HI.
  • Dekeyser, R. (2007). Practice in a second language: Perspectives from applied linguistics and cognitive psychology. Cambridge: CUP.
  • Field, J. (2013). Cognitive validity. In A. Geranpayeh & L. Taylor (Eds.), Examining listening: Research and practice in assessing second language listening (pp. 77-151). Cambridge: Cambridge University Press.
  • Graham, S. (2006). Listening comprehension: The learners¡¯ perspective. System, 34, 165-82.
  • Gruba, P. (1997). The role of video media in listening assessment. System, 25(3), 335-345.
  • Hair, J. F., Black, W. C., Babin, B. J., & Anderson, R. E. (2010). Multivariate data analysis. New Jersey: Pearson Education.
  • Han, M. S., & Lee, J. S. (2010). An analysis of factors hindering English listening comprehension: Based on the English listening comprehension test of high school students. Secondary English Education, 3(1), 107-135.
  • Hughes, A. (2003). Testing for language teachers (2nd ed.). Cambridge, UK: Cambridge University Press.
  • Jang, S. Y., & Kim, T. (2016). Exploring Validity of the Nationwide Standardized English Listening Tests Using the Rasch Measurement Model. Studies in English Education, 21(2), 115-145.
  • Jung, H. (2010). An analysis of spoken English test items in Korean CSAT. Studies in English Education, 15(2), 177-197.
  • Katz A. (2012). Linking assessment with instructional aims and learning. In C. Coobe, P. Davidson, S. Stoynoff, & D. Nunan (Eds.), Voices from the language classroom: Qualitative research on language education. Cambridge, UK: Cambridge University Press.
  • Kim, D. J. (2014). Analysis on learning effects of a university English class partially-integrating reading and listening subjects. Modern Studies in English Language & Literature, 58(2), 33-65.
  • Kim, G, S. (2007). A study of a teaching method for high score strategy of listening assessment of high school English learners. STEM Journal, 8(1), 141-168.
  • Kim, I. (2006). Effects and agenda of the nationwide English listening comprehension test in Korea. The Linguistic Association of Korea Journal, 14(1), 183-202.
  • Kunnan, A, J. (2009). The U.S. naturalization test. Language Assessment Quarterly, 6, 89-97.
  • Lee, B., Lee, Y. J., & Jun, H. (2016). The validity of the new item types of the listening section of an English proficiency test. Secondary English Education, 9(4), 141-165.
  • Lee, J. H. (2001). Some comprehensive suggestions for the improvement of the College Scholastic Abilities Test (CSAT). English Teaching, 56(2), 333-364.
  • Ockey, G. J. (2007). Construct implications of including still image or video in computer-based listening tests. Language Testing, 24(4), 517-537.
  • Oxford, R. (1990). Language learning services: What every teacher should know. Boston, MA: Heinle & Heinle.
  • Park, T. (2019). A comparative study of audio-mediated and video-mediated English listening tests. Secondary English Education, 12(3), 133-155.
  • Yo, I., & Rie, K. (2009). A meta-analysis of test format effects on reading and listening test performance: Focus on multiple-choice and open-ended formats. Language Testing, 26, 219-231.