Test Item Formats in Finnish Chemistry Matriculation Examinations


Abstract views: 584 / PDF downloads: 124

Authors

DOI:

https://doi.org/10.51724/ijpce.v4i2.104

Keywords:

Chemistry matriculation examination questions, Classification, Constructed-response items, Selected-response items, Test item formats

Abstract

Summative assessment plays an essential role in the chemistry education. This paper presents an analysis of Finnish chemistry matriculation examination questions according to test item format, and some examples of the analysis and examination questions. The research data consisted of 257 chemistry questions from 28 matriculation examinations between 1996 and 2009. Qualitative approach and theory-driven content analysis method were employed in the research. This research was guided by the following question: What kinds of test item formats are used in chemistry matriculation examinations? The research indicates that summative assessment was used diversely in chemistry matriculation examinations. The tests included various test item formats, and their combinations. The majority of the test questions were constructed-response items that were either verbal, quantitative, or laboratory-related items, symbol items, or combinations of the aforementioned. The studied chemistry matriculation examinations seldom included selected-response items that can be either multiple-choice, binary-choice, or matching items. The classification framework developed in the research can be applied in chemistry and science education, and also in educational research.

Downloads

Download data is not yet available.

References

Aksela, M. & Juvonen, R. (1999). Kemian opetus tänään [Chemistry Education Today]. Opetushallitus [Ministry of Education]. Helsinki: Edita Oy, 38–41.

Bennett, R. E. (1993). On the meanings of constructed response. In R. E. Bennett & W. C. Ward (eds.), Construction versus choice in cognitive measurement. New Jersey: Lawrence Erlbaum Associates, Inc, 1–2.

Bennett, S.W. (2008). Problem solving: Can anybody do it?. Chemistry Education Research and Practice, 9, 60–64.

Black, P. (2004). Purposes for assessment. In J. Gilbert (ed.), The routledgefalmer reader in science education. London: Routledge, 189–198.

Brooks, D.W. & Crippen, K.J. (2006). Web-based practice and assessment systems in science. In J. J. Mintzes & W. H. Leonard (eds.), Handbook of college science teaching: Theory, research, and practice. Arlington, Va.: NSTA Press, 253.

Cohen, L., Manion, L. & Morrison, K. (2007). Research methods in education. 6th edition. London: Routledge.

CUSE (Committee on Undergraduate Science Education, National Research Council), (1997). Science teaching reconsidered: A handbook. Washington D.C.: National Academy Press, 39–45.

Doran, R.L., Lawrenz, F. & Helgeson, S. (1994). Research on assessment in science. In D. L. Gabel, (ed.), Handbook of research on science teaching and learning. New York: Macmillan Publishing Company, 388–427.

Downing, S.M. (2002). Assessment of knowledge with written test forms. In G. R. Norman, C. Vleuten & C. V. D. Newble (eds.), International handbook of research in medical education. Dordrecht: Springer, 647–670.

Downing, S.M. (2003). Guessing on selected-response examinations. Medical Education, 37(8), 670–671.

Downing, S.M. (2006). Selected-response item formats in test development. In S. M. Downing & T. M. Haladyna (eds.), Handbook of test development. London: Routledge, 287–302.

Ferrer, L. (2008). Building effective strategies for teaching of science'2008 ed.. Manila: Rex Bookstore, Inc., 193–198.

Haladyna, T.M. (2004). Developing and validating multiple-choice test items. 3rd edition. Mahwah, NJ: Routledge, 3–4, 67–98.

Haláková, Z. & Prokša, M. (2007). Two kinds of conceptual problems in chemistry teaching. Journal of Chemical Education, 84(1), 172–174.

Hancock, D.R. (2007). Effects of performance assessment on the achievement and motivation of graduate students. Active Learning in Higher Education, 8(3), 219–231.

Harlen, W. (2004). Teaching, learning and assessing science 5–12. 3rd edition. London: Paul Chapman Publishing Ltd, 108–121.

Heinonen, V. & Viljanen, E. (1980). Evaluaatio koulussa. [Evaluation in school] Keuruu: Otava, 11–266.

Hogan, T.P. & Murphy, G. (2007). Recommendations for preparing and scoring constructed-response items: What the experts say?. Applied Measurement in Education, 20(4), 427–441.

Holt, L.C. & Kysilka, M.L. (2005). Instructional patterns: Strategies for maximizing student learning. Thousand Oaks, CA: SAGE, 116.

Huffman, D. (2002). Evaluating science inquiry: A mixed-method approach. In J. W. Altschuld & D. D. Kumar (eds.), Evaluation of science and technology education at the dawn of a new millennium. New York: Springer, 219–242.

Kraska, M. (2008). Assessment. In N. J. Salkind & K. Rasmussen (eds.), Encyclopedia of educational psychology. 2nd edition. Thousand Oaks, CA: SAGE, 60–65.

Leuenberger, T. (2001). Developing and using diagnostic and summative assessments to determine students’ conceptual understanding in a junior high school earth science classroom. In D. P. Shepardson (ed.), Assessment in science: A guide to professional development and classroom practice. Dordrecht: Springer, 199.

Lindblom-Ylänne S. (2003). Oppimisen psykologia ja ylioppilastutkinto. [The psychology of learning and matriculation examination]. In A. Lahtinen & L. Houtsonen (eds.), Oppi osaamiseksi–tieto tulokseksi: Ylioppilastutkinnon 150-juhlavuotisseminaari. Helsinki, 38.

Lunetta, V.N., Hofstein, A. & Clough, M.P. (2007). Learning and teaching in the school science laboratory: An analysis of research, theory, and practice. In S. K. Abell & N. G. Lederman (eds.), Handbook of research on science education. New Jersey: Lawrence Erlbaum Associates, 393–442.

Martinez, M.E. (1999). Cognition and the question of test item format. Educational Psychologist, 34(4), 207–218.

McMahon, M., Simmons, P. & Sommers R. (2006). Assessment in science: Practical experiences and education research, Arlington, Va.: NSTA Press, 29.

McMillan, J.H. (2008). Assessment essentials for standard-based education. 2nd edition. Thousand Oaks, CA: Corwin Press, 6–38.

McTighe, J. & Ferrara, S. (1998). Assessing learning in the classroom. Student assessment series. Washington D.C.: National Education Association, 11–20.

Miller, M.D., Linn, R.L. & Gronlund, N.E. (2008). Measurement and assessment in teaching. 10th edition. New Jersey: Pearson Education, Inc., 1–287.

Murphy, P. & McCormick, R. (2006). Problem solving in science and technology education. In J. Gilbert (ed.), Science education: Major themes in education, volume II. London: Routledge, 186–214.

Nitko, A.J. & Brookhart, S.M. (2007). Educational assessment of students. 5th edition. New Jersey: Pearson Education, Inc., 1–260.

Osterlind, S.J. (1998). Constructing test items: Multiple-choice, constructed-response, performance, and other formats. 2nd edition. Boston: Springer, 30.

Pelton, T. & Pelton, L.F. (2006). Introducing a computer-adaptive testing system to a small school district. In S. L. Howell & M. Hricko (eds.), Online assessment and measurement: Case studies from higher education, K-12, and corporate. Hershey, PA: Idea Group Inc (IGI), 146.

Phelps, A.J. (1996). Teaching to enhance problem solving. Journal of Chemical Education, 73(4), 301–304.

Plake, B.S. (2005). Doesn’t everybody know that 70% is passing?. In R. P. Phelps (ed.), Defending standardized testing. Mahwah, NJ: Routledge, 182.

Popham, W.J. (2003). Test better, teach better: The instructional role of assessment. Alexandria, VA: ASCD, 72–105.

Quellmalz, E. & Hoskyn, J. (1997). Classroom assessment of reasoning strategies. In G.D. Phye (ed.), Handbook of classroom assessment. San Diego, CA: Academic Press, 111–112.

Reid, N. & Yang, M. (2002). The solving of problems in chemistry: The more open-ended problems. Research in Science & Technological Education, 20(1), 83–98.

Rodriquez, M.C. (2002). Choosing an item format. In G. Tindal & T. M. Haladyna (eds.), Large-scale assessment programs for all students: Validity, technical adequacy, and implementation. Mahwah, N.J.: Lawrence Erlbaum Associates, 213–232.

Salmio, K. (2004). Esimerkkejä peruskoulun valtakunnallisista arviointihankkeista kestävän kehityksen didaktiikan näkökulmasta. [Examples of national basic education evaluation programmes from the perspective of the didactics of sustainable development]. Joensuu: Joensuun yliopistopaino, 29–30, 60–69, 164.

Scheerens, J., Glas, C.A.W. & Thomas, S.M. (2003). Educational evaluation, assessment and monitoring: A systemic approach. London: Routledge, 100–110.

Stiggins, R.J. & Erter, J.A. (2004). Classroom assessment for student learning: Doing it right, using it well. Portland, OR: Assessment Training Institute, 99–100.

Tamir, P. (2003). Assessment and evaluation in science education: Opportunities to learn and outcomes. In B. J. Fraser & K. G. Tobin (eds.), International handbook of science education: Part two. Dortrecht: Kluwer Academic Publishers, 761–785.

Tarendash, A.S. (2006). Let's review chemistry: The physical setting. 4th edition. Hauppauge, N.Y.: Barron's Educational Series, 491.

Temiz, B.K., Taşar, M. F. & Tan, M. (2006). Development and validation of a multiple format test of science process skills. International Education Journal, 7(7), 1007–1027.

Uusikylä, K. & Atjonen, P. (2005). Didaktiikan perusteet. [Fundamentals of didactics]. 3rd edition.. Helsinki: WSOY, 71–73, 191–208.

Wakeford, R. (2003). Principles of student assessment. In H. Fry, S. Ketteridge & S. Marshall (eds.), A handbook for teaching & learning in higher education: Enhancing academic practice. 2nd edition. London: Routledge, 42–61.

Welch, C. (2006). Item and prompt development in performance testing. In S. M. Downing & T. M. Haladyna (eds.), Handbook of test development. London: Routledge, 304–305.

Woolfolk, A. (2007). Educational psychology. 10th edition. Boston, Mass: Allyn and Bacon, 560.

Downloads

Published

08/19/2012

How to Cite

Tikkanen, G., & Aksela, M. (2012). Test Item Formats in Finnish Chemistry Matriculation Examinations. International Journal of Physics and Chemistry Education, 4(2), 157–172. https://doi.org/10.51724/ijpce.v4i2.104