MANAGEMENT STUDENTS’ ATTITUDE TOWARDS DIFFERENT KINDS OF EXAM FORMAT: A CASE FROM A UNIVERSITY IN NORWAY

Leiv Opstad

Abstract


The exam is an important tool for learning and measuring students’ knowledge and competence. However, there are no standard answers as to how exams are best suited to ensure this. There is considerable variation in students’ preferences for different exam types. In this study, the focus is on students’ attitudes towards the choice of exam type. This survey features management students from Norway, who study for a master's degree in order to pursue a career in the public sector. The results show that students are well motivated to learn, but they are sensitive to the choice of exam type. This affects effort, motivation, and expected success. There are different views on the types of exams that are perceived to be fair. The oral exam stands out in particular, as female students have great anxiety in relation to this form of exam.

 

Article visualizations:

Hit counter


Keywords


assessment evaluation, students’ attitudes, exam format, management students

Full Text:

PDF

References


Akimov A, Malin, M, 2020. When old becomes new: a case study of oral examination as an online assessment tool. Assessment & Evaluation in Higher Education 45(8): 1205-1221. https://doi.org/10.1080/02602938.2020.1730301

Athanasios N, McNett J M, Harvey C, 2003. Critical thinking in the management classroom: Bloom's taxonomy as a learning tool. Journal of Management Education 27(5): 533-555.https://doi.org/10.1177/1052562903252515

Bazvand A D, Rasooli A, 2022. Students’ experiences of fairness in summative assessment: A study in a higher education context. Studies in Educational Evaluation 72: 101118. https://doi.org/10.1016/j.stueduc.2021.101118

Bengtsson L, 2019. Take-Home Exams in Higher Education: A Systematic Review. Education Sciences 9(4): 267. doi: https://doi.org/10.3390/educsci9040267.

Biggs J, Tang C, 2007. Teaching for Quality Learning at University, Maidenhead, UK

Blanch D C, Hall J A, Roter D L, Frankel R M, 2008. Medical student gender and issues of confidence. Patient Education and Counseling 72(3): 374-381. https://doi.org/10.1016/j.pec.2008.05.021

Bloom B S, (Ed.), 1956. Taxonomy of educational objectives, handbook I: Cognitive domain. New York

Burger R, 2017. Student perceptions of the fairness of grading procedures: A multilevel investigation of the role of the academic environment. Higher Education 74: 301–320. https://doi.org/10.1007/s10734-016-0049-1

Chan N, Kennedy P E, (2002). Are multiple‐choice exams easier for economics students? A comparison of multiple‐choice and “equivalent” constructed‐response exam questions. Southern Economic Journal 68(4): 957-971

Denis M O, Mudulia M, 2019. Assessment of the principal’s administrative strategies on adequate staffing and its influence on students’ performance in Kcse in Masaba, South sub-county, Kisii County, Kenya. European Journal of Education Studies.6 (4) http://dx.doi.org/10.46827/ejes.v0i0.2568.

Dufresne, R J, Leonard, W. J, Gerace, W. J. 2002 . Marking sense of students' answers to multiple-choice questions. The Physics Teacher, 40(3): 174-180.

Forehand, M, 2010. Bloom’s taxonomy. Emerging Perspectives on Learning, Teaching, and Technology, 41(4), 47-56.

Gibbs G, 2009. The assessment of group work: lessons from the literature. Assessment Standards Knowledge Exchange: 1-17.

Huxham M, Campbell F, Westwood J, 2012. Oral versus written assessments: A test of student performance and attitudes. Assessment & Evaluation in Higher Education, 37(1): 125-136. https://doi.org/10.1080/02602938.2010.515012

Kuechler W L, Simkin M G, 2010. Why is performance on multiple‐choice tests and constructed‐response tests not more closely related? Theory and an empirical test. Decision Sciences Journal of Innovative Education, 8(1): 55-73.

Lopéz D, Cruz, J.-L, Sánchez, F, Fernández A 2011. A take-home exam to assess professional skills. In Proceedings of the 41st ASEE/IEEE Frontiers in Education Conference, Rapid City, SD, USA, 12–15 October 2011

Mattheos N, Nattestad A, Falk‐Nilsson E, Attström R, 2004. The interactive examination: assessing students' self‐assessment ability. Medical Education, 38(4), 378-389.

Moore R, Jensen P, 2007. Do open-book exams impede long-term learning in introductory biology courses? Journal of College Science Teaching 36, 46–49.

Nordberg D, 2008. Group projects: More learning? Less fair? A conundrum in assessing postgraduate business education. Assessment & Evaluation in Higher Education, 33(5), 481-492. https://doi.org/10.1080/02602930701698835

Opstad L, 2020a. Attitudes towards Multiple Choice Questions among Business Students. The Future of Education proceedings, 10th Conference, 18-19. June, Florence, Italy

Opstad L, 2020b. Attitudes towards statistics among business students: do gender, mathematical skills and personal traits matter? Sustainability, 12(15): 6104.

Opstad L, 2021a. Can Multiple-Choice Questions Replace Constructed Response Test as an Exam Form in Business Courses? Evidence from a Business School. Athens Journal of Education, 8(4):349-360.

Opstad L, 2021b. Can we identify the students who have success in macroeconomics depending on exam format by comparing multiple-choice test and constructed response test? International Journal Education Economics and Development, 12 (4), 2021

Opstad L, 2022. Did COVID-19 change students’ grade assessments? A study from a business school. Social Sciences and Education Research Review 9(1): 7-16. https://doi.org/10.5281/zenodo.6794376

Opstad L, Pettersen I, 2022. Did Home-Based Exams during COVID-19 Affect Student Ranking? A Case from a Business School. Educational Process: International Journal, 11(2): 96-113 https://doi.org/10.22521/edupij.2022.112.5

Pereira D, Flores M A, Niklasson L, 2016. Assessment revisited: a review of research in Assessment and Evaluation in Higher Education. Assessment & Evaluation in Higher Education, 41(7):1008-1032. https://doi.org/10.1080/02602938.2015.1055233

Rich R, 2011. An experimental study of differences in study habits and long-term retention rates between take-home and in-class examination. International Journal of University Teaching and Faculty Development 2: 123–129.

Ringeisen T, Lichtenfeld S, Becker S, Minkley N, 2019. Stress experience and performance during an oral exam: the role of self-efficacy, threat appraisals, anxiety, and cortisol. Anxiety, Stress, & Coping 32(1): 50-66. https://doi.org/10.1080/10615806.2018.1528528

Romeo Jr, V B, Astroquillo N R, Cadangin M A, Dubpaleg B. E, Elardo B J Y, Gellado I J O, 2022. Perceived challenges and satisfaction of education students in online distance learning. European Journal of Education Studies, 9(8) https://doi.org/10.46827/ejes.v9i8.4400

Simkin M G, Kuechler W L, 2005. Multiple‐choice tests and student understanding: What is the connection? Decision Sciences Journal of Innovative Education 3(1): 73-98. https://doi.org/10.1111/j.1540-4609.2005.00053.x

Stinebrickner R, Stinebrickner T R, 2008. The causal effect of studying on academic performance. The BE Journal of Economic Analysis & Policy 8(1): 1–53.

Turner G, Gibbs G, 2010. Are assessment environments gendered? An analysis of the learning responses of male and female students to different assessment environments. Assessment & Evaluation in Higher Education, 35(6): 687-698. https://doi.org/10.1080/02602930902977723

Zoller U, Ben‐Chaim D, 1990. Gender differences in examination‐type preferences, test anxiety, and academic achievements in college science education—a case study. Science Education 74(6) :597-608.




DOI: http://dx.doi.org/10.46827/ejes.v9i10.4491

Refbacks

  • There are currently no refbacks.


Copyright (c) 2022 Leiv Opstad

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2015-2023. European Journal of Education Studies (ISSN 2501 - 1111) is a registered trademark of Open Access Publishing Group. All rights reserved.


This journal is a serial publication uniquely identified by an International Standard Serial Number (ISSN) serial number certificate issued by Romanian National Library (Biblioteca Nationala a Romaniei). All the research works are uniquely identified by a CrossRef DOI digital object identifier supplied by indexing and repository platforms. All authors who send their manuscripts to this journal and whose articles are published on this journal retain full copyright of their articles. All the research works published on this journal are meeting the Open Access Publishing requirements and can be freely accessed, shared, modified, distributed and used in educational, commercial and non-commercial purposes under a Creative Commons Attribution 4.0 International License (CC BY 4.0).