Abstract:
Over the last years, electronic exams have received increasing attention, as they offer far more potential than simply replacing paper-based questions by electronically s...Show MoreMetadata
Abstract:
Over the last years, electronic exams have received increasing attention, as they offer far more potential than simply replacing paper-based questions by electronically supported multiple choice questions, or replacing blank paper by text fields in a web-browser or PDF form. Especially in the area of software development and software engineering, both students and teachers expect to benefit from an exam environment where students can utilize the same set of tools that they are used to from working on their homework exercises. Furthermore, teachers expect that electronic exams provide support with respect to automatic assessment. In this paper, we investigate applying electronic exams to introductory programming courses. In these exams, students have to solve programming problems using an integrated development environment, and hand in source code as their solution. This, however, requires a shift in formulating exam tasks, in order to align these tasks with learning objectives. It also requires careful investigation into the students' ability to cope with the environment and the rigorous supervision by a compiler and other checkers under time restrictions and, possibly, the nerves of an exam situation. To achieve this, we discuss how programming tasks can be designed in a manner that allows addressing all levels of Bloom's taxonomy of learning objectives, in order to implement constructive alignment. Furthermore we analyze our students' behavior during exams and their perception of this kind of exam. As well, we analyze the resulting solutions and identify typical error patterns that occur.
Date of Conference: 09-12 November 2020
Date Added to IEEE Xplore: 14 October 2020
Print ISBN:978-1-7281-6807-4
Print ISSN: 2377-570X