By Topic

Beyond multiple choice exams: Using computerized lexical analysis to understand students' conceptual reasoning in STEM disciplines

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

6 Author(s)

Constructed response questions - in which students must use their own language in order to explain a phenomenon - create more meaningful opportunities for instructors to identify their students' learning obstacles than multiple choice questions. However, the realities of typical large-enrollment undergraduate classes restrict the options faculty have for moving towards more learner-focused instruction. We are exploring the use of computerized lexical analysis of students' writing in large enrollment undergraduate biology and geology courses. We have created libraries that categorize student responses with > 90% accuracy. These categories can be used to predict expert ratings of student responses with accuracy approaching inter-rater reliability among expert raters. These techniques also provide insight into students' use of analogical thinking, a fundamental part of scientific modeling. These techniques have potential for improving assessment practices across STEM disciplines.

Published in:

Frontiers in Education Conference, 2009. FIE '09. 39th IEEE

Date of Conference:

18-21 Oct. 2009