By Topic

Analysis of interrater agreement in ISO/IEC 15504-based software process assessment

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

6 Author(s)
Hye-young Lee ; Coll. of Bus. Adm., Korea Univ., Seoul, South Korea ; Ho-Won Jung ; Chang-Shin Chung ; Jong Moo Lee
more authors

The emerging ISO/IEC 15504 standard provides a framework and a model for software process assessment and improvement. There are two requirements for reliable process assessment: internal reliability and external reliability. The objective of the study is to provide an empirical case of external reliability, i.e. the interrater agreement in ISO/IEC 15504-based software process assessment. Interrater agreement implies the extent to which independent assessors agree in their ratings of software process attributes. Our dataset was from two assessments conducted using the ISO/IEC 15504 standard. The results showed "substantial" to "excellent" agreement. This implies that the two assessments acquired external reliability

Published in:

Quality Software, 2001. Proceedings.Second Asia-Pacific Conference on

Date of Conference: