By Topic

Comparing object-oriented database systems benchmark methods

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Jia-Lang Seng ; Dept. of Manage. Inf. Syst., Nat. Cheng-Chi Univ., Taipei, Taiwan

With the increasing use of object-oriented database software, system performance has become an important issue in the system procurements and evaluation process. It is vital to understand and predict the functionality and performance characteristics of the systems to be procured in the real world settings so as to justify the use of the object-oriented systems over the traditional systems. Benchmarks are considered the most common approach to test and measure the quantitative performance of database systems. We have seen a set of de facto standard benchmarks such as the OO1, HyperModel, ACOB, and OO7 benchmarks receiving more and more attention and acceptance. It is hence the intent of this paper to provide an elaborate version of analysis of these benchmarks and to give a systematic comparison of these standards with a multi-set of comparison criteria. Through the comparison and contrast, we point out the essence of features of a desirable database benchmark

Published in:

System Sciences, 1998., Proceedings of the Thirty-First Hawaii International Conference on  (Volume:6 )

Date of Conference:

6-9 Jan 1998