By Topic

Adaptive time warp simulation of timed Petri nets

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

1 Author(s)
Ferscha, A. ; Inst. fur Angewandte Inf., Wien Univ., Austria

Time warp (TW), although generally accepted as a potentially effective parallel and distributed simulation mechanism for timed Petri nets, can reveal deficiencies in certain model domains. Particularly, the unlimited optimism underlying TW can lead to excessive aggressiveness in memory consumption due to saving state histories, and waste of CPU cycles due to over-optimistically progressing simulations that eventually have to be “rolled back”. Furthermore, in TW simulations executing in distributed memory environments, the communication overhead induced by the roll-back mechanism can cause pathological overall simulation performance. In this work, an adaptive optimism control mechanism for TW is developed to overcome these shortcomings. By monitoring and statistically analyzing the arrival processes of synchronization messages, TW simulation progress is probabilistically throttled based on the forecasted time stamp of forthcoming messages. Two classes of arrival process characterizations are studied, reflecting that a natural trade-off exists among the computational and space complexity, and the respective prediction accuracy: While forecasts based on metrics of central tendency are computationally cheap but yield inadequate predictions for correlated arrivals (thus negatively affecting performance), time series based forecast methods give higher prediction accuracy, but at higher computational cost

Published in:

Software Engineering, IEEE Transactions on  (Volume:25 ,  Issue: 2 )