By Topic

Improving Robustness of Spacecraft Downlink Schedules

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$33 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

2 Author(s)
Oddi, A. ; Italian Nat. Res. Council, Rome ; Policella, N.

In the realm of scheduling problems, different sources of uncertainty can invalidate the planned solutions: unpredictability of activity behaviors, machine breakdowns, new activities to be served, and so on. In this paper, we are concerned with the generation of high-quality downlink schedules in a spacecraft domain in the presence of a high degree of uncertainty. In particular, we refer to a combinatorial optimization problem called Mars-Express memory dumping problem (MEX-MDP), which arose in the European Space Agency (ESA) program Mars-Express. A MEX-MDP consists of a generation of dumping commands to maximize the downloads of data sets from the satellite to the ground. The domain is characterized with several kinds of constraints - such as bounded onboard memory capacities, limited communication windows over the downlink channels, deadlines, and ready times imposed by the payload requirements - and different sources of uncertainty - such as the amount of data generated at each scientific observation or the channel data rate. In this paper, we tackle this problem by using a reduction of the MEX-MDP to a max-flow problem: the former problem has a solution when the maximum flow in the latter equates the total amount of data to dump. Given this reduction, we introduce a novel definition of solution robustness based on the utilization of the onboard memory, as well as an interative procedure to improve solution quality. The key idea behind this approach is that the lower the in memory utilization, the higher the ability to cope with an unexpectedly large amount of data.

Published in:

Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on  (Volume:37 ,  Issue: 5 )