Skip to Main Content
Our goal is to predict the output of a parameterized computer simulation code given a database of outputs at different parameter values. To do so, we investigate a particular model reduction technique that interpolates the right singular vectors in the singular value decomposition of the matrix of outputs. A common observation about these singular vectors is that they become more oscillatory as the index of the singular vectors increases. We use this property to split the singular vectors into “signal” and “noise” regions. The model reduction then interpolates the “signal” and uses the “noise” to estimate the uncertainty in the result. This methodology requires a big-data approach because the simulations we study produce snapshots with hundreds or thousands of timesteps on thousands to millions of nodal values. Each simulation output is then a vector with millions to billions of values. We utilize a MapReduce-based SVD routine to compute the SVD of the snapshot matrix.