The rapid growth of high-performance supercomputing technology and advances in numerical techniques in the last two decades have provided an unprecedented opportunity to explore complex physical phenomena using modeling and simulation. In the present day, multi-physics simulations involving fluid flow, structural dynamics, chemical kinetics, atomic and nuclear sciences, are ubiquitous. The success of computer simulations also means that they will be increasingly relied upon as important tools for high-consequence predictions and decision making. For example, the Department of Energy Advanced Simulation and Computing (ASC) Program was established in 1995 to accelerate the development of computer simulation capabilities for analyzing and predicting the performance, safety, and reliability of nuclear weapons and certifying their functionality. As such, ``predictive science", which is the application of verified and validated computational simulations to predict the behavior of complex systems, has emerged as a new movement.
The flourishing of simulation-based scientific discovery has also resulted in the emergence of the verification and validation (V&V) and uncertainty quantification (UQ) disciplines. The goal of these emerging disciplines is to enable scientists to make precise statements about the degree of confidence they have in their simulation-based predictions. Here we focus on the UQ discipline which is essential for validating and verifying computer models.
What is Uncertainty Quantification?
Uncertainty quantification is defined as the
- identification (Where are the uncertainties?),
- characterization (What form they are in?),
- propagation (How they evolve during the simulation?),
- analysis (What are their impacts?), and
of ALL uncertainties in simulation models.
In mapping from reality to computer models, many sources of uncertainties/discrepancies arise:
- Uncertainties due to variabilities in the actual design parameter values and environmental factors
- Uncertainties in initial and boundary conditions
- Uncertainties in the physics sub-models (due to imprecise and simplified physics, e.g. data-driven phenomenological models)
- Uncertainties in couplings between sub-models
- Uncertainties due to missing physics
- Uncertainties arising from model implementation, discretization errors, roundoff errors, and algorithmic errors
- Uncertainties in data due to noise and measurement errors
- Uncertainties due to lack of sufficient data
Uncertainties are often classified as:
- aleatoric (known probability distributions)
- epistemic (probability distributions unknown, use intervals/belief functions)
- mixed aleatoric/epistemic (known distribution, unknown means/std deviations)
- Model form uncertainties (each possible model may have its own aleatoric/epistemic uncertainties)
Characterization of uncertainties is often the most time-consuming part of a UQ study, especially for data-driven phenomenological physics models.
Propagation of uncertainties can be performed forward and backward involving
- analyzing the impact parameter uncertainties have on model outputs,
- finding the major sources of uncertainties (sensitivity analysis),
- deriving parameter posterior distributions based on data (calibration/data fusion), and
- exploring "interesting" regions in the parameter space (model exploration).
Approaches for propagating uncertainties can fall into 3 areas:
- intrusive (e.g. polynomial chaos),
- non-intrusive (sampling-based), and
- hybrid (a mixture of both intrusive and non-intrusive methods inside a simulation).
Selection methods for propagating uncertainties is a critical part of planning. Care must be taken to ensure that proper methods are selected for a study. It is useful to first identify UQ characteristics of a given model beforehand. For example, for a complex multi-physics model, we may extract the following characteristics:
- Nonlinear relationship between the uncertain and output variables
- The uncertain parameter space is high-dimensional
- There may be some model form uncertainties
- High computational cost per simulation
- Experimental data are available at module, subsystem and full system level
Knowing these characteristics will help determine the best UQ strategies. For example, if there are many uncertain parameters, then reducing the parameter dimension may be a useful step before more detailed analysis. If the mode input-output relationships are nonlinear, then perturbation-based or SRC (standardized regression coefficient)-based sensitivity analysis may not be sufficient. For computationally expensive models, response surfaces may be needed for quantitative analysis. If the model is highly nonlinear, classical regression-based response surface methods may not be good enough (non-parametric methods may be needed).
Information gathered from simulations can be analyzed for, for example,
- assessing "anomalous" regions in the parameter space (risk analysis),
- establishing the integrity of a simulation model (validation), and
- providing information on which additional physical experiments are needed to best improve the understanding of the system (experimental guidance),
How to plan a UQ Study
The planning of a UQ study involves 3 components:
1. Develop a UQ Process
- define objectives of the study
- define clearly and in detail the model to be studied
- determine the computational budget
- state in detail the assumptions
2. Compile relevant UQ methods and tools such as those for
- data fusion
- dimension reduction (to reduce number of parameters)
- response surface analysis
- uncertainty assessment/sensitivity analysis
- model validation
3. Develop/secure hardware/software infrastructure to perform ensemble runs
- job management: scheduling, monitoring
- data management, visualization
Introduction to a UQ Software Library: PSUADE
PSUADE is a acronym for Problem Solving environment for Uncertainty Analysis and Design Exploration. It is a software toolkit to facilitate the UQ tasks described above. PSUADE has a rich set of tools for performing uncertainty analysis, global sensitivity analysis, design optimization, model calibration, etc. In particular, PSUADE supports a global sensitivity methodology for models with large number of parameters and complex constraints.
PSUADE has 3 major components:
I. Sampling Methods
Monte Carlo, quasi-Monte Carlo, Latin hypercube and variants, orthogonal arrays, factorial and fractional factorial, Morris method, Fourier Amplitude Sampling Test (FAST), Box-Behnken, Plackett-Burman, central composite, and methods based on spatial decomposition. In addition, a few uniform and adaptive sampling refinements are supported. Basic probability distributions such as uniform, normal, lognormal, and triangular are available.
II. An Simulator Execution Environment
Once a sample design has been created, the sample points are propagated through the simulator to generate the corresponding outputs of interest. Since PSUADE supports an integrated design and analysis environment, it also provides an automated system for launching the simulators and collecting results. There are several parallel simulation modes supported by PSUADE. Details can be found in the PSUADE user manual.
III. Analysis Methods
- basic statistical analysis (moments and correlations)
- Many methods for main, second-order, and total-order effect analyses
- Several dimension reduction analysis methods
- Markov Chain Monte Carlo for parameter estimation
- principal component analysis and some hypothesis testing
- response surface analysis (many different types of response surfaces)
PSUADE is a free software with a LGPL license. To download PSUADE, click on the button above.