METAMODELS & CALIBRATION
The use of mathematical and statistical tools to approximate, calibrate and simulate complex real world systems is widely applied in many fields. Interpolation and regression methodologies are now commonly used even in engineering, where they are also known as Response Surface Methods (RSMs). RSMs are becoming very popular among engineers, offering a fast surrogated model (a “metamodel”) in computer aided engineering.
Meta-models are an effective way to speed up optimization tasks, especially within the context of inverse problems, thanks to their ability to simplify time-consuming simulations by introducing simplified models.
Interpolation methodologies are also particularly important for models calibration. When comparing measurements of real world problems with a numerical model, interpolation methods are used to tune and calibrate the model by making it a better fit with the experimental values.
The Value of Metamodeling and Calibration
In real world applications, it is not always possible to reduce the complexity of the problem and obtain a model that can be quickly solved. When a single simulation can take hours or days, the time to run a single analysis makes it prohibitive to run more than a few of them. This makes it difficult to optimize the solution or to even gain a full understanding of the problem at hand.
Consequently a more cost-effective solution is needed to solve these complex problems. A viable alternative to costly computations is for engineers to perform a reduced number of calculations and use these well-distributed results to create a meta-model that interpolates the data.
The resulting meta-model represents a surrogate of the original problem and can be used to perform the optimization without performing any further analyses. Insight from the meta-model allows the identification of important design variables and their effects on the response. This is necessary to understand the behavior of the model, improve it, and/or re-define the region of interest in the design space.
The Numerical Aspects of Metamodels and Calibration
In the real world engineers do not have unlimited resources to compute an unlimited number of simulations and this is why the use of meta-models is becoming very popular in the field of engineering simulation. Building an accurate and reliable metamodel starting from a reduced number of simulations is not a trivial task. For example, if the training points are not carefully chosen, the fitted model can be really poor and influence the final results negatively. Inadequate approximations may lead to sub-optimal designs or inefficient searches for optimal solutions. Many different algorithms can be used to generate meta-models: neural networks and radial basis functions are probably the most well known of these.
Important factors that should be considered when developing a meta-model are:
- mathematical and physical soundness of the final result
- computational costs of the model
- prediction errors
There are however other factors that should be considered. As engineers always strive to grasp the general trends in the phenomena, especially when the behavior is non-linear, a meta-model should be good enough but not overly complex.
The assessment of the metamodel is a very important step in the analysis process. Engineers need choose an appropriate validation strategy to evaluate the model’s performance. The validation process gives the engineer a way to verify the accuracy of a particular meta-model and decide whether or not to improve its fidelity by adding additional simulations.