Multiscale modeling

From Infogalactic: the planetary knowledge core
Jump to: navigation, search

In engineering, mathematics, physics, meteorology and computer science, multiscale modeling or multiscale mathematics is the field of solving problems which have important features at multiple scales of time and/or space. Important problems include scale linking (Steinhauser 2008,[1] Baeurle 2009,[2] de Pablo 2011,[3] Knizhnik 2002,[4] Adamson 2007[5]).

History

Horstemeyer 2009,[6] 2012[7] presented a historical review of the different disciplines (solid mechanics,[8] numerical methods,[9] mathematics, physics, and materials science) for solid materials related to multiscale materials modeling.

<templatestyles src="Template:Quote_box/styles.css" />

The recent surge of multiscale modeling from the smallest scale (atoms) to full system level (e.g., autos) related to solid mechanics that has now grown into an international multidisciplinary activity was birthed from an unlikely source. Since the US Department of Energy (DOE) national labs started to reduce nuclear underground tests in the mid 1980s, with the last one in 1992, the idea of simulation-based design and analysis concepts were birthed. Multiscale modeling was a key in garnering more precise and accurate predictive tools. In essence, the number of large scale systems level tests that were previously used to validate a design was reduced to nothing, thus warranting the increase in simulation results of the complex systems for design verification and validation purposes.

Essentially, the idea of filling the space of system level “tests” was then proposed to be filled by simulation results. After the Comprehensive Test Ban Treaty of 1996 in which many countries pledged to discontinue all systems level nuclear testing, programs like the Advanced Strategic Computing Initiative (ASCI) were birthed within the Department of Energy (DOE) and managed by the national labs within the US. Within ASCI, the basic recognized premise was to provide more accurate and precise simulation-based design and analysis tools. Because of the requirements for greater complexity in the simulations, parallel computing and multiscale modeling became the major challenges that needed to be addressed. With this perspective, the idea of experiments shifted from the large scale complex tests to multiscale experiments that provided material models with validation at different length scales. If the modeling and simulations were physically based and less empirical, then a predictive capability could be realized for other conditions. As such, various multiscale modeling methodologies were independently being created at the DOE national labs: Los Alamos National Lab (LANL), Lawrence Livermore National Laboratory (LLNL), Sandia National Laboratories (SNL), and Oak Ridge National Laboratory (ORNL). In addition, personnel from these national labs encouraged, funded, and managed academic research related to multiscale modeling. Hence, the creation of different methodologies and computational algorithms for parallel environments gave rise to different emphases regarding multiscale modeling and the associated multiscale experiments.

The advent of parallel computing also contributed to the development of multiscale modeling. Since more degrees of freedom could be resolved by parallel computing environments, more accurate and precise algorithmic formulations could be admitted. This thought also drove the political leaders to encourage the simulation-based design concepts.

At LANL, LLNL, and ORNL, the multiscale modeling efforts were driven from the materials science and physics communities with a bottom-up approach. Each had different programs that tried to unify computational efforts, materials science information, and applied mechanics algorithms with different levels of success. Multiple scientific articles were written, and the multiscale activities took different lives of their own. At SNL, the multiscale modeling effort was an engineering top-down approach starting from continuum mechanics perspective, which was already rich with a computational paradigm. SNL tried to merge the materials science community into the continuum mechanics community to address the lower length scale issues that could help solve engineering problems in practice.

Once this management infrastructure and associated funding was in place at the various DOE institutions, different academic research projects started, initiating various satellite networks of multiscale modeling research. Technological transfer also arose into other labs within the Department of Defense and industrial research communities.

The growth of multiscale modeling in the industrial sector was primarily due to financial motivations. From the DOE national labs perspective, the shift from large scale systems experiments mentality occurred because of the 1996 Nuclear Ban Treaty. Once industry realized that the notions of multiscale modeling and simulation based design were invariant to the type of product and that effective multiscale simulations could in fact lead to design optimization, a paradigm shift began to occur, in various measures within different industries, as cost savings and accuracy in product warranty estimates were rationalized.

Mark Horstemeyer, Integrated Computational Materials Engineering (ICME) for Metals, Chapter 1, Section 1.3.

The aforementioned DOE multiscale modeling efforts were hierarchical in nature. The first concurrent multiscale model occurred when Michael Ortiz (Cal Tech) took the molecular dynamics code, Dynamo, (developed by Mike Baskes at Sandia National Labs) and with his students embedded it into a finite element code for the first time.[10] Martin Karplus, Michael Levitt, Arieh Warship 2013 were awarded a Nobel Prize in Chemistry for the development of a multiscale model method using both classical and quantum mechanical theory which were used to model large complex chemical systems and reactions.

Multiscale Modeling Areas of Research

In physics and chemistry, multiscale modeling is aimed to calculation of material properties or system behavior on one level using information or models from different levels. On each level particular approaches are used for description of a system. The following levels are usually distinguished: level of quantum mechanical models (information about electrons is included), level of molecular dynamics models (information about individual atoms is included), mesoscale or nano level (information about groups of atoms and molecules is included), level of continuum models, level of device models. Each level addresses a phenomenon over a specific window of length and time. Multiscale modeling is particularly important in integrated computational materials engineering since it allows the prediction of material properties or system behavior based on knowledge of the process-structure-property relationships.

In operations research, multiscale modeling addresses challenges for decision makers which come from multiscale phenomena across organizational, temporal and spatial scales. This theory fuses decision theory and multiscale mathematics and is referred to as multiscale decision-making. Multiscale decision-making draws upon the analogies between physical systems and complex man-made systems.

In meteorology, multiscale modeling is the modeling of interaction between weather systems of different spatial and temporal scales that produces the weather that we experience. The most challenging task is to model the way through which the weather systems interact as models cannot see beyond the limit of the model grid size. In other words, to run an atmospheric model that is having a grid size (very small ~ 500 m) which can see each possible cloud structure for the whole globe is computationally very expensive. On the other hand, a computationally feasible Global climate model (GCM, with grid size ~ 100 km, cannot see the smaller cloud systems. So we need to come to a balance point so that the model becomes computationally feasible and at the same time we do not lose much information, with the help of making some rational guesses, a process called Parametrization.

Besides the many specific applications, one area of research is methods for the accurate and efficient solution of multiscale modeling problems. The primary areas of mathematical and algorithmic development include:

See also

References

  1. Lua error in package.lua at line 80: module 'strict' not found.
  2. Lua error in package.lua at line 80: module 'strict' not found.
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. Lua error in package.lua at line 80: module 'strict' not found.
  6. Lua error in package.lua at line 80: module 'strict' not found.
  7. Lua error in package.lua at line 80: module 'strict' not found.
  8. Lua error in package.lua at line 80: module 'strict' not found.
  9. Lua error in package.lua at line 80: module 'strict' not found.
  10. Lua error in package.lua at line 80: module 'strict' not found.

External links