Obstacles to Progress
By Dr. Barna Szabó
Engineering Software Research and Development, Inc.
St. Louis, Missouri USA
Thomas Kuhn, a professor at MIT and a highly influential philosopher of science, was interested in how science progresses as opposed to how it is generally believed to be progressing. He found that progress occurs in fits and starts, rather than through a steady accumulation of knowledge. Typically, a period of normal science is followed by a period of stagnation which is prolonged by the tendency of professionals to develop dogmatic adherence to a paradigm. In the period of stagnation, evidence accumulates that the methodology being developed is incapable of handling certain classes of problems. This leads to a model crisis, followed by a paradigm shift and the start of a new phase of normal science.
While Kuhn was thinking of science as a whole, his observations are particularly fitting in the applied sciences where changing an accepted paradigm is greatly complicated by the fact that methods based on it may have been incorporated in the workflows of industrial organizations.
The development of the finite element method (FEM) follows a similar but more complex pattern consisting of two main branches: the art of finite element modeling and the science of finite element analysis.
The Art of Finite Element Modeling
The art of finite element modeling evolved from the pioneering work of engineers in the aerospace sector. They were familiar with the matrix methods of structural analysis and sought to extend it to the solution of elastostatic problems, initially in two dimensions. They constructed triangular and quadrilateral elements by establishing linear relationships between nodal forces and displacements.
This work was greatly accelerated by the US space program in the 1960s. In 1965 NASA awarded a contract for the development of a “general purpose” finite element analysis program, which was later named NASTRAN. NASTRAN and the other legacy codes were designed based on the understanding of the finite element method that existed in the 1960s. Unfortunately, the software architecture of legacy codes imposed limitations that prevented these codes from keeping pace with subsequent scientific developments in finite element analysis.
Legacy finite element codes were designed to support finite element modeling which is an intuitive construction of a numerical problem by assembling elements from the library of a legacy finite element software product. Through artful selection of the elements, the constraints, and the loads, the force-displacement relationships can be estimated with reasonable accuracy. Note that a nodal force is an abstract entity, derived from the generalized formulation, not to be confused with concentrated forces, that are inadmissible in two and three-dimensional elasticity. This point was not yet clearly understood by the developers of legacy codes who relied on early papers and the first book [1] on the finite element method.
The Science of Finite Element Analysis
Exploration of the mathematical foundations of the finite element method began in the early 1970s, well after the architecture of legacy finite element software products took shape. The finite element method was viewed as a method by which the exact solutions of partial differential equations cast in variational form are approximated [2]. Of interest are: (a) the rate of convergence in a norm that depends on the formulation, (b) the stability of the sequence of numerical problems corresponding to an increasing number of degrees of freedom, (c) the estimation and control of the errors of approximation in the quantities of interest.
The mathematical foundations of finite element analysis were substantially established by the mid-1980s, and finite element analysis emerged as a branch of applied mathematics.
Stagnation in Finite Element Modeling
Legacy finite element codes came to be widely used in engineering practice before the theoretical foundations of the finite element method were firmly established. This led to the emergence of a culture of finite element modeling based on the pre-scientific understanding of the finite element method. There were attempts to incorporate adaptive control of the errors of approximation, however, these attempts failed because adaptive error control is possible only when the underlying mathematical problem is well defined (i.e. an exact solution exists), however, in most industrial-scale finite element models this is not the case.
The primary causes of stagnation are:
- The organizations that rely on computed information have not required solution verification which is an essential technical requirement in numerical simulation.
- The vendors of legacy finite element software tools have not kept pace with the growth of the knowledge base of the finite element method.
Outlook
The knowledge base of finite element analysis (FEA) is currently much larger than what is available to practicing engineers through legacy finite element software tools. Linking numerical simulation with explainable artificial intelligence (XAI) tools will impose requirements for reliability, traceability, and auditability. To meet those requirements, software vendors will have to abandon old paradigms and implement state-of-the-art algorithms for solution verification and hierarchical modeling [3].
References
[1] Zienkiewicz, O.C. and Cheung, Y.K. The finite element method in continuum and structural mechanics. McGraw-Hill 1967.
[2] Babuška, I. and Aziz, A.K. Lectures on mathematical foundations of the finite element method. Report ORO-3443-42; BN-748. University of Maryland, College Park, Institute for Fluid Dynamics and Applied Mathematics, 1972.[3] Szabό, B. and Babuška, I. Finite Element Analysis: Method, Verification and Validation., 2nd ed. John Wiley & Sons, Inc., 2021.
Leave a Reply
We appreciate your feedback!
You must be logged in to post a comment.