Finite Element Modeling Archives - ESRD https://www.esrd.com/tag/finite-element-modeling/ Engineering Software Research and Development, Inc. Tue, 14 Nov 2023 16:14:10 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 https://www.esrd.com/wp-content/uploads/cropped-SC_mark_LG72ppi-32x32.jpg Finite Element Modeling Archives - ESRD https://www.esrd.com/tag/finite-element-modeling/ 32 32 Why Finite Element Modeling is Not Numerical Simulation? https://www.esrd.com/why-finite-element-modeling-is-not-numerical-simulation/ https://www.esrd.com/why-finite-element-modeling-is-not-numerical-simulation/#respond Thu, 02 Nov 2023 15:05:12 +0000 https://www.esrd.com/?p=30196 The term “simulation” is often used interchangeably with “finite element modeling” in the engineering literature and marketing materials.  It is important to understand the difference between the two.]]>

By Dr. Barna Szabó
Engineering Software Research and Development, Inc.
St. Louis, Missouri USA


The term “simulation” is often used interchangeably with “finite element modeling” in the engineering literature and marketing materials.  It is important to understand the difference between the two.

The Origins of Finite Element Modeling

Finite element modeling is a practice rooted in the 1960s and 70s.  The development of the finite element method began in 1956 and was greatly accelerated during the US space program in the 1960s. The pioneers were engineers who were familiar with the matrix methods of structural analysis and sought to extend those methods to solve the partial differential equations that model the behavior of elastic bodies of arbitrary geometry subjected to various loads.   The early papers and the first book on the finite element method [1], written when our understanding of the subject was just a small fraction of what it is today, greatly influenced the idea of finite element modeling and its subsequent implementations.

Guided by their understanding of models for structural trusses and frames, the early code developers formulated finite elements for two- and three-dimensional elasticity problems, plate and shell problems, etc. They focused on getting the stiffness relationships right, subject to the limitations imposed by the software architecture on the number of nodes per element and the number of degrees of freedom per node.  They observed that elements of low polynomial degree were “too stiff”.  The elements were then “softened” by using fewer integration points than necessary.  This caused “hourglassing” (zero energy modes) to occur which was fixed by “hourglass control”.  For example, the formulation of the element designated as C3D8R and described as “8-node linear brick, reduced integration with hourglass control” in the Abaqus Analysis User’s Guide [2] was based on such considerations.

Through an artful combination of elements and the finite element mesh, the code developers were able to show reasonable correspondence between the solutions of some simple problems and the finite element solutions.  It is a logical fallacy, called the fallacy of composition, to assume that elements that performed well in particular situations will also perform well in all situations.

The Science of Finite Element Analysis

Investigation of the mathematical foundations of finite element analysis (FEA) began in the early 1970s.  Mathematicians understand FEA as a method for obtaining an approximation to the exact solution of a well-defined mathematical problem, such as a problem of elasticity.  Specifically, the finite element solution uFE has to converge to the exact solution uEX in a norm (which depends on the formulation) as the number of degrees of freedom n is increased:

Under conditions that are usually satisfied in practice, it is known that uEX exists and it is unique.

The first mathematical book on finite element analysis was published in 1973 [3].  Looking at the engineering papers and contemporary implementations, the authors identified four types of error, called “variational crimes”. These are (1) non-conforming elements, (2) numerical integration, (3) approximation of the domain and boundary conditions, and (4) mixed methods. In fact, many other kinds of variational crimes commonly occur in finite element modeling, such as using point forces, point constraints, and reduced integration.

By the mid-1980s the mathematical foundations of FEA were substantially established.   It was known how to design finite element meshes and assign polynomial degrees so as to achieve optimal or nearly optimal rates of convergence, how to extract the quantities of interest from the finite element solution, and how to estimate their errors.  Finite element analysis became a branch of applied mathematics.

By that time the software architectures of the large finite element codes used in current engineering practice were firmly established. Unfortunately, they were not flexible enough to accommodate the new technical requirements that arose from scientific understanding of the finite element method. Thus, the pre-scientific origins of finite element analysis became petrified in today’s legacy finite element codes.

Figure 1 shows an example that would be extremely difficult, if not impossible, to solve using legacy finite element analysis tools:

Figure 1: Lug-clevis-pin assembly. The lug is made of 16 fiber-matrix composite plies and 5 titanium plies. The model accounts for mechanical contact as well as the nonlinear deformation of the titanium plies. Solution verification was performed.

Notes on Tuning

On a sufficiently small domain of calibration any model, even a finite element model laden with variational crimes, can produce results that appear reasonable and can be tuned to match experimental observations. We use the term tuning to refer to the artful practice of balancing two large errors in such a way that they nearly cancel each other out. One error is conceptual:  Owing to variational crimes, the numerical solution does not converge to a limit value in the norm of the formulation as the number of degrees of freedom is increased. The other error is numerical: The discretization error is large enough to mask the conceptual error [4].

Tuning can be effective in structural problems, such as automobile crash dynamics and load models of airframes, where the force-displacement relationships are of interest.  Tuning is not effective, however, when the quantities of interest are stresses or strains at stress concentrations.  Therefore finite element modeling is not well suited for strength calculations.

Solution Verification is Mandatory

Solution verification is an essential technical requirement for democratization, model development, and applications of mathematical models.  Legacy FEA software products were not designed to meet this requirement. 

There is a general consensus that numerical simulation will have to be integrated with explainable artificial intelligence (XAI) tools.  This can be successful only if mathematical models are free from variational crimes.

The Main Points

Owing to limitations in their infrastructure, legacy finite element codes have not kept pace with important developments that occurred after the mid-1970s.

The practice of finite element modeling will have to be replaced by numerical simulation.  The changes will be forced by the technical requirements of XAI.

References

[1]  O. C. Zienkiewicz and Y. K. Cheung, The Finite Element Method in Structural and Continuum Mechanics, London: McGraw-Hill, 1967.

[2]  http://130.149.89.49:2080/v6.14/books/usb/default.htm

[3] G. Strang and G. J. Fix, An Analysis of the Finite Element Method, Englewood Cliffs, NJ: Prentice-Hall, 1973. [4] B. Szabό and I. Babuška, Finite Element Analysis: Method, Verification and Validation., 2nd ed., Hoboken, NJ: 2nd edition. John Wiley & Sons, Inc., 2021.

[4] B. Szabό and I. Babuška, Finite Element Analysis: Method, Verification and Validation., 2nd ed., Hoboken, NJ: 2nd edition. John Wiley & Sons, Inc., 2021.

]]>
https://www.esrd.com/why-finite-element-modeling-is-not-numerical-simulation/feed/ 0
Obstacles to Progress https://www.esrd.com/obstacles-to-progress/ https://www.esrd.com/obstacles-to-progress/#respond Tue, 24 Oct 2023 17:43:42 +0000 https://www.esrd.com/?p=30119 The development of the finite element method (FEM) consists of two main branches: the art of finite element modeling and the science of finite element analysis. Learn why in this blog.]]>

By Dr. Barna Szabó
Engineering Software Research and Development, Inc.
St. Louis, Missouri USA


Thomas Kuhn, a professor at MIT and a highly influential philosopher of science, was interested in how science progresses as opposed to how it is generally believed to be progressing.  He found that progress occurs in fits and starts, rather than through a steady accumulation of knowledge.  Typically, a period of normal science is followed by a period of stagnation which is prolonged by the tendency of professionals to develop dogmatic adherence to a paradigm.  In the period of stagnation, evidence accumulates that the methodology being developed is incapable of handling certain classes of problems.  This leads to a model crisis, followed by a paradigm shift and the start of a new phase of normal science.

Photograph of Thomas Kuhn (via Magdalenaday.com).

While Kuhn was thinking of science as a whole, his observations are particularly fitting in the applied sciences where changing an accepted paradigm is greatly complicated by the fact that methods based on it may have been incorporated in the workflows of industrial organizations.

The development of the finite element method (FEM) follows a similar but more complex pattern consisting of two main branches: the art of finite element modeling and the science of finite element analysis.

The Art of Finite Element Modeling

The art of finite element modeling evolved from the pioneering work of engineers in the aerospace sector.  They were familiar with the matrix methods of structural analysis and sought to extend it to the solution of elastostatic problems, initially in two dimensions.  They constructed triangular and quadrilateral elements by establishing linear relationships between nodal forces and displacements.

This work was greatly accelerated by the US space program in the 1960s.   In 1965 NASA awarded a contract for the development of a “general purpose” finite element analysis program, which was later named NASTRAN.  NASTRAN and the other legacy codes were designed based on the understanding of the finite element method that existed in the 1960s.  Unfortunately, the software architecture of legacy codes imposed limitations that prevented these codes from keeping pace with subsequent scientific developments in finite element analysis.

Legacy finite element codes were designed to support finite element modeling which is an intuitive construction of a numerical problem by assembling elements from the library of a legacy finite element software product.  Through artful selection of the elements, the constraints, and the loads, the force-displacement relationships can be estimated with reasonable accuracy. Note that a nodal force is an abstract entity, derived from the generalized formulation, not to be confused with concentrated forces, that are inadmissible in two and three-dimensional elasticity.  This point was not yet clearly understood by the developers of legacy codes who relied on early papers and the first book [1] on the finite element method.

The Science of Finite Element Analysis

Exploration of the mathematical foundations of the finite element method began in the early 1970s, well after the architecture of legacy finite element software products took shape.  The finite element method was viewed as a method by which the exact solutions of partial differential equations cast in variational form are approximated [2].  Of interest are: (a) the rate of convergence in a norm that depends on the formulation, (b) the stability of the sequence of numerical problems corresponding to an increasing number of degrees of freedom, (c) the estimation and control of the errors of approximation in the quantities of interest.

The mathematical foundations of finite element analysis were substantially established by the mid-1980s, and finite element analysis emerged as a branch of applied mathematics.

Stagnation in Finite Element Modeling

Legacy finite element codes came to be widely used in engineering practice before the theoretical foundations of the finite element method were firmly established. This led to the emergence of a culture of finite element modeling based on the pre-scientific understanding of the finite element method. There were attempts to incorporate adaptive control of the errors of approximation, however, these attempts failed because adaptive error control is possible only when the underlying mathematical problem is well defined (i.e. an exact solution exists), however, in most industrial-scale finite element models this is not the case.

The primary causes of stagnation are:

  • The organizations that rely on computed information have not required solution verification which is an essential technical requirement in numerical simulation. 
  • The vendors of legacy finite element software tools have not kept pace with the growth of the knowledge base of the finite element method.

Outlook

The knowledge base of finite element analysis (FEA) is currently much larger than what is available to practicing engineers through legacy finite element software tools. Linking numerical simulation with explainable artificial intelligence (XAI) tools will impose requirements for reliability, traceability, and auditability. To meet those requirements, software vendors will have to abandon old paradigms and implement state-of-the-art algorithms for solution verification and hierarchical modeling [3].

References

[1] Zienkiewicz, O.C. and Cheung, Y.K. The finite element method in continuum and structural mechanics. McGraw-Hill 1967.

[2] Babuška, I. and Aziz, A.K. Lectures on mathematical foundations of the finite element method. Report ORO-3443-42; BN-748. University of Maryland, College Park, Institute for Fluid Dynamics and Applied Mathematics, 1972.

[3] Szabό, B. and Babuška, I. Finite Element Analysis: Method, Verification and Validation., 2nd ed. John Wiley & Sons, Inc., 2021.

]]>
https://www.esrd.com/obstacles-to-progress/feed/ 0
Why is Simulation Governance Essential for the Reliable Deployment of FEA-Based Engineering Simulation Apps? https://www.esrd.com/simulation-governance-essential-for-deployment-of-fea-based-sim-apps/ https://www.esrd.com/simulation-governance-essential-for-deployment-of-fea-based-sim-apps/#respond Tue, 08 May 2018 05:19:44 +0000 https://esrd.com/?p=6827 How can the vision for expanding the use of numerical simulation by persons who do not have expertise in finite element analysis (FEA) be safely realized? The solution lies in the establishment of Simulation Governance through the development and dissemination of expert-designed Engineering Simulation Apps. Read more[...]]]>
SAINT LOUIS, MISSOURI – May 7, 2018

ESRD President and CEO Dr. Ricardo Actis

Finite element modeling originated in the aerospace industry over 60 years ago. Owing to the level of expertise and experience required, it has remained a practice of analysts. There are many reasons for this, among them getting the right mesh for a problem and getting the mesh right is always near the top of why it takes both an expert and much time to get a solution. Not to mention the expertise required to navigate the minefield of multi-purpose finite element software tools in selecting the “right” elements from an ever-expanding element library, and selecting the “right” value of tuning parameters to overcome various deficiencies in implementations.

Yet, looking at this more closely, the focus should not be the level of experience or modeling skills of the user, but the level of intelligence in the software. Nearly all of the most popular legacy FEA software products were designed to support the practice of finite element modeling and as such none of them have the capability to provide a simple Q/A dashboard to advise the non-expert user if they have a good solution.

Splice joint stress contours generated by ESRD’s Multi-Fastener Analysis Tool (MFAT) Sim App

How then can the vision for expanding the use of numerical simulation by persons who do not have expertise in finite element analysis (FEA) be safely realized? The solution lies in the establishment of Simulation Governance through the development and dissemination of expert-designed Engineering Simulation Apps to ensure the level of reliability and consistency needed for widespread adoption.

The Key Ingredient for FEA-Based Simulation Apps

FEA-based Simulation Apps for the standardization and automation of recurring analysis tasks and process workflows for use by persons who do not have expertise in FEA must be designed by expert analysts to fit into existing analysis processes, capturing institutional knowledge and best practices to produce consistent results by tested and approved analysis procedures. Only by meeting the technical requirements of Simulation Governance can simulation apps have the reliability and robustness needed to support engineering decision-making processes!

Simulation Governance must be understood as a managerial function that provides a framework for the exercise of command and control over all aspects of numerical simulation through the establishment of processes for the systematic improvement of the tools of engineering decision-making over time. This includes the proper formulation of idealizations, the selection and adoption of the best available simulation technology, the management of experimental data, verification of input data and verification of the numerical solution.

Establishing the Proper Framework

Double lap joint inputs for ESRD’s Single Fastener Analysis Tool (SFAT) Smart Sim App.

In the creation of FEA-based Simulation Apps for the application of established design rules, data verification and solution verification are essential. The goal is to ensure that the data are used properly and the numerical errors in the quantities of interest are reasonably small: they must have built-in safeguards to prevent use outside of the range of parameters for which they were designed; they must incorporate automatic procedures for solution verification; and must be deployed with a detailed description of all assumptions incorporated in the mathematical model and a clear definition of the range and scope of application.

To ensure their proper use, Simulation Apps must incorporate estimation of relative errors in the quantities of interest, an essential technical requirement of Simulation Governance. They should not be deployed without objective measures of the approximation errors for all the reported results. The success of the vision of Democratization of Simulation depends on it!

Learn More

 

Previous S.A.F.E.R. Simulation Posts…

 

To receive future S.A.F.E.R. Simulation posts…

=

]]>
https://www.esrd.com/simulation-governance-essential-for-deployment-of-fea-based-sim-apps/feed/ 0
What Bottlenecks Limit the Adoption of Simulation Governance? https://www.esrd.com/simulation-governance-bottlenecks/ https://www.esrd.com/simulation-governance-bottlenecks/#respond Thu, 03 Oct 2019 20:49:19 +0000 https://esrd.com/?p=11760 While the idea of simulation governance may be easy to understand, the challenges of two potential bottlenecks must be addressed before it can adopted by engineering management. Read Dr. Barna Szabo's latest S.A.F.E.R. simulation post to learn more.]]>
SAINT LOUIS, MISSOURI – October 2, 2019

ESRD Chairman and Co-Founder Dr. Barna Szabó

The idea of simulation governance is easy to understand:  The application of numerical simulation technology must be properly governed in every organization. The responsibility for simulation governance rests with board-level executives.  They exercise this responsibility through setting the goals, objectives and metrics to ensure that the economic value of numerical simulation is positive.  If numerical simulation is not managed properly then it can and often does lead to poor decisions that result in economic loss.  There are many well-documented examples of substantial economic loss that can be attributed to lack of simulation governance and management.

Recognizing the need to clarify the issues associated with the governance and management of numerical simulation, the Simulation Governance and Management Working Group of NAFEMS[1] has undertaken to develop a document on “What is Simulation Governance and Management”. An extract was published in the July 2019 issue of the NAFEMS quarterly Benchmark.

Beware the bottlenecks preventing the adoption of simulation governance.

While the concept is easily grasped, the existence of two bottlenecks must be recognized:  The first is that board-level executives generally lack the expertise to properly formulate realistic policies, expectations and metrics for numerical simulation and to assess the economic risks associated with numerical simulation activities within their organization.  Therefore it is necessary to engage outside consultants who have proper credentials and experience.   The problem is that it is extremely difficult to find consultants who are competent in this area.  This is because there are very few experts in numerical simulation and the consultant must also understand the intricacies of corporate change management.

The second bottleneck is that, with very rare exceptions, the technical staff do not have a clear understanding of what numerical simulation is.  There are historical reasons for this:  The primary tool of numerical simulation is the finite element method.  The legacy finite element codes, whose origins can be traced to the 1960’s and 70’s, were not designed to support numerical simulation.  In fact, numerical simulation, as the term is understood today, did not yet exist at that time.

An essential aspect of numerical simulation is that mathematical models must be formulated independently from how the numerical solution is obtained.  In contrast, legacy finite element software tools have the model definition and approximation conflated in their element libraries, making it very difficult, at times impossible, to separate model form errors from numerical approximation errors.  As a consequence, solution verification and model validation, which are essential elements of numerical simulation, cannot be reliably performed.

To eliminate this bottleneck it will be necessary to re-train the technical staff so that they will become proficient in the use of quality assurance procedures in numerical simulation.

ESRD/Revolution in Simulation Webinar on Democratization of Simulation.

Management should also seek professional advice on the benefits and risks associated with Democratization of Simulation which must be subject to the rules of simulation governance.  The term means that numerical simulation tools are made available to engineers who do not have expertise in numerical simulation.  The goal is to increase productivity without sacrificing reliability through standardization of recurring numerical simulation tasks.  These tools must be carefully designed, tested and certified by experts for safe and efficient use.

The reasons why democratization tools should not be deployed without meeting the technical requirements of simulation governance are outlined in a 2018 presentation as well as in a joint webinar hosted by ESRD & Revolution in Simulation.

[1] NAFEMS is the International Association for the Engineering Modelling, Analysis and Simulation Community.  The parent organization was the National Agency for Finite Element Methods and Standards, established in the UK in 1983 with the objective to promote the safe and reliable use of finite element and related technology.

References

NASA Standard 7009: This NASA Technical Standard provides an approved set of requirements, recommendations, and criteria with which models and simulations (M&S) may be developed, accepted, and used in support of NASA activities.

Szabó B and Actis R. Simulation governance: Technical requirements for mechanical design. Computer Methods in Applied Mechanics and Engineering.  249-252, 158-168, 2012.

To receive future S.A.F.E.R. Simulation posts…

=

]]>
https://www.esrd.com/simulation-governance-bottlenecks/feed/ 0
S.A.F.E.R. Numerical Simulation for Structural Analysis in the Aerospace Industry Part 4: Simulation Governance https://www.esrd.com/safer-numerical-simulation-structural-analysis-part-4/ https://www.esrd.com/safer-numerical-simulation-structural-analysis-part-4/#respond Mon, 26 Feb 2018 20:53:46 +0000 https://esrd.com/?p=6198 In this fourth of our multi-part series on “S.A.F.E.R. Numerical Simulation for Structural Analysis in the Aerospace Industry”, we will explore the topic of Simulation Governance, and why both simulation users and their managers on A&D programs should care. [...]]]>
SAINT LOUIS, MISSOURI – February 26, 2018

In our last post in this series we examined why finite element modeling, as it has historically been practiced, is not the same as numerical simulation. The difference is important for both users of FEA software as well as all those who depend on the reliability of engineering calculations, like margins of safety from detail stress analysis that use the results of a simulation. In this article we will explore the topic of Simulation Governance, and why both simulation users and their managers on A&D programs should care.

The Importance of Verification & Validation

V&V activities as defined in ASME’s Guide to V&V in Computational Solid Mechanics

The American Society of Mechanical Engineers (ASME) published its first guideline on verification and validation (V&V) in computational solid mechanics in 2006. The main point was this: since engineering decisions are based on computed information, assurance of the quality of that information is essential.

Since then V&V has become a more timely topic across many industries, but especially so in A&D for good reasons. Foremost is that as computational software tools have become more capable and pervasive, they have also become more complex and risky to use. This is true not just for the novice or occasional user but even for the expert or routine user. Likewise, the demands on the performance and durability of aerostuctures has continued to increase at an unrelenting pace. With more engineering time spent in the design phase a greater reliance on digital simulation and virtual testing has reduced the time available for physical prototyping and operational testing. This is a dangerous combination of trends that suggests status quo processes may no longer be sufficient.

Historically, the A&D industry has responded to the requirements of V&V thru ad-hoc strategies that were largely based on proprietary methods and institutional knowledge. This was satisfactory until recently because of accommodations in the schedule for prototyping, flight testing, and fixes with every new block number. Many A&D OEMs had their own engineering methods, technical support, and process improvement groups whose function included standardizing analysis tools to assure some level of quality control. However, rarely was there a single group or well-defined role which had explicit responsibility for oversight of engineering quality from a V&V perspective.

With respect to the implementation of V&V procedures, there is seldom a clear understanding or even agreement on the fundamentals. Many engineering managers and even some regulatory agencies often confuse or intermix verification of codes and validation of results. Verification has a much broader reach and involves data verification, code verification and solution verification. What is almost always overlooked is that solution verification is a prerequisite to model validation, and that without uncertainty quantification is practically impossible to define reliable validation metrics. In the absence of solution verification, it was not uncommon that the results of validation testing were then fed back to tune the finite element models and correct for errors of unknown sources.

Without more attention to the direct support of V&V it is no surprise that the results from the finite element “modeling” of the same engineering problem can vary greatly between different simulation users, idealization approaches, and codes. Without this managerial oversight most discrepancies and deliberations are delegated back to the experts to resolve where discussions almost always fall back to questions about idealizations (e.g. element types) and discretizations (e.g. the mesh). The lack of a governance framework with supporting policies, practices, and tools to assess the quality of engineering predictions left no recourse except to ask the expert who performed the work to defend its credibility.

What is Simulation Governance?

NAFEMS has named Simulation Governance a “Big Issue”. But what is it?

The concept of simulation governance when used within the context of mechanical and structural engineering was introduced in 2012 when Dr. Barna Szabo and Dr. Ricardo Actis of Engineering Software Research & Development (ESRD) published the first reference to and definition of the term. This paper was titled ‘Simulation Governance: Technical requirements for mechanical design’, and is available for download via ScienceDirect. Recently, in 2016, the authors expanded upon this paper with a NAFEMS webinar titled “Simulation Governance: Technical Requirements”.

The term simulation governance refers to procedures established for the purposes of ensuring and enhancing the reliability of predictions based on numerical simulation. Simulation governance is a managerial function concerned with the exercise of command and control over all aspects of numerical simulation. At the 2017 NAFEMS World Congress in Stockholm simulation governance was identified as the first of eight “big issues” in numerical simulation.

The application of simulation governance is concerned with (a) selection and adoption of the best available simulation technology for the intended use of the model, (b) formulation of mathematical models, (c) management of experimental data, (d) data and solution verification procedures, and (e) revision of mathematical models in the light of new information collected from physical experiments and field observations.

A plan for adopting simulation governance has to be tailored to fit the mission of each organization. If that mission is to apply established rules of design and certification then emphasis is on solution verification and standardization, which includes the creation of simulation apps, templates, or automated workflow processes. If, on the other hand, that mission is to formulate design rules then verification and validation both must be part of the plan.

When these elements are adequately addressed, the practice of simulation governance provides guidance in how best to implement V&V in everyday engineering work. Especially when it is directly supported by and incorporated into the software tools employed by the expert simulation engineer, and even more so by the non-expert occasional user. With a new generation of numerical simulation software, like ESRD’s StressCheck, it is now possible for users to explicitly support the requirements and best practices of engineering verification and validation.

Why Simulation Governance is Important to the A&D Industry

Simulation governance must be implemented to achieve democratization of simulation in A&D (Click to download).

The technical and economic value of numerical simulation performed by engineers in the aviation, aerospace, and defense industries is well established. As the performance requirements and complexity of the products that engineers in these industries design and maintain has dramatically increased this value has become even greater. This in turn has fueled the need to perform more simulation of greater complexity with as much conducted as early in the design cycle as possible. This has created additional demands on the engineering organization to improve the volume, thoroughness, fidelity, and speed of analysis work all without a loss in confidence or reliability regardless of the expertise of the engineer.

The importance of simulation governance extends from the acknowledgement that the practice of finite element modeling, especially in the A&D industry, is inherently a highly complex and error-prone activity, whose results often vary greatly depending on the user, idealizations, and code. As the value of and risk from the simulation function increases, the practice of simulation governance becomes critical for ensuring the reliability and robustness of FEA-based analysis in support of the aerostructures design function. The practice of simulation governance requires a continuous investment in the training of simulation professionals and improvement in the maturity of software tools, engineering methods, automated processes, standards, and best practices.

Nowhere are these improvements more important than demonstrated in the different schemes over the years to promote the use of FEA software earlier in the design cycle by general design engineers without expert training. The results have been disappointing for many valid reasons. More recently there has been much discussion about the democratization of simulation to enable more simulation-led design that occurs earlier in the systems engineering “Vee”. But is it realistic in industries like A&D?

The answer lies in the practice of simulation governance which provides guardrails to ensure that the most difficult computational problems can be solved by experts with confidence, while more routine analysis in support of design decisions can be performed by engineers without expert training. The admirable vision for expanding the use of simulation by non‐experts cannot be safely realized unless a new approach to FEA based upon numerical simulation and simulation governance emerges to replace the art of finite element modeling as it has been practiced up to now.

Once this occurs, the standardization, automation, and democratization of the engineering simulation function through the adherence to the practice of simulation governance can offer many benefits to the A&D industry at the engineering, product, program and business levels. These benefits include encapsulating complexity, improving productivity, containing cost, and ensuring reliability for the expert simulation analyst and non‐expert design engineer alike.

An Example from the Application of Simulation Governance in A&D

With an adherence to the principles of simulation governance it now becomes feasible for more simulation processes to be standardized, then automated, and finally democratized for routine use by experts and non-experts alike. An example is sets of Smart Sim Apps that populate Digital CAE Handbooks of recurring analysis tasks.

A particular example of a Smart Sim App is shown in the 1-minute demo below, in which a standardized analysis of a laminate composite “PI-Joint” containing a delamination is performed via ESRD’s CAE Handbook:

The use of high aspect ratio solids elements for very thin domains (parent mesh) combined with parametric auto-lamination capabilities makes it possible to quickly update laminated composite joint geometries, the number of plies and the stacking sequence to produce a ply-by-ply mesh without FEA know-how. Utilizing this enabling technology, the PI-Joint Smart Sim App example was constructed, vetted and deployed by a single expert user to represent a standardized solution for use by non-experts, with all of the input parameters, solution steps and extraction criteria carefully implemented and controlled.  The PI-Joint updates automatically upon valid parameter changes (as defined by the expert’s rules), solves the configuration (e.g. linear sequence of solutions), and automatically extracts/reports the 3D energy release rate along the delamination front. Without simulation governance, this could not be reliably deployed and consumed.

Coming Up Next…

In our next and final S.A.F.E.R. Simulation post in this series we will profile stress analysis software like ESRD’s StressCheck Professional which is based on numerical simulation technology that provides intrinsic capabilities for solution verification and validation through the use of hierarchic finite element spaces and a hierarchic modeling framework. StressCheck Professional facilitates the practice of Simulation Governance by the A&D engineering organization to make simulation Simple, Accurate, Fast, Efficient, and Reliable for experts and non-experts alike.

To receive future S.A.F.E.R. Simulation posts…

=

]]>
https://www.esrd.com/safer-numerical-simulation-structural-analysis-part-4/feed/ 0
S.A.F.E.R. Simulation Views: Toward Simulation-Driven Design https://www.esrd.com/safer-simulation-views-toward-simulation-driven-design/ https://www.esrd.com/safer-simulation-views-toward-simulation-driven-design/#respond Mon, 19 Feb 2018 22:21:16 +0000 https://esrd.com/?p=6138 Introducing S.A.F.E.R. Simulation Views, where we invite colleagues, simulation experts, and A&D industry leaders to address simple questions about simulation. In this edition, ESRD's President & CEO Dr. Ricardo Actis is asked about the hurdles we must overcome to achieve simulation-driven design. [...]]]>

Q&A’s with colleagues, experts and industry leaders

Introducing S.A.F.E.R. Simulation Views, where we invite colleagues, simulation experts, and A&D industry leaders to address simple questions about simulation.

In this edition of S.A.F.E.R. Simulation Views, we asked ESRD President & CEO Dr. Ricardo Actis about simulation-driven design.

Q: What do you see as an important issue or challenge faced by the A&D industry to achieve the wider, more reliable use of simulation software such that the vision of simulation-driven design can be realized?

Dr. Actis responds:

ESRD President and CEO Dr. Ricardo Actis

While finite element modeling originated in the aerospace industry way back in the 60’s, unfortunately, it has largely remained a practice of FEA specialists due to the level of expertise and experience required. There are many valid reasons for this that often confuse engineering managers, but getting the right mesh for a problem and getting the mesh right is always near the top of why it takes an expert and much time to get a good solution.

Yet looking at this more closely, the focus should not be the level of modeling skills of the user, but the level of intelligence in the software. Nearly all of the most popular legacy FEA software used by the A&D industry are based on the practice of finite element modeling and as such, none of them provide a simple Q/A dashboard to advise the non-expert user if they had a good solution.

Only the new generation of numerical simulation software based on hierarchic modeling that supports the practice of simulation governance can offer this capability to make simulation by non-experts simple, accurate, fast, efficient, and reliable.

Up Next…

We will pose the same question to a user of simulation software solving problems in the A&D industry.  Stay tuned!

Share Your Views on Simulation…

=

]]>
https://www.esrd.com/safer-simulation-views-toward-simulation-driven-design/feed/ 0
S.A.F.E.R. Numerical Simulation for Structural Analysis in the Aerospace Industry Part 3: FEM is not Numerical Simulation https://www.esrd.com/safer-numerical-simulation-structural-analysis-part-3/ https://www.esrd.com/safer-numerical-simulation-structural-analysis-part-3/#respond Mon, 29 Jan 2018 17:25:05 +0000 https://esrd.com/?p=5981 In this third of our multi-part series on “S.A.F.E.R. Numerical Simulation for Structural Analysis in the Aerospace Industry” we will examine why Numerical Simulation is not the same as Finite Element Modeling and what this means to the structural analysis function within the A&D industry. [...]]]>
SAINT LOUIS, MISSOURI – January 29, 2018

In ESRD’s last S.A.F.E.R. Simulation post, we profiled the challenges that structural engineers encounter when using legacy-generation finite element analysis (FEA) software to perform high-fidelity stress analysis of mixed metallic and composite aerostructures designed to be higher-performing and more damage tolerant over longer lifecycles. We went so far as to question whether because of these challenges the democratization of simulation was a realistic or safe goal if it means putting the current generation of expert FEA software tools into the hands of the occasional user or non-expert design engineer.

In this third of our multi-part series on “S.A.F.E.R. Numerical Simulation for Structural Analysis in the Aerospace Industry” we will examine why Numerical Simulation is not the same as Finite Element Modeling and what this means to the structural analysis function within the A&D industry. We’ll start by sharing a brief history of the finite element method.

Legacy Finite Element Modeling

The finite element method (FEM) used to solve problems in computational solid mechanics is now over 50 years old. Over the decades there have been many commercial improvements in analysis capabilities, user interfaces, pre/post processors, high-performance computing, pricing and licensing options. However, what has not changed is that nearly all of the large general-purpose FEA codes in use today for mechanical analysis are built upon the same underlying legacy FEM technology base.

How can someone identify a legacy FEA code? If it has more element types than you can count on one hand, if answers are dependent on the mesh, if different analysis types (e.g. linear, non-linear, thermal, modal) require different elements and models using different computational schemes, or if there is no explicit measurement of solution accuracy, then it is based on legacy FEM.

In these older implementations of the FEM the focus of the analyst shifts from the eloquence of the underlying method to the minutia of the model. These details include the selection of element types, location of nodes, and refinement of mesh among a long list of user-dependent judgement calls, assumptions, and decisions. It is no surprise for many users of legacy FEA codes that the task of creating the right mesh consumes most of the analyst’s time such that once an answer is attained there is little time or patience to verify whether it is the right answer.

A Brief History of the Finite Element Method

The first paper on FEM was 62 years ago. What has changed since then?

The first paper on the finite element method was published in 1956, and the next year the Soviet Union launched the first satellite (Sputnik) and the space race begun. This brought significant investments into engineering projects in support of the U.S. space program.

In 1965 NASA issued a request for proposal for a structural analysis computer program that eventually led to the development of the finite element analysis software NASTRAN. This marks the beginning of the development of legacy FEA software, and of the practice of finite element modeling.

Early work on developing the FEM was performed largely by engineers. The first mathematical papers were published in 1972. This is an important milestone because mathematicians view finite element analysis very differently from engineers. Engineers think of FEA as a modeling exercise that permits joining various elements selected from a finite element library to approximate the physical response of structural components when subjected to applied loads.

Mathematicians on the other hand, view FEA as a method to obtain approximate solutions to mathematical problems. For example, the equations of linear elasticity, together with the solution domain, the material properties, the loading and the constraints define a mathematical problem that has a unique exact solution which can be approximated by the finite element method.

Two main features characterize the early implementation of the emerging technology which lead to the current practice of finite element modeling: (1) large element libraries and (2) the lack of intrinsic procedures for solution verification.

Large element libraries put the burden on the analyst to make appropriate modeling decisions (e.g. why is reduced integration better?)

In 1981 it was proven and demonstrated for a large class of structural problems that the rate at which the error of discretization is reduced when the order of the approximating functions is increased over a fixed mesh, is at least twice as fast as when performing mesh refinement while keeping the order of the approximation functions constant.

In 1984 it was shown that exponential rates of convergence are possible when hierarchic finite element spaces are used with properly graded meshes, which is very significant in the modeling of bodies with cracks. The year 1984 marks the beginning of the development of Numerical Simulation Software (NSS) and of the engineering practice of numerical simulation.

The implementation of hierarchic models began in 1991 in which any mathematical model can be viewed as a special case of a more comprehensive model. This development highlighted the essential difference between numerical simulation software and legacy finite element modeling software.

In legacy FEA software, the mathematical model and the numerical approximation are combined resulting in the development of large element libraries. For example, a QUAD element for linear elasticity, a different QUAD element for nonlinear elasticity, yet another QUAD element for plates, and several variations for each.

In numerical simulation software the mathematical model is treated separately from its numerical approximation. Through the use of hierarchic finite element spaces and hierarchic models it is possible to control numerical errors separately from the modeling errors. As a result there is no need for large finite element libraries or using different element types for different types of analyses. Solution verification is an intrinsic capability for enabling a new generation of numerical simulation software that produces results which are far less dependent on the mesh or the user.

To learn more, watch ESRD’s 5-minute executive summary of the History of FEA and skim our decade-by-decade History of FEA timeline:

Finite Element Modeling Is Not Numerical Simulation

Numerical simulation, unlike finite element modeling, is a predictive computational science that can be used more reliably by both the expert simulation analyst and non-specialist engineer. Thus, a prerequisite for any numerical simulation software product is that it provides quantitative assessment of the quality of the results. Lacking that capability, it still requires an expert to subjectively ascertain if the solution is valid or not, and fails the most basic requirement for solution verification.

Finite element modeling is the practice of breaking the strict rules of implementing the finite element method such that the definition of the model and the attributes of the approximation are mixed, and errors of idealization and discretization are not possible to be separated. The model is re-imagined as LEGO® block connections at nodes using different idealizations and elements that are ambiguously mixed and matched. An example of finite element modeling is connecting 3D solid elements with 1D beam elements (mixing 3D and 1D elasticity). This is a dangerous practice as the model may not have an exact solution and the results will be mesh-dependent.

Numerical Simulation vs Finite Element Modeling: Key Facts

In numerical simulation an idea of a physical reality is precisely stated in the form of mathematical equations. Model form errors and the errors of numerical approximation are estimated and controlled separately. Numerical simulation is the practice of employing properly implemented computational techniques and methods for solving mathematical models discretely. The mathematical model representing the problem “at hand” (i.e., idealization) and the computational techniques employed to solve the mathematical model (i.e., discretization) should always be independent of the idealization. In the practice of legacy finite element modeling the idealization and the discretization are mixed making it impossible to ascertain if differences between predictions and experiments are due to idealization errors, discretization errors or both.

The implementation of hierarchic finite element spaces and models in the latest generation of numerical simulation software is quite different from that of legacy FEA codes, yet there is still confusion among users and software providers alike. When implemented properly, true numerical simulation is most definitely not achieved by adding yet another set of element types to an existing legacy FEA software framework, nor is it pretending that the finite element mesh is hidden. The implementation of hierarchic finite element spaces makes it possible to estimate and control the errors of approximation in professional practice while model hierarchies enable the assessment of the influence of modeling decisions (idealizations) in the predictions, neither one are possible with legacy FEA implementations.

Numerical Simulation supports a simple, hierarchic element library based only on topology

What this means for users is that degrees of freedom are no longer associated with or locked to nodes, design geometry can be mapped exactly without extensive defeaturing, large spanning 3D solid geometries of thin structures typical in aerospace can be modeled using 3D finite elements without simplification, high-fidelity solutions are continuous throughout the domain regardless of the mesh topology, and post-processing now becomes live dynamic results extraction of any function anywhere in the model at any time without a-priori knowledge of the solution required.

Most importantly for the engineering analysis function, numerical simulation software is inherently more tolerant when geometries, loads, or other boundary conditions change due to aerostructure design changes or revised operating conditions. Numerical simulation, unlike finite element modeling, is no longer about selecting the best elements to use or generating the right mesh then getting that mesh right. An objective measure of solution quality for all quantities of interest, combined with hierarchic finite element spaces, enables obtaining accurate results even with low-density “ugly” meshes.

Numerical simulation means that simulation-led design becomes more realistic for aerospace engineering groups, as does the democratization and appification of simulation for non-expert usage. The lack of automatic verification procedures is the most compelling reason to rule out the practice of finite element modeling in the creation of simulation apps that attempt to democratize the use of simulation.

Coming Up Next…

In the next S.A.F.E.R. Simulation post we will describe how the practice of Simulation Governance, enabled by a new generation of engineering analysis software based on Verifiable Numerical Simulation (VNS), such as ESRD’s StressCheck, is helping engineering groups respond to an avalanche of increasing performance, interdependent complexity, and technical risk in their products, processes, and tools. And this, in the process, makes structural analysis of aerostructures using finite element numerical simulation software more Simple, Accurate, Fast, Efficient, and Reliable for the new and expert user alike.

To receive future S.A.F.E.R. Simulation posts…

=

]]>
https://www.esrd.com/safer-numerical-simulation-structural-analysis-part-3/feed/ 0