Multiscale Analysis: A General Overview and Its Applications in Material Design

Multiscale modeling is a broadly used term to describe any situation where a physical problem is solved by capturing a system’s behavior and important features at multiple scales, particularly multiple spatial and/or temporal scales. Applications for multiscale analysis include fluid flow analysis, weather prediction, operations research, and structural analysis, to name a few. 

Multiscale Analysis of Advanced Materials

Modeling advanced materials accurately is extremely complex because of the high number of variables at play. The materials in question are heterogeneous in nature, meaning they have more than one pure constituent, e.g. carbon fiber + polymer resin or sedimentary rock + gaseous pores. 

While heterogeneity offers huge advantages in performance (making airplanes, space shuttles and lightweight cars possible), it also introduces difficulties in the engineering design. Presently, there is not enough computational power to include all the important details within a single Finite Element (FE) model, as is customary in industry, because that would require a high-resolution model too complex to be feasibly solved. 

While the simulation and analysis technology for metal structures such as car frames is quite robust, the analysis of novel “advanced materials” is lagging behind. The consensus is that using conventional techniques (standard FEA), it is not possible to accurately simulate these materials without extensive experimental and empirical “calibration” data. Thus, the introduction of new materials into a structure results in increased time to market and costs. 

The most efficient solution is to use multiscale FEA to divide and conquer the problem. To accomplish this, a local scale model of the material microstructure is embedded within the global scale FE model of the part. These different scales are analyzed simultaneously and the behavior of one level affects the behavior of the other. 

Solving each scale individually and linking their results is several orders of magnitude faster than trying to solve a single high-resolution model containing all relevant details. However, not all multiscale techniques are built alike. 

The ultimate goal of multiscale simulation is to achieve balance between simulation accuracy and computational efficiency. For this reason, there are a number of different types of structural multiscale analyses that attempt to strike that balance: 

  1. Multiscale with analytically modeled microstructure 
  2. FE2
  3. TRUE Multiscale (TM) 

Analytically Modeled Microstructures

One technique used to account for microstructural nuances is to use an analytical equation to model behavior. These equations are developed empirically by witnessing controlled experiments and generating a relationship between all relevant variables that match the observed outcomes. 

“Semi-analytical methods can be defined as direct micromacro procedures for which the local constitutive equations and criteria are evaluated at the local scale using explicit relations to establish the link between the macroscopic behavior with microstructural responses.” 

Some of these techniques aim to homogenize the properties of the local scale; others attempt to capture nonlinear behavior via curve fitting and progressive damage approaches. Many of the most famous techniques, such as those evaluated in the World Wide Failure Exercises, are related to the analysis of unidirectional composites. There are others that relate to spherical or oblong inclusions. The key is that the user must be very aware of the assumptions and bounds of their model when employing one of these techniques. 

These methods are certainly more accurate than their single-scale, isotropic predecessors, but fall short when trying to analyze novel parts/materials for which there is no historical correlations or empirical guide-posts. 

Finite Element Squared (FE2)

FE2, introduced by Feyel in 1998, describes the behavior of heterogeneous structures by using finite element models at both the global and local length scales. At each integration point at the macroscopic scale, a representative volume element (RVE) is assigned and a separate finite element computation is performed simultaneously. The macroscopic behavior is thus deduced from the nonlinearities in the behavior of the associated microstructure. The model is built using three main ingredients: 

  1. A modeling of the mechanical behavior at the lower scale (the RVE)
  2. A localization rule which determines the local solutions inside the unit cell for any given overall strain
  3. A homogenization rule giving the macroscopic stress tensor, knowing the micromechanical stress rate
faster results

Because the microstructure is a finite element mesh, all standard analysis techniques, failure theories, constitutive models, etc can be easily applied. For this reason, FE2 is known to be flexible and highly accurate, but is often regarded as too expensive to be practical in full component designs and optimizations. In cases where nonlinearity is modeled, it is as if the number of computations required for standard FEA is being “squared.” 

TRUE Multiscale

TRUE Multiscale is an approach developed and utilized exclusively at MultiMechanics. It combines the accuracy of FE2 with the computational efficiency of analytical methods. It has been shown to be as accurate, if not more so, than FE2, and enables 1,000x more efficiency. 

Just as in FE2, this approach allows one to explicitly model microstructural details, capture local progressive damage, and show the coalescence of local scale phenomena into global scale phenomena. In other words, the approach naturally inherits the same amount of flexibility and accuracy found in the widely accepted FEA and FE2 methods. 

With this approach, engineers are able to perform component and subcomponent designs with production-quality run times, and can even perform optimization studies. 

To learn more about TRUE Multiscale technology, download our on-demand webinar by clicking here. 

MultiMechanics and Fortify Collaborate to Make 3D Printing More Predictable

MultiMechanics and Fortify, a Boston-based additive manufacturing company specialized in composite material systems, have announced a strategic partnership to improve the predictability of composite 3D printing. As part of the partnership, Fortify will use MultiMechanics’ flagship product, MultiMech, to predict the structural integrity of printed parts before printing, and to help optimize the design by controlling the fiber orientation throughout the structure. Additionally, R&D will be performed to further enhance Fortify’s print analysis software, INFORMTM, and generate more sophisticated microstructures using their FluxprintTM¬†process based on microstructure analyses performed in MultiMech.¬†

Many companies involved in 3D printing of composites struggle to determine how their printed part will behave. With the Fortify print analysis software and Fluxprint printing capabilities, MultiMech will act as a tool to provide the required feedback for closed-loop iterative design of composite parts with optimized fiber orientation. This collaboration will enable users to optimize the design and manufacturing of parts to fit specific applications. 

“We see a definite need for better predictability of 3D printing, and we believe that our application poses a unique case for the MultiMech software. The resulting printed parts take full advantage of the strength-to-weight benefits of composite materials to a degree of resolution and complexity not possible before,” stated Josh Martin, CEO of Fortify. 

“We have a few exciting projects in the pipeline that will benefit from the use of MultiMech, including end use components for industrial UAVs and injection molding tools,” stated Martin. “We are excited to partner with MultiMechanics to push the 3D printing industry forward.” 

As the two startups continue to expand, MultiMechanics and Fortify plan to integrate the MultiMech API in Fortify printers. The simulation capabilities of MultiMech would then be available to users of Fortify, giving engineers full control over the 3D printing process, from design and testing to final production. 

“We are excited to partner with Fortify because both companies offer users the ability to control microstructural design at every single point of a product. Fortify allows that in the real world, while MultiMechanics enables that virtually,” stated Leandro Castro, CEO of MultiMechanics. “This strong synergy removes design constraints to create truly optimized parts.” 

MultiMechanics Releases MultiMech 18.1


MultiMechanics is excited to announce the release of MultiMech 18.1. The new features added will deliver improved ease of use, faster speed, and more simulation capabilities, including: 

  • Brand-new GUI, designed to declutter the workspace and make composite simulation more intuitive. 
  • Tube and pressure vessel optimization tool, enabling engineers designing composite tubes and pressure vessels to predict and mitigate manufacturing-induced variations. 
  • Major ANSYS extension workflow improvements, enabling ANSYS Workbench users to automatically generate microstructures and enhance their structural simulations. 
  • Abaqus extension workflow improvements, enabling Abaqus users to easily perform dehomogenization for Abaqus/Explicit simulations. 
  • Multiscale analysis with temperature, enabling the solution of multiscale and multi-physics problems, including curing and processing of composite materials. 
  • Automated time step refinement/coarsening, for a better user experience. 
  • Simulation of fully anisotropic thermoviscoelastic material models, expanding our library of material models. 
  • Improved speed, allowing engineers to accurately solve complex material challenges even faster. 

“MultiMech 18.1 contains a 7x speed-up in our solver, in addition to several new features that will expand the simulation capabilities of our users,” stated Dr. Flavio Souza, President and CTO of MultiMechanics. “This release truly reflects the significant progress we have made towards our three foundational pillars: accuracy, speed, and ease of use.”

How Rule of Mixtures is Killing Your Composite Design

What is considered to be a “composite” is always changing. Just as there is no single definition, there is also no single analytical method that can safely predict their dynamic behavior. Just as you cannot obtain ideal performance by using a single material throughout an entire car, you can’t expect to use a single analytical method to predict the behavior of all composites. 

Rule of Mixtures is probably the most known and widespread method of estimating composite properties. Its notoriety in composite design circles is also its main problem: Rule of Mixtures has been overused, and applied to cases that do not even come close to respecting its original, simplifying assumptions. If you wish to trust your analysis, it is essential to find out when it is OK, and (more importantly) NOT OK to use Rule of Mixtures. This article will describe what this rule really says, and will show some consequences of abusing this “rule of thumb” for composite behavior. 

The Rule of Mixtures is actually composed of two models: Voigt, W. (1889) and Reuss, A. (1929). The first model is normally applied to calculate elastic modulus on the fiber direction, while the second one is used for estimations on the transverse direction. The common assumption in both models is that a composite microstructure is approximated like this: 

rule of mixtures 2

So if your composite microstructure is always exactly like those two “blocks,” then Rule of Mixtures is a great option for you, and you don’t even need to bother reading the rest of this post.

However, we all know most composites don’t look like this. Even continuous fiber composites, whose properties are often estimated using Rule of Mixtures, have a different type of microstructure. But this isn’t just about “the looks.” The geometric assumptions affect how the local field variables are estimated. Therefore, using the wrong geometry, even for “similar” composites such as continuous fibers, can bring errors to local stress/strain fields, and consequently affect their properties and failure predictions. 

It doesn’t matter how sophisticated your failure model is. Your results will still be inaccurate because their inputs – the local fields – were wrongly decomposed in the first place. It is like building a fancy house on top of a weak foundation – it is a waste of time and money. 

So let’s understand the root of these inaccuracies. 

In the first model, the assumption is that under uniaxial tension, the strains on the fibers and resins are the same, while the stresses are proportional to each constituent’s moduli. That is not quite true, but luckily, fiber mechanical properties are normally one order of magnitude larger than resin properties. Therefore, it is common for the axial direction response to be fiber-dominated, with the influence of resin being quite small. Under those circumstances, the experimental results show acceptable agreement with the first model equation. 

This explains why Rule of Mixtures can still be reasonably applied in practice to axial response of continuous fibers, despite the geometric inaccuracies of the model even for this given composite. However, the bigger problem lies in the prediction of transverse direction properties, as we discuss below. 

In the second model, the extra assumption is reversed: the stresses on the fibers and the resin are the same, while strains are now inversely proportional to each constituent’s moduli. However, this time you can’t count luck, as this model is much more inaccurate than the first one. This is partially because the fibers will “protect” portions of the resin from stress while causing other resin areas to be over-stressed, as seen in the below image of the fringe patterns of microscale stress distribution. 

So much for equivalent stresses among constituents – the stresses are not constant, not even throughout the same constituent! But the consequences of that go beyond simply predicting wrong moduli. 

In reality, damage initiates sooner than one would predict by using Rule of Mixtures. Therefore, your Rule of Mixtures-based design is not so safe anymore. 

So what? You can always overdesign, right? Wrong! 

In an ever-competitive market, overdesign is each day more the last resort. In fact, with tight margins and increasing competition, small amounts of overdesign can mean the difference between winning or losing a client for the competition. Sometimes, even the business feasibility of employing composites at all can be compromised because of overdesign. Not surprisingly, as the cost of overdesign can be quite high. 

The solution to this problem isn’t simply switching to a more sophisticated analytical method. Again, composites are always changing, and no single, or even a collection of, analytical methods will help predict their dynamic behavior. 

rule of mixtures

The need for flexibility has never been greater. In the same way FEA has freed engineers from using the same old shapes for their products and bring flexibility to allow evaluation of designs that never existed before, this method can also be expanded to bring the same flexibility to the material level so engineers can best employ the materials of yesterday, today, and tomorrow. 

The Anatomy of a Composite Microstructure

Microstructural modeling is often viewed as an extraneous activity when analyzing the behavior of composites. Many engineers use the “system” properties as the inputs for their part design without considering what contributes to that overall system response. 

In this post, we will look at the anatomy of a composite microstructure to better understand how mechanical and geometric properties come together to influence overall behavior. 

rouchasPyramid-1

For UD composites, the following items have been shown to impact the overall system response, so let’s take a close look at how to characterize each within a virtual model. 

  • Fiber volume fraction
  • Fiber size and orientation
  • Fiber strength (tension and compression) 
  • Strength and geometric distributions
  • Resin behavior
  • Interface strengths
  • Voids, defects, particles within resin

Fiber Volume Fraction

The fiber content and alignment is the single most important aspect of any composite system. Depending on the alignment and orientation of your fibers, your part can either behave like steel or like wood. And it may even behave like both in different regions of a part. 

orientation-1

Unidirectional composites are not nearly as complex as injection-molded composites, which can vary in both the alignment and orientation of the fibers. 

In a unidirectional composite, such as a laminate or filament-wound part, orientation still does play a major role. For laid-up composites, the layup pattern and drape angles need to be accounted for in the model. For filament-wound parts, the wind pattern can be imported from winding tools such as FiberGrafix, which will drive the microstructure orientation. Further, a custom specification could be imported to account for voids or residual stresses that result from manufacturing-induced variations. 

Even a 10% difference in orientation within a unidirectional composite can mean the difference between the system failing predominantly at the matrix (in shear) or at the fiber (in tension) – resulting in a 40% reduction in ultimate strength. 

Modeling Variability and Stochasticity

ud_composites-1

There is no question that variability exists in composites – both in their geometric configurations and in their properties. Both of these variations influence the system behavior, and thus should be reflected in virtual predictions. 

To do this, stochastic modeling is used to capture the variability. From a geometry perspective, size and spatial configurations can be modeled according to distribution. This means that fiber size and length variations can be captured. 

The distribution of fiber can play a role in the introduction of failure because regions of high stress between inclusions will cause strain gradients and premature failure. Depending on your material and application, this could either result in improved properties (hardening effects) or decreased properties (crack onset). 

WeibullEquation-1

In addition to fiber orientation, for some composite systems, you want to take into account the strength distribution in your fiber or matrix or at the fiber-matrix interface. These strengths often follow the well-documented Weibull distribution, governed by the equation above. Using experimental data, the Weibull modulus (m) and scaling parameter (A) can be obtained for a given phase of your material and input into the model. To obtain this data for your fibers, for instance, a single filament test could be performed to generate a Weibull distribution curve for fiber tensile or compressive strengths. 

weibull-1

Another important and often overlooked feature of composites is the response of the resin. Viscoelastic materials exhibit different behavior at different loading rates and exhibit some energy dissipation and stress relaxation characteristics. Most resin systems have some viscoelastic or viscoplastic behavior. While rate dependency may not be a concern for quasi-static analysis, strain-induced stiffening or stress relaxation may be. 

The image below shows the loading of a UD composite, both along and transverse to the fiber. Notice that along the fiber, you get a primarily linear stress-strain response, as the stress is fiber-dominated. However, transverse to the fiber, we see non-symmetric behavior in loading and unloading. This is a characteristic of a viscoelastic material, where the area under this curve is the energy dissipated by the material.

Visco_Plots-1

Using a single analytical model to capture your composite behavior is a difficult task that is ignored in most analytical models. However, using MultiMech, it is possible to generate accurate viscoelastic material cards for the different phases of your material. These models easily capture the strain-dependent and energy-dissipating characteristics of the materials. 

Interface Behavior

A major consideration when designing a composite material is how it will maintain adhesion throughout a load and fatigue cycle. The mechanism of failure in this case is debonding and pullout and can be captured in a virtual model through the use of cohesive zones between the different phases. 

For complex biaxial cases, where shear forces are present, debonding can be a major area of damage onset. As seen in the image below, a 45 degree test on a unidirectional coupon yields a stair-stepping crack, working its way around the fiber-matrix interface and through the matrix. 

debondgif_big-1

While it may be difficult to obtain interface properties experimentally, MultiMech is able to reverse-engineer these properties, given that the other facets of your system are understood. 

Voids and Toughening Particulates

A hot area of research is in the effects of voids and toughening particulates within a composite matrix phase. These microscopic occurrences can have tremendously beneficial or detrimental effects depending on their size,  distribution, adhesion, and mechanical properties. 

C2 injectex-1

Using MultiMech, these inclusions or voids can be explicitly modeled and their effects can be studied using the same principles used to study the interactions of fiber and resin. 

Putting Together the Pieces

Putting all of this together yields some very interesting results. 

You can have the viscoelastic resin interacting with an orthotropic fiber. The stress gradients between the two phases cause some initial fiber debonding. That debonding then distributes stress into the matrix region, where matrix cracks start to form and propagate. Eventually, the composite system loses stability. 

Validations of models such as this have achieved experimental correlations in the 1% range. But more importantly, these models allow engineers to see the progression of damage within the microstructure. It isn’t a smearing of behaviors, but different, distinct interacting mechanisms, all of which are physically verifiable. 

Conclusion

This post focused mainly on the behavior of unidirectional composites, but a similar story could be told for other types of heterogeneous materials. If you would like more info on how to accurately model the behavior of your advanced material, subscribe to our blog for more posts like this! 

3 Reasons Why You Should Perform Multiscale Finite Element Analysis

Many companies that develop new composite materials are surprised when their product does not perform as expected during the physical testing and certification process. In addition to the many years wasted on developing the material, companies often spend more than $50M on developing and testing a single new material concept. 

Why is it so difficult to predict part behavior? Part failure often originates at the microscale, however, many simulation tools are not capable of linking microscale failure to the macro scale. In order to accurately predict how a part will behave, true multiscale simulation is required in order to determine how behavior at the microstructure will impact the global part. 

multiscale modeling

What are the advantages of performing multiscale FEA? 

  1. Higher accuracy. Performing multiscale analysis allows engineers to input different constituent material properties and get a more realistic prediction of how the composite material will behave. This is because multiscale takes into account damage at the microstructure, which is often where failure initiates, including debonding, fiber rupture, and matrix failure. Simulating the constituent materials at the microscale gives a much more realistic result of how the composite material and part will respond under structural loads and, more importantly, predicting when part failure will occur. 
  2. Predict how composites will fail. Multiscale analysis allows engineers who are performing structural analysis to observe the failure mode within a composite part, which in turn allows them to make design changes to mitigate that failure. For example, in continuous fiber composites, multiscale analysis can ensure that the material is optimally loaded, which is in the direction of the fibers. If a material is failing due to debonding then engineers can iterate the layup orientations to make the structure more conducive to the applied loadings. This ability to see where failure modes are occurring allows engineers to optimize part design for many different families of composites. 
  3. Run a virtual design of experiments. Multiscale analysis can be used to distribute different types of micromechanics throughout a material model. Finite elements can reflect the true nature of materials: elements can be inconsistent, containing flaws such as resin pockets, weakened interfaces, or misaligned inclusions, all of which will reduce the strength of a composite. Multiscale technology allows engineers to customize a wide variety of non-ideal microstructures that represent the more realistic nature of the composite. Because multiscaling can randomly distribute these defects and flaws throughout the part, a simulation can be run with the same number of flaws with the same geometry and mesh, but produce different results by changing the random distribution pattern. This will produce a variety of results, very similar to physically manufacturing these parts and testing them in a lab. This range of results is much more reliable and cost-effective because it allows engineers to gauge the variability in a certain part given the type of defect inserted within it. 

While many simulation tools claim to be multiscale, MultiMechanics offers the only TRUE multiscale analysis, meaning it solves for both the global and the local scale simultaneously. Once a microstructure starts to develop damage, the stiffness of its elements will decrease, causing stress concentrations within the adjacent elements. This is a much more realistic representation of how a material will fail and can only be performed using MultiMech. 

Subscribe to our blog to learn more about multiscale simulations and MultiMech! 

How to Zoom Into Your Material Microstructure using ANSYS Workbench

Failure in engineered materials is extremely difficult. In composites, damage originates at the microscale and is then propagated to the global scale. While Finite Element Analysis is a powerful tool, it is limited to the global scale because the mesh refinement needed to get down to the microscale is not feasible in FE programs. At MultiMechanics, we consider this to be a true multiscale problem, since damage at the microscale needs to be assessed and relayed to the macroscale. 

What tools are available? 

Most commercial material modeling and simulation tools rely on analytical methods to understand the stresses and strains within an element. Until recently, analytical methods were much faster than traditional FE2, as they relied on approximations to get results in a reasonable timeframe. These generalizations of a material assume that the material is perfect. For example, a fiber reinforced composite will be represented as a matrix with all fibers equally spaced from one another, which is far from the actual distribution of fibers. 

The MultiMech for ANSYS tool utilizes the extremely detailed and accurate FE2 theory. Using this theory, an engineer can have a global scale model with a part that is subjected to different loadings and boundary conditions. The stresses and strains within this part are transferred to a specified microscale finite element model, which represents every integration point within the global scale model. Once the microscale model is solved, that information is passed to the global scale, enabling much more realistic results than simply approximations. This method is truly representative due to the fact that microscale damage will lead to stiffness reduction in global scale elements, distributing stress concentration to different elements. 

How does it work? 

MultiMech is integrated within the ANSYS native user interface, making it very simple to convert an existing ANSYS model into a full TRUE multiscale model. After the MultiMech extension is installed, the option to designate a material as a “MultiMech Microscale RVE (UPF)” is available. After this MultiMech material is created, a user has the ability to create a microstructure model and define its constituent material properties, all within ANSYS. 

ansys blog1

The microstructure generation is located in ANSYS Mechanical. After the extension is installed, a MultiMech toolbar will be present in ANSYS Mechanical, as seen in the image below. Selecting the “Add MultiMech Material” button will populate the model tree with a MultiMech section. 

ansys blog2

This is where all the MultiMech options and utilities are located. A user has the ability to import complex fiber orientations, define the microstructural outputs, and create, edit, or import a microstructural model. 

ansys blog3


When creating a new microstructural model or editing an existing model, the MultiMech GUI window will appear and give the user the ability to make any changes. 

ansys blog4

Once the microstructural model is ready, selecting “Solve” will then run the full multiscale simulation, ensuring that the MultiMech material is designated to the part geometry. The part level can still be post-processed the same as any other ANSYS model, but the MultiMech extension also gives the ability to post-process the microstructure. The user can then observe how the microstructure behaves and rigorously investigate where damage originates at the microscale. From here, a structural analyst can try different geometries or orientations to improve overall part performance. 

ansys blog5

Conclusion

MultiMech for ANSYS is a tool that gives engineers and scientists the ability to observe microstructure properties. These multiscale analyses used to be too computationally expensive, as the solve time and memory were not practical for commercial applications. With MultiMech for ANSYS, multiscale analysis is now feasible and allows engineers to make design choices based on the high fidelity multiscale simulation that pinpoints the origin of failure at the microstructure. This tool also allows material manufacturers to virtually test new materials without the cost and time necessary to manufacture physical prototypes and destructively test different loading scenarios. Finally, MultiMech, being seamlessly integrated with ANSYS, allows structural engineers to easily run multiscale analyses and quickly derive value from this simulation method. 

Solvay and MultiMechanics Partner to Reduce Material Development and Certification Cost

Improving efficiency, lowering emissions, and decreasing fuel consumption are global trends that are currently transforming the transportation industry. Lightweighting by replacing metal components with lighter composite materials is one approach to achieving these goals. However, as structural designs have become more complex and demanding, new composite material development has struggled to keep up, thus slowing the adoption of lightweighting.

The time and cost to a material supplier to develop a new structural aerospace material can take five years and up to $50M, while the cost to the OEM to certify the material can be even higher. These time and cost restraints, which can be attributed to the material testing process, have resulted in stagnation in new material development to the entire industry. Supplementing this testing with computational analysis is a goal for many, but it is complicated by the unique behavior and challenges presented by composites. 

fibers_2

Problem

Scientists at Solvay knew that simulation would be a critical tool to get new materials developed faster and at a lower cost. They needed a platform that could process inputs such as fiber volume fraction, fiber orientation, interface effects, resin ductility, and material variability. No existing commercial simulation tools were capable of handling the amount of explicit inputs the team required in order to properly define and test their new materials. 

Additionally, Solvay scientists needed a tool that could quickly provide insight into how changes at the constituent material level affect the overall mechanical performance at the composite level. Physical testing of new composite materials is often cumbersome, and most material designs do not behave as expected during physical testing due to competing mechanisms and complex failure modes. Solvay was looking for a solution to shorten the feedback loop from new constituent synthesis to composite property determination and did not want to waste time and resources on developing new composites that would ultimately fail the difficult and cumbersome certification process. 

Solution

Matthew Jackson, Senior Research Scientist and Solvay, and his team spent years looking for a simulation tool that would meet their requirements. MultiMech was the one tool that would give the team a deeper understanding of their materials and the ability to go from constituent material properties to predicting part-scale composite properties. The software enabled the team to test the effect of inputs that they changed during the material development process. It accurately predicted composite failure by accounting for multiple competing damage mechanisms, including fiber rupture, resin cracking, and fiber-resin debonding.

Realistic Damage Modeling jpg

In addition to damage mechanisms, MultiMech also captured rate dependency of materials, so the team could fine-tune a given material to perform as expected under very different loading scenarios. 

Most importantly, MultiMech gave the Solvay team insight not only into when the material would fail, but also why. While most tools simply approximated the results, MultiMech gave unique, accurate insight into exactly why damage would occur. This gave engineers the ability to not only successfully create new materials, but also to improve them over time. 

Capture_2

“When it comes to multiscale analysis and bottom-up prediction of complex materials, MultiMechanics has no peers,” said Jackson. 

Benefits

By implementing MultiMech, Solvay estimates that the time and cost of developing and certifying new materials could be reduced by 40%. The team can now send new material ideas to be physically tested with greater confidence that their new design will pass testing. They also have insight into exactly how, when, and where damage will occur, and what they can do to mitigate it. 

“Our Composite Materials Global Business Unit carefully reviewed all modeling solutions and, by far, MultiMechanics provided the best results,” said Nicolas Cudre-Mauroux, Chief Technology Officer at Solvay. “The accuracy and speed afforded by MultiMechanics, and its efficient integration with commonly used commercial finite element software packages, is changing the way we develop new materials and interact with our customers.” 

A Brief History of Finite Element Analysis: Looking Forward

In our first two posts (Part I and Part II), we talked about the history of FEA and hinted at what the future may hold. 

From a hardware standpoint, companies are continuing to uphold Moore’s law as they target 500-fold increases in performance by 2018. However, the speed of a processor does not necessarily improve the accuracy of a finite element solution; it only helps you get your inaccurate results faster. 

This lack of accuracy is being magnified as new, advanced materials are introduced into our toolbox. For composite materials, such as fiber-reinforced plastics, the damage initiates at the microstructural level due to the inconsistent material properties and interactions between constituent materials. Thus, in order to predict a composite’s behavior using FEA, you need to explicitly model these interactions. 

To many engineers, the thought of creating an accurate finite element model of a microstructure seems either foreign or prohibitively labor-intensive; not to mention that it will likely take days to compute even on the world’s most powerful processor. For this reason, these engineers often their composites as a homogenized orthotropic media and account for the uncertainty through expensive destructive testing and/or the age-old technique of over-design. 

In very simple terms, it is easy to plot the consequences of over-design for just a single part. Assume that a part for a car suspension weighs one pound when perfectly designed and that the material costs $16/lb (as high-performance carbon fiber does). If we plot the cost of excess material for different factors of safety, as the number of manufactured parts grows, we will see the results. 

cost_of_over-1

After manufacturing only 2,000 units, the cost of extra carbon fiber already exceeds $25,000. When you look at the average production rates for a single part in the automobile industry (sometimes over 10 million units of a single part per year), the cost of over-design is around $128 million per year.

The point is that the added cost of accurate FEA becomes negligible as the number of manufactured units increases. When it comes to improving FEA accuracy of a composite part, the three competing approaches are direct numerical simulation, rule of mixtures, and multiscale simulation. 

At MultiMechanics, we believe that a multiscale simulation approach is bridging the gap between the simplified FEA of the past and the ideal-but-costly “direct numerical simulation.” For example, consider a composite bar with a single layer of cylindrical fibers, periodically placed along the bar. 

dissertation_beam-1

When examining tension in this beam, transverse to the fiber direction, you see very clear stratification between the DNS and rule of mixtures approaches. Additionally, as the number of cells in the DNS solution increases (ie the FE model grows in size), it converges towards the computationally modest multiscale results! 

Multiscale_v_DNS-1

Further and most importantly, when running these simulations on a 3.0 GHz Dell workstation, the multiscale simulation runs over 140 times faster than the comparably accurate DNS solution. 

CaseTime
Rule of mixture40 seconds
5 cells20 minutes
25 cells6.5 hours
100 cells6 days
MultiMech multiscale (4 processors) 53 minutes

While the past 50 years of Finite Element Analysis has been bright and prosperous, in order to ensure its utility for the next 50 years, new and innovative approaches need to be adopted. We believe that a multiscale simulation approach allows engineers to take advantage of the advances in computing, accurately model their novel material microstructures, and stay within the practical DOF limits set within industry. 

A Brief History of Finite Element Analysis – Part II

As we mentioned in Part I, the history of Finite Element Analysis is deeply intertwined with the evolution of computing. It seems only fitting that the FEA software used to design the world’s most cutting-edge products should have the most cutting-edge computational techniques at its disposal. From the early punch days of the 60’s through the 2000’s, FEA companies have found unique ways to take advantage of the ever-changing computer landscape. 

GUIs – “1984 won’t be like 1984” 

1984 – The Apple “Lisa” was released. Named after Steve Jobs’s daughter, the computer was a commercial flop, but would pave the way for the Graphical User Interface and the industry-changing Macintosh. 

1985 – The same year that Microsoft unveiled the Windows OS, AutoCAD 2 was released. It was designed to run on “microcomputers,” including two of the new 16-bit systems, the Victor 9000 and the IBM Personal Computer (PC). This version consisted of over 100,000 lines of C code and had a list price of $2,000. 

1985 – Altair Engineering was founded in a garage in Detroit, MI. Their first product was HyperMesh, followed by the award-winning FE based topology optimization tool, OptiStructu. A product they would later buy, the RADIOSS Finite Element solver, required 20 hours to solve a 20 K element crash simulation in 1987. If you fast forward to 2018, RADIOSS is now able to parallelize a 15 million element crash simulation to 128 cores and see results in 5 hours. That represents a nearly 4,000% increase in computational power. Most of this gain, however, can be accredited to the doubling of computational speed every 18 months. 

1991 – NEi Software was founded as Noran Engineering, Inc. Their product, NEi Nastran was a spinoff of the original MSC/NASA codebase, but with a GUI and improved performance. 

64 bits and Parallelization

1999 – To improve simulation accuracy, the Direct Numerical Simulation was created, which modeled all relevant details of a part as unique finite elements. The method was introduced in 1999 as a solution for modeling the heterogeneity in composite structures. 

2004 – ANSYS was the first simulation software company to solve a structural analysis model with more than 100 million degrees of freedom. The company did so using an SGI Altix server with six 64-bit Intel titanium 2 processors to solve a structural analysis problem with 111 million degrees of freedom in just 8.6 hours of solver time. 

transistor_count-1

ANSYS achieved this improved computing performance by running their code on a 64-bit high performance cluster, which is now the standard for all production FEA implementations. 

elmerPump-1

ANSYS’s improvements set a new standard in performance that all other software vendors would soon follow. 

What’s Next? 

From a hardware standpoint, several vendors and even governments are continuing to push the limits of high performance computing. SGI, the company that supplied ANSYS with the machine for their record-breaking simulation, plans to achieve a 500-fold increase in performance by 2018, in order to achieve one exaflop. 

But speed and availability of processors does not necessarily improve the accuracy of an FEA solver, it only helps you get inaccurate results faster. For any practical problem using heterogeneous materials, the mesh sizes and computational resources required to perform an accurate direct numerical simulation would exceed the capacity of the most powerful computers available. 

In our next post, we will discuss multiscale simulation, a best-of-both-worlds alternative to brute-force DNS.