DLR Advances Crack Prediction in Ceramic Matrix Composites Using MultiMech

The German Aerospace Center (DLR) has selected MultiMechanics, a developer of multiscale modeling and simulation software for advanced materials, to perform advanced crack prediction during the manufacturing of ceramic matrix composite (CMC) components.

The simulation group at DLR’s Institute of Structures and Design in Stuttgart has recently begun to model the process of pyrolysis, an intermediate step in the production of ceramic matrix composites. Within this process, carbon fiber reinforced polymers (CFRP) are heated to temperatures up to 1,600 degrees Celsius and then cooled down. The team wants to understand how temperature change would affect the material’s microstructure and consequently the material behavior. This analysis is crucial for future CMC components produced by DLR, including nozzles for rockets and thermal protection systems for re-entry vehicles amongst other applications.

“MultiMech allows us to model microstructural cracks and determine how they would affect the overall composite part,” states Neraj Jain, Group Leader in Simulation and Engineering at the Department of Ceramic Composites and Structures at DLR. “Thanks to it, we can actually see where a crack is developing, how the crack will change our material, and how it will affect the final microstructure of the material.”

Using MultiMech, the engineers can vary the interface strength between fiber and ceramic matrix and evaluate how this parameter would influence crack initiation and propagation. This insight enables them to optimize the material and tailor it according to microstructural mechanics – a crucial step to more accurately understanding how a part will behave as a whole.

“DLR is a global leader in aeronautics and space research and we are extremely proud to support their advancement in understanding and designing CMC parts,” stated Dr. Flavio Souza, President  & CTO of MultiMechanics. “Our work at MultiMechanics for the last ten years has been dedicated to accurately connecting microstructural behavior of advanced materials to the overall part performance and its manufacturing, and we are pleased to see that our users are seeing the benefits and high value of TRUE multiscale modeling.”

“MultiMech’s efficient integration with our other FEA tools like Abaqus and ANSYS increases our productivity and brings us closer to our aim to optimize CMC material virtually,” stated Jain. “The way we are able to conduct crack modeling and multiscale simulation holds lots of opportunities in many projects to come.”

About DLR

The German Aerospace Center (DLR) is the national aeronautics and space research center of the Federal Republic of Germany. Its extensive research and development work in aeronautics, space, energy, transport, digitalization and security is integrated into national and international cooperative ventures. In addition to its own research, as Germany’s space agency, DLR has been given responsibility by the federal government for the planning and implementation of the German space program. DLR is also the umbrella organization for one of Germany’s largest project management agencies. At the Institute of Structures and Design in Stuttgart, DLR researchers investigate – among others – the development of CMC components for high-temperature and demanding environment applications.

Filament Wound Pressure Vessels & Space Shuttle Disasters

One of the most visible symbols of American ingenuity and exploration over the past 50 years has been the progress and perseverance of the National Aeronautics and Space Administration (NASA). It is the epitome of human curiosity. It is a constant example of how we live with unknowns for only as long as technology limits our ability to explore; and sometimes we try and explore those unknowns before technology is ready.

A total of 18 astronauts (12 of which were Americans) have perished while on space flight missions, and more have lost their lives while preparing or testing for them. Regardless of how bright your engineering team or how much money you have, complex systems often find their demise at the hand of a simple problem.

In 2003, the Columbia Space Shuttle disintegrated on re-entry, killing 7 astronauts aboard. The accident was caused by a piece of foam insulation that broke off from the Space Shuttle external tank and struck the Orbiter’s left wing.

After the accident, an investigation was launched into the Culture of Safety which existed at NASA. The results were disturbing. 2005 estimates for the probability of a catastrophic failure over the life of an Orbiter (estimated 100 missions) were one-in-five. The investigation put it succinctly:

“Organizations that deal with high-risk operations must always have a healthy fear of failure – operations must be proved safe, rather than the other way around. NASA inverted this burden of proof.” Thus, change in philosophy was mandated: the design and its operations must be proven to be safe.

One of the studies commissioned after this report was the analysis of the Orbiter’s composite-wrapped pressure vessels. Each Orbiter vehicle contains 24 Kevlar-COPVs for pressurized helium and nitrogen at pressures up to 5,000 psi.

Starting in the 1960’s and 70’s, the potential for Kevlar reinforced vessels was recognized at the NASA Lewis Research Center. Early research into filament winding reported weight savings of 25%-30% over comparable all-metallic vessels. Today, filament wound pressure vessels are essential to numerous NASA power and environmental systems. The majority of older vessel overwraps are made of Kevlar®-49/Epoxy Composites while the newer vessels have Carbon/Epoxy overwraps. Incorporating COPVs instead of traditional all-metal tanks reduced the Orbiter weight by 752 pounds.

Early on, the primary failure mode of the vessel was considered to be fatigue of the thin metal liner, either in the parent material or in the welds. The design requirement for the liner was Leak Before Burst (LBB). By definition, LBB requires that material defects or fatigue cracks in the liner manifest themselves in such a way that if a crack grows and penetrates through the liner, it would continue growing slowly enough for the gas to slowly escape.

However, from baseline tests, the following preliminary findings were released :

  • Kevlar/Epoxy (K/E) COPV are susceptible to the stress-rupture failure mode that can result in an explosive release of stored energy,
  • Risk is a strong function of fiber stress level, time at pressure (adjusted for temperature),
  • Likelihood of stress rupture failure in Orbiter COPV is significantly higher than previous predictions, and
  • Reliability models and supporting database need to be updated to reflect new knowledge as it is gained
  • Given the new NASA safety regulations, COPVs underwent extensive physical testing.

The original program requirements were for 100 missions in 10 years. Cycle testing was performed on test articles to show that each vessel liner could survive a factor of 4 on what was anticipated for 100 missions. The results of the study are beyond the scope of this post, however, twenty tanks were experimentally cycled in fatigue. For the most part, LBB was observed, however compressive (buckling) failures occasionally occurred but were deemed to be the result of unrealistic conditions. Several important conclusions were drawn from their work and if you are interested in composite pressure vessels, they are worth a read.

*Bonus 1:

When all design and maintenance costs are taken into account, the final cost of the Space Shuttle program, averaged over all missions and adjusted for inflation, was estimated to come out to $1.5 billion per launch, or $27,000 per pound.. This should be contrasted with the originally envisioned costs of approximately $657 per pound adjusting for inflation to 2013. By comparison, Russian Proton expendable cargo launchers (Atlas V rocket counterpart), are said to cost as little as $110 million (approximately $2,300 per pound).

Bonus 2:

Most of the smoke billowing up from a space shuttle launch is not exhaust. It is water vapor from the pool of water under the shuttle designed to absorb the acoustic shock waves that could otherwise tear the shuttle apart.

Physical Testing vs Multiscale Virtual Simulation of Advanced Materials

The transportation industry is increasingly adopting advanced materials, which, as a result of increases in specific strength, offer improved efficiency, lower emissions, and decreased fuel consumption compared to traditional metal components. However, the transition to advanced materials has been slowed as structural analyses have become more complex and demanding.

The time and cost for a material supplier to develop a new structural aerospace material can take five years and up to $50M, while the cost to the OEM to certify the material can be even higher. Supplementing physical testing with computational analysis is a goal for many, but it is complicated by the unique behavior and challenges presented by composites.

Traditional Material Design Iteration

Due to limitations presented by popular commercial tools, most companies still undergo an extensive “build and destroy” process with every new material they develop. There are several issues with this approach, all of which have a negative impact on the cost and time to take a new product to market:

  • An unknown number of iterations are built and destroyed to undergo countless manufacture and performance scenarios
  • Each iteration results in added time and cost, wasting company resources and materials
  • Uncertainties in formulation exist due to manufacturing variability; several samples are needed
  • Design iterations are based on parametric studies and materials engineering experience

Design Iteration with Analytical Simulation Software

Virtual testing of advanced materials has existed for nearly a decade. The most popular tools were adopted quickly by transportation companies due to their fast results. However, these tools have not provided sufficient enough results for companies to be able to drastically reduce their physical testing processes, resulting in slow adoption of advanced materials.

The reason these tools are insufficient is because they sacrifice accuracy in order to provide a quick, easy result. Results are oversimplified and based on physical testing data from manufacturers, leading to unexpected failure and overdesign of composite parts and systems.

Design Iteration with Multiscale Simulation Software

True multiscale simulation software extends the flexibility and robustness of the Finite Element Method down to the microstructural scale and connects behavior at that level to the overall behavior of the macroscale. Many companies can “multiscale” on some level, but MultiMechanics’ flagship product, MultiMech, is the only commercially available software platform that links micro- and macroscale behavior.

Designing composites using TRUE multiscale software allows for much more realistic damage modeling, giving engineers true visibility into the behavior of their composite products. Damage scenarios on the microstructural level such as fiber rupture, resin cracking, and fiber-resin debonding can be accurately modeled and accounted for, allowing engineers to optimize their design before producing a physical prototype.

TRUE multiscale simulation is estimated to reduce the time and cost of developing and certifying new materials by 40%. It also gives engineers insight into exactly how, when, and where damage will occur, and how they can mitigate it.

To learn how MultiMechanics customers are reducing the time and cost to develop and certify new materials by as much as 40%, click here to read a case study.

MultiMechanics and Opterus R&D Partner to Create Advanced Material Models in Response to NASA Solicitation

MultiMechanics and Opterus R&D, a Colorado-based satellite component manufacturer specializing in the design and testing of deployable space structures, have announced a collaboration in response to a NASA Solicitation seeking further exploration of the use of thin-ply High Strain Composites (HSCs) for space applications. These advanced composites have the potential to reduce weight and increase performance of space systems, but they exhibit behavior that is difficult to predict using currently available Finite Element Analysis (FEA) tools.

Continue reading

What do the Model T and F-150 have in common? Advanced Materials!

Henry Ford’s Model T just celebrated its 110th birthday. Introduced on October 1, 1908, the Model T was the first and most popular mass-produced automobile in the world. By 1921, Ford Model Ts accounted for over half of the world’s automobile production.


The car was lauded for the revolutionary process by which it was manufactured (the moving assembly line), but the automobile itself was an engineering marvel. Let’s take a look back at this industry-inventing automobile and the technologies and materials that made it possible. 

It Starts With A Vision


Henry Ford’s goal was to make an automobile for the great masses.  For this reason he needed to make the car as cheap, durable, rugged, and fuel-efficient as possible – truly a challenging optimization problem.

The Model T was introduced in 1908 for $825, which was a significant amount considering a teacher’s annual salary at the time was $850. Over the course of 19 years,  dramatic improvements in manufacturing efficiency and supply chain integration lowered the cost to only $280.

Material Innovation

The Model T weighed about 1,200 pounds and had a 100-inch wheelbase and a ten gallon gas tank.  It got about 13 to 21 miles per gallon of gasoline, but because gasoline was not widely available in the early days of automobiles, the car was designed to run on gas, grain alcohol, or ethanol.

Despite its nickname “The Tin Lizzy”, a Model T’s bill-of-materials included iron, wood, and (for structurally crucial components) steel – but no tin. The fender and most engine components were made of high strength iron. The body panels were composed of a framework of wood to which cheap sheet steel was fastened. But, undoubtedly, the most breakthrough material of the car was Vanadium Steel.


Henry Ford searched the world for the best materials he could find at the cheapest cost. In 1905, during a car race in Florida, he noticed that the parts of a wrecked French sports car were lighter but stronger than what was being used in mass-produced American vehicles. This French concoction was known as Vanadium Steel.

Ford recognized that, despite its higher price tag, the steel’s high tensile strength (nearly three times greater than cheaper, lower-grade steels) and its ability to be easily machined would allow him to make a stronger, lighter, better performing car. As part of the pre-production process for the Model T, Ford imported an expert who helped the company build a steel mill and mass-produce the metal.  

Vanadium Steel eventually was used in nearly all of the Model T’s highly stressed parts including the crankshaft, forged front axle, wheel spindles, springs, and transmission gears (vanadium steel running in an oil bath). Steel was a critical, albeit very heavy, component.

Same Problems, Different Day

Fast forward 100 years: on modern vehicles, most of the weight still comes from different varieties of steel. In 2007, the average car weighed 3,000 pounds and contained 2,400 pounds of steel.  The average truck weighed 4,000 lbs and contained 3,000 lbs of steel. In cars, steel is used to create the underlying chassis, exhaust pipes, door beams, roofs and body panels.

But – due to tighter fuel efficiency regulations, the trend is towards lighter, more expensive materials. Just as Henry Ford sacrificed price for performance, engineers in today’s car company’s are forced to make the same crucial decisions.

Ford at it Again


In 2015, Ford overhauled its iconic F-150 pickup, replacing the truck’s steel body panels with lightweight, corrosion-resistant aluminum and shed 400 pounds in the process.

Because of the switch to aluminum, the Ford F150 was able to shed 400 pounds (10% weight reduction). For every 10 percent reduction in vehicle weight, fuel economy increases 5-7 percent and the F150 2015 exceeded that by improving MPG by 7.4%.

Because of the increased cost of aluminum, this 400 lb reduction adds about $725 to the raw material costs of the truck. However, due to advanced recycling practices spurred by former CEO Alan Mulally, about $280 of this additional cost gets recovered during recycling.

Another pleasantly surprising outcome of the lightweighting effort is that it allowed Ford to scale back on parts that were introduced to account for the car’s excess weight. For instance, the F-150 uses a smaller but powerful 2.7-liter Ecoboost engine option, smaller brakes and a lighter suspension.  These updates further reduced truck costs and improved its fuel economy.

“Black” Aluminum

Today’s cars also use tremendous amounts of plastics in auto manufacturing. Plastics make up about 50% of the construction of new cars today by volume – but mostly for non-structural components. This is unsurprising because traditional plastics are durable, cheap and can be turned into just about anything.

 Another recent trend is to use high-cost, high-performance fiber-reinforced plastics (pejoratively referred to as “Black Aluminum”). While the F-150 reduced a lot of weight, it still uses heavy steel for many components like the cage and chassis. Daring companies like Hyundai are looking into carbon fiber car frames which would reduce frame weight by 70% and vehicle weight by 30%, without compromising on rigidity or safety features.


Bonus Fact: Henry Ford, a bastion of efficiency, used wood scraps from the production of Model Ts to make charcoal, which he then sold.  Originally named Ford Charcoal, the name was later changed to Kingsford Charcoal.

MultiMechanics Joins Siemens’ Solution Partner Program to Deliver Multiscale Capabilities to Siemens’ Simcenter 3D Users

MultiMechanics is excited to announce that we will integrate our flagship product, MultiMech, with Siemens’ Simcenter™ portfolio. This partnership brings the ability to zoom into the material microstructure and connect material behavior to overall part performance directly within Siemens PLM Software’s Simcenter portfolio.

PR Pic (1)

This partnership will enable Siemens’ Simcenter 3D users to easily enhance their models with MultiMechanics’ multiscale capabilities, helping to bring the ultimate level of understanding of how materials behave and fail, and how that affects part performance. This opens up new opportunities for optimization at the material and part levels.

“We have received a significant number of requests to integrate MultiMech with NX Nastran and are excited to join the Siemens PLM Software partner community and bring TRUE Multiscale technology to users,” stated Dr. Flavio Souza, President & CTO of MultiMechanics. “The strength of our two organizations working together can bring significant value to engineers.”

As part of the agreement, MultiMechanics will develop a MultiMech Connector for Simcenter 3D and couple MultiMech solver natively with NX™ Nastran® software. The integrated solution will enable Siemens PLM Software users to easily create or convert existing projects into multiscale models and account for material microstructural design variables in the Simcenter 3D platform.

“We are pleased that MultiMechanics has joined the Siemens Partner Program as a Software & Technology Partner. MultiMech provides our customers complementary solutions that will add value to their simulation software investment,” stated Willy Bakkers, vice president of Simulation & Test Solutions for Siemens PLM Software.

MultiMechanics and Siemens PLM Software will collaborate closely to provide advanced capabilities leveraging the unique features of their respective products, as well as fully integrate with other Siemens PLM Software products such as Simcenter™ Samcef® software and the Fibersim™ portfolio of software for composites engineering.

Note: Fibersim and NX are trademarks or registered trademarks of Siemens Product Lifecycle Management Software Inc. or its subsidiaries in the United States and in other countries. Simcenter is a trademark or registered trademark of Siemens Industry Software NV or any of its affiliates. NASTRAN is a registered trademark of the National Aeronautics and Space Administration. Samcef is a registered trademark of SAMTECH S.A.

MultiMechanics Selected as Finalist for Mondial.Tech Startup Award, CAMX Award, and Award for Composites Excellence

MultiMechanics has been selected as a finalist for three industry awards: the Mondial.Tech Startup Award, the CAMX Award, and the Award of Composites Excellence (ACE). These awards recognize technologies that are shaping the future of composites and transportation. 

Press Release

The Mondial.Tech Startup Awards recognize pioneering technologies in the automotive industry. MultiMechanics has been selected as a finalist for the Material & Weight Reduction category along with seven other leading startups, which will be judged by Karl-Heinz Fueller of Daimler and Jean-Claude Le Four of Renault. The pitch competition will be held in Paris, France on October 3rd and the winners will be announced shortly after. 

The CAMX and ACE Awards are presented at the CAMX event, taking place October 15-18 in Dallas, Texas. The CAMX Award recognizes cutting-edge innovations and innovators that are shaping the future of composites and advanced materials, while the ACE Award is hosted by ACMA and is given in recognition of outstanding achievement and innovation in technology, manufacturing and product development.

“We are honored to have been selected as a finalist for these three extremely prestigious awards,” said Dr. Flavio Souza, CTO of MultiMechanics. “The work we are doing at MultiMechanics is helping to speed up the adoption of advanced materials, and I am very proud to see our team’s hard work being recognized.”

Advanced Composites Spotlight: Long Fiber Thermoplastics

Much attention has been given to the advent of long fiber thermoplastics due to their desirable recycling, manufacturing, and mechanical properties. However, the question remains as to whether current analysis tools and techniques can safely capture this material’s behavior. 


A thermoplastic is a polymer (or resin material) which becomes pliable or moldable above a specific temperature and returns to a solid state upon cooling. This is in contrast to thermosetting plastics which are irreversibly cured. 

When compared to other materials, thermoplastics stack up well: 

  • 60% lighter and 600% stiffer than steel 
  • 30% lighter than aluminum
  • 200% tougher than thermoset composites
  • 60% less scrap during production than sheet goods
  • Recyclable and reusable

A downside is that they are generally more expensive. Epoxy thermosets are typically $1-10/pound, while PEEK thermoplastics can cost anywhere between $10-100/lb. 

One of the most promising uses of this thermoplastic resin is as a matrix material for long, discontinuous fiber advanced composites. Compared to short fiber composites and thermosets, long fiber thermoplastics (LFTs) offer better mechanical properties in terms of elastic stiffness, strength, creep and fatigue endurance, and crashworthiness. 

When aligned along the loading axis, long fiber thermoplastics can withstand up to 70% of the load of continuous fiber composites at a fraction of the manufacturing costs. As techniques improve, the ability to “align” fibers will likely improve and manufacturing costs will continue to decrease. 


LFTs are also favorable because existing injection molding machines used in industry today can be adjusted to mass-produce LFT parts. 

For these reasons, some believe that thermoplastics could be the “missing link” between exotic advanced composite applications and the mass market. 

Whether this is just marketing hype has yet to be seen, but what we do know is that with the advent of this new blend of composites, empiricists need to come up with a whole new set of theories to capture and predict LFT behavior. 

It is a well-disguised fact that most composite analysis tools utilize simplifying assumptions that don’t play nicely with long fiber composites. Analytical theories that assume inclusions are “oblong” or “spherical” in shape are voided with long fiber composites, because the real material deviates so far from the ideal (assumed) geometry. And don’t even get us started on theories that utilize rule of mixtures. 

When a discontinuous short- or long-fiber polymer composite is subjected to monotonic loading or cyclic loading, matrix cracking initiates from existing micro-voids in the matrix material, or cracking will start from fiber ends which are the sites of stress concentrations. This cracking then combines forces with (or is sometimes the cause of) fiber-matrix debonding at the fiber-matrix interfaces. The final cause of failure is often fiber pull-out and rupture. 

Depending on the pre-treatment and manufacturing method used, an LFT composite part can have vastly different values for influential properties such as fiber orientation, material defects, matrix voids, and fiber degradation factors. 

Constructing analytical equations to account for these variables is feasible but very time-consuming. It is far better to have a realistic geometry and physics-based approach for capturing microstructural nuances and predicting material behavior. That is where a tool with advanced micromechanical modeling capabilities, such as MultiMech, can become a huge advantage. 

Long fiber thermoplastics have great potential, but whether or not they will be the material that “catapults” past steel/aluminum is uncertain. What is more probable is that engineers will continue to invent new ways to combine materials, yielding more customized and favorable mechanical properties. 

If we are to keep up with this innovation, engineers must adopt flexible physics-based modeling technologies rather than calibrated analytical equations. 

How Solvay Uses MultiMech for ANSYS to Optimize Material Performance: Live Webinar

MultiMechanics is partnering with Solvay and ANSYS to produce the webinar “How Solvay Uses MultiMech for ANSYS to Optimize Material Performance.” The webinar will take place on Thursday, August 30th at 8:00AM CST.

Results (Showing element)

Improving efficiency, lowering emissions, and decreasing fuel consumption are global trends that are currently transforming the transportation industry. Lightweighting by replacing metal components with lighter composite materials is one approach to achieve these goals. However, the time and cost to a material supplier to develop a new material can take five years and up to $50M, while the cost to the OEM can be even higher. 

Join this webinar to learn: 

  • How Solvay uses MultiMech for ANSYS to virtually predict how behavior at the material microstructure level affects part- and system-level performance
  • How to use virtual testing to reduce the time and cost of developing new materials and send new ideas to be physically tested with greater confidence that the design will pass testing
  • How to gain insight into exactly how, when, and where damage will occur in the microstructure, and how to mitigate it
  • How the MultiMech and ANSYS platforms integrate to enable maximum simulation efficiency and accuracy

After the webinar, a live Q&A will be held with all panelists. 

  • Matthew Jackson, Senior Research Scientist at Solvay
  • Rebekah Sweat, Research Scientist at Solvay
  • Richard Mitchell, Lead Product Marketing Manager at ANSYS
  • Hayden Cornwell, Application Engineer at MultiMechanics

Click here to download the webinar recording.

The Benefits of Optimization in Composites Engineering

The analysis of composites and other heterogeneous materials is complex for a number of known and very well-documented reasons. 

Many virtual testing techniques have been developed to help predict the behavior of composite parts; however, most tools end up relying on a great deal of physical testing of a composite specimen before virtual testing of a part becomes a viable solution. 

Is there a way we can avoid excess physical testing and instead use virtual automation tools to understand our materials and improve our end products? 

Piece of Cake

To use an analogy, let’s say you want to understand and predict the science behind baking a good cake. 

There are a number of variables that define a good cake, such as the amount of water, quality of flour, and convection of your oven. You could bake 10 cakes and laboriously come up an empirical formula to predict how various inputs affect the resulting cake. Or you can understand a cake’s ingredients well enough to predict how a change will influence the outcome. 

If you can define these inputs, you can start to understand which ingredients or processes contribute to favorable or unfavorable cake characteristics. If you can do the latter, this opens the doors up to the true power of computers: the ability to iterate and optimize, such that for any given variation of your ingredients, you can reasonably predict how well that cake is going to turn out. 


There are a number of optimization tools available on the market. They vary in their ease of use, pre- and post-processing capabilities, and methodologies, but boiled down, optimization tools operate under the following conditions: 

  1. Inputs – Provide a set of variable parameters and their upper and lower bounds
  2. Solving – Perform some operation using those inputs that generates a single result
  3. Match – Try to match that result to a set of pre-defined target values, or try to minimize/maximize any number of resulting values
  4. Iteration – Iterate (using a number of smart parameter selection techniques) until that solution converges

Just as it’s important to deconstruct cake ingredients, it’s also wise to look at the key pieces of the optimization process. For the parameter selection, this step dictates that you need an input paradigm flexible enough to take in and work with numerous variables. If your tool requires that your inputs are generic, vague, or boiled-down, then your outputs will be equally un-revealing. 

The other important and rate-limiting component is the “iteration” step. This is the key ingredient in all optimization tools. The takeaway there is that the speed at which a tool takes to arrive at a solution must be 1,000x faster than the time to actually find an optimal solution manually. This is because it might take an optimization tool 1,000-5,000 iterations before it finds a suitable solution. Thus, another weakness of composite analysis tools in this space is their inability to quickly generate solutions to complex problems. 

Composites Optimization

There are a number of factors that can be modified to potentially improve the properties of the material. At the same time, there are variables that are strictly controlled, such as the presence of voids in a matrix, as these variables result in the degradation of a part’s performance. 

Since isolating variables is often a wiser approach, typical optimization studies found in the composites industry include: 

  • Fiber manufacturers
    • This group may try to find the ideal length of fibers to meet target strength and weight while minimizing cost 
    • Alternatively, they may be interested in evaluating the ratio of glass-to-carbon fiber in a hybrid reinforcement bundle vs. other key mechanical properties 
  • Proprietors of woven composites
    • May be curious about the ideal weave geometry to hit a certain strength target 
  • Designers of mining technologies
    • Might be interested in the optimal placement of explosives to promote ideal crack propagation within a heterogeneous medium (like coal or shale rock)
  • Part manufacturers
    • May be interested in optimal adhesion characteristics of fiber/resin, which can be modified by the introduction of surface treatments and coatings
  • 3D printers
    • Looking to print optimized material microstructures in the same part, all with specific properties targeted for that part region

Holy Grail

For context, the ideal composite optimization jobs would be the following: 


  1. Given all possible variables
    • Manufacturing 
    • Material
    • Part geometry
  2. And costs to modify each of the below: 
    • Cost to control defect
    • Costs of different materials
    • Costs of different manufacturing processes
    • Costs to “model” various geometric features


  1. Find the lowest cost option to hit a given set of targets

Tools Required:

  1. High-Powered Optimization Tool (HyperStudy, Optimus, Design Explorer)
  2. Manufacturing simulation tool (Moldex3d, FiberGraphix, FiberSim, etc.)
    • Moldex3d in particular is adept in various forms of optimization
  3. Structural/thermal analysis tool capable of: 
    • Ingesting manufacturing inputs
    • Using inputs from various sources to drive automated pre-processing at multiple scales
    • Efficiently using manufacturing inputs to minimize computational costs
    • Intelligently notifying optimization engine when a manufacturing input yields sub-par results 
    • Outputting simulation results in a useful and consolidated manner

The workflow for the optimization of a discontinuous fiber reinforced part, using available software tools, would be as follows: 

benefits of optimization

In Conclusion: 

In composites engineering, the list of variables is long and interrelated. Whenever there exists a problem where there are more input variables than there are favorable outputs (and the stakes for solving are relatively high), you find that each group that controls one variable will claim that their variable is the most important and they have perfected the control of it. Often, you are encountering guessing and speculation (best case) and snake oil (worst case). It’s like the sugar producer or oven manufacturer claiming they have engineered their product to solve the “most pressing” challenge in cake engineering without understanding how the other one works. 

In reality, it is up to the baker to understand all  ingredients and know how to come together to make something the end user wants to eat.