Author: Denis Avetisyan
Researchers are harnessing the power of large language models to automatically derive physically plausible equations that govern material behavior.
![Engineering-Oriented Symbolic Regression bridges physical principles with practical application through a three-stage process-skill injection to define constraints [latex]\mathcal{S}=\{\mathcal{T},\mathcal{O},\mathcal{C}\}[/latex], constrained discovery via physics-informed genetic algorithms, and finite element validation-yielding constitutive laws that avoid the limitations of both empirically unstable models and overly idealized representations.](https://arxiv.org/html/2603.19241v1/sr_workflow_v2.png)
This work introduces Engineering-Oriented Symbolic Regression, a framework leveraging LLMs to discover accurate and numerically robust constitutive laws for advanced simulations.
Discovering accurate and numerically stable constitutive laws for complex materials remains a challenge, often caught between data-intensive methods and traditional engineering approximations. This work presents ‘Engineering-Oriented Symbolic Regression: LLMs as Physics Agents for Discovery of Simulation-Ready Constitutive Laws’, a novel framework leveraging Large Language Models to integrate physical constraints – such as thermodynamic consistency and frame indifference – directly into the symbolic regression process. This approach autonomously yields simulation-ready models, exemplified by a hybrid hyperelastic law for rubber-like materials that guarantees both predictive accuracy and unconditional convexity, even under severe deformation. Could this generalized, low-barrier pathway unlock a new era of physics-informed materials modeling and accelerate the design of advanced materials?
The Challenge of Accurate Material Representation
The reliable performance of numerous engineering systems hinges on the precise prediction of how hyperelastic materials – those exhibiting large, reversible deformations – will behave under load. However, traditional methods for modeling these materials frequently fall short when confronted with complex scenarios. These approaches often rely on simplified assumptions about material behavior, struggling to accurately capture phenomena like stress softening, Mullins effect, or rate-dependent responses. This limitation becomes particularly acute in applications involving extreme stretching, compression, or shearing, such as seals, gaskets, soft robotics, and biological tissues. Consequently, inaccuracies in material representation can lead to flawed designs, unreliable simulations, and potential structural failures, underscoring the need for more sophisticated modeling techniques capable of faithfully reproducing the intricate behavior of these materials.
While current constitutive models for hyperelastic materials have proven valuable in many engineering scenarios, their limitations become apparent when subjected to extreme deformation. These models, often relying on simplified assumptions or empirical fits, may fail to accurately predict material behavior under conditions involving large strains, high velocities, or complex loading paths. The resulting discrepancies can manifest as inaccurate stress predictions, instability estimations, or a failure to capture phenomena like Mullins effect – the softening of materials due to cyclic loading. Consequently, designs relying on these models in demanding applications-such as impact absorption, sealing, or soft robotics-may lack the necessary robustness or require substantial safety factors, hindering performance optimization and potentially leading to unforeseen failures. This necessitates the development of more sophisticated models capable of representing the full complexity of material response, even under the most challenging circumstances.
The pursuit of accurate material representation demands innovative strategies for characterizing the fundamental relationship between stress and strain. Traditional methods often simplify complex behaviors, leading to discrepancies between simulations and real-world performance, particularly under large or rapid deformations. Consequently, engineers are increasingly turning to advanced constitutive modeling – encompassing techniques like finite element analysis with hyperelastic models and data-driven approaches – to better capture nuanced material responses. The fidelity of these models directly impacts the reliability of designs, the efficiency of simulations, and ultimately, the safety and performance of engineered systems. Improved accuracy in defining this stress-strain relationship isn’t merely an academic exercise; it’s a critical need for advancing fields ranging from aerospace and automotive engineering to biomedical device development and beyond.
![The proposed SR model [latex] ext{(Eq 16, solid red line)}[/latex] outperforms the Yeoh model [latex] ext{(N=3, dashed blue)}[/latex] in capturing multi-axial material behavior, demonstrating accurate generalization to pure shear conditions [latex] ext{(MSE ≈ 0.0048)}[/latex] despite not being trained on them.](https://arxiv.org/html/2603.19241v1/model_comparison.png)
Data-Driven Discovery of Constitutive Laws: A New Path
Traditional constitutive modeling relies on formulating mathematical equations based on physical understanding and then fitting parameters to experimental data. Data-driven approaches, conversely, directly learn the relationship between observed variables from experimental datasets without requiring a pre-defined functional form. This is achieved by employing algorithms that analyze the data and identify patterns, effectively constructing a constitutive law directly from the observations. The benefit of this methodology is the potential to accurately represent complex material behavior that may not be adequately captured by existing, physically-based models, and to reduce the reliance on assumptions inherent in manual model development. These techniques are particularly useful when dealing with materials exhibiting highly nonlinear or complex behaviors where analytical solutions are difficult or impossible to obtain.
Deep Learning and Symbolic Regression are increasingly utilized to establish relationships between material behavior and observed data. Deep Learning, particularly through neural networks, excels at identifying complex, non-linear correlations within large datasets, often achieving high predictive accuracy. Symbolic Regression, conversely, aims to discover explicit mathematical equations that describe the underlying physics, providing interpretable models even with limited data. Comparative analyses demonstrate that these data-driven techniques can surpass the accuracy of traditional, manually derived constitutive models in certain scenarios, while simultaneously reducing the computational cost associated with parameter identification and model calibration. The efficiency gains stem from automated learning processes that minimize the need for iterative refinement and expert intervention.
The transition from manual constitutive model development to automated discovery represents a significant change in material modeling. Traditionally, researchers would formulate hypothesized equations based on physical understanding and then fit parameters to experimental data. Automated approaches, leveraging algorithms like machine learning, directly infer constitutive relationships from data without requiring a pre-defined functional form. This allows for the identification of complex, potentially non-intuitive relationships that may be missed by human-derived models. The resulting data-driven models can improve predictive accuracy, reduce modeling time, and facilitate the discovery of novel material behaviors, especially in scenarios where underlying physical mechanisms are poorly understood or highly complex.

Ensuring Model Stability and Physical Realism: The Foundation of Reliability
Constitutive model stability, essential for reliable simulations, is formally assessed using criteria such as the Drucker Stability Criterion. This criterion dictates that a material’s incremental stress response to any incremental strain must be non-negative, preventing a loss of resistance under sustained loading. Mathematically, this is expressed as [latex] \sigma_{ij} \Delta \epsilon_{ij} \geq 0 [/latex], where [latex] \sigma_{ij} [/latex] represents the stress tensor and [latex] \Delta \epsilon_{ij} [/latex] the incremental strain tensor. Failure to meet this criterion indicates potential for instability, leading to physically unrealistic predictions of material behavior, like spontaneous softening or volume collapse, even under constant loads. Verification of stability is therefore a fundamental requirement in the development and validation of any constitutive model intended for predictive simulations.
A stable constitutive model must exhibit a unique and predictable response to applied stress states. This necessitates preventing behaviors that violate fundamental physical principles, such as infinite deformation, negative volume change under compression, or the generation of energy without work input. Unpredictability arises when the model allows multiple possible responses for a given stress state, hindering accurate simulations and potentially leading to erroneous predictions of material behavior. Ensuring a single, defined response for all valid stress conditions is therefore critical for model reliability and the validity of numerical simulations relying on the model’s output.
Convexity in constitutive modeling ensures a unique and stable response to applied stress states. Mathematically, a function is considered convex if the line segment between any two points on its graph lies above or on the graph itself; in material behavior, this translates to a predictable and physically realistic deformation. Specifically, a convex constitutive function [latex] f(\sigma) [/latex] – where σ represents the stress tensor – guarantees that the incremental strain will consistently correspond to a single, well-defined stress increment. Non-convex functions can lead to multiple solutions for a given stress state, resulting in instability, loss of uniqueness, and potentially unphysical predictions such as a reduction in work during plastic deformation – a violation of the Second Law of Thermodynamics. Therefore, maintaining convexity is a necessary, though not always sufficient, condition for a robust and reliable constitutive model.
![Finite element analysis of a double-edge notched specimen demonstrates that the proposed model achieves stable large deformation [latex] \lambda_{global} = 3.0 [/latex] due to its convexity-informed formulation, while the Ogden model fails under compressive loading [latex] \lambda_{3} \approx 0.157 [/latex] due to a numerical singularity arising from its negative exponent [latex] \alpha_{3} \approx -3.18 [/latex] and lack of global convexity.](https://arxiv.org/html/2603.19241v1/fig6_page-0001.jpg)
Experimental Validation and Model Refinement: The Pursuit of Accuracy
The development of accurate material models for hyperelastic materials frequently involves comparison against established formulations like the Mooney-Rivlin, Ogden, and Yeoh models, which serve as foundational references in the field. Rigorous validation of any new model relies heavily on benchmark datasets, with the Treloar Data being particularly prominent due to its comprehensive collection of experimental data on rubber-like materials under various deformation conditions. These datasets provide a standardized basis for assessing a model’s ability to predict material behavior, ensuring that improvements aren’t merely fitting noise but represent a genuine advancement in capturing the underlying physics of hyperelasticity. Consequently, consistent performance against these established models and datasets is a critical step in demonstrating the reliability and predictive power of any novel material modeling approach.
Accurate material characterization fundamentally relies on the quality of experimental data, and Planar Biaxial Testing stands out as a crucial technique for capturing complex material behaviors. Unlike uniaxial testing which focuses on stress in a single direction, this method simultaneously applies and measures stress in two orthogonal directions, providing a more comprehensive understanding of how a material deforms under realistic, multi-axial loading conditions. This is particularly important for materials exhibiting anisotropy or non-linear elasticity, where behavior varies with orientation and stress level. The resulting data, meticulously gathered through precise displacement and force measurements, serves not only to validate existing constitutive models – such as those comparing Mooney-Rivlin, Ogden, and Yeoh formulations – but also to inform the development of new, more accurate representations of material response, ultimately leading to more reliable predictions in engineering simulations and product design.
Accurate material modeling necessitates a careful consideration of fundamental properties and potentially subtle behaviors. Incompressibility, the resistance of a material to volume change under stress, significantly impacts model predictions, particularly in rubber-like materials where even small volume changes can introduce substantial errors. Equally important is acknowledging phenomena like Gent locking, a stress-stiffening effect observed in elastomers at high strains; ignoring this can lead to underestimation of stresses and inaccurate predictions of component failure. Therefore, robust models aren’t simply mathematical approximations but must incorporate these physical realities to faithfully represent material response across a range of loading conditions and ensure predictive power in complex engineering applications.
The newly developed Engineering-Oriented Symbolic Regression (EO-SR) framework demonstrates a high degree of accuracy in capturing material behavior, as evidenced by its low fitting errors on standard tensile tests. Specifically, the framework achieves an error of only 0.0031 when fitting uniaxial tension data – a measure of how well the model predicts material stretching in a single direction. Furthermore, EO-SR maintains strong performance under more complex loading conditions, exhibiting a fitting error of 0.0146 on equibiaxial tension data, where the material is stretched equally in two directions. These results suggest the framework effectively captures key material characteristics, providing a robust foundation for predicting behavior in a variety of engineering applications and surpassing the accuracy of many existing models.
The predictive power of the Engineering-Oriented Symbolic Regression (EO-SR) framework extends beyond the datasets used during its training phase, as evidenced by its performance on zero-shot pure shear data. This signifies a robust generalization capability, achieving a fitting error of only 0.0048 – a remarkably low value considering the framework had not been explicitly trained on this specific deformation mode. Such performance suggests the EO-SR model isn’t simply memorizing training data, but instead, is identifying and representing the underlying constitutive relationships governing material behavior in a manner that accurately predicts response even under previously unseen conditions. This ability to extrapolate beyond the training set is crucial for real-world applications where materials are subjected to complex loading scenarios not fully captured by limited experimental data.
A critical assessment of model robustness involved subjecting both the Engineering-Oriented Symbolic Regression (EO-SR) model and a commonly used benchmark model to a finite element simulation of a double-edge notched specimen – a scenario known to challenge numerical solvers. The EO-SR model successfully converged, producing a stable solution that accurately reflected the expected material behavior under stress. In stark contrast, the benchmark model failed to converge, indicating a susceptibility to numerical instability when confronted with the geometric complexities and stress concentrations inherent in the notched specimen. This successful convergence underscores a key advantage of the EO-SR framework: its ability to generate equations that are not only accurate in fitting experimental data, but also numerically stable and reliable in complex simulations, potentially unlocking more accurate and robust predictions in engineering applications.
![The proposed SR model accurately predicts a finite extensibility limit under uniaxial tension by exhibiting an asymptotic stiffness surge at [latex]\lambda \approx 8.77[/latex], effectively preventing non-physical infinite stretching, while the Ogden model predicts unbounded polynomial growth, and the inset demonstrates that the SR model maintains stability with a minimum stiffness of approximately 0.30 MPa.](https://arxiv.org/html/2603.19241v1/drucker_stability.png)
The pursuit of accurate constitutive laws, as detailed in the framework, demands a reduction of complexity, not its amplification. The study champions a method where models are derived from first principles, guided by physical constraints – a process mirroring elegant simplicity. As Bertrand Russell observed, “The point of education is to teach people to think, not to memorize facts.” This sentiment echoes within the presented EO-SR framework; the goal isn’t simply to find a model, but to discover one born of understanding, capable of generalization and numerical robustness-a testament to the power of focused inquiry over exhaustive data fitting.
What Remains?
The pursuit of constitutive laws, once a laborious exercise in hand-wrought equations, has yielded, through this work, to a more automated inquiry. Yet, automation is not absolution. The framework presented does not solve constitutive modeling; it relocates the critical path. The challenge now lies not in generating candidate equations, but in rigorously vetting their physical plausibility beyond the initially imposed constraints. A numerically stable equation, elegantly expressed, remains insufficient if it describes a world unmoored from observation.
Future iterations should not focus on expanding the complexity of the symbolic regression itself, but rather on sharpening the criteria for acceptance. The integration of Drucker’s stability criteria, while a necessary step, is merely one facet of a much larger validation landscape. The true test will be the predictive power of these discovered laws when subjected to extreme conditions – scenarios deliberately omitted during training, and potentially revealing fundamental limitations.
Ultimately, the goal is not a proliferation of constitutive models, but a reduction – a distillation of physical principles into a minimal set of robust and verifiable laws. The elegance of a solution is not found in its intricacy, but in its capacity to explain the maximum with the minimum. This work, therefore, is not an end, but a necessary clearing of the ground.
Original article: https://arxiv.org/pdf/2603.19241.pdf
Contact the author: https://www.linkedin.com/in/avetisyan/
See also:
- Physics Proved by AI: A New Era for Automated Reasoning
- American Idol vet Caleb Flynn in solitary confinement after being charged for allegedly murdering wife
- Invincible Season 4 Episode 4 Release Date, Time, Where to Watch
- Gold Rate Forecast
- eFootball 2026 is bringing the v5.3.1 update: What to expect and what’s coming
- Total Football free codes and how to redeem them (March 2026)
- Seeing in the Dark: Event Cameras Guide Robots Through Low-Light Spaces
- Magicmon: World redeem codes and how to use them (March 2026)
- Hatch Dragons Beginners Guide and Tips
- Goddess of Victory: NIKKE 2×2 LOVE Mini Game: How to Play, Rewards, and other details
2026-03-23 14:13