Modeling the Unpredictable: A New Approach to System Dynamics

Author: Denis Avetisyan


Researchers have developed a novel framework that combines physics-based constraints with neural networks to create more accurate and interpretable models of complex, nonlinear systems.

Learned dynamics, when forward-integrated from a single initial condition, accurately reproduce the trajectories of a Van der Pol oscillator - mirroring ground truth across both state channels and validating the surrogate model’s predictive capability as defined by equation [latex](20)[/latex].
Learned dynamics, when forward-integrated from a single initial condition, accurately reproduce the trajectories of a Van der Pol oscillator – mirroring ground truth across both state channels and validating the surrogate model’s predictive capability as defined by equation [latex](20)[/latex].

SOLIS leverages trajectory reconstruction and state-conditioned parameter identification to learn second-order surrogate models with improved stability and accuracy compared to existing Physics-Informed Neural Networks.

Balancing physical interpretability with modeling flexibility remains a central challenge in nonlinear system identification. The work presented in ‘SOLIS: Physics-Informed Learning of Interpretable Neural Surrogates for Nonlinear Systems’ addresses this by introducing a novel framework that learns state-conditioned, second-order surrogate models from sparse data. SOLIS recasts identification as learning a Quasi-Linear Parameter-Varying (Quasi-LPV) representation, recovering interpretable parameters without presupposing a global equation, and stabilizes training via a cyclic curriculum and localized regression anchors. By decoupling trajectory reconstruction from parameter estimation, can SOLIS unlock more robust and accurate modeling of complex dynamical systems than existing Physics-Informed Neural Networks?


Deconstructing the Limits of Prediction

Conventional system identification techniques frequently encounter difficulties when applied to systems exhibiting highly nonlinear dynamics. These methods often depend on simplifying assumptions – such as linearity within a specific operating range – to render the problem tractable. However, such approximations can introduce significant errors when the system ventures outside these limited ranges or when the nonlinearity is intrinsic to the system’s behavior. This reliance on assumptions restricts the applicability of traditional approaches to a narrow subset of real-world phenomena, particularly those involving complex interactions and state-dependent characteristics. Consequently, accurate modeling and prediction become challenging, hindering the ability to reliably analyze and control these nonlinear systems, and often necessitating the development of more sophisticated identification strategies.

Traditional system identification techniques frequently encounter difficulties when confronted with systems exhibiting state-dependent characteristics, where properties like stiffness or damping aren’t constant but vary based on the system’s current state. This presents a significant challenge because standard methods typically assume linearity or time-invariance, struggling to accurately represent behaviors that change dynamically. For instance, a structure might become significantly stiffer under high loads, or a damper’s effectiveness might diminish at high velocities – phenomena that conventional models often oversimplify or miss entirely. Consequently, the applicability of these techniques is limited when dealing with complex systems where such state-dependent variations are prevalent, hindering accurate modeling, prediction, and control, and necessitating more sophisticated approaches capable of capturing these nuanced dynamics.

The inherent difficulties in modeling nonlinear systems are starkly illustrated when examining benchmark oscillators like the Duffing and Van der Pol systems. These oscillators, exhibiting behaviors such as jump phenomena and limit cycles, consistently challenge traditional system identification techniques. Existing methods, often predicated on linear approximations or perturbative analyses, struggle to accurately capture the full trajectory of these systems, particularly over extended prediction horizons. Attempts at reconstruction frequently reveal significant deviations between modeled and actual system behavior, highlighting the limitations of relying on conventional approaches when faced with strong nonlinearities. The persistent inaccuracies in trajectory prediction underscore the need for advanced modeling strategies capable of directly addressing and characterizing these complex dynamics, rather than relying on simplifying assumptions that compromise accuracy.

A surrogate model accurately replicates the ground-truth phase portrait of a Duffing oscillator, as demonstrated by high cosine similarity between their vector fields across the state space and validated with training trajectories.
A surrogate model accurately replicates the ground-truth phase portrait of a Duffing oscillator, as demonstrated by high cosine similarity between their vector fields across the state space and validated with training trajectories.

SOLIS: Reconstructing Reality from Data

The SOLIS framework addresses system identification by integrating Physics-Informed Neural Networks (PINNs) with data-driven techniques to generate second-order surrogate models. This approach moves beyond traditional PINNs by allowing for greater flexibility in model representation, which enhances accuracy in identifying system dynamics. Specifically, SOLIS focuses on constructing models that are both accurate to observed data and interpretable, allowing for analysis of the underlying system behavior. The resulting surrogate models are validated against established methods, demonstrating improved performance in identifying and reconstructing complex dynamical systems. This is achieved through a novel architecture that balances physical constraints with data-driven learning, resulting in models that generalize better than conventional approaches.

The SOLIS framework utilizes two primary neural networks: a Parameter Network and a Solution Network. The Parameter Network functions to identify a state-conditioned affine surrogate model of the system dynamics; this means it learns a linear relationship parameterized by the current state of the system. This network outputs the parameters defining the surrogate model at each state. Subsequently, the Solution Network employs Neural Ordinary Differential Equations to reconstruct continuous-time state trajectories based on the outputs of the Parameter Network. This approach allows for a continuous representation of the system’s evolution, enabling accurate prediction of future states given initial conditions and the learned dynamics.

The SOLIS framework utilizes Neural Ordinary Differential Equations (Neural ODEs) within its Solution Network to model system evolution as a continuous-time process. This approach differs from discrete-time methods by representing the system’s state change as a derivative, defined by [latex] \frac{dy}{dt} = f(y(t), t) [/latex], where y(t) is the state at time t and f is a neural network defining the dynamics. By integrating this differential equation, the Solution Network reconstructs continuous state trajectories, offering improved accuracy in trajectory reconstruction compared to baseline methods such as IPINN and TF, which typically rely on discrete-time approximations. This continuous representation also enhances interpretability, as it directly models the underlying physical processes without being constrained by fixed time steps.

Anchoring the Model: Robustness Through Regularization

Total Variation (TV) regularization, as implemented in SOLIS, operates by penalizing the sum of absolute differences between neighboring parameter values within the identified field. This encourages solutions where parameters change gradually across the modeled system, effectively promoting smoothness and preventing abrupt, unrealistic oscillations. Mathematically, the TV regularization term can be expressed as [latex] \lambda \sum_{i,j} |p_{i,j} – p_{i-1,j}| + |p_{i,j} – p_{i,j-1}| [/latex], where [latex] p_{i,j} [/latex] represents the parameter value at location (i, j) and λ is a weighting factor controlling the strength of the regularization. Furthermore, the L1 norm implicit in the TV regularization promotes sparsity by driving some parameter differences to zero, leading to a more concise and interpretable model while reducing sensitivity to noise in the data.

Sliding-Window Ridge Regression is implemented during the initial stages of training to establish analytical parameter anchors. This technique involves constructing a localized regression problem using a sliding window of data, effectively constraining the parameter field to adhere to locally observed trends. By employing [latex]L_2[/latex] regularization – the Ridge penalty – the solution minimizes the sum of squared errors while simultaneously preventing excessively large parameter values. This process generates a stable, initial parameter estimate that serves as a strong prior, accelerating convergence and preventing the optimization process from becoming trapped in local minima, particularly when data is sparse or noisy.

SOLIS achieves a balance between data-driven model adaptation and physical interpretability by integrating Total Variation Regularization and Sliding-Window Ridge Regression. This combined approach results in a robust identification framework capable of accurately representing system dynamics while maintaining a degree of parsimony. Quantitative evaluations demonstrate that SOLIS consistently surpasses the performance of baseline methods, as measured by predictive rollout accuracy, indicating improved generalization and reliability in forecasting system behavior.

The Van der Pol oscillator's trajectory [latex]y(t)[/latex] and velocity [latex]v(t)[/latex] were accurately reconstructed from in-sample data, closely matching the ground truth.
The Van der Pol oscillator’s trajectory [latex]y(t)[/latex] and velocity [latex]v(t)[/latex] were accurately reconstructed from in-sample data, closely matching the ground truth.

Beyond Prediction: A New Era of System Understanding

Conventional dynamic modeling often relies on discrete-time recurrent neural networks – such as RNNs, GRUs, and LSTMs – which approximate continuous physical processes as a series of snapshots in time. However, SOLIS distinguishes itself by directly embracing continuous-time dynamics, mirroring how many physical systems actually evolve. This approach avoids the inherent approximation of discretizing time, potentially leading to a more accurate and natural representation of the underlying physics. By operating in continuous time, the model can capture subtle nuances and dependencies that might be lost in discrete approximations, particularly crucial when dealing with systems exhibiting rapid or complex behaviors. The result is a framework better suited to modeling and predicting the behavior of continuous physical phenomena, offering a significant advantage over traditional sequence-based methods.

SOLIS distinguishes itself through broad applicability to nonlinear dynamical systems – those exhibiting behavior too complex for simple linear approximations. Unlike conventional methods often limited to specific system types, this framework demonstrates robust performance across a diverse spectrum of challenges. Critically, SOLIS doesn’t just model these systems, it excels at reconstructing their trajectories with unparalleled accuracy – consistently achieving the highest fidelity across all observed data. This improved performance isn’t solely quantitative; the learned surrogate model inherent in SOLIS also provides enhanced interpretability, allowing researchers to gain deeper insights into the underlying dynamics driving complex behaviors and potentially revealing previously hidden relationships within the system.

The SOLIS framework establishes a state-conditioned surrogate model capable of accurately predicting, controlling, and analyzing the behavior of complex dynamical systems. This approach moves beyond traditional methods by learning a continuous representation of system dynamics, enabling robust performance across a range of nonlinear systems. Evaluations demonstrate SOLIS’s superiority in both reconstructing past trajectories and forecasting future states; specifically, it achieved the highest average cosine similarity when tested on the Duffing and Van der Pol systems, outperforming both Physics-Informed Neural Networks (IPINN) and Temporal Fusion (TF). This enhanced accuracy and predictive capability unlocks potential applications in diverse fields, from engineering design and robotics to scientific modeling and data-driven discovery.

The development of SOLIS, a framework for learning interpretable surrogate models, echoes a fundamental tenet of reverse engineering: understanding isn’t passive observation, but active interrogation. Every exploit starts with a question, not with intent. Similarly, SOLIS doesn’t merely predict system behavior; it actively reconstructs trajectories and identifies state-conditioned parameters. This echoes Kolmogorov’s insight: “The most important thing in science is not to be afraid of making mistakes, but to learn from them.” The framework’s curriculum learning approach, particularly its handling of second-order systems, embodies this principle; iterative refinement through testing-essentially, controlled ‘mistakes’-yields a more robust and accurate model. It’s a process of dismantling assumptions and rebuilding understanding, akin to dissecting a complex system to reveal its underlying logic.

Beyond the Surrogate

The construction of SOLIS, a physics-informed neural network capable of reconstructing trajectories and identifying state-conditioned parameters, feels less like a destination and more like a particularly well-engineered demolition. It dismantles the black-box problem of nonlinear system identification, yes, but exposes the rubble beneath-the inherent difficulty of truly knowing a system, rather than merely mimicking its behavior. The framework’s reliance on second-order surrogates, while demonstrably effective, begs the question: how much fidelity is truly necessary, and at what cost to computational efficiency? Future work will undoubtedly push the boundaries of surrogate complexity, but the real challenge lies in determining when increased precision yields diminishing returns.

One wonders if the very notion of ‘identifying’ parameters isn’t a fundamentally flawed approach. Systems rarely remain static; their ‘parameters’ are themselves evolving entities. Perhaps a more fruitful avenue lies in modeling not the system itself, but the process of its change – a continuous deformation rather than a fixed configuration. Curriculum learning within SOLIS offers a glimpse of this, but feels almost… quaint. A system that learns to learn is still, ultimately, constrained by the initial curriculum.

The pursuit of interpretable surrogates isn’t about achieving perfect replication; it’s about establishing a controlled point of failure. A physicist doesn’t seek to build an unbreakable machine, but one that breaks in a predictable manner. SOLIS provides a refined toolkit for this controlled dismantling, but the most interesting discoveries will likely emerge when someone inevitably forces it to fail – and reverse-engineers the reasons why.


Original article: https://arxiv.org/pdf/2604.14879.pdf

Contact the author: https://www.linkedin.com/in/avetisyan/

See also:

2026-04-18 16:19