Learning linear evolution of partial differential equations through koopman-inspired implicit neural representation

Loading...
Thumbnail Image
Authors
Herman, Jacob, Lewis
Issue Date
2023-12
Type
Electronic thesis
Thesis
Language
en_US
Keywords
Aeronautical engineering
Research Projects
Organizational Units
Journal Issue
Alternative Title
Abstract
Living in the age of big data, learning models from data is of great importance; that is, representingsystems using provided or collected data as opposed to a system representation derived from the governing equations, which may be nearly impossible to derive (e.g., soft robotics) or may not even be known (e.g., coarse-grained systems). Despite the recent success of data-driven methods such as dynamic mode decomposition, SINDy, operator inference, these data-driven methods suffer from several pitfalls: 1) requiring the full measurement of the system state, which is usually unavailable in realistic setting; 2) unable to handle spatial-temporal data on complex, dynamically changing mesh; 3) lack inductive biases and interpretability. To address those challenges, this work introduces the Koopman Operator Implicit Neural Representation (KoIN) by leveraging recent advances in the Koopman operator theory and implicit neural representations. A neural implicit representation framework is employed to reduce spatial complexity into latent representation where the dynamics is governed by a linear system. This framework can be viewed as learning a Koopman operator of the nonlinear systems governed by partial differential equations in a mesh-agnostic way. Moreover, we leverage auto-encoding to avoid explicit encoding such that full state measurement is avoided. The effectiveness of this framework is validated on several canonical PDE data-sets with a diverse set of initial conditions ranging from 2D wave to incompressible Navier-Stokes equations. Our framework shows better generalizations than state-of-the-art methods on linear PDEs while slightly inferior on nonlinear PDEs. Overall, our framework trains much faster thanks to the simple linear representation.
Description
December2023
Full Citation
Publisher
Rensselaer Polytechnic Institute, Troy, NY
Terms of Use
Journal
Volume
Issue
PubMed ID
DOI
ISSN
EISSN
Collections