{ "id": "2410.14134", "version": "v1", "published": "2024-10-18T03:02:19.000Z", "updated": "2024-10-18T03:02:19.000Z", "title": "Fine-Tuning DeepONets to Enhance Physics-informed Neural Networks for solving Partial Differential Equations", "authors": [ "Sidi Wu" ], "comment": "24 pages, 8 figures,6 tables", "categories": [ "math.NA", "cs.NA" ], "abstract": "Physics-Informed Neural Networks (PINNs) have emerged as powerful tools for solving partial differential equations (PDEs). However, training PINNs from scratch is often computationally intensive and time-consuming. To address this problem, we propose a parameter-efficient approach that fine-tunes pre-trained DeepONet models within the PINN framework (FTO-PINN), enabling more efficient meshless PDE solving. Specifically, we freeze the weights of the pre-trained DeepONet model and fine-tune the output of the branch net by incorporating a small number of new trainable parameters, which can be quickly determined using least-squares techniques. Additionally, we introduce trunk net expansions and low-rank adaptation strategies to further enhance the performance of FTO-PINN. The effectiveness of our proposed method is demonstrated through a series of numerical experiments across various types of PDEs. FTO-PINN significantly reduces the training time of vanilla PINNs while maintaining comparable accuracy, and outperforms DeepONet, which is pre-trained on general function data, in both fidelity and generalization capabilities.", "revisions": [ { "version": "v1", "updated": "2024-10-18T03:02:19.000Z" } ], "analyses": { "keywords": [ "solving partial differential equations", "enhance physics-informed neural networks", "fine-tuning deeponets", "pre-trained deeponet model", "low-rank adaptation strategies" ], "note": { "typesetting": "TeX", "pages": 24, "language": "en", "license": "arXiv", "status": "editable" } } }