Please use this identifier to cite or link to this item: http://hdl.handle.net/123456789/3374
Full metadata record
DC FieldValueLanguage
dc.contributor.authorSinha, Sudeshna-
dc.date.accessioned2020-12-26T05:30:59Z-
dc.date.available2020-12-26T05:30:59Z-
dc.date.issued2020-
dc.identifier.citationChaos, Solitons and Fractals: X, 5,100046en_US
dc.identifier.otherhttps://doi.org/10.1016/j.csfx.2020.100046-
dc.identifier.urihttps://www.sciencedirect.com/science/article/pii/S2590054420300270?via%3Dihub-
dc.identifier.urihttp://hdl.handle.net/123456789/3374-
dc.descriptionOnly IISERM authors are available in the record.-
dc.description.abstractWe quantify how incorporating physics into neural network design can significantly improve the learning and forecasting of dynamical systems, even nonlinear systems of many dimensions. We train conventional and Hamiltonian neural networks on increasingly difficult dynamical systems and compute their forecasting errors as the number of training data and number of system dimensions vary. A map-building perspective elucidates the superiority of Hamiltonian neural networks. The results clarify the critical relation among data, dimension, and neural network learning performance.en_US
dc.language.isoenen_US
dc.publisherElsevieren_US
dc.subjectMachine learningen_US
dc.subjectNeural networksen_US
dc.subjectHamiltonian dynamicsen_US
dc.subjectHigh dimensionsen_US
dc.titleThe scaling of physics-informed machine learning with data and dimensionsen_US
dc.typeArticleen_US
Appears in Collections:Research Articles

Files in This Item:
File Description SizeFormat 
Need to add pdf.odt8.63 kBOpenDocument TextView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.