Description
According to recent literature, the L-BFGS-B optimizer seems to be very powerful for training PINNs, overperforming stochastic optimizers (Adam, ect.) -- the L-BFGS-B can be used for second-step fine-optimization while Adam can be used in a first stage rough-optimization to prevent falling into a local minimum. Having L-BFGS-B as a Keras optimizer available like Adam would therefore be very valuable. Currently, there are unpractical workarounds (e.g. this one) based on the Tensorflow probability BFGS minimizer, I tried it myself, but I found it rather unflexible. So, is there any plan to have the L-BFGS-B optimizer available? This request was already addressed in several Tensorflow requests, this one or this one,. Regarding the growing interest in PINNs, that would be great to have this feature.