This module includes a regressor object used to optimize the parameters of the machine-learning model.
Regressor(optimizer='BFGS', optimizer_kwargs=None, lossprime=True)¶
Class to manage the regression of a generic model. That is, for a given parameter set, calculates the cost function (the difference in predicted energies and actual energies across training images), then decides how to adjust the parameters to reduce this cost function. Global optimization conditioners (e.g., simulated annealing, etc.) can be built into this class.
- optimizer (str) – The optimizer to use. Several defaults are available including ‘L-BFGS-B’, ‘BFGS’, ‘TNC’, or ‘NCG’. Alternatively, any function can be supplied which behaves like scipy.optimize.fmin_bfgs.
- optimizer_kwargs (dict) – Optional keywords for the corresponding optimizer.
- lossprime (boolean) – Decides whether or not the regressor needs to be fed in by gradient of the loss function as well as the loss function itself.
Performs the regression. Calls model.get_loss, which should return the current value of the loss function until convergence has been reached, at which point it should raise a amp.utilities.ConvergenceException.