jijzepttools.blackbox_optimization.fm_config#

Configuration classes for Factorization Machine optimizers.

Classes#

AdamConfig

Adam optimizer configuration for FM training.

LBFGSConfig

L-BFGS optimizer configuration for FM training.

Module Contents#

class AdamConfig#

Adam optimizer configuration for FM training.

n_epochs#

Number of training epochs.

Type:

int, default=1000

batch_size#

Batch size for mini-batch training.

Type:

int, default=32

lr#

Learning rate.

Type:

float, default=0.001

weight_decay#

Weight decay (L2 regularization).

Type:

float, default=0.0

early_stopping#

Whether to use early stopping.

Type:

bool, default=True

es_min_epochs#

Minimum number of epochs before early stopping can trigger.

Type:

int, default=10

es_patience#

Patience for early stopping (number of epochs without improvement).

Type:

int, default=10

es_threshold#

Threshold of loss relative improvement for early stopping.

Type:

float, default=5e-4

es_verbose#

Whether to print early stopping message.

Type:

bool, default=False

n_epochs: int = 1000#
batch_size: int = 32#
lr: float = 0.001#
weight_decay: float = 0.0#
early_stopping: bool = True#
es_min_epochs: int = 10#
es_patience: int = 10#
es_threshold: float = 0.0005#
es_verbose: bool = False#
class LBFGSConfig#

L-BFGS optimizer configuration for FM training.

L-BFGS is a quasi-Newton method that uses full-batch optimization. It typically converges faster than Adam for small to medium datasets.

max_iter#

Maximum number of iterations per optimization step.

Type:

int, default=20

lr#

Learning rate (step size). For L-BFGS, 1.0 is often optimal.

Type:

float, default=1.0

history_size#

Number of previous iterations to store for approximating the inverse Hessian.

Type:

int, default=10

line_search_fn#

Line search function. “strong_wolfe” is recommended for stability. Set to None to disable line search.

Type:

str or None, default=”strong_wolfe”

max_iter: int = 20#
lr: float = 1.0#
history_size: int = 10#
line_search_fn: str | None = 'strong_wolfe'#