Gymbo
Public Member Functions | Public Attributes | List of all members
gymbo::GDOptimizer Struct Reference

Gradient Descent Optimizer for Symbolic Path Constraints. More...

#include <gd.h>

Public Member Functions

 GDOptimizer (int num_epochs=100, float lr=1.0, float eps=1.0, float param_low=-10.0, float param_high=10.0, bool sign_grad=true, bool init_param_uniform_int=true, int seed=42)
 Constructor for GDOptimizer. More...
 
bool eval (std::vector< Sym > &path_constraints, std::unordered_map< int, float > params)
 Evaluate if path constraints are satisfied for given parameters. More...
 
bool solve (std::vector< Sym > &path_constraints, std::unordered_map< int, float > &params, bool is_init_params_const=true)
 Solve path constraints using gradient descent optimization. More...
 

Public Attributes

int num_epochs
 Maximum number of optimization epochs. More...
 
float lr
 Learning rate for gradient descent. More...
 
float eps
 The smallest positive value. More...
 
float param_low
 Lower bound for parameter initialization. More...
 
float param_high
 Upper bound for parameter initialization. More...
 
bool sign_grad
 
bool init_param_uniform_int
 
bool contain_randomized_vars
 
int seed
 Random seed for initializing parameter values. More...
 
int num_used_itr
 Number of used iterations during optimization. More...
 

Detailed Description

Gradient Descent Optimizer for Symbolic Path Constraints.

The GDOptimizer class provides functionality to optimize symbolic path constraints by using gradient descent. It aims to find parameter values that satisfy the given path constraints, making them true or non-positive.

Note
This class assumes that symbolic expressions are represented by the Sym type, and path constraints are a vector of Sym objects.
The optimization process relies on the gradient information of the path constraints. The optimizer iteratively updates the parameters until the constraints are satisfied or a maximum number of epochs is reached.

Constructor & Destructor Documentation

◆ GDOptimizer()

gymbo::GDOptimizer::GDOptimizer ( int  num_epochs = 100,
float  lr = 1.0,
float  eps = 1.0,
float  param_low = -10.0,
float  param_high = 10.0,
bool  sign_grad = true,
bool  init_param_uniform_int = true,
int  seed = 42 
)
inline

Constructor for GDOptimizer.

Parameters
num_epochsMaximum number of optimization epochs (default: 100).
lrLearning rate for gradient descent (default: 1).
epsThe smallest positive value of the target type (default: 1.0).
param_lowLower bound for parameter initialization (default: -10).
param_highUpper bound for parameter initialization (default: 10).
sign_gradIf true, use sign gradient descent. Otherwise, use standard gradient descent. (default true).
init_param_uniform_intFlag indicating whether initial parameter values are drawn from the uniform int distribution or uniform real distribution (default true).
seedRandom seed for initializing parameter values (default: 42).

Member Function Documentation

◆ eval()

bool gymbo::GDOptimizer::eval ( std::vector< Sym > &  path_constraints,
std::unordered_map< int, float >  params 
)
inline

Evaluate if path constraints are satisfied for given parameters.

This function checks if the given path constraints are satisfied (non-positive) for the provided parameter values.

Parameters
path_constraintsVector of symbolic path constraints.
paramsMap of parameter values.
Returns
true if all constraints are satisfied; otherwise, false.

◆ solve()

bool gymbo::GDOptimizer::solve ( std::vector< Sym > &  path_constraints,
std::unordered_map< int, float > &  params,
bool  is_init_params_const = true 
)
inline

Solve path constraints using gradient descent optimization.

This function attempts to find parameter values that satisfy the given path constraints by using gradient descent optimization. It iteratively updates the parameters until the constraints are satisfied or the maximum number of epochs is reached.

Parameters
path_constraintsVector of symbolic path constraints.
paramsMap of parameter values (will be modified during optimization).
is_init_params_constFlag indicating whether initial parameter values are constant.
Returns
true if the constraints are satisfied after optimization; otherwise, false.

Member Data Documentation

◆ contain_randomized_vars

bool gymbo::GDOptimizer::contain_randomized_vars

If true, use aeval and agrad. Otherwise, use eval and grad.

◆ eps

float gymbo::GDOptimizer::eps

The smallest positive value.

◆ init_param_uniform_int

bool gymbo::GDOptimizer::init_param_uniform_int

Flag indicating whether initial parameter values are drawn from the uniform int distribution or uniform real distribution (default true).

◆ lr

float gymbo::GDOptimizer::lr

Learning rate for gradient descent.

◆ num_epochs

int gymbo::GDOptimizer::num_epochs

Maximum number of optimization epochs.

◆ num_used_itr

int gymbo::GDOptimizer::num_used_itr

Number of used iterations during optimization.

◆ param_high

float gymbo::GDOptimizer::param_high

Upper bound for parameter initialization.

◆ param_low

float gymbo::GDOptimizer::param_low

Lower bound for parameter initialization.

◆ seed

int gymbo::GDOptimizer::seed

Random seed for initializing parameter values.

◆ sign_grad

bool gymbo::GDOptimizer::sign_grad

If true, use sign gradient descent. Otherwise, use standard gradient descent. (default true).


The documentation for this struct was generated from the following file: