Optimization Methods in SINDBAD
This documentation provides a comprehensive overview of optimization methods in SINDBAD, including available methods, configuration settings, and how to implement new ones.
Overview
SINDBAD uses a type-based dispatch system for optimization methods, allowing for flexible and extensible optimization approaches. The optimization process is configured through JSON files and can be customized for different experiments.
Configuration
Optimization settings are defined in the optimization.json
file:
{
"algorithm_optimization": "opti_algorithms/CMAEvolutionStrategy_CMAES.json",
"model_parameters_to_optimize": {
"autoRespiration,RMN": null,
"gppAirT,opt_airT": null
},
"multi_constraint_method": "metric_sum",
"observational_constraints": [
"gpp",
"nee",
"reco"
],
"observations": {
"default_cost": {
"cost_metric": "NSE_inv",
"cost_weight": 1.0
}
}
}
Key components:
algorithm_optimization
: Path to the optimization algorithm configuration or direct algorithm namemodel_parameters_to_optimize
: Parameters to be optimizedmulti_constraint_method
: Method for combining multiple constraintsobservational_constraints
: Variables to be used as constraintsobservations
: Cost metric and weight settings
Algorithm Optimization Configuration
The algorithm_optimization
field can be specified in two ways:
- Path to JSON File
"algorithm_optimization": "opti_algorithms/CMAEvolutionStrategy_CMAES.json"
This points to a JSON file containing the algorithm configuration, which should include:
{
"algorithm": "CMAEvolutionStrategy_CMAES",
"parameters": {
"max_iterations": 1000,
"tolerance": 1e-6,
"population_size": 50
}
}
- Direct Algorithm Name
"algorithm_optimization": "CMAEvolutionStrategy_CMAES"
When specified as a string, it uses default parameters for the algorithm.
INFO
Using a JSON file for algorithm_optimization
allows for:
Custom parameter tuning
Different configurations for different experiments
Easy switching between algorithm settings
Available Optimization Methods
TIP
To list all available optimization methods and their purposes, use:
using Sindbad
showMethodsOf(OptimizationMethod)
This will display a formatted list of all optimization methods and their descriptions.
TIP
To get default options for any optimization method, use sindbadDefaultOptions
:
# Get default options for CMA-ES
opts = sindbadDefaultOptions(CMAEvolutionStrategyCMAES())
# Returns: (maxfevals = 50,)
# Get default options for Morris method
opts = sindbadDefaultOptions(GSAMorris())
# Returns: (total_num_trajectory = 200, num_trajectory = 15, len_design_mat = 10)
# Get default options for Sobol method
opts = sindbadDefaultOptions(GSASobol())
# Returns: (samples = 5, method_options = (order = [0, 1],), sampler = "Sobol", sampler_options = ())
These default options can be used as a starting point for customizing optimization parameters in your configuration files.
Current methods include:
Bayesian Optimization
BayesOptKMaternARD5
: Bayesian Optimization using Matern 5/2 kernel with Automatic Relevance Determination from BayesOpt.jl
Evolution Strategies
CMAEvolutionStrategyCMAES
: Covariance Matrix Adaptation Evolution Strategy (CMA-ES) from CMAEvolutionStrategy.jlEvolutionaryCMAES
: CMA-ES implementation from Evolutionary.jl
Gradient-based Methods
OptimLBFGS
: Limited-memory BFGS method from Optim.jlOptimBFGS
: BFGS method from Optim.jlOptimizationBFGS
: BFGS method from Optimization.jlOptimizationFminboxGradientDescent
: Fminbox Gradient Descent method from Optimization.jlOptimizationFminboxGradientDescentFD
: Fminbox Gradient Descent with forward differentiation from Optimization.jl
Black Box Optimization
OptimizationBBOadaptive
: Black Box Optimization (adaptive) method from Optimization.jlOptimizationBBOxnes
: Black Box Optimization (xNES) method from Optimization.jl
Other Methods
OptimizationGCMAESDef
: GCMAES method from Optimization.jlOptimizationGCMAESFD
: GCMAES method with forward differentiation from Optimization.jlOptimizationMultistartOptimization
: Multistart Optimization method from Optimization.jlOptimizationNelderMead
: Nelder-Mead method from Optimization.jlOptimizationQuadDirect
: QuadDIRECT method from Optimization.jl
Adding a New Optimization Method
1. Define the New Optimization Method Type
In src/Types/OptimizationTypes.jl
, add a new struct that subtypes OptimizationMethod
:
import SindbadUtils: purpose
# Define the new optimization type
struct YourNewOptimizationMethod <: OptimizationMethod end
# Define its purpose
purpose(::Type{YourNewOptimizationMethod}) = "Description of what YourNewOptimizationMethod does"
INFO
When naming new optimization types that use external packages, follow the convention PackageNameMethodName
. For example:
CMAEvolutionStrategyCMAES
for the CMA-ES method from CMAEvolutionStrategy.jlOptimizationBFGS
for the BFGS method from Optimization.jlBayesOptKMaternARD5
for the Matern 5/2 kernel method from BayesOpt.jl
This convention helps identify both the package and the specific method being used.
2. Set Default Options
In defaultOptions.jl
, add default options for your new optimization method:
# Add default options for your new method
sindbadDefaultOptions(::YourNewOptimizationMethod) = (
max_iterations = 1000,
tolerance = 1e-6,
population_size = 50,
# Add other default parameters specific to your method
)
TIP
When setting default options:
Choose reasonable default values that work well for most cases
Include all essential parameters needed by the optimization method
Use descriptive parameter names that match the underlying package's terminology
Consider adding parameters for:
Convergence criteria (e.g.,
max_iterations
,tolerance
)Population/ensemble settings (e.g.,
population_size
)Algorithm-specific parameters
Performance tuning options
Make sure:
Test the default options with different problem sizes
Consider adding validation for parameter values in your implementation
Keep the default options simple but flexible enough for common use cases
3. Implement the Optimization Function
In optimizer.jl
, implement your optimization function with the following signature:
function optimizer(cost_function, default_values, lower_bounds, upper_bounds, algo_options, ::YourNewOptimizationMethod)
# Your implementation here
end
The function should:
Set up the optimization problem
Configure the algorithm parameters
Run the optimization
Return the results
Example implementation structure:
function optimizer(cost_function, default_values, lower_bounds, upper_bounds, algo_options, ::YourNewOptimizationMethod)
# Set up optimization problem
problem = OptimizationProblem(
cost_function,
default_values,
lower_bounds,
upper_bounds
)
# Configure algorithm
algorithm = YourAlgorithm(; algo_options...)
# Run optimization
result = optimize(problem, algorithm)
return result
end
4. Update Algorithm Configuration (if needed)
If your method requires special configuration, update the algorithm configuration file:
{
"algorithm": "your_new_method",
"parameters": {
"max_iterations": 1000,
"tolerance": 1e-6,
"population_size": 50
}
}
Important Considerations
- Parameter Handling
Ensure proper handling of parameter bounds
Implement appropriate scaling if needed
Consider parameter constraints
- Performance
Optimize for large parameter sets
Consider parallelization opportunities
Implement efficient memory management
- Convergence
Set appropriate stopping criteria
Handle numerical stability
Implement error handling
- Documentation
Add comprehensive docstrings
Include usage examples
Document any special requirements
Testing
After implementing your new optimization method:
Test with small parameter sets
Verify convergence behavior
Check performance with larger parameter sets
Ensure compatibility with different cost functions
Best Practices
- Algorithm Selection
Choose appropriate algorithm for the problem type
Consider problem dimensionality
Account for computational resources
- Parameter Configuration
Set reasonable bounds
Configure appropriate stopping criteria
Adjust population sizes if needed
- Performance Optimization
Implement efficient data structures
Consider parallelization
Optimize memory usage
- Error Handling
Handle numerical instabilities
Implement appropriate fallbacks
Provide informative error messages