| Title: | Metaheuristic and Gradient-Based Optimization for Neural Network Training and Continuous Problems |
| Version: | 0.1.0 |
| Description: | Provides tools for general-purpose continuous optimization and feed-forward artificial neural network training using metaheuristic and gradient-based optimization algorithms. The package supports benchmark function optimization, regression, binary classification, and multi-class classification with multilayer perceptrons. The package implements several optimization methods, including particle swarm optimization Kennedy and Eberhart (1995) <doi:10.1109/ICNN.1995.488968>, differential evolution Storn and Price (1997) <doi:10.1023/A:1008202821328>, grey wolf optimizer Mirjalili et al. (2014) <doi:10.1016/j.advengsoft.2013.12.007>, secretary bird optimization Fu et al. (2024) <doi:10.1007/s10462-024-10729-y>, and Adam Kingma and Ba (2015) <doi:10.48550/arXiv.1412.6980>. |
| License: | MIT + file LICENSE |
| Encoding: | UTF-8 |
| RoxygenNote: | 7.3.3 |
| URL: | https://github.com/burakdilber/metANN |
| BugReports: | https://github.com/burakdilber/metANN/issues |
| NeedsCompilation: | no |
| Packaged: | 2026-05-11 19:24:56 UTC; hp |
| Author: | Burak Dilber [aut, cre, cph], A. Fırat Özdemir [aut, ths] |
| Maintainer: | Burak Dilber <burakdilber91@gmail.com> |
| Repository: | CRAN |
| Date/Publication: | 2026-05-15 20:30:07 UTC |
Artificial Bee Colony Engine
Description
Internal implementation of the Artificial Bee Colony algorithm for continuous optimization.
Usage
abc_optimize(fn, lower, upper, optimizer, verbose = TRUE, ...)
Arguments
fn |
Objective function. |
lower |
Numeric vector of lower bounds. |
upper |
Numeric vector of upper bounds. |
optimizer |
An ABC optimizer object. |
verbose |
Logical. If |
... |
Additional arguments passed to |
Value
An object of class "met_optimize_result".
Derivative of Activation Functions
Description
Internal helper for computing activation derivatives during backpropagation.
Usage
activation_derivative(activation, z, a)
Arguments
activation |
A metANN activation object. |
z |
Pre-activation values. |
a |
Activation values. |
Value
A numeric matrix with the same dimensions as z.
Leaky Rectified Linear Unit Activation Function
Description
Creates a leaky rectified linear unit activation function object.
Usage
activation_leaky_relu(alpha = 0.01)
Arguments
alpha |
A non-negative numeric value controlling the slope for negative inputs. |
Value
An object of class "met_activation".
Examples
act <- activation_leaky_relu(alpha = 0.01)
act$fn(c(-1, 0, 1))
Linear Activation Function
Description
Creates a linear activation function object.
Usage
activation_linear()
Value
An object of class "met_activation".
Examples
act <- activation_linear()
act$fn(c(-1, 0, 1))
Rectified Linear Unit Activation Function
Description
Creates a rectified linear unit activation function object.
Usage
activation_relu()
Value
An object of class "met_activation".
References
Nair, V., and Hinton, G. E. (2010). Rectified Linear Units Improve Restricted Boltzmann Machines. Proceedings of the 27th International Conference on Machine Learning, 807–814.
Examples
act <- activation_relu()
act$fn(c(-1, 0, 1))
Sigmoid Activation Function
Description
Creates a sigmoid activation function object.
Usage
activation_sigmoid()
Value
An object of class "met_activation".
Examples
act <- activation_sigmoid()
act$fn(c(-1, 0, 1))
Softmax Activation Function
Description
Creates a softmax activation function object.
Usage
activation_softmax()
Value
An object of class "met_activation".
References
Bridle, J. S. (1990). Probabilistic Interpretation of Feedforward Classification Network Outputs, with Relationships to Statistical Pattern Recognition. In Neurocomputing: Algorithms, Architectures and Applications, 227–236. Springer.
Examples
act <- activation_softmax()
act$fn(c(1, 2, 3))
Hyperbolic Tangent Activation Function
Description
Creates a hyperbolic tangent activation function object.
Usage
activation_tanh()
Value
An object of class "met_activation".
Examples
act <- activation_tanh()
act$fn(c(-1, 0, 1))
Convert Character Input to an Activation Object
Description
Converts a character string such as "relu" into the corresponding
activation function object.
Usage
as_activation(activation)
Arguments
activation |
A character string or an object of class
|
Value
An object of class "met_activation".
Examples
as_activation("relu")
as_activation(activation_leaky_relu(alpha = 0.05))
Convert Character Input to a Loss Object
Description
Converts a character string such as "mse" into the corresponding loss
function object.
Usage
as_loss(loss)
Arguments
loss |
A character string or an object of class |
Value
An object of class "met_loss".
Examples
as_loss("mse")
as_loss(loss_huber(delta = 1.5))
Convert Character Input to a Metric Object
Description
Converts a character string such as "rmse" into the corresponding metric
function object.
Usage
as_metric(metric)
Arguments
metric |
A character string or an object of class |
Value
An object of class "met_metric".
Examples
as_metric("rmse")
as_metric(metric_accuracy())
Convert Multiple Inputs to Metric Objects
Description
Converts a character vector or a list of metric objects into a list of metric objects.
Usage
as_metrics(metrics)
Arguments
metrics |
A character vector, a single metric object, or a list of metric objects. |
Value
A list of objects of class "met_metric".
Examples
as_metrics(c("rmse", "mae", "r2"))
as_metrics(list(metric_accuracy(), metric_f1()))
Convert Character Input to an Optimizer Object
Description
Converts a character string such as "pso" into the corresponding optimizer
object.
Usage
as_optimizer(optimizer)
Arguments
optimizer |
A character string or an object of class |
Value
An object of class "met_optimizer".
Examples
as_optimizer("pso")
as_optimizer(optimizer_adam())
List Available Activation Functions
Description
Returns the names of activation functions currently available in the metANN package.
Usage
available_activations()
Value
A character vector of activation function names.
Examples
available_activations()
List Available Gradient-Based Optimizers
Description
Returns the names of gradient-based optimizer objects currently available in the metANN package.
Usage
available_gradient_optimizers()
Value
A character vector of gradient-based optimizer names.
Examples
available_gradient_optimizers()
List Available Loss Functions
Description
Returns the names of loss functions currently available in the metANN package.
Usage
available_losses()
Value
A character vector of loss function names.
Examples
available_losses()
List Available Metaheuristic Optimizers
Description
Returns the names of metaheuristic optimization algorithms currently available in the metANN package.
Usage
available_metaheuristics()
Value
A character vector of metaheuristic optimizer names.
Examples
available_metaheuristics()
List Available Performance Metrics
Description
Returns the names of performance metrics currently available in the metANN package.
Usage
available_metrics()
Value
A character vector of metric names.
Examples
available_metrics()
List Available Optimizers
Description
Returns the names of optimization algorithms currently available in the metANN package.
Usage
available_optimizers()
Value
A character vector of optimizer names.
Examples
available_optimizers()
Classification Loss Value
Description
Internal helper for computing binary or multi-class classification loss.
Usage
classification_loss_value(y_true, y_pred, classification_type, epsilon = 1e-15)
Arguments
y_true |
Encoded true response. |
y_pred |
Predicted probabilities. |
classification_type |
Either |
epsilon |
Small value used for numerical stability. |
Value
A single numeric loss value.
Classification Metric Values
Description
Internal helper for computing basic classification metrics.
Usage
classification_metric_values(y_true, y_pred)
Arguments
y_true |
True class labels. |
y_pred |
Predicted class labels. |
Value
A named numeric vector.
Extract the Best Parameters from a metANN Optimization Result
Description
Extract the Best Parameters from a metANN Optimization Result
Usage
## S3 method for class 'met_optimize_result'
coef(object, ...)
Arguments
object |
A metANN optimization result object. |
... |
Additional arguments, currently unused. |
Value
A numeric vector containing the best solution found.
Extract Weights from a metANN Model
Description
Extract Weights from a metANN Model
Usage
## S3 method for class 'metann'
coef(object, ...)
Arguments
object |
A fitted metANN model. |
... |
Additional arguments, currently unused. |
Value
A numeric vector of fitted network weights.
Compute Metric Values
Description
Internal helper for evaluating a list of metric objects.
Usage
compute_metric_values(metrics, y_true, y_pred)
Arguments
metrics |
A list of metric objects. |
y_true |
Observed values. |
y_pred |
Predicted values. |
Value
A named numeric vector of metric values.
Count the Number of Trainable Parameters in an MLP Architecture
Description
Computes the total number of weights and bias terms required by a multilayer perceptron architecture.
Usage
count_parameters(architecture, input_dim = NULL)
Arguments
architecture |
An object created by |
input_dim |
Optional positive integer specifying the number of input
features. If |
Value
A positive integer giving the total number of parameters.
Examples
arch <- mlp_architecture(
input_dim = 4,
layers = list(
dense_layer(5, activation = "relu"),
dense_layer(1, activation = "linear")
)
)
count_parameters(arch)
Differential Evolution Engine
Description
Internal implementation of Differential Evolution using the rand/1/bin strategy.
Usage
de_optimize(fn, lower, upper, optimizer, verbose = TRUE, ...)
Arguments
fn |
Objective function. |
lower |
Numeric vector of lower bounds. |
upper |
Numeric vector of upper bounds. |
optimizer |
A DE optimizer object. |
verbose |
Logical. If |
... |
Additional arguments passed to |
Value
An object of class "met_optimize_result".
Decode Classification Predictions
Description
Internal helper for converting probabilities to class labels.
Usage
decode_classification_prediction(
probabilities,
class_levels,
classification_type,
threshold = 0.5
)
Arguments
probabilities |
Numeric vector or matrix of predicted probabilities. |
class_levels |
Class labels. |
classification_type |
Either |
threshold |
Classification threshold for binary classification. |
Value
A factor of predicted class labels.
Decode an MLP Weight Vector
Description
Converts a numeric parameter vector into layer-wise weight matrices and bias vectors.
Usage
decode_weights(weights, architecture, input_dim = NULL)
Arguments
weights |
A numeric vector of MLP parameters. |
architecture |
An object created by |
input_dim |
Optional positive integer specifying the number of input
features. If |
Value
A list containing layer-wise weight matrices and bias vectors.
Examples
arch <- mlp_architecture(
input_dim = 2,
layers = list(dense_layer(3), dense_layer(1, activation = "linear"))
)
w <- initialize_weights(arch, seed = 123)
decoded <- decode_weights(w, arch)
Create a Dense Layer
Description
Creates a fully connected dense layer object for use in metANN architectures.
Usage
dense_layer(
units,
activation = "relu",
use_bias = TRUE,
trainable = TRUE,
name = NULL
)
Arguments
units |
A positive integer specifying the number of neurons in the layer. |
activation |
A character string or a |
use_bias |
Logical. Whether to include a bias term in the layer. |
trainable |
Logical. Whether the layer parameters should be trainable. |
name |
An optional character string specifying the layer name. |
Value
An object of class "met_dense_layer".
Examples
dense_layer(10, activation = "relu")
dense_layer(1, activation = activation_linear())
Detect Task Type
Description
Internal helper for detecting whether the response corresponds to a regression or classification task.
Usage
detect_task(y, task = "auto")
Arguments
y |
Response vector. |
task |
Character value. One of |
Value
A character value: "regression" or "classification".
Encode Classification Response
Description
Internal helper for encoding binary and multi-class responses.
Usage
encode_classification_response(y)
Arguments
y |
Response vector. |
Value
A list containing encoded response, class levels, and classification type.
Evaluate a metANN Model
Description
Evaluates a fitted metANN model on new data.
Usage
evaluate(object, newdata, y_true = NULL, metrics = NULL, threshold = 0.5, ...)
Arguments
object |
A fitted object of class |
newdata |
New data used for evaluation. For formula-based models, this should be a data frame containing the response variable. For x-y models, this should be a numeric matrix or numeric data frame. |
y_true |
Optional true response values. Required for x-y models. For
formula-based models, if |
metrics |
Optional performance metrics. If |
threshold |
Classification threshold for binary classification. |
... |
Additional arguments passed to |
Value
An object of class "metann_evaluation".
Examples
fit <- met_mlp(
formula = Petal.Width ~ Sepal.Length + Sepal.Width + Petal.Length,
data = iris,
hidden_layers = c(5),
optimizer = optimizer_pso(pop_size = 10, max_iter = 10),
seed = 123,
verbose = FALSE
)
evaluate(fit, newdata = iris)
Forward Pass for an MLP
Description
Computes predictions from input data, an MLP architecture, and a parameter vector.
Usage
forward_pass(x, weights, architecture)
Arguments
x |
A numeric matrix or data frame of input features. |
weights |
A numeric vector of MLP parameters. |
architecture |
An object created by |
Value
A numeric matrix containing network outputs.
Examples
x <- matrix(rnorm(10), nrow = 5, ncol = 2)
arch <- mlp_architecture(
input_dim = 2,
layers = list(
dense_layer(3, activation = "relu"),
dense_layer(1, activation = "linear")
)
)
w <- initialize_weights(arch, seed = 123)
forward_pass(x, w, arch)
Genetic Algorithm Engine
Description
Internal implementation of a real-coded Genetic Algorithm.
Usage
ga_optimize(fn, lower, upper, optimizer, verbose = TRUE, ...)
Arguments
fn |
Objective function. |
lower |
Numeric vector of lower bounds. |
upper |
Numeric vector of upper bounds. |
optimizer |
A GA optimizer object. |
verbose |
Logical. If |
... |
Additional arguments passed to |
Value
An object of class "met_optimize_result".
Gradient-Based Optimization Engine
Description
Internal implementation of gradient-based optimization for differentiable continuous objective functions.
Usage
gradient_optimize(
fn,
gr,
lower,
upper,
initial = NULL,
optimizer,
verbose = TRUE,
...
)
Arguments
fn |
Objective function. |
gr |
Gradient function. |
lower |
Numeric vector of lower bounds. |
upper |
Numeric vector of upper bounds. |
initial |
Optional numeric vector of initial values. |
optimizer |
A gradient-based optimizer object. |
verbose |
Logical. If |
... |
Additional arguments passed to |
Value
An object of class "met_optimize_result".
Grey Wolf Optimizer Engine
Description
Internal implementation of the Grey Wolf Optimizer for continuous optimization.
Usage
gwo_optimize(fn, lower, upper, optimizer, verbose = TRUE, ...)
Arguments
fn |
Objective function. |
lower |
Numeric vector of lower bounds. |
upper |
Numeric vector of upper bounds. |
optimizer |
A GWO optimizer object. |
verbose |
Logical. If |
... |
Additional arguments passed to |
Value
An object of class "met_optimize_result".
Initialize MLP Weights
Description
Creates a numeric vector of randomly initialized weights and bias terms for an MLP architecture.
Usage
initialize_weights(
architecture,
input_dim = NULL,
method = c("uniform", "normal"),
lower = -0.5,
upper = 0.5,
mean = 0,
sd = 0.1,
seed = NULL
)
Arguments
architecture |
An object created by |
input_dim |
Optional positive integer specifying the number of input
features. If |
method |
Initialization method. Currently |
lower |
Lower bound for uniform initialization. |
upper |
Upper bound for uniform initialization. |
mean |
Mean for normal initialization. |
sd |
Standard deviation for normal initialization. |
seed |
Optional random seed. |
Value
A numeric vector containing initialized parameters.
Examples
arch <- mlp_architecture(
input_dim = 3,
layers = list(dense_layer(2), dense_layer(1, activation = "linear"))
)
initialize_weights(arch, seed = 123)
Check Whether an Object is a metANN Activation
Description
Check Whether an Object is a metANN Activation
Usage
is_activation(x)
Arguments
x |
An object. |
Value
TRUE if x is a metANN activation object; otherwise FALSE.
Examples
is_activation(activation_relu())
Check Whether an Object is a metANN Architecture
Description
Check Whether an Object is a metANN Architecture
Usage
is_architecture(x)
Arguments
x |
An object. |
Value
TRUE if x is a metANN architecture object; otherwise FALSE.
Examples
arch <- mlp_architecture(list(dense_layer(1)))
is_architecture(arch)
Check Whether an Object is a Dense Layer
Description
Check Whether an Object is a Dense Layer
Usage
is_dense_layer(x)
Arguments
x |
An object. |
Value
TRUE if x is a dense layer object; otherwise FALSE.
Examples
is_dense_layer(dense_layer(5))
Check Whether an Object is a metANN Layer
Description
Check Whether an Object is a metANN Layer
Usage
is_layer(x)
Arguments
x |
An object. |
Value
TRUE if x is a metANN layer object; otherwise FALSE.
Examples
is_layer(dense_layer(5))
Check Whether an Object is a metANN Loss
Description
Check Whether an Object is a metANN Loss
Usage
is_loss(x)
Arguments
x |
An object. |
Value
TRUE if x is a metANN loss object; otherwise FALSE.
Examples
is_loss(loss_mse())
Check Whether an Object is a metANN Metric
Description
Check Whether an Object is a metANN Metric
Usage
is_metric(x)
Arguments
x |
An object. |
Value
TRUE if x is a metANN metric object; otherwise FALSE.
Examples
is_metric(metric_rmse())
Check Whether an Object is an MLP Architecture
Description
Check Whether an Object is an MLP Architecture
Usage
is_mlp_architecture(x)
Arguments
x |
An object. |
Value
TRUE if x is an MLP architecture object; otherwise FALSE.
Examples
arch <- mlp_architecture(list(dense_layer(1)))
is_mlp_architecture(arch)
Check Whether an Object is a metANN Optimizer
Description
Check Whether an Object is a metANN Optimizer
Usage
is_optimizer(x)
Arguments
x |
An object. |
Value
TRUE if x is a metANN optimizer object; otherwise FALSE.
Examples
is_optimizer(optimizer_pso())
Levy Flight Step
Description
Internal helper used by the Secretary Bird Optimization Algorithm.
Usage
levy_flight(dim, beta = 1.5)
Arguments
dim |
Problem dimension. |
beta |
Levy distribution parameter. |
Value
A numeric vector of Levy flight steps.
Binary Cross-Entropy Loss
Description
Creates a binary cross-entropy loss function object.
Usage
loss_binary_crossentropy(epsilon = 1e-15)
Arguments
epsilon |
A small positive numeric value used for numerical stability. |
Value
An object of class "met_loss".
References
Bridle, J. S. (1990). Probabilistic Interpretation of Feedforward Classification Network Outputs, with Relationships to Statistical Pattern Recognition. In Neurocomputing: Algorithms, Architectures and Applications, 227–236. Springer.
Examples
loss <- loss_binary_crossentropy()
loss$fn(c(0, 1, 1), c(0.1, 0.8, 0.9))
Categorical Cross-Entropy Loss
Description
Creates a categorical cross-entropy loss function object.
Usage
loss_crossentropy(epsilon = 1e-15)
Arguments
epsilon |
A small positive numeric value used for numerical stability. |
Value
An object of class "met_loss".
References
Bridle, J. S. (1990). Probabilistic Interpretation of Feedforward Classification Network Outputs, with Relationships to Statistical Pattern Recognition. In Neurocomputing: Algorithms, Architectures and Applications, 227–236. Springer.
Examples
loss <- loss_crossentropy()
y_true <- matrix(c(1, 0, 0, 0, 1, 0), nrow = 2, byrow = TRUE)
y_pred <- matrix(c(0.8, 0.1, 0.1, 0.2, 0.7, 0.1), nrow = 2, byrow = TRUE)
loss$fn(y_true, y_pred)
Huber Loss
Description
Creates a Huber loss function object.
Usage
loss_huber(delta = 1)
Arguments
delta |
A positive numeric value controlling the transition point between squared and absolute loss behavior. |
Value
An object of class "met_loss".
References
Huber, P. J. (1964). Robust Estimation of a Location Parameter. The Annals of Mathematical Statistics, 35(1), 73–101. doi:10.1214/aoms/1177703732
Examples
loss <- loss_huber(delta = 1)
loss$fn(c(1, 2, 3), c(1.1, 1.9, 3.2))
Log-Cosh Loss
Description
Creates a log-cosh loss function object.
Usage
loss_log_cosh()
Value
An object of class "met_loss".
Examples
loss <- loss_log_cosh()
loss$fn(c(1, 2, 3), c(1.1, 1.9, 3.2))
Mean Absolute Error Loss
Description
Creates a mean absolute error loss function object.
Usage
loss_mae()
Value
An object of class "met_loss".
Examples
loss <- loss_mae()
loss$fn(c(1, 2, 3), c(1.1, 1.9, 3.2))
Mean Squared Error Loss
Description
Creates a mean squared error loss function object.
Usage
loss_mse()
Value
An object of class "met_loss".
Examples
loss <- loss_mse()
loss$fn(c(1, 2, 3), c(1.1, 1.9, 3.2))
Train a Feed-Forward Multilayer Perceptron
Description
Convenience wrapper around metann() for training feed-forward multilayer
perceptrons.
Usage
met_mlp(
formula = NULL,
data = NULL,
x = NULL,
y = NULL,
architecture = NULL,
hidden_layers = NULL,
activation = "relu",
output_activation = NULL,
task = c("auto", "regression", "classification"),
optimizer = optimizer_pso(),
loss = NULL,
metrics = NULL,
seed = NULL,
verbose = TRUE
)
Arguments
formula |
Optional model formula. |
data |
Optional data frame used with |
x |
Optional numeric input matrix or data frame. |
y |
Optional response vector. |
architecture |
Optional MLP architecture object. |
|
Integer vector giving the number of units in each hidden layer. | |
activation |
Activation function for hidden layers. |
output_activation |
Optional output activation function. If |
task |
One of |
optimizer |
Optimizer object. |
loss |
Optional loss function. If |
metrics |
Optional performance metrics. If |
seed |
Optional random seed. |
verbose |
Logical. If |
Value
An object of class "metann".
References
Montana, D. J., and Davis, L. (1989). Training Feedforward Neural Networks Using Genetic Algorithms. Proceedings of the 11th International Joint Conference on Artificial Intelligence, 762–767.
Ilonen, J., Kamarainen, J.-K., and Lampinen, J. (2003). Differential Evolution Training Algorithm for Feed-Forward Neural Networks. Neural Processing Letters, 17, 93–105. doi:10.1023/A:1022995128597
Karaboga, D., and Ozturk, C. (2009). Neural Networks Training by Artificial Bee Colony Algorithm on Pattern Classification. Neural Network World, 19(3), 279–292.
Mirjalili, S. (2015). How Effective is the Grey Wolf Optimizer in Training Multi-Layer Perceptrons. Applied Intelligence, 43, 150–161. doi:10.1007/s10489-014-0645-7
Dilber, B., and Ozdemir, A. F. (2026). A novel approach to training feed-forward multi-layer perceptrons with recently proposed secretary bird optimization algorithm. Neural Computing and Applications, 38(5). doi:10.1007/s00521-026-11874-x
Examples
fit <- met_mlp(
formula = Petal.Width ~ Sepal.Length + Sepal.Width + Petal.Length,
data = iris,
hidden_layers = c(5),
optimizer = optimizer_pso(pop_size = 10, max_iter = 10),
seed = 123,
verbose = FALSE
)
fit
General-Purpose Optimization
Description
Performs continuous optimization using metaheuristic or gradient-based optimization algorithms.
Usage
met_optimize(
fn,
optimizer = optimizer_pso(),
lower,
upper,
gr = NULL,
initial = NULL,
seed = NULL,
verbose = TRUE,
...
)
Arguments
fn |
Objective function to be minimized. It must accept a numeric vector as its first argument and return a single numeric value. |
optimizer |
Optimizer object created by functions such as
|
lower |
Numeric vector of lower bounds. |
upper |
Numeric vector of upper bounds. |
gr |
Optional gradient function. Required for gradient-based optimizers
such as |
initial |
Optional numeric vector of initial parameter values. If |
seed |
Optional random seed. |
verbose |
Logical. If |
... |
Additional arguments passed to |
Value
An object of class "met_optimize_result".
Examples
sphere <- function(x) sum(x^2)
result <- met_optimize(
fn = sphere,
optimizer = optimizer_pso(pop_size = 10, max_iter = 20),
lower = rep(-5, 2),
upper = rep(5, 2),
seed = 123,
verbose = FALSE
)
result
grad_sphere <- function(x) 2 * x
result_adam <- met_optimize(
fn = sphere,
gr = grad_sphere,
optimizer = optimizer_adam(learning_rate = 0.1, epochs = 20),
lower = rep(-5, 2),
upper = rep(5, 2),
initial = rep(4, 2),
seed = 123,
verbose = FALSE
)
result_adam
Train an Artificial Neural Network with metANN
Description
Trains a feed-forward multilayer perceptron using metaheuristic or gradient-based optimization algorithms. The function supports regression and classification tasks through either an x-y interface or a formula-data interface.
Usage
metann(
formula = NULL,
data = NULL,
x = NULL,
y = NULL,
architecture = NULL,
hidden_layers = NULL,
activation = "relu",
output_activation = NULL,
task = c("auto", "regression", "classification"),
optimizer = optimizer_pso(),
loss = NULL,
metrics = NULL,
seed = NULL,
verbose = TRUE
)
Arguments
formula |
Optional formula specifying the model. |
data |
Optional data frame containing the variables in |
x |
Optional numeric matrix or data frame of input features. |
y |
Optional response vector or one-column matrix. |
architecture |
Optional MLP architecture created by |
|
Optional integer vector specifying hidden layer sizes.
Used when | |
activation |
Activation function used for hidden layers when
|
output_activation |
Optional activation function used for the output
layer when |
task |
One of |
optimizer |
A character string or a metANN optimizer object. |
loss |
Optional character string or metANN loss object. If |
metrics |
Optional character vector, metric object, or list of metric
objects. If |
seed |
Optional random seed. |
verbose |
Logical. If |
Value
An object of class "metann".
References
Montana, D. J., and Davis, L. (1989). Training Feedforward Neural Networks Using Genetic Algorithms. Proceedings of the 11th International Joint Conference on Artificial Intelligence, 762–767.
Ilonen, J., Kamarainen, J.-K., and Lampinen, J. (2003). Differential Evolution Training Algorithm for Feed-Forward Neural Networks. Neural Processing Letters, 17, 93–105. doi:10.1023/A:1022995128597
Karaboga, D., and Ozturk, C. (2009). Neural Networks Training by Artificial Bee Colony Algorithm on Pattern Classification. Neural Network World, 19(3), 279–292.
Mirjalili, S. (2015). How Effective is the Grey Wolf Optimizer in Training Multi-Layer Perceptrons. Applied Intelligence, 43, 150–161. doi:10.1007/s10489-014-0645-7
Dilber, B., and Ozdemir, A. F. (2026). A novel approach to training feed-forward multi-layer perceptrons with recently proposed secretary bird optimization algorithm. Neural Computing and Applications, 38(5). doi:10.1007/s00521-026-11874-x
Examples
fit <- metann(
formula = Petal.Width ~ Sepal.Length + Sepal.Width + Petal.Length,
data = iris,
hidden_layers = c(5),
optimizer = optimizer_pso(pop_size = 10, max_iter = 20),
loss = "mse",
metrics = c("rmse", "mae", "r2"),
seed = 123,
verbose = FALSE
)
fit
iris_bin <- iris
iris_bin$IsSetosa <- factor(
ifelse(iris_bin$Species == "setosa", "setosa", "other")
)
fit_class <- metann(
formula = IsSetosa ~ Sepal.Length + Sepal.Width + Petal.Length + Petal.Width,
data = iris_bin,
hidden_layers = c(5),
optimizer = optimizer_pso(pop_size = 10, max_iter = 20),
seed = 123,
verbose = FALSE
)
fit_class
Accuracy Metric
Description
Creates an accuracy metric object for classification tasks.
Usage
metric_accuracy()
Value
An object of class "met_metric".
Examples
metric <- metric_accuracy()
metric$fn(c(0, 1, 1), c(0, 1, 0))
F1 Score Metric
Description
Creates an F1 score metric object for classification tasks.
Usage
metric_f1(positive_class = 1)
Arguments
positive_class |
The class label treated as the positive class.
Defaults to |
Value
An object of class "met_metric".
Examples
metric <- metric_f1()
metric$fn(c(0, 1, 1, 0), c(0, 1, 0, 0))
Mean Absolute Error Metric
Description
Creates a mean absolute error metric object.
Usage
metric_mae()
Value
An object of class "met_metric".
Examples
metric <- metric_mae()
metric$fn(c(1, 2, 3), c(1.1, 1.9, 3.2))
Mean Squared Error Metric
Description
Creates a mean squared error metric object.
Usage
metric_mse()
Value
An object of class "met_metric".
Examples
metric <- metric_mse()
metric$fn(c(1, 2, 3), c(1.1, 1.9, 3.2))
Precision Metric
Description
Creates a precision metric object for classification tasks.
Usage
metric_precision(positive_class = 1)
Arguments
positive_class |
The class label treated as the positive class.
Defaults to |
Value
An object of class "met_metric".
Examples
metric <- metric_precision()
metric$fn(c(0, 1, 1, 0), c(0, 1, 0, 0))
Coefficient of Determination Metric
Description
Creates an R-squared metric object.
Usage
metric_r2()
Value
An object of class "met_metric".
Examples
metric <- metric_r2()
metric$fn(c(1, 2, 3), c(1.1, 1.9, 3.2))
Recall Metric
Description
Creates a recall metric object for classification tasks.
Usage
metric_recall(positive_class = 1)
Arguments
positive_class |
The class label treated as the positive class.
Defaults to |
Value
An object of class "met_metric".
Examples
metric <- metric_recall()
metric$fn(c(0, 1, 1, 0), c(0, 1, 0, 0))
Root Mean Squared Error Metric
Description
Creates a root mean squared error metric object.
Usage
metric_rmse()
Value
An object of class "met_metric".
Examples
metric <- metric_rmse()
metric$fn(c(1, 2, 3), c(1.1, 1.9, 3.2))
Create an MLP Architecture
Description
Creates a multilayer perceptron architecture object from a list of dense layers.
Usage
mlp_architecture(layers, input_dim = NULL, name = "mlp")
Arguments
layers |
A list of dense layer objects created by |
input_dim |
Optional positive integer specifying the number of input
features. This can be left as |
name |
Optional character string specifying the architecture name. |
Value
An object of class "met_mlp_architecture".
Examples
architecture <- mlp_architecture(
layers = list(
dense_layer(10, activation = "relu"),
dense_layer(1, activation = "linear")
)
)
architecture
Compute Binary Cross-Entropy Gradient for an MLP
Description
Internal helper for computing the gradient of binary cross-entropy loss with respect to the MLP parameter vector.
Usage
mlp_binary_crossentropy_gradient(x, y, weights, architecture, epsilon = 1e-15)
Arguments
x |
Numeric input matrix. |
y |
Numeric binary response vector coded as 0/1. |
weights |
Numeric parameter vector. |
architecture |
MLP architecture object. |
epsilon |
Small value used for numerical stability. |
Value
A list containing loss, predictions, and gradient vector.
Compute Multi-Class Cross-Entropy Gradient for an MLP
Description
Internal helper for computing the gradient of softmax cross-entropy loss with respect to the MLP parameter vector.
Usage
mlp_crossentropy_gradient(x, y, weights, architecture, epsilon = 1e-15)
Arguments
x |
Numeric input matrix. |
y |
Numeric one-hot encoded response matrix. |
weights |
Numeric parameter vector. |
architecture |
MLP architecture object. |
epsilon |
Small value used for numerical stability. |
Value
A list containing loss, predictions, and gradient vector.
Forward Pass with Cache
Description
Internal helper that stores intermediate values needed for backpropagation.
Usage
mlp_forward_cache(x, weights, architecture)
Arguments
x |
Numeric input matrix. |
weights |
Numeric parameter vector. |
architecture |
MLP architecture object. |
Value
A list containing activations, pre-activations, and decoded weights.
Compute MSE Gradient for an MLP
Description
Internal helper for computing the gradient of MSE loss with respect to the MLP parameter vector.
Usage
mlp_mse_gradient(x, y, weights, architecture)
Arguments
x |
Numeric input matrix. |
y |
Numeric response vector. |
weights |
Numeric parameter vector. |
architecture |
MLP architecture object. |
Value
A list containing loss, predictions, and gradient vector.
Create an Activation Function Object
Description
Internal helper for constructing activation function objects.
Usage
new_activation(name, fn, parameters = list())
Arguments
name |
A character string specifying the activation name. |
fn |
A function that applies the activation to numeric input. |
parameters |
A list of activation-specific parameters. |
Value
An object of class "met_activation".
Create a Loss Function Object
Description
Internal helper for constructing loss function objects.
Usage
new_loss(name, fn, parameters = list())
Arguments
name |
A character string specifying the loss name. |
fn |
A function that computes the loss from observed and predicted values. |
parameters |
A list of loss-specific parameters. |
Value
An object of class "met_loss".
Create a Metric Function Object
Description
Internal helper for constructing metric function objects.
Usage
new_metric(name, fn, task = "both", parameters = list())
Arguments
name |
A character string specifying the metric name. |
fn |
A function that computes the metric from observed and predicted values. |
task |
A character string specifying the supported task. |
parameters |
A list of metric-specific parameters. |
Value
An object of class "met_metric".
Create an Optimizer Object
Description
Internal helper for constructing optimizer objects.
Usage
new_optimizer(name, type, parameters = list())
Arguments
name |
A character string specifying the optimizer name. |
type |
A character string specifying the optimizer type. |
parameters |
A list of optimizer-specific parameters. |
Value
An object of class "met_optimizer".
Artificial Bee Colony Optimizer
Description
Creates an Artificial Bee Colony optimizer object for continuous optimization problems.
Usage
optimizer_abc(colony_size = 30, max_iter = 100, limit = NULL)
Arguments
colony_size |
Total colony size. Half of the colony is used as employed bees and half as onlooker bees. |
max_iter |
Maximum number of iterations. |
limit |
Number of unsuccessful trials before a food source is abandoned. |
Value
An object of class "met_optimizer".
References
Karaboga, D., and Basturk, B. (2007). A Powerful and Efficient Algorithm for Numerical Function Optimization: Artificial Bee Colony (ABC) Algorithm. Journal of Global Optimization, 39, 459–471. doi:10.1007/s10898-007-9149-x
Karaboga, D., and Ozturk, C. (2009). Neural Networks Training by Artificial Bee Colony Algorithm on Pattern Classification. Neural Network World, 19(3), 279–292.
Examples
optimizer_abc()
Adam Optimizer
Description
Creates an Adam optimizer object.
Usage
optimizer_adam(
learning_rate = 0.001,
beta1 = 0.9,
beta2 = 0.999,
epsilon = 1e-08,
epochs = 100,
batch_size = NULL
)
Arguments
learning_rate |
Learning rate. |
beta1 |
Exponential decay rate for the first moment estimates. |
beta2 |
Exponential decay rate for the second moment estimates. |
epsilon |
Small positive constant for numerical stability. |
epochs |
Number of training epochs. |
batch_size |
Mini-batch size. If |
Value
An object of class "met_optimizer".
References
Kingma, D. P., and Ba, J. (2015). Adam: A Method for Stochastic Optimization. International Conference on Learning Representations.
Examples
optimizer_adam()
Differential Evolution Optimizer
Description
Creates a Differential Evolution optimizer object.
Usage
optimizer_de(
pop_size = 30,
max_iter = 100,
F = 0.5,
CR = 0.9,
strategy = "rand/1/bin"
)
Arguments
pop_size |
Population size. |
max_iter |
Maximum number of iterations. |
F |
Differential weight. Common values are between 0.4 and 1. |
CR |
Crossover probability. Must be between 0 and 1. |
strategy |
Differential evolution strategy. Currently only
|
Value
An object of class "met_optimizer".
References
Storn, R., and Price, K. (1997). Differential Evolution – A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces. Journal of Global Optimization, 11, 341–359. doi:10.1023/A:1008202821328
Ilonen, J., Kamarainen, J.-K., and Lampinen, J. (2003). Differential Evolution Training Algorithm for Feed-Forward Neural Networks. Neural Processing Letters, 17, 93–105. doi:10.1023/A:1022995128597
Examples
optimizer_de()
Genetic Algorithm Optimizer
Description
Creates a real-coded Genetic Algorithm optimizer object.
Usage
optimizer_ga(
pop_size = 30,
max_iter = 100,
crossover_rate = 0.8,
mutation_rate = 0.1,
mutation_sd = 0.1,
elitism = TRUE,
selection = "tournament",
tournament_size = 2
)
Arguments
pop_size |
Population size. |
max_iter |
Maximum number of iterations. |
crossover_rate |
Probability of crossover. |
mutation_rate |
Probability of mutating each parameter. |
mutation_sd |
Standard deviation of Gaussian mutation noise. |
elitism |
Logical. Whether to preserve the best solution in each generation. |
selection |
Selection method. Currently only |
tournament_size |
Number of individuals used in tournament selection. |
Value
An object of class "met_optimizer".
References
Goldberg, D. E. (1989). Genetic Algorithms in Search, Optimization, and Machine Learning. Addison-Wesley, Reading, MA.
Montana, D. J., and Davis, L. (1989). Training Feedforward Neural Networks Using Genetic Algorithms. Proceedings of the 11th International Joint Conference on Artificial Intelligence, 762–767.
Examples
optimizer_ga()
Grey Wolf Optimizer
Description
Creates a Grey Wolf Optimizer object for continuous optimization problems.
Usage
optimizer_gwo(pop_size = 30, max_iter = 100, a_start = 2, a_end = 0)
Arguments
pop_size |
Population size. |
max_iter |
Maximum number of iterations. |
a_start |
Initial value of the control parameter |
a_end |
Final value of the control parameter |
Value
An object of class "met_optimizer".
References
Mirjalili, S., Mirjalili, S. M., and Lewis, A. (2014). Grey Wolf Optimizer. Advances in Engineering Software, 69, 46–61. doi:10.1016/j.advengsoft.2013.12.007
Mirjalili, S. (2015). How Effective is the Grey Wolf Optimizer in Training Multi-Layer Perceptrons. Applied Intelligence, 43, 150–161. doi:10.1007/s10489-014-0645-7
Examples
optimizer_gwo()
Hybrid Optimizer
Description
Creates a hybrid optimizer object by combining a global optimizer and a local optimizer.
Usage
optimizer_hybrid(
global = optimizer_pso(),
local = optimizer_adam(),
strategy = "sequential"
)
Arguments
global |
A metaheuristic optimizer object. |
local |
A gradient-based optimizer object. |
strategy |
Hybrid training strategy. Currently |
Value
An object of class "met_optimizer".
Examples
optimizer_hybrid(
global = optimizer_pso(max_iter = 10),
local = optimizer_adam(epochs = 10)
)
Get Optimizer Information
Description
Returns basic information about an optimizer available in the metANN package.
Usage
optimizer_info(optimizer)
Arguments
optimizer |
Character name of an optimizer or an optimizer object created
by functions such as |
Value
An object of class "met_optimizer_info".
Examples
optimizer_info("pso")
optimizer_info("sboa")
optimizer_info(optimizer_adam())
Particle Swarm Optimization Optimizer
Description
Creates a Particle Swarm Optimization optimizer object.
Usage
optimizer_pso(
pop_size = 30,
max_iter = 100,
w = 0.7,
c1 = 1.5,
c2 = 1.5,
velocity_clamp = NULL
)
Arguments
pop_size |
Population size. |
max_iter |
Maximum number of iterations. |
w |
Inertia weight. |
c1 |
Cognitive acceleration coefficient. |
c2 |
Social acceleration coefficient. |
velocity_clamp |
Optional maximum absolute velocity. If |
Value
An object of class "met_optimizer".
References
Kennedy, J., and Eberhart, R. (1995). Particle Swarm Optimization. Proceedings of ICNN'95 - International Conference on Neural Networks, 4, 1942–1948. doi:10.1109/ICNN.1995.488968
Examples
optimizer_pso()
Secretary Bird Optimization Algorithm Optimizer
Description
Creates a Secretary Bird Optimization Algorithm optimizer object for continuous optimization problems.
Usage
optimizer_sboa(pop_size = 30, max_iter = 100)
Arguments
pop_size |
Population size. |
max_iter |
Maximum number of iterations. |
Value
An object of class "met_optimizer".
References
Fu, Y., Liu, D., Chen, J., and He, L. (2024). Secretary Bird Optimization Algorithm: A New Metaheuristic for Solving Global Optimization Problems. Artificial Intelligence Review, 57, 123. doi:10.1007/s10462-024-10729-y
Dilber, B., and Ozdemir, A. F. (2026). A novel approach to training feed-forward multi-layer perceptrons with recently proposed secretary bird optimization algorithm. Neural Computing and Applications, 38(5). doi:10.1007/s00521-026-11874-x
Examples
optimizer_sboa()
Stochastic Gradient Descent Optimizer
Description
Creates a stochastic gradient descent optimizer object.
Usage
optimizer_sgd(learning_rate = 0.01, epochs = 100, batch_size = NULL)
Arguments
learning_rate |
Learning rate. |
epochs |
Number of training epochs. |
batch_size |
Mini-batch size. If |
Value
An object of class "met_optimizer".
References
Robbins, H., and Monro, S. (1951). A Stochastic Approximation Method. The Annals of Mathematical Statistics, 22(3), 400–407. doi:10.1214/aoms/1177729586
Examples
optimizer_sgd()
Teaching-Learning-Based Optimization Optimizer
Description
Creates a Teaching-Learning-Based Optimization optimizer object for continuous optimization problems.
Usage
optimizer_tlbo(pop_size = 30, max_iter = 100)
Arguments
pop_size |
Population size. |
max_iter |
Maximum number of iterations. |
Value
An object of class "met_optimizer".
References
Rao, R. V., Savsani, V. J., and Vakharia, D. P. (2011). Teaching-Learning-Based Optimization: A Novel Method for Constrained Mechanical Design Optimization Problems. Computer-Aided Design, 43, 303–315. doi:10.1016/j.cad.2010.12.015
Examples
optimizer_tlbo()
Whale Optimization Algorithm Optimizer
Description
Creates a Whale Optimization Algorithm optimizer object for continuous optimization problems.
Usage
optimizer_woa(pop_size = 30, max_iter = 100, a_start = 2, a_end = 0, b = 1)
Arguments
pop_size |
Population size. |
max_iter |
Maximum number of iterations. |
a_start |
Initial value of the control parameter |
a_end |
Final value of the control parameter |
b |
Constant defining the spiral shape in the bubble-net mechanism. |
Value
An object of class "met_optimizer".
References
Mirjalili, S., and Lewis, A. (2016). The Whale Optimization Algorithm. Advances in Engineering Software, 95, 51–67. doi:10.1016/j.advengsoft.2016.01.008
Examples
optimizer_woa()
Plot Optimization Convergence
Description
Plots the convergence curve of a metANN optimization result.
Usage
## S3 method for class 'met_optimize_result'
plot(x, log = FALSE, ...)
Arguments
x |
An object of class |
log |
Logical. If |
... |
Additional graphical arguments passed to |
Value
The input object invisibly.
Plot a metANN Model
Description
Plot a metANN Model
Usage
## S3 method for class 'metann'
plot(x, ...)
Arguments
x |
A fitted metANN model. |
... |
Additional arguments passed to |
Value
The input object invisibly.
Plot Neural Network Architecture
Description
Plots the architecture of a feed-forward multilayer perceptron, showing input, hidden, and output layers in a visually enhanced layout.
Usage
plot_network(
object,
max_neurons = 20,
show_connections = TRUE,
neuron_cex = 2.2,
label_cex = 0.9,
main = "Neural Network Architecture",
...
)
Arguments
object |
A fitted |
max_neurons |
Maximum number of neurons to display per layer. If a layer has more neurons than this value, only a subset is displayed and the layer is annotated. |
show_connections |
Logical. If |
neuron_cex |
Size of neuron circles. |
label_cex |
Size of text labels. |
main |
Main title of the plot. |
... |
Additional graphical arguments. |
Value
The input object invisibly.
Examples
fit <- met_mlp(
formula = Petal.Width ~ Sepal.Length + Sepal.Width + Petal.Length,
data = iris,
hidden_layers = c(5),
optimizer = optimizer_pso(pop_size = 10, max_iter = 10),
seed = 123,
verbose = FALSE
)
plot_network(fit)
Predict with a metANN Model
Description
Generates predictions from a fitted metANN model.
Usage
## S3 method for class 'metann'
predict(
object,
newdata,
type = c("response", "prob", "class"),
threshold = 0.5,
...
)
Arguments
object |
A fitted object of class |
newdata |
New data used for prediction. For formula-based models, this should be a data frame. For x-y models, this should be a numeric matrix or numeric data frame. |
type |
Prediction type. For regression models, |
threshold |
Classification threshold for binary classification. |
... |
Additional arguments. |
Value
A numeric vector, matrix, or factor depending on the task and prediction type.
Print a Dense Layer
Description
Print a Dense Layer
Usage
## S3 method for class 'met_dense_layer'
print(x, ...)
Arguments
x |
A dense layer object. |
... |
Additional arguments, currently unused. |
Value
The input object invisibly.
Print an MLP Architecture
Description
Print an MLP Architecture
Usage
## S3 method for class 'met_mlp_architecture'
print(x, ...)
Arguments
x |
An MLP architecture object. |
... |
Additional arguments, currently unused. |
Value
The input object invisibly.
Print a metANN Optimization Result
Description
Print a metANN Optimization Result
Usage
## S3 method for class 'met_optimize_result'
print(x, ...)
Arguments
x |
A metANN optimization result object. |
... |
Additional arguments, currently unused. |
Value
The input object invisibly.
Print a metANN Optimizer
Description
Print a metANN Optimizer
Usage
## S3 method for class 'met_optimizer'
print(x, ...)
Arguments
x |
A metANN optimizer object. |
... |
Additional arguments, currently unused. |
Value
The input object invisibly.
Print Optimizer Information
Description
Print Optimizer Information
Usage
## S3 method for class 'met_optimizer_info'
print(x, ...)
Arguments
x |
An object of class |
... |
Additional arguments. |
Value
The input object invisibly.
Print a metANN Model
Description
Print a metANN Model
Usage
## S3 method for class 'metann'
print(x, ...)
Arguments
x |
A metANN model object. |
... |
Additional arguments, currently unused. |
Value
The input object invisibly.
Print metANN Evaluation Results
Description
Print metANN Evaluation Results
Usage
## S3 method for class 'metann_evaluation'
print(x, ...)
Arguments
x |
An object of class |
... |
Additional arguments. |
Value
The input object invisibly.
Particle Swarm Optimization Engine
Description
Internal implementation of Particle Swarm Optimization.
Usage
pso_optimize(fn, lower, upper, optimizer, verbose = TRUE, ...)
Arguments
fn |
Objective function. |
lower |
Numeric vector of lower bounds. |
upper |
Numeric vector of upper bounds. |
optimizer |
A PSO optimizer object. |
verbose |
Logical. If |
... |
Additional arguments passed to |
Value
An object of class "met_optimize_result".
Secretary Bird Optimization Algorithm Engine
Description
Internal implementation of the Secretary Bird Optimization Algorithm for continuous optimization.
Usage
sboa_optimize(fn, lower, upper, optimizer, verbose = TRUE, ...)
Arguments
fn |
Objective function. |
lower |
Numeric vector of lower bounds. |
upper |
Numeric vector of upper bounds. |
optimizer |
An SBOA optimizer object. |
verbose |
Logical. If |
... |
Additional arguments passed to |
Value
An object of class "met_optimize_result".
Summarize a metANN Optimization Result
Description
Summarize a metANN Optimization Result
Usage
## S3 method for class 'met_optimize_result'
summary(object, ...)
Arguments
object |
A metANN optimization result object. |
... |
Additional arguments, currently unused. |
Value
A list containing the main optimization results.
Summarize a metANN Model
Description
Summarize a metANN Model
Usage
## S3 method for class 'metann'
summary(object, ...)
Arguments
object |
A metANN model object. |
... |
Additional arguments, currently unused. |
Value
A list containing model summary information.
Teaching-Learning-Based Optimization Engine
Description
Internal implementation of Teaching-Learning-Based Optimization for continuous optimization.
Usage
tlbo_optimize(fn, lower, upper, optimizer, verbose = TRUE, ...)
Arguments
fn |
Objective function. |
lower |
Numeric vector of lower bounds. |
upper |
Numeric vector of upper bounds. |
optimizer |
A TLBO optimizer object. |
verbose |
Logical. If |
... |
Additional arguments passed to |
Value
An object of class "met_optimize_result".
Train an MLP with a Gradient-Based Optimizer
Description
Internal gradient-based training engine for MLP regression and classification.
Usage
train_mlp_gradient(
x,
y,
architecture,
optimizer,
loss,
seed = NULL,
verbose = TRUE
)
Arguments
x |
Numeric input matrix. |
y |
Response vector or matrix used for training. |
architecture |
MLP architecture object. |
optimizer |
Gradient-based optimizer object. |
loss |
Loss object. |
seed |
Optional random seed. |
verbose |
Logical. If |
Value
An object compatible with "met_optimize_result".
Whale Optimization Algorithm Engine
Description
Internal implementation of the Whale Optimization Algorithm for continuous optimization.
Usage
woa_optimize(fn, lower, upper, optimizer, verbose = TRUE, ...)
Arguments
fn |
Objective function. |
lower |
Numeric vector of lower bounds. |
upper |
Numeric vector of upper bounds. |
optimizer |
A WOA optimizer object. |
verbose |
Logical. If |
... |
Additional arguments passed to |
Value
An object of class "met_optimize_result".