R/hp_tuning_LSTM.R
hp_tuning_LSTM.Rd
Define the hyperparameters of the long-short term memory LSTM
hp_tuning_LSTM(
data,
train_size = 0.8,
random_seed = 1234,
n_steps_LSTM,
f_neuron,
f_drop,
s_neuron,
s_drop,
learn_rate,
epoc
)
Dataset to split and tuning
Proportion of the x to split as train subset. By default the value is 0.8
An integer to control the random number generator used. By default the value is 1234
Number of steps to split into samples
Number of neuron in the first LSTM layer
Dropout rate in the first LSTM layer
Number of neuron in the second LSTM layer
Dropout rate in the second LSTM layer
Learning rate in the compilation with Adam optimizer
Number of epochs in the training model
Returns: Dataframe with the combinations of the parameters and the calculated error Better combination of hyperparameters that minimizes the train and test error