Define the hyperparameters of the long-short term memory LSTM

hp_tuning_LSTM(
  data,
  train_size = 0.8,
  random_seed = 1234,
  n_steps_LSTM,
  f_neuron,
  f_drop,
  s_neuron,
  s_drop,
  learn_rate,
  epoc
)

Arguments

data

Dataset to split and tuning

train_size

Proportion of the x to split as train subset. By default the value is 0.8

random_seed

An integer to control the random number generator used. By default the value is 1234

n_steps_LSTM

Number of steps to split into samples

f_neuron

Number of neuron in the first LSTM layer

f_drop

Dropout rate in the first LSTM layer

s_neuron

Number of neuron in the second LSTM layer

s_drop

Dropout rate in the second LSTM layer

learn_rate

Learning rate in the compilation with Adam optimizer

epoc

Number of epochs in the training model

Value

Returns: Dataframe with the combinations of the parameters and the calculated error Better combination of hyperparameters that minimizes the train and test error

Author

Catherine Rincon, catherine.rincon@udea.edu.co