-
Notifications
You must be signed in to change notification settings - Fork 380
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Exgoneous features in Automodels #906
Comments
Hey @kkckk1110, thanks for using neuralforecast. The exogenous features are used in the same way as the other hyperparameters, you can set them to a fixed value e.g. |
I tried this using hist_exog_list:
where feat_x are the names of the historical exogen features. I keep getting this error:
I could get this to work by also passing |
That's a bug with the HyperOptSearch, which was fixed in #851 but hasn't made it into a release. If you can install from github it should work. |
I got the simular problem when using TFT models, and get error
and I fixed it by have modify the code in _base_model.py as below:
|
Description
Hello! I am trying to use Automodels to automatically tune the parameters. However, I found the document confusing because I do not know how to specify the exogenous features in Automodels. I tried to specify in config as follows, but I have no idea whether I did it right. I also wonder what will happen if I add exogenous features in the config. Will the model use all the features, or, like tuning other parameters, selectively incorporating some features? I am looking forward to responses and I think a more detailed tutorial on how to combine Automodels and exogenous will be really helpful! Thanks!
config_nhits = {
"input_size": tune.choice([6, 62, 63]), # Length of input window
"start_padding_enabled": True,
"n_blocks": 5*[1], # Length of input window
"mlp_units": 5 * [[64, 64]], # Length of input window
"n_pool_kernel_size": tune.choice([5*[1], 5*[2], 5*[4],
[8, 4, 2, 1, 1]]), # MaxPooling Kernel size
"n_freq_downsample": tune.choice([[8, 4, 2, 1, 1],
[1, 1, 1, 1, 1]]), # Interpolation expressivity ratios
"learning_rate": tune.loguniform(1e-4, 1e-2), # Initial Learning rate
"scaler_type": tune.choice([None]), # Scaler type
"max_steps": tune.choice([1000]), # Max number of training iterations
"batch_size": tune.choice([1, 4, 10]), # Number of series in batch
"windows_batch_size": tune.choice([128, 256, 512]), # Number of windows in batch
"random_seed": tune.randint(1, 20), # Random seed
"futr_exog_list":futr_exog_list
}
Link
No response
The text was updated successfully, but these errors were encountered: