Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Exgoneous features in Automodels #906

Closed
kkckk1110 opened this issue Feb 28, 2024 · 4 comments
Closed

Exgoneous features in Automodels #906

kkckk1110 opened this issue Feb 28, 2024 · 4 comments

Comments

@kkckk1110
Copy link

Description

Hello! I am trying to use Automodels to automatically tune the parameters. However, I found the document confusing because I do not know how to specify the exogenous features in Automodels. I tried to specify in config as follows, but I have no idea whether I did it right. I also wonder what will happen if I add exogenous features in the config. Will the model use all the features, or, like tuning other parameters, selectively incorporating some features? I am looking forward to responses and I think a more detailed tutorial on how to combine Automodels and exogenous will be really helpful! Thanks!

config_nhits = {
"input_size": tune.choice([6, 62, 63]), # Length of input window
"start_padding_enabled": True,
"n_blocks": 5*[1], # Length of input window
"mlp_units": 5 * [[64, 64]], # Length of input window
"n_pool_kernel_size": tune.choice([5*[1], 5*[2], 5*[4],
[8, 4, 2, 1, 1]]), # MaxPooling Kernel size
"n_freq_downsample": tune.choice([[8, 4, 2, 1, 1],
[1, 1, 1, 1, 1]]), # Interpolation expressivity ratios
"learning_rate": tune.loguniform(1e-4, 1e-2), # Initial Learning rate
"scaler_type": tune.choice([None]), # Scaler type
"max_steps": tune.choice([1000]), # Max number of training iterations
"batch_size": tune.choice([1, 4, 10]), # Number of series in batch
"windows_batch_size": tune.choice([128, 256, 512]), # Number of windows in batch
"random_seed": tune.randint(1, 20), # Random seed
"futr_exog_list":futr_exog_list
}

Link

No response

@jmoralez
Copy link
Member

Hey @kkckk1110, thanks for using neuralforecast. The exogenous features are used in the same way as the other hyperparameters, you can set them to a fixed value e.g. futr_exog_list: ['a', 'b'] or tune them as well, e.g. futr_exog_list: tune.choice([['a'], ['b'], ['a', 'b']])

@vidarsumo
Copy link

I tried this using hist_exog_list:

nhits_config = {
       "max_steps": 100,                                                         # Number of SGD steps
       "input_size": 24,                                                         # Size of input window
       "learning_rate": tune.loguniform(1e-5, 1e-1),                             # Initial Learning rate
       "n_pool_kernel_size": tune.choice([[2, 2, 2], [16, 8, 1]]),               # MaxPool's Kernelsize
       "n_freq_downsample": tune.choice([[168, 24, 1], [24, 12, 1], [1, 1, 1]]), # Interpolation expressivity ratios
       "val_check_steps": 50,                                                    # Compute validation every 50 steps
       "random_seed": tune.randint(1, 10),                                       # Random seed
       "hist_exog_list": ['feat_1', 'feat_2', 'feat_3', 'feat_4', 'feat_5', 'feat_6']
    }
    
    model = AutoNHITS(h=21,
                  config=nhits_config,
                  search_alg=HyperOptSearch(),
                  backend='ray',
                  num_samples=3)

nf = NeuralForecast(models=[model], freq='D')
nf.fit(df=df_train, val_size=21)

where feat_x are the names of the historical exogen features. I keep getting this error:

set(temporal_cols.tolist()) & set(self.hist_exog_list + self.futr_exog_list)
TypeError: can only concatenate tuple (not "list") to tuple

I could get this to work by also passing "futr_exog_list": [] to nhits_config.

@jmoralez
Copy link
Member

That's a bug with the HyperOptSearch, which was fixed in #851 but hasn't made it into a release. If you can install from github it should work.

@noahvand
Copy link

noahvand commented Mar 7, 2024

I got the simular problem when using TFT models, and get error

"set(temporal_cols.tolist()) & set(self.hist_exog_list + self.futr_exog_list)
TypeError: can only concatenate tuple (not "list") to tuple"

and I fixed it by have modify the code in _base_model.py as below:

def _get_temporal_exogenous_cols(self, temporal_cols):
        self.hist_exog_list = list(self.hist_exog_list) # this is added 
        self.futr_exog_list = list(self.futr_exog_list) # this is added 
        return list(
            set(temporal_cols.tolist()) & set(self.hist_exog_list + self.futr_exog_list)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants