You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello, I am attempting to conduct multi-task hyperparameter tuning using Bayesian optimization for the XGBoost algorithm. I believe my scenario qualifies as a multi-task problem because my dataset comprises four variables, and I aim to treat each variable as a target variable while considering the other three variables as features. Consequently, I plan to train the XGBoost model four times, each time using a different target variable. My objective is to discover a set of hyperparameters that optimizes performance across all four tasks simultaneously. I understand that this approach differs from multi-objective hyperparameter tuning, where various performance metrics are evaluated. I am curious whether multi-task hyperparameter tuning is feasible using the mlrMBO package. If so, could you please direct me to any related tutorials or resources for guidance
The text was updated successfully, but these errors were encountered:
Hello, I am attempting to conduct multi-task hyperparameter tuning using Bayesian optimization for the XGBoost algorithm. I believe my scenario qualifies as a multi-task problem because my dataset comprises four variables, and I aim to treat each variable as a target variable while considering the other three variables as features. Consequently, I plan to train the XGBoost model four times, each time using a different target variable. My objective is to discover a set of hyperparameters that optimizes performance across all four tasks simultaneously. I understand that this approach differs from multi-objective hyperparameter tuning, where various performance metrics are evaluated. I am curious whether multi-task hyperparameter tuning is feasible using the mlrMBO package. If so, could you please direct me to any related tutorials or resources for guidance
The text was updated successfully, but these errors were encountered: