-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
A finetuning issue #1
Comments
Thanks for your interest in our work. I think XTab are not merged to the official AG releases. I uploaded my own branch as a .zip file in this repo. Could you try installing AG from that? |
Hi, thank for your quick response. You mean download autogluon-submission.zip and access the files in it, right? I wonder what's the file name that contains your modified XTab which can accept "finetune_on" parameter? Is it ft_transformer.py which is under "automm" folder? Can I just extract the core code of your modified model as custom model and then train the model under AG==0.8.2? Will this cause into numerous errors since the code hadn't been merged, or I must use the AG in your zip, otherwise the code won't work? |
I am not sure whether the pretrained checkpoint will work with AG-0.8.2 or not, since I am not aware of the latest updates. A safe way is to just use the version in this repo. Or you may try this: init a Transformer in AG-0.8.2, save the weights, partly replace the saved weight with the pretrained ckpts (if compatible), and continue train with AG-0.8.2. Sorry that this can be a bit inconvient. |
I don't see any method regarding saving or replacing the weight of FTT in documentation of AG. So I think I will have to use your version, but I am not entirely sure how to do from 0 to 1 still. Can you show me in detail and the steps? Also, do we need to install other requirement packages? |
Hi @arandomgoodguy, we are currently in the process of adding XTab's pretrained checkpoint to AutoGluon officially! You can see the PR that is adding the logic here: autogluon/autogluon#3859, which you can try out yourself (refer to the added unit test that uses the XTab weights) |
Hi, highly interesting work, I gave XTab a try immediately after I saw it.
But there is an issue about finetuning.
I had tried to install Autogluon==0.5.3 on Colab but it just failed.
So I installed 0.8.2 instead to test XTab on Colab, but at the beginning of the training process it said that the "finetune_on" parameter causes a KeyError. It seems that there is no place to specify the initial checkpoint directory.
Do you know how to solve this?
Thank you.
The text was updated successfully, but these errors were encountered: