Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A finetuning issue #1

Open
arandomgoodguy opened this issue Oct 21, 2023 · 5 comments
Open

A finetuning issue #1

arandomgoodguy opened this issue Oct 21, 2023 · 5 comments

Comments

@arandomgoodguy
Copy link

Hi, highly interesting work, I gave XTab a try immediately after I saw it.

But there is an issue about finetuning.

I had tried to install Autogluon==0.5.3 on Colab but it just failed.

So I installed 0.8.2 instead to test XTab on Colab, but at the beginning of the training process it said that the "finetune_on" parameter causes a KeyError. It seems that there is no place to specify the initial checkpoint directory.

Do you know how to solve this?

Thank you.

@BingzhaoZhu
Copy link
Owner

Thanks for your interest in our work. I think XTab are not merged to the official AG releases. I uploaded my own branch as a .zip file in this repo. Could you try installing AG from that?

@arandomgoodguy
Copy link
Author

Hi, thank for your quick response.

You mean download autogluon-submission.zip and access the files in it, right?

I wonder what's the file name that contains your modified XTab which can accept "finetune_on" parameter? Is it ft_transformer.py which is under "automm" folder?

Can I just extract the core code of your modified model as custom model and then train the model under AG==0.8.2?

Will this cause into numerous errors since the code hadn't been merged, or I must use the AG in your zip, otherwise the code won't work?

@BingzhaoZhu
Copy link
Owner

I am not sure whether the pretrained checkpoint will work with AG-0.8.2 or not, since I am not aware of the latest updates. A safe way is to just use the version in this repo.

Or you may try this: init a Transformer in AG-0.8.2, save the weights, partly replace the saved weight with the pretrained ckpts (if compatible), and continue train with AG-0.8.2. Sorry that this can be a bit inconvient.

@arandomgoodguy
Copy link
Author

I don't see any method regarding saving or replacing the weight of FTT in documentation of AG. So I think I will have to use your version, but I am not entirely sure how to do from 0 to 1 still. Can you show me in detail and the steps? Also, do we need to install other requirement packages?

@Innixma
Copy link

Innixma commented Jan 21, 2024

Hi @arandomgoodguy, we are currently in the process of adding XTab's pretrained checkpoint to AutoGluon officially!

You can see the PR that is adding the logic here: autogluon/autogluon#3859, which you can try out yourself (refer to the added unit test that uses the XTab weights)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants