Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot import 'HFTransformerBackend' from 'bertopic.backend' #664

Closed
esettouf opened this issue Aug 7, 2022 · 6 comments · Fixed by #668
Closed

Cannot import 'HFTransformerBackend' from 'bertopic.backend' #664

esettouf opened this issue Aug 7, 2022 · 6 comments · Fixed by #668

Comments

@esettouf
Copy link

esettouf commented Aug 7, 2022

Hi,
I am trying to use a custom model which I did additional training on, uploaded it to huggingface and having some problems. Right now I'm using the code snippet from here to set the custom model for the BERTopic object. But I am getting the following error when importing the package:

ImportError: cannot import name 'HFTransformerBackend' from 'bertopic.backend' (/usr/local/lib/python3.7/dist-packages/bertopic/backend/init.py)

BERTopic library was installed before. Is that package not included anymore?

Thanks a lot!

@MaartenGr
Copy link
Owner

My bad! That is an issue with the documentation being misleading in this case. To use this correctly you should run the following instead:

from transformers.pipelines import pipeline

hf_model = pipeline("feature-extraction", model="distilbert-base-cased")

topic_model = BERTopic(embedding_model=hf_model)

@esettouf
Copy link
Author

esettouf commented Aug 8, 2022

Thanks for your quick response! Using that model works now, although the results aren't very satisfying yet. I guess still have some work to do with my model.
And thank you for the great tool and ongoing support!

@MaartenGr
Copy link
Owner

No problem, glad I can be of help 😄

@rjac-ml
Copy link

rjac-ml commented Aug 16, 2022

I have a question regarding using HF pipeline feature extraction and execution time.

I run the following code, but it lasts 4 hours. Should I make sure that the pipeline is just returning the CLS embedding and not using all the words Embedding?

vectorizer_model = CountVectorizer(stop_words="english", ngram_range=(1,4), min_df=10,max_df=0.95)
umap_model = UMAP(n_neighbors=350, n_components=36, min_dist=0.0, metric='cosine')
embedder= pipeline("feature-extraction","microsoft/mpnet-base")
topic_model = BERTopic(
    vectorizer_model=vectorizer_model
    ,embedding_model=embedder
    ,umap_model=umap_model
    ,verbose=True
    ,nr_topics="auto"
)

With similar code it only took 5 mins the embedding part.

vectorizer_model = CountVectorizer(stop_words="english", ngram_range=(1,4), min_df=10,max_df=0.95)
umap_model = UMAP(n_neighbors=350, n_components=36, min_dist=0.0, metric='cosine')
topic_model = BERTopic(
    vectorizer_model=vectorizer_model
    ,embedding_model="all-mpnet-base-v2"
    ,umap_model=umap_model
    ,verbose=True
    ,nr_topics="auto"
)

@MaartenGr
Copy link
Owner

@rjac-ml The reason for the difference in speed is that you are using two different embedding models. microsoft/mpnet-base is a different model compared to sentence-transformers/all-mpnet-base-v2 so it is expected that their speeds would be different.

If you were to use the same model, then speed would be much more comparable. Do note that SentenceTransformers is optimized and likely to be faster than huggingface.

@rjac-ml
Copy link

rjac-ml commented Aug 17, 2022

hey @MaartenGr thanks for the response, understood.

@MaartenGr MaartenGr mentioned this issue Aug 30, 2022
MaartenGr added a commit that referenced this issue Sep 11, 2022
* Online/incremental topic modeling with .partial_fit
* Expose c-TF-IDF model for customization with bertopic.vectorizers.ClassTfidfTransformer
* Expose attributes for easier access to internal data
* Major changes to the Algorithm page of the documentation, which now contains three overviews of the algorithm
* Added an example of combining BERTopic with KeyBERT
* Added many tests with the intention of making development a bit more stable
* Fix #632, #648, #673, #682, #667, #664
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants