-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add example #30
Comments
Hi [CodeAKrome] I am struggling to understand the logic for segmenting text around the target entity. I'd appreciate your example program, where can I have it? |
gh repo clone CodeAKrome/NewsMTSC cat NewsSentiment/ner_example.py Sorry for long response time. I need to find out how to get notifications for these types of events, or know where to look. Anyway, if you still need it. Good on ya for sticking with things. I wish you luck. import spacy
from transformers import AutoTokenizer, AutoModelForTokenClassification
from transformers import pipeline
from NewsSentiment import TargetSentimentClassifier
tokenizer = AutoTokenizer.from_pretrained("dslim/bert-large-NER")
model = AutoModelForTokenClassification.from_pretrained("dslim/bert-large-NER")
nlp = pipeline("ner", model=model, tokenizer=tokenizer)
TEXT = "On some great and glorious day the plain folks of the land will reach their heart's desire at last, and the White House will be adorned by a downright moron."
ner_spans = nlp(TEXT)
ents = [span["word"] for span in ner_spans]
print(f"Entities: {ents}")
tsc = TargetSentimentClassifier()
for span in ner_spans:
l = TEXT[:span['start']]
m = TEXT[span['start']:span['end']]
r = TEXT[span['end']:]
sentiment = tsc.infer_from_text(l, m, r)
print(f"{span['entity']}\t{sentiment[0]['class_label']}\t{sentiment[0]['class_prob']:.2f}\t{m}") |
added a link to your comment @CodeAKrome - thanks! this will be shown in the pypi readme once the next version is uploaded. |
Excellent work. I was literally working on this very thing when I found this.
I wrote a short example program to explicitly show one way to use this.
The text was updated successfully, but these errors were encountered: