Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem with Binary classification #5

Open
navid5792 opened this issue Jul 28, 2018 · 2 comments
Open

Problem with Binary classification #5

navid5792 opened this issue Jul 28, 2018 · 2 comments
Labels

Comments

@navid5792
Copy link

If we make the fine_grain False, it performs the binary classification. But with binary, I can see there are three labels 0, 1, 2. So is it really a binary task?

@ttpro1995
Copy link
Owner

ttpro1995 commented Jul 31, 2018

because in non-root node (which is part of the sentence), it is OK to be neutral (so does the dataset https://nlp.stanford.edu/sentiment/treebank.html)
We only filter out neutral sentence in Binary Task, but keep full sentence with positive, negative sentiment (Most of sentences have a neutral tree span).
So, the label 1 is necessary for the non-root node. At root node, it will ignore label 1 and consider whether 0 or 2 is more "score" to decide as sentiment of the sentence.

Sample
Look at this sample, you see the root is positive, but some others non-root node is neutral.

@navid5792
Copy link
Author

ok, I get it now. You can have a softmax over 3 classes in the internal nodes of the tree but at the root, rather than keeping the softmax and overlooking the neutral class, it is possible to have a sigmoid layer and BCELoss as the loss function.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants