You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
If we make the fine_grain False, it performs the binary classification. But with binary, I can see there are three labels 0, 1, 2. So is it really a binary task?
The text was updated successfully, but these errors were encountered:
because in non-root node (which is part of the sentence), it is OK to be neutral (so does the dataset https://nlp.stanford.edu/sentiment/treebank.html)
We only filter out neutral sentence in Binary Task, but keep full sentence with positive, negative sentiment (Most of sentences have a neutral tree span).
So, the label 1 is necessary for the non-root node. At root node, it will ignore label 1 and consider whether 0 or 2 is more "score" to decide as sentiment of the sentence.
Look at this sample, you see the root is positive, but some others non-root node is neutral.
ok, I get it now. You can have a softmax over 3 classes in the internal nodes of the tree but at the root, rather than keeping the softmax and overlooking the neutral class, it is possible to have a sigmoid layer and BCELoss as the loss function.
If we make the fine_grain False, it performs the binary classification. But with binary, I can see there are three labels 0, 1, 2. So is it really a binary task?
The text was updated successfully, but these errors were encountered: