-
Notifications
You must be signed in to change notification settings - Fork 79
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Clarification on README #46
Comments
I have these same questions, thanks. |
One can get an idea from this, I think, seeing the meaning of the used spacy properties. |
nbow = {"first": ("#1", [0, 1], numpy.array([1.5, 0.5], dtype=numpy.float32)),
"second": ("#2", [0, 1], numpy.array([0.75, 0.15], dtype=numpy.float32))}
calc = WMD(embeddings, nbow, vocabulary_min=2) I think the nbow is a dictionary of documents , it contain the identify of document(the key of dictionary), human readable text(the first element of tuple), tokens which appear in the document, and transfer them to identify of your W2V model(the second element of tuple).third element of tuple, which is normalize of bag of word in the document, so the sum of it should be 1. In my work, I query a English sentence to find the most shortest WMD in Chinese sentences, |
Can please somebody PR the documentation fix? Thanks! |
Hello,
I can't fully understand the documentation, would you mind clarifying some points for me?
Firstly:
embeddings = numpy.array([[0.1, 1], [1, 0.1]], dtype=numpy.float32)
this is an array containing my the array of single words embedding, if I have 20 sentence, each one of 10 words, and each word represented with 300 dimension vector, embeddings will be (20 x 10 x 300), right?
Than in the documentation I only found this:
So the "#1" is just an indexed, I'm sorry but I can't understand what the [0, 1] and the numpy.array([1.5, 0.5], are supposed to represent. I think the second one is supposed to be the weight of each word, that should be calculated using the term frequency, isn't it supposed to sum up to 1? What items are to be identified by the [0,1]?
I'm sorry if I'm missing something, from there I just can't understand what's going on, I'm available to have a private chat if you are some time to spare, thank you very much.
Mattia
The text was updated successfully, but these errors were encountered: