Skip to content

Releases: winkjs/wink-nlp

Fixed some type definitions

24 Nov 13:21
Compare
Choose a tag to compare

Version 2.3.1 Nov 24, 2024

🐛 Fixes

  • Updated some BM25Vectorizer methods types according to implementation — thanks to @pavloDeshko

Enabled more special space characters handling

19 May 07:39
Compare
Choose a tag to compare

Version 2.3.0 May 19, 2024

✨ Features

  • Detokenization now restores em/en, third/quarter, thin/hair, medium math space characters & narrow non breaking space characters besides the regular nbsp. 👏 🙌 🛰️

Improved error handling in contextual vectors

08 May 15:45
Compare
Choose a tag to compare

Version 2.2.2 May 08, 2024

✨ Features

  • .contextualVectors() now throws error if (a) word vectors are not loaded and (b) with lemma: true, "pos" is missing in the NLP pipe. 🤓

🐛 Fixes

  • Refined typescript definitions further. ✅

Added missing typescript definitions

06 May 15:44
Compare
Choose a tag to compare

Version 2.2.1 May 06, 2024

🐛 Fixes

  • Added missing typescript definitions for word embeddings besides few other typescript fixes. ✅

Added non-breaking space handling capabilities

03 Apr 14:10
Compare
Choose a tag to compare

Version 2.2.0 April 03, 2024

✨ Features

  • Detokenization restores both regular and non-breaking spaces to their original positions. 🤓

Introducing cosine similarity for word vectors

24 Mar 14:53
Compare
Choose a tag to compare

Version 2.1.0 March 24, 2024

✨ Features

  • You can now use similarity.vector.cosine( vectorA, vectorB ) to compute similarity between two vectors on a scale of 0 to 1. 🤓

Word embeddings have arrived!

24 Mar 09:05
Compare
Choose a tag to compare

Version 2.0.0 March 24, 2024

✨ Features

  • Seamless word embedding integration enhances winkNLP's semantic capabilities. 🎉 👏 🙌
  • Pre-trained 100-dimensional word embeddings for over 350,000 English words released: wink-embeddings-sg-100d. 💯
  • API remains unchanged — no code updates needed for existing projects. The new APIs include: 🤩
    • Obtain vector for a token: Use the .vectorOf( token ) API.
    • Compute sentence/document embeddings: Employ the as.vector helper: use .out( its.lemma, as.vector ) on tokens of a sentence or document. You can also use its.value or its.normal. Tokens can be pre-processed to remove stop words etc using the .filter() API. Note, the as.vector helper uses averaging technique.
    • Generate contextual vectors: Leverage the .contextualVectors() method on a document. Useful for pure browser-side applications! Generate custom vectors contextually relevant to your corpus and use them in place of larger pre-trained wink embeddings.
  • Comprehensive documentation along with interesting examples is coming up shortly. Stay tuned for updates! 😎

Added Deno example

21 Jul 05:23
Compare
Choose a tag to compare

Version 1.14.3 July 21, 2023

✨ Features

  • Added a live example for how to run winkNLP on Deno. 👍

Fixed a bug

01 Jul 14:28
Compare
Choose a tag to compare

Version 1.14.2 July 1, 2023

🐛 Fixes

Squashed a bug

11 Jun 13:15
Compare
Choose a tag to compare

Version 1.14.1 June 11, 2023

🐛 Fixes