Skip to content

flavioltonon/tokenizer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tokenizer

Tokenizer is a simple text tokenizer written in Go.

Instalation

go get github.com/flavioltonon/tokenizer

Usage

// Create a tokenizer with a set of stopwords
t := tokenizer.New("foo", "bar", "baz")

// Tokenize a text
tokens := t.Tokenize("123 foo 456 bar 789 baz qux") // []string{"123", "456", "789", "qux"}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages