Go BPE tokenizer (Encoder+Decoder) for GPT2 and GPT3.
GPT2 and GPT3 use byte pair encoding to turn text into a series of integers to feed into the model. This is a Go implementation of OpenAI's original Python encoder/decoder which can be found here.
This code was inspired by Javascript implementation and partially generated by OpenAI himself!
go get github.com/samber/go-gpt-3-encoder
import tokenizer "github.com/samber/go-gpt-3-encoder"
encoder, err := tokenizer.NewEncoder()
if err != nil {
log.Fatal(err)
}
str := "This is an example sentence to try encoding out on!"
encoded, err := encoder.Encode(str)
if err != nil {
log.Fatal(err)
}
fmt.Println("We can look at each token and what it represents:")
for _, token := range encoded {
fmt.Printf("%d -- %s\n", token, encoder.Decode([]int{token}))
}
decoded := encoder.Decode(encoded)
fmt.Printf("We can decode it back into: %s\n", decoded)
Some corner cases are not covered by this library. See @TODO
in tests.