Tokenizer

Before your text is sent to the AI, it gets turned into numbers in a process called tokenization. These tokens are how the AI reads and interprets text.

The average token is around 4 characters long, but many common words are their own token.

NerdStash Tokenizer v2

Tokens: 0Characters: 0