The difference between tokens and words

1 · John Cook · March 7, 2025, 6:36 p.m.
Summary
This post explains the distinction between tokens and words in the context of large language models, specifically using the byte pair encoding (BPE) method for tokenization, while also referencing the tokenizer used in GPT 3.5 and 4.