Thursday, November 6, 2025

Tiny Text Predictor Model (How AI Predicts Text)

Modern natural-language models use an architecture called a Transformer. A Transformer predicts the next token (a word or a piece of a word) based on the input it receives, using patterns it learned during training. During training, the model is exposed to millions of examples of text and learns which words or tokens are most likely to follow others in different contexts, based on probability. 
For example, if the model sees the phrase “An electron has a …”, it has learned from many texts that “negative charge” is a very probable continuation. This is similar to how humans learn patterns of language from their environment and can often predict what word comes next. However, unlike the model, humans do not choose the next word purely from learned patterns—they rely on reasoning, intention, context, and their own understanding of what they want to express. Humans select the word that best conveys their thoughts, not simply the word with the highest statistical likelihood.

This demo learns simple word patterns from the training text below and tries to predict how sentences continue. Type a short prompt and click Generate, or edit the training text and rebuild the model.

Top next-word suggestions:
No suggestions yet.
Model output:
Nothing yet – try a prompt above.

Training text (editable, temporary)

The model learns from this text using a simple 3-gram language model (uses up to the last two words). You can paste your own text here and click Rebuild model. Changes live only in your browser.

Model loaded from default text.

No comments:

Post a Comment