GPT-NEO is the common term for a collection of language models created by EleutherAI.

Most of the text generated by GPT-NEO on this site was made on a local machine using an RTX-2070 GPU with 8GB of memory. This limits the text generation to fairly small-ish models.

The models used are either 125M or 1.3B parameters models from Hugging Face.