Text Synth

Text completion using the GPT-2 language model. It is a neural network of 345 million parameters. Type a text and let the neural network complete it. Each try returns a different randomly chosen completion.

Completed Text: