This is my attempt at creating an infinite series of Jeopardy questions using an LLM, GPT2, using the transformers library. It randomly feeds one topic of a 100 topic array at a time into GPT2.
Unfortunately, it works very (very) poorly! Using a different, better model but the same general setup could probably lead to cool results!
Maybe the code could be interesting to look at...