>>> from gremllm import Gremllm
>>> player = Gremllm('dungeon_game_player')
>>> player.go_into_cave()
'Player has entered the cave.'
>>> player.look_around()
{'location': 'cave', 'entered_cave_at': '2025-07-02T21:59:02.136960'}
>>> player.pick_up_rock()
'You picked up a rock.'
>>> player.inventory()
['rock']
(further attempts at this have ... varying results ...)the framework he and I built kept track of the game state over time and allowed saving and loading games as json. we could then send the full json to an LLM as part of the prompts to get it to react. the most neat part, imo, was when we realized we could have the LLM generate text for parts of the story, then analyze what it said to detect any items, locations, or characters not jn the game state, and then have it create json representations of the hallucinated objects that could be inserted into the game states. that sealed the deal for using hallucinations as creative story telling inside the context of a game.
i assure you the D&D context is very fun! the class website might give you more ideas too https://interactive-fiction-class.org/
i wasnt officially part of upenn at the time, so my name isnt listed on the site, but we wrote a paper about some of the things we did, such as this one, and you'll see me listed there https://www.cis.upenn.edu/~ccb/publications/dagger.pdf
It feels like an AI cousin to the Python error steamroller (https://github.com/ajalt/fuckitpy).
Whenever I see this sort of thing I think that there might be a non-evil application for it. But then I think ... where's the fun in that?
from gremllm import Gremllm
# Be sure to tell your gremllm what sort of thing it is
counter = Gremllm('counter')
counter.value = 5
counter.increment()
print(counter.value) # 6?
print(counter.to_roman_numerals()) # VI?
I love this!Shameless plug: People who love-hate this might also love-hate vibeserver (https://github.com/lxgr/vibeserver).
I had some vague plans to make it self-hosting; this might make that even lower effort :)
"Wet mode" is such a fantastically awful name. Definitely make me think twice about turning it on.
…until your comment. Here! Take my “lived through the 80’s and 90’s” card.
How do I give it a base URL for API calls so I can point it at my ollama server?
llm install llm-ollama
and then you use whatever model you like, anything llm has installed. See https://llm.datasette.io/en/stable/plugins/installing-plugin... for plugin install info.Here is a sample session. You can't see it ... but it is very slow on my CPU-only non-apple machine (each response took like 30 seconds) :)
>>> from gremllm import Gremllm
>>> counter = Gremllm("counter", model="gemma3n:e2b")
>>> counter.value = 5
>>> counter.increment()
>>> counter.value
None
>>> counter.value
None
>>> counter.value
None
>>> counter.what_is_your_total()
6
6
... also I don't know why it kept saying my value is None :) . The "6" is doubled because one must have been a print and the other is the return value.