Top
Best
New

Posted by vuciv 3 days ago

I replaced Animal Crossing's dialogue with a live LLM by hacking GameCube memory(joshfonseca.com)
https://github.com/vuciv/animal-crossing-llm-mod
858 points | 185 commentspage 2
nurettin 3 days ago|
Apart from the memory hacking, I also appreciate how he fully typed his python code. (as in foo: Optional[Dict[str, int]])
aucisson_masque 1 day ago||
To me LLM are one of the greatest advancement in gaming. I don’t understand how there isn’t already any big game that’s using it to have ‘live’ npc.

It would make single player game so much more alive, even railroaded one like red dead redemption because the npc could adapt more easily to what you just did, how you behave in game and so on.

Games are already demanding on gpu, running a very tiny local LLM and only when people interact with npc wouldn’t require so much power.

I’m sure there are issues which explain why it’s not already being used but the first one to do it will be remembered.

netruk44 1 day ago|
I wanted the same, so I hacked/bolted on an LLM to Morrowind (the open source recreation OpenMW).

The biggest problem I faced at the time (during ChatGPT 3 era) was that, without a good context, LLMs are the most vanilla roleplayers you’ve ever seen. By themselves, LLMs are just not interesting enough for a player to choose to talk to in-game.

If you want them to be “interesting” to talk to, you must provide (or generate and keep track of): a backstory, chat history, the scene, NPC inventory, the NPC’s current emotional state, the weather, literally everything needs to be given to the model before it generates messages for the player.

At which point you’ve got a big task. You need a way to automatically get the relevant data to the model for the specific conversation you’re having. There might be tools to pick appropriate text documents from a db given a conversation topic, but I didn’t/don’t know how to make that work for games.

I’m sure there’s a way to accomplish this with more modern tools and models. (Maybe instead of providing all that data up front, you would now give the model tools to call to retrieve that data on-demand?) But that’s what made me give up in 2022.

aucisson_masque 1 day ago||
Of course it would require the game engine to provide context to the LLM, it certainly is quite a bit of work but nothing technically impossible and the result are endless possibilities and endless replayabilities. You could very well let the LLM decide if an npc should agree to give an object to the player or not, even let the player try to convince the npc.

I mean it looks like to me the next big step in gaming after 3D and yet this is being ignored.

b3lvedere 3 days ago||
This is awesome! I'd love lots of screenshots with more funny dialogs. :)
vunderba 3 days ago||
Nice job! Seems like a good use-case for the random Mii avatars milling about in the Mii Plaza on the original Nintendo Wii.
Nition 3 days ago||
I've thought for a while that the ideal old game for this kind of conversion would be Starship Titanic.
b3lvedere 3 days ago|
That's a game name i haven't heard in ages. :)
foota 3 days ago||
I wonder if it supports Resetti :-)

But also, why couldn't you look at the code to find the addresses used for dialogue? If it's already disassembled I would think you could just look at the addresses used by the relevant functions/set a breakpoint etc.,?

iam_saurabh 2 days ago||
This feels like the future of gaming: community-driven mods where AI brings infinite new dialogue and quests. Imagine if Nintendo leaned into this instead of fighting mods.
hrdwdmrbl 3 days ago||
It always felt both "a cheap shot" and "valid" to express dismay that characters in video games don't react when you do things like jump up and down on their table.

While it's impossible for game developers to write code to cover every situation, AI could make general reactions possible.

It's surprising that really simple things like this haven't been tried yet (AFAIK!). Like, even if it's just the dialogue, it can still go a long way.

lmm 3 days ago||
Many games have tried for more "realistic" simulated NPCs, but usually it turns out they don't make the game any more fun, quite the opposite.
ehnto 3 days ago|||
The cost benefit is really poor, but I also wonder if it's just never been done well.

Old text adventures honestly did this heaps better than modern games do, but the reality is there was a more finite action space for them and it wasn't surprising when something wasn't catered for.

theshackleford 3 days ago|||
Which games are these? I’d be interested in checking them out.

I’m only aware of experimentation in making more “difficult” NPC AI which was found less enjoyable for obvious reasons, so would be interested to see why similar but different attempts down another path also failed.

anon7000 3 days ago|||
Kingdom Come Deliverance 2 is one of the closer attempts, NPCs will react to lots of things about you and behavior, like if you smell bad or stare at them for too long
dclowd9901 3 days ago||
I was so surprised in BotW and TotK to see NPCs duck, huddle, gasp and otherwise react to odd shit you might be doing. Also in dialogue, do contextual things like talk about the weather and time.

I would love to see a Zelda game implement LLM dialogue for random inconsequential dialogue or relating dialogue to the game context.

klaussilveira 3 days ago||
Somewhat related. Quake 3 bot chat with LLMs: https://www.youtube.com/watch?v=BeyvvQOPlhM

https://github.com/jmarshall23/Quake3LLM

jmarshall23 is a beast, with tons of interesting id tech-based projects.

bryanhogan 3 days ago|
This is amazing! Would have loved to see more gameplay!
More comments...