Local LLMs for NPCs

Now that reasonably powerful locally running LLMs are a thing, can NPCs be powered by them?
Gemma3 model runs on basically any PC from the last five years, needs barely 8GB RAM total and doesnt even need a GPU, and can generate jsons if given sufficient context basically in a second, which is slow to use every turn, but fine during NPC dialogues or the like. And it can even accept ascii “screenpastes” as input and decipher what happens nearby and make appropriate decisions (e.g. that the house is on fire or that there’s a Zombear behind the player and so despite it being a slaver that wanted to ask you to give up it better run away). Nothing that I tried so far works 100% of the time though, but the few times it does work it feels like magic.

No. The game is not set up for this and never will be. AI-produced content is unwelcome here.

1 Like

I’m not asking to include AI-generated content or a whole LLM into the game, I’m saying if the game exposed some sort of API for NPCs then I could plug whatever I have installed on my machine, whether its a state machine or an LLM or whatnot.

Again: No. The game is not set up for this and never will be.

1 Like

On the contrary, I believe that introducing AI led decision-making or dialogue in NPC systems is the feasible way to get rid of the gradually boring later stages of the game, and it is also the future of all open world games, including CDDA

Agreed, this is absolutely necessary and significant, but first of all, due to the current technological bottleneck, we need people with sufficient ability to study it.