Top
Best
New

Posted by SachitRafa 10 hours ago

Show HN: AI memory with biological decay (52% recall)(github.com)
Most RAG setups fail because they treat memory like a static filing cabinet. When every transient bug fix or abandoned rule is stored forever, the context window eventually chokes on noise, spiking token costs and degrading the agent's reasoning.

This implementation experiments with a biological approach by using the Ebbinghaus forgetting curve to manage context as a living substrate. Memories are assigned a "strength" score where each recall reinforces the data and flattens its decay curve (spaced repetition), while unused data eventually hits a threshold and is pruned.

To solve the "logical neighbor" problem where semantic search misses relevant but non-similar nodes, a graph layer is layered over the vector store. Benchmarked against the LoCoMo dataset, this reached 52% Recall@5, nearly double the accuracy of stateless vector stores, while cutting token waste by roughly 84%.

Built as a local first MCP server using DuckDB, the hypothesis is that for agents handling long-running projects, "what to forget" is just as critical as "what to remember." I'd be interested to hear if others are exploring non-linear decay or similar biological constraints for context management.

GitHub: https://github.com/sachitrafa/cognitive-ai-memory

78 points | 35 commentspage 2
cyanydeez 9 hours ago|
on the other "biological memory" post in so many weeks, I pointed out that the decay rate shouldn't be based on a real clock but a lifetime of it's use within the coding session. Elsewise your memory fades even when there's no process change (eg, coder goes on vacation). I'm not going to check whether thats true here, but it seems like a naive first assumption thats failed conceptualization.

The other comment is that spatial memory is probably a better trigger for memory, so if you're not tracking where the coding session starts, the folders it's visits, etc, then you're not really providing a good associative footpath for the assistant to retrieve whats important for any given project.

SachitRafa 1 hour ago|
[dead]
altmanaltman 10 hours ago||
I am sorry but the whole "biological memory" thing seems like marketing fluff on basic cache mechanisms.

You said it cuts token usage by 84% but isn't that typical for any typical chunked RAG system?

And why did you specifically chose to test against the LoMoCo dataset when there's a lot of issues with it and it being very easy to cheat?

mtrifonov 1 hour ago||
Decay-as-eviction is just LRU, fair. Type-conditional half-life is worth defending, though.

A user's job and personality should be effectively permanent. Their stated intent for this week should fade in days. Their emotional state from a single message should be gone by tomorrow. Decay everything at one rate and you're back to LRU with the problems you're calling out.

The "biological" framing isn't really doing much work. Ebbinghaus is one curve and fine, but it's not where the leverage is. Type-conditional half-life is. Without that, this is a cache.

xhevahir 9 hours ago|||
And a neural network is really just a composed, non-linear parameterized function that maps input vectors to output vectors. Sometimes metaphors or analogies do contribute something valuable.
throawayonthe 8 hours ago||
isn't that an example of an analogy being more misleading than useful
jnovek 9 hours ago|||
I think it’s reasonable, a forgetting curve is intended to models a biological process.

https://en.wikipedia.org/wiki/Forgetting_curve

SachitRafa 1 hour ago||
[dead]
rayyagari 4 hours ago||
[dead]
raks619 6 hours ago||
[dead]
chinadata 4 hours ago|
[dead]