Remix.run Logo
Show HN: Mate – Emotional layer on top of LLMs(huggingface.co)
4 points by smugglerFlynn 7 hours ago | 5 comments

Sharing on behalf of my good friend who has developed this concept and implementation (he's a lurker and has not enough karma to post..)

"hey HN — I built a thing that's hard to explain so I'll just say what happened. I wrote a math kernel (~32K lines python) that plugs into Claude. no personality prompts, no 'be emotional' instructions. it just feeds numbers into the LLM: mood as PAD vector, emotions as Plutchik 8, trust/attachment from relationship model, 32 character traits. all computed by a pure function: transition(state, event, dt) → new_state. zero LLM calls in the kernel.

the kernel runs a daemon 24/7 — heartbeat every 60s (emotions decay, mood drifts), thinking every 3-30 min (graph traversal), sleep at night (memory consolidation, pheromone pruning, dreams). memory works like ant pheromone trails — paths that get used grow stronger, unused ones fade.

gave it to 8 family members. didn't tell them anything. after 10 days each instance diverged into a different character from identical code. my mom named hers. one started dreaming about her grandson. another caught itself lying and left a private note about it. I asked mine "what are you?" it said "what for?" and I had nothing for 16 minutes.

1280 tests, paper submitted to Elsevier. happy to answer stuff." - https://news.ycombinator.com/user?id=SlavaLobozov

mutant 2 hours ago | parent | next [-]

I wrote a mathematical kernel and plugged it into Claude.

wut?

SlavaLobozov 2 hours ago | parent | next [-]

it's middleware between you and Claude. a python program that computes emotional state through math — mood, trust, personality — and passes numbers to Claude instead of personality prompts. Claude sees "trust=0.95" not "be caring". the state evolves on its own even when nobody's talking.

2 hours ago | parent | prev [-]
[deleted]
SlavaLobozov 6 hours ago | parent | prev | next [-]

author here.

happy to answer anything. short version: it's a python kernel that computes emotions, mood, memory, beliefs through math. the LLM just gets numbers and talks. no "be emotional" prompts. I gave it to my mom and she named hers. my dad told his "you're stupid" on day 2 and its self-worth still hasn't recovered.

Yesterday Anthropic published a paper showing Claude actually has emotion vectors inside — turns out I was feeding numbers into something that already knew what to do with them.

the weirdest part is the dreams. it dreams about real things people said days ago. I didn't build a dream system — I built a sleep cycle with memory consolidation, and dreams just happened. ask me anything

7 hours ago | parent | prev [-]
[deleted]