Remix.run Logo
rustyhancock 8 hours ago

The piece I'm not quite understanding is. To what extent is this really simulating the brain accurately?

It appears to behave accurately but it seems from reading the background which is largely above my head that those behaviours are in effect implementations not directly from data.

I.e. a pattern in the neural simulation is interpreted as groom. And then a groom behaviour is in effect played back

Walk pattern recognition is translated into walking (effectively this is a animation).

Does it matter? I'm not sure but I think it's the difference between playing a running animation or running in QWOP

wzdd 8 hours ago | parent | next [-]

They've basically taken three separate models, one for fly vision, one for the fly brain, and one for the fly body, and bolted them together Frankenstein style.

They've taken the connectome, which is a map of how neurons in the brain are connected to each other, and then created a fly brain using artificial spiking neurons connected together using that same connectome.

So the neurons are not remotely accurate. The interesting point they're making is that even with these simplified neurons they still see plausible behaviours (i.e. simulate the presence of sugar by stimulating a gustatory neuron -> neurons associated with lowering the proboscis for feeding are triggered). So they make the case that a lot of information is encoded simply in the connectome.

The body isn't connected up to the brain in the way we'd expect. "Input" comes from a completely separate neural network which they've trained to simulate appropriate CNS neurons, and output is looking at "descending" (efferent I guess) neurons in a very basic way. It's not completely playing an animation, but the level of connectivity is very low dimensional. It's not clear how much control they have, but for example I imagine they have a spiking threshold for the proboscis below which it's lowered and above which it's raised, which is sort of like you being able to stick your tongue completely out or pull it completely in but nothing else.

So it's not especially bioplausible. The most interesting part is that leaky-integrate-and-fire connectome-based brain model which they're using, even though it is also very limited (for example, it doesn't learn).

Demo looks very cool though. Much credit for being so explicit about what's going on in it in that post. And I was immediately filled with ideas about what they could do next to improve it, which to me is a signal of good research.

lambdaloop 4 hours ago | parent | prev | next [-]

There is sooo much hype around this, when the Eon team did almost nothing new compared to the published research. (For context I did my PhD in a fly motor control lab and still follow this field. My colleagues are authors on the research papers Eon used.)

I want to highlight some limitations of the current state of research as well, based on questions that many neuroscientists in this field are struggling with.

- The movement of the legs is not modeled at a fine level, just whether fly is moving forward or turning. This is because we don't have great data on fly leg movements on all these situations, and ventral nerve connectome is still in progress

- By the way the brain connectome still has a lot of errors and needs more proofreading. Also the identity of many neurotransmitters and synaptic strength of connections is unknown. Many are identified through our knowledge of fly genetics which won't translate to humans. In current research, scientists add some finetuning to match some behavior to account for those unknowns.

- There's definitely fly behavior data at the level of making decisions, but not much at level of limb kinematics. Even where data is available, it's unclear how to evaluate the simulated fly against the data. How do you know you got it right?

When I saw the Eon announcement, I was curious how they tackled these challenges. Seems like they didn't. It looks like they forked a few research repositories and vibe coded something to combine them.

I'll give them props for the videos and marketing though, it's crazy to see so many people interested in this research field!!

zarzavat 8 hours ago | parent | prev | next [-]

> The fly body is not currently driven by the full downstream motor hierarchy of the biological fly. Instead, we use a small number of descending outputs as a practical interface between the connectome model and the biomechanics

> [...] Steering in our model is driven through the neurons DNa01 and DNa02 (Yang et al., 2024), which are implicated in turning. Forward velocity is modeled by activation of oDN1

It seems that currently only WASD control is working. But even that is impressive! This is essentially an NPC driven by a real connectome.

bananzamba 8 hours ago | parent | prev | next [-]

I had exactly the same question. In the linked tweet the CEO claims it's not an animation. But in the article they imply certain animations are played when a specific signal is detected.

I think a full simulation without these precreated animations would be more convincing that they actually fully simulated a fly's behavior. It's very easy to make it look like real behavior using animations, videogames do exactly that.

SayThatSh 7 hours ago | parent | prev [-]

I saw a short clip on this project the other day and was thinking along the same lines! They animated the flies’ mouth part (proboscis?) lowering and I was wondering if they were truly reading the motor commands for that from the brain.