Remix.run Logo
ilaksh 11 hours ago

One thing to mention is that as far as I know, biological neurons are much more complex than artificial ones. They spike and have a lot more internal states (I think).

I wonder if we might see more neuromorphic AI hardware (at competitive scale) soon in order to get away from the extreme energy requirements by (hopefully) packing more capability into fewer artificial neurons and/or a more efficient implementation of the neurons.

Every time I see another headline about the plans for giant AI datacenters I expect the next one to be about memristors or some new paradigm with a huge investment coming out of a lab and into a competitive chip. I think that every dollar that goes into a giant AI datacenter should be matched with trying to make new hardware paradigms for AI competitive.

erikerikson 16 minutes ago | parent | next [-]

> biological neurons are much more complex than artificial ones. They spike and have a lot more internal states (I think)

This is correct. Notably the famous Minsky XOR result is valid only due to oversimplification in the perceptron model. By adding a notion of location and modulating learning using it, Hopfield networks learn XOR just fine.

kuukyo an hour ago | parent | prev | next [-]

I also think neuromorphic computing is the future for energy efficient AI. But sadly development is getting a bit sidelined by other popular AI research areas. It doesn't help that it's much easier to train ANNs than spiking neural networks (SNNs). :/

bob1029 an hour ago | parent [-]

I think neuromorphic might even be a distraction right now.

The hyperparameters for a SNN that performs a specific task are extremely elusive. If you don't even know what kind of neuron model or fan out ratio might work, how the hell can you start burning these as constants into some hardware contraption?

sdenton4 2 hours ago | parent | prev | next [-]

I doubt it... The last, oh, ten years has been accompanied by a graveyard of dead neurotrophic ai startups. Brain neurons are what they are because they are working under a particular set of capabilities and constraints, very different from what we can do with silicon.

It's also worth asking whether the data center is really all /that/ inefficient: The human time and effort required to produce (say) one text-to-image prompt completion is enormous compared to a GPU: the brain uses less power per time unit, but seems to require a hell of a lot more time.

Roxxik 39 minutes ago | parent | prev [-]

I'm not sure that the analogy stretches so far.

What even is an artificial neuron in an Artificial Neural Network executed on "normal" (non-neuromorphic) hardware? It is a set of weights and an activation function.

And you evaluate all neurons of a layer at the same time by multiplying their weights in a matrix by the incoming activations in a vector. Then you apply the activation function to get the outgoing activations.

Viewing this from a hardware perspective, there are no individual neurons, just matrix multiplications followed by activation functions.

I'm going out of my area of expertise here, I just started studying bioinformatics, but neurological neurons can't simply hold an activation because they communicate by depolarising their membrane. So they have to be spiking by their very nature of being a cell.

This depolarization costs a lot of energy, so they are incentived to do more with less activations.

Computer hardware doesn't have a membrane and thus can hold activations, it doesn't need spiking and these activations cost very little on their own.

So I'm not sure what we stand to gain from more complicated artificial neurons.

On the other hand, artificial neutral networks do need a lot of memory bandwidth to load in these weights. So an approach that better integrates storage and execution might help. If that is memristor tech or something else.