| ▲ | noosphr 3 days ago |
| All opinions are my own: The whole point of the CNNs is to act like a auto encoder for input and an auto decoder for output. The only reason why this is done in the first place is because the number of electrodes in the dish is pitiful and has no chance of describing something as complex as Doom. They are there to create a latent space that can be fed through 60 odd electrodes and decode the neuron latent space into pressing buttons. The pong version of the game was the proof of concept that neurons can learn without a latent space intermediate in either direction. Both the world state and neuronal control were raw signals: https://pubmed.ncbi.nlm.nih.gov/36228614/ What I wanted to do after dish brain pong, but never had the budget for, was using live animals as the computational substrate. Use the visual cortex of one as the input, send the neural spikes to a second animals frontal lobe for computation and finally send those signals to a third animals motor cortex to physically press buttons. It's a shame we never raised enough because it wouldn't have cost more than $15m to build the hardware and do the biological proof of concept. |
|
| ▲ | batch12 3 days ago | parent | next [-] |
| > using live animals as the computational substrate. Use the visual cortex of one as the input, send the neural spikes to a second animals frontal lobe for computation and finally send those signals to a third animals motor cortex to physically press buttons. That sounds terrifying. |
| |
| ▲ | altmanaltman 3 days ago | parent [-] | | It does but most of what we do to animals is terrifying. I could see why getting funding for this idea might not have been that easy though "I want to mind control three animals to play Doom" is certainly a pitch | | |
| ▲ | 4gotunameagain 3 days ago | parent [-] | | That is the fallacy of relative privation. The fact that most of what we do to animals is terrifying should be the motivating factor to NOT do more of it, such as the atrocity described above. | | |
| ▲ | Schmerika 2 days ago | parent [-] | | Seems like the relative privation fallacy is more and more a key component of an accelerating race to the bottom, across wide swathes of society. Sure would be cool if more people could recognise it. |
|
|
|
|
| ▲ | roenxi 3 days ago | parent | prev | next [-] |
| > The only reason why this is done in the first place is because the number of electrodes in the dish is pitiful and has no chance of describing something as complex as Doom. This sounds a bit suspicious though. If we're confident that the neurons aren't complex enough to understand Doom, how can they be said to be complex enough to play it? Playing a game is a loose term but it seems difficult to say that it is playing something that it can't comprehend or interact with. By analogy, if there was a CNN between me and a game of Doom people would say "roenxi is cheating with an AI aim-bot", not 'roenxi is playing Doom". The whole thing is still pretty cool though. Hopefully the neurons are having fun, I'm sure we all wish them what happiness they can muster. |
| |
| ▲ | noosphr 3 days ago | parent [-] | | There isn't enough input electrodes to encode a doom frame into the multi electrode array without compression. That's all the artificial neural networks are doing. If we could have gotten an MEA with 320x200 electrodes we wouldn't have used any encoding and just let the neurons figure it out. Instead it is an 8x8 grid. | | |
| ▲ | roenxi 3 days ago | parent [-] | | We've got LLMs that seem to be smarter than anyone I'm talking to day-to-day and one useful model of them is just "compression". Compression is turning out to be a pretty key operation in intelligence and understanding (in fact, it seems to be intelligence and understanding in key ways). If we compress Doom into "shoot" and "the press buttons in the most favourable way to the player" then good compression could let a fair coin play Doom well if someone flips it fast enough. I mean maybe ANN just means sampling the screen in which case I'm not sure why we're talking about it as a "network". But the type of compression seems critical. Have I watched any of the videos or read the code? No I have not. |
|
|
|
| ▲ | idiotsecant 3 days ago | parent | prev | next [-] |
| Yes...quite a shame that we never made a amalgamation cyborg horror out of parts and pieces of several different animals. That's definitely not the plot of every sci-fi horror movie. |
|
| ▲ | virgildotcodes 3 days ago | parent | prev | next [-] |
| This sounds nightmarish. Maybe we build a human centipede if we can get the VC funding next? |
| |
| ▲ | popcorncowboy 3 days ago | parent | next [-] | | Or a Torment Nexus!! | | |
| ▲ | red-iron-pine 2 days ago | parent | next [-] | | man its shocking how every day we keep spiraling to that | |
| ▲ | peddling-brink 3 days ago | parent | prev [-] | | Great idea, intense pain does provide a stronger response in the neuronal substrate. The prisoners, or uhh, “research subjects”, won’t mind. It’s for science. /s |
| |
| ▲ | noosphr 3 days ago | parent | prev [-] | | I would have been quite happy to use my own brain as the computational substrate and I had more than a few other people keen to be the input and output parts of the system. It's rather unfortunate that in the West it is impossible to get elective brain surgery. The countries that will do it have at best a spotty record. I talked to someone who had it done in Brazil and their electrodes became dislodged after a few months. There is nothing new or horrifying about self experimentation. Newton for one did it in conditions that were far more dangerous: https://psmag.com/social-justice/newtons-needle-scientific-s... | | |
| ▲ | virgildotcodes 3 days ago | parent [-] | | I'm totally fine with consensual human experimentation that somehow threads the needle around exploitation of the poor - just not sure how we do the latter part short of requiring experimentees to pass a minimum net-worth threshold? | | |
| ▲ | hgoel 3 days ago | parent | next [-] | | I think the closest would be: if anyone involved ever complains to authorities at all, everyone involved gets in trouble. If no one ever complains, no trouble. Everyone involved is forever at the mercy of everyone else involved. | | |
| ▲ | Nevermark 3 days ago | parent | next [-] | | I see perverse incentives to ablate complaint origination to expression pathways, or complaint system dependencies. Or not so perverse, as this makes running these ventures much safer. Safety first! | |
| ▲ | noworriesnate 3 days ago | parent | prev | next [-] | | Complaining to the authorities needs to come with a cost, otherwise people who don’t believe in the research or are looking for a payday will join just to complain | | |
| ▲ | hgoel 2 days ago | parent [-] | | I consider that a feature in this idea, you have to believe in whatever you want to human experimentation for, enough to select who you include very carefully, and ultimately still take that risk. |
| |
| ▲ | nkrisc 3 days ago | parent | prev | next [-] | | Competitor will pay a former customer to complain. | |
| ▲ | idiotsecant 3 days ago | parent | prev [-] | | Or...at the mercy of a scary man with a big wrench. Every single post you've put in this thread is a volatile mix of idealistic, naive, and sociopathic. So, obviously you'll be a tech CEO in 10 years. | | |
| |
| ▲ | redrecpeng 2 days ago | parent | prev [-] | | Make sure subjects actually pay, instead of getting paid for being a subject? |
|
|
|
|
| ▲ | stackghost 3 days ago | parent | prev | next [-] |
| >What I wanted to do after dish brain pong, but never had the budget for, was using live animals as the computational substrate. What does the ethical due diligence process look like, for something like this? |
| |
| ▲ | AmbroseBierce 3 days ago | parent [-] | | Haha, you made me laugh quite a bit, like ethical due diligence was even a bleep in the mental model of someone who talks like that about sentiment life forms. | | |
| ▲ | noosphr 3 days ago | parent [-] | | I figure I'd get some mice fed to pythons. You know, the ones swallowed head first and alive and then drown in stomach acid, the ones you can buy in pet any store? | | |
| ▲ | AmbroseBierce 3 days ago | parent | next [-] | | Yeah, "nature is brutal, therefore what gives if i raise the bar in the suffering we bring to this world", great logic right there mate, specially when we all know such experiments are without the slightest shred of doubt aimed to end up using humans neurons, because those are the most powerful. Also worth mentioning in district of Columbia and a few other places is illegal to sell live animals including mice, so there is some effort to do better about our behavior towards other living beings, unlike you. | | | |
| ▲ | stackghost 2 days ago | parent | prev [-] | | The fact that they get eaten doesn't justify torturing mice by hooking their brains up to an LLM to make a slightly better climate catastrophe, or making them play doom. Real talk, torturing animals is a hallmark of sociopaths. You should really seek professional help, and I say this 100% seriously. |
|
|
|
|
| ▲ | roxolotl 3 days ago | parent | prev | next [-] |
| Reminds me of the head transplant experiments. The stuff of nightmares but also fascinating. |
|
| ▲ | peddling-brink 3 days ago | parent | prev | next [-] |
| > What I wanted to do after dish brain pong, but never had the budget for, was using live animals as the computational substrate. This is horrific. What’s your end goal? Prisoners as data centers? I hope you rethink this. |
|
| ▲ | catigula 2 days ago | parent | prev | next [-] |
| >What I wanted to do after dish brain pong, but never had the budget for, was using live animals as the computational substrate. Use the visual cortex of one as the input, send the neural spikes to a second animals frontal lobe for computation and finally send those signals to a third animals motor cortex to physically press buttons. It's a shame we never raised enough It's amazing that someone would feel comfortable sharing this. |
|
| ▲ | constantius 3 days ago | parent | prev | next [-] |
| > What I wanted to do after dish brain pong, but never had the budget for, was using live animals Torture, so casually mentioned. For what, I wonder. |
| |
| ▲ | noosphr 3 days ago | parent | next [-] | | When we can turn off distress and pain in farm animals we would have done more to improve well-being in the world than anyone alive today. Factory farms stop being an efficient evil and become the only moral way to produce meat. And as a side effect we also get super intelligence on a substrate that is 10 orders of magnitude more energy efficient than silicon. | | |
| ▲ | catigula 2 days ago | parent | next [-] | | This is literal supervillian wire-heading type stuff. Poorly thought out ideas of a madman with no regard for the consequences; just vague claims about the definite super-good idea of direct brain state manipulation. | |
| ▲ | cinntaile 3 days ago | parent | prev [-] | | Do you mean we'll input their brains with an alternate reality or just removing their pain and distress signals? Regardless of the answer. Lab meat still seems more ethical, it was never a sentient being in the first place. | | |
| ▲ | catigula 2 days ago | parent | next [-] | | That person has no idea or even care for what is or isn't ethical outside of other people maybe being upset about it and trying to stop them. | |
| ▲ | 2 days ago | parent | prev [-] | | [deleted] |
|
| |
| ▲ | cindyllm 3 days ago | parent | prev [-] | | [dead] |
|
|
| ▲ | Balgair 3 days ago | parent | prev | next [-] |
| Gosh it's been years, but I think they did the dual animal experiment with rats about a decade ago. I'm likely misremembering but they tickled a rat in Japan and fed the impulses into the internet and had another rat in maybe Brazil move it's tail in response. From what I recall it did potentiate over time, implying learning at the more reflex level. Sorry I can't find the link though! |
|
| ▲ | ofjcihen 3 days ago | parent | prev | next [-] |
| Hahaha I love how you made something that wouldn’t be harmful sound like a nightmare horror show. Edit sweet Jesus never mind I missread it. |
|
| ▲ | 3 days ago | parent | prev [-] |
| [deleted] |