Remix.run Logo
reliablereason 5 hours ago

Most chatbots are not trained to have/emulate emotions so pain or fear of death is non existent. Therefore killing them and/or using them as slaves is not a moral issue. Thats how i reason.

On another point, LLMs are not conscious if anything is conscious, it is something being modeled inside the network. Basically if an LLM simulates a conscious entity, that doesn't mean the LLM itself is conscious; stating that is making some type of category error. So the fact that LLMs are just useful statistical generators would not mean that sentience could not appear out of it.

Brian_K_White 4 hours ago | parent [-]

Pain or fear is not why it's wrong to kill holy cow. I could feed you a drug and you would not feel or fear anything.