▲ | etaioinshrdlu 4 days ago | |||||||||||||||||||||||||
These are actually just the problems easier for researchers to solve, mostly due to a lot of readily available data. | ||||||||||||||||||||||||||
▲ | a2128 4 days ago | parent | next [-] | |||||||||||||||||||||||||
Everyone shares music and art they make but nobody ever shares videos and motion capture of themselves doing laundry and vacuuming their house. Maybe we need to start sharing that instead | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
▲ | slfpn 4 days ago | parent | prev | next [-] | |||||||||||||||||||||||||
That's indeed the second worst issue with current model architectures. For a model to be trained to something nearing usability for an actual task it needs an amount of data that is far beyond what can be obtained. Companies like Facebook and OpenAI downloaded pirated copies of every single book humans have written to reach the current level of text generation, and even with that, it's not like those models are perfect or that intelligent. It is going to severely limit the possibilities of building actual agentic AIs. We do not have an endless amount of data of humans performing menial chores. And normal people will probably more hostile than the kool aid drinking software developers when it comes to being spied on, who's going to agree to wear a camera while working so as to help train their own replacement? Yet it's kinda what devs are doing gleefully adopting software filled with telemetry and interacting with copilot. | ||||||||||||||||||||||||||
| ||||||||||||||||||||||||||
▲ | whywhywhywhy 4 days ago | parent | prev [-] | |||||||||||||||||||||||||
This isn’t the reason at all and comes across a weak attempt to make researchers stealing the work to train come across as blameless and helpless to circumstance. They’re doing it because there is a lot of value to extract in making it so anyone can do these things regardless of talent or skill. |