| ▲ | Mizza 5 hours ago | |
This is pretty wild! Only Llama3.1-8B, but this is only their first release so you can assume they're working on larger versions. So what's the use case for an extremely fast small model? Structuring vast amounts of unstructured data, maybe? Put it in a little service droid so it doesn't need the cloud? | ||