| ▲ | stavros 2 hours ago |
| The Chinese models are distilled from GPT and Claude, so it's not like China would pull ahead if those companies went away for six months. They really are at the forefront of innovation right now, as much as I hate to think of the consequences of this (a single company owning a superintelligence is basically a nightmare scenario for me). |
|
| ▲ | largbae 2 hours ago | parent | next [-] |
| Don't worry, if someone truly achieves superintelligence it won't be controlled by anyone for long. |
| |
| ▲ | chihuahua 2 hours ago | parent | next [-] | | There will be a blinding flash which signals the superintelligence singularity. When the smoke clears, you'll see a 50-foot tall Altman/Borg hybrid. He is about to destroy humanity with his death ray. Suddenly, a 50-foot tall Musk/Borg hybrid appears out of nowhere, and stops Altman just in time. Then they work together to destroy all humans. | | |
| ▲ | rl3 an hour ago | parent [-] | | Seems our best hedge in that case is Levi Ackerman. |
| |
| ▲ | stavros 2 hours ago | parent | prev [-] | | That's my other nightmare scenario :P | | |
| ▲ | georgemcbay 2 hours ago | parent [-] | | Just imagine how inexpensive paperclips will become, there is always a silver lining. We will finally have achieved abundance. | | |
| ▲ | stavros 2 hours ago | parent [-] | | Not just abundance, we will have the maximum amount of paperclips possible. |
|
|
|
|
| ▲ | isodev 2 hours ago | parent | prev [-] |
| I think that’s the realm of conspiracy theories. There are also not only Chinese alternatives- Mistral in Europe is doing pretty good in several categories they’ve opted to focus on. This kind of reiterates the parent’s question I think - people are maybe too focused on the gpt/claude model and forget about all the other ways of using the tech. |
| |
| ▲ | stavros 2 hours ago | parent [-] | | Is it? I thought it was pretty well established that open models were distilled from the proprietary, frontier ones. Maybe I'm wrong. | | |
| ▲ | airstrike 2 hours ago | parent [-] | | No, that is not well established at all, and generalizing all open models under that inaccurate umbrella doesn't really help anyone. |
|
|