| ▲ | trilogic 6 hours ago |
| When 2 multi billion giants advertise same day, it is not competition but rather a sign of struggle and survival.
With all the power of the "best artificial intelligence" at your disposition, and a lot of capital also all the brilliant minds, THIS IS WHAT YOU COULD COME UP WITH? Interesting |
|
| ▲ | sdf2erf 5 hours ago | parent | next [-] |
| Yeah they are both fighting for survival. No surprise really. Need to keep the hype going if they are both IPO'ing later this year. |
| |
| ▲ | thethimble 5 hours ago | parent | next [-] | | The AI market is an infinite sum market. Consider the fact that 7 year old TPUs are still sitting at near 100p utilization today. | |
| ▲ | superze 5 hours ago | parent | prev [-] | | How many IPOs can a company really do? | | |
| ▲ | re-thc 5 hours ago | parent [-] | | As many as they want. They can "spin off" and then "merge" again. |
|
|
|
| ▲ | rishabhaiover 6 hours ago | parent | prev | next [-] |
| What happened to you? |
| |
|
| ▲ | lossolo 5 hours ago | parent | prev | next [-] |
| What's funny is that most of this "progress" is new datasets + post-training shaping the model's behavior (instruction + preference tuning). There is no moat besides that. |
| |
| ▲ | Davidzheng 5 hours ago | parent | next [-] | | "post-training shaping the models behavior" it seems from your wording that you find it not that dramatic. I rather find the fact that RL on novel environments providing steady improvements after base-model an incredibly bullish signal on future AI improvements. I also believe that the capability increase are transferring to other domains (or at least covers enough domains) that it represents a real rise in intelligence in the human sense (when measured in capabilities - not necessarily innate learning ability) | | | |
| ▲ | WarmWash 5 hours ago | parent | prev | next [-] | | >There is no moat besides that. Compute. Google didn't announce $185 billion in capex to do cataloguing and flash cards. | | | |
| ▲ | riku_iki 2 hours ago | parent | prev [-] | | > is new datasets + post-training shaping the model's behavior (instruction + preference tuning). There is no moat besides that. sure, but acquiring/generating/creating/curating so much high quality data is still significant moat. |
|
|
| ▲ | 6 hours ago | parent | prev [-] |
| [deleted] |