Remix.run Logo
throw4847285 12 hours ago

Didn't they cut a huge deal with Disney just 3 months ago?

https://openai.com/index/disney-sora-agreement/

toraway 10 hours ago | parent | next [-]

Yes, and Disney is apparently no longer investing in OpenAI, making one more example of an OpenAI investment hype cycle that turned out to be hot air.

Disney Exits OpenAI Deal After AI Giant Shutters Sora

https://www.hollywoodreporter.com/business/digital/openai-sh...

  A source familiar with the matter tells The Hollywood Reporter that Disney is also exiting the deal it signed with OpenAI last year, in which it pledged to invest $1 billion in the company and agreed to license some of its characters for use in Sora.

  “As the nascent AI field advances rapidly, we respect OpenAI’s decision to exit the video generation business and to shift its priorities elsewhere,” a Disney spokesperson said. “We appreciate the constructive collaboration between our teams and what we learned from it, and we will continue to engage with AI platforms to find new ways to meet fans where they are while responsibly embracing new technologies that respect IP and the rights of creators.”
Also "exit the video generation business" seems somewhat notable, suggesting they're not just planning to launch a different video gen product to replace Sora?
moralestapia 11 hours ago | parent | prev [-]

Wow. OpenAI is the weirdest company in the planet.

I used to think they were pretty clever but with this news and other recent ones (Jony Ive project cancelled, Stargate scaled down significantly, their models inflating token use on purpose) they just seem schizo.

password54321 11 hours ago | parent | next [-]

They are just cancelling side projects because Anthropic is dominating in enterprise and side projects (probably) don't make profit. https://x.com/ShanuMathew93/status/2031074311629353299

timpera 11 hours ago | parent [-]

This data is pretty questionable. OpenAI employees have said on Twitter that it does not account for ChatGPT Enterprise, where most of their growth is, which is quote-only and not paid by credit card.

radicality 11 hours ago | parent | prev | next [-]

You have more info about the inflated token use? I’m using codex cli a bunch now, but the reported token usage seems like an order of magnitude higher than, say Claude code with opus.

Idk if it’s because I set codex to xhigh reasoning, but even then it still seems way higher than Claude. The input/output ratio feels large too, eg I have codex session which says ~500M in / ~2M out.

moralestapia 11 hours ago | parent [-]

I wish I had hard evidence but it is mostly an observation. I do use Codex a lot and I felt a drastic change from like one-two months ago to this day.

It used to give me precise answers, "surgical" is how I described it to my friends. Now it generates a lot of slop and plenty of "follow ups". It doesn't give me wrong answers, which is ok, but I've found that things that used to take 3-4 prompts now take 8-10. Obviously my prompting skills haven't changed much and, if anything, they've become better.

This is something that other colleagues have observed as well. Even the same GPT5.4 model feels different and more chatty recently. Btw, I think their version numbers mean nothing, no one can be certain about the model that is actually running on the backend and it is pretty evident that they're continuously "improving" it.

SpicyLemonZest 7 hours ago | parent [-]

I haven't had the time to fully hash this take out, but a big question in the back of my mind has been - is it possible that AI model improvements come partly from finding overhang in things that look hard and impressive to humans but are actually trivial consequences of the training data? If true, then the observable performance of any widely distributed model could get worse over time as it "mines out" the work that's easy for it to do.

skywhopper 8 hours ago | parent | prev [-]

Turns out just lying about what your tech will do and how much people want it doesn’t work forever to raise unlimited money to throw in the fire hoping you hit something that actually makes a profit.