Remix.run Logo
aliljet 3 hours ago

I really really want to see how these images are starting to form into videos. The stills are clearly getting better and better, but what about when you need the stills to organically conform to a keyed script?

Mizza 3 hours ago | parent | next [-]

Check out Seedance 2: https://seed.bytedance.com/en/seedance2_0

Nano Banana was technically impressive the first time, but after Seedance it's not really. It's all just an internet pollution machine anyway.

rany_ 3 hours ago | parent [-]

The page looks promising but how can I try it out?

rabf 2 hours ago | parent [-]

They have an API.

progbits 3 hours ago | parent | prev | next [-]

I'm seeing more and more AI video memes and they are getting really good. Still just bunch of short clips, long shots are not working well enough, but typical Hollywood movies have few second cuts anyway so this is almost good enough to make a marvel fanfic.

vessenes 3 hours ago | parent | prev [-]

the workflow right now would be to take this images, make a sequence of them for key "shots" and send them to an I2V model. LTX-2 is the model the r/stablediffusion folks are playing with right now, but there are a fair few.