dragonwriter a day ago

> But 7b is rather small no?

Sure, its smallish.

> Are other open weight video models also this small?

Apples models are weights-available not open weights, and yes, WAN 2.1, as well as the 14B models, also has 1.3B models; WAN 2.2, as well as the 14B models, also has a 5B model (the WAN 2.2 VAE used by Starflow-V is specifically the one used with the 5B model.) and because the WAN models are largely actually open weights models (Apache 2.0 licensed) there are lots of downstream open-licensed derivatives.

> Can this run on a single consumer card?

Modern model runtimes like ComfyUI can run models that do not fit in VRAM on a single consumer card by swapping model layers between RAM and VRAM as needed; models bigger than this can run on single consumer cards.

jjfoooo4 19 hours ago

My guess is that they will lean towards smaller models, and try to provide the best experience for running inference on device