A huge thanks to the incredible team behind HyperGAN-CLIP! Abdul Basit Anees, Ahmet Canberk Baykal, Muhammed Burak Kizil, Duygu Ceylan @aykuterdem.bsky.social ๐ (4/n)
21.11.2024 05:51 โ ๐ 2 ๐ 0 ๐ฌ 0 ๐ 0A huge thanks to the incredible team behind HyperGAN-CLIP! Abdul Basit Anees, Ahmet Canberk Baykal, Muhammed Burak Kizil, Duygu Ceylan @aykuterdem.bsky.social ๐ (4/n)
21.11.2024 05:51 โ ๐ 2 ๐ 0 ๐ฌ 0 ๐ 0
๐ฎCurious to learn more? Catch us at #SIGGRAPHAsia 2024 in Tokyo or check out our paper for all the details.
Project page: cyberiada.github.io/HyperGAN-CLIP
Arxiv: arxiv.org/abs/2411.12832
(3/n)
Our approach extends a pre-trained StyleGAN by integrating CLIP space via hypernetworks. This allows us to dynamically adapt it to new domains using reference images or text descriptions.
Itโs flexible, efficient, and unlocks new applications for GANs. ๐โจ (2/n)
GANs like StyleGAN generate highly realistic images. But adapting them to new domains or tasks like text-guided editing or reference-guided synthesis with limited data is challenging! ๐ผ๏ธโจ
Our #SIGGRAPHAsia 2024 paper, HyperGAN-CLIP, tackles this: youtu.be/X0VOYFhPWxQ (1/n)