Cole Feuer’s Unpaired Style Transfer

Transforming 3D-rendered animation frames into temporally coherent anime style using a novel GAN pipeline.

About the Project

This work from Cole Feuer presents a two-stage, unpaired approach: first a perceptual pretraining phase on anime datasets; then a CycleGAN-based domain adaptation using 8-channel inputs (RGB, depth, edges, blurred prior frames) to enforce visual and temporal fidelity without paired data. Ideal for studios seeking automated stylization at scale.

Key Technical Features

Code & Documentation

View the complete implementation, training scripts, and setup instructions on GitHub:

github.com/CDFire/UnpairedStyleTransfer

Future Directions

Planned enhancements include explicit optical-flow losses, multi-style conditional transfer, and user-driven style parameter controls. Feedback and collaborations welcomed.

Contact

Questions or collaborations? Reach out to coledfeuer@gmail.com or connect on LinkedIn.