← Home

Quick answer

AI Summary: Presents BigGAN, a radically scaled-up Generative Adversarial Network that introduced the 'truncation trick' and set unprecedented records for high-fidelity, high-resolution image synthesis.

Claim

Large Scale GAN Training for High Fidelity Natural Image Synthesis

Andrew Brock·
Jeff Donahue·
Karen Simonyan

ABSTRACT

Despite recent progress in generative image modeling, successfully generating high-resolution, diverse samples from complex datasets such as ImageNet remains an elusive goal. To this end, we train Generative Adversarial Networks at the largest scale yet attempted, and study the instabilities specific to such scale. We find that applying orthogonal regularization to the generator renders it amenable to a simple 'truncation trick', allowing fine control over the trade-off between sample fidelity and variety. Our modifications lead to models which set the new state of the art in class-conditional image synthesis. When trained on ImageNet at 128x128 resolution, our models (BigGANs) achieve an Inception Score (IS) of 166.3 and Frechet Inception Distance (FID) of 9.6.

Review Snapshot

Explore ratings

4.6
★★★★★
5 ratings
5 star
60%
4 star
40%
3 star
0%
2 star
0%
1 star
0%

Recommendation

100%

recommend this content.

Review this content

Share your opinion to help other learners triage faster.

Write a review

Invite a reviewer

Invite someone by email to share an invited review for Large Scale GAN Training for High Fidelity Natural Image Synthesis.

Author Inquiries

Public questions about this content. Attendemia will route your question to the author. Vote on the most important ones. No guarantee of response.
Post an inquiry
Sort by: Most helpful