Skip to content

2023

Save time, resources and money with Latent Diffusion based image generation.

This article shows a novel approach to training a generative model for image generation at reduced training times using latents and using a pre-trained ImageNet latent classifier as a component of the loss function.

The image generation model was trained from an initialised (not pre-trained) state remarkably was less than 10 hours on a single desktop consumer NVIDIA card.