Watching an AI create fake celebrity faces is nightmare fuel

Like an algorithmic version of John Carpenter’s The Thing, Nvidia has created a machine-learning system that morphs into hundred of made-up celebrity faces.

In a publication on the project, Nvidia’s researchers describe how they came up with a new way to spawn unique faces with the use of a generative adversarial network (GAN). This type of machine learning uses a pair of neural networks, and sets them in competition with each other. One of the networks is tasked with generating images, while the second is put in the role of discriminator – evaluating the work done by the former like an AI art critic.

The researchers claim they managed to create “images of unprecedented quality” by training the neural networks on a CelebA-HQ database of famous people.

“The key idea is to grow both the generator and discriminator progressively, starting from low-resolution images, and add new layers that deal with higher resolution details as the training progresses,” an intro to the report reads. “This greatly stabilises the training and allows us to produce images of unprecedented quality, e.g., CelebA images at 1024² resolution.”

Watching the faces morph into each other is a lesson in abject horror, but the individual ‘celebrities’ look remarkably convincing – and come as a neat metaphor for our culture’s cynical, cyclical generation of celebrities to boot.

Neural networks are becoming increasingly sophisticated, and their work is being applied in an increasing range of scenarios – some entertaining, some outright sinister. Recently an AI researcher claimed to create a system that is capable of telling a person’s sexuality from a single photo. Other researchers have trained networks to make computer-fooling fingerprints. That’s to say nothing of neural networks that make pornographic rock formations.

Leave a Reply

Your email address will not be published. Required fields are marked *

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.