Actress Kristen Stewart may be richer than us, better looking than us, and far, far more famous than us, but you know what she doesn’t have to her name? An article on cutting-edge neural networks.
The Twilight actress recently made her directorial debut with the short film Come Swim, and in it used a machine learning technique known as “style transfer” (where the aesthetics of one image or video is applied to another) to create an impressionistic visual style.
ALONG WITH SPECIAL EFFECTS ENGINEER BHAUTIK J JOSHI
Along with special effects engineer Bhautik J Joshi and producer David Shapiro, Stewart has co-authored a paper on this work in the film, publishing it in the popular online repository for non-peer reviewed work, arXiv.
AI researchers and Stewart fans were surprised (and pleased) to discover her contribution to the field:
Once more: Kristen Stewart of Twilight fame directs movie; writes arXiv paper about using StyleNet during production https://t.co/NZ4I1yhQUN
— Mark Riedl (@mark_riedl) January 19, 2017
To be someone in Hollywood, you’ve got to put your ML papers on Arxiv and you better use TensorFlow… https://t.co/2Rcg1ccJ36
— François Chollet (@fchollet) January 19, 2017
Kristen Stewart has coauthored a paper about applying neural nets to images. I feel supremely unaccomplished today. https://t.co/1CFXAg3kv1
— Diane Hosfelt (@avadacatavra) January 20, 2017
The paper itself is titled “Bringing Impressionism to Life with Neural Style Transfer in Come Swim,” and offers a detailed case study on how to use this sort of machine learning in a film. The paper describes Come Swim as a “poetic, impressionistic portrait of a heartbroken man underwater,” with the film’s aesthetic grounded by a painting of Stewart’s showing a “man rousing from sleep.”
THE TEAM USED EXISTING NEURAL NETWORKS TO TRANSFER THE STYLE OF THIS PAINTING
The team used existing neural networks to transfer the style of this painting onto a test frame, and then fine-tuned their setup by adding “blocks of color and texture” until they’d created the desired painting-like effect. When this transfer process was correctly tuned, they applied it different parts of the film, producing frames like the ones below. It’s a simple technique deployed convincingly
THERE IS OF COURSE A BIT OF LIGHT HEARTED SNOBBERY HERE
There is of course a bit of light-hearted snobbery here (“Why on Earth is a Hollywood actress getting involved in machine learning?!”), but the fact is that these machine learning tools, once thought of as esoteric and specialized, have become increasingly mainstream. Open source AI frameworks like Tensor Flow and Keras make it easy for anyone to try and implement code, and the commercialization of specifics techniques like style transfer (even Facebook offers style transfer image filters) pushes this research into popular culture.
Arguably, the AI revolution isn’t just powered by abundant data and GPUs — to truly thrive it also needs an open community and accessible tools. Stewart’s paper is brilliant example of how that works, and what can be achieved.