Computed Curation – Curating photography with neural networks

Created by Philipp Schmitt (with Margot Fabre), ‘Computed Curation’ is a photobook created by a computer. Taking the human editor out of the loop, it uses machine learning and computer vision tools to curate a series of photos from an archive of pictures.

Using the neural network, the machines considers both the image content and composition to uncover unexpected connections and interpretations that a human editor might have missed.

Machine learning based image recognition tools are already adept at recognizing training images (umbrella, dog on a beach, car), but quickly expose their flaws and biases when challenged with more complex input. In Computed Curation, these flaws surface in often bizarre and sometimes poetic captions, tags and connections. Moreover, by urging the viewer to constantly speculate on the logic behind its arrangement, the book teaches how to see the world through the eyes of an algorithm.

The book features 207 photos taken between 2013 to 2017. Metadata is collected through Google’s Cloud Vision API (tags, colors), Microsoft’s Cognitive Services API (captions) and Adobe Lightroom (date, location). Composition is analyzed using histograms of oriented gradients (HOG).

Considering more than 850 variables for each photo, a t-SNE algorithm arranges the pictures in two-dimensional space according to similiarities in content, color and composition. A genetic TSP (Travelling Salesman Problem) algorithm computes a shortest path through the arrangement, thereby defining the page order.

The book layout is generated using a custom Node.js application and rendered in InDesign using basil.js. The book is printed on a HP Indigo digital press and hand-bound as an accordion book with a total length of 29 meters (95ft).

Project PagePhilipp SchmittMargot Fabre

More like this:

Powered by WPeMatico

eBay