Need a photo that fits the mood? Ask EyeEm’s algorithm
San Francisco: Instead of hiring a professional photographer for an important marketing job last year, Massimo Portincaso gave the job to a computer software.
Portincaso, the head of marketing at the Boston Consulting Group, was tasked with overhauling his company’s website. It was a straight-forward assignment, giving the site a fresh look highlighting the business’s capabilities. When the time came to select pictures for the new homepage, Portincaso didn’t want to entrust a photographer with the job. He turned over some of the decision-making to an algorithm he trained to crawl an Internet photo database to find what he wanted.
That system, created by Berlin-based start-up EyeEm, began suggesting shots matching Portincaso’s taste—in a similar way to how Pandora’s algorithm adapts music recommendations based on a person’s listening preferences. Rather than combing through stock photos from services such as Getty Images, Portincaso entered terms such as “young woman,” “smiling” and “escapism” into EyeEm’s search field and a list of pictures emerged from the database that fit with the look he was going for. At first, the algorithm suggested lots of random photos of people and colour schemes. Over time, it learned he wanted more abstract shots: no pictures of people from the front and more uniform colours. “It’s almost frightening,” he said.
The website job was relatively minor, but represents a bigger change in computers’ ability to understand images and adopt human-like preferences. EyeEm began as an Instagram-like photo-sharing app about five years ago. With backing from investors including billionaire Peter Thiel, the company has evolved to be at the cutting edge of the technology industry’s race to effectively organize the trillions of images online.
The company has developed ways to quickly identify what is in a picture. Ramzi Rizk, the co-founder and chief technology officer at EyeEm, highlighted the capabilities by showing an internal version of the software. As he moved his iPhone camera around a room, the code scrolling across his phone’s screen identified objects in real-time: window, desk, computer, shelf, book. When he turned the phone on himself, a description showed, “handsome.” “That’s clearly a bug,” he said.
People can search for photos by typing in terms like “dinner last December in San Francisco” or “Hawaii sunsets.” “It’s not enough to say what’s in a photo—you have to filter out the most relevant,” Rizk said.
EyeEm, which charges Boston Consulting Group a license fee for using the software, is announcing this week that it’s making the tools available to other companies. Rizk said the company is targeting brands. Marketing teams can train the EyeEm algorithm to match the look they are going for by uploading pictures they’ve used in the past, or others they like. The software analyses all the pixels of a picture to identify recurring themes—be it colour, a person’s facial expression, objects or lighting.
EyeEm has roughly 80 million of photos in its database companies can choose from, coming mainly from some of the 18 million professional and amateur photographers who have agreed to sell their shots. If a company uses a picture, EyeEm splits the fee with the photographer 50-50.
While Google’s search engine has become skilled at identifying if a picture contains an object like a cat or a bridge, EyeEm is attempting to identify what photos are actually good. One of its apps, called The Roll, assigns scores to photographs to help people organize their pictures by quality. (Apple is adopting similar ideas in its latest iOS mobile operating system to be released this year.)
Rizk said EyeEm’s advantage over competitors is it’s popularity among photography enthusiasts. (It publishes a magazine and is hosting a photography festival.) People upload their pictures to the app and other photographers can like or comment on them. The photographer-centric audience has provided a strong filtering system that serves as the baseline for its algorithm, helping it identify what makes a good photo. Rizk said the company has been testing ways to identify what pictures will go viral on social media.
At the Boston Consulting Group, Portincaso is ready to entrust more decisions to the algorithm. Instead of having to approve new marketing materials or edit client presentations, he envisions an employee being able to upload the materials to a software program like EyeEm’s, which will then decide whether the material is good enough.
“I can either say you can only do A, B or C, and police them and make sure they don’t, or I can define it centrally then have the software work with them and realize if they are being on brand or off,” he said. Bloomberg