When it comes to recognizing things in photographs, algorithms and robots are getting pretty smart. Sometimes they seem almost scary smart. MIT’s latest photo-ogling algorithm claims it can tell how memorable a picture is based on a few specific visual cues.
The project studies Large-Scale Image Memorability and you can download the full-on research paper here. Basically, it’s meant to decide how likely it is that someone will remember the contents of an image roughly 100 seconds after having seen it.
The data was taken from an original pool of roughly 60,000 images that were analyzed and studied.
You can upload your own photo to the site (as always, if you’re concerned about copyright of your images—and you should be—read the terms before uploading) and try it for yourself.
Many people immediately think of the social media implications of an algorithm like this, but it could have other research implications such as testing ad campaigns if it turns out to be as accurate as it claims.
There are some interesting tidbits to be found in the paper, too. For instance, if you want someone to remember a photo of you, it’s probably best to look disgusted or amused. “We find that images that evoke disgust are statistically more memorable than images showing most other emotions, except for amusement. Further, images portraying emotions like awe and contentment tend to be the least memorable.”
Photos are also more memorable if there’s one central object to focus on rather than a whole collection of things.
As tech like this gets better, it seems very interesting to consider how the feedback could be used. When a photo “just doesn’t feel right,” it’s feasible that an algorithm may eventually be able to tell you why.