Move the World: How Saatchi & Saatchi Wellness UK Transformed Data & Emotion into Art

Representing the world’s emotions through real-world data.

Have you ever wondered what the inside of your brain looks like? Not the squishy pink stuff — but the neurons and synapses responsible for your thoughts, feelings, and memories?

Our colleagues in the UK recently implemented a brilliant agency branding campaign that seeks to paint that picture. That’s right — they’ve found a way to use emotional data to produce digital paintings that can reflect both an individual’s mood and global sentiment.

How’s that for ambitious?

The project set out to explore our organization’s core belief that empathy is a powerful motivator for behavioral change. The ability to see the world through someone else’s eyes — to truly understand their perspective — is an essential part of creating meaningful engagement and driving a healthier world. It made sense, then, that the new Saatchi & Saatchi Wellness UK brand identity should be built to interpret and respond to human emotions. Let’s take a look at how they did it:

The Process

Working with creative technology company Random Quark, the Saatchi UK team used EEG headsets, which scan the human brain for electrical activity, and a bespoke generative art process to capture and convey staff members’ most emotional memories.

To come up with individual portraits of employees’ emotions, each employee was asked to remember an emotional experience while wearing the headset: the birth of a child, the loss of a parent, their wedding day, and so on. After recording the employees’ electrical brain activity for 30 seconds, the software responded to the impulses in their brain.

Random Quark measured the asymmetry between the right and left brain hemispheres as well as the overall activation of the brain before plotting the data on a 2-D graph known among neurologists as the “Geneva Emotions Wheel,” scoring each emotion based on intensity.

Using a set of seven core emotions — joy, sadness, anger, love, aversion, fear, and surprise — the team pinpointed the two predominant feelings inspired by each memory. They then assigned a specific color to each emotion: red for anger, blue for sadness, etc., with the vibrancy of each color indicating the strength of the emotion. The result was about 100,000 data points representing the resulting ratio of both colors for each individual.

The Look: Algorithmic Data Visualization

The team then considered how best to translate this deluge of brain data into pieces of emotional art. Random Quark’s Theodoros Papatheodorou and Tom Chambers adopted a generative flocking algorithm inspired by bird swarm or movement of fish in the sea.

RandomQuark1.jpg

“Some rules of the swarm rely on stochastic or random decisions and therefore are unique,” Papatheodorou said. “At the same time, keeping some rules the same we managed to create a uniform visual output. Basically, all of the images look like they were made by the same painter, on a different day.”

emoscapeA.jpg

Flocking algorithms offered the best way to incorporate the massive amounts of data produced by the brain during the EEG test without reducing it to a few inputs. The resulting emotional landscapes — or EmoScapes™ — formed the agency’s new look.

Emotion On a Global Scale

But Saatchi UK has bigger dreams for this technology than simply mapping one person’s emotions. Their new website now features what’s called an “Empathy Engine” — a live artistic algorithm that uses IBM Watson to monitor global social media sentiment and identify the moments that are moving people all over the world. The Empathy Engine changes in real-time to portray world’s emotions as a constantly shifting piece of art.

Capture.PNG

The global response to Brexit, for example, fueled a visualization comprised primarily of deep purples and light blues, colors associated with surprise and sadness, respectively:

Brexit2

The election of Donald Trump, on the other hand, was depicted in bright reds and oranges, indicating simultaneous expressions of love and anger:

Trump2

What’s Next?

This creative — and frankly, mind blowing — integration of data analysis, generative algorithms, and artistic expression marks an important step forward in how digital applications might be used to contextualize and accommodate human emotions. On a larger scale, the shift toward adaptive technology has the potential to not only improve the way humans work with and leverage digital applications, but may also impact storytelling in a way that makes it possible for messaging to adapt to the wants and needs of the person receiving it.

Not to mention it makes for pretty cool art.

What do you think?

This site uses Akismet to reduce spam. Learn how your comment data is processed.