This article was written by Silvana Churruca between 2011-2014. Silvana is a UX Designer & Researcher with a strong multidisciplinary background: starting with Graphic Design, Arts and Communications, and later specializing in Cognitive Systems and Interactive Media, Product and Project Design Management and Communication Design Theory.  Follow Silvana on her LinkedIn.

In the last Google IO 2013, Alex Faaborg made a presentation on Cognitive Science and Design. Besides a great introduction on human factors, his talk was full of powerful examples on how to translate research insights into design solutions. At the same events, Rachel Garb and Helena Roeber made another amazing presentation on Android Design Principles.

In the last years I have been checking Google’s research material and it is fascinating to see how one paper you read before is shaping google’s product experience. Now I will like to share with you just three inspiring examples on how Google goes from Research to Design. I selected these three because they go from particular to general, starting with Interface design, through functionality and finishing at Strategy.  I hope these 3 lessons inspire you as much as it has inspired me!

1. “Pictures are faster than words”

Facial recognition occurs in a special dedicated part of the brain (the Fusiform Face Area) and it is known that humans are particularly good at this process, so good that we even see faces where there are none, making facial recognition a fast and effective process. Most of the time, we can recognize a familiar face, even when we can’t recall the person's name  (by the way it is really fascinating to know that the same brain areas of face-recognition activate when you read handwritten text).

The first example, used for Alex in his presentation, is not coming from Google’s research, but from an extensive research in the field of neuroscience around visual perception and cognition, however Google makes a great point turning this concept into an interface design solution where scanning email recipients cause faster and more effective results.

The solution is simple and effective: If facial recognition is faster than reading, why not simply use pictures to identify contacts much faster? By adding the contact picture next to each email at the inbox and next to the contact when you are selecting recipients to your emails, Google’s designers get what they seek, putting your facial recognition system to work. I bet that you also like me jump directly from pic to pic confirm that this is the person that you want to send the message. Including contacts' pics is not merely an aesthetic appeal, but above all a cognitive solution that is undeniably faster and simpler than reading.

Google's lessons: Images are faster than words


If you are interested in learning more about this and other human factors, check out Susan Weinschenk’s books, also recommended by Alex Faaborg in his presentation.

2. In a cross-device world, the most common link between them is ‘web search’ functionality.

The most common link between devices (screens) is the search activity; people tend to move between devices searching again on the second device either for activities like ‘searching for Info’, ‘Browsing the Internet’ or ‘Shopping online’. In second place, people tend to directly navigate to the destination site.

The second example is about using Google’s contextual research (Ethnographic, surveys, diary) to make design solutions. In this case, it is not an interface design but a functionality. If we observe users tending to use web search to jump from one device to another when performing related activities (sequential and complementary), why not facilitate these connections, keeping users' recent searches across all devices?

Using Chrome history for devices, Google gives users easy access to a website they visited earlier at a work PC, from his home tablet without the need to save anything. Thanks to recent search history, Google Now allows jumping from one device's recent search to another's without even needing to browse into device-history. Even more with its new ‘cross-devices tracking service’, Google is not just improving users’ multi-screen experience, but satisfying a market need to track conversions that we know now almost never start and finish at the same time and with the same device.








3. It takes three positive emotions to outweigh each negative one

We need at least three heartfelt positive emotions to open us up and lift us up for every heart wrenching negative emotion (Dr. Fredrickson)

The last example is probably my favorite because of its originality and potential. It was also presented at the last Google IO 2013 by Rachel Garb, lead of the Interaction design team, and Helena Roeber, head of UX Research team, both for Android.

In this case, we have as I said a curious and creative use of research. Google's Android Design Team were building their Design Principles, but rather than simply rely to a list of good practices, they built their principles on the foundations  of John Gottman’s research on successful marital relationships predicted by a positive vs negative emotions ratio.

If people spend some time, or even more time interacting with technology than with people, why not apply the same research around human relationships, and emotions to the relationships we have with our most close tech devices? They've developed a system that allows them to apply design principles in a 3 (positive) to 1 (negative) ratio that Marcial Losada and Dr. Barbara Fredrickson found to be the ratio that describes either more productive work teams (Losada) or people who have more successful life outcomes (Fredrickson).

Google’s Jars of Marbles turn out to be (besides some controversial around the positivity ratio) the perfect way to ensure a positive user experience equilibrium in Android design.









Hungry for more? Don’t miss Alex Faaborg, Rachel Garb and Helena Roeber's presentations!…

Alex's presentation is full of others' examples on how to translate science/research insights into design solutions, going from physiological conditions as visual perception to cognitive functions as attention, memory or emotion.

Rachel Garb and Helena Roeber also have plenty of examples on design solutions that came from research, as the Android’s Design Principles themselves that came from an extensive UX research named ‘Android baseline study’ combining several techniques (diaries, in-home, interviews and observations).

Other recommended research resources from Google

Google made public a bunch of them, just to start check some of this links:

You've successfully subscribed to UX Lady
Welcome back! You've successfully signed in.
Great! You've successfully signed up.
Your link has expired
Success! Your account is fully activated, you now have access to all content.