What Advertising Can Learn from Kimmel, Lassie and Game of Thrones
If you read my last article here on Muse, "What Advertising Can Learn From Stranger Things and Sean of the Dead" (shameless self-plug), then you know that humans are at once both extremely neophilic—curious to discover new things—and neophobic—afraid of things that are too new. I noted that the key to making a hit is to provide audiences with a balance of both, often in the form of some well-disguised familiarity. Set them up with a fleeting feeling of uncertainty before hitting them with an aha moment. What I didn't discuss is the challenge to getting there. Specifically, our industry's nature to, more often than not, rob consumers of that experience.
As we set out on this never-ending quest to come up with creative ideas that deliver on that aha moment, it often comes down to providing the consumer with just the right amount of pieces to the puzzle. Give one too many, and you've lost the click of satisfaction that only comes with moving them from uncertainty to understanding. Don't give enough, and you've just lost them.
Tug of War
The trick is finding the delicate balance that ensures your communication is interesting, intriguing and thought-provoking—leaving something to the imagination—but not so enigmatic that it sails straight over everyone's head. Getting to that point, though, usually means entering into the longstanding tug of war between two factions that I'll refer to as the reductionists (people who think we are saying too much) and the holists (people who want to say everything). And from what I've gathered, the holists have more people pulling on their end of the rope.
With the best of intentions, the forces that be (internal and external) are often pushing to add just one more communication point, one more reason to believe, one more line of copy or bit of dialogue, one more URL or CTA. One more explanation. One more piece to the puzzle. All to make certain people will understand. Which is completely reasonable. I think reductionists and holists would agree, one of the worst things a brand can do is put something out there that absolutely no one understands.
But maybe just as bad is putting something out there that no one cares about, even if they completely understand it. Overindexing on making sure everyone understands everything means you risk losing the thing that makes creative advertising so powerful—the part where the viewer or user feels like they are in on it.
With this pressure to always add more, at some point advertising usually becomes an art of subtraction.
But Will They Get It?
Michelangelo once said, when creating a sculpture the more stone you chisel away, the more the statue reveals itself. So this sentiment is nothing new. Yet in this business, the more you try to strip away in hopes of creating the aha moment, the more you will get one question when showing the work: "But will they get it?" They, being the consumer. And it being the idea, the moral, the takeaway, the reference, the punchline, the twist, etc. From what I can tell, people in every role both agency and client-side seem to have a similar reaction to work that elicits this type of response, and it goes something like this: They see the idea, they get the idea, they like the idea, and then they think about it long and hard. Sometimes they think about it in the room. Sometimes it's over a day or two. And then they ask it: "But will other people get this?" That is usually when the holist mentality kicks into high gear and the process of overthinking and adding begins.
Without doing some sort of consumer testing (which can come with its own set of challenges), there is no answer to this question other than the subjective opinions in the room. And at that point we are up against a whole body of research that makes us unreliable judges in the matter. Several studies suggest we tend to inflate our own capacity for comprehension versus the general population. For example, there's the Dunning-Kruger Effect, which suggests that we greatly overestimate our own abilities (even if we have low ability in a given area) and underestimate the abilities of others. Or the Third Person Effect, which posits we believe that while everyone else is influenced by advertising, we are too smart and self-aware to fall for it.
Point is, we all think we are smarter than the average person, which is why we might assume that just because we got it doesn't necessarily mean everyone else will.
I've grappled with this question many times and found myself playing both sides of the argument to push the work to where I want it to go. If I think something needs to be added, I say, "People are dumb, we have to spell it out for them." When I want something more minimal, it's of course because "People are smarter than we give them credit for, we should treat them as such." But I wrote this to hopefully put an end to my blatant use and abuse of the general public's collective IQ and to stop all of us from halting and altering great ideas, campaigns and designs with a question that has no definitive answer. Until now, hopefully.
My answer to the question of "Will they get it?", in short, is yes. They will. And I'll tell you why in a moment. But before I do, I'd like to hedge my bets with a few caveats. This is not a defense of confusing or boring work. If the concept doesn't make sense to anyone, then it doesn't. And if it is just bad or uninteresting, then it is just that. Absolutely no one will get it if they are too bored to pay attention. That said, the next time this question comes up, a good way to think about it is, if most of the people in the room—especially those who fall within the target audience—seem to get it initially, there is a good chance that everyone else will too. Why? For starters, your audience is smarter than ever.
If you turn on the news, scroll through your aunt's political-meme-laden social feed or eavesdrop on a "customer service" altercation while waiting in line at the grocery store, you might find yourself feeling that the human species is slowly devolving. More specifically, you might be feeling that, as a whole, people are more stupid than ever.
If you're looking for a place to justify that feeling, no need to look further than a regular segment on Jimmy Kimmel Live in which his producers stop people on the street to ask a seemingly simple question. In one case, the question is, "Can you name a country?" If you have a moment, you can watch it on YouTube. You may laugh. And then you may cry.
In the video, one after another, people step up to an unlabeled map of the world and are unable to identify a single country. A few can force out the name of a continent, but even in those cases they still manage to point to the wrong spot on the map. Just as a reminder, the United States, the country they presumably live in, is a perfectly legitimate answer on the board. Which is going to make the next thing I write hard to swallow: Right now, people are collectively smarter than at any other point in history.
According to studies outlined by David Epstein, in his book Range, every generation tends to think that younger generations are getting a worse education than they did. However, the exact opposite is taking place. Epstein notes that "most testable measures show that students today have a master of basic skills that are superior to the past ... And by and large everyone is smarter."
And thanks to the fact that robots have taken over factory lines (due in part to advances since the Industrial Revolution) and that any given piece of information is available at all times (because Google), more people than ever now find themselves in educational and work settings that are geared toward problem solving, conceptual thinking and processing all that information that we now have at our fingertips.
Is Your Partner Stupid?
We are living through what David Robson, the author of Why Smart People Make Stupid Decisions, referred to as an intellectual golden age. Among many of the proof points, there's the fact that average IQ scores are higher than ever. What's more is that IQs have increased so much that the average person today would have been considered a genius compared to someone born a century earlier (a phenomenon known as the Flynn effect). There are of course dissenting opinions on what it all means. But it would seem that one thing rings true: By and large, our abilities to problem solve, think conceptually and abstractly, and process information quickly are getting better. And when we are talking about "getting something," it usually falls into one of those categories.
Here's a more straightforward way to think about it. Is your partner stupid? I hope you know the answer. I ask because they're probably part of the group you're talking to with your advertising. David Ogilvy used to famously say, "The consumer isn't a moron; she is your wife." I'd like to update it for today because it was said in a different era: The consumer isn't a moron, [preferred pronoun] is your partner. And this consumer is not only smarter than ever, they are actually working really hard to understand the story you are trying to tell them.
Since the earliest humans walked the earth, stories have been a constant companion. From cave drawings, hieroglyphics and epics carved in clay tablets to spoken word and sacred texts, since as far back as we can trace, stories have been a vehicle to teach, learn, remember, preserve and entertain.
Today it has left us all uniquely evolved to be most excellent watchers, readers and listeners of the story. Because of this, we are all on a constant journey to connect the dots. Ever read the first few chapters of a book and start to formulate your own hypothesis for how it ends? Ever go to a museum with a friend and after staring at a work of art for 15 minutes hear him reluctantly admit, "I don't get it"? Whether there is something to get or not is beside the point. What's important is that he knew—without even thinking about it—that he's inherently supposed to get something. Turns out that's fundamental to how humans operate.
In the book The Storytelling Animal: How Stories Make Us Human, Jonathan Gottschal suggests the human species has a compulsive need for meaningful experience. When we see something that we don't immediately understand, we can't help but try to make sense of it. More often than not, we use stories to help us. We are, in fact, trying so hard to extract stories from the information we receive that even when there is no story there at all, we'll just create one.
Think about the ancient Greeks, who after having no good explanation for the giant ball of fire floating across the sky, gave us Helios the sun god. He would ride across the sky each day from east to west behind his fire-darting steeds faithfully pulling—what else—a chariot (a chariot of fire, maybe). What they saw didn't make sense to them, so they made it make sense with a story. While the ancient Greeks were an extremely advanced culture, that was 2,500 years ago.
We couldn't possibly be doing stuff like that today, because we're all highly functioning geniuses compared to them, right?
In 1944, two psychologists, Heder & Simmel, tested this idea with a simple study (one that has been replicated several times since). They had participants watch an abstract video of triangles, squares and circles moving around on screen in a totally random fashion. Then they asked viewers what they saw. Only a very small percentage reported that they had just witnessed a bunch of shapes floating around on screen (which is what it was). The others saw the stuff of legend and lore—love and courtship, dancing, the foiling of an arch nemesis, war, a knight in shining armor. They saw characters with purpose, motivation and feelings. All from a bunch of random shapes.
Not only does this study illustrate our capacity and motivation to work extremely hard to extract a story, it also shows how naturally imaginative and creative we are given the right circumstances.
Gotschal notes that, when unchecked, these two things combined (need for meaningful experience + imagination) are the impetus for conspiracy theories. So if the research from 1944 feels a bit dated, and you find yourself in need of some more recent proof of this phenomenon, just turn on the news. We are wired to find a story. And if our never-ending quest to fill in the blanks wasn't enough, we are being trained how to do it better with each passing day.
Remember when your parents used to tell you that TV and video games were turning your brain to mush? They lied. In his book Everything Bad Is Good for You, Steven Johnson argues that games, comic books, TV, and movies are—contrary to popular belief—making us smarter. Drawing on studies from neuroscience, economics and media theory, he suggests these forms of media actually present new cognitive challenges and make our minds measurably sharper. Take that, Ma!
Before we get too far on this subject, a quick PSA is in order. Of course screen time is a problem if not supplemented with outdoors, activities, books, exercise, etc. I am definitely not making an argument that you will be smarter if you only watch TV and play video games all the time (neither was Steven Johnson). Nor am I suggesting anyone should do that. And really, truly, everyone should read books. Maybe even ones about advertising that aren't really about advertising at all. That said, there is a misperception that the so-called boob tube is somehow making us all dumber.
Due to the rapid advancement in complexity of all kinds of media we digest on a daily basis (television being one of the most prominent), we are learning new ways to take in a story. And the more they advance, the harder we are working to constantly keep up.
Time Never Seems to Pass
Think about how far television has come in the last 75 years. My own experience with TV shows dating back that many years came through the time portal that is Nick at Nite. After dropping slime on people all day, Nickelodeon turns the channel over (right around when the kids are supposed to be in bed) to a slate of reruns from a bygone era. Today, its lineup is filled with shows from my own youth, like Fresh Prince of Bel-Air, Friends and Full House. But back in the late '80s and early '90s, when I was staying up past my bedtime, it was shows from the '50s and '60s that my parents and grandparents watched. Shows that still had the word "show" in their title, like The Andy Griffith Show, The Dick Van Dyke Show, The Lucy Show, Mister Ed, Green Acres, Bewitched and Lassie.
If you rewatch one of these classics today with a critical eye, they feel so simple, even a bit strange. Time never seems to pass. The characters wake up in the same place, at the same age, on the same sunny day, with the same group of people. Those people, by the way, are typically limited to a small outfit that you can count on one hand; a few main characters, a neighbor, the sheriff and a municipal worker of some sort like a delivery driver or milkman.
The entire story typically takes place on screen in front of the viewer. When Timmy falls in the well, we see him fall in the well. When Lassie runs to tell his parents, we see Lassi run. Cut to the parents' reactions. And so on. Most noticeably, the conflict (in this example, Timmy falls in well) starts somewhere near the beginning of the episode and is resolved (Lassie saves the day) somewhere near the end, with clean explanations and clean resolutions all in a nice 30-minute window. In turn, the viewer doesn't have to do much work to figure anything out.
And if that wasn't simple enough, you could tune in next week for a nearly identical storyline, except this time Timmy falls down a hill. To put it into context, during this era it was a revelation to connect a storyline over more than one episode. It was a landmark achievement when shows like the original Batman series asked viewers to "Tune in tomorrow! Same time! Same [bat] channel!" to see the conclusion of a two-episode arch.
A Therapeutic Relationship
Now think about a modern television series. How about one with six seasons and 86 hour-long episodes released over the span of a real-life decade? A story that follows the intricate, gut wrenching and all-too-normal details of a New Jersey mafioso during his ascension to the head of a crime family. A show, according to Emily Nussbaum in her book I Like to Watch, that "takes advantage, in every sense, of the odd bond a series has with its audience: an intimate dynamic that builds over time, like a therapeutic relationship."
Now imagine that after spending 10 years engrossed in this world, the very last moment of the series in which we find out the fate of the protagonist takes place off screen.
The now-infamous cut-to-black moment in the finale of The Sopranos left the ending entirely up to the viewer. Since heralded as one of the best TV shows of all time, the entire show—but also the ending—is an example of how far things have come. And the confidence that the show's creators had in their audience to let them put the pieces together on their own is testament to that.
While The Sopranos wasn't the first to expect a lot from its viewers, it has been marked as a bit of a watershed moment that redefined the platform. Since then, shows have only gotten more complex. Not just offbeat David-Lynchian-type mindbenders or Black Mirror's tech-infused interactive Bandersnatch, but mainstream cultural phenoms like Game of Thrones—the show that for me illustrated a deep divide between the generations that grew up on Lassie, and those who were born in the Matrix of today's television.
You've Trained for This Your Whole Life
Growing up, I noticed my father was an intense consumer of media. But he had strange reading and watching habits. He would—to my horror—read the last page of a novel first to decide whether it is worth reading the entire book. He would start TV shows in what storytellers refer to as "in media res" (or in the middle of the action) and watch episodes out of order just because they happened to be on at the time. Sometimes, he would even read a book while watching a TV show. And I think it used to work for him. He was able to fully grasp everything going on. That was 30 years ago.
I can pinpoint when I noticed that today's TV may have outpaced his viewing habits. It was during one of our weekly conversations at the height of Game of Thrones mania when I suddenly realized we were watching two different but related shows. His version was a topical understanding of the most straightforward plot lines. But noticeably missing were about 24 other subplots and storylines building in the background with subtle winks and nods, things that went unsaid and moments of foreshadowing that if your eyes dropped to an alert on your phone, you'd miss them. What I realized is that our conversations at some point just became me explaining the events that unfolded up to a certain major plot twist that he either didn't see coming or didn't understand.
I'm not bringing this up to shame my 74-year-old dad. The fact that I have conversations with him about all kinds of great shows on TV proves he is, in the words of Dr. Evil, still more "hip and with it" than most people his age. Though he has admitted to me that sometimes there is just way too much going on to follow. Which makes total sense. Consider the capacity it requires to follow along to the aforementioned Lassie shows, the ones he grew up on. No doubt his viewing habits evolved past them.
But now think about the capacity and cognitive effort put forth to follow seven noble families, 52 main characters, over 100 other key characters (not extras, there's thousands of those) along all of their interwoven—often incestual—storylines. They're sprawled across a universe of three major continents in a story that spans nearly a decade.
In one moment, time advances at an extremely rapid pace—an entire year may pass off screen, leaving the viewers to infer what happened in between with nothing but a trail of breadcrumbs. The next moment, an entire episode is a single battle in near real time. All of this is happening over seven seasons (OK, there are actually eight seasons, but the final season has been stricken from the record) and throughout a decade of your own actual human life.
It is so complex that major publications dedicated entire weekly columns to explaining every episode, thousands of YouTube videos were created to help recap what you just saw, and entire websites were erected to help explain WTF is going on. For my dad, it may be a bit too much. For the upcoming generations who are trained like thoroughbreds for this exact thing, it kind of just feels like watching TV. But it only feels that way because we've been practicing for this moment since the day we were born.
By now you've heard of the 10,000 hour rule for success, outlined in many books but most notably attached to Malcom Gladwell's Outliers. It contends that one of the predictors of success in a given field, sport, skill or task is whether an individual has amassed at least 10,000 hours of practice.
If my calculations are correct, I think I'm pushing the 50,000 mark on the number of hours practicing on my TV. And if you look at the stats for Gen Z, millennials and even Gen X to an extent, we are pretty much Michael Jordan shooting free throws with his eyes closed when it comes to the stuff. So put us in coach, make us work.
Underestimate at Your Own Peril
This is not to say everything you make needs to be hard to figure out. Not by any means. There is a time and place to hit someone over the head with a metaphorical tack hammer.
But as you push to make great work that gives the user a chance to do a little work on their own, play in the playground you've constructed and get to the aha moment, you will inevitably find yourself trying to answer the question: "But will people get this?" When it happens, my suggestion is to not underestimate the consumer.
And more importantly, don't deprive them of the chance to find the story and experience the feeling that comes along with it. John McPhee, a creative non-fiction writer, used to say, "Make the reader want to choke you for not giving them more." If they are smarter than ever, born to find a story, and are being trained to get better at it every day, they can handle it.
They will get it. And in the end, they will thank you for it.