Prince Harry and his actress wife, Meghan, Duchess of Sussex, recently broke a global Instagram record by becoming the fastest account to attract 1 million followers. It took the glamorous royals less than six hours to reach the mark under their new @sussexroyal name. Two days later, they had 3.4 million followers. They now have more than 5 million.
Why, then, 24 hours after launching @sussexroyal, would Harry be warning (not for the first time) of the dangers of sites like Instagram? He argued: "Social media is more addictive than drugs and alcohol, and it's more dangerous because it's normalized and there are no restrictions to it." He went on to call for the Fortnite video game to be banned, claiming "it's created to addict—an addiction to keep you in front of a computer for as long as possible. It's so irresponsible."
Harry is plainly aware of the essential dichotomy between "affinity" and "addiction" at the heart of the digital user experience.
How deliberately some degree of "addiction" is designed into some of the digital age's most successful brands, particularly in social media and gaming, is one of the era's most contentious subjects. Critics see addiction as a method of gaining ever more of users' attention, habits and pockets. By contrast, the notion of voluntary "affinity" lies behind most positive user experiences. Harry recognized this in his call to ban Fortnite, alluding to William Morris's "Have nothing in your house that you do not know to be useful, or believe to be beautiful."
Some might see blatant hypocrisy or feel he is just plain wrong, but Harry articulated what many feel about gaming and social media—increasing helplessness in the face of their addictive nature. Those involved in the genesis and evolution of these products need take heed. The climate is turning against their creations, which can appear to be out of control in the manner of Frankenstein's monster.
Harry is the latest in a line of public figures from Justin Trudeau to Ed Sheeran, Miley Cyrus and even Silicon Valley executives to warn of the addiction and alienation that the ubiquitous spread of digital devices, social media and video games have created. Both Steve Jobs and Bill Gates limited their children's screen time, and even Tim Cook has warned we should keep children off social media. It's not just about time, but well-being, self-esteem and the lens through which we view the world.
When Harry interviewed President Obama for a mental health charity in 2017, Obama argued: "One of the dangers of the internet is that people can have entirely different realities. They can be cocooned in information that reinforces their current biases," adding that we need to find ways to "harness technology ... to promote ways of finding common ground."
With an under-pressure Facebook announcing plans to reveal (some of) its news algorithm secrets via a new "Why am I seeing this post?" button, it is clear tech giants can no longer afford to stay in the shadows about user experience (UX) design, how they collect our data and what they do with it. It places responsibility on UX designers to evolve more human-centric creativity, with digital wellness at its heart, based on a desire for greater affinity rather than a relentless quest for more "engagement" dollars.
If we continue to design digital experiences around a sense of unfulfilled need—the "dopamine effect"—then yes, Fortnite, Facebook and other brands will rightly come under ever more intense scrutiny. Those endless hours spent on Battle Royale, Instagram, Googling and swiping left are in danger of creating generations that are so un-self-reliant and insecure that they are eventually intellectually and emotionally crippled.
None of our tech giants became phenomena without having inherent affinity: Users loved their experience within the brand's parameters more than their rivals. Witness the original iPhone and all it stood for. It "sparked joy," as did the "unnecessary" iPad, or the first time we used Uber or Tinder's interface. The challenge is to not allow that affinity to become an addiction. To achieve this, we can't just rely on bewildered parents or governments trying to balance a duty of care to their citizens with freedoms of speech and trade. There is only so much mom confiscating that iPhone or the European Union enforcing GDPR can achieve.
Lasting, meaningful solutions must come from UX design itself—natural, built-in checks and balances that do not ruin the experience because they are designed into the concept, not bolted on. We need a new focus on human-centric design that is not merely a synonym for human testing. Perhaps this may bring an end to being trapped within an app for the sake of more engagement in favor of more flexible, human-centric outcomes. If we can design time and return limits into parking apps, then we can with video games. We might even embrace the challenge that is the cognitive dissonance of dissenting voices and dare to design inherent balance into social media's echo chambers.
If this notion of intrinsic "digital wellness" sounds fanciful, then it shows just how much experience design is currently found wanting. Human beings are not cyborgs. Designing for affinity, not addiction, means products giving us back greater control of the decision-making process to take advantage of the almost infinite possibilities offered us by the mini super-computers in our pockets. It means corporates coming out of those data and engagement shadows to be more honest in intent and transparent in communication.
Corporate social responsibility has fallen down the priority chain since the 2008 economic crisis, but the opportunity to pioneer digital wellness and literally "do no evil" is very real as we enter the machine learning age. How do we turn growing trepidation about our digital consumption into excitement about the possible? The answer must surely be to do away with a continuing shadowy reliance on inherent addiction and endless data capture in favour of an honest, transparent pursuit of greater affinity.