#11: Giving Trust The Reboot
How the tech industry can move from shaky to higher ground
|Scott Bryan||Jan 29, 2020||3|
Here is your slightly-longer-but-way-more-interesting-than-the-last-one weekly newsletter!
Reading time 6 minutes
"A movement to be “post-digital” will emerge in 2020. We will start to realise that being chained to your mobile phone is a low-status behaviour, similar to smoking."
BJ Fogg (Founder, Persuasive Tech Lab, Stanford University)
Last week's post demonstrated how Facebook manipulates user behaviour by scattering numbers all over it’s interface. Turning people’s basic need to feel valued into a quantifiable measurement - through “likes,” comments, shares, friends, notifications etc, hooks people to their devices.
This business model of maximising user-engagement is eroding trust. And it's unsustainable, both for future company profitability and user well-being.
The screen tech industry needs to find a way to reboot trust.
A brief history of trust
Once upon a time, trust was a local phenomenon. Tribes were small and everyone knew each other. People put their faith in face-to-face contact. If someone lied or broke that trust, the rest of the tribe heard about it and you could be shamed or ostracised.
Then, along came urbanisation and the Industrial Revolution. For the first time, strangers were everywhere. A new system of trust was required, so institutions were created. Banks were entrusted with your money, police were relied upon to uphold law and order and faith was put in schools to educate children.
When the Internet arrived on the scene, gradually, it altered these long held relationships with trust. It removed the hierarchy of institutional trust, empowering the individual to take back control, in what trust expert Rachel Botsman calls “distributed trust.”
These heady days of the Internet reflected the inherent goodness of people. You could buy a book from a random person on eBay, safe in the knowledge they'd send it to you, while trusting a stranger to sleep in your home while you're away.
This decentralised system of accountability still works, but gargantuan, unforeseen problems have emerged in the meantime.
Problems? What problems?
Initially we thought web platforms were neutral. They simply facilitated buyer and seller, host and guest to transact.
As companies grew larger however, we discovered the unintended consequences of scale.
The tech industry's fetish for incessant growth expressed through maximising user engagement, created new dangers:
Facebook stands accused of facilitating election engineering
Youtube is blamed for promoting polarisation and outrage
Airbnb found itself criticised for exacerbating a global rental crisis
This business model appears to shorten attention span and create information overload.
A flight to higher ground
“The humane tech movement is like a back to the land movement for the human psyche”
Tristan Harris (Center for Humane Technology)
The Center for Humane Technology believes that trust, not attention, will become the ultimate "currency of the future.”
As the world becomes less trustworthy and more complex, short of following Thoreau into the woods, we have little choice but to trust the untrustworthy. To do otherwise, would mean spending days reading terms and conditions, researching company reputations and generally driving ourselves to distraction.
We would quickly become overwhelmed and exhausted, until eventually, crossing a precipice.
Not knowing who or what to trust, a new organisation might emerge, using metrics that mirror our values instead of manipulating them. This could trigger, among users, what philosopher and social scientist Joe Edelman calls “a flight to higher ground.”, where people abandon the familiar platforms for new ones.
So how do companies reboot trust?
If compelled, Apple, Google or Facebook could trigger this flight to higher ground and lead a race to the top where they could be known as “The Trust Company.”
They would achieve this by focusing on the reasons people access their products and services, rather than their behaviour.
Instead of asking, are they swiping, watching, sharing, scrolling, liking etc, what if the metrics focused on the real reasons people use their service?
Maybe a user spends hours on Facebook because they're lonely, or perhaps they simply want to laugh on YouTube as respite from their screaming kids.
Whatever the reasons, companies could listen to them.
Joe Edelman calls this approach whole person analytics - analytics concerned with reasons and fulfilment, not just superficial behaviour.
Take YouTube as an example:
I'm currently studying a Professional Diploma in UX Design. Whenever I logon to YouTube to learn, I feel YouTube fails to respect this reason, as it keeps trying to distract me with "entertaining" videos.
Only three out of the top eight recommended videos relate to my reason for being there.
You see, it knows, based on past experiments with billions of other users, that “entertaining" videos will keep me engaged the longest. And it also knows, based on my previous viewing habits, that I enjoy the Graham Norton Show.
Since the more time spent on screen, means greater revenue for YouTube, the company are not overly concerned with my reasons for being there.
It just wants to maximise engagement.
So when I get lost for hours down a rabbit hole of funny Graham Norton Show clips, a tally of the numbers tells the company, Scott is "engaged”. YouTube thinks it's succeeding.
Yet on a personal level, when I log off, I feel guilty. I've wasted precious time distracted, when I wanted to learn.
YouTube isn’t cooperating with me, it’s trying to convert me.
What could YouTube do differently?
If YouTube cared about the reasons I logged into their service, it would spot I had been procrastinating for over an hour. It would know I came to learn about UX Design.
YouTube could prompt me with a friendly reminder:
Or let's take things a step further - we could expect these platforms to help us to get off our screens (low trust) into real-life situations with friends (high trust):
But where's the personal responsibility in all of this?
Look I get it! I need to stop blaming technology for my personal failings and start blaming myself.
This is the central thesis of behavioural scientist Nir Eyal (who advised tech companies how to build habit-forming products).
The trouble I have with this argument is this - every time you log onto Google, YouTube or Facebook for example, what we fail to realise, is a super computer is pointed at our brains.
Billions of dollars of supercomputing power are activated, behind which reside, the best psychologists, engineers and marketeers in the world.
These technologies are only going to get more powerful and more persuasive.
What chance have even the most strong-willed amoung us, to cope with this onslaught of manipulative technology?
We can’t be expected to go it alone, the companies must adapt too.
A new approach
Think about this for a second - heart disease, obesity, industrial pollution, social isolation, diabetes, credit card debt, and Black Friday shopping all have the same thing in common with internet addiction.
They are all symptoms of an epidemic of overconsumption.
The maximisation of user behaviour, both onlike and offline, encourages this.
A radical shift in how the global economy functions is required.
As more and more industries move online, this general economic model of over-consumption could be challenged through the tech industry.
If we keep focusing on sales, views, and clicks, we’ll wind up fat, depressed, socially isolated, diabetic, bloodshot staring at screens or jacked into VR, and surrounded by piles of junk we regret buying.
Joel Edelman (Social scientist and philosopher)
Kickstarter founder Yancey Strickland calls for the abandonment of ‘financial maximisation”, where companies view money as the only value metric.
We need a broader definition of value, where we recognise other valuable things in life, such as love, community security, knowledge and of course, trust.
In return, we would pay YouTube or those services we trust the most, a monthly subscription fee.
Their business model would not be reliant on advertising or maximising user engagement.
It’s a lot more complicated than this, I know.
The problems run deep.
This is just the start of the conversation.
In the meantime, as the unintended consequences of these platforms start to play out, the role tech industry plays in building trust, might prove one of the most critical issues of our times.
You made it!
That’s all for this week.