Secrets of Silicon Valley — A two-part documentary by Jamie Bartlett (part
2: The Persuasion Machine)
In Secrets of Silicon Valley, a two-part
documentary series for the BBC, tech writer Jamie Bartlett tries to uncover the
dark reality behind Silicon Valley’s glittering promise to build a better
world.
In The Persuasion Machine, the second part
of Secrets of Silicon Valley, Jamie Bartlett tells the story of how
Silicon Valley’s mission to connect all of us is plunging us into a world of
political turbulence that no-one can control.
Like Y Combinator’s President Sam Altman in last week’s
episode, Alexander Nix, CEO of Cambridge Analytica, seems to have a similar
unshakeable faith in the inevatibility of the future: “It’s going to be a
revolution, and that is the way the world is moving. And, you know, I think,
whether you like it or not, it is an inevitable fact.” In an ‘Etonesque’ way he
repeats what Altman said last week: “I think if you continue this thrust of,
shouldn’t we stop progress, no-one’s going to take you seriously […].”
Both episodes are available on BBC iPlayer.
My ‘transcript’ of part 1, The Disruptors, can be found here.
Secrets of Silicon Valley (part 2) — The Persuasion Machine
“The tech gods believe the election of Donald Trump
threatens their vision of a globalised world. But in a cruel twist, is it
possible their mission to connect the world actually helped Trump to
power?,” Jamie Bartlett
wonders in part two of Secrets of Silicon Valley, The Persuasion
Machine.
To answer that question, you need to understand how
Silicon Valley’s tech industry rose to power. And for that, you have to go back
20 years to a time when the online world was still in its infancy. A time when
people feared the new internet was like the Wild West, anarchic and potentially
harmful.
The Telecommunications Act of 1996 was designed to
civilise the internet, including protecting children from pornography. But
hidden within the act, was a secret whose impact no-one foresaw, Bartlett tells
us. Section 230 as it is known, says, “No provider or user of an interactive
computer service shall be treated as the publisher or speaker of any
information provided by another information content provider.” It holds the key
to the internet’s freedom and has been an enabler for Silicon Valley’s accelerated
growth.
Jeremy Malcolm, an analyst at the Electronic Frontier Foundation, a civil
liberties group for the digital age, believes that without Section 230, “We
probably wouldn’t have the same kind of social media companies that we have
today. They wouldn’t be willing to take on the risk of having so much
unfettered discussion.”
Section 230 “allowed a new kind of business to spring up
— online platforms that became the internet giants of today,” Bartlett tells.
“Facebook, Google, YouTube […] encouraged users to upload content, often things
about their lives or moments that mattered to them, onto their sites for free.
And in exchange, they got to hoard all of that data but without a real
responsibility for the effects of the content that people were posting. […] At
first, the tech firms couldn’t figure out how to turn that data into big
money.”
But that changed when a secret within that data was
unlocked. Allowing Facebook users to be targeted using data about what they do on
the rest of the internet, opened up vast profits and has propelled Silicon
Valley to the pinnacle of the global economy. “The secret of targeting us with
adverts is keeping us online for as long as possible. […] Our time is the Holy
Grail of Silicon Valley,” Bartlett tells us. So what is it that is keeping us
hooked to Silicon Valley’s global network?
Bartlett is off to Seattle to meet Nathan Myhrvold, who
saw first-hand how the tech industry embraced new psychological insights into
how we all make decisions. A decade ago, Myhrvold “brought together Daniel Kahneman, the
pioneer of the new science of behavioural economics [and author Thinking Fast and
Slow], and Silicon Valley’s leaders for a series of meetings.”
“A lot of advertising is about trying to hook people in
these type-one things to get interested one way or the other,” Myhrvold
explains. “You’re putting a set of triggers out there that make me want to
click on it.” He adds, “Tech companies both try to understand our behaviour by
having smart humans think about it and increasingly also having machines think
about it.”
Of course, trying to grab the consumer’s attention is
nothing new. It is essentially what advertising is all about. But insights into
how we make decisions have helped Silicon Valley to shape the online world. And
little wonder, their success depends on keeping us engaged.
As Silicon Valley became more influential, it also
started to attract powerful friends in politics, starting with Barack Obama,
who was regarded by people in Silicon Valley as a kindred spirit. Just like
them, Obama believed that “we can solve problems if we would work together and
take advantage of these new capabilities that are coming online,” tells Aneesh
Chopra, Obama’s first Chief Technology Officer. And by the time he won his
second term, Obama was was feted for his mastery of social media’s persuasive
power.
“Facebook’s mission to connect the world went
hand-in-hand with Obama’s policies promoting globalisation and free markets.
And Facebook was seen to be improving the political process itself,” according
to Bartlett. “But across the political spectrum, the race to find new ways to
gain a digital edge was on. The world was about to change for Facebook.”
“That data-driven decisionmaking played a huge role in
creating a second term for the 44th President and will be one of the more
closely studied elements of the 2012 cycle. It’s another sign that the role of
the campaign pros in Washington who make decisions on hunches and experience is
rapidly dwindling, being replaced by the work of quants and computer coders who
can crack massive data sets for insight. As one official put it, the time of
‘guys sitting in a back room smoking cigars, saying We always buy 60 Minutes’
is over. In politics, the era of big data has arrived.” — Michael Sherer
in How Obama’s data
crunchers helped him win
He subsequently takes us to the heart of Silicon Valley,
Stanford University, where he meets Michal Kosinski, a psychologist specialised
in psychometrics (the science of predicting psychological traits, such as
personality) who is investigating just how revealing Facebook’s hoard of
information about each of us could really be.
Kosinski explains how you can measure psychological
traits using the digital footprints we leave behind on the internet. “An
algorithm that can look at millions of people and […] hundreds of thousands […]
of your likes can extract and utilise even those little pieces of information
and combine it to a very accurate profile,” he tells Bartlett. “It can also use
your digital footprint and turn it into a very accurate prediction of your intimate
traits, like religiosity, political views, personality, intelligence, sexual
orientation and a bunch of other psychological traits.”
This algorithm can also predict people’s political
persuasions. People who score high on ‘openness to experience’ tend to be
liberal; those who score low, more conservative. If you would then use another
algorithm to adjust the messages those people will receive, this “obviously
gives you a lot of power,” according to Kowinski.
It’s powerful way of understanding people, but Bartlett
“can’t help fearing that there is that potential, whoever has that power,
whoever can control that model will have sort of unprecedented possibilities of
manipulating what people think, how they behave, what they see, whether that’s
selling things to people or how people vote, and that’s pretty scary too.”
Next, Bartlett tries to uncover how the expertise
of Cambridge Analytica in
personality prediction played a part in Donald Trump’s presidential win, and
how his campaign exploited Silicon Valley’s social networks. In San Antonio,
Texas, he meets Theresa Hong, Trump’s former Digital Content Director, to get
an understanding of what they actually did — “who they were working with, who
was helping them, what techniques they used.”
“Cambridge Analytica were using data on around 220
million Americans to target potential donors and voters. Armed with Cambridge
Analytica’s revolutionary insights, the next step in the battle to win over
millions of Americans was to shape the online messages they would see. Adverts
were tailored to particular audiences, defined by data. Now the voters
Cambridge Analytica had targeted, were bombarded with adverts” delivered
through Silicon Valley’s vast social networks, Bartlett tells us.
People from Facebook, YouTube and Google, who were
working alongside Donald Trump’s digital campaign team, were “basically our
kind of hands-on partners as far as being able to utilise the platform as
effectively as possible,” Hong tells Bartlett. “When you’re pumping in millions
and millions of dollars in these social platforms [The Trump campaign spent the
lion share of its advertising budget, around 85 million, on Facebook], you’re
going to get white-glove treatment, so they would send people […] to ensure
that all our needs were being met.” Adding, “Without Facebook, we wouldn’t have
won. I mean, Facebook really and truly put us over the edge. Facebook was the
medium that proves most successful for this campaign.”
Trump’s digital strategy was built on Facebook’s
effectiveness as advertising medium. “It’s become a powerful political tool
that’s largely unregulated.” Facebook didn’t want to meet him but “made it
clear that, like all advertisers on Facebook, also political campaigns must
ensure their ads comply with all applicable laws and regulations,” Bartlett
tells us. “The company also said that no personally identifiable information can
be shared with advertising, measurement or analytics partners unless people
give permission.”
Off to London, where Bartlett meets Alexander Nix,
Cambridge Analytica’s CEO, to find out how the company used psychographics to
target voters for the Trump campaign.
When he asks Nix if he can understand why some people
might find using big data and psychographics “a little bit creepy,” Nix
replies, “No, I can’t. Quite possibly the opposite. I think the move away from
blanket advertising towards ever more personalised communication, is a natural
progression. I think it is only going to increase.” People should “understand
the reciprocity that is going on here — you get points [in case of a
supermarket loyalty card], and in return, they gather your data on your
consumer behaviour.”
But Bartlett wonders whether shopping or politics are
really the same thing.
“The technology is the same,” according to Nix. “In the
next ten years, the sheer volumes of data that are going to be available, that
are going to be driving all sorts of things, including marketing and
communications, is going to be a paradigm shift from where we are now. It’s going
to be a revolution, and that is the way the world is moving. And, you know, I
think, whether you like it or not, it is an inevitable fact.”
“Cambridge Analytica’s rise has rattled some of President
Trump’s critics and privacy advocates, who warn of a blizzard of
high-tech, Facebook-optimized
propaganda aimed at the American public, controlled by the
people behind the alt-right hub Breitbart News. Cambridge is principally owned
by the billionaire Robert Mercer, a Trump backer and investor in Breitbart.
Stephen K. Bannon, the former Breitbart chairman who is Mr. Trump’s senior
White House counselor, served until last summer as vice president of
Cambridge’s board. But a dozen Republican consultants and former Trump campaign
aides, along with current and former Cambridge employees, say the company’s
ability to exploit personality profiles — ‘our secret sauce,’ Mr. Nix once
called it — is exaggerated.” — Nicholas Confessore and Danny Hakim in Data Firm Says
‘Secret Sauce’ Aided Trump; Many Scoff
“The election of Donald Trump was greeted with barely
concealed fury in Silicon Valley. But Facebook and other tech companies had
made millions of dollars by helping to make it happen. Their power as
advertising platforms had been exploited by a politician with a very different
view of the world. But Facebook’s problems were only just beginning. Another
phenomenon of the election was plunging the tech titan into crisis,” says
Bartlett.
“Fake news had provoked a storm of criticism over
Facebook’s impact on democracy. [Mark Zuckerberg], claimed it was extremely
unlikely fake new had changed the election’s outcome. But he didn’t address why
it had spread like wildfire across the platform. Meet Jeff Hancock, a
psychologist who has investigated a hidden aspect of Facebook that helps
explain how the platform became weaponised in this way. It turns out the power
of Facebook to affect our emotions is key, something that had been uncovered in
an experiment the company itself had run in 2012. The news feeds of nearly
700,000 users were secretly manipulated so they would see fewer positive of
negative posts.”
Hancock, who helped interpret these results, found that
people who were seeing less negative emotion words in their posts, would write
with less negative and more positive emotion in their own posts, and vice
versa. “This is consistent with the emotional contagion theory,”
he adds. “Basically, we were showing that people were writing in a way that was
matching the emotion that they were seeing in the Facebook news feed.”
Furthermore, “The more intense the emotion in content, the more likely it is to
spread, to go viral. And it doesn’t matter whether it is sad or happy, like
negative or positive, the more important thing is how intense the emotion is.”
“The process of emotional contagion helps explain why
fake news has spread so far across social media,” Bartlett tell us. The problem
with social networks however, is that all information is treated equally.
“[Y]ou have good, honest, accurate information sitting alongside and treated
equally to lies and propaganda. And the difficulty for citizens is that it can
be very hard to tell the difference between the two,” as also Barack Obama
pointed out during a press conference with Germany’s Chancellor Angela Merkel.
“In an age where there is so much active misinformation,
and it’s packaged very well, and it looks the same when you see it on a
Facebook page or you turn on your television, if everything seems to be the
same and no distinctions are made, the we won’t know what to protect. We won’t
know what to fight for.” — Barack Obama
But data scientist Simon Hegelich has
discovered an even darker side to the way Facebook is being manipulated.
Hegelich has found evidence the debate about refugees on Facebook is being
skewed by anonymous political forces. “One statistic among many used by
Facebook to rank stories in your news feed is the number of likes they get.” In
the example Hegelich gives, only a handful of people, 25 to be precise, each
liked more than 30,000 comments over six months. These hyperactive accounts
could be run by real people, or software.
“This is evidence that the number of likes on Facebook can
be easily gamed as part of an effort to try to influence the prominence of
anti-refugee content on the site,” says Bartlett.
When asked if this worries him, Hegelich answers, “It’s
definitely changing [the] structure of public opinion. Democracy is built on
public opinion, so such a change definitely has to change the way democracy
works.”
According to Facebook, “they are working to disrupt the
economic incentives behind false news, removing tens of thousands of fake
accounts, and building new products to identify and limit the spread of false
news.” But it is still trying to hold the line, based on Section 230, that it
isn’t a publisher.
“Facebook now connects more than two billion people
around the world, including more and more voters in the West,” Bartlett tells
us. “In less than a decade, it has become a platform that has dramatic
implications for how our democracy works.”
“Old structures of power are falling away. Social media
is giving ordinary people access to huge audiences. And politics is changing as
a result” across the entire spectrum as shown by The Canary, an online political news outlet
that supported Labour candidate Jeremy Corbyn during the 2017 elections.
“During the campaign, their stories got more than 25 million hits on a tiny
budget.” About 80 percent of its readership comes through Facebook, says
Kerry-Anne Mendoza, The Canary’s Editor in Chief.
Using emotions, its presentation of its pro-Corbyn news
is tailored to social media. “We’re trying to have a conversation with a lot of
people, so it is on us to be compelling,” Mendoza tells. “Human beings work on
facts, but they also work on gut-instincts. They work on emotions, feelings and
fidelity and community. All of these issues.” When Bartlett points out that The
Canary’s headlines are very “clickbait-y,” she says, “Of course [the headlines]
are there to get clicks. We don’t want to have a conversation with ten people.
You can’t change the world talking to ten people.”
Bartlett’s finishing words to an unraveling series …
“The tech gods are giving all of us the power to
influence the world. Social media’s unparalleled power to persuade, first
developed for advertisers, is now being exploited by political forces of all
kinds. Grassroots movements are regaining their power, challenging political
elites. Extremists are discovering new ways to stoke hatred and spread lies.
And wealthy political parties are developing the ability to manipulate our
thoughts and feelings using powerful psychological tools, which is leading to a
world of unexpected political opportunity and turbulence.
I think the people that connected the world really
believed that somehow, just by us being connected, our politics would be
better. But the world is changing in ways that they never imagined and they are
probably not happy about anymore. But in truth, they are no more in charge of
this technology than any of us are now.
The people who are responsible for building this
technology, for unleashing this disruption onto all of us, don’t ever feel like
they are responsible for the consequences of any of that. They retain this
absolute religious faith that technology and connectivity is always going to
make things turn out for the best. And it doesn’t matter what happens, it
doesn’t matter how much that’s proven not to be the case, they still believe.”
“In the next ten years, the sheer volumes of data that
are going to be available, that are going to be driving all sorts of things,
including marketing and communications, is going to be a paradigm shift from
where we are now. It’s going to be a revolution, and that is the way the world
is moving. And, you know, I think, whether you like it or not, it is an
inevitable fact.” — Alexander Nix, CEO of Cambridge Analytica
Link originale: https://medium.com/@marksstorm/secrets-of-silicon-valley-a-two-part-documentary-by-jamie-bartlett-part-2-the-persuasion-fa8b832a2aa2
Nessun commento:
Posta un commento