Facebook's revelations: Real change or window dressing?
For a company bent on making the world more open, Facebook has long been secretive about the details of how it runs its social network - particularly how things go wrong and what it does about them.
Yet on Tuesday, Facebook rushed forward to alert Congress and the
public that it had recently detected a small but "sophisticated" case of
possible Russian election manipulation. Has the social network finally
acknowledged the need to keep the world informed about the big problems
it's grappling with, rather than doing so only when dragged kicking and
screaming to the podium?
While the unprompted revelation does signal a new, albeit tightly
controlled openness for the company, there is still plenty that Facebook
isn't saying. Many experts remain unconvinced that this is a true culture change and not mere window dressing.
"This is all calculated very carefully," said Timothy Carone, a
business professor at the University of Notre Dame. He and other
analysts noted that Facebook announced its discovery of 32 accounts and
pages intended to stir up U.S. political discord just a week after the
company's stock dropped almost 20 percent - its worst plunge since going
public.
But Facebook's proactive disclosure, including a conference call
for reporters with chief operating officer Sheryl Sandberg, struck a
markedly different tone from the company's ham-handed approach to a
string of scandals and setbacks over the past two years. That has
included:
- CEO Mark Zuckerberg's infamous dismissal of the idea that fake news on Facebook could have influenced the 2016 election as "a pretty crazy idea";
- The company's foot-dragging as evidence mounted of a 2016 Russian
election-interference effort conducted on Facebook and other
social-media sites;
- Zuckerberg, again, declining for nearly a week to publicly
address the privacy furor over a Trump campaign consultant, Cambridge
Analytica, that scavenged data from tens of millions of Facebook users
for its own election-influence efforts.
A chastened Facebook has since taken steps toward transparency,
many of them easy to overlook. In April, it published for the first time
the detailed guidelines its moderators use to police unacceptable
material. It has provided additional, if partial, explanations of how it
collects user data and what it does with it. And it has forced
disclosure of the funding and audience targeting of political
advertisements, which it now also archives for public scrutiny.
All of that is in keeping with the image of Facebook that
Zuckerberg relentlessly promotes. In his telling, the giant,
data-and-ad-driven social network is a force for good in the world that
must now reluctantly do battle with "bad actors," such as Russian
agents, who threaten Facebook's noble mission of "connecting the world."
Solving such problems, in Facebook's view, is mostly a matter of
more investment, more hard work, more hires, and better technology -
particularly artificial intelligence.
And Facebook's newfound passion for openness only goes so far. Of
the 32 apparently fake accounts and pages it found, it only released
eight to researchers. In a conference call this week, executives
declined to characterize the accounts, even in terms of whether they
leaned right or left. Facebook left it to researchers at the nonprofit
Atlantic Council, a think tank that is helping the company on election
interference, to draw those conclusions.
Facebook said its timing was motivated by an upcoming protest event
in Washington that was promoted by a suspicious page connected to a
Russian troll farm, the Internet Research Agency. Several people
connected to the IRA have been indicted by the U.S. special counsel for
attempting to interfere in the 2016 election.
Despite Zuckerberg's repeated mantra - delivered to relentless
effect in some 10 hours of testimony before Congress in April - that the
company now really gets it, some who know the company best have their
doubts.
David Kirkpatrick, the author a Facebook history, argues that
neither Zuckerberg nor Sandberg have ever shown themselves to be "deeply
alarmed in public." As a result, he suggests, Facebook seems more
concerned with managing its image than with solving the actual problem
at hand.
Such issues run deep for the company. Some of its biggest critics,
including former employees such as Sandy Parakilas and early Facebook
investor Roger McNamee, say the company needs to revamp its business
model from the ground up to see any meaningful change.
These critics would like to see Facebook rely less on tracking its
users in order to sell targeted advertising, and to cut back on
addicting features such as endless notifications that keep drawing
people back in. Parakilas, for example, has advocated for a
subscription-based model, letting users pay to user Facebook instead of
having their data harvested.
Merely hiring more moderators, or hanging hopes on the evolution of
artificial intelligence, isn't going to cut it, in their view. There
have also been widespread calls for Facebook to acknowledge that it is,
in a sense, a media company, responsible for what happens on its
platforms - a characterization the social network has long fought.
For all that, Facebook is well ahead
of Silicon Valley rivals such as Google and Twitter when it comes to
openness - even if only because it's attracted the lion's share of
criticism, said Paul Levinson, a media studies professor at Fordham
University.
But Facebook "can't win at this
game," said Siva Vaidhyanathan, a University of Virginia professor of
media studies whose 2018 book "Antisocial Media" critiques Facebook's
effect on democracy and society. Because it's so huge - 2.2 billion
global users and counting - and so difficult to police, he said, "it
will always be vulnerable to hijacking and will never completely clean
up its content."
Worse, he says, there is no real solution. "It is hopeless," he said. "The problem with Facebook is Facebook."
No comments:
Post a Comment