In 2016, they enabled the spread of fringe political views and fake news, which made it easier for Russian actors to manipulate the American electorate.
In January 2018, Mark announced that the algorithms would favor non-news content shared by friends and news from “trustworthy” sources, which his engineers interpreted — to the confusion of many — as a boost for anything in the category of “politics, crime, tragedy.”
Facebook has responded to many of the criticisms of how it manages speech by hiring thousands of contractors to enforce the rules that Mark and senior executives develop.
After a few weeks of training, these contractors decide which videos count as hate speech or free speech, which images are erotic and which are simply artistic, and which live streams are too violent to be broadcast.
(The Verge reported that some of these moderators, working through a vendor in Arizona, were paid $28,800 a year, got limited breaks and faced significant mental health risks.)
As if Facebook’s opaque algorithms weren’t enough, last year we learned that Facebook executives had permanently deleted their own messages from the platform, erasing them from the inboxes of recipients; the justification was corporate security concerns.
When I look at my years of Facebook messages with Mark now, it’s just a long stream of my own light-blue comments, clearly written in response to words he had once sent me.
(Facebook now offers this as a feature to all users.)
The most extreme example of Facebook manipulating speech happened in Myanmar in late 2017.
Mark said in a Vox interview that he personally made the decision to delete the private messages of Facebook users who were encouraging genocide there.
“I remember, one Saturday morning, I got a phone call,” he said, “and we detected that people were trying to spread sensational messages through — it was Facebook Messenger in this case — to each side of the conflict, basically telling the Muslims, ‘Hey, there’s about to be an uprising of the Buddhists, so make sure that you are armed and go to this place.’ And then the same thing on the other side.”
Mark made a call:
“We stop those messages from going through.”
Most people would agree with his decision, but it’s deeply troubling that he made it with no accountability to any independent authority or government.
Facebook could, in theory, delete en masse the messages of Americans, too, if its leadership decided it didn’t like them.
Mark used to insist that Facebook was just a “social utility,” a neutral platform for people to communicate what they wished.
Now he recognizes that Facebook is both a platform and a publisher and that it is inevitably making decisions about values.
The company’s own lawyers have argued in court that Facebook is a publisher and thus entitled to First Amendment protection.
No one at Facebook headquarters is choosing what single news story everyone in America wakes up to, of course.
But they do decide whether it will be an article from a reputable outlet or a clip from “The Daily Show,” a photo from a friend’s wedding or an incendiary call to kill others.
Mark knows that this is too much power and is pursuing a twofold strategy to mitigate it.
He is pivoting Facebook’s focus toward encouraging more private, encrypted messaging that Facebook’s employees can’t see, let alone control.
Second, he is hoping for friendly oversight from regulators and other industry executives.
Late last year, he proposed an independent commission to handle difficult content moderation decisions by social media platforms.
It would afford an independent check, Mark argued, on Facebook’s decisions, and users could appeal to it if they disagreed.
But its decisions would not have the force of law, since companies would voluntarily participate.
In an op-ed essay in The Washington Post in March, he wrote, “Lawmakers often tell me we have too much power over speech, and I agree.”
And he went even further than before, calling for more government regulation — not just on speech, but also on privacy and interoperability, the ability of consumers to seamlessly leave one network and transfer their profiles, friend connections, photos and other data to another.
I don’t think these proposals were made in bad faith.
But I do think they’re an attempt to head off the argument that regulators need to go further and break up the company.
Facebook isn’t afraid of a few more rules.
It’s afraid of an antitrust case and of the kind of accountability that real government oversight would bring.
We don’t expect calcified rules or voluntary commissions to work to regulate drug companies, health care companies, car manufacturers or credit card providers.
Agencies oversee these industries to ensure that the private market works for the public good.
In these cases, we all understand that government isn’t an external force meddling in an organic market; it’s what makes a dynamic and fair market possible in the first place.
This should be just as true for social networking as it is for air travel or pharmaceuticals.
In the summer of 2006, Yahoo offered us $1 billion for Facebook.
I desperately wanted Mark to say yes.
Even my small slice of the company would have made me a millionaire several times over.
For a 22-year-old scholarship kid from small-town North Carolina, that kind of money was unimaginable.
I wasn’t alone — just about every other person at the company wanted the same.
It was taboo to talk about it openly, but I finally asked Mark when we had a moment alone,
“How are you feeling about Yahoo?” I got a shrug and a one-line answer:
“I just don’t know if I want to work for Terry Semel,” Yahoo’s chief executive.
Outside of a couple of gigs in college, Mark had never had a real boss and seemed entirely uninterested in the prospect.
I didn’t like the idea much myself, but I would have traded having a boss for several million dollars any day of the week.
Mark’s drive was infinitely stronger.
Domination meant domination, and the hustle was just too delicious.
Mark may never have a boss, but he needs to have some check on his power.
The American government needs to do two things:
break up Facebook’s monopoly and regulate the company to make it more accountable to the American people.
First, Facebook should be separated into multiple companies.
The F.T.C., in conjunction with the Justice Department, should enforce antitrust laws by undoing the Instagram and WhatsApp acquisitions and banning future acquisitions for several years.
The F.T.C. should have blocked these mergers, but it’s not too late to act.
There is precedent for correcting bad decisions — as recently as 2009, Whole Foods settled antitrust complaints by selling off the Wild Oats brand and stores that it had bought a few years earlier.
There is some evidence that we may be headed in this direction.
Senator Elizabeth Warren has called for reversing the Facebook mergers, and in February, the F.T.C. announced the creation of a task force to monitor competition among tech companies and review previous mergers.
How would a breakup work?
Facebook would have a brief period to spin off the Instagram and WhatsApp businesses, and the three would become distinct companies, most likely publicly traded.
Facebook shareholders would initially hold stock in the new companies, although Mark and other executives would probably be required to divest their management shares.
Until recently, WhatsApp and Instagram were administered as independent platforms inside the parent company, so that should make the process easier.
But time is of the essence:
Facebook is working quickly to integrate the three, which would make it harder for the F.T.C. to split them up.
Some economists are skeptical that breaking up Facebook would spur that much competition, because Facebook, they say, is a “natural” monopoly.
Natural monopolies have emerged in areas like water systems and the electrical grid, where the price of entering the business is very high — because you have to lay pipes or electrical lines — but it gets cheaper and cheaper to add each additional customer.
In other words, the monopoly arises naturally from the circumstances of the business, rather than a company’s illegal maneuvering.
In addition, defenders of natural monopolies often make the case that they benefit consumers because they are able to provide services more cheaply than anyone else.
Facebook is indeed more valuable when there are more people on it:
There are more connections for a user to make and more content to be shared.
But the cost of entering the social network business is not that high.
And unlike with pipes and electricity, there is no good argument that the country benefits from having only one dominant social networking company.
Still others worry that the breakup of Facebook or other American tech companies could be a national security problem.
Because advancements in artificial intelligence require immense amounts of data and computing power, only large companies like Facebook, Google and Amazon can afford these investments, they say.
If American companies become smaller, the Chinese will outpace us.
While serious, these concerns do not justify inaction.
Even after a breakup, Facebook would be a hugely profitable business with billions to invest in new technologies — and a more competitive market would only encourage those investments.
If the Chinese did pull ahead, our government could invest in research and development and pursue tactical trade policy, just as it is doing today to hold China’s 5G technology at bay.
The cost of breaking up Facebook would be next to zero for the government, and lots of people stand to gain economically.
A ban on short-term acquisitions would ensure that competitors, and the investors who take a bet on them, would have the space to flourish.
Digital advertisers would suddenly have multiple companies vying for their dollars.
Even Facebook shareholders would probably benefit, as shareholders often do in the years after a company’s split.
The value of the companies that made up Standard Oil doubled within a year of its being dismantled and had increased by fivefold a few years later.
Ten years after the 1984 breakup of AT&T, the value of its successor companies had tripled.
But the biggest winners would be the American people.
Imagine a competitive market in which they could choose among one network that offered higher privacy standards, another that cost a fee to join but had little advertising and another that would allow users to customize and tweak their feeds as they saw fit.
No one knows exactly what Facebook’s competitors would offer to differentiate themselves.
That’s exactly the point.
The Justice Department faced similar questions of social costs and benefits with AT&T in the 1950s.
AT&T had a monopoly on phone services and telecommunications equipment.
The government filed suit under antitrust laws, and the case ended with a consent decree that required AT&T to release its patents and refrain from expanding into the nascent computer industry.
This resulted in an explosion of innovation, greatly increasing follow-on patents and leading to the development of the semiconductor and modern computing.
We would most likely not have iPhones or laptops without the competitive markets that antitrust action ushered in.
4/5