What If We Didn’t... have Facebook?
“What If We Didn’t...?” is a new series from Mic — read more here.
Calm and poker-faced, Facebook CEO Mark Zuckerberg embarked on a long-winded talking point during his April 10 testimony before Congress. As he stumbled through a seemingly scripted response to the question — “Who’s your biggest competitor?” — Sen. Lindsey Graham (R-S.C.) interrupted to ask the tech billionaire directly whether Facebook wields the unchecked power of a monopoly.
“It certainly doesn’t feel like that to me,” Zuckerberg responded. The audience roared with laughter.
Zuckerberg is now troubleshooting Facebook’s most contentious scandal yet. His 14-year-old social media platform has been plagued by a data leak that could be the premise of a dystopian novel. It’s widely known that the platform allowed Russian propagandists to buy up political ads designed to sway voters to favor then-presidential candidate Donald Trump in the 2016 election. But now, thanks to investigations from the Guardian, the The New York Times and Channel 4, we know Facebook’s role in American politics reaches even further.
The 33-year-old was in the congressional hot seat because a single individual was able to gather and abuse roughly 87 million Facebook users’ private information, which was ultimately utilized to serve messaging like “drain the swamp” up to individuals whom “psychographic” algorithms believed would find it appetizing.
In 2013, University of Cambridge academic Aleksandr Kogan created a personality quiz app that was able to harness Facebook information from nearly 300,000 paid users, along with many of their Facebook friends, eventually snowballing into tens of millions of users. The researcher’s access was not unusual, but he later took that information to British political consulting firm Cambridge Analytica, which was hired by the Trump campaign.
In 2015, journalists at the Guardian reported that Cambridge Analytica had a copy of the user data in its servers. Facebook quickly demanded that Kogan and Cambridge Analytica delete the data. The company said it did, and that was seemingly the end of it — until March 17, when news claiming otherwise smeared Facebook’s image of having everything under control.
Suddenly, Facebook publicly looked like a gold-plated, leeching, surveilling behemoth capable of turning politics as we know it upside down. At this point, one thing is absolutely undeniable: Facebook is more than a tech company. It’s an institution, and it’s being held accountable with checks and balances from the public.
So, as it continues to be the gatekeeper for more than 2 billion active users’ data, one has to wonder: What if we, and not Facebook, were the ones in the digital driving seat? What if we bought out social media platforms from the Zuckerbergs of the world and owned them ourselves, or simply demanded to know how our data is being used? What if we even owned and could sell our data for personal profit, treating it as an income stream, instead of handing it to tech firms?
What if Facebook didn’t exist anymore? What if there were no more elusive, monopolistic mediators controlling how we connected with our friends?
“The danger here is in how little we actually know about what’s going on.”
As Zuckerberg will often tell you, he started Facebook in his Harvard University dorm room in 2004. That tale, told with his meme-evoking, earnest-and-yet-robotic expression, has become what the New Yorker called his “dorm room defense.” It’s perhaps a way of dispelling the image that Facebook users, now accounting for nearly 30% of the world population, are all part of an unwieldy social experiment even Zuckerberg has trouble reining in.
In other words, it’s a deflection of accountability.
But in all truths, privacy is an age-old woe for Americans, yet one with a paradoxically vague definition — which makes it an incredibly hard standard to enforce on platforms like Facebook.
Though privacy is listed on the United Nations Declaration of Human Rights — making it a matter of international law — it’s still unclear what protections are required for our “likes” and our user behavior, even when they end up scandalously shaping major political campaigns.
“I don’t think we’ve even approached a consensus on what our new definition of privacy should even be,” Scott J. Shackelford, chair of Indiana University Bloomington’s cybersecurity program, said in an interview. “Privacy is one of those amorphous concepts that a lot of us say we care about, but when it comes down to it, we really don’t even do the basic stuff that would be required to show that we really do care about it.”
For the last 127 years, the U.S. has largely considered privacy the “right to be let alone.” Those words come from an 1890 article in the Harvard Law Review written by two attorneys, one of whom later became a Supreme Court Justice. They wrote the definition because at the time, people were concerned about the way new technology like photography and sensational reporting could intrude on their lives. Sound familiar?
Since then, privacy concerns have come in and out of vogue. In the 1920s, buzz about wiretapping and nosy telephone operators worried many citizens, as did recording and computing technology in the 1970s. Most recently, the National Security Agency scandal in 2013 showed how widely technology can be used to surveil us — and the following year, Facebook started tapping into user data collected from outside of its platform to better target advertising.
Yet here we are, still fairly complacent as a whole.
Privacy issues have become much more complex since the word was first defined in the United States. Data has transformed from newspaper clippings and photographs to the far less tangible: We can no longer set data on fire, let alone be sure we’ve actually deleted it. Far too often, we don’t even know what our data looks like, what it’s being used for or where it’s being stored. And for those who have “nothing to hide,” the Cambridge Analytica reveal showed that our seemingly irrelevant personal information could be used for more manipulative purposes.
“A lot of the danger here is in how little we actually know about what’s going on,” Nathan Schneider, author of Everything for Everyone: The Radical Tradition That Is Shaping the Next Economy and assistant professor of media studies at the University of Colorado, said in an interview. “The potential uses for data that is being collected at this moment may be very different five years from now.”
So, in an age of elusive data, a lack of clear-cut privacy standards makes platforms like Facebook regulation dead zones, where transparency is left up to the will of for-profit technology giants. (Zuckerberg himself even admitted it might be time for government regulation). Meanwhile, users are vulnerable to security breaches, hacks or the general abuse of their information, and their privacy becomes a difficult burden of personal choice.
“I teach a class on hacker culture, where we spend hours trying to help ourselves get more secure and more aware of what data is going where,” Schneider said. “It’s really not a fair expectation to require people to figure this out for themselves. It takes a lot of effort to have some control over your data, and so I think we really need to approach this issue as a matter of social concern.”
The Republican-controlled Congress recently voted to repeal internet privacy legislation that would’ve limited how internet service providers such as Verizon, Comcast and AT&T use customers’ browsing habits and other personal information. Revoking those protections would mean ISPs could sell users’ internet browsing histories to other companies without their permission.
That’s something Facebook doesn’t even do.
So, as it stands currently, our world looks something like this: Facebook is one of the many gatekeepers of our data, but we users are the gatekeepers of our own ill-defined privacy. With no tangible list of legally enforced demands to keep tech giants accountable, there’s no saying these matters will actually change for the better.
After all, the scandal has so far been relatively inconsequential. Cambridge Analytica has denied doing anything illegal, though at least one watchdog group has filed a complaint accusing the firm of breaking election laws. Facebook continues to legally sell advertising that targets users based on their own data, a revenue stream that netted them $39.9 billion in 2017.
As it stands, Facebook’s main threat is an already-existing investigation from the Federal Trade Commission that started in 2011. If that investigation finds that Facebook violated a consent decree, they could be charged as much as $40,000 per user, per day of violation.
“Cambridge Analytica may have a legacy, in terms of turning the page and opening up a new chapter … It seems like now we are asking the big, tough questions,” Shackelford said. “But the only way to show we really care about this is by making it, frankly, more of an issue.”
“We’re giving it away, and we don’t have to.”
The situation doesn’t have to be like this. In fact, it isn’t like this for much of the world — particularly Europeans, who have more agency over their personal information thanks to recently enacted laws.
In 2016, the European Union signed off on the General Data Protection Regulation, which will be fully enforceable by May 25 of this year. Under GDPR, any business or entity that operates in the EU or provides services to EU citizens — including Facebook — must be upfront about when, why and how they are collecting and using their consumers’ personal information. Individuals can access their own data and also request that it be erased, altered or stored but not used.
“You have the right to decide what kind of data you share and under what circumstances that happens,” Shackelford said. “And if you don’t like your service and don’t want that data up anymore, they generally have to delete it. There’s nothing shocking here; this isn’t some kind of pie-in-the-sky vision that we’re talking about. This is the reality for hundreds of millions of people around the world.”
Facebook is supposed to adapt its platform to comply with these regulations. During his testimony before Congress, Zuckerberg verbally committed to making those same controls available to users worldwide. It’s worth noting that’s not the same as enforcing the standards themselves (which is arguably lawmakers’ job). Nevertheless, it’s clear Zuckerberg was at least willing to talk about extending GDPR protections to all Facebook users: A photograph of the typed notes he brought to his testimony included talking points like “support privacy legislation that is practical” and “don’t say we already do what GDPR requires.”
“The European Union has been far ahead of us in all of this,” Schneider said. “The Republican Party is aligned with the [ISPs] and the Democratic Party has largely been aligned with the big platform monopolies — the Facebooks and the Googles. So both parties are in bed with major monopolistic actors in this economy, and we’ve been really slow to see the conversation develop about how we can challenge their power.
“Until we take this more global approach, I’m kind of worried about where we’re headed,” Shackelford added. “We’re giving it away, and we don’t have to.”
In an alternate universe — or perhaps in the near future — consumers could truly own their data or sell it for a profit. They could even own the social media platforms they use.
Facebook might provide its services for free, but it made an average of $20.21 per user in 2017, primarily from using their data to sell targeted advertising. Imagine a world where each user made that money instead. Though it’s certainly not a fortune, it’s definitely more than zero.
“Private data is the gold of the information age,” Shackelford said. “If we collectively start to say, ‘We’re not OK with that,’ and start to put a price tag on it, companies will pay you for it. This could be an income stream especially for people who are really active online. You’re valuable. Treat yourself that way.”
If you can’t beat ‘em, join ‘em?
New York startup DataCoup allows users to connect their credit card information, social media accounts and even their activity trackers to an online marketplace. DataCoup then creates a data profile and makes it available to potential buyers. When the data is purchased, users get paid. The sum is far from impressive — in the beginning, DataCoup capped users at $10 a month — but it’s not too different from DataWallet, a similar startup with a comparable average payout.
There’s also TheGoodData, which sells users’ information and donates the value to charity.
Selling user data may sound like an “if you can’t beat ‘em, join ’em” approach in a greater conversation about privacy. But it’s also an arguably symbolic message that users value what they bring to social media platforms.
“There are startups out there trying to put users in the driving seat and allowing them to sell access to their own private data streams,” Shackelford said. “It’s feasible — and technologically, it’s absolutely doable.”
Others are calling for a cooperative model, where users could collectively own a social networking platform like Facebook.
Schneider, who started a cooperative digital space called social.coop on Mastodon’s network, calls it “smooshing together the tradition of cooperative business with online platforms.” The idea is that rather than surrendering their data and communications to the whims of a centralized power, users can govern themselves by collaborating on their own policies.
Schneider compared that shared ownership model to a credit union or the Associated Press’ beginnings: “A credit union — at the end of the day, when they have to make a hard decision, you know that their owners are their customers and they’re going to side with [them], whereas Bank of America or Wells Fargo has to align with its investors,” he said. “And because [the AP] is owned by a variety of media companies such as Fox News to the New York Times to my local paper, it hasn’t fallen victim to the polarization that has affected many other media outlets in the country.”
It’s highly unlikely this idea would ever translate to a behemoth like Facebook, which is valued at around $489 billion as of April 17. Zuckerberg has said he’s open to discussing both free and paid versions of the social media site, but that has the potential of disadvantaging those who can’t afford to pay. Another hypothetical step toward a cooperative model could be Zuckerberg selling some of his Facebook stock to users, Schneider said, instead of a dramatic surrender of power.
“Maybe users wouldn’t end up controlling the whole company, but maybe they would end up with a board seat or two,” Schneider said. “Maybe they would end up with representatives specifically accountable to them. So they can really at least be at the table in shaping the future of the company, without having to buy out all the big investors at market value.”
Complacency could be the killer
The Cambridge Analytica fiasco has been characterized as Facebook’s “fall from grace” by some, while others have boldly questioned whether we’re witnessing the “beginning of the end.” Those aren’t entirely unfair assessments. Since March 16 — the day before the scandal first hit newswires — Facebook has lost around $48 billion in stock market value. Yet some experts are still not convinced the platform will take a lasting hit.
“They’re too big to fail. When you have 2 billion or more users, you can lose people and you’re still going to be a very profitable business,” Nora Ganim Barnes, director of the Center for Marketing Research at the University of Massachusetts Dartmouth, said in an interview. “What are they going to lose? 10%? 15%? It’s still a huge, huge company.”
Ultimately, complacency could be the killer of any public crusade against Facebook. In a 2017 survey of Fortune 500 companies, Ganim Barnes found that privacy issues weren’t their biggest concern when it came to social media. (Granted, this was before news of Cambridge Analytica broke.) Roughly 33% of the companies interviewed said it worried them, whereas 42% were concerned in 2016.
“Privacy did make that list, but return on investment beat it out by a longshot,” Ganim Barnes said.
Even so, a lot of good could come from a data privacy revolution of some kind. Even the smallest drop of resistance could pass some dollars onto users or earn revenue for charities. And however improbable, Americans seizing their data or demanding protective legislation would send a clear message to social media platforms that they are our messengers, not our informants. It would take our increasingly clear future of Orwellian surveillance and political meddling and push it off-kilter.
“We have an absolute human right to privacy, but it’s never been updated for our digital age,” Shackelford said. “I think it’s about time we do that.”