Can Mark Zuckerberg manipulate Facebook to influence tight elections for pro-amnesty candidates? In an article in the New Republic, which is published by Facebook co-founder and former Obama campaign operative Chris Hughes, Harvard law professor Jonathan Zittrain argues that he can.
In 2010, Facebook conducted an experiment with political scientists to determine if it could prod people to vote. Facebook contained a “link for looking up polling places, a button to click to announce that you had voted, and the profile photos of up to six Facebook friends who had indicated they’d already done the same.” And that graphic was “planted in the newsfeeds of tens of millions of users.”
Zittrain, who also teaches computer science, describes how “the researchers cross-referenced their subjects’ names with the day’s actual voting records from precincts across the country to measure how much their voting prompt increased turnout.”
The result? Facebook mobilized nearly 400,000 voters:
Overall, users notified of their friends’ voting were 0.39 percent more likely to vote than those in the control group, and any resulting decisions to cast a ballot also appeared to ripple to the behavior of close Facebook friends, even if those people hadn’t received the original message. That small increase in turnout rates amounted to a lot of new votes. The researchers concluded that their Facebook graphic directly mobilized 60,000 voters, and, thanks to the ripple effect, ultimately caused an additional 340,000 votes to be cast that day. As they point out, George W. Bush won Florida, and thus the presidency, by 537 votes–fewer than 0.01 percent of the votes cast in that state.
Zittrain then describes how Zuckerberg could use what he calls “digital gerrymandering” to actually tip a tight election:
Suppose that Mark Zuckerberg personally favors whichever candidate you don’t like. He arranges for a voting prompt to appear within the newsfeeds of tens of millions of active Facebook users–but unlike in the 2010 experiment, the group that will not receive the message is not chosen at random. Rather, Zuckerberg makes use of the fact that Facebook “likes” can predict political views and party affiliation, even beyond the many users who proudly advertise those affiliations directly. With that knowledge, our hypothetical Zuck chooses not to spice the feeds of users unsympathetic to his views. Such machinations then flip the outcome of our hypothetical election.
Zittrain says that digital gerrymandering, which “occurs when a site instead distributes information in a manner that serves its own ideological agenda, is actually “possible on any service that personalizes what users see or the order in which they see it, and it’s increasingly easy to effect [sic].”
“They can be subtly tweaked without hazarding the same backlash,” he says of newsfeeds. “Indeed, in our get-out-the-vote hypothetical, the people with the most cause for complaint are those who won’t be fed the prompt and may never know it existed.”
High-tech giants are becoming more active in politics, as evidenced by Google’s campaign against SOPA (Stop Online Privacy Act). And Zuckerberg’s FWD.us group has invested millions and has been one of the biggest players in trying to pass amnesty legislation that would also increase the number of high-tech visas. FWD.us, for instance, ran deceptive ads this election cycle touting pro-amnesty candidate Rep. Renee Ellmers (R-NC) as someone who opposed amnesty. She survived her primary–and FWD.us gloated about it. Zuckerberg’s group has also attacked Rep. Steve King (R-IA) on the airwaves for opposing amnesty. Facebook’s position on amnesty legislation is clear, and it is looking for a return on their investment in getting a bill passed.
Zittrain suggests enticing Web companies entrusted with personal data and preferences with tax breaks or legal immunities to act as “information fiduciaries”–like doctors and lawyers with sensitive information–that would “forswear any formulas of personalization derived from their own ideological goals.”
He writes:
My search results and newsfeed might still end up different from yours based on our political leanings, but only because the algorithm is trying to give me what I want–the way that an investment adviser may recommend stocks to the reckless and bonds to the sedate–and never because the search engine or social network is trying to covertly pick election winners.
In the meantime, Zittrain notes that the “disclosure policies of social networks and search engines already state that the companies reserve the right to season their newsfeeds and search results however they like,” and “an effort to sway turnout could be construed as being covered by the existing agreements and require no special notice to users.”
In other words, there is nothing stopping Facebook from using “machinations” to try to “flip” elections for preferred candidates if it wanted to do so.