Thursday, 14 August 2014

How Facebook Can Manipulate the November Election

Written by 

It was the focus of the film The Social Network, but critics say that Facebook has become a political network — one that is two-faced and facing left.

Facebook founder Mark Zuckerberg has long been pushing amnesty and targeting politicians who oppose it. Now, warns American Thinker’s Jonathon Moseley, Facebook is going so far as to block Republican activism from its website and may use social-network manipulation to turn out the vote for liberal candidates.

In a piece entitled “The Facebook Factor in the GOP's Battle to Take the Senate,” Moseley writes that he had started an “Event” on his Facebook page designed to raise money for Republican Kevin Wade, who is running for U.S. Senate in Delaware against Democratic incumbent Chris Coons. Coons is vulnerable, and putting his seat in play increases the GOP’s senatorial fortunes in November. But, reports Moseley, making Wade’s bid realistic requires the candidate raise at least one million dollars through September. This is why what Moseley reports next is eyebrow-raising:

Monday night, Facebook deleted the “Event” this author created: “KEVIN WADE: Don't Hope for Change: Vote for It: $25 from 3,000 for US Senate, Delaware, Republican, by August 20 FEC reporting deadline.” The common function — routinely used by liberals — allows a user to invite his own “friends” (in Facebook terminology) to join in one’s Event. Asking other friends to donate to a cause or a candidate is widely accepted and practiced ... but apparently not for Republicans. The “Event” was deleted, meaning that this author’s over 4,937 Facebook friends will not receive this author’s personal invitation for them to donate to Kevin Wade.

... Facebook is very commonly used — overwhelmingly by liberals — to promote a “page” or a “group” dedicated to a candidate or campaign, or to hold an “event” for an activity or fund-raising push to benefit candidates. Ron Paul supporters aggressively used such techniques for “money bombs” in which everyone is asked to donate a small amount on the same day. News of an impressive sum donated all on the same day has been an effective technique going back years, particular [sic] for the well-organized Paul movement.

Apparently, though, it’s not a technique available to candidates in disfavor with Facebook. Don’t expect an explanation from the Web behemoth, either. The company is notoriously hard to contact; as Moseley explains, even relatives of deceased complain that they can’t get their loved ones’ pages deactivated.

There is a situation, however, where Zuckerberg and his underlings are absolutely on the ball: the matter of manipulating votes. In a June 3 Breitbart article whose title asks “Can Zuckerberg Use Facebook to Tip Elections for Pro-Amnesty Candidates?” writer Tony Lee answers in the affirmative. Reporting on, and quoting, an article by Harvard law and computer science professor Jonathan Zittrain, Lee writes, “In 2010, Facebook conducted an experiment with political scientists to determine if it could prod people to vote. Facebook contained a ‘link for looking up polling places, a button to click to announce that you had voted, and the profile photos of up to six Facebook friends who had indicated they’d already done the same.’ And that graphic was ‘planted in the newsfeeds of tens of millions of users.’” What was the result? As Zittrain reported in the New Republic,

Overall, users notified of their friends’ voting were 0.39 percent more likely to vote than those in the control group, and any resulting decisions to cast a ballot also appeared to ripple to the behavior of close Facebook friends, even if those people hadn’t received the original message. That small increase in turnout rates amounted to a lot of new votes. The researchers concluded that their Facebook graphic directly mobilized 60,000 voters, and, thanks to the ripple effect, ultimately caused an additional 340,000 votes to be cast that day. As they point out, George W. Bush won Florida, and thus the presidency, by 537 votes — fewer than 0.01 percent of the votes cast in that state.

What could be wrong, though, with increasing voter participation? First, some would point out that people who are Facebook oriented — people who spend more time on social networks — might be inordinately liberal. Others, such as John Stossel and me, have asserted that encouraging generally apathetic, disengaged citizens who wouldn’t otherwise vote to cast ballots simply increases the number of low-information voters at election time. Yet there is another problem, as Zittrain explains:

Now consider a hypothetical, hotly contested future election. Suppose that Mark Zuckerberg personally favors whichever candidate you don’t like. He arranges for a voting prompt to appear within the newsfeeds of tens of millions of active Facebook users — but unlike in the 2010 experiment, the group that will not receive the message is not chosen at random. Rather, Zuckerberg makes use of the fact that Facebook “likes” can predict political views and party affiliation, even beyond the many users who proudly advertise those affiliations directly. With that knowledge, our hypothetical Zuck chooses not to spice the feeds of users unsympathetic to his views. Such machinations then flip the outcome of our hypothetical election. Should the law constrain this kind of behavior?

Professor Zittrain calls this “digital gerrymandering,” which is when a website disseminates information in a way that promotes its own political agenda. Any site can do this, and it’s quite insidious because Web users will generally be unaware of the manipulation. After all, would you know it if a social network simply didn’t send you a voting graphic that a certain targeted group received (I rarely even view my Facebook page)? Zittrain’s article title says it all: “Facebook Could Decide an Election Without [sic] Anyone Ever Finding Out.”

But political Internet manipulation has long been occurring without anyone, or at least without most everyone, finding out. Years ago already, Google censored sites such as The New Media Journal,, and The Jawa Report from its news search — a traffic-hobbling move that can send a site to a virtual Siberia — after accusing them of “hate speech” for criticizing Islam. Social commentators Dr. David Yeagley and Amil Imani had their MSN Hotmail accounts terminated for the same reason. And Google’s subsidiary YouTube once deleted an Islam exposé by commentator Michelle Malkin and has pulled or suppressed the traffic of other politically incorrect videos — all while purporting to be an open marketplace for ideas.

So will the next election be won via social-network sleight of hand? Tragically, you may never know.

Please review our Comment Policy before posting a comment

Affiliates and Friends

Social Media