In the world of social media, Facebook and Twitter are the twin Goliaths. They dominate the social media market. They also systematically censor posts and use manipulative tactics to influence the ways users think, interact with each other, and even vote in elections. But now, thanks to a love of free speech and the power of alt-tech, a new David has arrived on the scene to compete with those twin Goliaths. The new social media platform — ONEWAY — is prepared to become the platform for those who love free speech.
ONEWAY — which was launched in early January — is described as “The free speech and human friendly alternative to Google, Facebook, Twitter, Youtube, Reddit, Imgur, and Patreon.” While that may seem like a large bill to fill, the site boasts nearly all of the features of those sites and more. The “features” that are present in those platforms but absent from ONEWAY are those that diminish privacy and free speech and aid in surveillance.
ONEWAY has succeeded in pulling together everything there is for a freedom-loving user to like about those platforms while leaving out all the rest. The end result is one-stop social media enjoyment without the creepy surveillance and social manipulation practiced by the people operating those sites.
In an exclusive interview with The New American, Derek Peterson, founder and CEO of ONEWAY, shared some things that set ONEWAY apart and explained his reasons for creating the social media platform and taking on those platforms that routinely harvest data, censor posts, and manipulate users.
Perhaps one of the biggest things that ONEWAY does differently from other social media sites is the lack of algorithms that decide for users what they do and do not see on the site. Peterson explained, “We show all the content on the website — one hundred percent.” So everything posted by anyone a user is following will show in their feed. This prevents the standard social manipulation that is par for the course with other social media platforms.
“You have a lot of power and controls to filter your feed," he said, "because it’s a lot of content when you see every single post from every single person you’re following.” You can filter your feed by topic, user, or words. So if you never want to see videos of cute kittens wrestling on the dining room table, you just select to block the “Cute and awww” category. If there is a user you want to continue following, but want to take a break from, you can mute him for now and then unmute him later. If certain words bother you (this is, after all, a free speech platform and that means no censorship), you can block any post containing those words.
The point, though, is that you decide what is in your feed; ONEWAY does not decide that for you.
Peterson offered an example of how Facebook, Twitter, and others are able to alter the way people — especially young people who are more easily influenced by what they perceive as the prevailing opinions of their peers — think about controversial topics. Besides shadow-banning, where users’ posts go into the ether never to be seen by anyone, Facebook, Twitter, and others are able to use their filtering algorithms to determine who sees a post. As Peterson explained, that is a tool for social manipulation. “Aside from the obvious issue of censorship, it’s the manipulation of people that I think is the most offensive and subtle and most dangerous — where they’re trying to manipulate people into believing the way they want you to believe.”
How can an algorithm alter a person’s basic belief system? Peterson said it’s simple. “A teenage girl goes on [social media] and and says, ‘I think transgenderism is wrong.’ Facebook is going to either squelch that post or share it only with people who disagree with her, so she’s either going to get a bunch of negative feedback or just radio silence, nothing, dead air.” The result of either being ignored (because the post was dropped down the silence hole) or taken out behind the digital woodshed by her peers (because the only people Facebook feeds the post to are those who the company knows will disagree) is that “she’s going to feel all alone in her ideas, on an island, like she’s the only one out there that thinks like that.” The flip-side of that, though, is that “if someone else posts something positive about transgenderism,” the response is “that’s great, I agree.” Given enough of that treatment, “she might see that and start subtly being influenced.”
Peterson is not out on a limb with that scenario. Facebook has been caught doing exactly this type of thing on several occasions. As this writer reported in July, 2015, Facebook’s “Celebrate Pride” rainbow profile picture filter was a social/psychological experiment with more than 26 million unwitting subjects, though Facebook denied it. From that article:
Setting aside the merely circumstantial evidence, perhaps the greatest reason to disbelieve Facebook's denial in this case is a paper written on the subject that was co-authored by a data scientist at Facebook itself. The paper, published by Facebook in March of this year, discusses the fact that in March of 2013, three million Facebook users changed their profile picture to an "equals" sign and that Facebook looked at the metrics of those users to determine the reasoning behind that change. The report examines the fact that the main dynamics impacting whether users adopted the equals sign as their profile picture were how many of their friends had already done so, the demographics of the users, and how often those users typically change their profile pictures. The report concluded that "the probability of adoption [of the equals sign as a profile picture] depends on both the number of friends and the susceptibility of the individual."
It seems that Facebook decided to take this experiment to the next level to see whether "the susceptibility of the individual" could be altered. Considering that the number of users who adopted a "gay pride" profile picture went from three million in March of 2013 to more than 26 million once Facebook standardized the method for doing so and put the Facebook stamp of approval on it in June 2015, it seems the answer is yes.
This would certainly not be the first time Facebook experimented on its users.
Reporting on this most recent example, The Atlantic pointed out that in June of 2014, "Facebook manipulated what users saw when they logged into the site as a way to study how it would affect their moods." This was accomplished by creating what were essentially "test groups" and filtering their timelines to show some groups only positive posts from their friends while showing other groups only negative posts from their friends. Facebook then monitored the "mood" that each user selected from the drop-down menu when making a post of their own.
And in September of 2012, the National Institutes of Health reported the findings of a Facebook experiment on manipulating the way people voted in the 2010 congressional elections. The results of that study claim that by filtering the political posts and ads that were shown to Facebook users, and monitoring the websites those users visited after seeing those posts and ads, the Internet giant was able to steer users to a particular point of view and influence their voting patterns.
This type of manipulation and the accompanying censorship are what led Peterson to launch ONEWAY. He said that there were degrees for him coming to his tipping point. First, he experienced political censorship on the Goliath platforms. That was annoying. Then he experienced religious censorship. That was more personal. But when he experienced social media giants censoring mothers with autistic children who were looking for alternative treatments — who were examining whether their children had been injured by immunizations and wanted to find out what could be done to help them — he finally had enough.
ONEWAY would never censor that, Peterson said. For that matter, ONEWAY doesn’t censor anything. That is where the community comes in. Users can simply apply the filters they choose and block whatever (or whoever) they don’t want to see. But that does not mean that anything goes on ONEWAY. There is a clearly written list of guidelines users agree to follow. Because ONEWAY is both “free speech” and “human friendly,” the platform does not allow nudity or pornography. Peterson explained that nothing that is “human unfriendly” — including exploiting or objectifying people (which is exactly what pornography does) — is allowed on the site. This also includes legitimate threats, doxing, or other things that actually harm people. Speech, though, even distasteful or racist speech, will not be censored by ONEWAY.
The difference here is that other social media platforms promise a platform for free speech, but allow their own biases to determine the limits of that speech. Whatever that is, it is not free speech. In the words of Inigo Montoya, “You keep using that word. I do not think it means what you think it means."
So, while ONEWAY prohibits nudity, threats, and other things harmful to humans, there is a consistency that is lacking at other sites. If Facebook, Twitter, or other sites want to promote a particular agenda and silence all voices contrary to that agenda, that is their prerogative; they should just be honest and disclose that to their users. Of course, then they would lose the opportunity to manipulate those users.
For those who would like to force Facebook, Twitter, or any other company — digital or otherwise — to allow truly free speech, Peterson has a warning. There is no good way to do that without risking having it blow up in your face. Eventually, if enough people demand it, the government will step in and make the rules for what is allowed. Peterson said, “I try never to take my morals from the government.” He is right, because that never ends well.
Instead of trying to force them to allow what you want in their business, Peterson’s advice is to “let them commit financial suicide” by chasing users away. Simply migrate over to another platform that more closely resembles your values. ONEWAY makes it a lot easier to do just that.