Wednesday, 16 November 2016

Facebook’s Flimsy Denial of Fake News & Its Impact

Facebook's Flimsy Denial of Fake News & Its ImpactFacebook paints a very dichotomous, contradictory picture of itself. On one side, they claim to be the world’s largest social network, impacting the lives of over a billion people each month. On the other side, CEO Mark Zuckerberg — apparently not using his own social network or perhaps living under a rock this past year? — claims that Facebook has virtually no influence on national elections.

The disconnect is important, because it shows that Facebook doesn’t appear to take a leadership position of responsibility for unleashing and reinforcing the technology that has become a part of billions of people’s lives everyday. Is fake news an actual problem on Facebook, and if so, what can be done about it?

On Tuesday, after Google said it would no longer accept publishers into its ad network who publish fake news — fictionalized news designed to be shared and clicked-on by others — Facebook followed suit. It’s telling that Facebook followed Google’s lead given Facebook CEO Mark Zuckerberg’s comments over the weekend calling it “crazy” that Facebook would have had any influence in the most contentious recent presidential general election.

However, banning fake news sites from using your ad network is a far cry from actually working on the real problem — people sharing and liking fake news stories just as readily as real, legitimate news stories.

Will Oremus, writing over at Slate, lays out the problem:

The problem is Facebook’s refusal to own up to its increasingly dominant role in the news media. It’s one that is unlikely go away, even if the fake news does. In a public interview last Thursday, Zuckerberg claimed that fake news on Facebook “surely had no impact” on the election and that to suggest otherwise was “a pretty crazy idea.” […]

At the same time, he cautioned that Facebook had to “proceed very carefully,” because “identifying the ‘truth’ is complicated.” […]

But there is a growing sense, both inside and outside the company, that it may be proceeding rather too carefully, given its increasingly dominant role in the distribution of news online. And Zuckerberg’s denials seem to be fanning the flames.

Indeed. When your own CEO seems so disconnected from the reality on the ground, it makes you wonder what he spends time doing all day. Fake news is passed around on Facebook just as readily as legitimate news (and pictures of kittens), shared and liked millions of times over during this past election. While Facebook has some rudimentary reporting tools, they provide zero feedback to the user suggesting that anything is ever done. With no feedback, users have little incentive to use these tools — so most people simply don’t bother with them.

But identifying fake news isn’t really about identifying “the truth” — it’s identifying easily verifiable facts that are untrue (what most of us call “lies” or “fiction”). Especially when it comes to easily verifiable things that Facebook completely fell down on during this past election — like the lie that the Pope endorsed Trump.

As Facebook has grown, however, it seems to have gotten more complacent and lazy about its responsibilities to its users:

[Edward] Snowden cautioned that social media networks are careful to respect users as they grow, but get more reckless as they establish dominance. “To have one company that has enough power to reshape the way we think — I don’t think I need to describe how dangerous that is,” he concluded. […] “When you get a Google in place, a Facebook in place, a Twitter in place, they never seem to leave,” he said. “When one service provider makes a bad decision we all suffer for it.”

Facebook says, “Hey, we’re a technology company first and foremost. We don’t want to get into this messy business of having to involve humans in our processes.” Yet, as they’re finding out, when you run a technology business whose sole purpose is to connect human beings to one another in some meaningful way, you may need to actually have humans involved in the process (outside of coding the obviously-flawed algorithms). Denying that more needs to be done is burying your head in the sand and hoping the problem just goes away.

Problem Readily Solved — By Four College Students

Now, this all would be a moot point if the problem was insurmountable, unsolvable.

But it is easily solved, as four college students from Princeton University just demonstrated. Their solution is a Chrome-only browser extension (if they were Facebook engineers, they could easily bake this directly into the social network):

“For links, we take into account the website’s reputation, also query it against malware and phishing websites database and also take the content, search it on Google/Bing, retrieve searches with high confidence and summarize that link and show to the user. For pictures like Twitter snapshots, we convert the image to text, use the usernames mentioned in the tweet, to get all tweets of the user and check if current tweet was ever posted by the user.”

The browser plug-in then adds a little tag in the corner that says whether the story is verified. […]

But the students show that algorithms can be built to determine within reasonable certainty which news is true and which isn’t and that something can be done to put that information in front of readers as they consider clicking.

Indeed. You’d think a company worth billions of dollars could’ve figured this out on their own before it became the problem it did in the 2016 election. In fact, a small task group of Facebook engineers is now taking matters into their own hands, to try and convince their CEO this is a real problem — but one that they can solve.

Fake news is a problem that can be solved. But first the folks in charge (like Zuckerberg) have to acknowledge that it’s a real problem — that social networks like Facebook play a significant social role in many country’s politics.

 



from World of Psychology http://psychcentral.com/blog/archives/2016/11/16/facebooks-flimsy-denial-of-fake-news-its-impact/

No comments:

Post a Comment