What Facebook's new mission can and can't fix

in #news7 years ago

Live streamed murders. Terrorists recruiting new members. Hate groups organizing. Liberals and conservatives sealing themselves off in echo chambers.

With nearly 2 billion people around the world checking in monthly, it makes sense that Facebook is dealing with some very sticky issues. The social network is facing increasing pressure to address them head on.

On Thursday, CEO Mark Zuckerberg announced a new vision for the company. He's shifting its focus from connecting individuals to building communities, namely by getting people to join more Facebook groups.

The change is summed up in the company's new mission statement: "Give people the power to build community and bring the world closer together."

But how much of the company's problems can really be fixed by the new direction?

Filter bubbles

Facebook (FB, Tech30) has been accused of contributing to filter bubbles — where people only see news and opinions that reinforce their existing beliefs and biases. It's not just Facebook's algorithms. They're created by the friends we choose to have, the people we decide to mute, and the stories we click.

Striving to get people more involved in groups could possibly exacerbate the problem. Users of Facebook could end up spending more time in groups organized around a shared political view or belief.

Zuckerberg has denied that filter bubbles are widespread. He also believes memberships in groups will expose people to more opinions, not fewer, by helping "people meet new people and get new perspectives and broaden their horizons."

Related: Facebook's global fight against fake news

Terrorism recruiting

On Facebook, groups can be set to Secret, meaning users don't see them in search results. There are good reasons for secrecy — namely safety and privacy — but dangerous organizations can also use the groups as bases for recruiting new members.

Facebook recently outlined its plans to combat terrorism on the social network. It's using artificial intelligence to scan images, posts and profiles to identify and remove bad actors. The company also employs 150 people focused on counter-terrorism.

"Terrorist recruiting. That is something that we want zero of. We try to make it as difficult as possible," Zuckberberg said. "Even if no one reports it, we have systems that go out and try to flag that content for our community [monitors] ... we'll do more and more of that over time, as AI gets better."

Related: How Facebook decides what violent content is allowed

Fake news

The now overused phrase fake news was originally about made-up news stories that floated around Facebook. Facebook has taken multiple steps to crack down on questionable news stories. It's working with fact-checking organizations, hiding spammy links, and using AI to identify fake accounts spreading propaganda.

The move to a more groups-based experience for Facebook users could mean people get fewer articles from their news feed, where many publishers post directly. They might see less news overall, including fake news, or a more curated selection of stories from their groups. Facebook has not said how or if its tools for fighting fake news carry over to stories posted in groups.

Hate speech

The focus on groups as a positive tool with the power to change the world overlooks how people use them for negative causes. Hate groups like white power organizations use Facebook groups openly, and will continue to exist in the future. Zuckerberg has said he values free speech on the platform and Facebook only interferes if something goes "way over the line," like bullying or the threat of real world violence. Facebook often relies on regular people flagging objectionable content, but that's less likely to happen in closed Facebook groups.

Related: Facebook adding 3,000 reviewers to combat violent videos

Murder, violence and self-harm

In April, a Cleveland man used Facebook to share a video of himself shooting a 74-year-old man. The video was viewable for two hours before it was taken down.

People have used Facebook Live, the company's live video streaming tool, and regular uploads to share videos of murders, beatings, police violence and suicide. It's a tricky issue for the company, especially when the site is used to document potential civil rights violations.

Facebook is planning on using artificial intelligence to identify violent videos early. It is also deploying 7,500 human content moderators to monitor videos as they're flagged. As with hate speech, videos shared in private groups might evade Facebook's moderators for longer than if they were in a news feed.

Sort:  

I look forward to the day that Mr Z wakes up and realises that his untold wealth actually comes from the peoples content, and there fore he should share it with those same people. Much like steemit. Facebook without users content is nothing, and yet those users don't actually fungibley [ hope that's a word ] benefit from the use of their content. I left facebook years and years ago, for no other reason, than I just didn't like it. But I will join the new facebook when it runs on the blockchain, and rewards it's users .. oh hang I did ! I joined steemit :)

haha very interesting i think one thing about getting really big is you should always remember that your team is not only those that you draw flowcharts with or write codes with but also those who use that platform you created especially when it was not called for. we saw it we loved it but you have to grow the relationship but i guess Mr Z left us out so we found a better babe and its Steemit

Hi! I am a robot. I just upvoted you! I found similar content that readers might be interested in:
https://blog.financhill.com/2017/06/22/what-facebooks-new-mission-can-and-cant-fix/