Facebook Messenger introduces forwarding limits to combat fake news
Similar to what Facebook did earlier with WhatsApp, the company said it's putting new limits on Messenger.
On Thursday, the company introduced a message forwarding limit in Messenger, meaning that messages can now only be forwarded to five people or groups at a time.
As a part of our ongoing efforts to provide people with a safer, more private messaging experience, today we’re introducing a forwarding limit on Messenger, so messages can only be forwarded to five people or groups at a time. Limiting forwarding is an effective way to slow the spread of viral misinformation and harmful content that has the potential to cause real world harm.
We believe controlling the spread of misinformation is critical as the global COVID-19 pandemic continues and we head toward major elections in the US, New Zealand and other countries. We’ve taken steps to provide people with greater transparency and accurate information. Our Coronavirus (COVID-19) Community Hub gives people access to up-to-date information and resources designed to help them stay safe. And our Voting Information Center and voter registration notifications ensure people know how to register to vote and make their voices heard. We are introducing a forwarding limit on Messenger to help curb the efforts of those looking to cause chaos, sow uncertainty or inadvertently undermine accurate information. We want Messenger to be a safe and trustworthy platform to connect with friends and family. Earlier this year we introduced features like safety notifications, two-factor authentication, and easier ways to block and report unwanted messages. This new feature provides yet another layer of protection by limiting the spread of viral misinformation or harmful content, and we believe it will help keep people safer online.
Facebook is merging Messenger with Instagram's direct messages
In January 2019, Mark Zuckerberg said Facebook was looking to merge all of the company's messaging platforms — Messenger, WhatsApp and Instagram — into one. The new feature, which will show up to some users when they open the app (see image below), isn't obligatory — you can still keep your Messenger and Instagram DMs separate if you so desire.
However, if you do update, the somewhat simple messaging experience on Instagram will become very similar to Messenger. This means Instagram users will be getting features such as message forwarding and customizing chat threads with custom colors and nicknames. They're also getting a bunch of new features, including selfie stickers, Watch Together (which lets you watch trending videos with friends), and vanish mode, which enables you to set messages to automatically disappear after a certain time. Facebook says some features will hit Instagram first and arrive on Messenger soon after that.
Now, the company took a big step toward that goal. On Wednesday, Instagram started rolling out the option to connect Instagram direct messages with Messenger. Yes, this means you can send messages to Messenger people from Instagram, and the other way around.
Facebook's new policies are meant to stop the spread of conspiracy theories and hate in Groups
For the first time, Facebook is releasing stats concerning how the social media platform moderates what goes on in Facebook Groups.
Alongside the new numbers, the company has also announced new policies surrounding how it will deal with conspiracy theories and hate speech that often flourishes inside these groups.
First, the stats. Over the past year, according to Facebook, about 12 million pieces of content that violated the platform’s hate speech policies were removed from Facebook Groups. An additional 1.5 million pieces of content that fell under its organized hate policies were also removed from Facebook Groups.
According to Facebook, the vast majority of this content — 87 percent of hate speech and 91 percent of organized hate content — was removed proactively. Basically, this means that Facebook’s content moderation AI took these violating posts down before any user even reported them.
If members of Groups repeatedly break Facebook’s rules, the company doesn’t just remove the content, it removes the Group entirely. Facebook says it deleted more than 1 million Groups this year that violated these policies.
The problem with Groups
Groups allow users to congregate with other members around a specific topic. Each has its own feed just made up of posts published in the Group. Groups can be public, meaning anyone can read the posts and join the Group to contribute. Or they can be private, which obscures the Group feed from a user until their membership is approved by a Group administrator. Private groups can even be completely hidden from Facebook users outside the Group, which would make it impossible for non-members to join unless they were specifically invited by other members.
There are many cases of conspiracy theories, misinformation, and hate festering in some of these Groups. Take, for example, the false stories earlier this summer about violent social justice protesters invading small towns across the United States. Just last week, the FBI and local police departments had to tackle the misinformation that was running rampant in Facebook Groups concerning lies about anti-fascist protesters starting wildfires on the West Coast.
So, what’s Facebook going to do to deal with this problem? From the sounds of its announcement Thursday, it feels like it’s all but hitting the reset button on Groups.
Facebook’s new policies put a lot of responsibility on Groups’ administrators and moderators. In order to enforce these new policies in Groups... a Group actually has to have an admin and moderator in the first place. People who create a Group can also choose to leave it. That means that there are a number of Facebook Groups who don’t have any admins or moderators at all.
Over the next few weeks, Facebook will suggest admin roles to members of those Groups. If no one steps up, it will begin the process of archiving the Group, basically creating a time capsule of it at that moment for members but closing it off to new members and posts.
For groups with admins and mods, they will begin to play a more central role in Facebook’s moderation policies. If a Group member violate Facebook’s Community Standards, their posts will no longer be automatically published for 30 days. Moderators will need to approve all of that user's posts first during that time period. If admins and moderators continuously approve violating content, the Group will be banned. Once a Group is banned, admins and moderators will be barred from creating any new Facebook Groups for 30 days.
In order to combat coronavirus misinformation, Facebook will no longer recommend health-related Groups to members. Users can still join them or find them in Facebook search, but the platform will not promote them.
“It’s crucial that people get their health information from authoritative sources,” says Facebook in its statement. The social media platform has attempted to take action against the COVID-19 misinformation that runs rampant on the site. However, considering so much of this originates from Groups, this may be its most critical move dealing with the issue yet.
Earlier this year, Facebook removed a number of Groups related to QAnon, a far-right conspiracy theory targeting President Donald Trump’s political opponents. It also removed some Groups related to the right-wing militia movement Boogaloo Bois.
Of course, none of these new moves will completely solve the problem. But they show that Facebook is now focusing on the source of so many dangerous conspiracy theories and hate that spreads on its platform: Facebook Groups.