Tik Tok company offers New feature To combat the spread of misleading content across its platform, the app begins by warning users before sharing videos with unverified information.
The update aims to address a kind of gray area in fact-checking allegations that fact-checkers cannot verify.
With the update, users trying to share a video that has been reported as not based on evidence by fact checkers in the app see a popup that says: This video has been reported due to misleading, unverified content.
They can still go ahead and share it if they like, but the video will not appear on other users’ For You page, and TikTok alerts the person who originally shared the video that it has been reported.
The platform hopes the claim will encourage people to pause to consider the next step before they choose to cancel or participate anyway.
The company says that early testing of warnings has reduced sharing of misleading content by 24 percent, while reported video likes have decreased by an average of 7 percent.
This experience is similar to the experience of the Twitter platform, which encouraged users to read articles before sharing them, a test the company says: It was successful.
And in some ways, TikTok has taken a more aggressive approach to misleading content than other social media platforms.
The company works with a number of third-party fact-checking organizations and removes the videos they specify.
But some publications are bound to leak loopholes, and the company has sometimes had to catch up, as happened in the aftermath of the election and violence in Washington, DC.
For its part, TikTok notes that the new warnings should help it address the content that appears during events before fact-checkers give their final opinion.
The new feature is available in the United States and Canada, and will gradually be rolled out in more regions over the coming weeks.