A new option has appeared inside Instagram's report function that lets users flag posts for "false information".
The option appears alongside the usual report reasons, including nudity, hate speech, violence or harassment. The platform also prohibits the sale and promotion of firearms and drugs.
Facebook also employs similar fact-checking resources on its main platform. YouTube also gives users an option to fact-check information featured in videos.
“Explore and hashtags allow people on Instagram to find content that they haven’t already chosen to follow, and by filtering misinformation from these places, we can significantly limit its reach," said Stephanie Otway, a spokesperson for Instagram parent company Facebook.
“Starting today, people can let us know if they see posts on Instagram that they believe may be false. We’re investing heavily in limiting the spread of misinformation across our applications, and we plan to share more updates in the coming months.”
The update is only available in the US currently.
You better think
The update is the latest in a string of steps being taken to improve safety and wellbeing on the platform.
Instagram is now trialing 'hidden likes' in selected countries, as part of an initiative to bring focus back to authentic content over likes.
“We want people to worry a little bit less about how many likes they’re getting on Instagram and spend a bit more time connecting with the people that they care about," Instagram head Adam Mosseri said, regarding the change.
A separate update also challenges abuse in the hands of the poster before it even goes live. Before a comment is posted, a user will receive a warning that it could be considered offensive, giving them time to change or bin it.