Instagram removes content that, in the opinion of the social network, violates the principles of the community. Now the developers of the service are taking more stringent measures against potentially dangerous publications containing bullying, incitement to hatred or encouragement to violence. Such posts will be shown at the bottom of the feed and at the very end of the stories. Changes will affect only individual publications, they will not affect the accounts of their authors.
Instagram uses guidelines to determine the type of content users post on Reels and Explore. Developers have previously focused on showing messages lower in the feed and in stories if they contain misinformation. Now these actions will also apply to potentially dangerous publications.
“If our algorithms detect that a post contains bullying, hate speech or incitement to violence, we will show it below in the feeds and stories of the followers of this account. To understand if our rules are violated, we will look at the signature, pay attention to whether it has been seen before and whether there have been violations in the past, ”the Instagram blog says.
Based on actions on Instagram, social network algorithms will predict whether a particular user can send a complaint about the content shown to him. If the probability is high, then such publications will automatically appear at the bottom of his feed. Potentially harmful content will be removed if the company receives complaints from users.