Instagram is rolling out a new feature to prevent users from viewing possibly abusive messages through content filters.
Instagram has introduced a new feature that could potentially make the app safer to use by preventing users from viewing abusive messages and filtering offensive words, phrases and emojis.
Facebook Inc, which owns Instagram, said that not only will there be a filter option for abusive direct messages (DMs), it will also make it harder for people blocked by users to contact them through new accounts.
This is all part of the app’s efforts to tackle hate speech and online abuse across its platform, especially as it is more popular among teenagers and younger adults who may be more susceptible to bullying online.
The content filter can be activated on Instagram through privacy settings, and can be customised to include any words, phrases or emojis that can be blocked by the user.
Users will also have the option to entirely rule out options for others to contact them on Instagram after blocking their accounts, preventing those that have been blocked from reaching out to users from new or different profiles.
According to Instagram, the feature is set to roll out in some countries in the next few weeks.