INSTAGRAM INTRODUCES NEW TOOL TO BAN SELF-HARM AND SUICIDE POSTS

Instagram has rolled out a new tool in the UK and across Europe, so that the app that can recognise suicide and self-harm content.  


The detection tool uses machine learning technology to automatically search for suicide-related images and words on the platform. It then makes the content less visible or in some cases deletes the content altogether after 24 hours, if the technology deduces that the post breaks the app’s rules.  



The tool from the Facebook-owned social media platform already existed for users outside of Europe, but an issue with GDPR, meant that until now European users were not able to have access to it.


"We want to do everything we can to keep people safe on Instagram. We’ve worked with experts to better understand the deeply complex issues of mental health, suicide, and self-harm, and how best to support those who are vulnerable," explained Adam Mosseri, Head of Instagram. "No one at Instagram takes these issues lightly, including me. We’ve made progress over the past few years, and today we’re rolling out more technology in Europe to help with our efforts."




More and more, social media platforms have come under scrutiny for allowing content that glamorises self-harm and suicide. Leading to automatic solutions and Machine Learning becoming more prevalent and sought after.


For more beauty, lifestyle, and fashion tech, follow The Modems on Instagram.