Читать книгу Instagram - Tama Leaver, Crystal Abidin - Страница 24
Molly Russell’s Suicide and Self-Harm Images on Instagram
ОглавлениеAs noted earlier, for several years, Instagram’s Community Guidelines explicitly banned a range of content, stating ‘any account found encouraging or urging users to … cut, harm themselves, or commit suicide will result in a disabled account without warning’ (quoted in Cobb 2017). Research on the impact of Instagram and other social media on mental health, self-harm and youth suicide emphasizes social media as an amplifier, but whether this is positive or negative depends as much on the user and context as anything else (Seabrook, Kern & Rickard 2016). Notably, even amongst calls to ban any mention of suicide or self-harm on large social media platforms, there is evidence that these are very effective platforms for suicide prevention messaging, and while depression, anxiety and social media are linked at times, there is no clear causation from one to the other (Robinson, Bailey & Byrne 2017).
In 2017, British teenager Molly Russell tragically took her own life. In early 2019, Molly Russell’s father very publicly and articulately argued that Instagram ‘helped kill my daughter’ after it was found she had been following a number of accounts that displayed and romanticized self-harm and suicide (Crawford 2019). In the wake of Molly Russell’s death, UK Health Secretary Matt Hancock very publicly called for social media companies to do more in removing self-harm and suicide content, linking increasing levels of teen self-harm and suicide with the rise of social media (Savage 2019). Instagram was singled out in particular as not doing enough to police the content children could access. After initially responding with a piece in The Independent promising to do more (Mosseri 2019a), new Instagram head Adam Mosseri had to elevate the platform’s response further in the next few days, promising that it will ‘not allow any graphic images of self-harm, such as cutting on Instagram’, and ‘will continue to remove it when reported’, even if this new commitment still relies on users reporting this content before it can be found and removed (Mosseri 2019b). At the same time, parent company Facebook made public statements reiterating that their approach to self-harm and suicide images was informed by experts across the globe, and that balancing removal with allowing inspirational stories of overcoming issues was a difficult one as many of the latter images are shown to help and support people suffering from mental health issues (Davis 2019). To try and ensure Instagram’s efforts were taken seriously, Mosseri met with the UK Health Secretary in an attempt to defuse the clearly political desire to more heavily regulate Instagram and other social media platforms (Lomas 2019). Yet, whether Instagram and Facebook can meaningfully police or deal with self-harm content remains to be seen.