How do social media platforms address users potentially in crisis?
November 4, 2016 8:39 AM   Subscribe

A public figure, active on social media, is displaying signs that might be construed as mania. Do social media platforms have policies on how to address this?

I won't point this person out, but their behavior has changed sufficiently in recent weeks to prompt coverage in the press. In the last few days the changes have escalated quickly. People in the community in which this person works are watching this unfold in a vacuum--since we are not immediate friends or family, there's no recourse for the public viewer. Nevertheless, people are (perhaps in error) recognizing behaviors familiar to them as a manic episode or similar. In the last 24-48 hours, these behaviors have escalated--and they're all being documented in dozens of videos, pictures, tweets, and so on.

This question isn't about seeking a welfare check for this individual (that may have been done already, as referenced in this person's feeds). This question is about social media platforms themselves: do Twitter, Instagram, Periscope, etc. have policies in place that allow them to take an event like this out of public view? If so, how does one suggest an account be reviewed for such a thing?

I ask because I'm remembering Amanda Bynes' very public experience and wondering (even hoping) that social media platforms have considered how they can improve their services to lessen the impact of crises.
posted by late afternoon dreaming hotel to Technology (7 answers total) 1 user marked this as a favorite
I may be super wrong and super offbase, but I don't see how this could exist. I don't think you can seek to sensor someone because you feel (however rightly or wrongly) they are mentally unwell. I suffer from depression, but I don't think anyone has the right to unilaterally revoke my access to social media while I am in the midst of a depressive episode, just in case I might find it embarrassing later on. I REALLY don't see Twitter/Facebook/whatever having a policy to moderate content in this way. They aren't medical professionals, they are in no position to declare "I think you're nuts! No more twitter for you!". Unless the posts are in direct violation of their policies, I think you're out of luck.

Honestly, a HUGE percent of social media posts could be viewed as the ramblings of mentally unwell individuals. They'd be deleting everyone's accounts...

If you think the individual is at risk of harming themselves, address that for sure.
posted by PuppetMcSockerson at 8:53 AM on November 4, 2016 [11 favorites]

Every major social network as a "Trust and Safety" or "Protect and Care" team. You can reach out to them as they all take self-harm seriously. But they won't generally do anything other than take down posts, or in a worst-case scenario, alert law enforcement if there are exigent circumstances which would indicate a life threatening situation. Given what you've written it's unlikely there is much they can do.
posted by carlodio at 9:04 AM on November 4, 2016 [1 favorite]

First of all, I believe I know of whom you're speaking, and I would agree it looks remarkably like a manic episode with delusion & paranoia. I say this because my brother lives with bipolar and before being properly medicated one of the first signs he was slipping into a manic episode was his social media posts that sounded remarkably like the ones this person is posting -- stuff out of left field, conspiracy theories, things that didn't make sense or have any basis in reality. After his last episode we as a family agreed to keep a vigil on his social media as a tool to help spot the signs he was slipping into mania.

I agree there's likely not much a social media company would be willing to do unless the person were threatening to harm either themselves or someone else. I think this is where family and friends play a crucial role, and I'm hoping this person's loved ones will step in to help. Although, I will also add, in our experience it was very difficult to get my brother to realize he needed help and meanwhile he was still posting strange things. So it's possible people are already underway trying to assist this person.
posted by bologna on wry at 9:36 AM on November 4, 2016

Absent a request from the account owner, I strongly suspect the only things a company would be willing to take action on would be overt self-harm/suicide/threats of violence, and that as much for liability reasons than for concern for the user. It's too big a judgment call and way too sticky to be making a call on something as nonspecific as "signs of mania."

(Haven't worked at a social media company per se, but have worked at several companies that employed a bunch of lawyers and ran large communities.)
posted by restless_nomad at 9:51 AM on November 4, 2016

I can guarantee this is waaaay too gray-area for the actual platform owners to go anywhere near, which should be obvious by logging in to any of them and reading the last 90 seconds of activity. I can think of a number of politicians I'd like to file a "dangerous mental illness event" crisis form on myself, and I can also imagine basically every woman on the internet becoming unable to use it thanks to misogynists with plenty of free time on their hands. So, no, that would be a terrible idea.

The best you could get would be, if this person maybe posted that they were in danger from someone else or themselves, local police could probably get some cooperation from the platform regarding location information. Probably. It might require a warrant, though.

People have a right to lose their shit in public. You're going to have to find an offline silencing tactic, preferably one that is legal and has some protections for that person rolled in, as this is not a problem to be solved with software.
posted by Lyn Never at 11:28 AM on November 4, 2016

Thanks for the response so far, everyone. The concern isn't zany posts, its that these posts are coming dangerously near, for example, encouraging one's followers to drink poison. Hence the question over what the line is for escalating from general concern to actionable concern. We've sought out the "protect and care"-type approaches--so far to little effect--but thank you for mentioning them. I'm happy to hear any other ideas.
posted by late afternoon dreaming hotel at 12:13 PM on November 4, 2016

encouraging one's followers to drink poison

Perhaps the messaging needs to be directed at the followers.
posted by Miko at 5:34 PM on November 4, 2016

« Older The job is fine, but the location is not fine   |   How do I sell my dead father's house? Newer »
This thread is closed to new comments.