How to respond to anxiety over general AI
May 21, 2022 9:57 PM   Subscribe

My partner recently shared that he’s scared of general AI arriving in 15-20 years and the consequences killing literally every last human on the face of the planet. He has been a reader of rationalist blogs for ages - while that makes this thought not-improbable, I didn’t know that he took this particular concern of the community so seriously as to feel an internal anxiety over it. My heart hurts - he said he was afraid to tell me because he worried I might think it was weird. He is an incredible partner and I am distressed as to how best to support him, not least because I do not think highly of the rationalist community (which he knows, and which has been a source of confusion and tension for our relationship).

On this particular anxiety: I think it is fringe, yes, but weird, no more so than the fears of people who are focused on bee death or rare diseases or other threats that are real and exist in the world and will likely affect many lives as we move into the future, even if they don’t end humankind.

I personally don’t find this apocalypse scenario plausible, but I am not an AI/ML engineer (he is) and do no know the future, and I don’t want to shame my partner for being scared about this particular possibility.

At the same time, people have been reporting horrible behavior in the rationalist community for years, there is clear hero-worship of figures like Scott Alexander and Eliezer Yudkowsky, multiple people have reported severe mental distress based on the “infohazards” of considering such apocalyptic scenarios, multiple allegations have been made about terrible interpersonal behavior and possible abuse in rationalist sun-circles, and at least one person has died by suicide as a consequence of distress over such behavior.

I am not a fan of many of the boldface names in rationalist circles, and he knows it (there is a great deal of thinly-veiled prejudice, intellectual dishonesty, and tolerance for bigotry in Scott Alexander’s writing and online spaces, for example). We’ve discussed our fundamental values, and I’m confident that we share the things that are important to each of us.

We come occasionally up against a live wire that to me seems to be about rationalism being at or near to the core of his identity. Some of our most difficult conversations have been activated by my fear about sexism/racism in rationalist writing - his body responds by physically shutting down with counter-fear. It’s horrible for both of us - I want empathy and understanding, but also do not do not do not want physical threat vigilance for him either.

Conversations with this kind of dynamic - I express concern, fear, frustration, or anger, he shuts down with counter-fear - are few and far between, but they were so confusing and left us both feeling so at a loss that I asked if we could see a couples counselor. He agreed, after about a year of my asking gently, and has found it really useful so far (his words). He’s looking for his own individual therapist (I see someone weekly), for which I am very grateful.

So. My partner doesn’t seem like a cult member, whatever that means, but he is deep into his rationalist identity (signed up for cryopreservation), and relies almost exclusively on rationalist sources for information (news via twitter feeds or Astral Codex Ten posts), and so this latest anxiety feels more concerning than it might otherwise. I’ve worry-read internet advice about Qanon and cults, but this is nothing as strong as that. I just want to make sure I’m not not-taking-this-seriously-enough, or hiding something I should really be open with him about, both because I don’t want to hurt him.

Today I asked if he wanted help looking for a rationalist-adjacent therapist, so that he can talk about anything without worrying about being thought weird - and then later worried, But what if a rationalist-adjacent therapist just reinforced these fears, or encouraged him to see me as a power-seeking SJW? I feel like I’m heading in the direction of paranoia, which is precisely what I want to avoid for both of us.

I know there are both fans and detractors of Scott Alexander on MetaFilter - I don’t want to hear rationalism trashed (or defended), just help to figure out how best to support my partner (and myself) in this situation. Advice given with compassion from any perspective will be helpful.
posted by anonymous to Human Relations (22 answers total) 6 users marked this as a favorite
 
As a person who struggles with anxiety:
You cannot reassure someone out of their anxiety. Reassurance breeds more anxiety.

You can't argue them into a feeling of ease.

Acknowledging that I'm an Internet stranger and don't know anything about you or your partner, I can say with certainty that the problem is the anxiety itself.

Anxiety exists in me, and it causes me to latch onto whatever random thing, in a desperate attempt to feel better.

The physical and emotional symptoms of my anxiety (caused by hormonal imbalance and past trauma in my case) make me feel constantly on edge and afraid, and my brain tries to find reasons.

It's like there is an alarm going off inside me all the time, and I try to find reasons for it. I look around and latch onto an external cause to explain the fear. I used to be convinced that my partner was going to die of Covid. I tried to control that situation in all kinds of ways. My attempts to feel safer and control the uncontrollable did a lot of damage, both to me and my relationship with my partner.

It was only when I realised that "the call was coming from inside the house" that I was able to start dealing with the anxiety itself.

Refuting your partner's fears will be a game of whack a mole, because it's avoiding what's really going on.

Obviously I don't know the root cause of their anxiety. Whether they need a change in lifestyle, therapy, medication, or something else. In my case, all of those helped, but the most important step was facing the fact that I needed help.

Good luck. All the best. This is hard.
posted by Zumbador at 10:41 PM on May 21, 2022 [42 favorites]


This is going to be really hard. I think you need to look for a support group of your own, which might better equip you for handling your own feelings first.

For myself, one thread that helped me contextualize this is that, at least in Yudkowsky's case, he's clearly been deeply psychologically hurting for many years, and has consuming guilt underneath the surface that he's unable to reanimate his dead brother Yehuda, like a sort of D-student Dr. Frankenstein, indicating to me a trauma he's never come to terms with: http://sl4.org/archive/0411/10322.html

Confronting and constructively managing the fear of death is one way in which I think you might begin to approach this to help both yourself and him, although personally, I think this VERY similar to a cult situation, based on my interactions with others in the community, that may take years to resolve one way or the other.
posted by StrikeTheViol at 11:14 PM on May 21, 2022 [1 favorite]


For convincing arguments that AGI is not what it's made out to be by rationalists, Superintelligence: The Idea That Eats Smart People may be useful. When I've shown it to people who are in as deep as your partner, it hasn't seemed to make a big impact, but it could possibly help.

I will also say that rationalism is absolutely a cult, your partner sounds quite deep in it, and you should recognize that. Q has ideas that are more obviously wacko, but that doesn't mean rationalism is a less serious cult.
posted by wesleyac at 11:33 PM on May 21, 2022 [20 favorites]


Does your partner do a job that they perceive may be automated? As I've seen some people tie themselves in knots when the think along similar such lines.
posted by unearthed at 12:45 AM on May 22, 2022


I don't know rationalism or AI-apocalypse, but I'm a prepper and this mentality is really familiar to me. It happens to members of a community I help manage.

They'll pick a highly specific scenario (this one is AI, but I usually see "polar shift," "supervolcano," "Russian/North Korean EMP attack") and hyperfixate.

They begin a doom loop of despair, then sometimes they spend time and money to prepare for these very specific things while basic life tasks for likely outcomes go undone. (For example, they'll make Faraday cages, but won't write out a will.)

I think it's a confluence of factors.

Yes, anxiety. You can't argue with anxiety or bat it away or reason with it. You can be kind and supportive, while encouraging treatment and setting boundaries to safeguard your own mental health. It sounds like you're doing a great job of that.

I think it also stems from hopelessness and an inability to visualize the future.

Instead of imagining oneself aging and living out our allotted decades on Earth, there's One Big Event that sweeps everything clean and absolves us of the need to plan. It's a combination of depression and avoidance, perhaps. I'm not qualified to diagnose.

Ultimately, I think this isn't about "rationalism" or AI, it's not a philosophical difference, it's about anxiety and an inability to visualize a future for himself. He found an outlet for those feelings.
posted by champers at 4:08 AM on May 22, 2022 [16 favorites]


I'm rationalist-adjacent and sometimes find myself looping down these AGI/misaligned AI anxiety sinkholes too. Also, climate change. Also, authoritarianism. Also, any number of other things could absolutely annihilate our world's delicate balance more quickly than we imagine.

What helps me is having a solid spiritual footing in Buddhism. Accepting that death is just the inevitable, and potentially even wonderful, next step in it all. I hope humanity doesn't die out in a blaze of misaligned AI catastrophe (or nuclear war. Or climate runaway scenarios. or a pandemic far more deadly than covid. etc.) but if that's what happens, that's what happens. We all have to go sometime.

I would say to your partner, read this book: https://fiveinvitations.com/. Watch this episode of Midnight Gospel: https://www.dailymotion.com/video/x81gjl5. Develop a relationship with spirituality that allows you to be more present right now for your life now today, because it will end, AGI or not. Maybe AGI will never happen, but you'll develop pancreatic cancer in 2 years. Death is always waiting, so get to know it now. I don't know if it's helpful for him, but it was for me. Also, happy to talk to him one on one, if he wants.
posted by namesarehard at 4:19 AM on May 22, 2022 [7 favorites]


Zumbador has hit the nail on the head, imo. Rationalism's issues are a problem, but this is not the core problem for your partner. The core problem is that he is existing in a state of near-constant anxiety and has normalised it by giving it a reason ("it's okay for me to be afraid all the time because it's a rational response to this specific Big Bad Thing that's definitely going to happen"). As champers noted, this happens in all sorts of communities, with all sorts of Big Bad Things that mentally unwell people choose to fixate on in order to make sense of their complicated and painful thoughts.

As well as a therapist, has he visited a doctor? How is his lifestyle outside of being connected to these online communities? Is he getting exercise? How's his sleep, his eating habits, his work stress? All of these things could be contributing to feeling constantly on edge and afraid as his neurochemicals go out of whack. Encourage him to see his GP as well as a therapist. Medication may help in the short term, particularly with his response to these discussions (I assume you're describing anxiety or panic attacks?).

I think you may need to also approach this as an addiction to some degree. He needs to be able to step away from these online spaces and stop relying on them for a constant drip feed of fear and concern. Does he see this as enough of an issue to delete his Twitter account and remove the app from all of his devices? Unless he's willing to disengage from these spaces, he may be stuck in this cycle of having his fears reaffirmed over and over.
posted by fight or flight at 4:40 AM on May 22, 2022 [7 favorites]


Idk what you mean by 'bee death' but all the world's experts agree that climate change and ecosystem collapses will have devastating impacts, to some greater or lesser degree. And we have ample evidence of the havoc of infectious zoonotic disease, any epidemiologist will tell you this can and will happen again, and next pathogen could easily be worse.

This AI thing is not like the others. This isn't to help him, he won't really listen or understand, because this is about his anxiety, as per Zumbador, maybe with a dash of cultish thinking too. But it might help you, to acknowledge this fear is categorically different than the fears that are backed by the vast majority of the respective leading experts and institutions worldwide.
posted by SaltySalticid at 4:58 AM on May 22, 2022 [3 favorites]


Share this news piece with him. It won't change his mind but it may suggest the world has some time.
posted by tmdonahue at 5:23 AM on May 22, 2022


I was surprised to read that he's a ML engineer - my suggestion was going to be that he talk to one, because I'm a programmer who works with some ML people, and it's my/their understanding that we're hilariously far away from anything that could be considered actual AI. The fact that he works in this space suggests to me that this really needs to be approached as an anxiety problem, rather than as an information problem. He already has all the facts he needs to know that this isn't something he should worry about, and that's clearly not helping, so it seems unlikely he'll be persuaded out of this belief. My suggestion is that he look into Acceptance and Commitment Therapy. Since he probably can't be persuaded that this isn't a real possibility, his first step should be figuring out how to live with that possibility.
posted by Ragged Richard at 5:56 AM on May 22, 2022 [24 favorites]


I agree with Ragged Richard. I don't work in the space myself but am good friends with many AI and ML PhDs and have worked on projects with them. This is not a rational fear, despite the "rationalist" name. He needs help for anxiety.
posted by branca at 6:16 AM on May 22, 2022 [5 favorites]


I think you might want to look into Dr Steven Hassan’s work. I don’t know anything about rationalism but I do know a bit about doomsday cults and this sounds like one. If so, the anxiety is not necessarily biochemical- it’s being induced by the cult as a means of control.
posted by warriorqueen at 6:33 AM on May 22, 2022 [2 favorites]


Would it be possible for him to take a break from the rationalist message boards and websites? I know he gets all his news there and I know from my own experience with depressing online spaces that it is a tough habit to break, but I also know that even a couple of days away from the twitters can really, really boost my mood.

IMO as an anxiety-haver, it's not just that I "have" anxiety like I would have a cold, where I'm going to have the cold whether I lie in bed or do the dishes. I have the potential for anxiety, and repeated exposure to really depressing stuff potentiates the anxiety. If I stop thinking about the really depressing stuff, my brain rapidly starts getting out of the depressing-stuff paths.

It's easy to think "oh but if I don't keep up with the really depressing stuff I [will be ill-informed or ill-prepared or lose my identity]" and therefore hard to move away from the depressing stuff but even cutting back really helps me. Like, I got a new and very busy job where I cannot read the internet as much, and lo, my anxiety really reduced. Twitter can still make me extremely anxious and depressed, but I can't be on twitter as much, so I'm not anxious and depressed as much.

I feel like anxiety itself isn't a habit, but you can build habits that increase or decrease your anxiety markedly. Reading and thinking about AI stuff becomes a habit, just like gardening or being a really active Star Wars fan becomes a habit only it's terrible for you.

The older I get, the more I realize that all that 19th century bullshit about "take a long sea voyage" or "Mrs. Rich Lady had a bad shock and the doctor ordered her to the seaside for her health" is actually slightly right - treating the symptom and not the cause, okay, but slightly right. New ideas, new places, new interests can help create new brain habits.
posted by Frowner at 7:54 AM on May 22, 2022 [10 favorites]


Also, as an anxiety-haver, I have over the years learned to say to myself, "I cannot fix this large problem, nothing I do will fix it, so I am going to [garden, cuddle with the cat, have a snack, look at something else on the internet]". There are all kinds of pretty dark future things I believe very likely (not really world-ending AI) and I can't do a goddamn thing about them so lying in bed crying and shaking does not help. This, again, does not really get at why I'm an anxious person, childhood trauma, etc, but very often I can start concentrating on something else and if you do that frequently enough it really does reduce the anxiety.
posted by Frowner at 7:57 AM on May 22, 2022 [3 favorites]


I'm adjacent to the movement - one of my 'exes and still friends' has two people important to them in the Scott Alexander inner circles (like day to day interactions with him). And they (the ex) do describe it as a cult.

And I am not an ML/AI expert but I delve into it. I've worked with several "powered by Watson" technologies and they've been crap. We've been promised self driving cars in "a few years" for quite a while but they can still barely avoid pedestrians. The idea that general purpose AIs are going to be viable, let alone dangerous, in 15-20 years is laughable. If anything is going to destroy humanity in that timespan, it's going to be us.

I agree with Zumbador that you can't rationalize someone out of an irrational position.

My best suggestion is to try to get them into therapy for anxiety but to also set a bug out point for your own safety/sanity. Just like a ship going under, someone like this can drag you down as well.
posted by Candleman at 8:08 AM on May 22, 2022 [2 favorites]


I agree you cannot argue someone out of anxiety (I should know since that is part of my make-up). But I also like to think I am a thoughtful, rational, intelligent person. I also agree with Einstein who said "Only two things are infinite, the universe and human stupidity, and I'm not sure about the former."

People have accosted me for not being more vocal, worried and concerned about climate change (citing that I have grandchildren). I am most definitely concerned about climate, WMD, very old and heartless bastards controlling our government, SCOTUS, the Electoral College, more pandemics, the abuse of the 2nd amendment, racism, banning books, micromanaging doctors and school teachers, etc. You think "he didn't list AI" but I consider AI, if misused, to just fall within the WMD umbrella, or maybe there's an argument that it's the ultimate fascist state. It reminds me of the movie "Equilibrium" where the character played by Christian Bale was eventually saved by his own children. They are our only hope.
posted by forthright at 8:14 AM on May 22, 2022 [1 favorite]


Anxiety is much worse when it feels like there's nothing you can do about the scary thing. But there's always something you can do: you can work on yourself to develop equanimity, courage, wisdom and balance. As namesarehard points out, it's not very rational of your partner to waste a lot of time pointlessly stressing over a 20-year scenario when he could get sick and die five years from now.

As a cognitive approach, Stoic philosophy, a premodern small-r-rationalist tradition, seems to resonate really well with some members of the rationalist community. I find Marcus Aurelius's Meditations to be very helpful for anxiety, in particular. Someone on LessWrong also posted a quite good sequence on virtues that he might find interesting, particularly the posts on courage and hope.
posted by Bardolph at 8:38 AM on May 22, 2022 [3 favorites]


I don’t think you’re taking seriously enough the part of yourself that’s hurt that he follows misogynist thinkers! If I were you I’d identify my line in the sand and be prepared to exit if he crosses it.

Today I asked if he wanted help looking for a rationalist-adjacent therapist, so that he can talk about anything without worrying about being thought weird - and then later worried, But what if a rationalist-adjacent therapist just reinforced these fears, or encouraged him to see me as a power-seeking SJW?

Yes, it sounds like he’s looking for a therapist that will validate the areas of his thinking that hurt you.
posted by kapers at 10:31 AM on May 22, 2022 [11 favorites]


I haven't seen this addressed above, but I'd be pretty hurt if my partner planned to be cryogenically revived in the far future, but I wasn't a part of it :( I'm not saying this to be funny, it's a detail buried in your question that seemed to me to suggest a world of painful and unhappy unevenness in this relationship. I'd draw a line in the sand over who comes first, his hurtful Intellectual Dark Web heroes or you.
posted by johngoren at 12:34 PM on May 22, 2022 [3 favorites]


Context/Disclaimer: I'm currently employed as an "artificial intelligence" researcher, studying learning theory and ethics for AI systems. I have a PhD in computer science and lots of anxiety.

To Ragged Richard, I think I can explain why an AI/ML engineer would be suffering from anxiety with this particular theme. The following are simultaneously plausible:

1. Modern AI is nowhere near being general/capable enough to threaten humans with extinction.
2. Right now, technologies called "AI" are doing real harm. They can dramatically accelerate and reinforce existing inequalities.
3. There is a tiny but real chance that modern AI could become capable of threatening humans with extinction soon-ish.

I know engineers and some researchers who displace their anxiety about (2) into (3) by engaging with the Rationalist community. It seems like a coping mechanism for the guilt of being well-compensated for building and deploying technologies that are net-negative for society as a whole.

Sometimes, you can reach people by talking about the relative probability of (2) and (3). Often, they are ambivalent or unhappy with the impact of their own work in AI/ML, or feel that they should be doing more to correct the immediate harms associated with rolling out these technologies. I found the book Against Purity helpful for developing strategies to promote social goods from a place (AI/ML research and development) that suffers from many toxic systemic biases.

This is not to say that nobody should ever worry about (3). I do sometimes, in the course of basic research on learning theory. But said worry is entirely abstracted and theoretical, because the probability that runaway AI will threaten all of humanity seems tiny (in my professional opinion). (3) is more a thought experiment than anything else.

On the other hand, my worries about the immediate harms done by systems called "artificially intelligent" give me full-blown anxiety that I am in therapy for. I've learned coping mechanisms, but the only thing that really helps is working on projects that are trying to mitigate harms done by "artificial intelligence" in the here and now.

So... maybe discomfort with his own work or the field as a whole is what's driving the AGI-themed anxiety; this is not uncommon in my experience. Frankly, I think we in the field should be deeply disturbed by our overall impact, and have a responsibility to try and course-correct.

If I were you, I would ask (in therapy, maybe) what he thinks the impact of his industry is on real people, right now. See where that discussion goes?
posted by kitten_hat at 1:22 PM on May 22, 2022 [9 favorites]


What's he doing for or against climate change? Thinking a thing isn't action, so going outside might make a stark reality.

I think everyone's assessment of risk (self included) is terrible because it's a balance of what we don't know but also need to predict. If I could compel your partner to think the unthinkable and had unending patience, I'd get them to face down the horrible outcomes and put detail on every one of the scenarios so that critical junctures can be identified and mitigated or written off as unlikely.

(One scenario earned an unspeakable name. The scenario with the name R___'s B_______ is a variant on Pascal's Wager where you're induced to bet on a thing that you can't control for payoff you don't receive. Ask your partner "what is Liskov Substitution?" for a distinction between the edge or interface a program has in the world and the implementation in program code behind it. There will always be a range of programs that respond the same to identical inputs despite having any number of implementations doing the work. That's to say that you're reading the public performance of K3ninho and a malevolent AI could make an exhaustive imitation based on how all this K3ninho noise exists in the world -- but it would not be this K3n.)

Enumerating these scenarios will probably require professional help, given they may very well decide you can't help, somehow the problem is intractable due to their sense of superiority: they're rationalist, and you wouldn't comprehend. So talk about conpassion and empathy. And hold on to the consequence of logic that you share the same computational architecture in your skulls such that, whatever thoughts they think, you can think too in due course.

Breathing-focused meditation and in-the-moment mindfulness are still useful tools to try for anxiety. Your wellness and healthcare providers will have a starting point, especially if your partner is earning data science/machine learning money.

(As someone who left a marriage in mid-30s that was unhelpful for both of us, I offer the thought that there are some unwinnable battles, however much love and diligent effort you put in. While you still have breath in your body and blood pumping round your veins you still have choices and options, such as declaring your present circumstances as a learning exercise that readies you for a different relationship.)
posted by k3ninho at 2:05 PM on May 22, 2022 [2 favorites]


If he has an anxiety disorder then even if he stops worrying about AI, his anxiety disorder will just find a new thing to fixate on.

When I was having severe anxiety I thought going outside was the problem so I stopped doing that. Then I thought the phone ringing and startling me was the problem so I put it on silent. Then I couldn't open my mail. Etc.

Personally I found 1mg/night of klonopin to be life-changing. I felt better within 3 days and was cured within a week. Then once I was functional again I fixed my life so that it was less stressful, then tapered off the klonopin.

My brain had built new pathways and the agoraphobia etc. did not return until years later when I was under great stress again. Fortunately I was able to recognize the signs this time and went back on the 1mg/night klonopin again. Fixed the source of the stress, tapered off again.

Brains can get stuck in anxiety spirals and temporarily going on medication is a great way to break out of that spiral and form new brain pathways. And anti-anxiety meds generally work pretty fast (they don't need months to build up in your system like anti-depressants) so you know within a week or so if they are working for you. In my experience, they eliminate the irrational anxieties so that you are able to more clearly see the real problems in your life.

I would suggest to him that he try going on anti-anxiety meds for 2 weeks and see if he still feels the same. It would be a good test for whether his intense fear of AI was arrived at rationally or if he just has an anxiety disorder and that's what it chose to fixate on.

Given that most people in his industry aren't terrified, what's more likely: That he alone has seen some danger that all the experts have missed? Or that his brain is misfiring? If he's as rational as he thinks he is, he'll have to admit that odds are it's the latter and it's certainly worth a 2 week experiment to check.
posted by Jacqueline at 8:17 PM on May 22, 2022 [4 favorites]


« Older New things to do in New York City (since 2018)   |   Help me help my partner not hate summer Newer »
This thread is closed to new comments.