The structure of ideological abuse
Standing on your spiritual own two feet
[I have freelanced for a number of effective altruist organizations, such as the Centre for Effective Altruism and 80,000 Hours. I don’t speak for any of my clients, past or present, and they didn’t look at this post.]
[Previously: Why join high-demand groups?; Identifying healthy high-demand groups]
This is the third in a series of posts about the book Abuses in the Religious Life and the Path to Healing by Dysmas de Lassus. You don’t have to read the first two posts before you read this one. In this post, I will talk more about de Lassus’s definition of spiritual and ideological abuse.
When talking about de Lassus’s ideas, I will use the term “spiritual abuse”, as he focuses on religious communities. When talking more broadly, I will use the term “ideological abuse”, which doesn’t assume that the abusive community is religious in nature.
What Is Ideological Abuse?
According to de Lassus, the thing that defines spiritual abuse is an inappropriate intrusion onto the internal forum.
The external forum is all matters having to do with society and the public good. For example, the external forum includes whether you go to church, what job you work, how much you donate to charity, and whether you’re legally married. The internal forum has to do with your inner thoughts, especially your conscience. For example, the internal forum includes anger, generous impulses, a conviction that a course of action is morally wrong, and the impulse to refresh X two hundred times a day.
An ideologically abusive community might interfere with the external forum in various bad ways: for example, they might command you to cut off your family, to fast when it would make you sick, or to donate an unsustainably high percentage of your income to buy the pastor a third Lamborghini. But the root of the problem—the reason you don’t just tell them to fuck off—is that the community is interfering inappropriately in the internal forum. They’re exercising control of your beliefs, preferences, and conscience in a way that they don’t have a right to exercise.
Inappropriate intrusion into the internal forum can cause serious mental health problems. De Lassus quotes a spiritual abuse survivor describing their experiences:
I came little by little to feel like a dead tree, like a tree that had had its branches cut off. After the branches have been cut off, there is only the trunk left, and when I spoke of things like this, they said to me: “That is really wonderful; this means that you are really being pruned for Heaven.” So I said to myself: “It must be normal, then, but it seems strange that I feel less and less alive in terms of this world.” But they said to me: “It is completely normal; it is the Cross. Look at the wood of the Cross; it is like a dead tree, and yet it is this that gives life.”
Typical mental health effects of spiritual abuse, according to de Lassus, include the usual nonspecific depressive symptoms: insomnia, psychosomatic illnesses, neglect of personal hygiene, self-harm. Spiritually abused people may work too much so that they can cling to the scraps of approval they get from the community. On the other hand, spiritually abused people may behave in a harmful, coercive, or pointlessly defiant way as a desperate attempt to maintain some control over their lives.
Catholic spiritual abuse victims often refer to spiritual abuse as “a violation of the chastity of the inner self” or as “a rape of the heart.” The former is a metaphor that secular people are less likely to use, but I think it is evocative of the problem. The way that sexual abuse is a violent intrusion on a private, intimate part of the body—something people want to only share by choice, and normally only with people they love and trust—spiritual abuse is a violent intrusion onto a private, intimate part of the mind—something people want to share by choice, and normally only with people they love and trust.
Many communities are supposed to intrude onto the internal forum. A Catholic monastery is supposed to strengthen your love of God and neighbor; if you spent decades in a monastery and were just as much as an asshole as you were before, you’d want a refund. Likewise, introductory effective altruist material is supposed to make you have higher ambitions, think with numbers, and care a similar amount about all strangers. The rationality community is supposed to make your beliefs more accurate and your plans more likely to achieve your goals. But when a community is supposed to intrude on the internal forum, we should be cautious to make sure that the intrusion stays within proper bounds.
I’m going to talk about the concept of the “linchpin”, then I’m going to talk through the three themes that de Lassus considers most common in spiritual abuse.
The Linchpin
In ideologically abusive communities, everyone is both a victim and a perpetrator. Everyone has complaints that they’re hiding, because they know their complaints are a sign that they’re weak and bad and sinful, and everyone is being cruel and judgmental to others for having ordinary human weaknesses or needs.
Why? In a small-group abuse situation, such as a family, the perpetrator can keep everyone in line with threats, guilt, and gaslighting. But if she wants to abuse dozens or hundreds or thousands of people, even the most determined perpetrator will need to delegate. She needs abuse middle management.
Since it’s impossible for a single abuser to enforce ideologically abusive dynamics all by herself, she needs her victims to do it for her. And she also needs to keep the enforcers in line themselves, so that they don’t start getting any bright ideas about improving the situation somewhat. In short, ideological abuse needs the cooperation of the victims.1
That’s what makes ideologically abusive communities so fiendishly hard to untangle. Realizing how bad the group is means realizing that you’ve done evil. People naturally flinch away from that.
While everyone is a perpetrator, usually a small “inner circle” is particularly responsible for negative dynamics in the community. De Lassas calls this the “linchpin”, the center of the community.
The linchpin isn’t the same as the colloquial idea of a cult leader. While the linchpin sometimes has a position of power in the community, they often don’t. Sometimes they’re an ordinary member or even (in religious life) a layperson. By the same token, decentralized communities that don’t have a leader may well still have a linchpin or three.
It’s also a mistake to assume that the linchpin is, like our stereotype of a cult leader, giving dramatic speeches, demanding obedience, verbally abusing people, or threatening violence. In my experience, some of the most powerful linchpins never raise their voices. A linchpin may present herself as a weak and helpless victim, someone the community should take care of. Or a linchpin may make herself ever-present and useful in small humble ways; never underestimate the power of being the person who handles the logistics and takes the minutes at the meetings. Or—especially in the modern day—a linchpin may simply be a popular writer or thinker, and everyone might believe that they were simply convinced by the quality of the linchpin’s arguments.
What makes a linchpin is that they’re the community’s “single point of reference.” The entire community winds up warped around the linchpin’s whims. The primary goal of the community becomes protecting the linchpin’s feelings and ensuring that the linchpin is never contradicted. Often, community members become obsessed with the linchpin’s health, their activities, and the state of their soul.
If the linchpin falsely accuses someone of some wrongdoing, the accused can wind up confessing their sin, even without obvious coercion and even while knowing inside that they’re innocent. To maintain their own innocence would be unthinkable. If the linchpin said something is true, it must be.
To identify linchpins, a beta reader suggests asking, “who always gets their way when there’s a win/lose situation and why?” To be sure, sometimes one person might legitimately keep winning conflicts—maybe she has good judgment, maybe she hosts the events, maybe she’s disabled in a way you need to accommodate, maybe she’s three months old. But if there’s no good reason for someone to always win, the community is likely warped around making them always win.
The linchpin is typically placed above every rule. If everyone else is an ascetic, the linchpin lives a life of luxury. If everyone else has taken a vow of poverty, the linchpin has a personal bank account. If everyone else is separated from the world, the linchpin can talk as much as she likes with outsiders. If everyone else’s Internet use is limited, the linchpin can spend twelve hours a day on social media. If everyone else has to attend services and prayers several times a day, the linchpin can skip whenever she wants.
Often, you’re not supposed to notice or acknowledge the ways that the linchpin is breaking the rules, and you’re definitely not supposed to write it down somewhere so you can check how often she does so.
The linchpin may make a point of all the rules she follows that don’t bother her very much. De Lassus writes:
A certain hypocrisy can even be established. In one new community (which, incidentally, has not survived), the religious were forced to live in the harshest conditions, sleeping in dormitories in a poorly-heated building, and having no form of private life, while the founder was leading a pleasant existence, living alone in a separate building, where he had the benefit of all necessary comforts — and maybe a few more besides. But out of humility, he used to celebrate Mass barefoot.
Again, while some linchpins make the argument (at various levels of explicitness) that they’re in charge so they deserve to be treated better, not all linchpins will. A linchpin may well demand special treatment because she has been so oppressed and victimized in the past.
Do all ideologically abusive groups have a linchpin? No. An ideologically abusive structure can perpetrate itself without anyone particularly wanting it to keep going. If everyone is a victim and everyone is a perpetrator, you don’t need an abuser for a group to be abusive. A truth that is difficult to talk about is that many people want to be in ideologically abusive groups. As I wrote in Why Are There So Many Rationalist Cults?:
But people who are drawn to the rationalist community by the Sequences often want to be in a cult. To be sure, no one wants to be exploited or traumatized. But they want some trustworthy authority to change the way they think until they become perfect, and then to assign them to their role in the grand plan to save humanity. They’re disappointed to discover a community made of mere mortals, with no brain tricks you can’t get from Statistics 101 and a good CBT workbook, whose approach to world problems involves a lot fewer grand plans and a lot more muddling through.
Many people want to submit the internal forum of their lives to a charismatic authority instead of taking responsibility for their own emotions and moral life. A dynamic both de Lassus and I have noticed is the charismatic leader who doesn’t seek out being a linchpin, but who is treated as a linchpin by her followers.
That said, a group without a linchpin will tend to acquire one. People who want to be linchpins will take advantage of a linchpinless ideologically abusive group. And—while virtuous leaders can dissuade their followers from treating them as a linchpin, albeit with mixed success—many will be seduced by the praise and power. If everyone you talk to thinks you’re right about everything and is trying to cater to your every caprice, you don’t have to be a very bad person to give in.
Claiming Authority Over Conscience
The first theme of spiritual abuse we’ll discuss is claiming authority over conscience. “Claiming authority over conscience” means replacing someone else’s moral judgment with your own. When someone has claimed authority over your conscience, instead of making your own moral choices and following your own sense of what is right, you unquestioningly obey the authority—or are shamed and punished when you don’t.
Even outside high-demand groups, people might want you to obey them sometimes. For example, your manager wants you to prioritize a particular project or design the website the way she wants; the Discord moderator wants you to thread, or stop talking about entirely, a particular topic of conversation; and the local-group organizer wants the meetup to happen at a particular time, in a particular location, and with particular activities. But they shouldn’t demand authority over:
Your beliefs (”if we design the website this way, the users will be confused”).
Your preferences (”arguing about whether trans women are women is both fun and enlightening”).
Your judgment (”all things considered, the llama pen at the petting zoo is a terrible place to hold a meetup”).
You should always be allowed to have your own preferences, beliefs, and judgment. You should never be required to replace them with community consensus or the views of the linchpin.
Now, beliefs and judgment can get a bit complicated. Even in low-demand communities, people aren’t socially allowed to believe whatever they want: in almost every community in the Anglosphere, if you start taking about racial differences in IQ or how six-year-olds can consent to sex, you’ll lose all your friends. For de Lassus, the distinction is easy: your religious community can command you to believe the teachings of the Catholic Church, but shouldn’t command you to believe anything else. For people who don’t believe the Holy Spirit guides the Catholic Church towards truth, the line between normal social norms and claiming authority over conscience can be very blurry. Here are some guidelines I find helpful:
It is a green flag if your community punishes basically the same beliefs the wider culture punishes (e.g. Nazism, pro-pedophilia activism).
It is a green flag if your community punishes very few beliefs, and allows widespread disagreement on most topics.
It is a green flag if you basically know what beliefs your community is going to punish: they don’t change over time, or expand or contract with the whims of the linchpin.
It is a green flag if popular people and unpopular people are held to the same standards.
It is a red flag if you aren’t allowed to disagree about practical issues (such as webpage design, how best to get people to pick up after themselves, or cabbage-growing techniques).
It is a red flag if you aren’t allowed to disagree about how best to accomplish the purpose of the community (e.g. which methods are most effective for persuading people, whether vaccinations are more cost-effective than malaria nets, whether you should work for a frontier AI company, or what meditation techniques best lead to enlightenment).
I think it is inherently toxic for your community to pressure you to want a different thing than you want. You can decide to try to change your preferences, but that’s a personal matter that your community should have no say over. You might choose not to pursue your desires (e.g. I choose not to eat eggs), but no one should stigmatize you for wanting something (e.g. I still like egg-containing baked goods).
A second example of claiming authority over conscience is requiring people to share their interior life—their feelings, thoughts, preferences, temptations, hopes, fears, etc.—when they’re not comfortable sharing it.
Let me be clear here. It’s important to have someone to talk to about your interior life. Many people do have to have a general sense of what’s going on with you: for example, your boss or housemates might need to know, in general terms, that you’re going through a hard time, so that they know how to support you and when to cut you some slack. And if you want to be a good person, you need guides and companions on the path: people who share their experiences so you don’t feel so alone, whose disappointment you can imagine when you’re struggling to hold yourself to account, who help you problem-solve when you’re stuck, and who call you on your shit.
But, fundamentally, opening your heart to someone does no good, and often rather a lot of harm, if it’s someone you didn’t choose or don’t feel safe around.
You must never require someone to share their inner life with you without their free, full, informed consent. That is, you shouldn’t get people to share their inner life with you by ordering them, manipulating them, pressuring them, or using your power over them to punish or reward them for it. Indirect pressure is still pressure: you shouldn’t go “you don’t have to share your interior life with me, but if you don’t you’ll be mediocre forever.” It is even questionable to tell someone that they ought to share their interior life with you, outside of some limited situations like marriages or therapy where someone with free, full, informed consent chose to commit to a certain level of emotional intimacy.
Lots of things, like finances, should be required to be transparent. But it is never acceptable to demand that a person’s thoughts or conscience be transparent. You should feel able to keep some things private, even from your spouse or therapist.
Inappropriately requiring people to share their inner lives leads to people sharing other people’s confidential information. De Lassus writes:
In the context of an openness that has been forced, indiscretion generally becomes the rule, springing from a feeling of having rights over the other person’s inner life: I have the right to know everything and the right to do what I want with what I know. In a pyramid structure, the violation of confidentiality is, moreover, part of the structure itself, given the requirement to relate everything to the head of the community. Intermediaries have to pass everything on, so nothing can be kept confidential. Experience shows that if someone lives long enough in this kind of atmosphere, the very meaning of confidentiality is lost, and many people are unable to keep themselves from saying what, deep down, they know they should be keeping to themselves. Even the seal of the confessional may be broken.
Another example of claiming authority over conscience is coercing people into entering a high-demand community or not letting them leave. No high-demand community is right for everyone, or even for most people. A healthy high-demand community actively rejects people who are a poor fit. Catholic monastic communities formally reject people’s application for membership. More informal communities might need a social norm of going “this seems really bad for you. Do you want to go somewhere else?”
Each individual must freely choose to be part of the community. It is very bad for a religious community to say to a candidate “God is calling you to be in our community,” because that means that the monastic life isn’t a free choice. Similarly, it is wrong to try to convince someone to stay in a monastic order if they feel called to leave. It is wrong to assume that someone’s desire to leave is a sin or a temptation from the Devil. You definitely shouldn’t attempt to cast out the demons that are making someone want to leave (!).
Similarly, in secular communities, people should feel social permission not to join a community and to leave if it isn’t working out for them. Leaving shouldn’t mean losing all your friends in the community. Maybe you aren’t invited to the effective altruist meetup anymore if you’ve left effective altruism, but you should still get invites to birthday and New Year’s Eve parties. No one should feel pressure to cut someone off solely because they left the group.
In general, the exit ramp should be comfortable. You can quit your job at the effective altruist organization and stay in the effective altruist group house; you can leave the effective altruist group house and still come over for dinner sometimes.
Poisoning the well against other communities is a red flag. Often, I’ve noticed, an ideologically abusive community will acknowledge that other legitimate communities exist—in theory. But in practice Alternative A is mediocre and uncommitted and won’t accomplish anything, and Alternative B is racist and misogynist, and Alternative C is run by a secret pedophile cabal, and Alternative D is just cringe. Healthy high-demand communities, on the other hand, know about and refer people to alternatives—whether that means the Carthusians telling people about the Jesuits or the Astral Codex Ten meetup telling people about TPOT.
No high-demand community is the only source for anything. You can find other groups with the same traits and goals: Catholics have many different religious orders; groups ranging from the Gates Foundation to GAVI fight global poverty without being effective altruists; rationalists are only one kind of weird nerd. You can also pursue important goals on your own: you don’t need to coordinate with the AI safety movement to get a job in the civil service or a think tank and advocate against AI risk; you can meditate on your own or practice solitary paganism or pray to God without an intermediary. Pursuing goals on your own is lonely and relies a lot on your good judgment, but it’s a hell of an improvement on being abused. (Hint: ideologically abusive groups are rarely particularly good at achieving their goals in the long run, so they’re not hard to outperform.)
It is a horrible, horrible mistake to think you have to hang out with some specific group of people to achieve any goal. Any high-demand community that says the choice is them or nothing is lying.
Secrecy about the community’s teachings is also a red flag. De Lassus writes:
One sister who asked the question [about why the Rule was hidden] received this explanation: “Because the rule is so sublime that we can’t understand it before we make our profession, i.e., before we have been properly initiated. Otherwise, we would be overwhelmed by its sublime nature, and as a result utterly discouraged when confronted with our own nothingness.”
If you don’t know what a community really teaches or what its norms really are, you can’t informedly consent to join it.
People don’t like to admit that a lot of people want others to claim authority over their conscience. Self-determination is scary. If you have to decide for yourself what’s good and right, you might be wrong. It’s much easier to submit yourself to someone else, knowing that if it all goes terribly then you’re not at fault. Personal responsibility is perennially unpopular, which is one of the major reasons people seek out ideologically abusive communities.
Demanding Total Self-Abandonment
“Total self-abandonment” means you are no longer an independent person. You are supposed to act as the hands and mouth of another person, whom you are completely loyal to and who knows everything about you.
In some ideologically abusive Catholic communities, religious take a vow to trust their superiors. The Legionaries of Christ went so far as to vow never to say or write anything that would “discredit the person of the superior or his authority” and to report anyone who did so.
This is very bad. No one should ever try to force you to trust them unconditionally. Trust is earned, not required.
De Lassus has to be fine with vows of obedience because he’s Catholic and they do those (and vows of obedience do serve as a drama reduction measure). In a secular context, you obviously need to obey people in some circumstances. Your boss can expect you to make the work product to her specifications, even if the specifications are very stupid. The meetup leader or owner of a coworking space has the final say over what behavior is acceptable at the meetup or coworking space.
But in a secular context obedience is narrowly scoped: the meetup leader can’t forbid you from dating someone or hanging out outside of the meetup; your boss can’t rule on your hobbies or friendships. And, if someone’s judgment is very bad, you can always quit your job or stop attending the meetup.
In the secular world, people rarely take explicit vows to trust group leaders. But social pressure can serve to force people to unconditionally trust high-status people or the linchpin. Someone might be a “high-level rationalist” or a “core effective altruist”, so it seems like her judgment and ethics must be better than yours. If you disagree with her about a course of action, you are at fault.
I don’t mean that you should never defer to people who know more than you. If an Anthropic employee corrects me about some detail of how large language models work, I’m going to assume they’re right and I’m wrong. But my deference is narrowly scoped: I don’t assume that Anthropic employees are right about healthy vegan diets or the best methods for preventing burnout. I remain open to the possibility that the Anthropic employee is mistaken. I don’t defer about questions of values (“how bad would it be if humanity went extinct?”) or complex and uncertain areas where experts disagree (“when will we develop AI?”, “how should we best align AI?”). And as I learn more about AI, I’m less likely to defer and more likely to rely on my own judgment.2
Disagreement, criticism, and people following their own consciences are our best protection against groupthink. And ultimately, as a person with agency and inherent dignity, you need to rely on your own judgment and your own sense of morals, not outsource them to some authority. Total self-abandonment is one of the worst ways you can wrong yourself.
Sometimes people try to make it impossible for you to notice that you disagree with them or think they’re asking you to do something morally wrong. Not to be a stereotypical Quaker, but I can’t overestimate the importance of quiet, unscheduled time—by yourself or with a trusted friend—in which you can think through whether what’s being asked of you is stupid or evil. Often, you’d notice that quiet voice inside you that says “this isn’t quite right” if you had a moment by yourself to think—so the ideologically abusive group will make sure you never have a moment by yourself to think. Constant busyness is an underrated red flag of ideological abuse.
However, total self-abandonment doesn’t have to come from pressure to trust any specific person; it can come from seeking uniformity rather than unity.
People in a high-demand community need to behave in the same way, which is different from how other people behave. All religious take vows of poverty, chastity, and obedience. Each individual order has its own demands: Trappists are very communal, while Carthusians spend a lot of time alone. And each individual abbey has its own culture, its own “feel.” Similarly, effective altruists talk, think, and behave in ways which are different from people who aren’t effective altruists. Each workplace, coworking space, meetup, and group house acquires its own unique culture and “feel”.
This is called “unity” and it’s good. Becoming more similar to other people is a natural part of being in any community, even a low-demand community.
At the same time, normal people don’t take up the religious life or join any other high-demand community. People in high-demand communities are weird; they aren’t all the same kind of weird, either. A good high-demand community embraces people’s weirdnesses. Uniformity is a Procrustean bed where you cut off bits of people to fit.
The task is to learn to be self-effacing without destroying yourself:
You might be asked not to impose your preferences on others; you shouldn’t be asked not to have preferences.
You might be asked to be okay with someone else being chosen for a job that you think you’d be better at; you shouldn’t be asked to falsely believe you’re bad at something you’re good at.
You might be asked to tolerate other people thinking differently; you shouldn’t be asked to think the same way as them.
You might be asked to participate in community recreation you don’t really enjoy but which other people do; you shouldn’t be asked to like things you don’t like.
You might be asked to explain why you don’t agree with the decisions of leadership; you shouldn’t be told, “I feel like you don’t agree with my decisions. You must say to me that you agree.”
Often, in spiritually abusive communities, having divergent preferences is itself seen as sinful or as the product of a deliberate action from Satan. Among rationalists and effective altruists, it may be seen as irrational, unaltruistic, or unagentic. In other groups, it may be seen as misogynist, racist, woke, untrad, unenlightened, unspiritual, or manifesting negativity.
Unity isn’t about getting rid of your preferences and replacing them with better ones. Unity is about learning to listen, to respect others’ preferences, to see the good in what other people like, and to figure out what is best for everyone. If your preferences change in this process—if seeing the good in Gregorian chant makes you appreciate it, for example—that’s good. But you can respect other people’s love of Gregorian chant while continuing to dislike it yourself.
Uniformity, on the other hand, is about taking the unique person you are and squeezing it until you fit someone’s predetermined idea of what a Moral Person or a Good Group Member looks like. Ultimately, the goal is to reshape you into a good tool for the linchpin’s or the group’s purposes—a tool that has no inconvenient traits that might get in the way of working out the will of those in authority.
Spiritual doctrine
“Spiritual doctrine” is the overspiritualization of ordinary situations.
Spiritually abusive groups are obsessed with finding the will of God in everything. De Lassus explains that God doesn’t have an opinion on whether you wear a sweater. He gave you a brain so that you could figure out for yourself whether to wear a sweater. The linchpin, however, very well might have an opinion on whether you wear a sweater. So doing the will of God becomes doing the will of the linchpin.
Spiritual doctrine can also be used to justify various sorts of wrongdoing. De Lassus gives a few examples of the use of spiritual doctrine to justify sexual abuse:
Jacques Poujol gives several examples of this: “See it as a blessing from God”; “God has told me that it is a good thing; our love is special”; “It is a privilege for you that I have chosen you”; “By loving me, you are loving Jesus.” Acts that are ambiguous or even frankly sinful are presented as an expression of God’s love, which needs to be incarnate if it is to reach people. You need this if you are to discover God’s love.
Presenting the abuse as something mystical also provides the easiest way to justify keeping it secret: What happens between you and me is a special grace, which others would not understand. This is a simple way of presenting immoral violence as the summit of union with God. Moments of doubt may spring up, but at such times, the victim may be reassured by the notion that this is an exception to normal rules: “It’s allowed for us. The grace that we are living is so high that it sets us above ordinary laws. The degree of love that we have reached allows us to do anything.”
At first, it might not seem obvious how the concept of “spiritual doctrine” applies to nonreligious ideologically abusive groups, which have no God to seek the will of. But I am reminded of the words of one of my interviewees in Why Are There So Many Rationalist Cults?:
“It’s a story in which they matter and in which it is justified for them to do weird stuff and stand up for themselves,” said an interviewee familiar with the Zizians. “Every action has great meaning, and that hooks into people in two ways. One of which is that it’s empowering, and the other of which is that it’s a great trigger for becoming obsessed with whether you’re a bad person.”
He continued, “It makes it easy for small things to seem very big. And I think it also makes it easy for big things to seem sort of the same size as small things. When you get pulled over and then you get in a gunfight with the cops or whatever, this is the same level of treating the situation like it is anomalous or a big deal as having an argument about who washed the dishes.”
Spiritual doctrine makes small things big and big things small. Wearing a sweater is a matter of obedience to the Perfect Will of God; being raped is nothing compared to the love and grace of God.
Any community with unusual beliefs is going to disagree with the mainstream about what things are important. “Not eating eggs” used to be a major moral self-improvement project for me, and it has now been replaced by “send networking messages” and “tell people they can hire you for freelancing work.” But the community has to agree that some things are unimportant. There must be some issues about which the teachings say nothing, and which are simple matters of personal taste.
If everything is important, nothing is.
Similarly, a high-demand community must be cautious about believing true things about the world. As I’ve written in the same piece:
But if you’re using your decision theory to make decisions and not just to get a PhD in mathematics, it really matters that you chose correctly. If you’re using your all-encompassing theory of human psychology to decide how to treat people, it really matters that you actually understand it. And all of these are more dangerous if, instead of following reasoning you understand, you’re deferring to the judgment of someone else who seems like they’ve thought about it a lot.
Agency and taking ideas seriously aren’t bad. Rationalists came to correct views about the COVID-19 pandemic while many others were saying masks didn’t work and only hypochondriacs worried about covid; rationalists were some of the first people to warn about the threat of artificial intelligence. If you want better outcomes than normal people, then you also need to do something different than what normal people do. But diverging from the norm is often risky and the risks have often not been taken seriously.
The more seriously you take acting on your beliefs, the more dangerous it is to have beliefs about the world that aren’t true. To be sure, some beliefs aren’t that dangerous even if you get them wrong: even if you think Catholicism is true, believing a Trinitarian heresy is unlikely to result in ideological abuse. But if your beliefs about psychology, society, or physical reality are mistaken, it can cause serious harm—especially if these beliefs are so core to the ideology that they feel impossible to question.
People often underestimate the extent to which communities are ideologically abusive because they believe specific things that justify ideological abuse. Of course, sexual abuse happens because victims are dependent on their abusers, because of isolation, because of a culture of thoughtless obedience, and so on and so forth. But it also happens because abusers teach that having sex with them is a special grace from God that sets you above ordinary laws, and victims believe them. Unless we understand the role of belief in ideological abuse, we’re going to be mystified by how it works.
In principle, you could have one group of perpetrators and one group of victims, but ideological abusers rarely seem to adopt this approach, perhaps because they don’t like sharing power.

"Our rule (in a Catholic context) is so sublime that it needs to be secret" is, let me be clear, an absolutely INSANE thing to say. In a "our bikeshed policy is more important than the Bible" kind of way.
I've seen claims that Scientology is the hardest cult to recover from. I'm not sure this is true, but it probably has more meddling with internal processes than most.
I've been paying attention to what I call emotional policing-- people telling each other what they're supposed to feel. For example, I've been seeing any number of people saying to not feel bad for Trump supporters whose lives were severely damaged by Trump. Aside from whether this is good strategy, are the emotions really anyone's business?