Little Red Ridinghood, an erstwhile dutiful daughter and responsible student, was on her way to finish dental school when she was suddenly accosted by the Big Bad Cult Proselytizer and Deployable Agent, who made hypnotic eye contact with her and zapped her immediately into an endless spiral of trauma, dependency, deepening social alienation, and cognitive impairment!
The "Little Red Ridinghood Theory of Cult Conversion," which I formulated tongue-in-cheek a half-decade ago, is, of course, a wretched strawperson. With the possible exception of Mr. Ted Patrick, no rational anti-cultist or "brainwashing expert" would want to endorse it. Yet, the Ridinghood model merely caricatures and exaggerates some mechanistic and responsibility-annihilating elements in formulations about cultist "mind control" which I find disturbing and which I will deal with in this essay.
It is absolutely crucial to intelligent discourse to recognize that there are several different models of only partly interchangeable notions such as brainwashing, mind control, thought reform, menticide, information disease, coercive persuasion, etc. These models vary in the scope of their applicability and also in the degree to which "persuadees" are depicted as having lost free-will or the capacity for choice, or becoming like robots, zombies, or totally "programmed" slaves. Beyond these variations there are differences related to whether a given model is social-psychological, behaviorist, neurophysiological, etc. In general, the models with the broadest applicability tend to entail the least sinister and rigidly deterministic typification of the persuadee. This is particularly true with a model such as Edgar Schein's coercive persuasion, which Schein saw as applying to respectable religious orders, armed services indoctrination, rehabilitation programs, and college fraternities and sororities.1 This is also the case with regard to Robert Lifton's model of thought reform,2 which, as John Lofland and L. N. Skonovd have recently argued, is probably applicable to any communal ideological movement.3
Both Lifton and Schein were really trying to get away from the prior lurid and sensational "brainwashing" discourse and the "robotist" image of persuadees. As Lifton noted, the brainwashing notion has acquired a misleading and erroneous connotation of an omnipotent and esoteric psychotechnology which is employed to gain total control over purely passive slaves."
Neither Lifton nor Schein saw persuadees as necessarily enslaved, totally passive and bereft of free will. There are, however, more stringent models such as the totalitarian forced confession model reflected in Albert Somit's 1968 article on brainwashing in the International Encyclopedia of the Social Sciences.5 While Lifton's and Schien's models would clearly fit some new religious movements (n.r.m.s) or "cults, Somit's model would be difficult to apply to any formally voluntary group." For example, Lifton's thought reform but not Somit's brainwashing characterized the ill-fated People's Temple Community at Jonestown.7
Some social scientists, therefore, tend to distinguish between thought reform and coercive persuasion on the one hand and "true brainwashing" on the other. The latter is held to entail methods which are sufficiently aversive such that voluntary subjects are not likely to stick around to experience them.8
"Cult apologists" and critics of brainwashing explanations of conversion to n.r.m.s have sometimes been too hesitant to acknowledge that, in a sense, groups such as the Krishnas and the Moonies do use thought reform. Yet, polemical and rhetorical formulations by critics of cults seem sometimes to conflate different models, that is, the models of Lifton or Schein (and the terms "thought reform" and "coercive persuasion") are referred to because their broad applicability is rhetorically convenient, but the consequences of indoctrination are depicted in terms which really pertain to more stringent models or to lurid cold war discourse about brainwashing and mind-rape. Such (deliberate or inadvertent) rhetorical devices may be combined with sensational suggestions that individuals can have their personalities totally, rapidly, and involuntarily altered without any overt brutality and the application or threat of mundane physical force (as opposed to God-will-punish-you spiritual threats) and without any subsequent visible signs of trauma or disorientation, such that only an initiate into the rarefied and arcane realm of brainwashing expertise can discern that a seemingly calm, normal person is really pathologically mind-controlled. I believe that figures like 10 or 12 have surfaced in court testimony as referring to the limited number of authentic brainwashing experts alleged to be competent to discern subtle menticidal programming which will necessarily evade the attention of the non-specialist clinician thinking in terms of standard notions of incompetence and impairment.
Before turning my focus on bio-psychiatric "brain control" variants of mind control arguments, I want to affirm my view that the pressures and constraints which make it difficult for some persons to leave religious movements are generally rather mundane and social, even "structural," rather than depth-psychological or neuropsychiatric in a narrow sense. As the recent essay by Dennis Collins9 indicates, the career opportunities for a cultist outside of his encapsulating movement and its world may be grim. One may have come to depend on the group and its members for companionship, sex, health care, clothing and food, social status, meaningful work, ecstatic "highs," a sense of purpose and identity, etc. Some persons are timid or inarticulate and cannot face the scenes or arguments which could ensue if they made an other than covert withdrawal or refused to stay on after the conclusion of an initial weekend seminar or worship, etc.10 Some group members are really ambivalent for awhile but are subsequently assisted by deprogrammers or therapists to redefine their erstwhile paralyzing ambivalence as "I wanted to leave but I couldn't" (due to hypnotic mind control or whatever).
In this context I am impressed by Bromley and Shupe's de-mystification of the "floating" phenomena which has been referred to in so much anti-cult literature. In Strange Gods, they note that explanations and evocations of "floating" entail an implicit analogy between the intrapsychic exercise of will and self-control and the flexing of muscles, that is, the passive cultist has had the capacity to will suppressed and atrophied, and it takes time and rehabilitative training before he or she can competently exercise this capacity. Hence, the ex-cultist is prone to be coerced into other cults (or to return to his original cult) until he has systematically built up the strength of his "muscles" of will and personal autonomy. Sometimes this metaphoric theory provides a rationale for custodial restraint of the "incapacitated" (sort of a 90 pound will-weakling) ex-cultist until his atrophied self-control capabilities have been reestablished.
Bromley and Shupe's social psychological/sociological explanation seems more plausible to me: Being in a "cult" often involves developing a grandiose self-concept. For example, the Moonie may feel that he is engaged in "world-saving."12 After emerging from such a movement, one has to exchange this exalted role for the mundane roles of student, good son or daughter, job-seeker, etc. Psychologically, the exchange may not be rewarding; thus, one is prone to recruitment or re-recruitment to another (or the same) messianic movement. One of the strengths of this analysis is that it also helps to explain why so many ex-cultists become anti-cult activists. In so doing, they once again take up a "heroic" role-identity; they are once again saving the world, only they are saving it from instead of for Reverend Moon or whomever. They are also responding to the problems discussed so eloquently by Dennis Collins by developing a "career" which converts their personal cultist experience into a positive virtue or strength rather than a stigmatizing defect. They also develop contacts with therapeutic professionals who can "certify" their recovered autonomy and help them gain entry to professional and "expert" status. Thus, in the United States, hysterias against deviants have often been accompanied by mystiques of how only the ex-addict or the ex-communist "really" knows what it's all about and must therefore guard society from the evil. In the early 1950s, the renewed civic virtue and loyalty of an ex-Stalinist would often be certified by a well-known anti-communist journalist, such as George Sokolsky, although such certification would not be available to an ex-communist who did not become a vehement anti-red or who became a Trotskyite or a socialist.
"…the pressures and constraints which make it difficult for some persons to leave religious movements are generally rather mundane and social… rather than depth-psychological or neuro-psychiatric in a narrow sense."
I have noticed one interesting thing about ex-Moonies who had to be abducted and deprogrammed and who then became anti-cult activists, counselors, lecturers, and "cult-experts." Such individuals (for example, Gary Scharff, Steve Hassan, Chris Edwards) often occupied middle-level leadership or organizing positions in the movement. Presumably, they developed valuable social and communicational skills and organizational capacities in the movement. I recall attending a lecture by Chris Edwards and hearing the lecturer describe how he organized "front groups" on college campuses, and I remember thinking how fascinating such work must have been. But it seems to me that it is rarely acknowledged that such activities, although perhaps ethically dubious, could have been quite rewarding and valuable in terms of developed skills and capacities.
While Mr. Edwards' lecture and the comments of those other exdevotees who have become recognized "experts" and activists have highlighted intense peer-bonding and the mystique of fellowship as come-ons in the seductive process of cult recruitment, the reasons given for persisting in a commitment over time are often of a negative nature--fear of loss of love or retribution, subnormal functioning and
grogginess from sleep or food deprivation, etc.--rather than positive rewarding experiences other than emotive gratification from peer and guru-father approval. This is in line with legal theorist Richard Delgado's model of cultist brainwashing whereby a devotee, initially recruited through deception, soon loses the capacity to rationally evaluate his continued group involvement as he or she is ground down by sleep deprivation, poor diet, and psychological terrorism.13
Neither the existence of complex organizational and communicational activities performed by cultists nor the persistence of positively rewarding experiences is entirely consistent with the claim of "lost capacity" and the associated stereotyped demonology, at least on the surface.14 There are theories, however, which posit a selective surrendering of the will or critical capacity through cultist conditioning in the context of seemingly normal capacity for complex cognitive tasks.
There are a variety of theories which purport to explain how a kind of selective control is allegedly achieved over individuals by extreme groups which involves the operation of stress on the brain and central nervous system (CNS) but which is highly selective in its effects such that the indoctrinee may not appear to be disturbed, may seem normal and (usually) calm, but undergoes sudden personality change and attenuation of the will and critical capacity, which is sort of put on hold. Apparent general (amoral) intelligence may not even be affected.
The theories and theorists in this area seem to share some interesting characteristics. First, the various theorists may strongly disagree with each other. My friend Kevin Garvey believes that some cult-like or latently religious therapeutic movements such as est entail the induction of possibly dangerous trance-states in participants, who undergo personality change which may not be entirely voluntary. I would have thought that he would be a qualified enthusiast of Snapping,15 the best-known of the formulations along these lines. But, although his experience seems to him to corroborate their anecdotal accounts, he believes there are serious errors in Conway and Siegelman's analysis such that its wide influence has unfortunately misled some people into an oversimplified viewpoint.16 If I recall correctly, he feels that the nature of a holograph is misinterpreted by Conway and Siegelman, and this misconstruction is extrapolated in such a way that a theory of personality is delineated in Snapping which is erroneous as well as being in some respects analogous to the erroneous and pernicious theory held by Werner Erhard! Mr. Garvey also has doubts about neuro-linguistic programming and "Ericksonian hypnosis," which some ex-cultist deprogrammers and counselors practice and contrast to pernicious cultist mind control.
A more basic characteristic of these theories is their speculative, unverified, and sometimes vague quality. Cults in America, by anthropologist Willa Appel,17 treats the later stages of cult conversion almost exclusively in terms of "breaking the will" and "the physiology of brainwashing," and her analysis appears to be largely a popularization of the views of Dr. John Clark of the Center for the Study of Destructive Cultism. She acknowledges that Clark's theory is "not strictly scientific," though she feels that it is nevertheless "compelling."
In Snapping, Conway and Siegelman admit that they are making a free-form "literary" application of Catastrophe Theory, which is really a precise, quantitative model. They talk about cultist conditioning having the effect of blocking information-carrying passageways in the CNS, but there is little specification with respect to mediating biochemical processes such as fluctuations in specific neurotransmitters, that is, the kind of specificity which now characterizes some of the important work being done on the biochemical and neurophysiological dimensions of schizophrenia or manic-depressive disorders. Dr. Appel and Dr. Clark are more precise, but there is little or no actual "brain data" involving cult converts to directly verify or investigate their speculative extrapolations, which are often developed in terms of analogies whereby some stereotypical property or pattern of cult converts is said to be like schizophrenia or bear a resemblance to temporal lobe epilepsy, etc.
Speculative, unverified, and unverifiable theories in this area may have an undeserved prestige or seeming plausibility conferred on them by the current prestige of neurophysiological and biochemical approaches to psychiatry, the (sometimes a bit apocalyptic) enthusiasm over the imminent reconstruction of psychiatry as a "brain science," and the transformation of psychiatric healing from unscientific talky-talk to true medical curing. There is, however, a continuum along which research subareas might be ranked in terms of directly supportive neurophysiologcal data and systematic research. At the top I would place some of the recent research on schizophrenia and manic-depressive psychotic disorders, as well as the work on degenerative syndromes such as Alzheimers. In the middle might be placed some of the work and theorizing on the possible biochemical and genetic dimensions of alcoholism or "Jellineck's disease." But I'd have to put at the bottom the speculative theories about the alleged brain and CNS dysfunctions of n.r.m. devotees and the posited neurophysiological explanations for their (allegedly inherently pathological) conversions.
I might point out that if one investigates the articles on cults and their converts in leading psychiatric journals, such as the American Journal of Psychiatry or the Journal of Nervous and Mental Disorders, one finds that the kind of analyses which I have been discussing are not well represented. Two types of articles appear to predominate: 1) those which report mental health or an absence of pathology among converts, or a positive "relief effect" or diminution of negative symptoms among converts; or 2) articles by clinicians who regard cults as pathological social phenomena but who place more emphasis on the recruitment by cults of already disoriented or agitated persons than on the creation of pathology by cultist conditioning imposed on previously healthy or happy persons. The latter kind of article often warns against too literal an acceptance of brainwashing explanations, which are viewed as often functioning as rationalizations for ex-converts and their parents who sometimes seek to deny responsibility or serious pre-conversion problems.18
The theory of Snapping is really untestable. . . Although the authors discuss "the Snapping Moment" and generally depict the onset of information disease as a sudden discontinuous event, they also speak of "snapping in slow motion" and assert that a process of evolutionary conversion would not disprove their theory.19
Support for the Conway-Siegelman theory of "information disease" has been inferred from the authors' subsequent questionnaire study of ex-cultists, which reported that the amount of post-deconversion psychological problems among ex-devotees was correlated with the latter's reported amount of time spent in rituals.20 Even if one ignores the serious sampling problem arising from the distribution of questionnaires primarily within the anti-cult network, and ignores the debates over the alleged statistical deficiencies of the study, the findings merely indicate that some persons can experience negative psycho-emotional after effects from the intensity of cultist lifestyles and rituals. They do not validate the speculative claims about blocked information-carrying passageways in the CNS or the "explanations" of conversions in terms of brain dysfunctions, etc. As indicated earlier, a different kind of data would be required.
I would also argue that the work of Clark, Conway-Siegelman, Appel, et. al. is vitiated by biases and arbitrary assumptions about trance states, "normal" brain functioning, and information processing. Trance states seem to be viewed as inherently pathological and destructive, a view which is not that of eminent anthropologists such as Erika Bourgignon and Felicitas Goodman or eminent sociologists of religion such as Guy Swanson or James Richardson. Cults in the Clark-Appel view use trance states and repetitive chanting to prevent the brain from slipping back into normal functioning and competent information processing. But what constitutes normal, competent, or conventional information processing and mental functioning? Throughout history, major religions have conspicuously featured trance and ecstatic behavior. Indeed, Conway and Siegelman have a rather large and diverse hit list: they associate snapping and information disease with mystical groups, authoritarian communal cults, human potential groups such as est, and evangelical-Pentecostal-fundamentalist ("born again") groups. Their theory might thus be viewed as a sort of derivation "from the premise that strong and comprehensive involvements with generalized symbolic realities is pathological and regressive."21 There is
no suggestion here that anyone might come to believe notions wildly at variance with the great western capitalist consensus on the basis of any rational reflection on the ideas proffered, experience of their mental or emotional consequences, or gut attraction to new friends or lifestyle.22
Those who worship different gods than we do are (informationally) diseased.
Finally, brain dysfunction and disease pathology theories of conversion qua brainwashing are inconsistent with the growing evidence that the spontaneous defection rate from cults, even in relatively authoritarian groups such as the Unification Church, is extremely large such that between 50 and 90 percent of all converts will defect within a few years. The existence of a vast defection rate has been established by numerous studies, utilizing diverse methodologies, including research by psychiatrists such as Marc Galanter and Saul Levine, and sociologists such as Eileen Barker, James Beckford, Burt Rochford, Fred Bird, Stewart Wright, L. N. Skonovd, Robert Blach, Janet Jacobs, Jim Lewis, Richard Ofshe, and others. In "researching" Snapping, however, Conway and Siegelman "found very few people who got out of the Unification Church or any other cult on their own."23 It is partly on this basis that sociologist Roy Wallis suggests that the theorists of information disease in cultists betray an unmerited "complacency concerning their own ability to process information."24 In their subsequent survey of ex-cultists, the authors report that 30 percent of their sample had left voluntarily; this relatively low figure, though substantially higher than the "very few" reported in Snapping, however, clearly reflects the built-in sampling bias related to the primary distribution of questionnaires primarily in the anti-cult or concerned citizens network by "deprogrammers, counselors, and concerned organizations."
When the persistent involvement of individuals with a cult is explained in terms of brainwashing theory, voluntary exiting tends to be explained in terms of inadvertent "weak points" and imperfections in the conditioning process such as "incomplete suppression of undesirable thoughts" through repetitive chanting and other mind control devices.25 This explanation seems to imply that were it not for fortuitous glitches in the brainwashing process, every convert would be fully socialized and totally docile and passive-dependent. Such explanations might be plausible if only a limited number of participants escaped the mind-altering technology; but such explanations and the models they derive from appear rather ludicrous when they are articulated in terms of the "revolving door" reality of continual substantial defection! A more plausible view, which takes account of the submissions of Clark, Appel, Margaret Singer, et. al., is that although a deliberate system of indoctrinational thought reform may exist in some groups (and may include the mobilization of group pressures, mortification rituals, and other items), this "technology" is not responsible for the actual patterns of recruitment, commitment, and defection which entail a phenomena of social drift whereby individuals regularly become involved and later become dissatisfied and drift away.26 Thought reform exists and operates, but by and large participants are not brainwashed in the horrific or brain-impaired sense of the term.
Yet it must be acknowledged that there is at least a small grain of truth in neurophysiological explanations of conversion. Everything we do and think is mediated by the CNS. It is not impossible that involvement in a very dogmatic or authoritarian religion (which might include orthodox Judaism, conservative Protestantism, or traditionalistic Catholicism) does produce alterations in the nervous system and brain patterns. I would insist, however, on the following. 1) We now know very little about this. 2) Assumptions about overwhelming coercive impact and effective theft of choice tend to be arbitrary and thus far unwarranted; they belie the reality of the continual drift through the cultic revolving door. 3) Present evaluative inferences about pathology tend to be overdrawn and biased. There is also the question of whether a very traditional religious verbal behavior such as repetitive chanting or teaching about sin, guilt, and retribution should be transvalued in terms of legal and social policy as a sinister and insupportable brainwashing device, even though it can certainly be argued that teaching devotees that backsliders burn in eternal hellfire amounts to "psychological terrorism."
Much of the evidence in support of the idea of cultist brainwashing through hypnotic trance induction is still anecdotal. I have heard lurid accounts of cult members talking to their parents or therapists and suddenly changing their whole vocal pattern and speaking-ranting-growling in alien voices and subsequently having no memory of the episode. It all sounds rather Linda Blair-ish (as in The Exorcist), with undertones of bodysnatchers, wolfpersons, and bloodsuckers. I wonder sometimes if Dr. Clark or Dr. Conway or Mr. Garvey aren't really students of Professor Van Helsing!
While the above reference to Van Helsing (of vampire-killing fame) is intended to be jocular, there is a sense in which brainwashing theory is indeed a kind of vampire theory. A vampire is not responsible for his behavior; he acts compulsively in a mechanistic manner because he has been infected by the bite of a prior vampire--a true medical model. Unfortunately, a vampire cannot be deprogrammed; he must therefore be run through the heart with a stake! But his moral responsibility for his conduct is dissolved, as is the responsibility of the person who infected him, who was compelled by his own bite-induced infection. So nobody is really responsible except the first biter! The brutal ex-Nazi concentration camp official facing prosecution can claim to have been brainwashed by Hitler and the SS. Look how this dynamic is played out with respect to cults. Several years ago Paul Morantz, a courageous lawyer and anti-Synanon crusader, who was bitten by a rattlesnake placed in his mailbox (for which two Synanonists were convicted), was quoted by a magazine as suggesting that the actual culprits should be excused from punishment if they renounced their loyalty to Synanon and its leader, Chuck Dederich. From the present writer's perspective, no one who puts a viper in a mailbox should escape punishment, nor should criminal legal processes be used to alter beliefs. What kind of deterrence against harmful acts performed by cult devotees is provided by allowing the perpetrators to evade retribution through ideological recantation? If brainwashing theories become dominant in this area, more persons might join extreme groups attracted by the
moratorium on legal constraint which such participation will afford in the sense that responsibility for anti-social acts performed by a devotee will be transferred for the duration to the guru!
"…although a deliberate system of indoctrinational thought reform may exist in some groups … this 'technology' is not responsible for the actual patterns of recruitment, commitment, and defection, which entail a phenomena of social drift…"
I recall once sharing a podium with Dr. Margaret Singer, who warned against "blaming the victims" (the "cult-victim") in cult controversies. If we assume that someone must be blamed ] for someone's having certain exotic beliefs or affiliations, we necessarily end up stuck with the dreary choice of blaming the cultist (maybe he has an "authoritarian personality") or blaming the "destructive cu1t," or possibly blaming the "sick society." But, assuming that a snake has not been deposited in a mailbox by the devotee, why is there necessarily an issue of blame? Brainwashing theories negate responsibility. In this sense they distort and mystify, and sometimes the alleviation of blame, guilt, and responsibility may be their function. But blame and responsibility aren't necessarily the same things, although they may be related. In taking responsibility for one's ideological commitments, even past mistaken ones, one isn't necessarily required to undertake the repetitive chanting of mea culpas or to accept the unfair stigma against which Dennis Collins has protested. There is nothing intrinsically wrong in being an exMoonie, and one needn't have to establish one's helpless victimization to ward off persecution and social stigma. Some of the leading American political intellectuals writing for the New York Review of Books are ex-Stalinists. An editor of the New Republic is an ex-member of the violent Jewish Defense League. Such writers tend to be highly critical of the (often highly manipulative) groups from which they defected--"The Gods that failed"--yet they do not usually see themselves as mere victims bereft of responsibility. Such a rigid and simplistic outlook would probably inhibit analytical depth and reflexivity.
There are times, of course, when taking responsibility does entail accepting a share of the blame. The late Shiva Naipaul, no admirer of cults, expressed some critical irony regarding the exculpatory impact of brainwashing theories in his book on Jonestown.
But caution, humility and contrition were not conspicuous among the defectors. They emerged from their years of subservience and loyalty untouched, they tell us, by any moral taint--the hapless victims of "mind control" and "coercive persuasion," heroes and heroines who ought to be applauded for their courage rather than pardoned for their sins. "The one thing we have learned," wrote one of their leading lights, "is not to blame ourselves for the things Jim made us do."27
An activist who was the director of the Human Freedom Center had defected from the People's Temple in 1975.
It was then that she had made the remarkable discovery--under the tutelage of the mind control experts--that neither she nor any other of the reformed disciples bore any responsibility for anything untoward they might have done during their Temple years. The burden was entirely Jim's.28
But the burden--of explanation as well as moral responsibility--is never exclusively Jim's (or Adolf's or Muammar's or Swami's). Things just aren't that simple.
Let me conclude by noting that in Snapping there is a tension between two not entirely compatible theories of cultism. One theory is humanistic: individuals sometimes irresponsibly exchange their critical intelligence and moral autonomy for the sensate "high" of intense experiences and the exhilaration of following a putatively perfect leader or system. This is an escape-from-freedom theory. The other theory is the dreary mechanistic and deterministic model about people being zapped or snapped into a subhuman state which is bereft of moral sense and which constitutes a true involuntary disease state from which the helpless semi-robots must be rescued, if necessary over their protest, and deprogrammed. The latter analysis seems to this writer like a Svengaliesque mystification and reductive simplification of complex social psychological processes. It is dehumanizing in its effects (for example, rationalizing therapeutic coercion as well as annihilating individual responsibility for commitments and committed acts) and in its vulgar, mechanistic simplification of complex human events.
Thomas Robbins, sociologist of religion, is the co-editor of In Gods We Trust (Transaction, 1987), Cults, Culture and the Law (Scholars Press, 1985), and Contemporary Church-State Relations (Transaction, 1986). He has published numerous articles on new religious movements.
1. Edgar Schein, I. Schneir, and C. H. Barker, Coercive Persuasion (New York: Norton, 1961).
2. Robert Lifton, Chinese Thought Reform and the Psychology of Totalism (New York: Norton, 1961). See also Robert Lifton, Cult Processes, Religious Totalism, and Civil Liberties,. pp. 59-71 in T. Robbins, W. Shepherd, and J. McBride (eds.)
Cults, Culture and the Law (Chico, CA: Scholars Press, 1985).
3. See John Lofland and L. N. Skonovd, "Conversion Motifs," Journal for the Scientific Study of Religion, (1980) 20(4):373-385.
4. Lifton, Chinese Thought Reform, p. 3.
5. Albert Somit, "Brainwashing," International Encyclopedia of the Social Sciences, Vol. 2 (New York: MacMillan, 1968).
6. See Lofland and Skonovd, "Conversion Motifs," and also John Lofland and L. N. Skonovd, "Patterns of Conversion," pp. 1-24 in Eileen Barker (ed.) Of Gods and Men (Macon, GA: Mercer University Press).
7. Judith Weightman, Making Sense of the Jonestown Suicides (Toronto: Edwin Mellon, 1983).
8. Alan Scheflein and Edward Opton, The Mind Manipulators (New York: Paddington, 1978).
9. Dennis Collins, "Ex-Cultists Need Not Apply," Update, (March 1986) 10(1):23-31.
10. Stewart Wright, "The Dynamics of Cult Disengagements: An Analysis of Exiting Modes," paper presented to the Society for the Scientific Study of Religion, Savannah, Georgia, 1985.
11. David Bromley and Anson Shupe, Strange Gods (New York: Beacon).
12. -----, The "Moonies" in America (Beverly Hills, CA: Sage, 1979).
13. Richard Delgado, "Religious Totalism: Gentle and Ungentle Persuasion Under the Thirteenth Amendment," Southern California Law Review, vol. 51 (1977), pp. 1-99.
14. It is worth noting here that some of the larger and wealthier groups, such as the Unification Church, manage a large complex of commercial, educational, communications, political, and therapeutic operations. Whatever one may think about some disturbing aspects of these enterprises and diversified cultist "empires," they often provide opportunities for interesting, skilled, and creative work and training, that is, there are more jobs available than selling flowers on the street. The large turnover of personnel which characterizes these groups can have the consequence of sustaining a continual availability of middle-level or otherwise responsible positions. As Dennis Collins has pointed out, ex-cultists are often quite prepared to be capable employees, if only they weren't stigmatized.
15. Florence Conway and Jim Siegelman, Snapping: America's Epidemic of Sudden Personality Change (Philadelphia: Lippincott, 1978).
16. Personal communications.
17. Willa Appel, Cults in America (New York: Holt, Rinehart and Winston, 1983).
18. For an extensive review of numerous papers on cults and mental health, see Brock Kilbourne and James Richardson, "Psychotherapy and New Religions in a Pluralistic Society," American Psychologist, (1984) 39(3):237-251.
19. Dick Anthony, Thomas Robbing, and Paul Schwartz, "Contemporary Religious Movements and the Secularization Premise," pp. 1-8 in John Coleman and Gregory Baum (eds.) New Religious Movements (New York: Seabury, 1983); originally Concilium, vol. 161 (January 1983), p. 6.
20. Florence Conway and Jim Siegelman, "Information Disease," Science Digest, (1982) 90(1):88-92.
21. Anthony, Robbins and Schwartz, p. 6.
22. Roy Wallis, The Elementary Forms of the New Religious Life (London: Routledge and Kegan Paul, 1984), p. 135.
23. Conway and Siegelman, Snapping, p. 36. For an extensive review of recent studies of disaffiliation from religious movements, see James Richardson, et. al., "Labeling and Leaving: Voluntary and Coerced Disaffiliation from Religious Social Movements," in Kurt Land (ed), Research on Social Movements, Conflict and Change, Vol. 9, 1985, in press.
24. Wallis, p. 135.
25. Appel, p. 145.
26. Theodore Long and Jeffrey Hadden, "Religious Conversion and the Concept of Socialization: Integrating the Brainwashing and Drift Models," Journal for the Scientific Study of Religion, (1983) 22(1):1-14.
27. Shiva Naipaul, Journey to Nowhere (New York: Simon and Schuster, 1980), p. 157.28. Ibid., p. 180.