Friday, December 12, 2014
Mistaken Attachment Beliefs, Persuasion, and "Trojan Horses"
I’ve been trying to figure out a name for a persuasive technique. Here’s an example: at www.imt.ie/clinical/2014/12/attachment-disorder-children.html, the authors Arshad and FitzGerald provide a good deal of well-substantiated information about children’s emotional attachment to their parents. They describe different qualities of attachment and refer to stages of attachment as they occur in the course of early development.
But now the trouble begins. Forgetting or ignoring the fact that insecure attachment is perhaps not ideal, but is within the normal range rather than pathological, these authors make the following statements: “Reactive attachment disorder is a severe form of insecure attachment, with symptoms of emotional dysregulation, anger, guilt, impulsive [sic], disinhibition, aggression, hostility, reactive [sic], proactive [sic], impulsiveness, stealing, hypervigilance, aggressive, withdrawn, destructive, temper outbursts, demanding, clinging, sleep problems, enuresis, overfamiliarity with strangers, oppositional, fidgety, poor hygiene, learned helplessness, abnormal eating habits, lack of eye contact, can’t keep friends, blames others for mistakes, mistrustful, manipulative, lack of remorse, irritable, fussy, swears, diffuse boundaries, and jealousy.” Rather than being supported by systematic research evidence, or drawn from any of the DSM discussions of Reactive Attachment Disorder, this symptom list is characteristic of websites like www.reactiveattachmentdisordertreatment.com.
The same technique is evident at www.attachmentnetwork.com and at www.radzebra.org. The latter has separate pages, one with material from DSM-IV and the other with a list of “symptoms” that are certainly concerning (like “fascination with blood and gore”-- by the way, is gore different from blood?) but that are not associated with disorders of attachment. These notional symptoms seem to have been drawn from the non-evidence-based 1996 article by Keith Reber which I discussed recently on this blog.
In the pages and documents I just mentioned, we have several examples of a persuasive technique. Trying to persuade people is not in itself a problem-- that’s something I am doing here, and even a simple presentation of well-founded statements is an effort at persuasion. However, persuasive devices are problematic when the goal is to convince readers that claims are correct, when they are not. It does not really matter whether the persuader is a true believer, has financial goals that depend on persuasion, or sees persuasion as a path to glory. Getting people to think that something is true when it is not is never really the right thing to do. In “wars of propaganda”, like World War II there may be short-term goals of persuasion that can lead to improved long-term outcomes, but even that use of persuasive devices may be based only on a guess as to what a good outcome may be. (Jane Austen’s novel Persuasion gives a good example of the difficulties here).
But what is the particular persuasive device seen in the examples above? It mingles true and false statements, apparently setting up an appeal to the authority of the true statements in order to “spread” that authority to cover the false statements as well. As I have tried to find a name for this device, I’ve found nothing among lists of fallacies or errors of critical thinking that might lead to persuasion in this way. There is such a thing as the fallacy of composition, which is the mistake of assuming that something true of a part of a whole must also be true of the whole, but that doesn’t seem to be exactly what’s going on here. I have found references to the method under the rubric of “disinformation” or intentional spreading of confusion about facts and logic, but no specific label to describe this technique as opposed to other disinformative methods.
Can any reader provide me with a name for a method that mixes true and false claims with the purpose of gaining belief for the false ones? In the absence of any other name known to me, I’m going to call this a “Trojan horse” technique. Presenting the reader with some well-established information, the “Trojan horse” user suggests the verity of ill-founded statements that have the potential to harm those who accept them. That’s certainly what appears to be happening in the examples given earlier, where accepting mistaken beliefs about attachment can lead to mistaken-- in fact, dangerous-- choices about treatment of children.