"Smart people (like smart lawyers) can come up with very good explanations for mistaken points of view."

- Richard P. Feynman, Physicist

"There is a danger in clarity, the danger of over looking the subtleties of truth."

-Alfred North Whitehead

October 11, 2010

First Do No Harm

We all know that part of the Hippocratic Oath of medical practitioners includes the promise to “first do no harm”. We also know of many drugs used to effectively treat various maladies were later taken off the market because of detrimental side-effects. We expect drug companies to research the side-effects of any drugs before they are widely used on humans. Do practitioners in the people “rehabilitation” business concern themselves with potential side effects and risks of the treatment they prescribe? The answer is not usually.

In an interesting article, “Cures that Harm: Unanticipated Outcomes of Crime Prevention Programs”, researcher Joan McCord argues that evaluating programs that attempt to reduce crime for effectiveness alone, is insufficient as some treatments cause other harm—sometimes even when they are effective in reducing recidivism.[i] McCord states: “Unless social programs are evaluated for potential harm as well as benefit, safety as well as efficacy, the choice of which social programs to use will remain a dangerous guess.” She continues, “Reluctance to recognize that good intentions can result in harm can be found in biased investigating and reporting. Many investigators fail to ask whether an intervention has had adverse effects, and many research summaries lack systematic reporting of such effects.”[ii]

McCord describes several well-designed, carefully implemented studies that resulted in harmful side-effects. The first was the Cambridge-Sommerville Youth Study, which “was a carefully designed, adequately funded, and well-executed intervention program.”[iii] That study was based on the theory that criminal conduct was related to the family in which the person grew-up. In the treatment homes, a social worker visited the family, sometimes once each week, providing friendly guidance for the children and family including referring the children to needed specialists. The control homes were identical to the treatment homes, but did not receive the treatment. All program participants reported that they thought the program had a very positive effect on them.[iv]

McCord followed these children for about 35 to 40 years. Her results showed that those children who were in the program, as adults, were more likely to have been convicted of a crime. She ultimately isolated the one factor, multiple attendance at summer camp, that increased the odds of a child being convicted of a crime as an adult by a factor of ten.[v] McCord applauds this study as having been designed properly from the start with an appropriate control group to allow researchers to discern the effects of the treatment.[vi]

McCord discusses another well-thought out and well-designed program that resulted in harm called “Volunteers in Probation”. In this program juvenile delinquents were assigned a volunteer who provided tutoring services for the youth. The program participants committed more crimes than the control group. The evaluator of that program wrote:

“To those who may feel that other such programs, perhaps their own, are so much superior or so different from this program and that our findings and recommendations are irrelevant to them, we urge caution. The staff responsible for this program has reasons good enough for them to feel that their program was effective when this study began, and without this study might still have no reason to feel otherwise. If there is anything that such a study as this one demonstrates, it is the danger of relying exclusively on faith in good works in the absences of systematic data”. [vii]

McCord discusses other programs designed to reduce criminal recidivism rates that ultimate turned out to increase them. One of these programs was the “Scared Straight” program where juvenile delinquents were exposed to tough prison inmates in an attempt to scare the delinquents into becoming law abiding. The program resulted in an increase in criminal activity for those who participated in the program. It was opined that the juveniles committed more crimes to prove that they were not scared. [viii]

Another actual side-effect case, not related to program efficacy, involved the use of citizen volunteers visiting with prisoners in an attempt to change the prisoners’ anti-social thinking by interacting with and learning from volunteers who had pro-social thinking. An evaluation of the program showed that it was effective in reducing prisoners’ antisocial thinking. However, this program had a negative side-effect. The researchers found that the citizen volunteers had increased antisocial thinking. Although the prisoners were learning from the volunteers, the volunteers were also learning from the prisoners.[ix]

McCord concludes that studies which provide evidence of harmful effects are often not published as there is a strong bias against reporting adverse effects of social programs. How often do we hear someone discussing a program that they found probably didn’t work? Rarely, if ever, do we hear of negative results.

McCord writes:

“ Many people seem to be willing to believe favorable results of inadequate evaluation designs. Some accept testimonials from clients who express their appreciation of a program. Against the claim that these provide valid evidence of effect, it should be noted that each of the programs (that she describes) would have been counted as successful by this criterion. Yet the clients would have better off had they not participated in the program.”

Those of us in the people rehabilitation business are advised to keep the admonitions of “first do no harm” and “the path to hell is paved with good intentions” in mind when we embark on a new idea to reform peoples’ behaviors. While it is important to not throw our hands up in despair and falsely claim that “nothing works”, we should always be realistic, guided by sound theory and ultimately empirical research, to ensure what we do, does not result in harm to the program participants, their families, and society.



[i] McCord, Joan (2003), Cures that Harm: Unanticipated Outcomes of Crime Prevention Programs, 587 Annals of the American Academy of Political and Social Science pp 16-29.

[ii] Ibid, p. 17.

[iii] Ibid p. 17.

[iv] Ibid pp 18-23.

[v] Ibid. pp. 21-22.

[vi] Ibid. pp. 23.

[vii] Ibid. p. 24, quoting Berger, R.J. et al, 1975, Experiment in a juvenile court: A study of a program of volunteers working with juvenile probationers. Ann Arbor: Institute for Social Research, University of Michigan

[viii] Ibid. p. 26.

[ix] Andrews, D.A and James Bonta, 2010, The Psychology of Criminal Conduct, 5th Ed. New Jersey, Mathew Bender, P. 128

The views expressed in this blog are solely the views of the author(s) and do not represent the views of any other public official or organization.

No comments:

Post a Comment