Times Report Models Worst Practices for Policy Research Reporting

By Scott Burris

I read the Times daily, and so naturally would like to be able to think it deserves to be regarded as a credible “newspaper of record.”  Today the paper outdid itself to disappoint, in a story by Sam Dolnick headlined “Pennsylvania Study Finds Halfway Houses Don’t Reduce Recidivism.” In the cause of making lemons from lemonade, I am drawing a list of “worst practices” from this little journalistic fender-bender:

Worst Practices Reporting Policy Evaluation Research (Provisional – come on readers, add your own.)

1. Don’t provide a title or author of the study or publication information on the study being described.

The story says only that it was conducted by the PA Department of Corrections and overseen by the state corrections department. There is a link later in the story to what turns out to be the Department’s annual report on recidivism. Not quite a “study” of halfway houses.

2. Don’t clearly describe the study.

The story does describe the study adjectivally – as “groundbreaking.”  It is, first of all, a bit of a stretch to call it a “study” at all. This is not the result of a systematic effort to explore the specific question of whether halfway houses work better than direct release to the street; it certainly was not a peer-reviewed or published study. Rather, the Times story is drawing on one section of an annual report produced by the state on recidivism among all prisoners released through all release mechanisms. The term “study” and the consistent suggestion that the study is important (“groundbreaking” results “so conclusive” that have “startled” leaders and experts) might lull the reader into believing that the “study” was well and deliberately designed to answer the question it supposedly posed – for example, a randomized, controlled and blinded trial of releasing prisoners directly to the street compared to halfway houses. Nope. This report is just a summary of outcome statistics, with a couple of paragraphs reporting in general terms on some statistical analysis meant to control for differences in the prisoners sent to halfway houses compared to those released to the street.

3. Just ignore the obvious problems for causal inference.

The plain and fundamental problem with pumping this study as powerful support for the claim that halfway houses don’t work is that we have no reason to be confident that the prisoners put into halfway houses are, as a group, the same as prisoners released directly to the street. It is elementary that statistical controls for observed differences cannot make up for a non-random, retrospective design that cannot also control for unobserved or unknown differences. Saying that this study “is casting serious doubt on the halfway-house model” is perhaps an attempt at caution, but way too weak a one. This study cannot cast serious doubt on anything, though it certainly points, as the report itself says, to worrisome outcomes in the halfway house system.

4. Ignore anything in the report that does not support unjustified causal inference.

There is nothing wrong with the PA report, mind you. It seems to be a careful and balanced presentation of the Department’s data, though I admit that if I were more into the politics of corrections here in PA I might find spin I didn’t like.  Two findings, in any case, should have given the Times a nudge to moderation. First, halfway house recidivism varies by provider, so some of the halfway house operators produce better results than release to the street. Second, in the statistical analysis, remaining in the halfway-house for 3-6 months produced a small but statistically significant improvement compared to release to the street.  Nothing “groundbreaking” here, but enough to suggest that a sufficient dose of a well-run halfway house might be better than going to the street, at a lower cost.

5. Embrace confirmation bias.

As the Times article itself notes, the paper has been reporting pretty aggressively on halfway house problems in New Jersey, many of whose facilities are run by contractors with problematic rates in their PA operations. Did the reporter here only see the parts that supported the view the paper was already taking?

I have no brief for halfway houses, let alone waste of tax-payer money through cronyism and outsourcing. I confess I am unbiased by any deep knowledge of this set of problems — I only know what I read in the paper. All the more reason,  as I drink my morning coffee, that I want my newspaper of record to conform to a few basic good practices in science reporting, even if the science happens to concern policy.

Note: Here’s a citation for the Report: Nicolette Bell et al, Recidivism Report 2013 (PA Dept of Corrections).

Temple University Center for Public Health Law Research

Based at the Temple University Beasley School of Law, the Center for Public Health Law Research supports the widespread adoption of scientific tools and methods for mapping and evaluating the impact of law on health. It works by developing and teaching public health law research and legal epidemiology methods (including legal mapping and policy surveillance); researching laws and policies that improve health, increase access to care, and create or remove barriers to health (e.g., laws or policies that create or remove inequity); and communicating and disseminating evidence to facilitate innovation.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.