John J. Donohue is quite the character. His anti-gun so-called “research” has long been paid for via a number of public universities and grants. He’s one of these “social scientists” who just amazingly comes up with “scientific” findings that his funding sources like to see. He seems particularly keen to take a lot of money from people who want to see more gun control and has long been a shrill critic of John Lott and Lott’s bullet-proof scientific findings (pun intended).
It took us a while to quit laughing at the abstract of John J. Donohue’s latest anti-gun academic screed, Stanford Law and Economics Olin Working Paper No. 461. It claims, in scientificese, that right-to-carry increases violent crime rates with statistical certainty. The gem that caused us to guffaw uncontrollably was this rather bold claim: “Our analysis of admittedly imperfect gun aggravated assaults provides suggestive evidence that RTC laws may be associated with large increases in this crime, perhaps increasing such gun assaults by almost 33 percent.” (emphasis added)
Yes, he wrote that right-to-carry laws increase aggravated assaults by 33%.
Did he really think he could pass this steaming horse manure off as scientific research?
First of all, your blog post author majored in sociology in college. That only slightly less useful than staying at a Holiday Inn Express the last night, I assure you.
Having said that, I know enough of the difference between a peer-reviewed study that’s been subject to and survived a great deal of scrutiny over the past twenty-plus years and a half-assed working paper that’s neither peer-reviewed, nor even a “study” ready for peer review.
Don’t take my word for it though.
Here’s what someone who has put some time into it wrote on it at Soylent News:
This study is completely without value. There is literally nothing to see here. What you’re about to read would be failed out of high school statistics, and the authors should be downright ashamed. I’m going to paste the abstract here and tear it apart paragraph by paragraph below, as should any objective, scientific observer demanding good, evidence-based findings, but you can skip that now. Seriously, save your time. I’ve wasted mine instead.
For over a decade, there has been a spirited academic debate over the impact on crime of laws that grant citizens the presumptive right to carry concealed handguns in public – so-called right-to-carry (RTC) laws. In 2004, the National Research Council (NRC) offered a critical evaluation of the “More Guns, Less Crime” hypothesis using county-level crime data for the period 1977-2000. 15 of the 16 academic members of the NRC panel essentially concluded that the existing research was inadequate to conclude that RTC laws increased or decreased crime. One member of the panel thought the NRC’s panel data regressions showed that RTC laws decreased murder, but the other 15 responded by saying that “the scientific evidence does not support” that position.
Authors tip their bias in the first sentence using non-professional language like “so-called” instead of the proper “hereafter referred to [by their common title]” or similar. They then attempt to conflate the very stable majority position (15/16), based on the actual statistical analysis at the time, with opinion instead of established fact using words like “saying,” and again tip their hands in the last sentence by stating the outlier’s position first, then getting to “the other 15.” Not a good start, guys. Maybe for mainstream news, but if you want to be taken seriously I’ve already binned your credibility with CNN or Fox News. However, let’s continue.
We evaluate the NRC evidence, and improve and expand on the report’s county data analysis by analyzing an additional six years of county data as well as state panel data for the period 1979-2010. We also present evidence using both a more plausible version of the Lott and Mustard specification, as well as our own preferred specification (which, unlike the Lott and Mustard model presented in the NRC report, does control for rates of incarceration and police). While we have considerable sympathy with the NRC’s majority view about the difficulty of drawing conclusions from simple panel data models and re-affirm its finding that the conclusion of the dissenting panel member that RTC laws reduce murder has no statistical support, we disagree with the NRC report’s judgment on one methodological point: the NRC report states that cluster adjustments to correct for serial correlation are not needed in these panel data regressions, but our randomization tests show that without such adjustments the Type 1 error soars to 22-73 percent.
So they re-analyze with additional data. Why not lead with that, instead of the biased prior paragraph? They then claim their model is better – wait, no, they don’t actually claim superiority at all but that theirs is preferred. By whom? Themselves, one assumes. They then “sympathize” with the majority view but claim their findings lie with the single dissenter, again attempting to soften or opinion-ify what should be based on hard evidence alone. Where is theirs? That bit appears to be 100% opinion. The last sentence appears to be FUD about Type 1 error – potentially relevant, but they don’t bother to explain why or how. “Blah blah we screwed with the data and made up reasons for why we needed to” appears to be what’s going on here. Even if relevant, this is only useful for those who have read and are intimately familiar with the prior study – which is not the place for an abstract. Abstracts should stand alone.
Our paper highlights some important questions to consider when using panel data methods to resolve questions of law and policy effectiveness. We buttress the NRC’s cautious conclusion regarding the effects of RTC laws by showing how sensitive the estimated impact of RTC laws is to different data periods, the use of state versus county data, particular specifications (especially the Lott-Mustard inclusion of 36 highly collinear demographic variables), and the decision to control for state trends.
These sound like potentially important factors, but the chicken is coming before the egg here – they should be listed after the evidence as discussion/conclusions falling from those findings.
Across the basic seven Index I crime categories, the strongest evidence of a statistically significant effect would be for aggravated assault, with 11 of 28 estimates suggesting that RTC laws increase this crime at the .10 confidence level. An omitted variable bias test on our preferred Table 8a results suggests that our estimated 8 percent increase in aggravated assaults from RTC laws may understate the true harmful impact of RTC laws on aggravated assault, which may explain why this finding is only significant at the .10 level in many of our models. Our analysis of the year-by-year impact of RTC laws also suggests that RTC laws increase aggravated assaults. Our analysis of admittedly imperfect gun aggravated assaults provides suggestive evidence that RTC laws may be associated with large increases in this crime, perhaps increasing such gun assaults by almost 33 percent.
Bad start for the real evidence. They looked across all the Index I categories and the best they’ve got is “the strongest evidence of a statistically significant effect” which is at the 0.10 level… about half the time (11/28). Remember, folks, if you want to talk about the 0.10 level one in ten of your investigations is going to return false positive results! They flat out admit they were searching seven categories here, so I’m definitely not impressed, and neither should you be. Then they talk about their “preferred” analysis/results, and assert that danger may be understated in many of their models. Folks, this is already reading like what should be buried at the bottom of a “limitations” section. f this was so relevant, such an important effect, which spanned many of their models and is worth noting in the second sentence of your Abstract’s results, you’d better damn well follow through and explore that instead of just spewing pure speculation. Then they find “suggestive” results with no actual test values noted – given they found an alpha of 0.10 worth reporting, I’m not holding my breath here. In the final sentence they weaken the inevitable brutal rebuttal of the paper by noting “admittedly imperfect” data and again use the statistical weasel words “suggestive evidence” and “perhaps” before asserting things which they obviously have no evidence to support.
It’s actually hilarious they say gun assaults could increase by almost 33% without a single mention of a significance level or test result for this claim. Is this actually a joke? Nope, there’s another paragraph to go yet.
In addition to aggravated assault, the most plausible state models conducted over the entire 1979-2010 period provide evidence that RTC laws increase rape and robbery (but usually only at the .10 level). In contrast, for the period from 1999-2010 (which seeks to remove the confounding influence of the crack cocaine epidemic), the preferred state model (for those who accept the Wolfers proposition that one should not control for state trends) yields statistically significant evidence for only one crime – suggesting that RTC laws increase the rate of murder at the .05 significance level. It will be worth exploring whether other methodological approaches and/or additional years of data will confirm the results of this panel-data analysis and clarify some of the highly sensitive results and anomalies (such as the occasional estimates that RTC laws lead to higher rates of property crime) that have plagued this inquiry for over a decade.
More worthless statistics, needing an alpha of 0.10 to be mentioned. Any writer with a shred of integrity would mention this as incredibly suggestive and non-significant first, instead of burying the alpha level at the end in hopes the reader is too incompetent to realize they’re spewing valueless assertions. Hey, finally here at the end if we limit the data range and use their “preferred” model a single crime reaches significance at an alpha of 0.05. Not stated: how many crimes were stated, or how many other models/date ranges were attempted before they found one that supported their agenda. In all likelihood this is another random finding thanks to p-hacking.
This isn’t a real study at all, just an excuse to feed politicians “scientific” ammunition for an agenda shared by the authors.
Pap for the low information voters. Sadly, we need to take this stuff seriously and refute it – which is very nicely done here. Thanks, John.
My working definition of Sociology: The study of things that are already painfully obvious.
And here to fore circumvent the actual ineffective correlated data making it prejudicial to the unproven yet clearly exposed factual matter. I’m wanting to supplement the word juxtaposed in here somewhere owing to it’s important appearance, but I surrender.
Remember that old TV show “In Living Color” where the idiot talked with big words to sound intelligent, but instead made a fool of himself?
That’s who this Donohoe reminds me of.
Sam
Yes !!!, he wore that little hat and was usually doing his rant in a jail cell, that was funny.
So this hack has determined that when law abiding people have the means to defend themselves they turn into criminals and crime increases. This is a true pile of defecation, always nice to see an anti freedom type make a fool of themselves. watch the grabbers make a shrine out of this toilet paper.
I too remember that character from In Living Color. LOL! You’re so right.
Dono-hoe has prostituted so-called research for grant $$ for most of his adult life. He also has attacked Lott for everything down to what kind of socks he wears, or so it seems.
He’s a little man, insecure and jealous of men who aren’t intimidated by inanimate objects.
Just more pseudo-intellectualism from folks that are living in a perpetual state of existential crisis. Everything must justify their beliefs and needs. If not, they shall make it so by way of their ego and narcassism. No lie is too great, no truth too dear to be bent towards their own position.
Sprinkle on some entitlement, a silver spoon education, study nights with organic Palo Alto groceries , a bag of weed from Humboldt, and you’ve got yourself a real nice sounding white paper.
Print it in a roll, and I’ll gladly wipe my ass with it