In:
Political Analysis, Cambridge University Press (CUP), Vol. 14, No. 2 ( 2006), p. 131-159
Abstract:
We address the problem that occurs when inferences about counterfactuals—predictions, “what-if” questions, and causal effects—are attempted far from the available data. The danger of these extreme counterfactuals is that substantive conclusions drawn from statistical models that fit the data well turn out to be based largely on speculation hidden in convenient modeling assumptions that few would be willing to defend. Yet existing statistical strategies provide few reliable means of identifying extreme counterfactuals. We offer a proof that inferences farther from the data allow more model dependence and then develop easy-to-apply methods to evaluate how model dependent our answers would be to specified counterfactuals. These methods require neither sensitivity testing over specified classes of models nor evaluating any specific modeling assumptions. If an analysis fails the simple tests we offer, then we know that substantive results are sensitive to at least some modeling choices that are not based on empirical evidence. Free software that accompanies this article implements all the methods developed.
Type of Medium:
Online Resource
ISSN:
1047-1987
,
1476-4989
Language:
English
Publisher:
Cambridge University Press (CUP)
Publication Date:
2006
detail.hit.zdb_id:
2077794-2
SSG:
3,6