HKS Authors

See citation below for complete author information.

Abstract

Both criminal and regulatory law have traditionally been skeptical of what Jeremy Bentham referred to as evidentiary offenses – the prohibition (or regulation) of some activity not because it is wrong, but because it probabilistically (but not universally) indicates that a real wrong has occurred. From Bentham to the present, courts and theorists have worried about this form of regulation, believing that certainly in the criminal law context but even with respect to regulation it is wrong to impose sanctions on a “where there’s smoke there’s fire” theory of governmental intervention. Yet although this kind punishment by proxy continues to be held in disrepute, both in courts and in the literature, we argue that this distaste is unwarranted. Regulating – even through the criminal law – by regulating intrinsically innocent activities that probabilistically but not inexorably indicate not-so-innocent activities is no different from the vast number of other probabilistic elements that pervade the regulatory process. Once we recognize the frequency with which we accept probabilistic but not certain burdens of proof, probabilistic but not certain substantive rules, and probabilistic but not certain pieces of evidence, we can see that defining offenses and regulatory targets in terms of non-wrongful behavior that is evidence of wrongful behavior is neither surprising nor inadvisable.

Citation

Schauer, Frederick, and Richard Zeckhauser. "Regulation by Generalization." KSG Faculty Research Working Paper Series RWP05-048, August 2005.