Explaining the content of impact assessment in the United Kingdom: Learning across time, sectors, and departments
Regulation & Governance
Reason for embargo
Whilst several studies have documented how evidence-based policy instruments affect public policy, we know less about what causes changes over time in the analyses mandated by the instruments, especially in Britain. Thus, we take the analytical content of a pivotal regulatory reform instrument (impact assessment) as dependent variable, we draw on learning as conceptual framework, and we explain the dynamics of learning processes across departments, policy sectors, and time. Empirically, our study draws on sample of 517 impact assessments produced in Britain (2005-2011). Experience and capacity in different departments matter in learning processes. Guidelines matter too, but moderately. Departments specialize in their core policy sectors when performing regulatory analysis, but some have greater analytical capacity overall. Peripheral departments invest more in impact assessment than core executive departments. The presence of a regulatory oversight body enhances the learning process. Elections have different effects, depending on the context in which they are contested. These findings contribute to the literature on regulation, policy learning and policy instruments.
This paper is based on research carried out with the support of the European Research Council (ERC) project on Analysis of Learning in Regulatory Governance (ALREG, Advanced Grant no. 230267), directed by Claudio.
This is the peer reviewed version of the following article: Fritsch, O, Kamkhaji, JC and Radaelli, CM (2016) Explaining the content of impact assessment in the United Kingdom: Learning across time, sectors, and departments. Regulation & Governance; which has been published in final form at https://dx.doi.org/10.1111/rego.12129. This article may be used for non-commercial purposes in accordance with the Wiley Terms and Conditions for Self-Archiving.