Explaining the content of impact assessment in the United Kingdom: Learning across time, sectors, and departments
Fritsch, O; Kamkhaji, JC; Radaelli, CM
Date: 19 September 2016
Journal
Regulation & Governance
Publisher
Wiley
Publisher DOI
Abstract
Whilst several studies have documented how evidence-based policy instruments affect public policy, we know less about what causes changes over time in the analyses mandated by the instruments, especially in Britain. Thus, we take the analytical content of a pivotal regulatory reform instrument (impact assessment) as dependent variable, ...
Whilst several studies have documented how evidence-based policy instruments affect public policy, we know less about what causes changes over time in the analyses mandated by the instruments, especially in Britain. Thus, we take the analytical content of a pivotal regulatory reform instrument (impact assessment) as dependent variable, we draw on learning as conceptual framework, and we explain the dynamics of learning processes across departments, policy sectors, and time. Empirically, our study draws on sample of 517 impact assessments produced in Britain (2005-2011). Experience and capacity in different departments matter in learning processes. Guidelines matter too, but moderately. Departments specialize in their core policy sectors when performing regulatory analysis, but some have greater analytical capacity overall. Peripheral departments invest more in impact assessment than core executive departments. The presence of a regulatory oversight body enhances the learning process. Elections have different effects, depending on the context in which they are contested. These findings contribute to the literature on regulation, policy learning and policy instruments.
Social and Political Sciences, Philosophy, and Anthropology
Faculty of Humanities, Arts and Social Sciences
Item views 0
Full item downloads 0