Show simple item record

dc.contributor.authorBywater, T
dc.contributor.authorGridley, N
dc.contributor.authorBerry, VL
dc.contributor.authorBlower, S
dc.contributor.authorTobin, K
dc.date.accessioned2018-11-21T15:50:54Z
dc.date.issued2018-01-29
dc.description.abstractBackground: Group-based parent programmes demonstrate positive benefits for adult and child mental health, and child behaviour outcomes. Greater fidelity to the programme delivery model equates to better outcomes for families attending, however, fidelity is typically self-monitored using programme specific checklists. Self-completed measures are open to bias, and it is difficult to know if positive outcomes found from research studies will be maintained when delivered in regular services. Currently, ongoing objective monitoring of quality is not conducted during usual service delivery. This is odd given that quality of other services is assessed objectively, for example by the Office for Standards in Education, Children's Services and Skills (OFSTED). Independent observations of programme delivery are needed to assess fidelity and quality of delivery to ensure positive outcomes, and therefore justify the expense of programme delivery. Methods: This paper outlines the initial development and reliability of a tool, the Parent Programme Implementation Checklist (PPIC), which was originally developed as a simple, brief and generic observational tool for independent assessment of implementation fidelity of group-based parent programmes. PPIC does not require intensive observer training before application/use. This paper presents initial data obtained during delivery of the Incredible Years BASIC programme across nine localities in England and Wales, United Kingdom (UK). Results: Reasonable levels of inter-rater reliability were achieved across each of the three subscales (Adherence, Quality and Participant Responsiveness) and the overall total score when applying percentage agreements (>70%) and intra-class correlations (ICC) (ICC range between 0.404 and 0.730). Intra-rater reliability (n = 6) was acceptable at the subscale level. Conclusions: We conclude that the PPIC has promise, and with further development could be utilised to assess fidelity of parent group delivery during research trials and standard service delivery. Further development would need to include data from other parent programmes, and testing by non-research staff. The objective assessment of quality of delivery would inform services where improvements could be made.en_GB
dc.description.sponsorshipThis research was supported by the National Institute for Health Research (NIHR) Collaboration for Leadership in Applied Health Research and Care – Yorkshire and Humber, and South West Peninsula.en_GB
dc.identifier.citationPublished online 29 January 2018en_GB
dc.identifier.doi10.1080/13575279.2017.1414031
dc.identifier.urihttp://hdl.handle.net/10871/34850
dc.language.isoenen_GB
dc.publisherTaylor & Francis (Routledge)en_GB
dc.rights.embargoreasonUnder embargo until 29 January 2019 in compliance with publisher policyen_GB
dc.rights© 2018 © The Child Care in Practice Groupen_GB
dc.subjectprogrammesen_GB
dc.subjectinterventionen_GB
dc.subjectimplementationen_GB
dc.subjectfidelityen_GB
dc.subjectreliabilityen_GB
dc.subjectobservationen_GB
dc.subjectmeasurementen_GB
dc.titleThe parent programme implementation checklist (PPIC): the development and testing of an objective measure of skills and fidelity for the delivery of parent programmesen_GB
dc.typeArticleen_GB
dc.identifier.issn1357-5279
dc.descriptionThis is the author accepted manuscript. The final version is available from Taylor & Francis via the DOI in this recorden_GB
dc.identifier.journalChild Care in Practiceen_GB


Files in this item

This item appears in the following Collection(s)

Show simple item record