Implementation: the missing link, also for risk assessment instruments
By Tamara De Beuf
What do iKnife, a tumor-detecting knife, and Zoe, a digital sleep coach, have in common? They both are healthcare innovations that failed in practice despite having good findings in initial research and testing. They are no exceptions; it is estimated that half of the innovations in healthcare do not hold up in practice.
The step from research to routine practice, called implementation, is a challenging one. There are many factors that can hinder the introduction of change and innovation in practice. How much resources are available? What is the readiness for change among employees? How supportive are the managers? To reduce the gap between research and practice, scientists began studying methods to promote the uptake of evidence-based practices and maximizing their impact on patients and clients. This research field is called implementation science.
Implementation is not only relevant in health care, but also in forensic mental health, and for risk assessment practices in particular. With the use of risk assessment instruments, practitioners can assess the risk that someone will engage in adverse behavior, such as violence, in the future. Risk assessment is relevant in populations for whom risk management is crucial to safeguard themselves and/or society.
It is evident from the risk assessment literature that introducing risk assessment instruments in complex work environments, such as forensic settings, is quite challenging. Yet relatively little research has been conducted on how to support such implementation initiatives and even fewer studies have applied knowledge and frameworks from the field of implementation science to better understand risk assessment implementations.
In our research on the implementation of the Short-Term Assessment of Risk and Treatability: Adolescent Version (START:AV), a risk assessment instrument for adolescents, we applied two well-known models from the field of implementation science to study the implementation process in a Dutch secure youth care service.
First, we explored which factors influenced the implementation process. Staff perceptions of implementation barriers and facilitators, gathered via focus groups, were categorized according to the constructs of the Consolidated Framework for Implementation Research. Characteristics of the instrument itself (e.g., complexity) and features of the implementing organization (e.g., culture, available resources) were frequently listed as barriers, whereas staff characteristics, and more specifically supportive attitudes towards the START:AV, were mentioned as facilitators of the implementation. Other constructs, such as characteristics of the implementation process (e.g., timing and planning) or characteristics of the external environment (e.g., national policies and legislation) were less frequently mentioned as influencing the implementation. Overall, staff highlighted the importance of effective communication, organization-wide buy-in, supportive leadership, sufficient resources, training, planning, monitoring, and integration. These factors have also been described as essential in previous risk assessment implementation studies.
Second, we measured how well the implementation was going by evaluating several outcomes suggested by the Implementation Outcomes Framework. Acceptability, adherence, adoption, appropriateness, feasibility, and penetration were assessed twice during the implementation. Findings indicated that most staff members perceived the START:AV’s key components, such as the final risk judgments and the assessment of both strengths and vulnerabilities, as useful for treatment (appropriateness). Yet, satisfaction with the instrument decreased over time (acceptability). This was likely due to an increased workload following the adoption of the START:AV, despite a reduction in time needed to complete a form (feasibility). Notwithstanding the dissatisfaction, the completion rate was acceptable (74%), yet variable among the evaluators (29-100%; adoption). When examining these forms for how complete they were, differences were noticeable between various parts of the form with features that required more elaboration having more missing data (adherence). Nevertheless, we found that adherence could be improved or at least maintained by providing a refresher workshop. Overall, staff indicated that, over time, the START:AV was well integrated in the work processes (penetration). In sum, evaluating the implementation using these outcomes from the field of implementation science provided a wealth of information about the status of the implementation and aided in identifying areas for improvement.
Overall, the frameworks from implementation science are useful to risk assessment implementation, not only to guide the implementation but also to evaluate it. This will help us better understand what works where and why with respect to the implementation of risk assessment instruments.
Studies such as the above will hopefully encourage more systematic research into risk assessment implementation. Increasing our understanding of the challenges typically faced when implementing risk assessment instruments and how they affect implementation outcomes will help us determine the most appropriate implementation strategies. Following this pathway may prevent risk assessment instruments from being added to the list of implementation failures, next to iKnife and Zoe.
Want to know more?
On April 4th, Tamara De Beuf defended her dissertation entitled ‘Risk assessment with the START:AV in Dutch secure youth care: From implementation to field evaluation’. The dissertation also describes findings on the field reliability and predictive validity of the START:AV when used within this particular setting by the intended professionals.
The recording of the defense is available on YouTube and the digital dissertation can be accessed freely via the following link: https://doi.org/10.26481/dis.20220404td.