I am currently contemplating alternatives to traditional models of implementation (i.e. putting plans, or 'evidence' into action) using visual methods, and this is a very early note for my thinking.
This post has two parts:
- Part 1: My context and problems with some implementation science.
- Part 2: My early speculative thinking about how visual methods may improve things.
PART 1 - context and (possible) problems with implementation science.
Context
As I’ve noted elsewhere in this blog, and on my Twitter feed, I am an educator-researcher in English higher education, with a background in arts based community work and policy, practice and management roles in local government children’s services. If the label means anything to you, I might call myself a visual ethnographer, that is, I’m interesting in working with visual/material methods to ask questions about how things get done. That’s one of the reasons why I am passionate about using visual methods to promote meaningful engagement with people, help them bring their contribution to activity, and to materialise and mobilise knowledge in health, education and care systems.
I one of my roles, I enjoy being an implementation lead for work with children and young people within an applied health research collaborative in the North East of England. In this role, I have the pleasure of working with a variety of colleagues who are researching with different sorts of communities. This fascinates me, particularly because I have a ‘hybrid’ professional identity (artist-ethnographer and educator with interests in inequality, inclusion and children and topics such as professional practice, policy, and public administration) meaning I like to perspective shift, or consider different ways we can consider phenomena.Q. What's the problem with implementation science? (if anything)I work in some diverse communities of practice, including in local government and in applied health research. All are concerned with implementation, or putting plans into action. Particularly in the latter, health research, the discipline of implementation science is influential. Bauer et al. define implementation science as:
"the scientific study of methods to promote the systematic uptake of research findings and other EBPs into routine practice, and, hence, to improve the quality and effectiveness of health services" (Bauer et al., 2015:1)
At the risk of over-simplification, I’ll say that one of the drivers of implementation science has been the “evidence-based practice” (EBP) movement, which has
“…popularised the notion that research findings and empirically supported (‘evidence based’) practices...should be more widely spread and applied in various settings to achieve improve health and welfare of populations” (Nilsen & Birken, 2020: 2).
I am more familiar with policy implementation (as a quasi public administration person), but that gives me some familiarity with the field as there are overlapping issues. However, in making a case for different approaches within implementation science, I don't want to misrepresent implementation science as totally "inflexible" or "logical-rational" (also assuming these characteristics are always problematic). There are a variety of theoretical perspectives and approaches that exist within the implementation science world, which I am still getting to know, including:
- process models: focus on translating research into practice through action models or similar.
- determinant frameworks: describe barriers and enablers to implementation; understand influences.
- classic theories: existing relevant theories from sociology, phsycology etc., applied to the field.
- implementation theories: a variety of models developed by implementation researchers (e.g. May et al.'s [2015] normalisation process theory).
- evaluation frameworks: identifies aspects of implementation that can be evaluated to determine success.
However, for now, I am speculating about the general limitations of implementation science. My assumption (let's be honest and get this into the open) is that implementation science, particularly those varieties strongly influenced by "evidence based practice" (EBP) approaches share a paradigm (a way of seeing, interacting with, and making sense of the world) that I find problematic. For now, I will say that this involves:
- not acknowledging the differences that exist between concepts/models and action/practice (e.g. differences in complexity and dynamism),
- ...leading to limited consideration of factors that are relevant to implementation (e.g. sensory, material, affective or other factors, depending on the theory)
- ...or more fundamentally, automatically privileging certain ontological (i.e. what is reality), epistemological (i.e. what can we know and how?) or axiological (i.e. value) positions as their basis. This is perhaps more fiddly as an issue, but it might involve:
- assuming there is a 'one way relationship' between causes and effects (an ontological issue).
- considering knowledge, and therefore evidence, as a singular, fixed blueprint for action (an epistemological issue).
- assuming that it is Ok to impose 'evidence based' solutions on populations or communities (an axiological issue).
"Selecting an appropriate model, theory or framework [for implementation science] often represents a considerable challenge..." (Nilsen & Birken, 2020:23)
and that all approaches are being developed incrementally and through testing, so
"it is also important to explore how the current theoretical approaches can be further developed to better address implementation challenges" (Ibid: 24)
The images below have been adapted from our co-painting session, and next to each I will add some of the experiences that co-painting gave us that are now forming what we want to say about implementation science (I'll leave it to you to think how these four experiences may do this, and we will keep working on it!):
Barad, K. (2007). Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning. Durham: Duke University Press.
Bauer, M.S., Damschroder, L., Hagedorn, H. et al. (2015) An introduction to implementation science for the non-specialist. BMC Psychology, Vol. 3 (32). Availiable online: https://doi.org/10.1186/s40359-015-0089-9
Manning, E. (2012) Relationscapes: Movement, Art, Philosophy, (Technologies of Lived Abstraction Series, Eds. Brain Massumi and Erin Manning), London: MIT Press.
Manning, E., and Massumi, B. (2014) Thought in the Act: Passages in the Ecology of Experience, Minneapolis, MN: Minnesota Press.
May, C., Rapley, T., Mair, F.S., Treweek, S., Murray, E., Ballini, L., Macfarlane, A. Girling, M. and Finch, T.L. (2015) Normalization Process Theory On-line Users’ Manual, Toolkit and NoMAD instrument. Available from http://www.normalizationprocess.org
Nilsen, P., and Birken, S.A. (Eds.) (2020) Handbook on Implementation Science, Cheltnham, UK: Edward Elgar.
No comments:
Post a Comment