Thursday, 8 July 2021

My problem with, and (visual) response to (some) implementation science.

I am currently contemplating alternatives to traditional models of implementation (i.e. putting plans, or 'evidence' into action) using visual methods, and this is a very early note for my thinking.

This post has two parts:

  • Part 1: My context and problems with some implementation science. 
  • Part 2: My early speculative thinking about how visual methods may improve things.

PART 1 - context and (possible) problems with implementation science.

Context

As I’ve noted elsewhere in this blog, and on my Twitter feed, I am an educator-researcher in English higher education, with a background in arts based community work and policy, practice and management roles in local government children’s services. If the label means anything to you, I might call myself a visual ethnographer, that is, I’m interesting in working with visual/material methods to ask questions about how things get done. That’s one of the reasons why I am passionate about using visual methods to promote meaningful engagement with people, help them bring their contribution to activity, and to materialise and mobilise knowledge in health, education and care systems. 

I one of my roles, I enjoy being an implementation lead for work with children and young people within an applied health research collaborative in the North East of England. In this role, I have the pleasure of working with a variety of colleagues who are researching with different sorts of communities. This fascinates me, particularly because I have a ‘hybrid’ professional identity (artist-ethnographer and educator with interests in inequality, inclusion and children and topics such as professional practice, policy, and public administration) meaning I like to perspective shift, or consider different ways we can consider phenomena.

Q. What's the problem with implementation science? (if anything)

I work in some diverse communities of practice, including in local government and in applied health research. All are concerned with implementation, or putting plans into action. Particularly in the latter, health research, the discipline of implementation science is influential. Bauer et al. define implementation science as:

"the scientific study of methods to promote the systematic uptake of research findings and other EBPs into routine practice, and, hence, to improve the quality and effectiveness of health services" (Bauer et al., 2015:1)

At the risk of over-simplification, I’ll say that one of the drivers of implementation science has been the “evidence-based practice” (EBP) movement, which has

“…popularised the notion that research findings and empirically supported (‘evidence based’) practices...should be more widely spread and applied in various settings to achieve improve health and welfare of populations” (Nilsen & Birken, 2020: 2).

I am more familiar with policy implementation (as a quasi public administration person), but that gives me some familiarity with the field as there are overlapping issues. However, in making a case for different approaches within implementation science, I don't want to misrepresent implementation science as totally "inflexible" or "logical-rational" (also assuming these characteristics are always problematic). There are a variety of theoretical perspectives and approaches that exist within the implementation science world, which I am still getting to know, including:

  • process models: focus on translating research into practice through action models or similar. 
  • determinant frameworks: describe barriers and enablers to implementation; understand influences.
  • classic theories: existing relevant theories from sociology, phsycology etc., applied to the field. 
  • implementation theories: a variety of models developed by implementation researchers (e.g. May et al.'s [2015] normalisation process theory).
  • evaluation frameworks: identifies aspects of implementation that can be evaluated to determine success. 
(adapted from Nilsen & Birken, 2020:11)

However, for now, I am speculating about the general limitations of implementation science. My assumption (let's be honest and get this into the open) is that implementation science, particularly those varieties strongly influenced by "evidence based practice" (EBP) approaches share a paradigm (a way of seeing, interacting with, and making sense of the world) that I find problematic. For now, I will say that this involves:

  • not acknowledging the differences that exist between concepts/models and action/practice (e.g. differences in complexity and dynamism),
  • ...leading to limited consideration of factors that are relevant to implementation (e.g. sensory, material, affective or other factors, depending on the theory)
  • ...or more fundamentally, automatically privileging certain ontological (i.e. what is reality),  epistemological (i.e. what can we know and how?) or axiological (i.e. value) positions as their basis. This is perhaps more fiddly as an issue, but it might involve:
    • assuming there is a 'one way relationship' between causes and effects (an ontological issue). 
    • considering knowledge, and therefore evidence, as a singular, fixed blueprint for action (an epistemological issue). 
    • assuming that it is Ok to impose 'evidence based' solutions on populations or communities (an axiological issue).
This is an educated guess, but I can say that at minimum, those critically reflecting on implementation science will at least recognise that different approaches may have advantages and disadvantages, so

"Selecting an appropriate model, theory or framework [for implementation science] often represents a considerable challenge..." (Nilsen & Birken, 2020:23)

and that all approaches are being developed incrementally and through testing, so

"it is also important to explore how the current theoretical approaches can be further developed to better address implementation challenges" (Ibid: 24)

That is my starting point, and at the moment, the onus is on me to see if the evidence stacks up with my concerns.

PART 2 - visual thinking and experimenting to expand the paradigm

Because this blog is about my development of visual methods, and related theoretical and practical topics, I'll now get more practical. In addition to careful reading to test out my guesses (see above), I'm developing by doing. The example I have relates to an informal creative project I am leading with friends and colleages (a research colleage, two artists and a film-maker). My aim is to produce a short film, speaking to the topic of expanding the paradigm of implementation science, hopefully to be published in the Blog of the academic journal BMJ Medical Humanities

I specifically reflect on my experience of developing this film project, as it has forced me to think through some very provisional ideas, some of which I began to express in my previous post. This post is about the process behind those statements, and how I have come to realise how attached I am to aspects of the dominant implementation paradigm! Quite rightly, my challenge starts with me owning up to my position and tensions before I judge others.

I started off talking with my colleage Catherine about different ways of thinking about child health. We got talking about the tradition of artists' manifestos, and I cheekily suggested that we should write an artists' take on implementation science: a set of propositions, if you like. To cut a long (and ongoing) story short, we then invited some co-collaborators to play with us. At the start of the project, I sketched a timeline for the film, setting out the various (sound, visual, spoken) elements.













The reality is that the project is taking longer than anticipated, partly because it has been a busy academic time of year (marking, funding deadlines etc.). The down side to this was that, from my point of view, the project became less dymanic and lost a sense of improvisation and I found myself, unintentionally thinking in quite a traditional way about moving from idea to action. To kick start things, I contacted Dex Hannon, one of the collaborative team, and we agreed to try some collaborative painting, to wake us up and to give us some time to reflect on what we were doing. Dex was great at helping me question the process.

The images below have been adapted from our co-painting session, and next to each I will add some of the experiences that co-painting gave us that are now forming what we want to say about implementation science (I'll leave it to you to think how these four experiences may do this, and we will keep working on it!):

Experience 1: The act of painting helped me appreciate how much any 'end result' is determined by pre-articulated, embodied, material and sensory factors. We had to handle brushes and paint, and moved around the canvas which was placed on the floor of Dex's studio. Thinking, and 'results' were intimatley bound up with acting. 


Experience 2: Painting together was very much about connection, relating and response. How we related, and the athmosphere that was generated co-constituted the painting. Dex reflected on how brave one had to be, how an activity like co-painting required us to ditch our egos, and to become attuned and to move together. 


Experience 3: The practice of co-painting was an emergent one, that is, it could not be pre-formed and planned beyond what Erin Manning calls its "initial conditions" (Manning, 2012; Manning & Massumi, 2014). We noticed things, were moved by things and were only able to act as we saw things starting to happen.


Experience 4: Innovation came as factors interacted in the moment to co-constiture effects, they were more than the sum of the parts. I thought of these as encounters, events, or flashes as we acted with the paint, making 'agental cuts' (Barad, 2007). At one point, we saw a section of the painting that had just produced a sense of depth, and our 'knowing' or insight came as our attention and the paint-canvas connected at that moment. 


In conclusion, I don't yet have one. However, I have realised how much my process of work had been rooted in some of the paradigm shared by evidence-based implementation science I had been questioning. As we take the projhect forward, or it takes us forward, we will pay attention to issues of embodiment, relationship, emergence and co-constitution. 


References

Barad, K. (2007). Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning. Durham: Duke University Press.

Bauer, M.S., Damschroder, L., Hagedorn, H. et al. (2015) An introduction to implementation science for the non-specialist. BMC Psychology, Vol. 3 (32). Availiable online: https://doi.org/10.1186/s40359-015-0089-9 

Manning, E. (2012) Relationscapes: Movement, Art, Philosophy, (Technologies of Lived Abstraction Series, Eds. Brain Massumi and Erin Manning), London: MIT Press.

Manning, E., and Massumi, B. (2014) Thought in the Act: Passages in the Ecology of Experience, Minneapolis, MN: Minnesota Press.

May, C., Rapley, T., Mair, F.S., Treweek, S., Murray, E., Ballini, L., Macfarlane, A. Girling, M. and Finch, T.L. (2015) Normalization Process Theory On-line Users’ Manual, Toolkit and NoMAD instrument. Available from http://www.normalizationprocess.org 

Nilsen, P., and Birken, S.A. (Eds.) (2020) Handbook on Implementation Science, Cheltnham, UK: Edward Elgar.

No comments:

Post a Comment

My problem with, and (visual) response to (some) implementation science.

I am currently contemplating alternatives to traditional models of implementation (i.e. putting plans, or 'evidence' into action) u...