Background
Operational research (OR) is the discipline of using models and analysis to aid decision-making in complex systems and has been used in healthcare since the 1950s [19]. The features of the problem being addressed inform the choice of approach adopted by OR practitioners, which can include qualitative methods such as problem structuring and conceptual modelling as well as quantitative data analysis, mathematical modelling and simulation techniques [46]. |
An operational researcher (lead author of this article, SC) supported knowledge production within a 2-year multidisciplinary grant-funded research project focusing on services following discharge from infant cardiac surgery [22]. That research project had two objectives: firstly, to understand the challenges encountered in accessing and providing services and to identify patient risk factors for adverse outcomes through qualitative and quantitative strands of research, and secondly, for an advisory group to develop recommendations for improving services. The operational researcher joined the project in order to bridge these two objectives by providing an explicit process for translating the research findings into practical recommendations of relevance to policymakers. A combination of problem structuring (soft systems methodology [47]) and quantitative OR methods (classification and regression tree (CART) analysis [48] and data visualisation) were deployed. The technical details are described elsewhere [45, 49], but for the purposes of this article, we highlight four key aspects: - Developing a Rich Picture (a device used in soft systems methodology) to explore the key features of services following infant cardiac surgery, perceived problems and possible improvements (see Additional file 1) [45]. - Facilitating a workshop with the project’s advisory group, who were tasked with agreeing a set of recommendations for service improvement [45]. This included a patient representative from the charitable sector and representatives from tertiary, secondary, primary and community care. - Creating a visual representation of data analysis to inform the advisory group’s consideration of the prioritisation of service improvements (the CART diagram, see Additional file 2) [45]. - Using a systematic process to integrate findings from the different strands of research and recommendations from the advisory group and a parent workshop (using a hyper-framework [49]). Outputs from the applied health research project included published academic articles reporting findings from the quantitative [50] and qualitative [51‐54] strands of research, as well as the final evidence-informed recommendations for service improvement [49]. These recommendations fed into National Health Service (NHS) England’s national review and public consultation on the care standards and specifications for commissioning specialist services for congenital heart disease [23]. Given the rapid uptake of the research outputs by national policymakers, we considered this to be a successful example of producing readily implementable knowledge within an applied health research context. We therefore sought to identify aspects of the OR approach to knowledge production that contributed to this success. |
Methods
Study design
Data collection
Researcher reflexivity
Data analysis
Results
Interviews: the characteristics of OR as a mediator of knowledge production
One of the things that frustrates me the most about pieces of research is that they’re not leading you to improve services and I felt that that was one of the differences with this piece of research and that was clearly one of your [operational researcher] remits, which was heartening. [Patient representative, project advisory group]
Engaging and incorporating different perspectives
my sentiment on the evidence was, yes, you’ve drawn on it but some of the recommendations and conclusions of the study have come from talking to clinicians working on the ground about the practicalities of delivering care. [Secondary care paediatrician, project advisory group]
health professionals are very, I think, often suspicious about qualitative information, unless it is presented to them in the right way. Like, particularly, kind of, cardiology, cardiac surgery. It is a very unfamiliar thing to them. They don’t know very much about it, and I think they respond quite well to some of the, kind of, more structured-, I don’t know, some of the ways that concepts are framed. By the things I’ve seen in the past from their OR team I think it helps them to, kind of, take stuff in. [Consultant intensivist, project study team].
it was very helpful to have those people [in the workshop], because as a specialist centre worker, you don’t necessarily appreciate some of the stresses and difficulties that the secondary and particularly primary care people have. [Consultant paediatric cardiologist, project advisory group]
People saw immediately what it [Rich Picture] was doing and knew which corner of the world that they were in. If I live here I could see that this was only a small section of the territory. Even if you’re just working in improving things here, it absolutely demands that you at least recognise that these other places exist. [Cardiologist, project study team]
Rendering data meaningful to people and for improvement
If you’re not mathematically minded you can still understand it [CART diagram]. I think this really helps people not to feel frightened or to feel that they’re not getting it and, actually, then they can really think about what it means […] I think they teased it apart rather than just accepted it at face value […] as a communication tool as well, to get people thinking, talking, discussing in relation to, “Well, I actually want to think about the intervention, who are we going to be applying it to?” [Health psychologist, project study team]
I had a huge advantage in that you’d [operational researcher] come and seen me and we’d gone through a lot of this on a one-to-one basis, which was fabulous, I did find that very useful. Obviously I was being introduced to you but I was also being introduced to the way that you were going to present things. [Patient representative, project advisory group]
Maintaining perceived objectivity and rigour
it [OR] has made it [the project’s recommendations] more comprehensive and it’s made it a bit more objective as well, because the problem is that whoever writes it, will bring to bear their own biases or their own perspective. So, if you have a way of, sort of, stepping back and being more objective and systematic, then you can, kind of, remove some of the human frailties that make something less good quality at the end. [Consultant intensivist, project study team]
It’s [OR] been refreshing and a different way of looking at things and free from the restrictions of […] a professional hierarchy or necessarily other working relationships where other factors come into play about conclusions you might want to come to or things you might want to say. I think you [operational researcher] felt like part of the team but an independent part of the team who has been unbiased. [Secondary care paediatrician, project advisory group]
I think you’ve [operational researcher] been a very good, sort of, questioning presence in some of those meetings which has got us to sort of reflect on our process a bit more than I think we otherwise would have done. [Clinical research psychologist, project study team]
I don’t think other people had that same perspective as you [operational researcher] had and I don’t think there was anyone actually who was going to be necessarily willing or able to get that in depth overview. […] In depth of all of the different bits, and to be able to pull it all together in that way and I think you kind of kept everybody in there. [Health psychologist, project study team]
I think you [operational researcher] did a very good job of advocacy for your work in raising our awareness of it and making sure that we knew it was going on and what exactly was going on, and making sure everybody was clear what the findings were. [Member of NHS England review team]
Participant observations: performing the role of a mediator of knowledge production
I was not involved in the grant proposal for the research project and came to know about it later on through the principal investigator, who I had collaborated with before. There appeared to be opportunities for OR to add value, so I joined the research team a few months into the project using separate funding from a personal fellowship. Members of the team were already assigned to particular strands of research (e.g. a clinical research psychologist was conducting staff and family interviews in the qualitative strand). As a free resource without a specified role in the grant proposal, I was flexible to use OR however might serve the project’s aims, primarily in relation to developing recommendations for service improvements on the basis of the evidence the study planned to collect:
In my view, there was no explicit method for doing this part of the project (drawing together strands and developing recommendations) and so we’re flexible to (and need to!) design that now. [Participant observation field notes, February 2014] I worked collaboratively with the rest of the project team, across all research strands, with my approach informed by their ideas and sensitive to the need to keep people on board. For example, in one project meeting, I discussed my early thoughts on how to bring together findings from the different strands systematically and illustrated to the team how this might be done using a grid-like framework. This received a mixed response, from very positive to very negative, prompting a detailed discussion and, eventually, agreement about how to augment my idea:
It certainly stimulated debate! And debate that has led us to a better place in terms of understanding and documenting our process in a manner in which we are all happy with […] it seems important that the idea that this is useful for the project has been reached by the research team as a whole. This way it is a constructive thing that everyone wants to be done and sees the value in - and how their work fits into it. [Participant observation field notes, March 2014] Method selection was not solely a technical decision and also involved considering the context and requirements of the project alongside my expertise. For example, a large and unanticipated part of my contribution was developing an analysis dataset from two national audits because I had relevant skills and prior experience that others in the team did not have (“the skillset needed to, kind of, bring the data set to order I think is something that only you were able to do in the project” [consultant intensivist, project study team]). This seemed of benefit to the project but was time consuming and frustrated me when it delayed my progress in other areas:
I had a growing sense of frustration mixed with panic during this meeting as I realised how much there still is to do - and that a lot of this falls down to my responsibility. They [the project team as a whole] have hugely underestimated how much work this dataset preparation and linking involves. This is going to take quite a bit more of my time and I want to be cracking on with the other OR side of things. [Participant observation field notes, October 2013] I was already proficient in data analysis of this kind but less experienced in another technique I wanted to use, soft systems methodology (SSM), so my confidence to push, and ability to conduct, different aspects of the OR approach varied considerably:
Much of the OR part of the project is unknown territory for me, in that I am not that familiar with the techniques of SSM - so I feel some reticence to push forward with that, which means I am susceptible to focusing on this data analysis which is much more in my comfort zone. [Participant observation field notes, November 2013] It took time to gain expertise in the areas I was less familiar with. Indeed, I found translating the evidence into recommendations for improvement very time consuming because of the scale and breadth of tasks it required and lack of significant dedicated resource for this purpose:
Even from the point of post-workshop, pulling together the final recommendations for endorsement took a lot of work! And a lot of my time […] whilst the team thinks it’s really important to have something coming out of the study in terms of practical implementation, they don’t have any time for it... [Participant observation field notes, November 2014] |
Non-participant observations: biscuits and politics
Non-participation observation of key project meetings and workshops provided insight into the practices through which knowledge was produced and ways in which OR influenced this.
Physical context: Large conference room. Two parallel screens, lots of PCs [computers] on either side of the conference room. Chairs organised roughly in rows, like an auditorium. Some participants in rows, others sitting on one side. Informal, relatively quiet, measured responses and interactions. Two hours into the meeting, a tin of biscuits appears and is circulated. [Non-participant observation notes, November 2013] Early meetings tended to focus on the definition of key terms and categories used in the study (e.g. grouping diagnoses for the purpose of the study). Participants would ask questions, raise queries and make comments, based on their “in the moment” reading of the information being presented during meetings, by drawing on their own clinical experience or other research data, or in response to the contributions of others made during the meeting. In particular, imagined audiences for the study’s findings were periodically invoked (including parents and the media) in order to reflect critically on what the study might show and to improve its perceived quality and robustness (e.g. “might get criticised for that” and “last chance to prove how clever we are”). The to and fro of verbal contributions made during the meeting either led the group towards consensus (e.g. chair was able to state “main thing we have to do - done it, which is a miracle”) or recognition that further work to satisfy the needs of the project was still needed (e.g. one participant stated “not sure yet”, to which another responded, “we’re both not sure”). The graphic or visual aspect of the information presented by the operational researcher appears to be well received. For instance, during a lengthy, rather circular discussion, one participant refers back to the graphic that was presented earlier by the operational researcher in order to move the conversation on: “[she] gave beautiful graphic of defining index, shall we look at it again?” The operational researcher contributed to the discussions by asking clarifying questions (e.g. “what do we mean by baseline anyway?”) or suggesting points of consensus (e.g. “[are we] saying all would benefit from similar interventions?”). There appeared to be a political aspect to the ways in which the operational researcher made contributions to the study. For instance, the contributions of the operational researcher were often aligned with the chair’s views or supportive of the chair (e.g. operational researcher stated: “as [chair] said, she’s spoken with number of people on categorising”). Working closely with the chair appeared to help to legitimise the operational researcher’s role in the study and, in particular, gain favour for their approach to gathering and presenting data to inform the study’s findings. The end-of-study workshop, which was led and facilitated by the operational researcher, highlighted the importance of soft skills (or what I might term “non-academic” skills) for leading the activities of the participants present around translating the evidence collected during the study into a set of recommendations. The observations suggested that such mobilisation of knowledge demanded leadership and facilitation skills, e.g. facilitating workshops, encouraging decision-making around the data, showing participants where they might go with the findings to develop recommendations and marshalling people in different ways, e.g. by assigning action points.
The operational researcher led the meeting from the start (as chair), taking on a leadership role around ensuring the data collected during the study are translated into useful findings and tangible recommendations that can inform policy and practice. A key aspect of this was ensuring that outputs were data-driven (and not just based on participants’ own experiences). The operational researcher was able to move the participants on from discussing the data, to what it might mean for practice, and to how people in the health system might act on it. Enabling this type of discussion included pushing participants to reflect on the analysis of the data collected, e.g. in relation to the CART diagram, asking participants “are these patients groups recognisable to you?” It also seemed to require active facilitation in order to challenge at times participants’ perspectives. For example, where participants might refer to their own experience in a particular heart centre - and use this to challenge the analysis of the data presented - acknowledging and welcoming this, but also asking for broader views on the evidence presented that went beyond their personal experiences. Leadership was also needed in order to encourage participants to take forward pieces of work outside the meeting, e.g. by being firm about assigning action points. [Reflections on end-of-study workshop, October 2014] |