Background
Patient and Public Involvement (PPI) in research is increasing. PPI can be defined as the involvement of patients, carers and the public as active partners in the design, delivery and dissemination of research to ensure that it is relevant and useful [
1], or as “
research being carried out ‘with’ or ‘by’ members of the public rather than ‘to’, ‘about’ or ‘for’ them” [
2]. The impact of PPI on research has been investigated. A 2012 systematic review of PPI in health and social care research identified benefits as enhanced quality and appropriateness of research, development of user-focused research objectives, user-relevant research questions, user-friendly information, questionnaires and interview schedules, appropriate recruitment strategies for studies, consumer-focused interpretation of data and enhanced implementation and dissemination of study results [
3]. More recent research has shown positive impacts on researchers, including their knowledge, priorities, lay communication skills and attitudes to involvement [
4]. The PPI representative also benefits, e.g. through an improved life focus and relationship with their illness [
1]. PPI specifically in mental health research has also shown benefits, e.g. with more PPI found in studies achieving recruitment targets [
5].
The level of involvement can differ [
6,
7]. Low-level involvement, called ‘informed’, ‘consulted’ or ‘participation’, consists of researchers asking for views which are then used to inform research decision making. Medium-level involvement, called ‘involved’, ‘collaboration’ or ‘co-production’, has a focus on equity within the relationship between the researcher and the PPI participant, and comprises an ongoing partnership with shared decision making. High-level involvement called ‘influential’, ‘control’ or ‘service user-led’, consists of people with experience of the health issue being researched having the dominant voice, delivering and managing research themselves. This framework was chosen as a pragmatic, understandable approach which is measurable [
8], relates to mental health areas such as shared decision making [
9] and experience of care [
10], and is widely used in health research in the United Kingdom. Other frameworks for characterising involvement exist, for example at the individual level (micro level), the health-care service level (meso level), policy level (macro level) and, as in the current study, in research and education (meta level) [
11]. Inclusive disability research has highlighted the importance of methods of involvement, including emancipatory research, collaboration research, and, as in this study, steering and advisory groups [
12]. Involvement in research can vary across different dimensions; a recent framework published in the United Kingdom identifies six standards: inclusive opportunities, working together, support & learning, communications, impact and governance [
13]. A systematic review identified types of PPI in research at the preparatory, execution and translational phases of the research cycle [
14].
Applying this three-level involvement framework to clinical services, a movement towards higher levels of service user involvement in clinical practice is evident. For example, shared rather than clinician-led decision-making is now routinely recommended [
15], because decision-making involvement influences recovery [
16]. Just as in PPI, involvement in clinical decision-making is influenced by relational aspects, including a collaborative and trusting relationship, and access to information resources [
17]. User involvement in strengthening services (e.g. in low resources settings [
18]) or supporting increased participation (e.g. in in-patient settings [
19]) is developing. The growth of peer support workers [
20] and the development of service user-run services [
21] indicates a trend towards high-level involvement in clinical services. However, the views of mental health staff (especially in-patient and less experienced staff) are more positive about patient involvement in treatment than their involvement in the planning and management of services or in professional education [
22]. A systematic review concluded that organisations are still negotiating the balance between consumer leadership and traditional structures and systems [
23].
Turning to PPI in research, the level of involvement may not be consistent across all parts of the research cycle. An important driver of behaviour in the scientific research community is funding, and many funders now require PPI input to proposals. For example, the leading funder of applied health research in the UK is the National Institute of Health Research (NIHR), and NIHR has a stated expectation that there will be active involvement of members of the public in the research that it funds [
24]. This is operationalised through mandatory PPI sections in proposal forms for applicants to outline the extent of PPI, and through PPI representatives on funding panels. As a result, PPI in the UK has become the norm in developing health research proposals.
PPI involvement after securing funding is inconsistent. Examples of token involvement during post-funding research stages (study setup, ethical approval, data collection, data analysis, dissemination) are reported, such as more importance being given to researcher and clinician views than to PPI perspectives, and PPI processes being experienced as a tick-box exercise [
25,
26]. Whilst the public voice is increasingly present in research decision-making, there is less evidence of a change in the underpinning power dynamic between the scientific research community and the public [
27]. Mirroring the stage of development of involvement in clinical services (outlined earlier), the transition towards citizen science – equal partnership between scientists and citizens – is only partially complete [
28].
As part of the movement towards higher levels of PPI in academic research, analysis of data is an important point of input [
29]. However, service users are rarely involved in creating meaning from data [
30]. This means that a valuable perspective in interpreting findings is being lost. Collaborative data analysis (CDA) of qualitative data is a recognised methodology involving a joint focus and dialogue among two or more researchers regarding a shared body of data to produce an agreed interpretation [
31]. When one of the involved parties bring a PPI perspective, this can highlight taken-for-granted researcher assumptions, provide socio-cultural and political insights, and enhance the thoroughness of interpretation [
32]. For example, in a study of detained psychiatric patients, mental health service user researchers coded more according to experiences and feelings, whereas university researchers coded more according to procedures and processes [
33].
The Recovery Colleges Characterisation and Testing (RECOLLECT) Study is an NIHR-funded evaluation of Recovery Colleges (RCs), a new innovation which uses co-production and adult education approaches, rather than treatment, to support mental health recovery [
34,
35]. RCs involve the development of a culture of ‘emancipatory education’ [
36] with an emphasis on ‘inclusivity and egalitarianism’ [
37]. Emerging fidelity criteria are located in counter-point to the current mental health system [
38]. As part of RECOLLECT, qualitative data on mechanisms of action and outcomes were collected in three RCs. As RCs are a disruptive innovation [
39] based on intentionally different values, goals and assumptions from clinical systems, there is a high risk of bias in interpreting qualitative data about them from a clinical research perspective. Therefore a collaborative data analysis approach involving people with lived experience was required to improve the quality of analysis. We refer to the (typically university-based) academic research team conducting studies (such as RECOLLECT) as the
academic researchers (whilst recognising that academic researchers may themselves have disclosed or non-disclosed experience of mental ill-health, and noting the existence of service user-led research groups in universities) and people bringing a public or patient perspective as
PPI co-researchers (also noting that they may have research experience).
The aims of this study were (1) to develop a methodology for involving PPI co-researchers in analysis of qualitative data; (2) to pilot and refine this methodology; and (3) to create a best practice framework for future PPI in data analysis.
Methods
The RECOLLECT Lived Experience Advisory Panel (LEAP) comprises a group of 9 people with lived experience of mental health issues, either personally or as carers. Members brought a range of research experience (from never having done research before to having collaborated on several projects), and were heterogeneous in where they lived, educational achievement and occupational background. LEAP members are referred to as the PPI co-researchers.
The RECOLLECT research team (who collected the data collaboratively analysed in Stage 2) span a range of professions (counselling, occupational therapy, psychology) and PPI experiences (PPI lead, PPI participant), and specifically include people with and without lived experience of mental health issues (directly or as a family member). The RECOLLECT research team are referred to as the academic researchers.
Ethical approval was obtained (Nottingham REC 1, 16/EM/0484) and all participants provided written informed consent.
Stage 1: Develop the CDA methodology
A critical literature review [
40] was conducted by HJ, involving evaluation and synthesis of included papers. The aim of a critical review is to identify the most significant articles in a field, and it was chosen over more systematic approaches as a proportionate review approach suitable for generating a model. Databases searched were AMED, PubMed, CINAHL, MEDLINE and PsycARTICLES, with serendipitous searching [
41] to identify grey literature. A scoping search indicated that terms like ‘coproduction’ were insignificantly specific, so search terms specifically focussed on the collaborative analysis element of coproduction were used. Inclusion criteria were (a) primary focus on qualitative collaborative data analysis; (b) published in journal or book; (c) focused on mental health research; (d) aligned with the principles of PPI and a democratic approach to public engagement [
7]; and (e) published since 2007 to maximise relevance to current PPI frameworks. The initial concept search strategy comprised (“collaborative data analysis” OR “interpretation workshop” OR “participatory action research”) AND (“service user” OR “public patient involvement” OR “PPI” OR “co-researcher” OR “co-investigator” OR “expert by experience” OR “client” OR “consumer” OR “survivor”) AND (“mental health” OR “mental illness” OR “mental distress” OR “psychological distress”) AND “qualitative”. The initial search identified few papers specific to mental health, so inclusion criterion (c) was amended to include any health research, and the search strategy amended accordingly. The first author extracted data relevant to design and procedure, and characteristics of successful CDA. Findings were synthesised by the research team, and organised into methodology options and success characteristics.
Stage 2: Pilot and refine the CDA methodology
The Stage 1 synthesis was considered by academic researchers and PPI co-researchers to select a methodology for CDA in RECOLLECT, in relation to investigation about mechanisms, outcomes and change models for Recovery Colleges. Evaluation criteria were that the methodology: ensured PPI co-researcher interpretations of qualitative data collected in RECOLLECT were generated; ensured these interpretations were used to inform findings; aligned with a democratic approach to PPI [
7]; were practical to apply within the constraints of a one-year study; and incorporated the design techniques that were associated with successful CDA. Decisions were amalgamated by HJ into a final synthesised methodology and procedural session plans, which were then refined by the other authors. The CDA methodology was then piloted. PPI co-researchers met with the academic researchers in four 4-h meetings, each with a minimum of 8 co-researchers present who were paid for their preparation and attendance.
The aim of this part of RECOLLECT was to develop an understanding of how RCs work (mechanisms of action) and the impact on students (outcomes). Framework analysis [
42] was the underpinning methodology. Summarising the content of CDA meetings (described in Table
3): Meeting 1 ensured the ecological validity of the document analysis carried out by the academic researchers; Meeting 2 refined the preliminary, academic researcher-developed coding framework to co-produce the final coding framework; Meeting 3 created a mental health service user-led change model (organising and linking coded mechanisms of action and outcomes); and Meeting 4 enabled final verification of the change model. Each meeting was facilitated by two RECOLLECT academic researchers; one to help navigate the narrative of the discussions and the other to support group dynamics and ensure all PPI co-researchers were able to contribute. Tasks were predominantly broken down into small group or pair work initially, with all PPI co-researchers then coming together to share what they had produced. Exercises were presented both verbally and on paper, with opportunities for questions and challenge. Adequate understanding of task requirements was ascertained through check-back and observation. Insights from occupational therapy research [
43,
44] were used to inform grading and adapting activities to align with an individual’s capacities, in order to maximise their performance. This improves the likelihood the PPI co-researcher experience will be positive, rather than frustrating, marginalising or demoralising [
45]. Exercises were paced to allow time for breaks and refreshment. Data interpretations by PPI co-researchers were captured by two academic researchers observing and writing down PPI co-researcher verbal contributions, collection of written and photographed outputs, and completion of field notes by the academic researchers.
Several strategies to enhance quality were used, such as peer examination, triangulation of researchers and PPI co-researcher perspective, triangulation of data sources (PPI, documents, qualitative interviews), and use of reflexivity in the academic researcher field notes [
46]. Additionally, in the final meeting informal feedback was obtained and the Quality Involvement Questionnaire was completed by PPI co-researchers [
47]. This 31-item self-report measure (each item rated 0 (low involvement) to 4) assesses perceived involvement, and has six sub-scales in two parts: Personal Factors (Your ability (range 0–28), Your potential (0–20), Your sense of being (0–20)) and Research Contexts (Research relationships (0–24), Ways of doing research (0–16) and Research structures (0–16)). Total score ranges from 0 (low involvement) to 124. As normative data are not available, it was also completed by the academic researchers to provide a comparison group.
Stage 3: Best practice framework for CDA
The most useful components of the methodology were synthesised by the academic researchers to produce best practice, based on data collected during Stage 2.