Organization
The personnel involved in PROVEN’s implementation reflected a dynamic blend of pragmatic and explanatory features. The “on-the-ground” implementation activities were continuously led by HCS personnel. HCS corporate-level administrators and clinical education specialists directed the facility ACP champions in implementing the program, and it was these champions, not the RT, who offered videos to patients and families. The RT’s initial involvement was largely “behind-the scenes,” such as designing the implementation and training protocols and creating feedback reports. However, as will be discussed below, once challenges with protocol compliance were revealed early in the implementation period, one of the RT’s PI (AEV) took an increasingly active role in coaching ACP champions, albeit always in partnership with the HCS clinical education specialists.
Turnover of ACP champions was the main challenge within the Organization domain. It prompted intensive tracking of the champions at each facility by the HCS clinical education specialist, as well as training of new champions as needed. The clinical education specialist position was also vacated and re-filled one time in each HCS during the implementation period. The potential disruption to implementation by these transitions was somewhat attenuated by the fact that that the corporate-level project leaders were very familiar with the program and could temporarily fill-in and ultimately train the new clinical education specialists.
The trial resources were more aligned with an explanatory trial, as the RT created the videos, purchased, pre-loaded, and distributed the tablet devices, and made the videos accessible via a website URL. In the real world, the HCS would have had to acquire these resources independently in order to implement an ACP Video Program. The HCS supplied the electronic medical record system that hosted the VSR, as well as information technology personnel who enabled that effort. The few problems encountered with project resources were handled by the RT. For example, the RT created translated and customized versions of the General Goals of Care video for two facilities that served mostly Navajo-speaking patients; also, lost tablets were replaced in another facility.
Training activities, also under the Organization domain, reflected a hybrid of explanatory and pragmatic features. The RT supplied all the training materials and primarily designed the training sessions. However, training activities leveraged each HCS’ particular existing infrastructure used to roll out new programs. The larger, national organization, HCS1, opted for training via Intranet-hosted webinars, which is the standard approach that they use for uniform training. On the other hand, the smaller, regional organization, HCS2, opted for centralized, in-person trainings for which they provided the venue. This approach was customary, since they regularly bring staff together for training.
Flexibility-Delivery
Within the Flexibility-Delivery domain, protocol delivery was mostly pragmatic. While the RT designed the protocol, it had limited control over its delivery, which was largely dependent on ACP champions’ discretion. The main prescribed element in the protocol was that providers should offer a video to all new or re-admissions and every 6 months to long-term care patients.
The trial’s monitoring mechanisms for both protocol delivery (i.e., did the ACP champions offer a video as per protocol?) and adherence (i.e., did the patients or family members actually view a video?) were decidedly less pragmatic but proved essential in revealing major implementation problems in a timely manner. The VSR was created by the RT as a “new” data source specifically for the trial and, thus, reflects a relatively more “explanatory” feature. Each HCS integrated the VSR into its electronic medical record system and used it to generate internal feedback reports, which supports the use of this monitoring approach under real-world conditions. However, the RT’s more robust analytic capabilities were ultimately needed to detect implementation problems. Most notably, by linking the VSR with HCS-based MDS resident assessment data, the RT could monitor VSR completion for both new admission and long-stay cohorts (i.e., delivery monitoring). Moreover, the RT could calculate the rates at which videos were actually shown (i.e., adherence monitoring) not just offered. The ACP champion qualitative interviews were another source of monitoring that would not have occurred outside of a trial and inadvertently revealed challenges to protocol delivery (i.e., ACP champion turnover, lack of attention to the long-stay residents). Taken together, the extent and formality to which protocol delivery and adherence monitoring occurred in PROVEN may not have occurred under real-world conditions.
The aforementioned monitoring mechanisms revealed three major implementation problems: (1) the overall rate of VSR completion (i.e., video offering) was much lower than anticipated, (2) the offering rate was particularly low among long-stay residents compared to new or re-admissions, and (3) in a high proportion of cases, videos offered were not actually shown. During monthly group “check-in” calls, ACP champions cited lack of time and competing responsibilities as the biggest barriers to not offering the video at all. They also described situations in which they felt it inappropriate to offer a video because the patient’s care plan already indicated a preference for comfort care (e.g., enrolled in hospice care), a circumstance that was not captured on the VSR. In fact, the VSR itself proved to be a hindrance to protocol delivery, as ACP champions found it did not serve any clinical purpose and, thus, often did not complete it.
The relatively better protocol delivery among new or re-admissions most likely reflected that it was more straightforward to integrate the video offering into the work flow of the distinct event of an admission rather than the more nebulous instructions to extend a video offer every 6 months to the long-stay group. While regularly scheduled care planning meetings were suggested as potential trigger events to offer videos in the long-stay cohort, often neither family members nor patients attended these meetings; when they did, ACP champions stated there was often not enough time to show a video.
The major problem with adherence (i.e., patients and families not actually viewing the videos) was an unintended consequence of our initial decision to define protocol compliance as “offering” rather than “showing” a video, a definition that was reinforced in the initial training of ACP champions and in the feedback reports provided to them. Thus, this phenomenon may have reflected an attempt by the champions to meet compliance benchmarks in the easiest manner possible.
Attempts to improve protocol delivery primarily involved the clinical education specialists reinforcing with ACP champions that delivery of the ACP Video Program was an expected responsibility. Reinforcement was achieved through joint review of the feedback reports, monthly group check-in calls, and on-site visits to poor performing facilities. These initial efforts met with limited success. Subsequent steps that ultimately proved critical in improving protocol delivery required a shift towards a more explanatory approach. First, the frequency of providing feedback reports to ACP champions was increased from quarterly to monthly, and the report content was modified to include the rate of showing, not just offering, videos. Second, ACP champions were told that, going forward, they would be held accountable for showing (versus offering) videos. They were also instructed to shift their efforts more towards long-stay residents relative to new or re-admissions since the trial’s primary outcome would be measured in this cohort. To increase the focus on showing videos to this target population, the research data team generated for every facility a list that indicated which long-stay residents had not yet seen a video. The PROVEN implementation team leaders then conducted monthly telephone meetings with each facility ACP champion, and together they identified who among these long-stay residents were most in need of advance care planning and strategized how to facilitate those residents or their family members viewing a video.
As an additional step to improve intervention adherence, after consultation with internal and external statistical experts, the subject enrollment period was extended from the planned 18 months to 24 months with the hope that, with the modifications to the implementation protocol, a greater number of long-stay residents would be exposed to the videos. This extension incurred a small increase in research funds to support the on-going efforts of the HCS partners, resources that would not necessarily be available under real-world conditions.
Finally, within the Flexibility-Delivery domain, PROVEN was entirely pragmatic in regards to co-interventions, as it allowed on-going ACP activities (e.g., use of MOLST forms) or programs to reduce hospitalization rates to occur alongside the trial. Although information about these co-interventions was ascertained in the intervention facilities during the ACP champion qualitative interviews, the presence of co-interventions could not be known in the control nursing homes. Thus, the impact of differential use of co-interventions between arms on the trial’s outcomes would not be directly measurable, a limitation of a pragmatic approach but also reflective of what happens in the real world.