Patient navigators are widely used to guide patients through the healthcare maze, providing education, financial networking, expert clinical judgment, emotional support, timely access, and continuity of care. This review examines [ Read More ]
April 2014 VOL 5, NO 2
Best Practices in Patient Navigation and Cancer Survivorship: Moving Toward Quality Patient-Centered Care
Anne Willis, MA
Background: The number of patient navigation and clinical survivorship programs is rapidly increasing. As more institutions develop these programs, healthcare professionals need guidance on best practices and how other institutions run their patient navigation and survivorship programs.
Methods: We conducted a national web-based survey of healthcare professionals at institutions with patient navigation and/or clinical survivorship programs. Respondents were asked to identify patient navigation caseload, measures tracked, tracking tools, clinical tools, funding, challenges, and recommendations for financial sustainability.
Results: The survey was sent to 1500 healthcare professionals; 100 completed the survey. Respondents with patient navigation and/or survivorship programs reported practice variation across several variables (eg, measures tracked) and identified common practices across institutions (eg, funding sources). Funding was identified as a challenge across both programs. Recommendations for sustainability included reimbursement, grants, and demonstrating value.
Conclusions: Patient-centered initiatives like patient navigation and clinical survivorship programs are relatively new. Program leaders and administrators need to understand caseload per full-time equivalent as well as potential ways to measure success in order to plan and implement these programs. Understanding existing practices for patient navigation and survivorship programs can assist healthcare professionals create and improve the delivery of these programs at their institutions.
New cancer program accreditation standards have the potential to rapidly change cancer care delivery. The American College of Surgeons Commission on Cancer (CoC), which accredits more than 1500 institutions that provide care for the majority of newly diagnosed cancer patients,1 announced new Continuum of Care Services standards that will go into effect in January 2015.2 These standards focus on patient navigation, psychosocial distress screening, and survivorship care plans (SCPs). See Table 1 for a list of the new standards.
Creating patient navigation and survivorship programs to deliver SCPs can be challenging. Many clinical professionals are tasked with developing programs, but they may not possess the program-planning skills that are essential for success. Both patient navigation and survivorship are relatively new fields that lack standards and guidance for program implementation. In addition, institutions have different resources at their disposal and have different patient populations. There is no “one-size-fits-all” approach. Tremendous variation in care exists and healthcare professionals often must take a “see-what-sticks” approach to creating these programs.
The new CoC standards have led to tremendous growth in the number of patient navigation and survivorship programs. To provide clarity and determine best practices for implementing patient navigation and cancer survivorship programs, researchers at the George Washington University (GW) Cancer Institute’s Center for the Advancement of Cancer Survivorship, Navigation, and Policy (caSNP) conducted the Best Practices in Patient Navigation and Cancer Survivorship Survey. The brief survey sought to collect and identify practices pertaining to measures tracked, clinical tools, funding, challenges, and other key topics.
The survey was developed as a brief tool to collect information related to frequently asked questions from healthcare professionals through the GW Cancer Institute’s Executive Training on Navigation and Survivorship, the caSNP and Association of Community Cancer Centers’ listservs, LinkedIn groups, and in-person conversations between GW Cancer Institute staff and other professionals. Survey topics and questions were based on a consensus of 4 staff members who are actively engaged in healthcare provider education on these topics.
The survey included 2 initial questions to assess respondent type and program type at the respondent’s institution. Respondents whose institution had a navigation program were asked to complete 7 navigation-specific questions, and those with a survivorship program were asked to answer 8 survivorship-specific questions. Respondents from institutions with both programs were asked to complete a total of 15 questions about their institutions’ navigation and survivorship programs. The final question was an open-ended request to identify other topics of interest. The survey and recruitment tools were submitted to the GW Office of Human Research, and exempt from Institutional Review Board human subjects research review.
The survey was open from January 30, 2013 to February 27, 2013 (4 weeks). A link was sent out through the caSNP listserv, which included more than 1500 professionals. Listserv members were asked to send the survey to others who might be interested. No incentive was offered to complete the survey, but participants were told that results would be summarized and presented on a free webinar. In total, 146 respondents started the survey and 100 completed it. An analysis was performed using frequencies of responses for categorical data. Not all respondents answered every question; therefore, the total number of respondents and corresponding percentages vary.
Respondents were asked to identify their job type; more than 1 job type could be selected if applicable. Of 99 respondents, 33% self-identified as patient navigators, 30% as nurses, and 5% as nurse practitioners. Other respondents identified themselves as program managers (22%), other (16%), social workers (12%), administrators (10%), and primary care providers (1%).
Respondents who selected “other” identified themselves as oncology nurse navigators (2%), health educators (2%), psychologists (2%), nurse researchers (1%), or oncology navigation coordinators (1%). One respondent skipped this question.
Respondents were also asked whether their institution had a navigation program only, a survivorship program only, or both programs. Of 97 respondents, 41% reported that their institution had both programs, 40% had a navigation program only, and 19% had a survivorship program only. Three respondents skipped this question.
Patient Navigation–Specific Responses
Seventy-two respondents described their annual average patient load for full-time navigators across the cancer continuum, starting with less than 100 patients and to more than 400 patients in increments of 50 patients (Table 2). Screening, diagnosis, treatment, and posttreatment phases were included. Respondents could also select “not sure” or “not applicable” (“n/a”) for each phase of the cancer continuum.
Respondents were asked to indicate which of the 13 measures provided were tracked in their patient navigation program and how each measure was tracked: tracking log, medical records, validated scale/tool/questionnaire, my own scale/tool/questionnaire, other. Respondents could indicate additional measures through selecting the “other” answer choice. The measures and tracking mechanisms of the 72 survey participants who answered the question are indicated in Table 3.
Respondents (n = 72) also identified which tracking tools they used. Seven answer choices were provided: Excel spreadsheet (51%), paper logs (47%), Electronic Medical Record (EMR; 30%), Access database (21%), other (20%), navigation software (13%), or none (3%). “Other” responses were: in-house software (1%), web-based database (1%), ARIA (1%), Epiphany database and appointment reminder (1%), PenRad and Outlook Calendar (1%), Extended Prostate Cancer Index Composite (EPIC) (1%), Siebel (1%), Efforts to Outcomes software (1%), Midas+ Care Management Module (1%), Word document forms (1%), research tools (1%), Microsoft SharePoint (1%), custom Access database (1%), and MSM (1%). MSM is assumed to be a customized tracking method through Management Systems Modelling Software.
Seventy-three respondents rated 8 common challenges using a 5-point Likert scale with 1 being most challenging and 5 being least challenging. For each challenge area respondents could select “n/a” (Table 4).
Respondents (n = 74) were asked to identify sources of funding for their patient navigation program from a list of 5 options: internal funds allocated for program (ie, new budget line item) (58%), grant support (57%), existing resources (eg, staff, space) (18%), other (12%), and direct reimbursement (1%). “Other” responses were: donations (individuals, community, and foundations) (4%), grants (1%), not funded by grants (1%), operational budget (1%), hospital auxiliary (1%), unknown (1%), and n/a (1%). Funding often comes from multiple sources for navigation programs, so more than 1 source could be chosen resulting in total percentages aggregating to over 100%.
Respondents were asked whether they formally track return on investment, cost-benefit, cost versus revenue, or cost-effectiveness. Of 69 respondents, 59% indicated that they were not tracking any of these measures. Some respondents reported tracking number of patients (38%), patient satisfaction (23%), timeliness of care (22%), barriers/resolutions (20%), direct program costs (eg, personnel, materials, training, procedures) (19%), number of procedures, tests, consultations, etc (17%), referrals from other patients and navigators (16%), clinical trial accrual (15%), adherence to treatment (10%), outmigration avoided (9%), downstream revenue (9%), no-shows avoided (7%), quality of life (7%), survival (7%), other (6%), and payments that otherwise might not have been made (0%). “Other” responses were: survivorship (1%), unknown (1%), services provided (1%), and cost avoidance (decreased emergency department visits and decreased inpatient stays) (1%).
Thirty survey participants responded to an open-ended question identifying suggestions for financial sustainability of patient navigation. Responses were: reimbursement (27%), demonstrate value (patients staying local for treatment, prevention of 30-day readmissions, finding payment assistance, cost savings, retaining patients, downstream revenue, increase research accrual, cost avoidance, migration data/prevention of outmigration, guidelines adherence, value to primary care provider [PCP] and PCP satisfaction, clinical trial accrual) (24%), grants/donations/philanthropy (14%), champions/physician buy-in (8%), state/federal funding (5%), research (5%), show program success (3%), partnership with state, hospitals, federally qualified health centers (3%), add cost into operating budget (3%), bill for nursing education for oncology nurse navigators (3%), market services (3%), external funding for navigation of particular patient populations (3%), develop pathways for patient services (3%), develop criteria for who should be navigated (3%), create a common definition and job description (3%), and certification for patient navigators (3%).
Fifty-one survey participants identified the survivorship care plan template(s) used in their survivorship program. They could select from 9 answer choices: homegrown tool (47%), LIVESTRONG Care Plan (24%), none—we have a survivorship program but do not provide an SCP (24%), Journey Forward Care Plan Builder (22%), American Society of Clinical Oncology templates (12%), commercial survivorship care plan (9%), report generated from EMR (8%), other (6%), or discharge letter from oncologist (2%). “Other” responses provided were unknown (6%).
Respondents were asked to identify which constructs are tracked in their programs. Responses from the 54 respondents who answered the question are summarized in Table 5. “Other” responses provided were unknown (4%), pain (2%), guideline delivery (2%), PCP satisfaction and understanding (2%), referrals made (2%), and resources utilized (2%).
Respondents (n = 52) were asked which assessment tools are used in their survivorship program and could select from 14 answer choices: National Comprehensive Cancer Network Distress Thermometer (50%), patient satisfaction survey (33%), tool developed by the institution (27%), other (15%), none (12%), fatigue scale (10%), Patient Health Questionnaire (PHQ-9) (10%), Functional Assessment of Cancer Therapy-General (FACT-G) (9%), Functional Assessment of Chronic Illness Therapy (FACIT) (4%), McGill Pain Questionnaire (4%), QualityMetric’s SF-36 or SF-12 (4%), body image scale (2%), Communication and Attitudinal Self-Efficacy-Cancer (CASE-C) (2%), and Impact of Cancer Scale Tool (0%). “Other” responses provided were: intake tool (10%), unknown (4%), EPIC (2%), CancerSupportSource (2%), Work Productivity and Activity Impairment Questionnaire: General Health (WPAI:GH) (2%), Valuation of Lost Productivity (VOLP) (2%), European Organisation for Research and Treatment of Cancer (EORTC) Quality of Life Questionnaire Core 30 (QLQ-C30) (2%), EORTC QLQ-C29 (2%), RAND SF-36 (2%), Piper Fatigue Scale (2%), Edmonton Symptom Assessment Scale (2%), Canadian Problem Checklist (2%), FACIT Fatigue Scale (2%), and nutritional surveys (2%).
Fifty-two respondents identified sources of funding for their survivorship program from a list of 5 options: grant support (60%), internal funds allocated for program (ie, new budget line item) (58%), existing resources (eg, staff, space) (42%), direct reimbursement (19%), and other (17%). “Other” responses were: donations (individuals, community and private) (10%), fundraising (4%), philanthropy (2%), any source available (2%), and unknown (2%).
Respondents (n = 47) were asked to rate 8 common challenges using a 5-point Likert scale with 1 being most challenging and 5 being least challenging. For each challenge area, respondents could select “n/a.” For respondents for whom the challenge was applicable, the weighted average for each challenge is shown in Table 6.
Respondents (n = 43) were asked to identify reimbursement challenges from a list of 19 answer choices: creation of survivorship care plan (33%), none (28%), nutrition services (26%), delivery of survivorship care plan to patient (26%), care coordination with PCP and other providers (26%), weight management services (23%), psychosocial care (23%), health promotion (21%), psychosocial assessment (19%), patient navigation (19%), physical activity services (16%), not all clinician services are covered (16%), late effects education (14%), other (14%), rehabilitation (12%), symptom management and palliative care (9%), screening/surveillance (9%), specialty referrals (5%), and medical assessment (0%). “Other” responses provided were: unsure (7%), do not bill (7%), care for surgical patients is not reimbursed within 3 months of surgery (2%), and cannot bill at the appropriate level (2%).
Twenty-five survey participants responded to an open-ended question identifying how the survivorship program at their institution bills for services. Responses provided were: level 3-5 visit (24%), “n/a” (24%), do not know (20%), do not bill (16%), nurse practitioner (NP) visit (4%), NP/PCP visit (4%).
They were also asked an open-ended question to identify suggestions for financial sustainability of survivorship programs. Among the 24 respondents, 38% include reimbursement, 21% grants, 17% “do not know,” 8% survivorship care plan efficiencies, 8% understand the benefits of survivorship programs, 4% rehabilitation, 4% streamlined services, and 4% organizational budget.
Respondents (n = 32) were asked a final open-ended question to identify additional topics of interest. Table 7 summarizes these responses.
Our study reports findings from healthcare professionals whose institutions have already established navigation and survivorship programs. Average patient navigation case- load can vary across several factors, including phase of the cancer continuum and patient population. Many navigators reported either high (>400 patients) or low volume (<150 patients) for screening, and more than half reported navigating fewer than 150 patients during the diagnosis, treatment, and posttreatment phases.
The field of patient navigation measures is still evolving and several efforts have been aimed at establishing common measures.3,4 Many of the measures cited most often by respondents were consistent with the goals of patient navigation: removing barriers to care and coordinating care.5 More than half of the respondents tracked care coordination, barriers to care/actions to remove barriers, communication between patient and provider, time to treatment, psychosocial distress, adherence to scheduled visits/missed appointments, time to diagnosis, adherence to treatment, patient satisfaction, and time to screening. Despite a core goal of removing barriers to care, 25% of respondents indicated that they did not track this measure. Navigators reported heavy reliance on low-tech tracking options, with about half using an Excel spreadsheet and/or paper as a tracking log. Use of these low-tech options may be related to funding, time, and role-clarity challenges.
The greatest challenge areas identified were related to funding and roles. Programs were reportedly largely funded from a combination of internal funds and/or grants. While grants have been instrumental in helping organizations launch navigation programs, the funding source is not sustainable. Interestingly, nearly one-quarter of the respondents identified demonstrating value as a strategy to financially sustain navigation programs, yet nearly 60% of respondents were not tracking value at all. Lack of role clarity was also cited as a challenge. This finding illustrates a larger need in the field of patient navigation to better define the roles of patient navigators and how they are distinct from and coordinate with other healthcare professionals.
Fewer respondents reported having a survivorship program. Almost half of the respondents reported using a homegrown survivorship care plan tool. In its landmark From Cancer Patient to Cancer Survivor: Lost in Transition report,6 the Institute of Medicine recommended information fields for SCPs, yet studies have shown that many SCPs do not meet these recommendations.7,8 With such high use of homegrown templates, more research is needed on how those templates were developed, what content they contain, and whether they are effective.
As with navigation, efforts are under way in the field of survivorship to establish common measures, including the National Cancer Institute’s Grid-Enabled Measures-Care Planning Initiative through its Grid-Enabled Measures Database. The initiative seeks to build consensus on survivorship care-planning measures by allowing users to submit and rate constructs and measures. In our study, psychosocial distress and patient satisfaction were reportedly tracked by more than half of the respondents, and almost half reported tracking quality of life and physical activity. Despite some common constructs, variation was reported in the tools used to measure the constructs, which creates difficulties in conducting and comparing research on patient and program outcomes.
Funding was also reported as a major challenge, with most respondents again reporting a mix of internal funds and/or grants as primary funding sources. Fewer than 20% of the respondents reported using reimbursement as a funding source, and challenges getting some services reimbursed were cited. It is important to note, however, that more than one-quarter of the respondents indicated no problems with reimbursement, and some of the reportedly problematic services do have reimbursement options, such as delivery of the survivorship care plan and nutrition consultation. This finding suggests a continued need to share best practices and successes.
This study has several limitations. Because it was meant to quickly collect best practices from a variety of respondents, the results are not representative of all healthcare professionals at institutions with navigation and survivorship programs, and the sample size is relatively small. As the goal was breadth rather than depth, limited information about the respondents is available. Moreover, any type of respondent from an institution with a navigation and/or survivorship program was eligible to complete the survey, so some of the respondents were unable to answer some of the questions. Despite these limitations, this is one of a few national studies seeking to identify best practices in navigation and/or survivorship and will be helpful for healthcare providers seeking to create or improve their programs.
Patient-centered initiatives like patient navigation and clinical survivorship programs are relatively new; however, the number of programs has grown significantly in the past 5 to 10 years spurred on by patient demands and new accreditation requirements. Our results provide critical insight into implementation practices related to navigation and survivorship programs from institutions across the country.
The findings can assist healthcare professionals who are creating or improving programs for which little guidance is available. Based on survey findings, patient navigators who assist patients during the screening part of the cancer continuum may be able to navigate a significantly greater number of patients annually than navigators assisting patients after diagnosis; and navigators supporting patients in treatment may need a smaller caseload than navigators supporting patients, at other points along the continuum. In addition, the survey yields data to help programs improve by aligning what is evaluated with the core function of the navigator (removal of barriers to healthcare).
The results also indicate the need to identify financially sustainable models for patient navigation and clinical survivorship programs and consensus on core measures. While respondents identified demonstrating value as a key strategy for sustainability, most programs were not tracking value. Focusing evaluation on the value of patient-centered programs is critical since most programs are funded internally or by grants. Healthcare leaders who champion patient-centered programs will benefit from ongoing research, specific methods, and consensus measures to evaluate program impact.
Author Disclosure Statement: All authors have nothing to disclose.
Corresponding Author: Anne Willis, MA, George Washington University Cancer Institute, 2030 M Street, NW, Suite 4003, Washington, DC 20036; E-mail: email@example.com.
1. About accreditation. American College of Surgeons. Commission on Cancer. www.facs.org/cancer/coc/whatis.html. Accessed August 16, 2013.
2. Cancer program standards 2012: ensuring patient-centered care. American College of Surgeons. Commission on Cancer. www.facs.org/cancer/coc/program standards2012.html. Accessed August 16, 2013.
3. National Patient Navigation Leadership Summit (NPNLS): measuring the impact and potential of patient navigation. Cancer. 2011;117(suppl 15):3535-3623.
4. Crane-Okada R. Evaluation and outcome measures in patient navigation. Semin Oncol Nurs. 2013;29(2):128-140.
5. Freeman HP, Rodriguez RL. History and principles of patient navigation. Cancer. 2011;117(suppl 15):3539-3542.
6. Hewitt M, Greenfield S, Stovall E, eds. From Cancer Patient to Cancer Survivor: Lost in Transition. Washington, DC: The National Academies Press; 2006.
7. Stricker CT, Jacobs LA, Risendal B, et al. Survivorship care planning after the Institute of Medicine recommendations: how are we faring? J Cancer Surviv. 2011;5(4):358-370.
8. Salz T, Oeffinger KC, McCabe MS, Layne TM, Bach PB. Survivorship care plans in research and practice. CA Cancer J Clin. 2012;62(2):101-117.
Memphis, TN—One of the most popular presentations at the 4th Annual Navigation and Survivorship Conference of the Association of Oncology Nurse & Patient Navigators (AONN+) was delivered by Linda A. [ Read More ]