Morgan Nestingen, MSN, APRN, AGCNS-BC, NEA-BC, OCN, ONN-CG
Navigators at the Miami Cancer Institute (MCI) have built a successful navigation program database to track patient care delivery and report metrics, according to Morgan Nestingen, MSN, APRN, AGCNS-BC, NEA-BC, OCN, ONN-CG, director of Nursing, Patient Intake and Navigation Services at MCI.
The team’s objective heading into the project was to use widely available tools (in this case Microsoft Access) to design and implement a navigation program database that would improve the routine delivery of patient care (including day-to-day patient tracking and follow-up), as well as reporting of standardized navigation metrics.
At the AONN+ 13th Annual Navigation & Survivorship Conference in New Orleans, Ms Nestingen shared the challenges and wins involved with getting the project off the ground, as well as important lessons learned.
The Scope of the MCI Navigation Program
The navigation program at MCI is treatment-focused and nonreferral-based, seeing more than 8500 unique patients with 35,000 touchpoints per year. The program employs 31 nurse navigators and 12 patient navigators working in 13 disease sites.
“We’re really focused on active cancer treatment: from the moment a patient comes in for their first consult to the time they exit treatment and head to our APRN [advanced practice registered nurse]-led survivorship clinic,” she said. “We are not a diagnostic program—we have no metrics that focus on time to diagnostic resolution—and we’re not an outreach program.”
Starting with a Purpose
Before beginning to build their database, the team at MCI identified some of the challenges they faced in tracking patient care delivery and reporting metrics.
First, they work with a traditional electronic health record (EHR). These are conceptually based on paper charts and are often designed for direct 1:1 patient care, dealing with only 1 patient at a time. According to Ms Nestingen, using this kind of system can make it extremely difficult to determine which individuals need more attention (especially when dealing with 8500 patients).
“You’re relying on your team to tell you which patients are in trouble, but they’re doing that based mostly on memory,” she noted.
Tracking program outcomes and metrics requires robust reporting across populations, but most traditional EHRs do not allow navigators to follow and document caseloads or populations over time. This limits the navigators’ ability to organize care across populations, as well as the leaders’ ability to demonstrate outcomes and return on investment.
“We rely on our IT experts; we need them and appreciate them, but we have no control over the timing of things that happen in their world,” she said. “So we wanted to do something in this project that we could directly control.”
Navigation is related to population health concepts, so the team needed the ability to see data from hundreds of patients at once, to follow that caseload over time, and to report specific outcomes and measures broken down into relevant subpopulations (ie, disease sites and clinics). The final pilot project included 10 pilot study metrics within 5 universal categories (ie, navigation caseloads, patient outcomes, interventions implemented, etc).
“The challenges we had walking into this were the same challenges that the folks on the AONN+ Navigation Metrics Toolkit studies encountered,” she noted.
Metrics challenges included:
- Lack of uniformity in data collection (including interpretation differences between navigators and disease sites)
- Navigator education and buy-in
- Finding correct information within a chart
- Avoiding double-charting of data
- Lack of discrete fields (many navigators would write phrases or descriptions in Excel; perhaps helpful for them, but lacks generalizability)
- IT delays (ie, report building, IT customization)
- Documentation learning curves and limited experience with Microsoft Access (the program did not use Excel)
- Reporting issues—utilizing manual reporting and interpreting reporting errors
- Variation in navigator workflows and individual patient needs
According to Ms Nestingen, the team was set on using available tools and a rapid Plan-Do-Study-Act (PDSA) project cycle to create the database.
“We wanted to use widely available tools: something that was free, immediate, and on everybody’s computer already,” she said. “And we wanted to do a quick PDSA to see if we could actually track the things that we wanted to track in a testing environment, before spending time to get it baked into the EHR.”
In the planning portion, experienced navigation leaders reviewed existing documentation, as well as reports and metrics already captured, and worked out how much could be accomplished within the existing EHR.
They determined that critical needs beyond the EHR’s functionality included patient tracking, follow-up reminders, and additional reporting.
The “Do” portion of the cycle involved frontline navigators, leaders, and IT experts reviewing existing tracking tools (Excel lists, EHR reminders, paper, etc), specific target metrics (ensuring executive buy-in for each metric), and Microsoft Access functions (including discrete fields, search queries, and real-time lists that allowed for access to specific patient records). Over the course of 2 months, the team designed, built, and tested these new discrete fields, critical real-time lists to support daily patient care, and queries to generate key reports.
Ms Nestingen noted the importance of naming—be it queries, discrete fields, or lists—in a way that makes intuitive sense to staff.
“Naming conventions matter, and they’re a lot harder to change down the line once you’ve built reports and queries that refer to the name of that object in a database or charting system,” she advised. “When you work with your IT teams, think ahead of time about what you want to call things, and what’s going to make sense to your staff.”
In the “Study” portion, navigation leaders hosted brief weekly huddles to review
- Database entries—is the new documentation successful?
- Reports—are the new metrics actually reportable?
- Staff input—what challenges are staff encountering?
She noted that these weekly huddles were created largely in response to a very active “Teams” chat environment, in which questions and solutions were constantly being exchanged between navigators.
“They started asking each other questions about the database, and we were seeing the same questions come up time and time again, so we decided we needed to meet with them on a regular basis,” she said. “At the beginning, we did a lot of talking during those 15-minute huddles, and at the end, they did all the talking.”
The entire department received live and video training, and superusers received 1:1 training on using the new database. She pointed out that superuser buy-in and their 1:1 review with frontline navigators were critical to their success.
Once the database was up and running in the “Act” phase, navigation leaders reported to executives on the new metrics, identifying ongoing challenges and “pain points,” and considered specific alternatives for the future (ie, third-party software vs EHR updates).
“We knew this either needed to live in the EHR, or at least feel to the team like it lived in the EHR,” she said. (For example, a third-party software solution that would launch from within the EHR.)
According to Ms Nestingen, double-charting was the team’s Achilles’ heel, and perceived double-charting turned out to be the biggest staff dissatisfier throughout the project.
“Microsoft Access will save patient info automatically, or navigators can click a ‘save’ button and close a chart, but every once in a blue moon a chart wouldn’t save,” she said. “All it took was that happening once for a nurse to no longer trust the system. So we had team members who still used Excel or paper; some were triple-charters.”
Although frontline navigators stated they understood the importance of the database for standardization and patient care tracking, Microsoft Access was new to many of them, and they struggled to balance documentation with patient care, she reported.
Moving to the Next Phase
At the end of the pilot phase, the database was widely considered to be a program-wide solution (head and neck navigation was operating in the same way as endocrine, general navigation, etc) that allowed leaders to track patient care, provide navigation support on a daily basis, and gauge who was falling behind.
“During this project, we had 3 navigators go out on FMLA [Family and Medical Leave Act] for various reasons, and we were able to go in and see their whole caseload,” she said. “We saw where they were planning to follow up for this specific patient population, and we were able to either reassign it or do it ourselves, as leaders.”
The pilot project also allowed the navigation team to test and implement their new metrics while also determining how to actually measure and interpret them from a statistical stance.
Long-term, the MCI navigation department plans to move to an EHR-compatible software that will integrate with scheduling data, track patients, organize caseloads, minimize charting time, and display navigator and leadership reports and dashboards while also capturing critical demographics and clinical factors to support population health (all with an intuitive user interface).
“But,” she added, “all of this is only as good as it is accessible, easy-to-use, and intuitive to our team.” Ultimately, the goal is full integration of navigation metrics into the EHR.
More on Data Collection
Identification of Key Stakeholders to Sup-port the Metrics Measurement Process
An Excerpt from the Navigation Metrics Toolkit