New York State Department of Health
Hospital Medical Home Demonstration Final Report DRAFT
- Final Report is also available in Portable Document Format (PDF, 988KB)
Summary
Overall Goals and Description of the Hospital Medical Home pilot:
- All sites (100%) achieved the goal of NCQA PCMH recognition at Level 2 or 3 2011 standards and 89% of sites achieved NCQA PCMH recognition at the highest level (level 3 under 2011 standards.)
- Significant improvements (p<0.05) were observed in 8 of 17 clinical performance metrics studied.
- Residency training programs were restructured in 82% of sites and 68% of sites increased resident time in ambulatory settings by the end of the demonstration.
- All four care coordination and integration projects showed significant improvement (p<0.05) over the course of the demonstration with improvements in almost all milestones related to the four projects.
- All sites made qualitative improvements in inpatient quality and safety, especially sepsis protocols and sepsis teams, though we could only find statistically significant quantitative improvement (p<0.05) for two measures, using hospital reported data - Central Line Bundle Compliance and Venous Thromboembolism Discharge Instructions. However, this data is not yet officially reported by CMS for the years covered by this project and inpatient data can be re-evaluated at that time.
- Half (50%) of the 12 sites participating in the Surgical Care Improvement Project (SCIP) inpatient project and 54% of the 41 sites participating in the Central Line- Associated Bloodstream Infection (CLABSI) inpatient project improved their performance by the end of the demonstration to reach a higher level of performance. Notably, 90% of 10 sites participating in the NICU CLABSI inpatient project achieved a higher level of performance by the end of the demonstration.
Policy Recommendations:
For Ambulatory Clinics Serving Medicaid Members:
- Outpatient clinics serving Medicaid members should be structured as advanced primary care models consistent with patient centered medical home principles.
- Outpatient clinics should provide care management for high risk patients and patients with chronic disease.
- Outpatient clinics should coordinate care across inpatient and outpatient settings and with specialty care.
- Outpatient clinics should be required and/or incentivized to track and report population health and quality metrics that are tied to payment.
- Outpatient clinics should be required to routinely exchange information with their Regional Health Information Organization or Health Information Exchange.
- Behavioral health should be integrated into all primary care settings through Collaborative Care or other evidence-based programs.
For ACGME - Residency Programs
- Residency programs training primary care residents should be encouraged or required to provide training in outpatient clinics that have been transformed into an advanced primary care model.
- Residency programs and hospitals should be required to provide training in interdisciplinary teams, including care managers, to prepare residents for team- based care.
- Primary care and specialty residency training programs should be required to jointly develop referral guidelines, communication systems, and co-management agreements to better coordinate care.
- Residency programs training primary care residents should incorporate explicit patient empanelment as a core element of primary care training.
- Residency programs should train and involve residents in the coordination of care between inpatient and outpatient settings.
For CMS - Hospitals and Residency Programs
- Hospitals should be required to report adherence to sepsis protocols.
- CMS should fund additional incentives to encourage medical students to choose primary care.
- CMS should encourage other states to use the waiver process to reform primary care outpatient training sites.
- Hospitals should be required or incentivized to develop policies to include residents in quality improvement committees, reviews, and projects.
- Hospitals, residency programs, and their outpatient clinics should be required or incentivized to coordinate care between primary and specialty care.
- Hospitals, residency programs, and their outpatient clinics should be required or incentivized to formalize interdisciplinary and interdepartmental teams across these settings to better integrate care given by residents.
- Hospitals, residency programs, and their outpatient clinics should be required or incentivized to include care managers in these teams.
- Hospitals, residency programs, and their outpatient clinics should be required or incentivized to develop methods to follow their patients across transitions of care to prevent unnecessary readmissions and patients lost to follow-up.
Section | Table of Contents | Page |
---|---|---|
I. | Introduction and Background | 5 |
II. | Research Questions Under Investigation | 11 |
III. | Evaluation Design/Evaluation Type explanation | 12 |
IV. | Data Needs and Data Sources | 15 |
V. | Data Analysis/Results | 20 |
VI. | Major Findings | 48 |
VII. | Challenges | 53 |
VIII. | Lessons Learned | 55 |
IX. | Limitations | 57 |
X. | Policy Recommendations | 58 |
Appendix | ||
Participating Hospitals | IA1 | |
HMH Finance Spreadsheet | IA2 | |
Work Plan Example | IA3 / IVB2 | |
Care Integration Report Card | IA4 | |
Clinical Performance Report | IA5 | |
Hospital Site Visit Summary | IA6 | |
Hospital Medical Home Conference 2014 | IA7 | |
Hospital Medical Home Conference 2015 | IA8 | |
CLABSI Definition Changes | IIIA1 | |
Tertile Report Appendix | IIIA2 | |
Feedback Letter Example | IIIB3 | |
Table of Required Measures | IVA1 | |
Application Example – Bellevue Hospital | IVB1 | |
PCMH Baseline Assessment Example | IVB3 | |
Quarterly Report Narrative Tool Questions | IVB4 | |
Annual Narrative Reporting Tool Questions | IVB5 | |
Hospital Medical Home Final Report Instructions | IVB6 | |
Example Site Visit Presentation | IVB7 | |
Example Care Transition Coaching Call Materials | IVB8 | |
NCQA Tracking Spreadsheet | VA1 |
I. Introduction/Background
The Hospital Medical Home (HMH) initiative was a Partnership Plan CMS 1115 Waiver from 2010, Quality Demonstration Program in which up to $250 million in funding was awarded through New York State to 65 hospitals to transform their primary care training sites to National Committee for Quality Assurance (NCQA) Recognized Patient-Centered Medical Homes (PCMHs) at 2011 Level 2 or 3 standards, implement patient safety and quality improvement projects both in the ambulatory training setting and inpatient, and extend or enhance the resident and patient continuity experience. (See Appendix IA1) The purpose of this demonstration was to improve the coordination, continuity, and quality of care for individuals receiving primary care in hospital outpatient departments operated by teaching hospitals, as well as other primary care settings used by teaching hospitals to train resident physicians. This demonstration was meant to be instrumental in influencing the next generation of practitioners in the important concepts of patient-centered medical homes.
The goals of the demonstration were:
- Provide better care of chronic disease.
- Increase preventive screenings and immunizations.
- Increase access to care for acute conditions.
- Improve health for individual Medicaid members seen in training clinics.
- Improve performance on population health.
- Decreased potentially preventable readmissions for certain defined high risk populations
- Have primary care training sites achieve PCMH Level 2 or 3 2011 NCQA recognition
- Train the future primary care work force in new models of primary care and encourage adoption of advanced primary care models such as PCMH
Participating entities that serve as training sites for primary care residents were required to: transform their sites to high level patient-centered medical homes and obtain National Committee on Quality Assurance (NCQA) Patient Centered Medical Home Level II or Level III Recognition at their 2011 standards; develop and report on 5 clinical performance metrics; restructure operations to enhance patients´ continuity of care experience and extend the ambulatory training experience for residents; implement one of four care integration initiatives; and implement two of six quality and safety improvement projects. (See further details below.)
The awards to hospitals ranged from just over $120 thousand to $21 million per hospital and participating affiliated outpatient sites, and averaged $3.9 million. Allocation of demonstration funds, as per terms and conditions, was based on 80% total Medicaid visits and 20% number of residents with an additional 25% weight for community-based sites.
Awards were distributed in five payments over a two-year period and contingent on the successful completion of defined milestones. The HMH Finance spreadsheet details payment information, penalties, distribution methodology, and actual awards post- recalculation (See Appendix IA2).
Awardees initially encompassed 65 teaching hospitals throughout the state and ended with 60 participants - 28 in New York City (NYC) and 32 throughout the rest of the state; 119 Residency Training programs: 48 Internal Medicine; 34 Pediatrics; 33 Family Medicine; and 4 Internal Medicine/Pediatrics. Initially there were 162 outpatient primary care residency training clinics but 6 of these withdrew Together these clinics train approximately 5,000 primary care residents and serve approximately 1,000,000 Medicaid members. During the demonstration, five hospitals withdrew from the demonstration as well as six residency programs. Long Beach Medical Center and Staten Island Hospital left the project due to impact on hospital operations from Hurricane Sandy. One hospital, New York Downtown hospital, closed. St. John´s Episcopal Hospital South Shore left the project due to restructuring, and Interfaith Medical Center left the project due to bankruptcy challenges.
Hospitals/Residency Programs | |
---|---|
Total HMH Hospitals (Sites) (Residency Programs) | 60 (156) (118) |
Hospital Counts by Region of State | |
NYC | 28 (47%) |
Rest of State | 32 (53%) |
Outpatient Counts by Region of State | |
NYC | 95 (61%) |
Rest of State | 61 (39%) |
Total Primary Care Residents Participating | 5524 |
Internal Medicine | 3,606 (65%) |
Pediatrics | 1,341 (24%) |
Internal Medicine-Pediatrics | 75 (1%) |
Family Medicine | 503 (9%) |
Project Milestones
The milestones for the HMH demonstration included:
- PCMH Level 2 or 3 2011 NCQA Recognition: Prior to this project, approximately 70 of the participating primary care ambulatory sites were not recognized as patient centered medical homes under any designation. Ninety sites were recognized by NCQA at the 2008 standards and upgraded the recognition level to the 2011 standards during this demonstration program. By July 2014, there were 156 additional NCQA PCMH level 2 or 3 by 2011 standards recognized primary care practices in New York due to the HMH Demonstration.
- Increased Continuity: Medicaid members in residency training clinics often receive care that is discontinuous and uncoordinated due to residency training schedules and competing priorities. Extension or enhancement of continuity for residents and patients was a cornerstone of this demonstration. Hospitals submitted narrative descriptions as well as a variety of quantitative measures of continuity improvement which were followed over the course of the program.
- Clinical Performance Measures. Each hospital reported on at least five Quality Assurance Reporting Requirements (QARR)/Healthcare Effectiveness Data and Information Set (HEDIS) or Meaningful Use (MU) clinical performance measures per outpatient site. Measures reported on most frequently across the hospitals for adults included: diabetes care, control of high blood pressure, screening for colon, breast and cervical cancer, BMI assessment and counseling for nutrition. Measures for children included: childhood immunization status, counseling for nutrition and physical activity, lead screening, and well-child visits.
- Increased Coordination of Care: Each hospital worked on at least one of four individual care coordination projects for each outpatient site listed below. Each of these projects built on standards required and competencies developed in the transformation of a primary care practice to a PCMH.
- Care Transitions and Medication Reconciliation: Hospitals were required to develop an infrastructure at the patient´s primary care practice that ensured information was shared between settings whenever there was a transition of care. A medication registration registry was included as a requirement with the ability for linkages to Medicaid data. There were 80 sites that participated in this project.
- Integration of Physical and Behavioral Health Care: Hospitals that chose this project were provided a package of Collaborative Care training resources in coordination with the NYS Office of Mental Health (OMH). Hospitals were also required to develop a linkage from their outpatient sites to the NYS OMH Health Psychiatric Services and Clinical Knowledge Enhancement System (PSYCKES) database, to improve their training of residents in depression and pain management, and develop the infrastructure for consistent reporting to the NYS Controlled Medication Prescriber database. There were 34 sites that participated in this project.
- Improved Access and Coordination between Primary and Specialty Care: Hospitals that chose this project developed plans to improve access to specialists, improve coordination of referrals, and improve patient, primary care provider, and specialist communication and satisfaction. There were 54 sites that participated in his project.
- Enhanced Interpretation Services and Culturally Competent Care: This area focused on improving primary care services for limited English proficiency patients and enhanced provision of culturally competent care. There were 28 sites that participated in this project.
- Improved Inpatient Quality and Safety: Hospitals also reported on at least two of six inpatient quality and safety improvement projects. The areas included Severe Sepsis Detection and Management, Central Line Associated Bloodstream Infections (CLABSI), Venous Thromboembolism Prevention and Treatment (VTE), Surgical Care Improvement Project (SCIP), Neonatal Intensive Care Unit (NICU) Safety and Quality, and Avoidable Pre-Term Births.
Work Plan
Each hospital was required to submit a detailed work plan with a template developed by the NYS DOH that described their baseline status, plans to meet the demonstration milestones, and overall budget (See Appendix IA3). The work plan was reviewed by a team of reviewers at the NYS DOH including clinicians and data analysts and compared against existing data sources such as central line associated bloodstream data and surgical site infection data from CDC/NHSN and residency program data from the NYS Council on Graduate Medical Education. Reviewers requested revisions on all submitted work plans to ensure all program deliverables were addressed. In addition, reviewers worked with individual hospitals provide guidance in the selection of standardized measures and develop a data collection strategy as well as set achievable and meaningful goals. Project work plans can be accessed at https://hospitalmedicalhome.ipro.org/workplans/index.
Program Support/Educational Interventions
Specific practice support was given to each site primarily through the hospital´s HMH project coordinator. The hospital´s project coordinator and other members of hospital staff, residency programs, and outpatient clinics received educational support and clarifications via telephone and email from HMH staff and clinical reviewers. A Bureau mail log for centralized emailing was set up for the HMH demonstration so that all necessary NYS DOH staff had access and could assist in answering questions or concerns of participants.
Additionally, each facility was assigned to one of three HMH reviewers for the duration of the project. These reviewers were clinicians knowledgeable in the Hospital Medical Home program, PCMH concepts, measure reporting, and quality improvement. They were a primary resource to provide counsel and direction in all areas of the project. The reviewers provided formal quarterly feedback on the submitted narrative and data for all inpatient and outpatient projects, requested remedial root cause analyses in areas that were of concern, and were available throughout the quarter to answer questions and approve data changes. The NYS DOH created quarterly ´report cards´ that showed each site´s performance on metrics within the care coordination and integration projects in relation to other sites´ performance (See Appendix IA4). Report cards used identification numbers rather than site or hospital names, and hospitals were only aware of their own sites´ IDs. Composite scores (average rates on required measures within each project) were also displayed for each hospital and ambulatory site´s comparison. These reports were posted to the HMH portal each quarter along with a clinical performance report. The clinical performance report displayed the average rate reported on the most commonly chosen clinical performance metrics, by quarter, as well as the Quality Assurance Reporting Requirements (QARR) statewide Medicaid rate for that measure (See Appendix IA5). This allowed sites to compare their own rates with the average reported rate in HMH and with an external benchmark.
Throughout the demonstration, the NYS DOH provided instruction, guidance, and numerous learning opportunities. The NYS DOH held webinars and conference calls with participants to provide the necessary information, assistance, and resources. Webinars were held to explain instructions for the application, work plan, and reporting portal.
Conference calls, open to all participants, were held at the beginning of each of the seven reporting periods and one week prior to the reporting portal closing. Coaching calls were facilitated by HMH staff and presenters in areas of priority following a training needs survey administered to participants. There were 22 coaching calls hosted throughout the project that provided support on quarterly documentation, hospital report cards, PCMH recognition, resident empanelment, health information technology, introduction to clinical quality measures, plan-do-study-act (PDSA) cycles, transitions of care from hospital to clinic, increasing eye exams for patients with diabetes, and medication reconciliation.
Participating programs who had successfully navigated these issues presented successes on the calls and a facilitated discussion followed.
Throughout the course of the demonstration, members of the NYS DOH visited 36 of the 60 hospitals and their affiliated outpatient sites during which the programs presented their projects, residents discussed their experiences, and feedback was shared. Follow-up letters were provided to the hospitals, and in a number of cases, additional information was requested (See Appendix IA6). Additionally, over 250 attendees, including residents, residency program directors, clinic staff, administrators, and community and professional organizations attended each of the HMH conference in both 2014 and 2015 (See Appendix IA7, IA8). All plenary sessions were videotaped and made accessible through the HMH website at https://hospitalmedicalhome.ipro.org/pages/annual_conference.
Technical Support
Island Peer Review Organization (IPRO), under contract with the NYS DOH, developed a website for the demonstration (https://hospitalmedicalhome.ipro.org) that was also used as the submission portal for the project. The application and work plan were electronically submitted by each participating hospital to the NYS DOH via this portal. The application requested identifying and other information regarding the hospital facility and their participating residency programs and outpatient sites. The application asked for preliminary information regarding choices of types of participation in the demonstration.
The portal for the work plan provided instructions, resources, questions, and a measure grid for each section of the demonstration to guide the participants. The measure grid contained required measures and places for facilities to make additional measure entries. Hospitals were to enter the information for each site that was included in the demonstration. The measure grid was used for submitting measure definitions, numerators, denominators, data source, baseline data and goals. An "add a measure" function allowed hospitals to propose an additional measure to be used in their own sites which required submission of definition and specifications and was then reviewed and approved through the clinical team.
In order to track each facility´s progress, a separate reporting portal and platform was developed for which every participating hospital submitted quarterly data. This portal was used to submit and track data, provide relevant resources and update program information such as contact information. Each facility had a portal section devoted to that facility that included their own profile page. The reporting portal was divided into two sections: a measure grid called Milestone Data and a Hospital´s Narrative Questions section. The Milestone Data measure grid listed the selected and required measures for each section of the project and for each site. Hospitals were able to enter both metric and narrative data, access previously submitted and locked quarterly data, and view graphed trends of their own data. The website also provided participants access to their application and work plan, data performance reports, quarterly data feedback letters, coaching call and conference information, tools and resources, announcements, updates and help desk access. Portions of the website were open to the public while the hospital specific data was protected by a sign-in process. During each of seven quarterly report periods, hospitals were provided with a help line for technical questions as well as two collaborative conference calls to review any demonstration related changes and answer questions.
Hospitals were also provided with technical assistance from NYS DOH staff and experienced IPRO Quality Improvement consultants, continuously updated resources including individual hospital materials, webinars, videos, and toolkits, and opportunities for collaboration with other agencies such as the Office of Mental Health (OMH) as well as the hospital and professional associations and the Primary Care Development Corporation. Project resources were organized into project areas and posted publically on the HMH portal at https://hospitalmedicalhome.ipro.org/pages/resources.
|top of section| |table of contents|II. Research Questions Under Investigation
The following section contains a list of research questions that address CMS´s overarching questions in the areas of: demonstrable improvements in the quality of care received by demonstration participants (including measures of access, utilization, and quality of care); the extent to which HMH has produced reliable residency program design; and how HMH helped facilities improve systemic changes and quality performance. Discussion of qualitative improvements and analyses that speak to these three overarching questions will be presented in the results section, along with the quantitative analyses that correspond to the subset of research questions listed below.
- Has the State´s HMH Demonstration resulted in demonstrable improvements in the quality of care received by demonstration participants?
- Have there been significant improvements in clinical performance metrics since the beginning of the demonstration?
- Have there been significant improvements in performance on care coordination and integration projects since the beginning of the demonstration?
- Does post discharge medication reconciliation (PDMR) impact the rates of all cause 30-day readmission, or potentially preventable readmission?
- Have there been increases in rates of resident continuity since the beginning of the demonstration?
- Has increased resident continuity in HMH been associated with better clinical performance?
- Has performance on inpatient measures improved since the start of the demonstration?
- Has the change (if any) in potentially preventable readmissions since the beginning of the project differed in comparison to hospitals not participating in HMH?
- Are follow-up visits and follow-up calls for high-risk Medicaid patients that are completed within 48 hours of hospital discharge correlated with lower high-risk Medicaid patient readmissions rates within 30 days of the initial discharge?
- Did Collaborative Care Initiative (CCI) rates improve throughout the demonstration for sites participating in CCI?
- Did clinics in the behavioral health project achieve expected goals for screening depression?
- Did the clinics in the behavioral health project achieve expected goals for enrollment into collaborative care?
- Among those patients that remained in collaborative care for at least 6 weeks, was there a significant improvement in the rate of patients whose Patient Health Questionnaire (PHQ-9) score dropped to below 10?
- To what extent has HMH produced replicable residency program design features that enhance training in medical home concepts?
- Have the number of sites that report having restructured resident training schedules increased significantly since the beginning of the demonstration?
- Have the number of sites that report having increased resident time in ambulatory settings increased significantly since the beginning of the demonstration?
- Have residents been assigned a panel of patients for whom they are responsible over an extended time period?
- Compared to the beginning of the demonstration, are residents more likely to believe that the residency program clinic schedule allows residents to develop continuous relationships with their patients?
- How has the HMH demonstration helped facilities improve both their systemic and quality performance under each initiative implemented by the selected facilities?
- Has the number of sites that report having office processes for outpatient visits including accessing the Regional Health Information Organization (RHIO) increased since the beginning of the demonstration?
- Has the number of sites that report having hospital processes for admissions including accessing the RHIO increased since the beginning of the demonstration?
- Has the number of sites reporting ´yes´ to each care integration project question on implementation of systemic changes (21 questions) increased significantly since the beginning of the demonstration?
- Have all sites become recognized as high level (level 2 or 3 under 2011 standards) PCMHs?
- Have the number of sites reporting ´yes´ to each inpatient question related to infrastructure building increased significantly since the beginning of the demonstration?
- Have the majority of sites moved up a performance band (tertile) in each inpatient project?
III. Evaluation Design/Evaluation Type
Clinical performance metrics (1a), the rates for care coordination and integration projects, which contribute to site level composite scores (1b), and inpatient measures (1f) were collected from Q3 2013 through Q4 2014 using data submitted via the HMH web portal. Analyses on inpatient measures were limited to those with a normal distribution given the small sample sizes used in the analysis (for sample sizes under 30, a normal distribution is needed to determine, significant differences using parametric tests) and reported on by more than 20 sites. Site-level composite scores were calculated for each of the four care integration and coordination projects by averaging the rates of required measures reported by individual sites (see full list of measures used in each composite measure in section IV). For required measures where lower rates were desirable, rates were subtracted from 1.0 and the inverted rate was used to calculate the composite score. T-test analyses were used to evaluate improvements over time in clinical performance metrics, care coordination and integration projects using composite scores, and inpatient measures.
The relationship between post discharge medication reconciliation and readmission (1c) was assessed by comparing readmission rates for Medicaid patients who had a post discharge medication reconciliation (PDMR) to those who had not. Participating outpatient sites submitted quarterly lists of patients who had medication reconciliations done by the outpatient site following a hospital discharge each quarter from Q3 2013 to Q2 2014 to the NYS DOH. Using a retrospective cohort study design, these lists were verified in Medicaid claims and encounter data, and all-cause 30-day readmissions were identified in Statewide Planning and Research Cooperative System (SPARCS), an all-payer dataset of hospital discharges. SPARCS and Medicaid data were also used to create a control group, which was comprised of Medicaid enrollees who were patients of the outpatient site and had a hospital discharge in the same time period as the intervention group, but did not appear on the patient lists submitted by the outpatient sites (these patients did not have PDMR performed by the ambulatory site). Finally, potentially preventable readmissions (PPRs), as identified by 3M software, were identified in 2013 data (2014 PPRs were not available at the time of this analysis). Datasets were limited to active patients (defined as having had a visit to the ambulatory site within six months prior to hospitalization). Logistic regression analyses were performed using combined quarterly data to determine the impact of PDMR (the exposure variable) on all-cause 30 day readmissions and PPRs (the outcome variables). The independent variables for modeling the probability of an all cause 30 day readmission included initial hospitalization admission type, and patient clinical risk group (CRG), mental health status, diagnosis at initial admission, and initial admission length of stay. The independent variables for modeling the probability of a PPR included initial hospitalization admission type, and patient clinical risk group (CRG), mental health status, and diagnosis at initial admission.
Increases in resident continuity (1d) are described using hospital-reported data from Q4 2014 for two measures of resident continuity: 1) The number of resident visits with patients on their own panel and 2) the number of patient visits with their assigned resident primary care physician. These measures were not collected at baseline, as most sites developed or strengthened empanelment through participation in HMH. Q4 2014 results are compared to an assumed baseline of 0. Similarly, increases in the number of sites recognized as PCMHs (3d) are described using hospital-reported data from Q4 2014 on high-level (level 2 or 3 under the NCQA´s 2011 standards) PCMH achievement.
A correlation analysis was used to assess the relationship between resident continuity measures and five clinical performance metrics (1e) and to assess the relationship between follow-up visits and follow up calls of high-risk Medicaid patients within 48 hours of hospital discharge and high-risk Medicaid patient readmissions within 30 days of the initial discharge (1h). Pearson product-moment correlation coefficients were calculated for all correlations. The clinical performance metrics included in the resident continuity analysis were restricted to those reported by a substantial number of HMH sites. Clinical performance measures, rates of follow-up, and high risk Medicaid readmissions were collected from HMH sites from baseline through the end of the demonstration. The two resident continuity measures (the proportion of resident visits with patients on their own panel and the proportion of patient visits with their assigned resident primary care physician) were collected in Q3 2014 and Q4 2014.
Collaborative Care Initiative (CCI) data was collected via the HMH portal on a quarterly basis. Data submitted in Q4 2013 through Q4 2014 was analyzed and aggregated to compare sites and examine overall trends. Data reported in the CCI focused predominantly on process measures designed to track and benchmark model implementation and fidelity. The analysis included: 1) the depression screening rate - the proportion of patients that were screened for depression using either the PHQ-2 or PHQ-9 standardized tool 2) the screening yield - among those screened, the number that scored positive for depression on the screening tool 3) the depression rate - of those screened positive, the percent of those diagnosed with depression and 4) the enrollment rate - among those screened positive, the percent that were enrolled in the CCI (1i, 1j, 1k). One measure looked specifically at outcomes among those patients receiving collaborative care for at 16 weeks (1l).
T-test analyses were used to determine the association between potentially preventable readmission rates and HMH participation (1g). Additionally, the changes in risk-adjusted PPR rates over time were assessed between HMH and non-HMH hospitals using linear regression: one model with HMH participation as the only predictive factor and one model controlling for rural/urban continuum code (1g). Potentially Preventable Readmissions (PPR) rates for 2011 through 2013 at hospitals in NYS (adjusted for patient age group, mental health status, severity of illness, and All Patient Refined Diagnosis Related Group) were available at the Health Data NY website. Hospital characteristics were extracted from the NYS Health Facilities Information System. Metropolitan and nonmetropolitan hospitals were identified using the National Center for Health Statistics classification scheme based on the United States Department of Agriculture rural/urban continuum codes. Outliers and influential observations were removed prior to analysis.
In order to determine if HMH produced replicable residency program design (2a-2d) and if the demonstration helped facilities improve systemic quality performance under each initiative (3a-3c and 3e), chi-square tests were used. When expected values were less than five, a Fisher´s exact test was used. Sites were required to answer yes/no questions throughout the course of the demonstration within each of these research topics. The proportion of responses answered ´yes´ and proportion of responses answered ´no´ for each question at baseline were compared to the proportions of yes/no answers at the end of the demonstration. The analysis comparing the results of the 2013 and 2015 Resident PCMH Surveys (2d) was restricted to only residents who responded that their residency program participated in HMH. Furthermore, this chi-square analysis was stratified by residency program type (Family Medicine, Internal Medicine, Internal Medicine-Pediatrics, and Pediatrics).
Tertile band movement (3f) is described using HMH Healthcare Associated Infections Data from 2011 and 2014. Standardized infection ratio (SIR) performance was used to place HMH and non-HMH hospitals into tertiles using 2011 data. A ´needed rate´ indicating the rate the hospital would need in order to move into a higher tertile (or the highest quartile for hospitals who already ranked in the top tertile) was established and given to each hospital. 2014 SIR performance was analyzed to determine the number and proportion of hospitals meeting their ´needed rate.´ While tertiles were established for a number of inpatient measures, post-HMH data are available for three (CLABSI SIR, NICU SIR, and Surgical Site Infection (SSI) SIR) and therefore only these three metrics can be assessed at this time. 2014 data was adjusted to account for changes in measure definitions. CLABSI SIR and NICU SIR rates were multiplied by 0.84 to account for a definition change that resulted in lower performance in 2014 (See Appendix IIIA1). SSI SIR excluded hysterectomies because this procedure was not included in the 2011 definition. Further details about pre- HMH metric adjustment is available in the appendix (See Appendix IIIA2).
Qualitative data was also evaluated, through review of the application, work plan information submitted, and quarterly and annual narrative questions addressed in the portal. Each quarterly report contained a set of required narrative questions. Reponses to those questions were evaluated based on the individual hospital´s work plan and previous quarterly reports including any required supplemental submissions of root cause analyses. The Clinical review team reviewed information and met for consensus and review of emerging themes. Recommendations, comments, and sometimes root cause analysis requests were then made in a quarterly feedback letter sent to each project participant. For an example of a quarterly feedback letter see Appendix IIIB3. The NYS DOH also compiled and analyzed the narrative final reports from each hospital which are posted on the HMH website at https://hospitalmedicalhome.ipro.org/workplans/index.
|top of section| |table of contents|IV. Data Needs and Data Sources
HMH Portal
The analyses presented in this report are based on data submitted quarterly though the HMH web portal. Raw data were sent to the NYS DOH from IPRO, the developer of the web portal, quarterly as comma separated value files. Data was analyzed by the NYS DOH to assess changes over time by site and at a demonstration level. Quantitative continuous data, nominal data (yes/no), and qualitative data was collected through the tool and analyzed by DOH analysts and reviewers. Metrics are shown at either the hospital or site level.
The following measures were required for all participating sites (project-specific care coordination and integration measures, and inpatient measures, were required only for hospitals/sites participating in that project). Some questions required yes/no responses while others required the submission of rates:
- PCMH Standards/Recognition
- Achieved NCQA PCMH recognition at the Level 2, 2011 standard for all participating sites?
- Achieved NCQA PCMH recognition at the Level 3, 2011 standard for all participating sites?
- Do you have a MU-Certified EHR?
- Are you connected to the RHIO at the outpatient resident continuity site?
- Are you connected to the RHIO at the hospital?
- Do you regularly upload data to the RHIO from the outpatient site?
- Do you regularly upload data to the RHIO from the hospital?
- Do your office processes for outpatient visits include accessing the RHIO for information?
- Do your processes for hospital admissions include accessing the RHIO for information?
- Increased the number of continuity training sites or expanding the current hospital-based sites beyond the hospital environment?
- Increased resident time in ambulatory settings?
- Resident Continuity Training Programs
- Restructured the resident training schedule to redistribute the time spent in an ambulatory setting?
- Assigned patient panels and/or resident /attending teams?
- Other methods?
- Have residents been assigned a panel of patients for whom they are responsible over an extended time period?
- Are patients assigned to a team?
- Patient Visits with Assigned Primary Care Provider
- Resident Visits with Own Patient Panel
- Improved Access and Coordination between Primary and Specialty Care
- Standardized referral process developed?
- Gaps in access and coordination Identified?
- System developed to ensure Complete Accurate and Timely Information from PCP to patient and specialist and specialist to PCP and patient?
- Patient Specialty Visit Care*
- Referrals & Inadequate Documentation*
- Referrals Made and Not Completed*
- Rejected Referrals*
- Specialty Care Wait Times*
- Integration of Physical-Behavioral Health Care
- A system been developed for the site to access and act on PSYCKE reports?
- Have all residents been trained in depression screening, appropriate treatment modalities, and referral?
- Have all residents been trained in Pain Management screening, appropriate treatment modalities, and referral?
- Demand and capacity for behavioral health services assessed?
- Is there an organizational plan for reviewing provider and program-level outcomes?
- Quality improvement plan utilizes provider and program-level outcomes data?
- Organization developed algorithm for patients not demonstrating improvement and process for treatment adjustment and psychiatric consultation?
- Have a process for facilitating and tracking referrals for specialty care?
- Created algorithm used by your organization for screening and diagnosing patients with behavioral health issues?
- Depression and Pain Management*
- Depression Screening*
- Enrolled Patients with Psychiatric Consult*
- Patients Enrolled in a Physical-Behavioral Health Program*
- PHQ-9 Decreases Below in 16 Weeks or Greater*
- Wait Times for Behavioral Health Services*
- Controlled Substances
- Care Manager FTE
- Patients Diagnosed with Depression
- Care Transition/Medication Reconciliation
- Medication Reconciliation Registry Developed?
- Standardized Communication Protocols?
- Has a Care Transition Protocol been developed for the most common causes of avoidable readmission?
- A System for identifying high risk patients?
- A system for allocation of resources to the most high risk patients?
- Is there now an Integrated EHR Information Systems between Inpatient & Outpatient sites?
- Admission Medication Reconciliation Rate*
- Follow Up Call*
- Follow Up Visit*
- Medicaid Readmission*
- Reconciled Medication List Received by Discharged Patients*
- Timely Transmission of Transition Record*
- Transition Record with Specified Elements Received by Discharged Patients*
- Care Transition Measure (CTM-15 Survey)
- Enhanced Interpretation Services and Culturally Competent Care
- Gaps in access and coordination identified?
- Increased access provided to appropriate language services?
- Training programs developed to improve staff cultural competence and awareness?
- Developed capacity to generate prescription labels in patients´ primary language with easy to understand instructions?
- Cross Cultural Training*
- Demographic Data Recorded*
- Discharge Instructions in Language of Patient*
- Interpreter Wait Time*
- Prescriptions in Language of Patient*
* Indicates inclusion in composite score calculations.
A more detailed measure list that includes inpatient measures and definitions for numerators and denominators is available in the appendix (See Appendix IVA1).
Many sites were unfamiliar with standardized data reporting on an attributed population at the beginning of the demonstration, and early education activities required NYS DOH to focus on data collection and reporting methods. Because early data (project quarters 1 and 2) included a large amount of missing or inaccurate data, Quarter 3, 2013 (first year of demonstration) is considered ´baseline´ for most metrics. In general, analyses that use responses to ´yes/no´ questions use Quarter 2, 2013 data as baseline. Quarter 4, 2014 was used as the final measurement period when comparing rates over time, with the exception of yes/no inpatient metrics, which were only collected until Quarter 4, 2013. Measurement quarters used for data analysis are specified in each results table.
In collecting and reporting quantitative data, sites were instructed to include all patients with a visit to the outpatient clinic within the past two years in the measure denominators. Other methods of attribution may have been applied at the site-level but were not specified by the demonstration. Clinical performance metrics were chosen by the site, but were required to be consistent with standardized measures such as Quality Assurance Reporting Requirements (QARR)/Healthcare Effectiveness Data and Information Set (HEDIS)® or Meaningful Use (MU). NYS DOH conducted a review of all clinical performance metrics submitted by all participants. Sites that initially proposed measures that did not meet QARR/HEDIS or MU definitions were instructed to revise measure definitions to meet these standards. Sites were asked to report these data using a rolling year as the time frame under evaluation (a rolling year includes the reporting quarter and the preceding three quarters). Quantitative measures in other domains used measures developed by the demonstration, which were common across all sites. In reporting data, sites were instructed either to utilize electronic health record (EHR) data to report rates across their entire patient population or a random sample of 30 patients when needed. For specific inpatient measures, such as CLABSI rates, alternative sampling was utilized, including presenting rates from a specific day of the week rather than a full quarter´s data. For some metrics, the number of sites reporting at baseline was smaller or larger than the number of sites reporting in Quarter 4, 2014. This is due to site-level data collection issues, site movement from one project to another, or because sites were no longer participating in the demonstration.
Other Data Sources
Some analyses used data sources outside the portal to perform a more robust analysis. A medication reconciliation and readmission analysis utilized patient lists reported by each site on a quarterly basis, along with matched Medicaid claims and encounter data and SPARCS discharge data. Inpatient project analyses involved banding hospitals into tertiles based on performance in select measures. Additional data used for this evaluation included NYS Vital Statistics data (2009), DOH Healthcare Associated Infections (HAI) data (2011 and 2014), SPARCS data (2010 and 2011), and CMS Hospital Quality Initiative data (2010- 2011). Because of the limited availability of post-HMH data (due to delayed reporting) from these sources, only the HAI data was utilized in the inpatient tertile band progress analysis presented in this report. An analysis of PPRs at HMH and non-HMH hospitals utilized a dataset of PPR rates for 2011-2013 at NYS hospitals from Health Data NY, hospital characteristics data from the NYS Health Facilities Information System, and a NYS DOH Medicaid Graduate Medical Education funding roster. Chi-square analyses on the Resident PCMH Survey used the raw data from 2013 and 2015 surveys created in conjunction with the Greater New York Hospital Association, which were administered through SurveyMonkey™.
Application: Each hospital submitted an application that described their hospital, residency programs, continuity clinics, current PCMH status, Medicaid volume, and proposed projects (See Appendix IVB1).
Work plan: Each hospital submitted a detailed work plan that included narrative descriptions of their plans to obtain PCMH recognition, budget, five selected clinical performance metrics including a demographic description of each outpatient site substantiating the need for each metric chosen, plans for extending or enhancing their resident training programs, a care integration project and answers to a set of required narrative questions including each of the deliverables set out in the Standard Terms and Conditions for that project, and two inpatient Quality and Safety Improvement Projects including their project plan and methods to include residents (See Appendix IVB2).
Formal PCMH Baseline Assessment: Each hospital submitted a formal assessment for each clinic of their baseline with regard to achieving the elements of NCQA PCMH Recognition including: Enhancing Access and Continuity; Identifying and Managing Patient Populations; Planning and Managing Care; Providing Self-Care Support and Community Resources; Tracking and Coordinating Care; Measuring and Improving Practice (See Appendix IVB3).
Quarterly Report Narrative Questions: Each quarter hospitals were required to submit narrative answers to a set of questions on each of five areas: PCMH and Health Information Technology; Resident Continuity Training Programs; Care Integration Projects; and each Inpatient Quality and Safety Project (See Appendix IVB4).
Annual Report Narrative Questions: A final report after the end of the first year contained an additional set of questions designed to gauge overall progress on the deliverables for each section of the project in conjunction with the quantitative data being submitted (See Appendix IVB5).
Final Report Project Summary: At the end of the project, each hospital was required to submit, in addition to their data on the measures, a Final Report that covered the following areas: final results with regard to changes in access, utilization, and quality of care; changes to residency programs; improvement in inpatient projects; challenges and limitations; lessons learned; and future plans including sustainability (See Appendix IVB6).
Site Visits: Department representatives completed site visits to more than half of the participating hospitals. At each site visit, programs gave presentations on their projects in each of the project areas which allowed the Department to evaluate progress, successes, challenges and opportunities for future support (See Appendix IVB7).
Other: Hospitals with best practices in a given area were invited to present their quality improvement projects on a coaching call for the other hospitals. For example, one hospital presented a comprehensive care transitions program that overcame many barriers and challenges. In addition, programs with innovative solutions to problems submitted documentation, power points, videos and other materials that were posted on the Hospital Medical Home website, including a team huddle video, an algorithm for assigning patient panels to residents, and a specialist-PCP communication satisfaction survey (See Appendix IVB8).
|top of section| |table of contents|V. Data Analysis
The following tables present results related to the research questions stated in section II, and are followed by the results of Hospital Medical Home´s qualitative analyses.
1a.The table below compared clinical performance in Q3 2013 to Q4 2014. Significant improvement was seen in several measures of clinical performance, including Breast, Cervical, and Colorectal Cancer Screenings, Dilated Eye Exam for Diabetics, Nephropathy Testing for Diabetics, Tobacco Use Assessment and Weight and Physical Activity Assessment for Children/Adolescents. A significantly lower rate was seen in Q4 2014, compared to baseline, for Follow up After Hospitalization for Mental Illness within 30 Days.
1a. Clinical Performance from Q3 2013 to Q4 2014 | |||||
---|---|---|---|---|---|
Measure Name | Number of sites reporting at baseline and Q4 2014 | Baseline Rate | Q4 2014 Rate | p-value | Significant Increase (↑) Significant Decrease (↓) |
Adult BMI Assessment | 45 | 44% | 51% | 0.0546 | |
Antidepressant Medication Management | 12 | 72% | 77% | 0.4025 | |
Breast Cancer Screening | 28 | 47% | 60% | 0.0109 | ↑ |
Care Coordination | 17 | 79% | 78% | 0.1442 | |
Controlling High Blood Pressure | 64 | 64% | 68% | 0.0909 | |
Cervical Cancer Screening | 29 | 51% | 64% | 0.0012 | ↑ |
Child Immunization Status | 38 | 57% | 71% | 0.002 | ↑ |
Colorectal Cancer Screening | 51 | 48% | 59% | <0.0001 | ↑ |
Dilated Eye Exam for Diabetics | 44 | 31% | 42% | 0.0002 | ↑ |
Follow Up After Hospitalization for Mental Illness within 30 Days | 15 | 85% | 66% | 0.0001 | ↓ |
Hemoglobin Testing for Diabetics | 34 | 83% | 86% | 0.2012 | |
Lipid Profile for Diabetics | 10 | 63% | 68% | 0.5077 | |
Lead Screening in Children | 13 | 61% | 67% | 0.4269 | |
Nephropathy Testing for Diabetics | 12 | 68% | 82% | 0.0292 | ↑ |
Tobacco Use Assessment | 68 | 70% | 86% | <0.0001 | ↑ |
Weight and Physical Activity Assessment for Children/Adolescents | 50 | 58% | 86% | <0.0001 | ↑ |
Well Child Visits | 39 | 75% | 78% | 0.4562 | |
↑ or ↓ indicate a statistically significant change from baseline to Q4 2014 given a p-value of <0.05, as well as the direction of the change. |
1b.i Culturally Competent Care Project: The graph below shows changes over time for the Enhanced Interpretation Services composite and its six components. The composite score improved 27% percentage points from baseline to the end of the demonstration and was statistically significant (p-value of 0.004). The following measures were the largest contributors driving this change: the average rate of staff from the outpatient site who completed a cultural competency training in the past 12 months and the average rates of prescription labels and the average rate of discharge summaries written in the preferred language of the patient for non-English speaking patients.
1b.ii The Care Transitions and Medication Reconciliation Project: As shown in the graph below, the composite score improved 16 percentage points from baseline to the end of the demonstration and was statistically significant (p-value of <0.0001). The following measures were large contributors to this change: the average rate of all patients from the outpatient site who received a specified transition record and review at the time of discharge and the average rate of all high risk Medicaid patients from the outpatient site discharged that had a follow up phone call within 48 hours of discharge.
1b.iii Improved Access and Coordination between Primary and Specialty Care Project: The composite score improved seven percentage points from baseline to the end of the demonstration and was statistically significant (p-value of 0.0006). Although most measures in this composite showed a gradual improvement over time, the following measure was the largest contributor driving this change: the average rate of all patients from the outpatient site being referred and seen within the timeframe requested by the primary care provider.
1b.iv Integration of Physical and Behavioral Healthcare Project: The composite score improved 33 percentage points from baseline to the end of the demonstration and was statistically significant (p-value of <0.0001). Although all measures showed a large improvement since baseline, the following measures were the largest contributors driving this change: the average rate of clinicians at the outpatient site who completed depression and pain management training, the average rate of depression screening of adult patients at the outpatient site, and the average rate of patients enrolled in the collaborative care initiative whose PHQ-9 decreased below 10 in 16 weeks.
1ci and 1cii. Post Discharge Medication Reconciliation: Crude and adjusted rates show that having a post discharge medication reconciliation (PDMR) at the outpatient site is associated with lower odds of 30-day readmission and potentially preventable readmission when compared to a control group that did not have PDMR.
1c.i Post Discharge Medication Reconciliation Impact on Readmission (Crude) | |||||
---|---|---|---|---|---|
Group | Time frame of study | Denominator | Readmissions | Readmission Rate | |
30-day all cause readmissions | Intervention | Q3 2014- Q2 2014 |
306 | 34 | 11.1% |
Control | 12,592 | 2,687 | 21.3% | ||
Potentially preventable readmissions | Intervention | Q3 2013- Q4 2013 |
288 | 20 | 6.9% |
Control | 7,606 | 948 | 12.5% |
1c.ii Post Discharge Medication Reconciliation Impact on Readmission (Adjusted/ Regression Analysis) | |||||
---|---|---|---|---|---|
Odds Ratio | 95% Confidence Interval |
p-value | |||
Odds Ratio for All cause 30 day Readmission | 1.934 | 1.334, 2.803 | 0.0005 | ||
Odds Ratio for Potentially Preventable Readmission | 1.944 | 1.212, 3.118 | 0.0058 | ||
Odds Ratio rates displayed compare the risk of readmission in the control group to the risk of readmission in the intervention group. |
1d. Resident Continuity: The table below shows that over half of resident visits are with patients on their own panel, and over half of patient visits are with their assigned resident primary care provider. Patient empanelment was a priority within the demonstration, and it is assumed that most resident visits with patients on their panel, or patient visits with their assigned resident PCP were very low at baseline.
1d. Final Reported Rates for Measures of Resident Continuity | |
---|---|
Measure | Rate as of Q4 2014 |
Resident Visits with Patients on their Panel | 55% |
Patient Visits with Assigned Resident PCP | 54% |
1e. Resident Continuity and Clinical Performance: The table below includes correlations between resident continuity and select measures of performance. The correlation coefficients between both measures of resident continuity and rates of controlled lipid levels in diabetics are at or near statistical significance (p<0.05). However, the sample sizes for these analyses were small. Levels approaching significance were seen for the correlation between rates of resident continuity and rates of blood pressure control in hypertensive patients. There was no evident correlation between resident continuity and the three cancer screening measures.
1e. Correlations Between Resident Continuity and Clinical Performance Measures | ||||
---|---|---|---|---|
Q3 2014 | Q4 2014 | |||
Resident Visits with Own Panel | Patient Visits with Assigned PCP | Resident Visits with Own Panel | Patient Visits with Assigned PCP | |
Correlation Coefficient (r) Probability (p) Sample size (n) |
Correlation Coefficient (r) Probability (p) Sample size (n) |
Correlation Coefficient (r) Probability (p) Sample size (n) |
Correlation Coefficient (r) Probability (p) Sample size (n) |
|
Lipids Controlled | 0.7104 (0.0483) 8 |
0.5529 (0.0777) 11 |
0.6484 (0.0589) 9 |
0.6268 (0.039) 11 |
Controlling Blood Pressure | 0.2139 (0.149) 47 |
0.1653 (0.2669) 47 |
0.1224 (0.4232) 45 |
0.1675 (0.266) 46 |
Breast Cancer Screening | (0.0229) (0.9136) 25 |
(0.0232) (0.9067) 28 |
0.1952 (0.3498) 25 |
0.0381 (0.8502) 27 |
Colorectal Cancer Screening | (0.1843) (0.2256) 45 |
(0.0929) (0.5437) 45 |
(0.1614) (0.2953) 44 |
(0.0583) (0.7072) 44 |
Cervical Cancer Screening | (0.149) (0.5081) 22 |
(0.1498) (0.4747) 25 |
0.1377 (0.5516) 21 |
0.0941 (0.6695) 23 |
The correlation coefficient (r) is a measure of the direction and strength of a linear relationship between two variables. The range of r is ((1, 1) where (1 is a perfect negative relationship and 1 is a perfect positive relationship. A 0 indicates no relationship. |
1f. Improvement in Inpatient Care. The table below compares Q3 2013 and Q4 2014 rates for inpatient quality and safety. The rates for both Central Line Bundle Compliance and Venus Thromboembolism Discharge Instructions (VTE-5) significantly improved from Q3 2013 to Q4 2014.
1f. Improvement in Inpatient Measures from Q3 2013 to Q4 2014 | ||||
---|---|---|---|---|
Measure Name | Baseline Rate Sample size (n) |
Q4 2014 Rate Sample size (n) |
p-value | |
Central Line Bundle Compliance | 57% 45 |
91% 44 |
<0.0001 | ↑ |
National Healthcare Safety Network Central Line- Associated Bloodstream Infection Outcome Measure | <1% 43 |
<1% 42 |
0.3217 | |
Sepsis Mortality | 23% 32 |
25% 31 |
0.8186 | |
Severe Sepsis and Septic Shock: Management Bundle | 65% 27 |
69% 26 |
0.5471 | |
Venous Thromboembolism Prophylaxis: Surgical Complications Core Processes VTE-1 | 93% 21 |
94% 21 |
0.4566 | |
Venous Thromboembolism Discharge Instructions - VTE- 5 | 67% 21 |
95% 21 |
0.0046 | ↑ |
↑ or ↓ indicate a statistically significant change from baseline to Q4 2014 given a p-value of <0.05, as well as the direction of the change. |
1g.i Potentially Preventable Readmissions: As shown in the table below, there was a statistically significant decrease in PPR rates between 2011 and 2013 in both HMH and non-HMH hospitals.
1g.i Comparison of PPR Rates from 2011 to 2013 in both HMH and non-HMH NYS Hospitals | |||||
---|---|---|---|---|---|
Sample Definition | 2011 PPR rate (PPR chains/100 at risk admissions) |
2013 PPR rate (PPR chains/100 at risk admissions) |
Sample Size | p-value | Significant Increase (↑) Significant Decrease (↓) |
HMH Hospitals | 6.706 | 6.176 | 236 | 0.0028 | ↓ |
Non-HMH Hospitals | 5.632 | 5.192 | 412 | 0.0264 | ↓ |
↑ or ↓ indicate a statistically significant change from 2011 to 2013 given a p-value of <0.05, as well as the direction of the change. |
1g.ii Potentially Preventable Readmissions: As shown in the table below, while HMH is not a statistically significant predictor of the changes in PPR rates from 2011 to 2013, the United States Department of Agriculture rural/urban continuum code is a significant predictive factor (p<0.05). Rural NYS hospitals were less likely than urban hospitals to have a decrease in PPR rates from 2011 to 2013.
1g.ii Linear Regression Analysis (2011-2013) | ||
---|---|---|
Model without rural/urban continuum code | Model with rural/urban continuum code | |
Constant | -0.62697 p<0.0001 |
-1.00438 p<0.0001 |
HMH Participation | 0.19714 p=0.3935 |
0.25821 p=0.1060 |
Rural/urban Continuum Code | 0.20469 p=0.0004 |
|
R-squared | -0.0017 | 0.0755 |
Number of observations | 159 | 141 |
1g.iii Potentially Preventable Readmissions: The table below shows that there is no statistically significant difference in the changes in PPR rates over time in HMH hospitals as compared to non-HMH hospitals even after controlling for metropolitan/nonmetropolitan classification.
1g.iii Comparison of HMH and non-HMH Hospitals in Changes in PPR Rates from 2011 to 2013 | |||
---|---|---|---|
Sample Definition | Sample Size | Result | p-value |
All NYS Hospitals | 156 | No statistically significant difference between HMH hospitals and non-HMH hospitals. | 0.5862 |
Metropolitan Hospitals Only | 116 | No statistically significant difference between HMH hospitals and non-HMH hospitals. | 0.2132 |
1h. Follow-Up Visits and Readmission Rates: Results of the correlation analysis between follow up visits within 48 hours of discharge and reported rates of Medicaid readmissions showed a significant association in both quarters. Results of the correlation analysis between follow up calls within 48 hours of discharge and reported rates of Medicaid readmissions showed a significant association in the Q3 2014, but not Q4 2014 reported rates.
1h. Correlations Between Reported Rates of Follow Up Visits Within 48 hours of Discharge and Reported Rates of All-Cause 30 Day Medicaid Readmissions and Reported Rates of Follow Up Calls Within 48 hours of Discharge and Reported Rates of All-Cause 30 Day Medicaid Readmissions | ||||
---|---|---|---|---|
Q3 2014 | Q4 2014 | |||
Follow Up Visits | Follow Up Calls | Follow Up Visits | Follow Up Calls | |
Correlation Coefficient (r) Probability (p) Sample size (n) |
Correlation Coefficient (r) Probability (p) Sample size (n) |
Correlation Coefficient (r) Probability (p) Sample size (n) |
Correlation Coefficient (r) Probability (p) Sample size (n) |
|
Readmissions | -0.3187 (0.0072) 70 |
-0.25300 (0.0346) 70 |
-0.3214 (0.0075) 68 |
0.03192 (0.7946) 69 |
The correlation coefficient (r) is a measure of the direction and strength of a linear relationship between two variables. The range of r is (-1, 1) where -1 is a perfect negative relationship and 1 is a perfect positive relationship. A 0 indicates no relationship. |
1i. Behavioral Health Process Measures: With the exception of screening yield, which was relatively stable over the demonstration, average rates at all sites (n=32) of all the main process measures - depression screening, diagnosis, and enrollment into Collaborative Care - showed improvement over the final five quarters of the demonstration.
1j. Behavioral Health Depression Screening: Increases in depression screening are one of the most tangible results of the CCI. Screening rates averaged across all sites increased from around 60% at the beginning of the demonstration to over 85%, achieving the project goal. Twenty-three of 32 sites met or exceeded the goal of screening 85% of all patients seen annually for depression using the PHQ-2/9. Sites that committed to near universal screening were able to do so.
1k. Behavioral Health Collaborative Care Enrollment: The expectation was that sites would enroll 50% of patients diagnosed with depression into Collaborative Care. Over the last five quarters of the demonstration, the number of sites meeting this goal ranged from 7 to 11 out of the 32 participating sites. Enrollment rates increased over the course of the project from 35% to 43%, suggesting an increase in capacity.
1l. Behavioral Health PHQ-9 Decreases: Among those in treatment for at least 16 weeks, an increase was observed in the proportion of patients with an improvement in the PHQ-9 measure from Q4 2013 (16%) to Q4 2014 (45%).
1i. Trends in CCI Rates Q4 2013-Q4 2014
2a, 2b and 2c. Residency Program Design: As shown in the table below, the proportion of sites reporting having restructured resident training schedules, and the proportion of sites reporting having increased resident time in ambulatory settings was significantly larger in Q4 2014 when compared to baseline. There was no significant difference in the proportion of sites reporting residents having a panel of patients for whom they are responsible.
2a, 2b, 2c. Chi Square Analyses on Residency Program Redesign | ||||||
---|---|---|---|---|---|---|
Measure | Baseline (Q2 2013) | Q4 2014 | p-value | Significant Increase (↑) Significant Decrease (↓) |
||
Sites Answering ´Yes´ (Percent) | Sites Answering ´No´ (Percent) | Sites Answering ´Yes´ (Percent) | Sites Answering ´No´ (Percent) | |||
Number of Sites Reporting Restructured Resident Training Schedules | 97 (62%) | 60 (38%) | 128 (82%) | 28 (18%) | <0.0001 | ↑ |
Number of Sites Reporting Increased Resident Time in Ambulatory Settings | 70 (45%) | 86 (55%) | 106 (68%) | 50 (32%) | <0.0001 | ↑ |
Number of Sites Reporting Residents Having a Panel of Patients for Whom they are Responsible | 134 (86%) | 22 (14%) | 142 (91%) | 14 (9%) | 0.1563 | |
↑ or ↓ indicate a statistically significant change from baseline to Q4 2014 given a p-value of <0.05, as well as the direction of the change |
2d.i Pediatric Resident Survey: Based on resident surveys administered in 2013 and 2015, the proportion of HMH Pediatrics residents who responded affirmatively to having a residency program which allowed them to participate in PCMH activities at the clinical site increased from 2013 to 2015. The proportion of HMH Pediatrics residents who responded affirmatively to their program having PCMH concepts incorporated into their educational activities increased from 2013 to 2015.
2d.i Resident PCMH Survey Chi Square Analyses for HMH Pediatric Residents | ||||||
---|---|---|---|---|---|---|
Measure | 2013 | 2015 | p-value | Significant Increase (↑) Significant Decrease (↓) |
||
Residents Answering Positively (%) | Residents Answering Neutrally or Negatively (%) | Residents Answering Positively (%) | Residents Answering Neutrally or Negatively (%) | |||
My residency program has involved me in activities within the clinic site associated with being a PCMH. | 102 (77%) | 20 (23%) | 116 (91%) | 11 (9%) | 0.0021 | ↑ |
PCMH concepts have been incorporated into educational activities within my residency program. | 97 (73%) | 35 (27%) | 115 (91%) | 12 (9%) | 0.0004 | ↑ |
The method in which I am scheduled for clinic in my residency program (i.e. weekly sessions vs. block sessions) allows me to develop a continuous relationship with my patients. | 113 (86%) | 19 (14%) | 109 (86%) | 18 (14%) | 0.9999 | |
I would like to work in a PCMH after graduation. | 64 (48%) | 68 (52%) | 59 (46%) | 68 (54%) | 0.8037 | |
↑ or ↓ indicate a statistically significant change from 2013 to 2015 given a p-value of <0.05, as well as the direction of the change |
2d.ii Family Medicine Resident Survey: The proportion of HMH Family Practice residents who responded affirmatively to having a schedule which facilitates the development of provider-patient relationships decreased from 2013 to 2015. The proportion of HMH Family Practice residents who responded affirmatively to wanting to work in a PCMH after graduation also decreased from 2013 to 2015. The most positive survey responses to these four questions in 2013 were from the Family Medicine residency program. The Family Medicine responses became more similar to the responses from other residency programs in 2015. There were no statistically significant differences between the 2013 and 2015 surveys for the Internal Medicine and Internal Medicine-Pediatrics residency programs.
2d.ii Resident PCMH Survey Chi Square Analyses for HMH Family Medicine Residents | ||||||
---|---|---|---|---|---|---|
Measure | 2013 | 2015 | p-value | Significant Increase (↑) Significant Decrease (↓) |
||
Residents Answering Positively (%) | Residents Answering Neutrally or Negatively (%) | Residents Answering Positively (%) | Residents Answering Neutrally or Negatively (%) | |||
My residency program has involved me in activities within the clinic site associated with being a PCMH. | 112 (87%) | 16 (13%) | 160 (86%) | 26 (14%) | 0.7053 | |
PCMH concepts have been incorporated into educational activities within my residency program. | 109 (85%) | 19 (15%) | 168 (90%) | 18 (10%) | 0.1629 | |
The method in which I am scheduled for clinic in my residency program (i.e. weekly sessions vs. block sessions) allows me to develop a continuous relationship with my patients. | 119 (93%) | 9 (7%) | 158 (85%) | 28 (15%) | 0.0303 | ↓ |
I would like to work in a PCMH after graduation. | 89 (70%) | 39 (30%) | 102 (55%) | 84 (45%) | 0.0088 | ↓ |
↑ or ↓ indicate a statistically significant change from 2013 to 2015 given a p-value of <0.05, as well as the direction of the change |
3c.i Systemic Changes: Care Transition and Medication Reconciliation
Hospitals Participating in Project:
- Beth Israel Medical Center - Petrie Campus
- Bronx-Lebanon Hospital Center - Concourse Division
- Brooklyn Hospital Center - Downtown Campus
- Glen Cove Hospital
- Lutheran Medical Center
- Mercy Hospital
- Montefiore Medical Center - Henry and Lucy Moses Division
- Mount Sinai Hospital
- Mount Vernon Hospital
- New York Methodist Hospital
- New York Presbyterian Hospital - Columbia Presbyterian Cent
- Niagara Falls Memorial Medical Center
- North Shore University Hospital
- Phelps Memorial Hospital Assn
- Rochester General Hospital
- Samaritan Medical Center
- Sisters of Charity Hospital
- Sound Shore Medical Center of Westchester
- South Nassau Communities Hospital
- St Joseph´s Hospital Health Center
- St Luke´s Roosevelt Hospital - Roosevelt Hospital Division
- Strong Memorial Hospital
- University Hospital
- Westchester Medical Center
- Winthrop-University Hospital
As shown in the table below, this analysis revealed significant results in the change in the proportion of sites answering ´Yes´ to questions on implementing systemic changes from the beginning of the demonstration and the end of the demonstration for five of six questions in the care transitions and medication reconciliation project.
3c.i Number of sites reporting ´Yes´ to each Care Transition and Medication Reconciliation project question on implementation of systemic changes since the beginning of the demonstration and the end of the demonstration | ||||||
---|---|---|---|---|---|---|
Baseline (Q2 2013) | Q4 2014 | |||||
Sites Answering ´Yes´ (Percent) | Sites Answering ´No´ (Percent) | Sites Answering ´Yes´ (Percent) | Sites Answering ´No´ (Percent) | p-value | Significant Increase (↑) Significant Decrease (↓) |
|
Medication Reconciliation Registry Developed? | 24 (28%) | 63 (72%) | 76 (100%) | 0 (0%) | <0.0001 | ↑ |
Standardized Communication Protocols? | 66 (76%) | 21 (24%) | 76 (100%) | 0 (0%) | <0.0001 | ↑ |
Has a Care Transition Protocol been developed for the most common causes of avoidable readmission? | 53 (61%) | 34 (39%) | 76 (100%) | 0 (0%) | <0.0001 | ↑ |
A System for identifying high risk patients? | 68 (78%) | 19 (22%) | 76 (100%) | 0 (0%) | <0.0001 | ↑ |
A system for allocation of resources to the most high risk patients? | 69 (79%) | 18 (21%) | 76 (100%) | 0 (0%) | <0.0001 | ↑ |
Is there now an Integrated EHR Information Systems Between Inpatient and Outpatient sites? | 72 (83%) | 15 (17%) | 63 (83%) | 13 (17%) | 0.9817 | |
↑ or ↓ indicates a statistically significant change from baseline to Q4 2014 given a p-value of <0.05, as well as the direction of the change |
3c.ii Systemic Changes: Integration of Physical and Behavioral Healthcare:
Hospitals participating in project:
- New York City Health and Hospitals Corporation (HHC) (11 sites: Bellevue, Elmhurst, Coney Island, Harlem, Jacobi, Kings County, Lincoln, Metropolitan, North Central Bronx, Queens Hospital, Woodhull)
- Brookhaven Memorial Hospital Medical Center
- Maimonides
- SUNY Downstate
- New York Presbyterian Hospital-Columbia Presbyterian Center
- Mount Sinai Medical Center
- Montefiore Medical Center
- Highland Hospital
- Erie County Medical Center
As displayed in the table below, this analysis revealed significant results in the change in the proportion of sites answering ´Yes´ to questions on implementing systemic changes from the beginning of the demonstration and the end of the demonstration for five of eight questions in the integration of physical and behavioral healthcare project.
3c.ii Number of sites reporting ´Yes´ to each Integration of Physical and Behavioral Healthcare project question on implementation of systemic changes since the beginning of the demonstration and the end of the demonstration. | ||||||
---|---|---|---|---|---|---|
Baseline (Q2 2013) | Q4 2014 | |||||
Sites Answering ´Yes´ (Percent) | Sites Answering ´No´ (Percent) | Sites Answering ´Yes´ (Percent) | Sites Answering ´No´ (Percent) | p-value | Significant Increase (↑) Significant Decrease (↓) |
|
A system been developed for the site to access and act on PSYCKE reports? | 3 (6%) | 45 (94%) | 33 (100%) | 0 (0%) | <0.0001 | ↑ |
Have all residents been trained in depression screening, appropriate treatment modalities, and referral? | 25 (52%) | 23 (48%) | 34 (100%) | 0 (0%) | <0.0001 | ↑ |
Have all residents been trained in Pain Management screening, appropriate treatment modalities, and referral? | 17 (37%) | 29 (63%) | 33 (97%) | 1 (3%) | <0.0001 | ↑ |
Demand and capacity for behavioral health services assessed? | 32 (65%) | 17 (35%) | 34 (100%) | 0 (0%) | 0.0001 | ↑ |
Quality improvement plan utilizes provider and program-level outcomes data? | 44 (94%) | 3 (6%) | 34 (100%) | 0 (0%) | 0.2602 | |
Organization developed algorithm for patients not demonstrating improvement and process for treatment adjustment and psychiatric consultation? | 41 (87%) | 6 (13%) | 34 (100%) | 0 (0%) | 0.0372 | ↑ |
Have a process for facilitating and tracking referrals for specialty care? | 45 (96%) | 2 (4%) | 34 (100%) | 0 (0%) | 0.5068 | |
Created algorithm used by your organization for screening and diagnosing patients with behavioral health issues? | 44 (94%) | 3 (6%) | 34 (100%) | 0 (0%) | 0.2602 | |
↑ or ↓ indicates a statistically significant change from baseline to Q4 2014 given a p-value of <0.05, as well as the direction of the change |
3c.iii Systemic Changes: Enhanced Interpretation Services for Culturally Competent Care:
Hospitals participating in project:
- Buffalo General Hospital
- St Joseph´s Medical Center (Yonkers)
- Peconic Bay Medical Center
- Wyckoff Heights Medical Center
- University Hospital SUNY Health Science Center
- Good Samaritan Hospital Medical Center
- University Hospital of Brooklyn
- Millard Fillmore Suburban Hospital
- New York Presbyterian Hospital - Columbia Presbyterian Center
- Women and Children´s Hospital of Buffalo
As displayed in the table below, this analysis revealed significant results in the change in the proportion of sites answering ´Yes´ to questions on implementing systemic changes from the beginning of the demonstration and the end of the demonstration for half of the questions in the enhanced interpretation services for culturally competent care project.
3c.iii Number of sites reporting ´Yes´ to each Enhanced Interpretation Services for Culturally Competent Care project question on implementation of systemic changes since the beginning of the demonstration and the end of the demonstration. | ||||||
---|---|---|---|---|---|---|
Baseline (Q2 2013) | Q4 2014 | |||||
Sites Answering ´Yes´ (Percent) | Sites Answering ´No´ (Percent) | Sites Answering ´Yes´ (Percent) | Sites Answering ´No´ (Percent) | p-value | Significant Increase (↑) Significant Decrease (↓) |
|
Gaps in access and coordination identified? | 34 (92%) | 3 (8%) | 26 (100%) | 0 (0%) | 0.2611 | |
Increased access provided to appropriate language services? | 33 (92%) | 3 (8%) | 26 (100%) | 0 (0%) | 0.2575 | |
Training programs developed to improve staff cultural competence and awareness? | 18 (51%) | 17 (49%) | 26 (100%) | 0 (0%) | <0.0001 | ↑ |
Developed capacity to generate prescription labels in patients´ primary language with easy to understand instructions? | 5 (14%) | 31 (86%) | 16 (62%) | 10 (38%) | <0.0001 | ↑ |
↑ or ↓ indicates a statistically significant change from baseline to Q4 2014 given a p-value of <0.05, as well as the direction of the change |
3c.iv Systemic Changes: Improved Access and Coordination Between Primary and Specialty Care
Hospitals participating in project:
- Albany Medical Center Hospital
- Bellevue Hospital Center
- Buffalo General Hospital
- City Hospital Center at Elmhurst
- Coney Island Hospital
- Ellis Hospital
- Erie County Medical Center
- Flushing Hospital Medical Center
- Harlem Hospital Center
- Interfaith Medical Center (Withdrew from project Oct 2015)
- Jacobi Medical Center
- Jamaica Hospital Medical Center
- Kings County Hospital Center
- Kingsbrook Jewish Medical Center
- Kingston Hospital
- Lincoln Medical & Mental Health Center
- Metropolitan Hospital Center
- Mount Sinai Hospital
- Nassau University Medical Center
- New York Hospital Medical Center of Queens
- North Central Bronx Hospital
- Queens Hospital Center
- Richmond University Medical Center
- St Barnabas Hospital
- Strong Memorial Hospital
- Unity Hospital
- University Hospital
- Women and Children´s Hospital of Buffalo
- Woodhull Medical & Mental Health Center
As displayed in the table below, this analysis revealed significant results in the change in the proportion of sites answering ´Yes´ to questions on implementing systemic changes from the beginning of the demonstration and the end of the demonstration for all of the questions in the improved access and coordination between primary and specialty care project.
3c.iv Number of sites reporting ´Yes´ to each Improved Access and Coordination Between Primary and Specialty Care project question on implementation of systemic changes since the beginning of the demonstration and the end of the demonstration. | ||||||
---|---|---|---|---|---|---|
Baseline (Q2 2013) | Q4 2014 | |||||
Sites Answering ´Yes´ (Percent) | Sites Answering ´No´ (Percent) | Sites Answering ´Yes´ (Percent) | Sites Answering ´No´ (Percent) | p-value | Significant Increase (↑) Significant Decrease (↓) |
|
Gaps in access and coordination identified? | 51 (85%) | 9 (15%) | 52 (100%) | 0 (0%) | 0.0034 | ↑ |
System developed to ensure complete Accurate and timely information from PCP to patient and specialist and specialist to PCP and patient? | 46 (77%) | 14 (23%) | 52 (100%) | 0 (0%) | 0.0002 | ↑ |
Standardize d referral process developed? | 49 (83%) | 10 (17%) | 52 (100%) | 0 (0%) | 0.0015 | ↑ |
↑ or ↓ indicates a statistically significant change from baseline to Q4 2014 given a p-value of <0.05, as well as the direction of the change |
3d. Patient Centered Medical Home Achievement: The table below shows that all 156 outpatient sites participating in HMH achieved high level PCMH recognition, 89% of which achieved recognition at the highest level (level 3 under 2011 standards). For a list of all outpatient site PCMH achievements and recognition begin and end dates please see Appendix VA1.
3d. Patient Centered Medical Home Achievement as of Q4 2014 (100% of sites) | |
---|---|
Standard Level | Number (Percent) Achieved by Level |
2011 Level 2 | 17 (11%) |
2011 Level 3 | 139 (89%) |
Total | 156 (100%) |
3e. Infrastructure Building in the Inpatient Setting:
Severe Sepsis Detection and Management
Participating Hospitals
- Beth Israel Medical Center
- Brookhaven Memorial Hospital
- Ellis Hospital
- Glen Cove Hospital
- Highland Hospital
- Interfaith Medical Center (Withdrew from project Oct 2015)
- Lutheran Medical Center
- Mercy Hospital of Buffalo
- Montefiore Medical Center
- Mount Sinai Medical Center
- Nassau University Medical Center
- New York and Presbyterian Hospital - Columbia Presbyterian Center
- New York Methodist Hospital
- Peconic Bay Medical Center
- Rochester General Hospital
- Samaritan Medical Center
- Sisters of Charity Hospital
- South Nassau Communities Hospital
- St Barnabas Hospital
- St Joseph´s Hospital Health Center
- St. Luke´s-Roosevelt Hospital Center
- State University of New York Downstate Medical Center
- Stony Brook University Hospital
- Unity Hospital
- University Hospital SUNY Upstate
- Winthrop University Hospital
- Wyckoff Heights Medical Center
Central Line Associated Bloodstream Infection (CLABSI):
Participating Hospitals
- Albany Medical Center
- Bellevue Hospital Center
- Beth Israel Medical Center
- Bronx-Lebanon Hospital Center
- City Hospital Center at Elmhurst
- Coney Island Hospital
- Erie County Medical Center
- Flushing Hospital Medical Center
- Harlem Hospital Center
- Highland Hospital
- Interfaith Medical Center (Withdrew from project Oct 2015)
- Jacobi Medical Center
- Jamaica Hospital Medical Center
- Kaleida Health-Buffalo General Medical center
- Kaleida Health-Millard Fillmore Suburban Hospital
- Kings County Hospital
- Kingsbrook Jewish Medical Center
- Lincoln Medical and Mental Health Center
- Lutheran Medical Center
- Maimonides Medical Center
- Mercy Hospital of Buffalo
- Metropolitan Hospital center
- Mount Sinai Medical Center
- New York Hospital Medical Center of Queens
- Niagara Falls Memorial Hospital
- North Central Bronx Hospital
- North Shore University Hospital
- Queens Hospital Center
- Richmond University Medical Center
- Rochester General Hospital
- Sisters of Charity Hospital
- Sound Shore Medical Center
- St Joseph´s Hospital Health Center
- St, Joseph´s Medical Center
- St. Luke´s-Roosevelt Hospital Center
- Strong Memorial Hospital
- The Brooklyn Hospital Medical Center
- University Hospital SUNY Upstate
- Westchester County Medical Center
- Winthrop University Hospital
- Woodhull Medical and Mental Health Center
- Wyckoff Heights Medical Center
Surgical Care Improvement Project (SCIP)
Participating Hospitals
- Good Samaritan Hospital
- Kaleida Health-Buffalo General Medical center
- Kaleida Health-Millard Fillmore Suburban Hospital
- Montefiore Medical Center
- Mount Vernon Hospital
- New York Hospital Medical Center of Queens
- Phelps Memorial Hospital
- Richmond University Medical Center
- Sound Shore Medical Center
- St Barnabas Hospital
- Stony Brook University Hospital
- Westchester County Medical Center
Venous Thromboembolism (VTE)
Participating Hospitals:
- Bronx-Lebanon Hospital
- Brookhaven Memorial Hospital
- Erie County Medical Center
- Flushing Hospital Medical Center
- Glen Cove Hospital
- Good Samaritan Hospital
- Jamaica Hospital Medical Center
- Kingsbrook Jewish Medical Center
- Kingston Hospital
- Maimonides Medical Center
- Mount Vernon Hospital
- New York and Presbyterian Hospital - Columbia Presbyterian Center
- Niagara Falls Memorial Medical Center
- North Shore University Hospital
- Peconic Bay Medical Center
- Phelps Memorial Hospital
- Samaritan Medical Center.
- South Nassau Communities Hospital
- St. Joseph´s Medical Center
- The Brooklyn Hospital Center
- Unity Hospital
Neonatal Intensive Care Unit (NICU) Safety and Quality
Participating Hospitals
- Bellevue Hospital Center
- City Hospital Center at Elmhurst
- Harlem Hospital Center
- Jacobi Medical Center
- Kaleida Health Women and Children´s Hospital of Buffalo
- Kings County Hospital
- Metropolitan Hospital
- Nassau University medical Center
- New York Methodist Hospital
- Queens Hospital Center
- State University Of New York Downstate Medical Center
- Strong Memorial Hospital
Avoidable Preterm Births: Reducing Elective Delivery Prior to 39 Weeks Gestation
Participating Hospitals
- Albany Medical Center Hospital
- Coney Island Hospital
- Ellis Hospital
- Kaleida Health - Women & Children´s Hospital
- Kingston Hospital
- Lincoln Medical & Mental Health Center.
- North Central Bronx Hospital
- Woodhull Medical & Mental Health Center
As seen in the table below, the majority of cell sizes were too small to test for a significant difference in proportions between baseline and Q4 2013, however, a significant increase in the proportion of sites reporting having met milestones for infrastructure building related to sepsis was found.
3e. Chi Square Analyses on Infrastructure Building by Inpatient Project | ||||||
---|---|---|---|---|---|---|
Measure | Baseline (Q2 2013) | Q4 2014 | ||||
Sites Answering ´Yes´ (Percent) | Sites Answering ´No´ (Percent) | Sites Answering ´Yes´ (Percent) | Sites Answering ´No´ (Percent) | p-value | Significant Increase (↑) Significant Decrease (↓) |
|
Sepsis: Number of Sites Reaching Milestones Related to Infrastructure Building | 28 (51%) | 27 (49%) | 21 (78%) | 6 (22%) | 0.0197 | ↑ |
CLABSI: Number of Sites Reaching Milestones Related to Infrastructure Building | 43 (81%) | 10 (19%) | 34 (81%) | 8 (19%) | 0.9823 | |
SCIP: Number of Sites Reaching Milestones Related to Infrastructure Building | 12 (71%) | 5 (29%) | 8 (80%) | 2 (20%) | 0.6784 | |
VTE: Number of Sites Reaching Milestones Related to Infrastructure Building | 38 (83%) | 8 (17%) | 17 (85%) | 3 (15%) | 0.9999 | |
NICU Safety: Number of Sites Reaching Milestones Related to Infrastructure Building | 9 (82%) | 2 (18%) | 13 (100%) | 0 (0%) | 0.1993 | |
Avoidable Preterm Births: Number of Sites Reaching Milestones Related to Infrastructure Building | 6 (50%) | 6 (50%) | 6 (75%) | 2 (25%) | 0.3729 | |
↑ or ↓ indicates a statistically significant change from baseline to Q4 2014 given a p-value of <0.05, as well as the direction of the change |
3f. Inpatient Performance Band Progress: Hospitals were tasked with moving up a performance band by the end of the demonstration, as compared to pre-HMH performance on select metrics. Fifty percent of sites met this goal for SSI SIR, 54% met this goal for CLABSI SIR, and 90% met this goal for NICU CLABSI SIR.
3f. Achievement of QSIP Performance Band Progress | |||
---|---|---|---|
QSIP Project Area | Measure | Number of Sites Electing to Work on this Project | Number (Percent) sites achieving rate needed to move up a performance band |
Surgical Complications Core Processes (SCIP) | Surgical Site Infection (SSI) Standardized Infection Ratio (SIR) | 12 | 6 (50%) |
Central Line Associated Bloodstream Infection (CLABSI) Infection Prevention | CLABSI Standardized Infection Ratio (SIR) | 41 | 22 (54%) |
Neonatal Intensive Care Unit (NICU) Quality and Safety | NICU CLABSI Standardized Infection Ratio (SIR) | 10 | 9 (90%) |
VI. Major Findings
Access to Health Care Services
All sites achieved PCMH recognition by the end of the demonstration, most (89%) achieving recognition at the highest level (level 3 under 2011 standards).
Each care coordination and integration project saw significant improvement over the course of the demonstration with improvements seen in nearly all questions regarding having met milestones related to the four care coordination and integration initiatives. One hundred percent (100%) of sites participating in the enhanced interpretation services for culturally competent care project reported having identified gaps in access and coordination, and having increased access to appropriate language services by the end of the demonstration. Additionally, all sites participating in the improved access and coordination between primary and specialty care project reported having identified gaps in access and coordination, having developed systems to ensure timely and accurate reporting of information from PCPs to specialists, and having a standardized referral process developed by the end of the demonstration.
One view of access to care is through use of specialty services such as breast cancer screening and diabetes management which both increased during the demonstration. Resident schedule changes allowed an increase in resident time in the ambulatory clinic which allowed for enhanced resident-patient empanelment. The total numbers of active patients increased as did outpatient clinic visits. Patient wait times for visits and no show rates decreased. Many clinics implemented same-day appointments and additional hours added to schedule. Several sites reported third next available appointment (TNAA) time frames for new patients decreased.
Pre-visit planning was implemented to obtain needed services such as labs, tests, and consultations prior to visit. Preventive screenings increased and patients not recently seen were contacted for preventive services. With the implementation of PHQ-2 and PHQ-9 behavioral health screenings, early intervention, treatment or referral was available for patients onsite and expeditiously.
Some practices also implemented a functional patient portal which allows patients to have access to their labs and other information in their medical chart, as well as offering bi- directional communication between patients and the practice for things such as scheduling appointments and requesting referrals.
At some of the sites, hospitalists notify one of the attending physicians when a patient at the practice is discharged from the hospital. This patient is then contacted by the outpatient clinic so that a hospital follow-up appointment can be scheduled.
Patients with specific language needs are supported through language interpretation services.
Utilization of Health Care Services
The average rate of high-risk Medicaid patients with a follow up call within 48 hours of discharge grew from 47% to 73% by the end of the demonstration and the follow up visit rate grew from 19% to 32%, indicating an area for needed ongoing improvement. A correlation analysis showed that having a follow up call or visit after a hospital discharge was weakly, but significantly, associated with reduced all-cause 30 day readmission rates in HMH hospitals in quarter 3, 2014 (this result was replicated in quarter 4, 2014 for only the follow up visit metric).
During the demonstration time frame, PPR rates decreased for both hospitals participating in the demonstration, as well as those that were not, though participation in HMH was not associated with a greater decrease in PPRs than non-HMHs.
An analysis of the relationship between medication reconciliation following a hospital discharge and readmission rates showed that the odds of having an all cause 30 day readmission were greater when a post discharge medication reconciliation was not conducted (odds ratio: 1.934, p=0.0005). Similar results were found for potentially preventable readmissions (odds ratio: 1.944, p=0.0058). These analyses used limited time frames and will be expanded when more recent Medicaid claims/encounter and SPARCS data are available. Additionally, the studies did not include some individual patient factors that could influence results, such as the number of medications prescribed.
Hospitals reported increased identification of high risk patients and enhanced resources dedicated to their care management reductions in potentially preventable admissions, readmissions and emergency department visits. The integrated care model of treatment provided practitioners with the necessary tools and skill to accurately assess a patient in need of immediate behavioral health care, and subsequently avoided trips to the ED (conjecture - would remove). To reduce unnecessary ED use and hospital admissions, an adult primary care practice collaborated with an Emergency Department to ensure that discharged patients were contacted the next day for follow-up.
Many clinics partnered with the inpatient unit staff to improve the scheduling of discharged patients for more immediate appointments, without the use of overbooked appointments.
To engage patients in their care and ensure necessary appointments are kept, care managers conducted follow-up phone calls with no-show patients to find out why they did not keep their appointment. Referrals to care coordinators were made for patients presenting with a lack of resources or concerns that could prompt a readmission. New on- call and after-hour phone coverage decreased ED utilization.
The pre visit planning model and use of a Western New York regional HIE, HEALTHeLINK, were implemented. This ensured all documentation was available at the time of visit to decrease duplicate testing, and unnecessary health care expense.
Quality of Care
Significant improvements were seen in approximately half (8 of 17) of the clinical performance metrics studied. There was one metric that demonstrated a significant decrease from baseline to quarter 4, 2014 (follow up after hospitalization for mental illness within 30 days). However, the final rate was within the ballpark of usual state averages for this measure and therefore may be reflecting more accurate reporting as time went on.
Overall, sites made improvements in all measures of inpatient quality and utilization though the degree of improvement in this area was not as large as other project domains.
Significant improvements were seen in rates of central line bundle compliance and venous thromboembolism discharge instructions between baseline and quarter 4, 2014 reporting. Improvements were also made in CLABSI outcomes, sepsis management and surgical complication core processes, though results were not statistically significant.
At least half of the hospitals participating in the SCIP, CLABSI and NICU CLABSI inpatient projects improved their SIR performance by the end of the demonstration enough to achieve placement in a higher tertile performance band (90% of NICU CLABSI participating hospitals met this achievement). Performance band progress was not calculated for other measures due to the unavailability of post-HMH data at this time.
Medicaid has a large number of members with co-existing physical and mental health/substance abuse co-morbidities. Optimal care requires integration of services and providers so that care is coordinated and appropriate for the well-being of the entire person, not just for a single condition. There are many barriers between behavioral and physical health care including different providers, varying locations, multiple agencies, confidentiality rules and regulations, historic lack of communication between providers, and more. The integration of Physical-Behavioral Health Care required training programs to find ways to integrate care for their patients with behavioral health conditions within the medical home.
Sites choosing to work on the Integration of Behavioral Health into Primary care who were working on depression care for adults were required to participate with the Office of Mental Health to implement the Collaborative Care Model. The Collaborative Care model is an effective, empirically supported, measurement-based approach to depression care within primary care outpatient practices that requires depression care managers as an integral component of care.
In addition, sites working on the Integration of Behavioral Health into Primary Care project were required to meet the following deliverables:
- A strategy for integration which includes a means of improving referrals to behavioral health providers, enhanced communication with mental health/substance abuse providers, processes for obtaining appropriate consents for sharing personal health information, and procedures for coordinated case management.
- Development of a linkage to the Office of Mental Health Psychiatric Services and Clinical Knowledge Enhancement System (PSYKES) project, which provides data and recommendations for potential problems of polypharmacy and metabolic syndrome complications for Medicaid members using Medicaid databases within the first year of the program start date. The linkage would require creating systems to receive, and act on, reports generated by PSYKES. The linkage must have been completed by the end of Year 1.
- Development of training for primary care clinicians in behavioral health care with particular focus on integrating depression screening and pain management with appropriate treatment modalities and referral.
- Assessment of demand and capacity to provide co-located services or other approaches to decrease wait times and improve access to behavioral health services.
A strategy for integration was addressed through the requirement and integration of the case manager for care coordination. Participating sites had an average of 1.2 full time case managers employed by the end of the demonstration. Multiple site assessments were required and conducted: 1) access to Psychiatric Services and Clinical Knowledge Enhancement System (PSYCKEs) reports; 2) universal resident training in depression screening with appropriate treatment modalities and referral parameters; 3) behavioral health resource demands; and 4) tracking of referral outcomes. To address the issue of polypharmacy, an assessment of site level access to PSYCKES was explicitly asked for in the reporting tool. Sites were required to measure access to the NYS DOH Controlled Substance Registry (I-STOP), with a goal of achieving 100% compliance. The tool explicitly asked what system had been developed for residents to access and act on I- STOP before writing prescriptions for controlled medications. There was also a metric to monitor the percentages of iSTOP referrals when prescribing controlled substances. All sites were connected to PSYCKES at the end of year one of this project and the rate of using ISTOP prior to prescribing controlled substances had risen to an average of 68.49% over the 33 participating sites.
To address the training component, there was an explicit requirement that residents be trained in, "depression screening, appropriate treatment modalities, and referral." There were metrics in the tool to monitor resident training in depression and pain management screening and treatment showing that 97% of residents in outpatient sites had received training by the end of the demonstration. Further, there were metrics to monitor use of the PHQ 9, and others to monitor depression treatment in eligible patients. The average rate of patients enrolled in treatment for at least 16 weeks whose PHQ-9 score decreased by the end of the demonstration reached 41%.
There was also an explicit requirement that the facility assess the demand and capacity for behavioral health services. Metrics in the tool for wait times and access to appointments with a mental health provider showed average improvement rising from 51% at baseline to 83% by the end of the demonstration. There were also metrics in the tool to monitor enrollment in treatment programs, which reached 43% for those screened positive for depression. The narrative entries allowed for sites to describe how they were working to improve wait times and access to appointments.
Changes to Residency Programs
A signification portion of sites restructured their residency training structures by increasing resident time in ambulatory settings. For example nearly all pediatric residents surveyed reported having been involved in PCMH activities and having been trained in PCMH concepts by the end of the demonstration. Not all findings from resident surveys were equal, however. Pediatric residents´ experiences, when compared to those of family medicine residents showed greater improvement from 2013 to 2015.
Patient empanelment was undertaken as part of site participation in HMH. At the beginning of the demonstration many sites reported incomplete patient empanelment as well as lacking processes in place to achieve that goal. However, by the end of the demonstration, more than 50% of resident visits were with patients on their own panel and sites now had the means to continue to identify and improve on this metric. A correlation was found between this measure of resident continuity and increased lipid control in the final two quarters of the demonstration, though small numbers make determining correlation difficult and this effect was not uniformly seen with other quality measures.
As documented in the final reports, outpatient clinics made notable changes to Residency Programs. The number of ambulatory sites were increased, resident schedules were restructured, increased time was spent in health centers, and resident empanelment was formalized and improved. Many programs reported the development of PCMH curriculum for their own use. Team based care was emphasized through pre-visit planning and team ´huddles´ prior to ambulatory sessions. Residents became significantly involved in the planning, data collection and QI activities for inpatient and outpatient improvement projects. Residents were provided with patient level reports of their performance on key indicators and had access to the extended care team to assist with follow- up care and outreach to patients. Procedures for transitioning patients from one resident or resident/attending team to another on graduation were also implemented.
|top of section| |table of contents|VII. Challenges
New York State Department of Health
At the beginning of the project, Hurricane Sandy caused damage and disruption of operations for many of the participating hospitals. This delayed work plan development as well as pushed back timelines for achievement of PCMH recognition. There were also multiple initiatives that hospitals had to respond to during this time from federal, state, and private payers and regulators which required strategic planning and layering of activities to align goals and resources as much as possible. In addition, the number of sites and residency programs meant hospitals were forced to develop communication systems between them that did not necessarily exist prior to the program.
Although the majority of hospitals and clinics were exchanging information successfully with their Regional Health Information Organization (RHIO) by the end of the project, there were challenges connecting with one particular region of the state associated with the Taconic Health Information Network and Community (THINC) RHIO and with one large hospital system. NYS DOH Office of Health Information Technology and Hospital Medical Home program staff provided ongoing assistance and consultation with challenges.
Hospitals and residency programs were not always prepared to communicate and collaborate to the degree needed for all components of the demonstration.
Clinics and individual providers were initially not familiar with use of standardized quality measures and population (panel) management. Hospitals needed extensive guidance and clarification regarding tracking performance on measures.
There is a limited workforce capacity with respect to well-trained behavioral health care managers. Some sites had unexpected turnover at the Depression Care Manager position and had difficulty finding qualified replacements.
Much of the Collaborative Care model (e.g. a patient care registry, Depression Care Manager and psychiatrist time) is not typically supported by public or private payers.
Participating Hospitals
Overarching Challenges:
- Competing demands on attending and resident trainee time
- Physical space constraints for additional team personnel and meetings
- Split leadership between academic partner and healthcare facility in designing resident schedules.
- Integrating a new member of the team - a care manager - and learning to regularly refer patients to the care manager.
- Mergers and acquisitions of facilities.
Residency Related Challenges:
- Faculty buy-in
- Infrastructure changes required for:
- Assigning patient panels to residents
- Developing and implementing training specific to this project
- Scheduling for increased access and increased continuity
Data Collection Challenges:
- Data processes were manual in many cases (lacking full functionality of EMR) and required significant staff time for training, collection, review, analysis and reporting
- Data collection time lags, quarterly data not available readily at many sites
Patient Related Factors
- Lack of Internet access for patient population (in one area, 1 in 5 households do not have internet access)
- High post inpatient discharge no-show rates Low health literacy
Non Resident Staffing Challenges
- Staff turnover and recruitment
- Sustainability of some programs due to staffing constraints Changes in roles leading to union issues
- The demand for the depression collaborative services could exceed the capacity of the care manager
IT challenges
- Need for custom templates
- Lack of interoperability between inpatient and outpatient EMRs and with specialists Patient Portal challenges
- Other EMR software limitations
PCMH Recognition challenges
- Collection of clinical metrics
- Documentation issues including:
- Referrals
- Patient education
- Community resource referral
- Lab tracking
VIII. Lessons Learned
New York State Department of Health
Overall
Hospitals, residencies, and outpatient clinics need training and support in the use of standardized measures to track and report care
Coaching calls and collaboration between programs was beneficial both for sharing expertise and for maintaining enthusiasm and overcoming "change fatigue" among participants
Residency Programs
Importance of empanelment, attribution methodology for assigning patients, and for transferring patients to new residents as third years graduate
Critical to include residents in designing new processes and in seeking their feedback for successful buy-in.
Residents should be trained in population health and dashboards.
Residents should be involved in Quality Improvement projects and have input on design, data collection, analysis, and Quality Improvement activities.
Care Transitions
Importance of creating small and rapid Plan-Do-Study-Acts (PDSA) followed by scaling up to risk stratify all patients
It was critical to have a communication strategy for immediate notifications to the PCP of hospitalizations, ER visits, and discharges
Specialty Access
Importance of satisfaction surveys for specialists
Specialists should work jointly with primary care to develop referral guidelines and co- management agreements
Specialist residents should be involved in transformation activities
Clinical Performance Metrics
Clinical members of the team require instruction in the rationale and methodology of clinical performance metrics including the definitions of numerators, denominators, risk adjustment, performance benchmarks versus clinical guidelines, attribution methodology , and data collection to: promote buy-in, choose meaningful and achievable goals, and participate during PDSA cycles
Integration of Physical-Behavioral Health Care
As national standards for quality behavioral health care develop, based on the grant experience, it appears that near universal annual screening for depression can be done relatively easily once clinics commit to this goal. However, screening is necessary, but not sufficient for implementation of truly integrated Collaborative Care. "Integration is not a natural state." Because Collaborative Care is a fundamental departure from usual care, it requires practitioners to orient to the model and learn new roles - an often underappreciated aspect of implementing Collaborative Care. Primary care providers, Depression Care Managers (DCM), and caseload consulting psychiatrists all need proper orientation and training to the integrated model and workflows must support integrated care.
Inpatient Projects:
General:
Alignment between CMS and other regulators and payers is important with regard to measures, goals, and strategies.
Quality improvement teams must cut across silos to include the full hospital community in designing strategies for change and evaluating results.
Small tests of change and rapid cycle plan-do-study-act cycles can lead to sustainable success even while larger projects are not yet off the ground.
It is important to have a commitment from the institution´s executive leadership to succeed.
Large scale change, including transformation to a patient-centered approach to care, may take much longer than initially thought
Improvements, to be sustainable, must be developed with consideration of the resources and infrastructure needed to support the changes.
Multiple practice sites with different management structures, information systems, processes, and cultures added to complexity of implementation.
Clear performance expectations, accountability for desired outcomes, and frequent auditing are needed to promote transformation.
Leadership
Changes in leadership require re-orientation and re-training.
Information Technology /EHR
Interoperable IT systems are needed for reporting and information sharing across inpatient and outpatient settings.
Restructuring of the EMR is a powerful mechanism for performance improvement.
|top of section| |table of contents|IX. Limitations:
Project Design: The Hospital Medical Home Demonstration was not designed to track overall avoidable readmissions or other cost or utilization metrics across participating hospitals. Given multiple other initiatives, it is not possible to isolate the effect of HMH on these variables
Patient Related Limitations:
Academic hospitals typically treat patients with significant socioeconomic challenges including many uninsured patients leading to interruptions in care and/or needs that were outside the scope of this project.
Resource Related Limitations:
Tracking data, workforce needs such as care managers, outreach workers, and patient navigators, and Regional Health Information Organization and EHR restructuring all required extensive resources.
Data Limitations:
Nearly all data used in the quantitative analyses presented in this report were hospital-self reported. Although hospitals were instructed to submit clinical performance data in accordance with QARR/HEDIS or MU specifications, these measures were designed for data collection from health plans, and some alterations were needed to make reporting appropriate at the site-level. All metric definitions and data submissions were reviewed quarterly, but any further auditing including chart review was not conducted. Rates were compared against state-wide QARR averages and hospitals were notified when their rates were significantly different and asked to provide explanations and/or action plans. The medication reconciliation and readmission analyses used a control group of patients that did not appear on any post-discharge medication reconciliation (PDMR) lists, indicating a PDMR was not conducted at an HMH-participating outpatient site. However, it is possible that patients in the control group received PDMR from a non-participating site, and therefore, misclassification bias is possible in the control group. If so, odds ratios presented in this report may be understated.
Resident surveys had a relatively low response rate (approximately 20%). Additionally, responses based on program type (pediatrics, family medicine) were not equivalent.
Findings from these surveys are not be generalizable.
Correlation analyses were conducted where sufficient data was available, but the number of observations in each analysis differed. The number of observations included in each analysis are presented within the Data Analysis section. It should be noted that analyses with fewer observations may fail to determine linearity even when statistical significance was determined.
The unavailability of rates for the external measures used to place hospitals into tertiles for the inpatient analyses precluded a complete final analysis in this area. Additionally, changing measure definitions changed over the course of the demonstration (for example, SSI SIR now excludes hysterectomies from its calculation). To address changing definitions, new methods of risk adjustment were used in the final evaluation.
|top of section| |table of contents|X. Policy Recommendations
The Hospital Medical Home Pilot has demonstrated the feasibility and value of transforming residency training clinics into patient-centered medical homes, increasing resident continuity and exposure to the outpatient setting, which also increases access and continuity for patients, and focusing on care integration in the areas of transitions of care, integration of behavioral health into primary care, specialty access and improved cultural competence. Additionally, HMH has shown that residents can and should be involved in quality and safety in the inpatient setting, where they spend much of their training and may be working after graduation. Based on these findings, the Department would like to propose consideration of the following policy recommendations:
For Ambulatory Clinics Serving Medicaid Members:
- Outpatient clinics serving Medicaid members should be structured as advanced primary care models consistent with patient centered medical home principles.
- Outpatient clinics should provide care management for high risk patients and patients with chronic disease.
- Outpatient clinics should coordinate care across inpatient and outpatient settings and with specialty care.
- Outpatient clinics should be required and/or incentivized to track and report population health and quality metrics that are tied to payment.
- Outpatient clinics should be required to routinely exchange information with their Regional Health Information Organization or Health Information Exchange.
- Behavioral health should be integrated into all primary care settings through Collaborative Care or other evidence-based programs.
For ACGME - Residency Programs
- Residency programs training primary care residents should be encouraged or required to provide training in outpatient clinics that have been transformed into an advanced primary care model.
- Residency programs and hospitals should be required to provide training in interdisciplinary teams, including care managers, to prepare residents for team- based care.
- Primary care and specialty residency training programs should be required to jointly develop referral guidelines, communication systems, and co-management agreements to better coordinate care.
- Residency programs training primary care residents should incorporate explicit patient empanelment as a core element of primary care training.
- Residency programs should train and involve residents in the coordination of care between inpatient and outpatient settings.
For CMS - Hospitals and Residency Programs
- Hospitals should be required to report adherence to sepsis protocols.
- CMS should fund additional incentives to encourage medical students to choose primary care.
- CMS should encourage other states to use the waiver process to reform primary care outpatient training sites.
- Hospitals should be required or incentivized to develop policies to include residents in quality improvement committees, reviews, and projects.
- Hospitals, residency programs, and their outpatient clinics should be required or incentivized to coordinate care between primary and specialty care.
- Hospitals, residency programs, and their outpatient clinics should be required or incentivized to formalize interdisciplinary and interdepartmental teams across these settings to better integrate care given by residents.
- Hospitals, residency programs, and their outpatient clinics should be required or incentivized to include care managers in these teams.
- Hospitals, residency programs, and their outpatient clinics should be required or incentivized to develop methods to follow their patients across transitions of care to prevent unnecessary readmissions and patients lost to follow-up.
Follow Us