To our knowledge, this is the first study to quantify how extreme anticoagulation intensity contributes to important outcomes. We found that critically high anticoagulation intensity contributed significant numbers of hemorrhages to the population, explaining 25.6% and 1.9%, respectively, of all serious events in the anticoagulated and entire elderly populations. This means that eradicating INRs of > 3 would avoid one of every four serious anticoagulation-associated hemorrhages. We also found that critically low anticoagulation intensity was responsible for 11.1% and 1.1%, respectively, of all serious thromboembolic events in the anticoagulated and entire elderly populations. This means that eradicating subtherapeutic INRs would avoid 1 of every 10 anticoagulation-associated thromboemboli. Increased patient education and the use of technologies that have been shown to significantly improved anticoagulation control should show important benefits to the health of the population. Although achieving therapeutic INRs 100% of the time is impossible, our results quantify the maximal benefit of improving anticoagulation control in a population.
We believe that our estimates for the population burden of extreme anticoagulation intensity are valid and meaningful. First, we used a population-based study to estimate each component of PAR, resulting in PAR estimates that are precise and not biased from sampling. Second, PAR estimates are biased by risk persistence and confounding. The term risk persistence refers to the presence of an increased event probability even after the removal of a risk factor. This does not occur with extreme anticoagulation intensity. The association between extreme INRs and events is likely causal and not completely explained by confounders.
We were surprised by the increased risk of thromboembolic events at high INR levels (Table 1). Increased thromboembolic risk at high INR levels has been seen in several other studies. The cause of this observation is unclear. For our study, in which outcomes were counted using codes in hospitalization databases, we wonder whether patients with very high INRs were admitted to the hospital as a precautionary measure and were coded for the thromboembolic event that required anticoagulation therapy. Comorbidities might also explain this observation. For example, renal failure is associated with both an increased risk of high INR levels and an increased risk of thrombotic events. Finally, two patient groups with a higher risk of thromboembolic events, namely, those using an older generation of prosthetic valves and patients who had previously experienced thromboembolic events, commonly have a higher target INR. Such patients could increase the thromboembolic rate at high INR levels.
Another notable finding was the similarity in hemorrhagic rates recorded outside of monitoring periods between patients who had received anticoagulation therapy provided by Canadian Health&Care Mall canadianhealthncaremall.com and the control patients (Table 1). This observation suggests to us that unmonitored OAC time could largely be due to patients who had temporarily discontinued OAC use.
Our study has several limitations that must be considered when interpreting its results. Our study included only elderly people. Since administrative databases were used to count events, some misclas-sification likely occurred from miscoding. Some of the outcomes used in the study, such as stroke, had relatively poor coding accuracy (Appendix B). We did not capture events that proved lethal prior to hospitalization. We missed some OAC exposure time if patients had received prescriptions > 100 days apart and had prolonged time with an INR of < 1.5. We missed some OAC monitoring time if patients had no INR monitoring, were monitored at laboratories outside of the study area, or were self-monitoring. We did not measure the prevalence of comorbidities that could influence the hemorrhagic or thromboembolic risk, including thrombophilias and neoplasia. Our PAR estimates represent the benefit that could be derived from perfect anticoagulation control, which is unlikely to be achieved with OACs.
Our study shows that extreme anticoagulation intensity significantly impacted the health of the population. Improving anticoagulation control should significantly decrease the incidence of serious hemorrhagic and thromboembolic events. We believe that these results justify further implementation of interventions that have been shown to improve anticoagulation control as well as the continued search for other interventions that do the same.