Am Fam Physician. 2007;76(10):1459-1462
Author disclosure: Nothing to disclose.
Testing for human immunodeficiency virus (HIV) became available in 1985 with the development of an enzyme-linked immunosorbent assay. The first test was actually developed to protect the blood supply, not to identify persons who were already infected. The more specific western blot test followed in 1987, and two-step testing became the national standard. However, testing for a fatal disease for which there were no effective treatments was not well accepted. Moreover, learning that one was HIV-positive often led to ostracism by family and friends, and loss of employment, housing, and health insurance. Acquired immunodeficiency disease (AIDS) was said to be part of the civil rights movement of the 1980s and 1990s.1 It was because of the stigma associated with AIDS that legislation was enacted to protect persons with the disease. Written informed consent was required, as was pre- and post-test counseling.
HIV testing guidelines were first issued by the Centers for Disease Control and Prevention (CDC) in 1987.2 These recommendations focused on targeting high-risk groups for screening: injection drug users, men who have sex with men, and persons with multiple sex partners. However, this approach was not effective. When the CDC guidelines were updated in 2002, physicians were encouraged to focus on persons with high-risk behaviors as well as clinical indicators such as recurrent pneumonia, oral candidiasis, varicella zoster, or unintentional weight loss.3 Routine HIV testing in settings with a disease prevalence of at least 1 percent was also recommended. This approach proved inadequate for several reasons. Patients are often unwilling to disclose high-risk behaviors, and physicians may not be comfortable asking about them. Many physicians are unaware of presentations that suggest HIV as the disease cause or co-factor. A significant number of patients who are tested at community sites do not return for their results. Consequently, illness remains the most common reason in the United States for HIV testing.4
High-quality studies have shown that HIV screening is cost-effective, in light of the overall HIV prevalence of about 0.3 percent in the Unites States.5,6 A recent study found that routine one-time HIV screening has a cost-effectiveness ratio of $30,800 per quality-adjusted life-year gained (QALY) when the disease prevalence is 1 percent, and about $50,000 per QALY when the prevalence is 0.12 percent.7 These ratios are similar to those for breast and colon cancer screening. In addition, effective treatments are available for HIV infection. As of September 2007, more than 20 drugs have been approved by the U.S. Food and Drug Administration to treat HIV, and at least three others are likely to be approved by the end of 2007. In general, the earlier patients are treated, the better their clinical status and life expectancy.
It is estimated that 25 percent of persons infected with HIV in the United States are unaware of their serostatus; these persons are thought to account for more than 50 percent of all new HIV infections.8 Identification of these individuals and implementation of risk-reduction counseling could reduce the number of new HIV infections by one half. New guidelines from the CDC recommend HIV screening for all persons between 13 and 64 years of age in all health care settings.9 The guidelines are a call for family physicians on the front lines of disease prevention to begin implementing routine HIV screening in their practices. The CDC recommends stream-lining the screening process by eliminating mandatory pre- and post-test counseling and written informed consent. For family physicians who do not think they have the time, skills, or knowledge to offer routine HIV screening, I recommend that you avail yourself of testing resources in your community. By normalizing HIV testing, we can diminish the stigma and discrimination associated with testing that have persisted for 25 years. By failing to test patients routinely, we will continue to see 40,000 new infections per year—something our health care system cannot afford.