Antibody tests for SARS-CoV-2 are hard to interpret. Many health experts agree that the tests, which search a blood sample for signs of past infection, are key to reopening the economy, calculating the true death rate of Covid-19, and estimating how close we may be to “herd immunity.”
But the results can be misleading, even when the test performs as advertised (which is often not the case). The trouble is, when the prevalence of an infection in a population is low, the total number of people who receive false positives can match or even exceed the number receiving true positives.
The true prevalence of infections has a huge impact on these predictive values. See for yourself: Try running the simulation with different prevalence rates, but without changing specificity or sensitivity.
To start, here are some of the prevalence estimates to emerge from early US antibody surveys, or serology surveys: 2.8% to 5.6% in Los Angeles; 2.49% to 4.16% in Santa Clara; 6% in Miami; 20% in New York City. Or try the WHO’s global estimate, 2% to 3%.
You can also try tweaking the sensitivity and specificity; we’ve provided some examples from a couple of prominent tests currently in use. Among the dozens of tests in development or use, sensitivities range from 87% to 93% and specificities range from 95% to 100%, according to the Johns Hopkins Bloomberg School of Public Health.
Update: As of May 4, the FDA will only issue emergency use authorizations to tests that have at least 90% sensitivity and 95% specificity.
The bigger the infected population, the higher the predictive value of an antibody test will be. Right now, overall prevalence of Covid-19 infections is pretty low, which makes the tests less useful. When looking at large populations, epidemiologists can use statistics to help account for this discrepancy, and can also use survey results to identify infection hotspots and ask comparative questions (i.e., how much bigger is the outbreak in New York vs. California).
But for an individual looking at their test results, wanting to know if that weird cold last month was Covid-19, these tests are still not very helpful. Here’s how Michael Osterholm, director of the Center for Infectious Disease Research and Policy at the University of Minnesota, put it: “If you’re a nurse, a physician, a first responder, and I told you there was a one in two chance that your [test] is really positive, would you trust that?”
Successive antibody surveys will gradually paint a more reliable picture of our predicament. But it’s likely too soon to rely on an antibody test result as the basis for any personal health decision.