The number of children who get measles vaccinations in Rwanda has been steadily climbing since 2001. But in June of last year, the United Nations reported a sudden drop in coverage from 97 percent to 80 percent.
When Rwandan Minister of Health Agnes Binagwaho saw the statistics show up online, she was certain it couldn’t be true. She posted an indignant Tweet calling out both UNICEF and the World Health Organization: “Why is #Rwanda’s DTP3 vaccine coverage shown as 80% on your sites?”
The vaccination reports, it turns out, said little about the health of Rwanda’s children. Rather, it illustrated the challenges of measuring child health in the developing world.
Because the vast majority of poor countries don't formally track births and deaths, making statistical guesses about child health is complex and, as the Millennium Development Goals have run their course over the last 13 years, questions about methodology have come up frequently. Critics have accused the United Nations of cooking the books to paint a rosier picture of progress against maladies that commonly kill children, such as malnutrition and malaria. Sometimes, even the most basic estimates like the child mortality rate vary drastically between agencies.
“Most people in Africa and Asia are born and die without leaving a trace in any legal record or official statistics,” wrote a team of World Health Organization researchers in a report calling for the development of civil registration systems in poor countries. Not only does this “scandal of invisibility” render most of the world’s poor unseen and uncountable, but it also means the world has “little authoritative evidence” of how efforts to improve child health are working.
A record of life and death
The falsely reported drop in Rwanda’s vaccination rates was a result of outdated birth rate estimates, according to a study published in The Lancet this summer. In order to track the burden of disease or the adoption of health interventions like vaccines statisticians must first have a baseline population estimate. Without hard records, statisticians rely on data from census or household surveys to make educated guesses about the state of child health.
While statistics are recalculated and reported to the public each year, surveys, which take months to complete and cost millions of dollars, are conducted in each country only about once every five years. The gaps in data are filled in using statistical modeling.
The accuracy of the health estimates vary from country to country based on the amount of data that has been collected. It is difficult for researchers to work in high conflict areas. For example, in the Democratic Republic of the Congo, where war has waged for the better part of two decades, only five surveys have been conducted since 1970. Therefore, its child mortality estimate is less certain to be on point than, say, Bangladesh, where researchers have published seven surveys since 2000.
“The estimates are pretty close to reality for the bulk of the countries, but there are a few countries where there is lots of uncertainty,” said Trevor Croft, technical director for the USAID-backed data collection project, Monitoring and Evaluation to Assess and Use Results Demographic and Health Surveys (MEASURE DHS). “We have some idea what’s going on, but I wouldn’t say we have any real strong belief.”
In Rwanda, things changed faster than the data could be collected. Between 2005 and 2010, the use of contraception in Rwanda rocketed from 10 percent to almost 50 percent, said Cameron Nutt, a researcher at the Dartmouth Center for Healthcare Delivery Science and a research fellow to Binagwaho. As a result, the average number of children per woman dropped from 6.1 to 4.6. The most recent census, which was from 2002, did not reflect the changed birthrate.
“When we don’t factor in the increased utilization of contraception, the estimated number of children born per year is far too high,” he said.
That one mistake skewed the perception of not just Rwanda’s vaccination rates, but also its child mortality rates.
In 2012 reports published just two months apart, three United Nations agencies painted three very different pictures. According to the United Nations Population Fund, children under the age of five were dying at a rate of 112 per 1,000 live births. The World Health Organization reported the rate at 91 per 1,000 and the UN’s Inter-agency Group for Child Mortality Estimation put it at 54 per 1,000.
Between the highest and lowest estimations, that’s a 107 percent difference.
“It’s a little tricky to assess child mortality in a rigorous way because all the different agencies produce different estimates,” Nutt said. “It’s difficult to figure out which is closest to the truth on the ground.”
10,000 personal interviews
A number of organizations, including several different arms of the United Nations, calculate child health indicators, but most use the same raw numbers. Two agencies — UNICEF and MEASURE DHS — conduct the household surveys that produce about 80-90 percent of the data on poor countries.
In August, Croft’s MEASURE DHS team wrapped up a survey in Liberia that required 16 teams of interviewers to visit 10,000 randomly selected homes in more than 320 randomly selected locations. The interviewers sat down with each participant to ask about 700 questions.
The surveys are exhaustive. “Essentially, we ask about their whole life,” Croft said. Children are weighed and measured. Parents report the number of children who have been born and died within the last five years. How long had babies been breastfed? Had any children recently suffered from fever, cough or diarrhea? What vaccinations had they been given?
The surveys are thorough, but they aren't always accurate. For example, University of Ottowa Professor Amir Attaran points out in a paper published in PLOS Medicine, because a doctor was not present to record measure the child, the answer to a simple question like, “How much did your baby weigh at birth?” is only a rough approximation.
“Other survey questions are so technical that no layperson can answer them accurately,” wrote, Attaran, an expert on population health and a vocal critic of what he calls the “immeasurability” of the Millennium Development Goals. “MICS, for example, asks parents if their child’s anti-malaria bed net was ‘ever treated with a product to kill mosquitos’: an accurate answer depends on the type, dose, and date of insecticide treatment, and whether the local mosquito species carry insecticide resistance genes.”
Of all the child health measures, Attaran said the mortality rate is the most reliable because, “If a child dies, you don’t forget that.” But Croft said MEASURE DHS’s surveyors still run into challenges. In some cultures, for instance, if a baby doesn’t survive long enough to undergo a naming ceremony people do not consider the child to have lived at all, he said. Naming ceremonies might happen anywhere between three and thirty days after birth.
“People don’t particularly like to talk about children who have died,” he said. “It’s not unusual for them to be reticent to bring them up, which might lead to some underestimation.”
Things get particularly fuzzy when it comes to figuring out cause of death.
In many parts of the world, children die without ever having seen a doctor, so the go-to method for diagnosing cause of death data is a verbal autopsy — which may take place years later, said Katrina Ortblad a researcher at the Institute for Health Metrics and Evaluation (IHME) who focuses on measuring global burden of disease.
To conduct a verbal autopsy, the surveyor asks parents detailed questions about the circumstances surrounding the child’s death. A fever might indicate malaria. A cough might mean pneumonia. Then the statistician applies an algorithm to find the probable cause of death.
“What you see in the raw data isn’t always a reflection of what’s actually happening,” Ortblad said. “There’s a lot of guesswork.”
Even when there is a doctor present at the time of death to give a diagnosis, “there’s a lot of bias in reporting,” she said. For example, HIV deaths frequently get misclassified because of cultural stigma.
“A lot of times doctors will report cause of death as something people couldn’t possibly have died from,” Ortblad said. “Like a headache.”
A shifting picture
Rwanda’s ministry of health views the vaccine and death rate debacle as simple errors. But it’s not the first time statistics have misrepresented progress toward the Millennium Development Goals. Some of these shifting numbers, critics argue, may not be quite so innocent.
Each time a report is published, statisticians rework the estimates for the entire time period being evaluated — not just the most recent year. A report released in 2013, therefore, may not be comparable to a report released in 2012.
In a particularly dramatic example, the UN malnutrition estimates from 2011and 2012 completely flipped the story of world hunger in one year. Twenty-eleven statistics showed the number of malnourished people had climbed from 833 million in 1990 to approximately 850 million in 2010. When the UN announced new hunger estimates in October of 2012, though, graphs showed hunger declining from 1 billion in 1990 to 850 million in 2010.
Croft attributes such shifts to improved data and statistical models. But Attaran calls it “cooking the books.”
“It’s clear that the UN has failed to measure the MDGs from their inception,” Attaran said. “It’s also becoming increasingly apparent as the MDGs draw to an end that there’s an exercise of going back, recalculating data in order to fudge an illusion of progress.”
Regardless of intentions, there’s no simple answer to the question, “Will the world meet the Millennium Development Goal for reducing child mortality by 2015?”
It depends whom you ask.
According to IHME in Seattle, Washington, only 13 of the 75 countries with the worst child mortality rates will achieve a two-thirds reduction by the deadline. The Countdown to 2015 initiative, an accountability project backed by UNICEF and WHO, on the other hand, reported that 23 of those same countries are on track to meet the goal.
“Part of the difficulty in assessing progress is agreeing on criteria by which to judge country results,” researchers with the independent Expert Review Group wrote in an assessment of international child health initiatives. “Different methods yield different mortality estimates, and different criteria for deciding whether a country is on track (or not) produce different judgments.”
The diverging numbers, the report concluded, “are confusing for national and global accountability.” If the international community can’t agree on precise child health figures, stakeholders should at least, the authors pleaded, come to a conclusion on the broad progress toward internationally agreed upon goals.
Much of the friction over discrepancies in data can be traced back to a widespread perception that measuring progress is a top-down exercise, where international agencies select goals and thrust them upon countries. Some in the global health community, including Rwanda’s Ministry of health, argue that international data-modeling agencies don’t have close enough contact with local stakeholders.
“You really have to involve the countries more in the planning and monitoring process,” Nutt said.
To really solve the problem, Nutt said, the international community needs to invest in helping governments build a system to track vital statistics. But while improving the statistics could very well scoot along efforts to save children’s lives, it’s not always an easy cause to sell.
“There’s not a lot of money is global health for statistical monitoring systems,” Nutt said. “It is hard to find funders for such a longer term, infrastructure driven government service. It’s not as exciting as donating money for, say, vaccines, where you see the benefit of your efforts with each child vaccinated.”
More from GlobalPost: Step by Step: The path to ending child mortality