Tracking poverty is critical for the United Nations, which launched a global poverty campaign last year. But gathering the data on the ground can be dangerous, slow, and expensive. Now, a study using satellite images and machine learning reveals an alternative: mapping poverty from space.
The first attempts to do that relied on images of the planet at night. The glow of electric lights paints a glittering map of a region's infrastructure, showing roughly where the rich and poor live. But at night, moderate economic underdevelopment doesn’t look much different from absolute poverty, defined by the World Bank as life on less than $1.90 per day.
So a team of social and computer scientists led by Marshall Burke, an economist at Stanford University in Palo Alto, California, has been sifting through daytime images. They train the computer on a subset of those data to create a statistical model that accurately predicts the hidden variable in the rest of the data.
Burke's team focused on five African countries: Nigeria, Tanzania, Uganda, Malawi, and Rwanda. These countries have both large proportions of their populations living in absolute poverty and good survey data to ground truth any predictions made by the computer.
As the team reports, daytime satellite images are dramatically better than nighttime images for mapping African poverty. Compared with the nighttime images, the daytime images were 81% more accurate at predicting poverty in places under the absolute poverty line and 99% more accurate in areas where incomes are less than half that.
Ground-based surveys will still be needed to build and validate this tool, says Marc Levy, a political scientist at The Earth Institute at Columbia University in Palisades, New York, who was not involved in the research. But the study shows that satellites plus surveys are “vastly more powerful than either one alone," he says, especially in regions where ground-based surveys are difficult or impossible.