On Sept. 12, the U.S. Census Bureau released national poverty data for 2017.
The headline was that 39.7 million people were poor in 2017. This works out to 12.3 percent of the population or one in eight Americans. The good news is that the U.S. poverty rate has fallen since 2010, when it hit 15.1 percent, and is now where it was before the Great Recession.
The bad news is that poverty still exceeds the 11.3 percent rate of 2000 and far too many people are poor in a country that is so rich. Another bit of bad news is that things look even worse if we use what many scholars like myself believe is a better poverty measure.
Who is poor?
In 2017, women had higher poverty rates than men and minorities had higher poverty rates than non-Hispanic whites, mainly because women earn less than men and minorities receive lower wages on average than whites. For similar reasons, adults with lower education levels are more likely to be poor.
What’s more, having an additional adult able to earn money gives married-couple families much lower poverty rates than households headed by a single woman.
Poverty also varies by age. For those 65 and over, the poverty rate fell from the 1960s until the 1990s, mainly due to more generous Social Security benefits. Since then, it has remained at around 10 percent. The poverty rate for prime-age adults fell until around 1980. After 1980, it fluctuated around 10 percent, rising during recessions and falling during economic expansions.
Child poverty, however, has been relatively high in the U.S. since the late 1970s; it now stands at 17.5 percent. For children in a female-headed household, the poverty rate is near 50 percent.
Problems with measuring poverty
These data all come from American households, using methodology developed in the early 1960s by Mollie Orshansky of the Social Security Administration.
Taking Agriculture Department data on minimum food requirements, Orshansky calculated the annual cost of a subsistence food budget for families of different sizes and types. Household budget studies from the 1950s showed that families spent one-third of their income on food. So, Orshanksy multiplied the cost of a minimum food budget for each family type by three to arrive at their poverty threshold. Thresholds rise annually based on inflation over the past year.
Being poor means having insufficient income during the year to purchase bare necessities. The poverty rate is the percentage of the population in this situation.
The Orshansky poverty measure has been subject to substantial criticism. Clearly, poverty thresholds are not very high. A single individual making US$1,060 a month would not be considered poor. Yet, in most areas in the U.S., it’s hard to rent a place for less than $500 a month.
Even if that’s possible, this leaves only $20 a day for transportation, clothing, phone, food and other expenses. Orshansky’s minimal food budget assumed that people shop wisely, never eat out and never give their children treats. She actually preferred a more generous food budget to get multiplied by three; but she was overruled by senior government officials.
Another problem is that the U.S. poverty measure ignores income and payroll taxes. In the early 1960s, the poor paid minimal taxes. Starting in the late 1970s, low-income families faced a more formidable tax burden, leaving them less money to purchase basic necessities. Conversely, in the late 1990s, tax credits began to lower the tax burden on the poor.
Finally, standards concerning what is required to be a respectable member of society vary over time and place. For example, cellphones did not exist until recently. Childcare was not necessary for many in the 1950s or 1960s; but when all adults in a family work, it’s essential.
More bad news
To deal with this last problem, many scholars prefer a relative measure of poverty. The Luxembourg Income Study, a research organization that analyzes income distribution, considers households to be poor if their income, adjusted for household size, falls below 50 percent of the median income of their country for the particular year.
Unlike the U.S. Census Bureau, the Luxembourg Income Study subtracts taxes from income when measuring poverty. It also adds government benefits, and makes data as comparable as possible across nations. The result is a poverty rate that is typically two to four percentage points above the official U.S. measure.
From an international perspective, the U.S. clearly does poorly. According to Luxembourg Income Study, the U.S. poverty rate was 17.2 percent in the mid-2010s – much higher than other developed countries, such as Canada and the U.K.
Things are even worse when it comes to child poverty. In the U.S., child poverty rates have surpassed 20 percent for several decades, making it an outlier among developed nations.
My research has identified two important policies responsible for this last result: child allowances and paid parental leave. Child allowances are fixed monthly payments to parents made for each child. Paid leave provides income to parents around the birth or adoption of a new child. Both policies are available in developed nations throughout the world – except the U.S. The more generous these national benefits are, the lower the child poverty rate.
Considerable research shows that growing up poor adversely affects children’s health, as well as their intellectual and social development. It lowers earnings in adulthood, and reduces future tax revenues for the government while increasing government social spending.
The annual cost of child poverty comes to around $1 trillion. Meanwhile, every dollar spent reducing child poverty is estimated to yield $7 in the future. This exceeds the return on most private investments.
Steven Pressman, Professor of Economics, Colorado State University
This article is republished from The Conversation under a Creative Commons license. Read the original article.