By Eric Ravenscraft June 5, 2019
If you ask a dozen lawmakers what constitutes a “living wage,” you’ll get a dozen answers. Where does the term come from? And is it even accurate?
The term “living wage” gets thrown around enough by politicians and advocacy groups that the definition can get muddy. The legal minimum wage in the United States is $7.25 per hour, though some states and cities like New York City and Seattle are experimenting with minimums as high as $15 an hour. But are these wages enough to live on? And what is a living wage, anyway?
The minimum wage roughly meshes with federal poverty guidelines. According to the guidelines, a two-person household with a total annual income below $16,910 is considered to be living in poverty. To clear the poverty line, one of those two people would have to make $8.13 an hour or more. At least 17 states have minimum wages higher than that. The $15-per-hour minimum wage in New York City, for example, translates to an annual income of $31,200, which is almost twice the federal poverty level for a household of two.
However, anyone living in New York City can tell you how laughably low $32,000 per year is for a single-income household. Likewise, $17,000 may be a poverty-level wage in much of the country, but that doesn’t mean $18,000 is enough to get by. This flaw in the federal poverty guidelines was first described by the woman who developed them, Mollie Orshansky.
In 1965, shortly before the United States government adopted the guidelines Ms. Orshansky wrote: “There is not, and indeed in a rapidly changing pluralistic society there cannot be, one standard universally accepted and uniformly applicable by which it can be decided who is poor. … If it is not possible to state unequivocally ‘how much is enough,’ it should be possible to assert with confidence how much, on an average, is too little.”
Read on: What a living wage means