As of this writing, the latest numbers on unemployment in the U.S. report less than 4 percent of Americans are unemployed. While that measure has long been used to indicate a strong economy — and as a frequent bragging point for the Trump administration — it seems this figure might not convey the full picture. Despite full employment, many American workers are barely scraping by. To understand why, we have to look at the quality of those jobs.
According to The U.S. Private Sector Job Quality Index prepared by Cornell University, the majority of new jobs being added in the U.S. are low-wage and/or low-hour positions. And while that has been a trend for decades, the proportion of low-quality jobs to high-quality jobs has increased dramatically since 1990, to the point that over 60 percent of all new paid and non-supervisory (P&NS) jobs created have been low-wage, low-hour roles.
In other words, there may be a ton of jobs, but the jobs being created aren’t very good, and a significant and growing majority of them do not provide a living wage. The current U.S. minimum is $7.25, or $15,080 per year given a standard 40-hour work week. And while the cost of living in the U.S. varies dramatically among the states and major cities versus rural communities, most Americans would argue that wage is too low to comfortably live on just about anywhere in the country on your own, let alone people with dependents.