For many years, the narrative that the best jobs require college degrees flourished. While having a degree from a university helps many find high-paying work, only 25% of American adults say it’s very ...