Author: Heather McGowan – Academic Entrepreneur + Innovation Strategist, June 20, 2014
This is part one of a three part series on the changing shape of employment, the emergence of the collaborative economy, and the potential impacts on higher education.
The era of using education to get a job, to build a pension, to then retire is over. Not only is average is over and the world flat, but this is the end of employment, as we once knew it. The future is one of life-long learning, serial short- term employment engagements, and the creation of a portfolio of passive and active income generation through monetization of excess capacity and marketable talents.
Where did this change begin? From 1981 to 2001, the U.S. economy experienced the beginning of middle-class income stagnation with most job growth focused in lower-paying jobs (evidenced by a flattening of median household income vs. GDP). Starting in 2001, even that job growth all but disappeared as companies began outsourcing middle-skill labor – anything that is routine either mentally or physically – to history. Work was automated, digitized, robotized, and/or broken into job fragments that can be supplied from anywhere in the world. Since the great recession of 2008 both GDP growth and labor productivity have risen while private employment has declined – rapidly[i]. This is new normal, employment as we’ve known it, is over.
This was one of the my key takeaways from attending the Thomas Friedman’s Next New World Summit, a summit of thought leaders from across the value chain, from education (Sebastian Thrun, co-founder of Udacity and Tony Wagner, author of The Global Achievement Gap), to technology (Lazlo Bock, SVP of People Operations at Google and Andrew McAfee, author of the Second Machine Age), to entrepreneurship (Ben Kaufman, founder and CEO of Quirky), including those who track and monitor these tectonic shifts (James Manyika, Senior Partner, McKinsey and Jeff Weiner, CEO of LinkedIn).