The jobs of today are quite unlike the jobs of some decades ago. In the years following World War II, employment was generally expected to be steady and the tenures long. Universities prepared students for work at a large company or institution in defined fields such as engineering, accounting, sales, marketing, law, writing or medicine. Even during the early internet years, while it wasn’t necessarily an entire life with a single employer, people did not take a job contemplating the next change. Nor was the fresh graduate necessarily prepared for that first job right away. Usually, companies trained new employees, rotated them through different departments, and prepared them for the long haul.
The labour market today is far more dynamic. There is the emergence of the “gig economy"—in which workers are essentially independent freelancers who work short-lived gigs found on platforms such as UpWork, possibly out of co-working spaces such as WeWork, and using software services such as Amazon Web Services, Google or Microsoft (instead of a back-office or an IT department).
A recent graduate may quite likely work for a startup, rather than an established company. The very nature of startups is that employees enter with a different social contract—that of impermanence. Even the line between white collar and blue collar work is blurring; a musician or a coder might work at Uber or on Task Rabbit between gigs. Transaction costs, which Nobel Laureate Ronald Coase proposed as the raison d’être for the coalescence of companies, are vanishing, and economies seem generally headed for a more disaggregated workforce.
This has an impact on how we educate, and why we teach. First, personal leadership and independence are becoming more important as life skills. Students need to leave college ready to become the chief executive officers of their own lives. For that matter, they also need to become their own chief marketing officers, chief technology officers, heads of sales and heads of innovation.
Furthermore, as the pace of technological progress accelerates, and the half-life of skills decreases (think of how quickly fields such as software development and marketing have evolved over the last decade), students need to take charge of their own learning—by becoming the chief learning officers of their own lives. In short, students need to graduate from college with life skills.
Second, while degrees will remain a crucial signalling mechanism, real, demonstrated capability will become more important in the new world. If you work for yourself, you are more likely to be recruited for the next freelancing gig because you have a track record for delivering value.
Many industries have already gone in this direction. Hotels on TripAdvisor, and merchants on eBay or Amazon, are judged more on their reviews than on their brands. The power of recommendation is practised informally today, and will only become more important in the emerging economy. And this shifts the emphasis from education to true lifelong learning—with less of a focus on exams, and more on applying learning to deliver results.
A third trend is towards integration. A great deal of innovation is occurring at the interfaces of fields. For example, while mathematics and geography are taught in silos today, the opportunity lies in using mathematics and geography together. Geolocation, geofenced advertisement, maps, GPS, and other similar technologies are all about precisely that intersection: mathematics and geography. Pick any two subjects, and a fertile intersection either exists or might well be the next big thing. This means that learning must be integrated, and breaking down the silos is essential.
None of this would be possible if we did not have a finely tuned understanding of each other. Herein lies the great potential for the humanities, arts and social sciences.
For one thing, the dismemberment of companies does not mean the dismemberment of teams—rather, the ability to form teams and become productive quickly is an essential aspect of this new agile world. Furthermore, as technology matures, the opportunity for human creativity will increase. Technologies such as drones, self-publishing sites, 3D printing and social media have enabled millions of new creators on new forms of media. Much of human endeavour will be geared towards releasing each others’ potential. This will require a deeper, more heartfelt understanding of the humanities—whether it is history or philosophy, fine arts or social science.
Finally, the jobs and lives of the future pose new and difficult questions of ethics and purpose. For example, we have seen the recent controversy around employees of a food-ordering app refusing to deliver certain types of non-vegetarian food, and the complex ethical questions surrounding it.
Moreover, in a world that is changing foundationally, individuals need to develop an awareness of their sense of purpose, and stay closely connected with it through turbulent times and environments. To quote the American sociologist and civil rights activist W.E. Du Bois: “The true college will ever have but one goal—not to earn meat, but to know the end and aim of that life which meat nourishes."
Kapil Viswanathan and Sanjay Sarma are, respectively, vice chairman of Krea University, and vice president for open learning at MIT