Читать книгу Considering University 2-Book Bundle - Ken S. Coates - Страница 9

3

Оглавление

Out of Sync

What went wrong? While a college or university degree was never a guarantee of a great job, a high income, and a house with a two-car garage in the suburbs, it served as the closest thing North America had to offer as a guarantee of such an outcome—second only to being born into a wealthy family. That so many college and university graduates struggle to find good jobs is due, in part, to the over-enrolment and massive over-selling of the career benefits of a college degree. But the transformation of North American and global markets has played an equally important role in creating the gap between the dreams and aspirations of twenty-first-century youth and the realities of the world economy.

We can put this very simply. The massive effort to get as many people as possible to college or university is not matched by a comparable effort to create jobs for the increased population of young people and for those dislocated from the workforce as the result of various competitive forces. As James Clifton argues:

The coming world war is an all-out global war for good jobs. As of 2008, the war for good jobs has trumped all other leadership activities because it’s been the cause and the effect of everything else that countries have experienced. This will become even more real in the future as global competition intensifies. If countries fail at creating jobs, their societies will fall apart. Countries, and more specifically cities, will experience suffering, instability, chaos and eventually revolution. This is the new world that leaders will confront.[1]

Outsourcing Jobs to Cheaper Labour Markets

With surprising speed, economic globalization and the rise of East Asia, India, and China have undercut some of the key foundations of the Western economy, displacing hundreds of thousands of workers and dashing the expectations of the young in many countries. What happened to the airline industry fifteen years ago serves as a good example. In the late 1990s, airlines found themselves facing a dilemma. They could continue to hire reservation agents in their own countries, paying a decent wage to get the college and university graduates that they preferred to hire. With benefits thrown in, they were paying more than $35,000 a year as a starting salary for what was, to North American youth, an entry-level and temporary job. But then improvements in telecommunications and computer systems provided access to huge pools of well-educated overseas talent. When India became readily accessible, the airlines (and other businesses) could hire from among tens of thousands of well-educated, highly motivated university graduates. The key differences? In India, these jobs carried an annual salary of $5,000—and any candidate fortunate enough to get one of the positions typically held on to it. The choice seemed obvious: one reluctant North American college graduate at $35,000 a year, or seven university graduates from India, the latter being more grateful for the jobs and often demonstrating a stronger work ethic and more commitment to the position. It makes sense that many telemarketing and reservation systems link phone calls to Bangalore or Mumbai, where staff from India are trained, with greater or lesser success, to speak with North American regional accents.

But that was only the beginning. Factory workers were laid off by the thousands throughout the early twenty-first century, with hundreds of North American factories shuttered and the jobs relocated to Mexico, China, Thailand, the Philippines, and India. The damage got worse during the 2008 global recession—brought to you by your friendly Wall Street subprime mortgage companies and their banking allies—a financial catastrophe that ended only when the United States government bailed out some of the most mismanaged and borderline corrupt institutions in the world. Things started to pick up in 2014–2015 at the macro-level, with “good news” circulating about the recovery of the American economy, although the crash in oil, natural gas, and commodity prices damaged the resource sector and Euro-instability, associated with the near-death experience in Greece and China’s recent boom-and-bust cycle in the stock market, made it clear that a return to stability was a long way off.

Still, the much-ballyhooed recovery carried some ominous news. The jobs did not come back as expected. Much was made about a few promising developments. Apple, one of the world’s richest corporations and one of the best American firms at hiding its profits overseas (Apple Luxembourg is fabulously wealthy, thanks to US tax laws), noisily congratulated itself on opening a new factory in California, primarily to offset growing criticism of the labour practices at its Foxconn and other Chinese manufacturing facilities. But the reality was that jobs did not follow the upsurge in manufacturing in America. Why? In large part because the country’s impressive productivity gains and continued high level of investment in production-improving technologies increased manufacturing—but not manufacturing jobs. Economic recovery without a surge in employment seemed a partial victory, at best, and a worrisome sign about the economic future.

The New Economy Versus the Old Economy

The new economy produced some high-tech jobs, but the much-hyped growth turned out to be a whimper rather than a roar. Google and Facebook do not employ as many people as General Motors and John Deere. Google, worth $365 billion as of August 2015, heading, apparently to $1 trillion[2] and one of the world’s richest firms, had 53,600 employees in 2014, a drop of 200 from two years earlier. That same year, General Motors had 216,000 workers worldwide, a drop of 3,000 from 2013. But let’s get real here. Walmart—an impressive retailer with a fabulous high-tech backbone but not the dream employer of many young Americans—has 2.2 million employees worldwide and is the largest private-sector employer in the United States. It is a big drop back to the second-largest American employer, Yum! Brands, which has over 500,000 workers, mostly overseas. (Yum! operates Pizza Hut and Taco Bell.) UPS, a parcel delivery service and one of the best-automated firms in the USA, has almost 400,000 workers. There are high-tech or knowledge-based companies in the mix, with IBM (434,000), Hewlett-Packard (332,000), and General Electric (305,000) making the American top ten. The point is that Google is a drop in the employment bucket. Walmart has many times more employees.

For so-called “knowledge workers,” the target of governments and universities around the world, an uncertain future lies ahead:

… there is really a double dose of bad news. For not only are their jobs potentially easier to automate than other job types because no investment in mechanical equipment is required; but also, the financial incentive for getting rid of that job is significantly higher. As a result, we can expect that, in the future, automation will fall heavily on knowledge workers and in particular on highly paid workers. In cases where technology is not yet sufficient to automate the job, offshoring is likely to be pursued as an interim solution.[3]

Much the same has occurred in the finance sector, remarkably one of the most desired career paths for young North Americans. Because of the appalling scandals and malfeasance that have wracked it in recent years, you’d think the young would avoid it like the plague, but such is not the case. The banking system has actually continued to grow, particularly in the United States, with the expansion of service outlets in a highly competitive market, but most of the growth has been in service or teller-type positions. In Canada, where an oligopoly of five large banks controls the majority of the financial market, job reductions have been more pronounced. It needs to be said that Canada has one of the world’s most stable and dependable banking systems anywhere. While the USA was recovering from the collapse of Bear Stearns, Wells Fargo, Andersen Consulting, Fannie Mae, and many other financial institutions in the wake of the 2008 financial crisis, not a single Canadian bank closed down or was even seriously affected. Computerization has brought about sweeping changes, with work shifting from in-person support to online banking and heavily computerized systems. Banks hire many high-technology employees. The financial service and insurance industry remains a major employer—with close to six million employees (three times Walmart’s workforce) in 2014. The securities sector, with almost 890,000 workers in 2014, is expected to grow by an additional 12 percent by 2018. Employment data shows that this remains a service sector, with over 500,000 tellers, 367,000 insurance sales agents, and almost 300,000 securities and financial services sales agents.[4] This is, of course, old-economy work, knowledge-based to be sure, but not exactly the exciting new jobs-of- the-future stuff that promoters of the new economy have been talking about for several decades.

The vaunted post–dot.com economy has produced fine jobs for high-tech wizards, entrepreneurs, and marketers, but far fewer jobs for ordinary graduates and other semi-skilled workers. According to the US Department of Labor, the animation industry—a key sector in the new digital economy—had only sixty-nine thousand jobs across the USA in 2012. What’s more, this exciting sector, one that converted once-unemployable fine arts graduates into tech-stars, is notoriously fickle and cyclical, with companies expanding and contracting their workforce with distressing regularity and with many firms discovering that they can keep just the high-end design work in North America, although even that is facing competition from Japan and South Korea, with the more routine animation work—the digital equivalent of manufacturing assembly work—being automated or outsourced to China, India, and other countries.

Where is the work? Glassdoor, an American jobs and employment site, compiled a list of jobs in the United States for which there is the highest demand. Here are the top ten:[5]


How does this match up with the experience of recent graduates? An estimate of the production of lawyers in the USA indicated that the profession expected to add almost two hundred thousand positions between 2012 and 2022. In that same time period, American law schools were expected to graduate more than three hundred thousand new lawyers. What are the other hundred thousand going to do? And remember that there are many unemployed and underemployed law school graduates already in the labour force, competing for one of the two hundred thousand new jobs.[6] To touch on a topic to be discussed at length later, major employers have increasingly discovered that four-year college graduates, especially in the high-technology sectors, are not necessarily a good match with job openings. They have turned to short-course providers, particularly in software related areas, who provide specially trained workers (many of whom already have a degree or two) with career-ready preparation.

These examples make several simple points. There are lots of jobs, and very good jobs, available in the North American economy. The jobs, however, do not align very well with the fields of study of American and Canadian college and university students. And even when there is a direct connection—law school and lawyers’ jobs, computer science and software architect—there is often a mismatch between the size of the graduating class and the needs of the workplace. One starts to feel sorry for the university students and graduates of today, because the challenge of determining the right fields of study, the most promising fields of employment, and the best fit for a career is becoming extremely difficult, given the shifting realities of the North American workforce.

And for those who don’t want to go to college for four years, there are still some surprisingly well-paid jobs that require training, but not a full degree. According to Nicholas Wyman’s book Job U, these (presumably not the starting salaries) include:

Radiation therapist$77,500
Elevator technician$76,600
Nuclear medicine technologist$70,180
Airline and commercial pilot$98,410
Dental hygienist$70,200
Medical sales$85,000
Air traffic controller$122,500

With a series of economic crises—including the 2008–2009 chaos created by the American subprime meltdown—piling on top of global workplace changes, job opportunities dried up quickly. College graduates got jobs—employers who had a chance to choose between a university or a high school graduate usually opted for the former—but few of the hirees expected that they would be working in Starbucks, a fire hall, at Home Depot, or a Walmart. What a difference in a few decades! In the 1970s, rental-car companies hired high school dropouts to staff their counters. By the 2010s, Enterprise Rent-A-Car in the USA declared with pride in its NCAA March Madness television commercials that it was the largest employer of college graduates in the country. Enterprise is a successful and well-regarded company. Many of the graduates are hired into management-stream positions and some do progress through the ranks of the company. It’s probably safe to say, however, that few parents and college-bound students ever sat around the dining room table, college application forms and guidebooks at the ready, student loan forms almost filled out, thinking that the end result of their studies would be work as a clerk at a car-rental firm.

The scale of the transformation of the North American industrial and professional workforce is not widely known. But tremors have been rumbling under the labour force for decades, with surface cracks emerging during recessions as the cruel pressures of international competition and technological change undermine such traditionally strong sectors as automobile and technological manufacturing. Detroit has emerged as the poster child for the continent-wide industrial meltdown, a city in complete free fall, with blocks of abandoned homes, widespread African American poverty, and near anarchy, surrounded by safe, comfortable, and largely Caucasian suburbs of sustained prosperity. In Detroit, the greatest increase in employment in the period 2008 to 2018 is estimated to be in home health aides, not a well-paid occupation.[7]

The crisis was not limited to Detroit. In Cleveland, Ohio, and other industrial cities in the Great Lakes and American Northeast, large companies shed hundreds of thousands of well-paid, unionized jobs. Recent economic challenges have seen the problems spread throughout southern Ontario and into other northern-tier industrial states. Urban blight, fuelled by a widespread flight of jobs and people from industrial areas, became the hallmark of what had previously been the industrial heartland for the Western industrial world. It is as if, only forty years from now, Silicon Valley were to become a wasteland of abandoned high-tech campuses, with the McMansions of former high-tech executives derelict, occupied by squatters, or converted into crack dens in some form of apocalyptic Detroit on the Pacific. What seems ridiculous and unimaginable for Silicon Valley, or Austin, Texas, or San Diego, or Boston was just as far-fetched for the Detroit of the 1970s. We are entering unfamiliar territory, with little evidence that people are paying attention to the warning signs.

The story inside the numbers is particularly jarring. It turns out that North America, the land of opportunity, is becoming decidedly less so. The story is much broader than the Occupy movement’s indictment of the One Percent and the endless—and partially deserved—critique of the richest people in the Americas. Few people realistically expect to rise up into the financial stratosphere, and, except for those who are obsessed by popular culture—fans of the Kardashians and their ilk—few intelligent people care much about the “uber-rich.” Sensible people know that most of the One Percent, like the ever-fascinating Donald Trump, started rich, with boosts from their parents. Most people simply seek a reasonable income, with enough money for the basics and a few extras, a comfortable home, and health security. North Americans overwhelmingly aspire simply to the middle class, or at least to the package of financial outcomes and material well-being historically associated with that status.

Loss of Industrial Jobs

One of the great success stories of post–World War II North America was that the rapidly expanding American consumer economy created conditions that propelled millions of otherwise average people from poverty into middle-class lifestyles. What stood out in this era was that, for the first time in human history, the Western industrial economy produced a large number of well-paid, wage-labour positions for people of average ability and skill. In previous generations, say before 1940, people of below average or of average ability struggled to find good work, and only a fortunate few earned enough money on a regular basis to enjoy a stable and comfortable life. The post–World War II period produced urban growth, fuelled the rapid expansion of suburbs, and raised expectations across society. The related development of the managerial class built an even broader foundation for national prosperity. This class included a large group of government workers in the rapidly expanding civil service, the staff of the expansive financial industry, the emergence of massive entertainment and media industries, and the stunning growth of the consumer retail trade, which included a large advertising sector. To top it off, the Cold War and the American adoption of the domino theory forced the USA to engage communism on many fronts, and that required a large standing military—and an even larger military-industrial complex to buttress America’s international commitments.

The simultaneous rise of management, retail commerce, public administration, and high technology created intense demand for college- and university-trained graduates, while also offering decent incomes for armies of industrial workers who had minimal training and no particularly specialized skills. Opportunities for the well-educated were matched by prospects for hard-working and dependable workers of average ability. This was the America of the great postwar boom, a nation ascendant internationally, with an economy that dominated the world and social opportunities that were unmatched anywhere. Young people leaving America’s high schools had a variety of good options, including a post-secondary education followed by entry into the burgeoning white-collar workforce, direct employment in any one of the thousands of manual and factory operations across the continent, or military service as a backup option. Not everything was rosy, of course, since it never is. African Americans had to surmount many barriers to opportunity, and immigrants often struggled to adjust. Women did not experience major employment and income gains until the 1970s. Even America at its height was no paradise. But the country worked better than almost all others on the planet.

The flood tide of American prosperity and social harmony began to ebb in the 1990s. Industrial closures swept across the United States, as companies fell victim to outdated technologies and intense Japanese, South Korean, and Chinese competition. The closures shifted to Canada and Mexico after the North American Free Trade Agreement came into effect in 1994. Women did find more jobs, but stagnant or declining incomes forced millions of families to have both parents working just to maintain their standard of living. Ethnic minorities, particularly African Americans, paid the greatest share of the cost, but so did poorly paid coal miners in the Appalachians and factory workers in uncompetitive industries. Impressions of prosperity lingered, buttressed by America’s unrelenting confidence in the free market, a preposterously overhyped dot-com boom, and an explosion of home values and fraudulent mortgages. But underneath the façade of one of the world’s least-regulated financial markets, the shiny possibilities of Silicon Valley adventurism, and hyperinflation in house prices, major cracks were emerging in the North American superstructure.

The most alarming change, with profound implications for youth prospects in the twenty-first century, was the rapid decline of opportunities for people of average ability and limited skill. With robots and mechanization replacing many factory workers, with global competition eliminating tens of thousands of jobs and crippling the effectiveness of most industrial trade unions, and with much of the country’s economic growth occurring in high technology and finance, the general-purpose industrial labourer lost access to work, income, and opportunity. There were occasional bright spots—Alaska pipeline developments in the 1970s, North Dakota and Wyoming shale gas in the 2010s, Alaska and Texas oil plays, and artificially hyped housing construction—but the overall experience was distressing. Entire towns and regions, from Ohio and Illinois to rural Pennsylvania and upstate New York, suffered through prolonged collapse, urban decay, and a massive increase in poverty among the laid-off workers. In the twenty-first century, the pace of technological displacement accelerated. The current cost of replacing a worker with a machine in the industrial sector is around $100,000. If an industrial machine can be purchased for that sum, a regular position can be eliminated. For many companies, the resulting increase in efficiency and productivity is the only way to remain competitive.

The Loss of Middle-Management Jobs

Perhaps the great challenge for university graduates is one that has attracted little attention because it evolved slowly and with little fanfare. Starting with the rapid expansion of government, industry, and the service sector after World War II, North America created one of the most impressive middle-management cohorts in the world. The USA and Canada were not alone. Japan had—and still has—one of the most successful middle-management cultures anywhere. So does England, built around the financial and insurance industries. Germany’s much-vaunted industrial establishment, like the banking sector in Switzerland, is likewise centred on a strong, educated, and conscientious middle-management layer. But consider the observation of Daniel Pink:

During the twentieth century, most work was algorithmic [described as “rules-based”]—and not just jobs where you turned the same screw the same way all day long. Even when we traded blue collars for white, the tasks we carried out were often routine. That is, we could reduce much of what we did—in accounting, law, computer programming, and other fields—to a script, a spec sheet, a formula, or a series of steps that produced a right answer. But today, in much of North America, Eastern Europe, Japan, South Korea, and Australia, routine white-collar work is disappearing; it’s racing offshore to wherever it can be done the cheapest. In India, Bulgaria, the Philippines, and other countries, lower-paid workers essentially run the algorithm, figure out the correct answer and deliver it instantaneously from their computer to someone six thousand miles away.[8]

The transition had a major impact on the prospects for North American youth. In 2015, Robert Putnam published a brilliant but depressing book, Our Kids: The American Dream in Crisis, that documented the rapid division of USA society into “have” and “have not” populations. His monumental work makes it clear that college and employment affirmative action cannot overcome deeply entrenched poverty, marginalization, racism, and trauma at the family and community level. Putnam puts words and statistics to what is obvious across the United States and parts of Canada. African Americans are doing much more poorly and have little access to the American Dream. Hispanic Americans, legal or otherwise, have suffered egregiously in educational and employment outcomes. New immigrants, for years the source of much American energy and entrepreneurship, lag well behind. It’s much the same in Canada, where immigrants struggle to have international credentials recognized and where Aboriginal peoples (as in the USA) often live with devastating poverty and community despair. Have-not regions—the Appalachians and Detroit in the USA, significant parts of the Maritimes and rural Quebec in Canada—rely on government transfers and make-work programs. The numbers are shocking: at the end of 2014, over forty-six million Americans relied on food stamps, and for three years in a row this figure included two hundred thousand people with Master’s degrees and thirty-three thousand with PhDs.[9]

Higher Education as the Answer

The problems are deep and systemic, attached to advances in technology, the displacement of labour through globalization, and the shift away from heavy industry across North America. Consider the implications, according to Derek Thompson, writing in The Atlantic, who documented the growing number of nonworking men and young people without jobs.[10]

The share of prime-age Americans (25 to 54 years old) who are working has been trending down since 2000. Among men, the decline began even earlier: the share of prime-age men who are neither working nor looking for work has doubled since the late 1970s, and has increased as much throughout the recovery as it did during the Great Recession itself. All in all, about one in six prime-age men today are either unemployed or out of the workforce altogether. This is what the economist Tyler Cowen calls “the key statistic” for understanding the spreading rot in the American workforce. Conventional wisdom has long held that under normal economic conditions, men in this age group—at the peak of their abilities and less likely than women to be primary caregivers for children—should almost all be working. Yet fewer and fewer are.

Starting in the 1980s, with industrial work eroding quickly, governments, parents, and high school counsellors turned to promoting college or university admission as the best path forward. The evidence was clear, displayed in the career and life experiences of those who opted for post-secondary education in the three postwar decades. The wage and income gap between high school graduates and those with college degrees was becoming steadily larger. What was missed in the celebration of the economic “success” of college graduates was the fact that, “The gap has increased mainly because of the collapse of wages for those who have less education, and not because of any dramatic increase in the earnings of college grads, especially new grads. The reason that fact about the gap matters is because it could well be that college grads do far better than high school grads and still do not earn enough to pay back the cost of their college degrees.”[11] The bonus for spending three or four years in post-secondary education, forgoing income during that time, and paying tuition and living expenses was nonetheless impressive and well worth the effort. Get a Bachelor’s degree and earn $1 million or more over a career than a community-college graduate, and much more than a mere high-school-diploma holder. Get a Master’s or professional degree and add another impressive salary jump. Struggle through to the PhD and earn even more. The formula was so simple. Governments believed that the formula also applied to them: subsidize undergraduate and graduate education, consider these to be investments in people and society, and reap the benefits of much higher taxes in the future. Everyone wins. No one loses.

By happy chance, the Learning = Earning numbers worked out perfectly for the college and university sector. It turned out that, on average, a four-year degree produced $1 million in additional lifetime income for a university graduate over a nongraduate. The world’s best marketing agency could not have found a better set of numbers. Even with the costs of going to school—and before the 1990s, tuition costs were reasonable and within reach for most middle-class families—and the delayed earnings, these young people enjoyed accelerated incomes, as well as cleaner and more attractive office environments.

College and university recruiters jumped to capitalize on the salary numbers. Sitting at dining room tables with prospective students or talking to an eager and earnest family at a recruiting fair, they could wipe out anxiety about high tuition costs and college living expenses with a simple reference to the million-dollar promise. Viewed this way, a degree was not simply an education. For parents, it was an investment, a financial commitment in their children’s future. And as alternative pathways to comfort and prosperity dried up—often for the parents themselves, as they lost their jobs or struggled with a changing workforce—the college promise hung out there like a bright and compelling beacon. Come hither, it said, to jobs and opportunity.

But from the outset, these statistics had been misleading. Averages are averages, indicating what will happen across a broad population. The averages dealing with college graduates include the Harvard-educated doctor, the Wall Street financier from Princeton, the McGill graduate with a degree in neuropsychology, and the oil and gas engineer from Texas A&M, as well as the English literature specialist from Minot State, the psychology graduate from Cal State Northridge, and the film studies major from New York University. Even the most superficial analysis would suggest that the first four would have dramatically different career outcomes than the last three. But the average figures were undeniably true, as averages, and parents and high school graduates, eager for reassurance, accepted the idea, or, to put it baldly, the lie that a degree—any degree from any institution—would provide a ticket to the middle class.

Parents and prospective students were even more strongly seduced by the additional promise of high-quality institutions. If the promises of universities and colleges in general might be suspect, then at least the best institutions provided better prospects. Harvard, University of Southern California, and University of Texas-Austin were clearly going to offer greater hope than a third- or fourth-tier college. Ditto Stanford, Duke, Princeton, Duke, and Yale. The data seem to bear this out. The simple dominance of Ivy League graduates in the American finance sector provides seemingly irrefutable evidence of what many North Americans take to be an unassailable truth. If Learning = Earning, then Learning at an Elite Institution = Even More Earning. The result, of course, is intense competition for entrance to the elite schools, with the best institutions attracting many times more applicants than there are first-year spaces. In one of the many perversions of the United States’ post-secondary education system, institutions are rated on the ratio of applicants to admissions. Clearly, a college that turns down more students is more selective and therefore more elite and more attractive. And so institutions seeking to position themselves as among the very best in an intense and overcrowded North American marketplace make a strong effort to recruit more applicants, even when they already have ten to fifteen times more would-be students than spaces.

The Jobs Crisis

But as the truth about graduate outcomes slowly became clear, the return on investment or ROI—an acronym that took over from a high-quality education as a primary goal for post-secondary education—started to decline. It also started to split between career-ready degrees and more general fields of study that lacked a precise career focus. Business schools, the weak cousin of colleges and universities in the 1960s, had become institutional superstars by the 1990s. Top business faculty members attracted rock-star salaries and attention. Students who had flooded into Arts and Science programs like lemmings in the 1970s and 1980s shifted gears in the 1990s and 2000s and headed for business schools. Even the prestigious American liberal-arts institutions—Colorado College, Middlebury College, and Swarthmore College—did the previously unthinkable and added business and economics degrees to what had once been the best Arts and Science undergraduate degree programs in the world.

The job crisis, which started in the low-skill industrial sectors, has started to infect the professional ranks. Law students, lured into degree programs by the prospect of high-profile and high-income jobs, have faced brutal job prospects across the United States and Canada in recent years. In the mid-2010s, thousands of new law graduates struggled to find articling positions, ringing up mountains of debt, and then failing to find the lucrative jobs they expected. The crisis here was driven higher by the out-of-sync career and salary expectations of aspiring lawyers who may have watched too many episodes of Law and Order or seen Michael Douglas’s turn in Wall Street (“greed is good”) as an enticement rather than an indictment against a failed financial system.

But you can’t fool all of the people all of the time, as a self-educated American lawyer once said, and students eventually caught on and responded to the collapsing job market. In the United States in 2014, law school admissions fell by some 40 percent over the previous high. Law schools felt they had two choices: close their doors (several did) or lower their academic standards to fill their classrooms and pay the bills. The latter practice was particularly noticeable at the larger for-profit schools that had sprung up to meet the seemingly insatiable demand for legal careers. Canada, incidentally, followed a contrary track, with continued strong demand for spots in law schools—but equally severe difficulties for graduates looking for articling positions and full-time employment. In fact, two new law schools opened in Canada, both in smaller regional centres that had struggled to find lawyers willing to work in the areas.

In 2012, some American law schools, in desperation, reportedly gave in to the temptation to game the system.[12] After-graduation surveys are often pretty superficial and do not inquire deeply about the positions that former students hold. Surveyors simply want to know if the graduates have a job and if the position is related to their law degree. So, the solution must have seemed obvious. The law schools hired their own graduates and gave them quasi-legal jobs that were of long-enough duration to cover the survey period. The results improved, the law schools looked better, and no one complained; at least, not until the truth came out. When it was revealed that certain law schools responsible for training the nation’s lawyers had betrayed the very principles they were teaching, many observers were disgusted, although the scandal did not get nearly as much attention as did the New England Patriots’ “deflate-gate” scandal in the American Football Conference finals in January 2015.

Other law schools have become bottom-feeders. The classic example is Florida Coastal School of Law. In 2013 the median score of its entering class lay in the bottom quarter of everyone in the country who wrote the LSAT test, something that the LSAT administrators say makes it unlikely that these students will ever pass the bar exam. Nevertheless, the school charges nearly $45,000 a year in tuition. Ninety-three percent of the 2014 graduates had debt averaging $163,000—this for a degree they may well be unable to use, if they graduate.[13]

A substantial number of graduates in many disciplines, particularly in the Arts and Sciences, have had dismal job experiences. As a result, many students have chosen to continue their studies at graduate and professional schools in a search for careers and a decent salary. The continued pursuit of graduate qualifications has, for many aspiring students, proven disappointing as well. This is particularly true at the Master’s level, where ROIs in certain Humanities fields have dropped below zero—a fancy way of saying that the cost of completing the degree plus the loss of deferred income is greater than any increase in earnings associated with the degree. So no million-dollar bonus here. This career calculus is fine if the motivation for completing the degree is learning and self-fulfillment. If the students (and the parents and governments that fund the system) select the degree in anticipation of a better job and higher income, then the outcome is a disaster.

Four Backup Careers for New Teachers

These are Canadian examples for teaching graduates who can’t find jobs or want to delay their entry into the field, but the ideas are universal. For Royal Canadian Mounted Police, for instance, substitute FBI.

1. Teaching abroad

There are many countries where English teachers are highly sought (e.g., South Korea, the Middle East, and Japan). If you’re an adventure seeker, teaching abroad on a one- or two-year contract is a great option. The classroom experience could prove useful when you return.

2. Private tutoring

You can work with tutoring companies such as Alliance or Kumon, or manage your own students. The rates are good—up to $30 per hour.

3. Private schools

People shy away from teaching at them because of the stereotype about affluent students being entitled and unpleasant to teach. But this may not be true, and the classes are relatively small and teacher resources are abundant.

4. The justice system

Who says you have to stick to the classroom? The Royal Canadian Mounted Police has also been recruiting education graduates lately to work in civilian jobs as instructors, youth workers, or in victim services. Skills acquired in teacher’s college—flexibility, planning, and multitasking—are useful in the justice system too.[14]

The oversupply of people with undergraduate degrees has morphed, in short order, into an oversupply of people with graduate degrees. Where there were once too many people with Bachelor’s degrees, there are now also too many with Master’s. A good portion of this is artificial, though, since the largest field of graduate studies in North America—by a very significant degree—is in education. In the high-demand areas, where the continental economy truly needs highly skilled and well-trained people, like mathematics, computer science, and engineering, the vast majority of North American PhD students and graduates are international students. Not so in education. In this extremely inflated field, enrolment is driven by the immediate salary bump for teachers that follows graduation.

The evolving patterns of the modern workforce have destroyed dreams by the hundreds of thousands. Many Chinese families who borrowed heavily to get their children into a North American university have seen them struggle to find work when they returned home. Thousands of American young adults have lived in undergraduate-like poverty as they searched for work, and large numbers have moved back in with their parents. Failure to launch, much more than the title of a mediocre movie, became the hallmark of the Millennial generation. While many coped with long-term underemployment and unemployment, others ascribed failure to their own shortcomings. Few looked across a landscape of destroyed expectations and identified structural or fundamental flaws with the economic and educational order.

From about 2000 to the present, the experience of young adults has mirrored and contributed to the changing employment landscape across North America and around the world. Some succeeded and did extremely well. Ivy League graduates continued to land six-figure jobs on Wall Street. High-tech firms in Massachusetts, north Texas, and California scoured the graduation lists at MIT, Harvard, University of Texas–Austin, Stanford, and the University of Waterloo to find the highly skilled workers they needed for their new economy firms. In short, there was wealth and opportunity for the few; struggle and frustration for the many. More than the mantra of the Occupy movement, this described the real-life experience of college and university graduates in general.

Considering University 2-Book Bundle

Подняться наверх