Want the Ideal Job? Craft it Yourself

Photos from Rachel- Potter Class at Earthborn 2012. By Unskinny Boppy (2)
Photos from Rachel- Potter Class at Earthborn 2012. Courtesy of Unskinny Boppy.

There’s something strangely satisfying about a hobby where you do what you want for a few hours.  Wouldn’t it be great if your whole career was that satisfying?  Well, it’s possible, but you need to decide that your own job content the item you should craft.

There was an interesting conversation ignited in the New York Times “Workologist” section by Rob Walker.  In March the career advice column fielded a question from someone considering leaving their job several years before retirement because of excessive work-travel they found unpleasant.  Readers objected that there was not enough consideration given to whether the employee could just ask for less work travel.  In his follow-up column April 1, 2018, Walker broached the more sophisticated subject of job crafting.

Job crafting is the practice of employees identifying what parts of their job they do best and find fulfilling, then putting more of their time and effort into those activities.  Work that is difficult or unpleasant may be down-scaled, dropped, or given more support.  It is different from manager-initiated job design, as job crafting is initiated by the employee.

Significant work has been done in this area by Amy Wrzesniewski from Yale University.  Wrzesniewski and her peers have developed a formal methodology for job crafting, including an assessment of what change is desired, building-block visual tools, a before-and-after dichotomy, and a good dose of positive psychology.  You can buy a copy of the Job Crafting Exercise workbook for about US$35.

Why Job Crafting is a Good Idea

The methodology is displayed in a 2010 article in Harvard Business Review, in which the authors note:

“…employees (at all levels, in all kinds of occupations) who try job crafting often end up more engaged and satisfied with their work lives, achieve higher levels of performance in their organizations, and report greater personal resilience.”

Job crafting improves proactivity, innovativeness, adaptability, and emotional well-being.  Employers see this practice as a way of giving employees an opportunity to self-motivate and improve the likelihood they will stay.

The self-directed feature of this practice is key. Managers may have insufficient time or knowledge to figure out how to organize the work of their subordinates for the better. If employees are given the opportunity to self-manage in this way, it can allow the manager to achieve results without having to do all of the leg-work.

In a 2013 paper by Justin Berg, Jane Dutton, and Amy Wrzesniewski, Job Crafting and Meaningful Work, the authors describe job crafting as key to making work more meaningful.  The authors identify three main categories in which employees attempt to re-craft their jobs for better fit and meaningful work.  Those categories are the employee’s key motives, the leveraging of the employees’ strengths and talents, and the ability to pursue passions and topic-areas of deep interest to the employee.

What Will Your Manager Say?

It should go without saying – so let’s be blunt – that bottom-up efforts to change job content are contingent on good conversations between the employee and their manager.  There will be unpleasant work that everyone wants to avoid, and employees might self-select away from that work.  I once heard an astronaut describe how everyone on a space station is expected to take their turn emptying the toilet regardless of their rank. Because of how unpleasant the work was, it was a badge of honour that everyone took their turn, without complaining, for the benefit of the team.  This might be a bad example, however, because these employees get to be an astronaut.  You don’t hear them complaining about excessive work travel.

The Downside of Job Crafting

Because I have worked in both the labour movement and the compensation field, I know there is likely to be resistance to this practice. Let’s explore why.  To some extent organizational hierarchy is designed to keep people in their place so they will deliver the goods.  At least that’s the predominant opinion amongst management bullies and trade unionists alike, with the main disagreement being who should be in charge and how to divide the spoils.  Under that conceptual model it’s common for employees to assert they should be given less work, get promoted, and be paid more.  It’s just not feasible to say “yes” every time.

However, there is more than one model.  There is a time and a place for meaningful work, thoughtful job design, and power-sharing between employees and managers.  It is implicit that job crafting is only viable where it is possible for the employee to control the job content, and this autonomy itself may be one of the items the workplace needs to work on. In lieu of control systems and the maximization of effort, workplaces may instead pursue a mindset of growth, adaptation, and collaboration.  Indeed, those items underpin most efforts to improve workplace culture.

There are downsides to job crafting.  Yes, some requests for a change of job content run counter to the organization’s goals. Employees can also take-on more “fun” work than they expected; they get better work but just too much of it.  And for those seeking their true calling, it is possible that they will be exposed to features of their interest-area that they had previously been unaware of.  As in, be careful what you wish for, you might actually get it.

Many people want to see more of the world, wishing they could travel more.  Others like to get out to cocktail parties to strike up conversations with new people.  And if you don’t do it very often, being in lengthy meetings making important decisions can be a thrill.  But it’s also nice to get home, have deeper conversations with close friends and family, and put independent thought into simple things within your own control.

It’s a good idea to know what you want before you try to go out and get it.  You need to know yourself to be yourself, and sometimes you only figure that out by experimenting.  But if you’re lucky you will probably discover that the biggest treasure you can ever find is yourself.

Not Normal is Now Normal and More Productive

Day 42, Hannes. By A. David Holloway.jpg
Day 42, Hannes. Photo courtesy of A. David Holloway.

It’s the research you’ve all been waiting for: nobody is normal.  You might think I’m trying to reassure you that you’re normal-enough to be accepted, but no, that misses the point.  Everyone is unique and weird in their own way, and this is what allows everyone to function at their best as individuals.

The study is by Avram J. Holmes and Lauren M. Patrick, under the title “The Myth of Optimality in Clinical Neuroscience.” Trends in Cognitive Sciences.  Feb 20, 2018.

The authors were looking at the complex environmental circumstances under which mental illnesses develop.  There is an emerging effort to develop broad datasets that isolate what causes someone’s brain-function to diverge from the ideal mental state.  About that: there’s not a single ideal mental state.

“We challenge this concept… arguing that there is no universally optimal profile of brain functioning. The evolutionary forces that shape our species select for a staggering diversity of human behaviors.”

At Inc.com, Jessica Stillman notes that “…for all but the most obvious maladaptations, there is almost always a mix or good and bad results from any given variation.”

“Take anxiety, for instance. …science shows that anxiety is probably keeping you safer, pushing you to be better prepared in important areas of your life, and improving your memory, even if it often doesn’t feel good… Or look at risk taking. If you’re a little further on the fearless end of the spectrum, your chances of suffering some life-threatening mishap are likely higher, but so are your chances of starting a world-changing company. Our strengths and weaknesses are intimately tied together.”

This research confirms what has long been understood from folklore, the humanities, and the school of life: everyone is different and we need to honour and cherish these differences.

Now that there is data to back it up, can we assert this wisdom more boldly?  I think we can and should.  There are profound implications for emerging workplace issues such as equity and inclusion, work-life balance, wellbeing, and performance management.

Equity and Inclusion

The research brings depth to the thinking around equity and inclusion.  Looking at demographic traits is one window into the ways in which totally arbitrary types of people get ahead while others are left behind.  If we want everyone to be at their best, we must strive to open our definition of what “best” looks like, be it sex or race or personality profile.  If there is a “type” who is tapped or favoured because they fit the mold, we need to step back and consider if we are being drawn into a bias, be it conscious or unconscious.  We need to look beyond types, consider the individual, and brace ourselves for plenty of surprises about who’s going to rock it, and how.

Work-Life Balance

There are also implications for work-life balance.  As employees go through major life events there may be special moments when they are a perfect match to your workplace.  But their home lives are important, and personal lives beckon for time, attention, and commitments.

Striking the balance is key in supporting employees to show up in their best form and deliver their best strengths. That balance hinges on allowing everyone to be themselves both at work and at home. Sometimes an employee’s workplace personality brings differences in what they can deliver.  And sometimes an employee chooses a home life that allows them to be their best.  Don’t make them choose between the two, they’re busy being themselves.

Wellbeing

With wellbeing efforts, every high-functioning workplace needs to evolve beyond claims-cost-reduction and mandatory anti-bullying courses.  If a workplace has developed a strategic and holistic sense of why they are advancing wellbeing, they are likely to happen upon the World Health Organization’s definition of mental health.  That definition emphasizes that to feel “well” people need to realize their potential, work productively, and make a contribution to their community, among other things.  How could that be possible if the corporate standards of performance disregard the unique ways in which each person is exceptional?

Performance Management and Competencies

This research raises questions about performance measurement against prescribed competencies.  Yes, employees need to deliver outputs at the right levels of quality, cost, and timeliness.  Yet the more specific we get about the kind of excellence expected, the narrower the opportunities for people to excel.

Competencies were originally put forward as a cutting-edge practice that blended skills and attitudes that employers wished people would deliver in their style of daily work.  Competencies allowed employers to get beyond people-as-machines applying skill and effort to the tasks specified in the job description.  But there is a flaw.  Top-down descriptions of desired competencies undermine the ability of individuals to define their unique strengths from the inside-out.

As people put themselves forward we need to accept them warts and all.

If people are to flourish, they need to be coached to identify their unique talents, develop their own learning objectives, and deliver work in a way that allows them to grow into their exceptionalities.  We need to recognize what is great about each person, anticipating that there may be a downside.  As people put themselves forward we need to accept them warts and all.  In order to develop people for their best growth we need a workplace culture of trust, sympathy, and encouragement.

By contrast, exercises where we score people against a half-dozen competencies sent down from corporate seem hopelessly archaic.  Allowing a privileged few to define themselves as excellent and encourage others to play along seems narcissistic and biased.  And telling others to achieve life-balance and wellbeing according to the standards of those with power reveals an antipathy for wisdom.

So spread the word: everyone needs to get their freak on.  If people can know themselves and be themselves, they’re far more likely to deliver the goods.

Curiosity is Key. Ask Me How.

CIMG5944. By Tim Sheerman-Chase
CIMG5944.  Photo courtesy of Tim Sheerman-Chase.

At work, do you sometimes feel guilty about indulging your curiosity?  Well, it turns out curiosity is a bigger benefit to your workplace that you might have expected.

Zandure Lurie, CEO of SurveyMonkey, asserts that curiosity is the attribute we most desperately need in today’s corporate environment.  He provides data co-created by SurveyMonkey showing that curiosity is significantly under-valued.  Senior leaders “…are speaking more and more about the importance of curiosity, recognizing it as the ultimate driver of success.”

This opinion is consistent with the finding that the best leaders are good learners.  The rules keep changing because of technology, political disruption, and demographic shifts.  Your excellence in past years may be irrelevant to the future, whereas your ability to learn-forward from your current state is critical.  You can keep pace with moving goal posts.

In Lurie’s data, executives mostly think there are no barriers to asking questions in their organizations.  But there’s a problem:  employees think otherwise.  I think executives are gripped by wishful thinking.  They wish they had a culture in which information was free-flowing upwards while decisions were moving in the direction of their own voice.  And then they talk a good line about a two-way exchange of information and decision making.  But the sincerity is perceived to be lacking.

Citing research from Stanford’s Carol Dweck, Lurie asserts that

“The Culture of Genius is largely to blame. In this type of company culture some minds are seen as inherently more brilliant than others, and others are intimidated to question things and speak up as a result. It can create a toxic environment that’s stifling curiosity and has many employees doubting whether they ‘have what it takes.’”

In the process of his article Lurie references an interesting academic paper from 2014 by Matthias Gruber, Bernard Gelman, and Charan Ranganath.  To spare you the polysyllabic details: curiosity improves learning.  This finding is sensitive to the learner’s innate curiosity about a topic (i.e. intrinsic motivation), which implies that we cannot always prescribe what others ought to learn.  It’s a nuance in workplace learning, as organizations often have a list of prescribed skills and attributes (i.e. competencies) that they perceive will determine organizational success.  But if they impose this learning obligation, they might get inferior results.

The learners who are best for an organization may be those who are already fascinated by the topic-area where an organization needs growth.  Identifying and cultivating a pre-existing fascination may be more of a recruitment-and-selection question than a performance-appraisal thing.  It poses some touchy questions about leadership style: do leaders have to hang back in those cases where the employee is already growing into a challenge of their choosing?  What shall we do with the performance scorecard, core competencies, and the mandatory learning modules?  Where’s the part where the leader “causes” important things to happen?

If a leader wants to “drive” high performance in learning, I think they would need to be good at spurring intrinsic motivation.  This has to be the hardest of soft skills.  I have a son who is fascinated by police, and there is a game he plays (in Roblox) where he’s required to write a report for every arrest he makes.  If I could just get him to write his reports with proper grammar, spelling, and punctuation, he would be producing a robust volume of writing every day under his own motivation.  But he didn’t seem to care when I last suggested this, so I had to back off.  I’ll try again next week.

The paper by Gruber and co. also finds that when learners are engaged in their curiosity they remember random trivial information in the surrounding environment.  You may have experienced this yourself: that moment you learned that one amazing thing… you can recall the room you were in, who you were with, and the weather that day.

This is notable because in business analytics it’s understood that information is data in a meaningful context.  All happenings are sensitive to the history, geography, economy, and culture in which they occur.  We don’t really get to decide what’s important and what’s trivial.  The large-and-small of every situation co-determine one another, such that tactics are just as important as strategy.  Given the research, it’s fortunate that brains remember the core experience as well as the context, as this gives us a natural opportunity to combine science and story.

Lurie makes compelling suggestions on how to turn curiosity into a strategic resource.  Make questions central to your daily work.  Encourage transparency.  Ensure the environment is safe for this exploratory behaviour.  Ensure diversity at all levels, to signal that all perspectives are cherished.  Direct this curiosity towards contact with customers.  “Celebrate prudent risks that fail – otherwise you will create a culture where employees are risk averse, thereby limiting your upside.” (Emphasis added)07

Most intriguing is that Lurie asserts that since Artificial Intelligence will allow robots to out-do us on efficiency and quality, “Being curious is our best defense.”  As we name compelling human instincts that cannot be imitated by robots, future careers become increasingly evident.  Decide for yourself what you think is interesting and share your discoveries with executives and clients.

The robots won’t have a clue.

Spaghetti Principle Best Way to Change Minds

IMG_0580 by Brent (2)
IMG_0580.  Photo courtesy of Brent.

Does everything change when you touch it?  Yes for spaghetti: spaghetti changes when you touch it.  But what about people?  Do people change when you try to move them?  Sometimes.  Only sometimes.

One of my sub-skills is my ability to give one-on-one tutorials to colleagues to bring them to a higher level proficiency in Microsoft Excel.  Results vary, not because of talent, but more because of the person’s interest-level and their opportunity to apply the learning. I have done these tutorials enough times to know that there is a major concept that everyone needs to “get.”  So I offer the spaghetti metaphor.

When you move cooked spaghetti from the colander to the dining table, there are two ways that it gets there.  First, you move spaghetti out of the colander and onto the plate, changing the layout of the noodles in the process.  Then, after putting on the sauce, you move the entire plate to the dining table.  Transporting the plate does not change the layout of the noodles.  You can move the noodles or move the entire plate.  The distinction is that in some cases you change the configuration of the contents and in other cases you change their location but with the configuration left intact.

For those struggling with Excel, the issue is that if a rectangular cell has formulas in it, you must cut-and-paste the cell, drag-and-move the entire cell, or copy the formula inside the formula prompt to move a formula without altering it.  By contrast, if you copy-and-paste a cell or you use the autofill feature, your formula will automatically change so that all the cell references move accordingly.  You don’t have to worry about this if you’re not manipulating Excel right now.  As I mentioned, your ability to grasp this depends on your opportunity to apply the learning.

Enough math, let’s extend the concept to people’s opinions.  Are there cases where we attempt to move the logic in the minds of others?  Yes indeed.  Sometimes when you attempt to compel others to think of things differently, you get to change the configuration of their spaghetti-scramble of ideas.  But other times, you simply move the plate.  You get a person with the exact same opinions as before, they’re just in a different place, possibly more entrenched.

On Ozan Varol’s website, the rocket-scientist-turned-contrarian-author has some advice on how to change people’s minds.  Varol explains that people’s beliefs have an outsized impact on their grasp of the facts.  This role of beliefs drives a cognitive fallacy known as confirmation bias, the tendency for us to select facts that strengthen our beliefs and gloss-over those facts that are disruptive and uncomfortable.  The challenge is that we cannot use facts to drive changes-of-opinion, because it’s almost impossible to get into peoples’ grasp of “the facts” without attacking their intelligence.  So their defenses go up and they tell you where to go.  You know how this goes.

Varol recommends re-framing either-or debates around an alternate frame of reference.  His best example is when Columbians in the 1950s were grappling with the collapse of the Rojas dictatorship.  An entrenched mindset would blame the military for complicity in the Rojas regime, but that’s not what happened.  Instead, citizens offered an alternative narrative that “…it was the ‘presidential family’ and a few corrupt civilians close to Rojas – not military officers – who were responsible for the regime’s success.”  This narrative significantly reduced the risk of Columbia slipping into a military dictatorship.

As an academic, Varol presents papers at conferences with a subtle verbal shift.  He presents opinions somewhat detached from himself (“This paper argues…”) so that his ideas are lobbed into the public sphere to be thrashed about until others come to a more meaningful conclusion.  When he made this shift his ideas “took a life of their own” allowing him to view his own arguments with some objectivity.

You can do this too.  Varol encourages you to befriend those who disagree with you, expose yourself to environments where your opinions can be challenged, and presume that you will experience some discomfort.

Personally, I think the big deal is to get over yourself.  Or to be precise, that I need to get over myself. (See what I did there?)  If everyone other than me has opinions that are a random configuration of noodles, what are the odds that my own ideas are configured perfectly?

When it’s my turn to make spaghetti, I get the noodles into the plate, even them up, pour the sauce, and just get it all onto the table.  I have one kid that hates parmesan, and another that hates pepper.  Neither of them uses a spoon.  They handle the noodles as they see fit.  I let everyone enjoy what’s in front of them, while we talk about our day and our lives.  Hands off the noodles, because now’s the time to enjoy people.

Information is the New Sugar

pie. by chad glenn. (=)
pie. Photo courtesy of chad glenn.

On Pi Day, are you able to resist temptation?

The bright colours?

The sweet flavours?

Maybe once a year it’s good for you.  But what if you were force-fed sweets every day?  That’s what’s happening today with information.

In an article in Wired, author Zynep Tufekci makes a comparison to food when describing the addictive power of information.

“…within the next few years, the number of children struggling with obesity will surpass the number struggling with hunger. Why? When the human condition was marked by hunger and famine, it made perfect sense to crave condensed calories and salt. Now we live in a food glut environment, and we have few genetic, cultural, or psychological defenses against this novel threat to our health.”

The author compares our food behaviours to our current addictions to highly processed data:

“Humans are a social species, equipped with few defenses against the natural world beyond our ability to acquire knowledge and stay in groups that work together. We are particularly susceptible to glimmers of novelty, messages of affirmation and belonging, and messages of outrage toward perceived enemies. These kinds of messages are to human community what salt, sugar, and fat are to the human appetite.”

There was a time when humans desperately needed food and new information.  Once these needs are satisfied the ability of industry to exploit our lingering sense of need and push unhealthy variants and volumes became the next big threat.

With food, it is helpful to seek out existing traditions in which things have been figure out already.  Healthy people eat in a manner that resembles the cuisine of their grandparents, rejecting processed foods and fad diets alike.  To quote Michael Pollan, the food writer, “eat food, not to much, mostly plants.”  So, if we were to seek healthy and viable traditions in the free flow of information, where would we turn?

Pi Day is a great place to start.  In the late nineties, I stayed at the home of a family friend named Larry Shaw, a science educator at the San Francisco Exploratorium.  During this trip Larry handed me a slice of pie on March 14.  I didn’t figure out until years later that he was the creator of Pi Day.  Larry looked like a hippie, and he had a great sense of fun.  But he was closer-at-heart to a serious movement to empower people to disagree with those with power, and express disagreements through free speech.

We watched a brief documentary about the Freedom of Speech Movement.  In 1964 a man named Jack Weinberg was arrested for distributing political materials on the Berkeley campus.  Students encircled the police car Weinberg was in.  There was a 32-hour stand-off during which activist Mario Savio gave a compelling speech, saying:

“There’s a time when the operation of the machine becomes so odious — makes you so sick at heart — that you can’t take part. …you’ve got to put your bodies upon the gears and upon the wheels, upon the levers, upon all the apparatus, and you’ve got to make it stop. And you’ve got to indicate to the people who run it, to the people who own it, that unless you’re free, the machine will be prevented from working at all.”

In the era of social media and big data we are experiencing this same problem, but in reverse.  In decades past, government and industry asserted legal power and made threats against the publication of some news.  Coercion-narrowed perspectives whipped the public mood into compliance.  When protests break out today, we know about it through social media in minutes, without the support of broadcast media.  This should be the golden era of free speech.  But it’s not.

Nowadays when you see news it is unclear if you are receiving something accurate.  And if you are the one posting the video Tufecki asks “…is anyone even watching it?  Or has it been lost in a sea of posts from hundreds of millions of content producers?”  It’s not the case that accurate news is reaching the broadest audience, and it’s not the case that you as a citizen can make your voice heard.

Social media offers a community experience that is equivalent to shopping for groceries at a convenience store.

Tufekci notes that the world’s attention is overwhelmingly funnelled through Facebook, Google, YouTube, and Twitter.  These entities

“…stand in for the public sphere itself. But at their core, their business is mundane: They’re ad brokers. …they sell the capacity to precisely target our eyeballs. They use massive surveillance of our behavior, online and off, to generate increasingly accurate, automated predictions of what advertisements we are most susceptible to…”

The author makes the case that freedom of speech is not an end in its own right.  Rather, it is a vehicle through which we achieve other social goals, such as public education, respectful debate, holding institutions accountable, and building healthy communities.  Consider Savio’s “bodies upon the gears” speech and you know he wasn’t in this so you could look at food porn or cat videos.

We shall seek the best possible recipe for our knowledge.  We need to read books, watch well-produced documentaries, and talk to trustworthy friends who are knowledgeable on the right topic.  We must be skeptical of those in power but even more skeptical about friends who coddle us with complacent views.  Seek information that is healthy and fulfilling, and guard it like a borrowed recipe from your grandmother’s box of index cards.

And yet, enjoy small amounts of rumor and gossip, like the indulgence in a favorite slice of pie.  You still get to have fun, once in a while.  You’re still human.

Not Too Shocking – Those High Numbers from AI Job Disruption

Shocked. By Mark Turnauckas.
Shocked. Photo courtesy of Mark Turnauckas.

Can you think of a time you took advantage of a new technology, and in the process got way more work done?  We’re all going to need more stories like this in order to stay ahead of the game.

I’ll never forget my first exposure to a pirated version of Microsoft Excel.  I was in graduate school in 1994 and a young woman in my class, Bev, handed me a stack of eight floppy disks held together with a blue elastic band.  She told me Excel was way better than what I was using.  Six months later I had finished an entire graduate thesis based on clever charts and tables I had created using new software.  Six months after that, I was at a firm in one of the towers in Toronto’s downtown core with experienced consultants lining up at my cubicle, waiting for some solid analysis.  My mind had co-evolved around the technology, and I was valued.

For many months I was the only analyst on a team that had four consultants.  When new technologies are brought in, sometimes one person can do the work of several peers.  And this appears to be a concern today with incoming technologies, such as artificial intelligence, internet of things, and analytics.

There has been some excitement lately about McKinsey’s report that 800 million jobs will be eliminated worldwide by technology.  Reading the content of the report – not just the media coverage – I can assure you that it’s far less dramatic.

First, the 800 million jobs was the upside of a forecasted range, and the authors recommend considering the mid-point of the range, which is 400 million jobs.  Those 400 million jobs are proportional to 15% of current work activities in the global labour market.  These job losses are not expected to be immediate, as this is a forecast into 2030 – twelve years from now.  This means the forecast is closer to 30-35 million jobs lost per year, which seems far more modest on a planet with 7.6 billion inhabitants.

But it gets better.  Of the 400 million jobs lost, only 75 million jobs will be eliminated altogether.  The remaining job losses will be in cases where parts of our jobs will be eliminated.  About 30% of “constituent” work will be automated for 60% of occupations.  That is, there will be bots taking care of the more mundane parts of our jobs.  It remains to be seen whether this shift will result in 30% less employment, or if our outputs will just be more efficient.  There may be a line-up at your own desk, with senior people increasingly reliant on your own unique, human-machine hybrid.

This technological revolution will have more dramatic impacts on industrialized economies such as Canada, the U.S. and Europe.  New technologies have a cost of implementation, and cost savings are needed to justify the investment.  A lot of cost savings can be found in eliminating expensive jobs.  But in the developing world, wages are lower and the gains of the new technology won’t always outweigh the cost.  The trade-offs between hiring people and bringing in new technology often tips towards employing people in those places where wages are low.  It’s in the industrialized world where we will see the most change.

In my opinion (not necessarily McKinsey’s), this will have an impact on political optics.  Jobs will appear to be eliminated in industrialized economies and then magically reappear in the developing world.  But the back-story is that technology allows work to be done with fewer employees and more machines in industrialized countries.  And those western workplaces will have competition from countries where it is not optimal to bring in new technologies.  The jobs created in developing countries will look like the same jobs that used to exist in the West.  But that’s not what’s going on.  Developing economies are just briefly immune to the more-expensive technology, for as long as those countries have low wages.

McKinsey also reviewed the history of technological change and found that there tends to be a net gain from new technologies.  The technology benefits someone — the buyer, investor, or some new profession or trade.  That someone spends money in a manner that creates different jobs, often by taking advantage of yetanother new technology.  Those 400 million lost jobs are likely to be the downside of a net-gain from technology.

This raises the difficult issue of things getting better on average.  As I described in an earlier post, if one million jobs are eliminated and a million-plus-one jobs are created, this is a net gain of one job.  In the minds of economists, this is considered progress.  However, looking at the blow-back from voters in industrialized countries, it appears that we must now pay very close attention to the millions who were on the downside of this net-gain.  And perhaps you know some of these people.

McKinsey was all over this issue:

“Midcareer job training will be essential, as will enhancing labour market dynamism and enabling worker redeployment.  These changes will challenge current educational and workforce training models…  Another priority is rethinking and strengthening transition and income support for workers caught in the cross-currents of automation.” (p. 8)

Within the human resources crowd, we are experienced at either enduring push-back from unions, or anticipating labour’s response with meaningful policies and initiatives.  But regardless of whether you are sympathetic to the underclass, or just trying to implement a new technology as quickly as possible, you can see that society’s success at adapting to this change will hinge on the personal experience of those who have lost.

Looking around us, it seems like we are all trying to get our footing, trying to figure out for that one special thing that sets ourselves apart.  You might not be told ahead of time what that thing should be.  In fact, you might need to figure it out entirely by yourself.  But those who are always working on their angle will have a better shot than those who are relying on prior wins.

Sure, there might be an employer who is loyal enough to set you up for success, or a program or union that will help with the job transition.  But as we take turns eliminating each other’s jobs, you might want to hold onto a dash of selfishness.  If you can bot-boss your way into a superior level of productivity, you might have a shot at being that one valued employee on the upside of a turbulent net-gain.

Either as a society, or as an individual, you need to write yourself into a story where you reached for the power cord and taught the corporate machine to work for you.