“I’m busy” is the call of the meek (and they shall not inherit the earth)

It's All in the Eyes. By Chris Gilmore
It’s All in the Eyes. Photo courtesy of Chris Gilmore.

Have you ever been stressed and overwhelmed by your workload, but then got the satisfaction of getting a grip of your to-do list?  I manage this several times a month, and I find it empowering and calming.

My favorite part is when I write a fresh list without dragging over the crossed-out items from the prior list. Then I write next to each task the priority number in which I would like to approach them.  After that, I write an new fresh list, prioritized in the order I had chosen.

It turns out I was onto something.  Having a clear sense of purpose and direction is the thing that makes us productive.  And that’s totally different from being busy.

Instead of resorting to the “I’m busy,” proclamation, simply organize your obligations and commitments.  You’ll realize it’s a good thing.

But once things are under control, you lose your bragging rights about being busy.  That’s a bad thing.

“I’m Busy” is a Humblebrag

In an article by Jessica Stillman in Inc.com from 2016, she shares research showing that people who say they are busy are perceived to be more important.  People know the “I’m busy” humblebrag is compelling and they use it liberally.  I think people only say “I’m busy” because others are saying it too.  Kind of like straight people drawing attention to the fact they’re straight, or women’s right activists saying they don’t call themselves feminists.  If there weren’t these crowd-sourced self-impositions to look busy and conform to norms, would we still be grabbing for labels that allow us to fit in and be validated?  Surely it would be easier to bring our best to the workplace and be our usual, weird selves.

As people increasingly say they are busy, the evidence suggests otherwise.  In another article by Stillman she reports data from the U.S. that people are sleeping more and finding more time to watch television compared to a decade ago.  This article was from two years ago when people were still watching televisions instead of being addicted to their phones. On average, people are not more busy.  “It’s not entirely surprising that we fit in all [that]… leisure — the average full-time workweek is a moderate 42 hours.”

Busy People Are Not Always Giving Their Best

In those cases where people are truly busy, it’s not a good thing.  Beyond a certain point people suffer cognitive overload.  In an article in Inc.com from June 2018, Wanda Thibodeaux interviews Fouad ElNaggar, the chief executive of an employee experience portal called Sapho.  ElNaggar cites oft-quoted research that people “…check email 47 times a day… And it takes an average of 25 minutes to get back on task after being interrupted.  They experience an endless tidal wave of beeps that require an acknowledgement or response and with mobility.”

ElNaggar references research that people compensate for the barrage of interruptions by working faster.  This leaves people stressed-out “…and subsequently, focus, concentration, and creativity – all tank.”  These are not the people who have got into the zone and got a lot of work done exceptionally well.  These are people who are controlled by clients, superiors, Facebook friends, and advertising algorithms coming out of the Silicon Valley.  These are people who have become unimportant.

He asserts responsibility for this problem sits with leadership, but notes individual employees need to share some blame.  He encourages individuals to take control of their calendar and decline meaningless meetings, assign narrow windows to handle email (i.e. not all day long), and keep the cell phone out of the bedroom.

However, this opens two controversial opinions.  One, he presumes we have enough control over our work-day to make these trade-offs.  Only leaders that give employees autonomy can expect employees to improve their work pace for the better. The second is that ElNaggar’s remedies imply you can become more effective by being less busy.

How Productive People Differ from Busy People

In an article from February of 2018, Larry Kim asserts productive people have a mission in their lives, have few priorities, and focus on clarity before action.  “Busy” people want to look like they have a mission, have many priorities, and focus on action regardless of clarity.

Productive people want others to be effective, and busy people want others to be busy.  The list of behaviours and attitudes are not mutually exclusive, but you get a sense of two different styles.

Described in this manner, people who say “I’m busy” are not actually drawing attention to their importance.  Rather, they are broadcasting that they lack focus, have no control, and are short on self-management.  “I’m busy” is a malfunctioning humblebrag, as it serves a backhanded compliment that insults the self.

But it might be early days for this realization.  You might have superiors and influential colleagues who have that busy buzz to them.  If this polarity between productivity and busyness comes into public view, it’s not going to look good for the busy-bees.

The biggest revelation from Kim’s article is that “Productive people make time for what is important.”  Productive people are all about mission, priorities, and focus, and they are allowed to target their time and effort.  If you have ten minutes to spare to get “important” work done, that important work is to consider your values and your mission, and create a fresh draft of your priorities that put everything into perspective.

People might not see you breaking a sweat, but with time you will deliver better results.  But remember, it looks way better when there’s no boasting.  And that will go a lot further after we’ve outed the “I’m busy” call of the meek.

Mini-Me Recruiting: Always Funny, Always Uncomfortable

Mini Me and Me (a.k.a. Verne Troyer) by Bit Boy
Mini Me and Me (a.k.a. Verne Troyer).  Photo courtesy of Bit Boy.

Who hasn’t wanted to clone themselves, especially when deep into a project that leaves a weekend in tatters. Dr. Evil of Austin Powers fame hilariously and awkwardly created Mini-Me as this right-hand man. While Mini-Me failed to carry out Dr. Evil’s plans for world domination, he succeeded in illustrating a major problem in human resources that needs more scrutiny than ever.  The actor Verne Troyer – who played Mini-Me – immortalized an uncomfortable concept.

The hiring of mini-me in organizations is a problem-behaviour caused by two cognitive fallacies.  One is the affinity bias, the liking of people similar to ourselves. The other is the exposure effect, where we like things that we have been merely exposed to. In the readings of cognitive fallacies it becomes clear that the majority of such fallacies are a variant of the “availability heuristic,” when we over-value thoughts that come to mind easily.  If we choose what’s comfortable, we reproduce our own status quo.

However, it’s usually the case that an employer needs a diverse team.  Even the most excellent leaders need people who have different strengths.  In an article at entrepreneur.com, George Deeb asserts;

“Maybe you don’t need a ‘glass half full’ optimist like yourself… Maybe you need a ‘glass half empty’ realist, who will bring a sense of caution to your investment decisions. Or, you may need a similar ‘A-Type Personality’ to lead your sales team efforts… But, maybe a ‘B-Type Personality’ may be a better fit to manage your more introverted team of technology developers. …Maybe what you really need is the opposite of yourself. You need your Anti-Me to help keep yourself organized, on plan and in check. It really comes down to what you see as your personal strengths and weaknesses, and filling in any voids in your skill-sets.” (Emphasis added)

Equity and Inclusion in Hiring Decisions

The most visible consequence of unconscious bias is that organizations hire and promote people in the same demographic category as the hiring manager, increasing the momentum behind historic privilege.  In an article in the Guardian in 2016, Matthew Jenkin notes that the context of a selection interview will have an outsized impact on who is chosen.  If the context is white and middle-class, candidates who are white and middle class will be favoured.

Bias goes beyond blockbuster items like race and social class. Hobbies, personal experiences, and how we dress can be factors too. If the leadership of an organization is “all of one type” it is a reliable sign that the leadership has lost all curiosity, has no self-doubt, and does not take evidence seriously.  The leadership is not reading the news, and if they are, they are only reading it in print.

This is not the mindset of leaders who will make an organization successful in the near future.  Yes, we must achieve indicators of diversity, but we must also foster receptiveness to new information, a curiosity about diverse ideas, and ways in which an individual can be excellent in a manner that might be considered weird.

Why Structured Interviews Matter

The professional association in the UK, the Chartered Institute of Personnel and Development (CIPD), released a paper in 2015 entitled A Head for Hiring: The Behavioural Science of Recruitment and Selection. It looked at, amongst other things, the role of unstructured interviews.  The authors found a study that fed research participants a combination of good evidential information, plus random irrelevant information from an unstructured interview.  The research subjects upgraded the importance of the random irrelevant information and discounted the good information.  “This can be seen as evidence of sense-making – our tendency to identify patterns or detect trends even when they are non-existent.”

It’s not just the interviewers who are at risk of making bad judgment calls. The CIPD paper identified cognitive fallacies in the mind of the interviewee that caused them to self-select away from promising job matches.  And walking into an unfamiliar environment, where they feel like an outsider, can cause job candidates to underperform because of the additional stress.  When people are using their brains, they are vulnerable to issues of cognitive load in which a complex environment exhausts their brain prior to facing decisions.  Those coming from a different context face disadvantage in an environment that might seem “normal” to the host.

Solutions in Diversity Hiring

What is the remedy for these problems?  For one, structured interviews are key, as they narrow the range of evidence to information that is relevant.  Also, we must actively seek contrary evidence; not taking things at face-value, and seeking information that is outside of what is familiar and comfortable.  There is also diversity representation.  Charles Hipps, CEO of e-recruitment company WCN, was quoted in the Guardian article and  “…suggests having team members from the particular group you are trying to attract present during the recruitment process – whether that’s meeting and greeting candidates or on the interview panel.”  Structure a diverse context and it will set a balanced comfort-level with reduced cognitive load.

Employers are also starting to get hard-core, using new tools to improve the selection process.  The Guardian article spoke with one company, Elevate, that “uses algorithms to score every candidate’s CV, previous work experience, skills and education, and assesses their suitability for a role. It then ranks candidates much like Google’s search results…”   Another company, Joinkoru, conducts validated pre-hire assessments which provide candidate scores that are less sensitive to the candidate’s similarity to current employees.  It is also feasible to do blind selection in the process of creating a shortlist, in a manner that obscures the name and sex of the candidate.

Not all of these tools are perfect, and indeed there are emerging risks that algorithms can carry-forward the historic bias of past human behaviours.  The rise of the racist robots is a concern.  We might not be creating cloned versions of ourselves (yet), but we are at serious risk of creating artificial intelligence which has flaws identical to our broader society.

And the technology can be expensive.  Doctor Evil is the only one selling it, and he’s going to charge you (pinky to mouth) one million dollars.

Shift in Job Market Doesn’t Need to Be a Nightmare

Melbourne Zombie Shuffle 162, by Fernando de Sousa
Melbourne Zombie Shuffle 162.  Photo courtesy of Fernando de Sousa.

Are you a little scared of the future? I think we all are. And for good reason.

There’s so much to think about these days, especially with technology disrupting our jobs. But if you have watched a few horror films, you’ll notice things become far less scary when you understand what’s really going on.  For me, my shoulders relaxed a little and I reached for popcorn again after I read a report from the World Economic Forum about job transitions.

The report reveals next-job opportunities for employees displaced by economic and technological disruption.

The U.S. labour market will see a structural job loss of 1.4 million jobs over the next 10 years, according to the Bureau of Labour Statistics. However, the report also cites a structural growth of 12.4 million new jobs.  On average the job market will be better.

However, let’s set aside the average for a moment and focus on the 1.4 million individuals who will be put out of work.

The report analyzed at a thousand job descriptions representing the majority of the American workforce and looked for similarities in skills, abilities, qualifications, and the work itself.  The job-matching methodology was created by Burning Glass Technologies, a firm specializing in labour market analytics harnessing big data and artificial intelligence.

Using the 10-year labour market forecast, they identified the job families where the largest number of jobs would disappear, identified other job families forecast for growth, and mapped-out how people could transition from lost jobs into new jobs.

Production and Office & Administration jobs are projected to be the hardest hit. In every other area there are fewer job losses expected, and the new-but-different jobs created within a job family greatly exceeds jobs lost.

Jobs in Production (which includes the beleaguered manufacturing sector) have a high similarity to emerging jobs in Construction and Extraction; Installation, Maintenance and Repair; and Transportation.  Positions in Office & Administration have a high similarity to emerging jobs in Business and Financial Operations.  And a large number of handy and hard-working people can always find a job in custodial or food services.

But if you lost your job, would you want to be a barista?

The Desirability of Job Transitions

Thankfully, the report considers whether peoples’ next jobs are desirable.  A significant drop in pay won’t motivate employees to seek reskilling.  Stability is also a top concern.  The investment in re-skilling or moving costs can be expensive, so some transition opportunities might be rejected just because of the instability.

Desirability isn’t all in the mind of the employee. Governments want a successful transition to achieve a good return on their investment in training programs. They don’t want to undermine their tax base with a low-wage workforce. And some governments are also concerned about the experience of workers as voters.  Employers need successful transitions too, as they fear of a workforce of demoralized, dissatisfied, and under-productive employees.

The report factored-in all these concerns and categorized viable job transitions as those that have high similarity, stable long-term prospects, and wages that are equal or better than the previous job.

They found plenty of opportunities:

 “…our analysis is able to find ‘good-fit’ job transitions for the vast majority of workers currently holding jobs experiencing technological disruption — 96%, or nearly 1.4 million individuals…  Interestingly, the majority of ‘good-fit’ job transition options — 70% — will require the job mover to shift into …a new job family.”

Job Transition Pathways

One of the benefits of this sophisticated model was that the authors of the report were able to extend the career transitions from a one-time change into “a full chain of job transition pathways” covering three jobs.

For example, a secretary can downshift into becoming a concierge, then come out ahead as a recycling coordinator. Each new job has a solid 90% similarity score relative to the prior job, but the salary bounces from $36k to $31k to $50k.

There is a similar trade-off for the transition from cashier to barista to food service manager.  So yes, you might still want to become a barista.  Employees could come out further ahead if they could see these pathways and plan accordingly.

Job Transitions are Different for Women

There are mixed results based on the sex of the worker.  On the minus side for women, it is estimated that 57% of the disruption will affect women.  Women also have fewer job transitions options: “Without reskilling… professions that are predominantly female and at risk of disruption have only 12 job transition options while at-risk male-dominated professions have 22 options.”

But women also have a better chance at job transitions that result in increased wages.  Of those experiencing labour disruption 74% of women have a good match into higher-paying jobs while the equivalent number for men in 53%.

This difference may contribute to a “potential convergence in women and men’s wages,” but this impact would obviously need to be blended with those economic forces that don’t favour women.  By which I mean, most economic forces.

Men and women alike significantly benefit from reskilling efforts, resulting in a quadrupling of the new job options available.  With reskilling, opportunities for women jump from 12 job options to 49, and opportunities for men jump from 22 options to 80.

A Change in Societal Mindset is Required

The report recommends societal changes in order to make this all viable:

“…what will be required is nothing less than a societal mindset shift for people to become creative, curious, agile lifelong learners, comfortable with continuous change.” (Links added)

On the public policy side, there is an additional shift in mindset for corporations and government:  pick up the tab or everyone is toast.

The main item that would empower this change is a comprehensive re-skilling program funded at full scale.  Displaced workers need to take some responsibility and show some initiative. But nobody in their right mind is suggesting that the cost of all this should be borne by anyone other than business and government.

While the consequences of inaction are dire for individuals and society, the path forward is becoming better understood.  It’s that part in the scary movie where they can see the way out.  And for that reason, it’s not so scary any more, and might even be fun to watch.

Curiosity is Key. Ask Me How.

CIMG5944. By Tim Sheerman-Chase
CIMG5944.  Photo courtesy of Tim Sheerman-Chase.

At work, do you sometimes feel guilty about indulging your curiosity?  Well, it turns out curiosity is a bigger benefit to your workplace that you might have expected.

Zandure Lurie, CEO of SurveyMonkey, asserts that curiosity is the attribute we most desperately need in today’s corporate environment.  He provides data co-created by SurveyMonkey showing that curiosity is significantly under-valued.  Senior leaders “…are speaking more and more about the importance of curiosity, recognizing it as the ultimate driver of success.”

This opinion is consistent with the finding that the best leaders are good learners.  The rules keep changing because of technology, political disruption, and demographic shifts.  Your excellence in past years may be irrelevant to the future, whereas your ability to learn-forward from your current state is critical.  You can keep pace with moving goal posts.

In Lurie’s data, executives mostly think there are no barriers to asking questions in their organizations.  But there’s a problem:  employees think otherwise.  I think executives are gripped by wishful thinking.  They wish they had a culture in which information was free-flowing upwards while decisions were moving in the direction of their own voice.  And then they talk a good line about a two-way exchange of information and decision making.  But the sincerity is perceived to be lacking.

Citing research from Stanford’s Carol Dweck, Lurie asserts that

“The Culture of Genius is largely to blame. In this type of company culture some minds are seen as inherently more brilliant than others, and others are intimidated to question things and speak up as a result. It can create a toxic environment that’s stifling curiosity and has many employees doubting whether they ‘have what it takes.’”

In the process of his article Lurie references an interesting academic paper from 2014 by Matthias Gruber, Bernard Gelman, and Charan Ranganath.  To spare you the polysyllabic details: curiosity improves learning.  This finding is sensitive to the learner’s innate curiosity about a topic (i.e. intrinsic motivation), which implies that we cannot always prescribe what others ought to learn.  It’s a nuance in workplace learning, as organizations often have a list of prescribed skills and attributes (i.e. competencies) that they perceive will determine organizational success.  But if they impose this learning obligation, they might get inferior results.

The learners who are best for an organization may be those who are already fascinated by the topic-area where an organization needs growth.  Identifying and cultivating a pre-existing fascination may be more of a recruitment-and-selection question than a performance-appraisal thing.  It poses some touchy questions about leadership style: do leaders have to hang back in those cases where the employee is already growing into a challenge of their choosing?  What shall we do with the performance scorecard, core competencies, and the mandatory learning modules?  Where’s the part where the leader “causes” important things to happen?

If a leader wants to “drive” high performance in learning, I think they would need to be good at spurring intrinsic motivation.  This has to be the hardest of soft skills.  I have a son who is fascinated by police, and there is a game he plays (in Roblox) where he’s required to write a report for every arrest he makes.  If I could just get him to write his reports with proper grammar, spelling, and punctuation, he would be producing a robust volume of writing every day under his own motivation.  But he didn’t seem to care when I last suggested this, so I had to back off.  I’ll try again next week.

The paper by Gruber and co. also finds that when learners are engaged in their curiosity they remember random trivial information in the surrounding environment.  You may have experienced this yourself: that moment you learned that one amazing thing… you can recall the room you were in, who you were with, and the weather that day.

This is notable because in business analytics it’s understood that information is data in a meaningful context.  All happenings are sensitive to the history, geography, economy, and culture in which they occur.  We don’t really get to decide what’s important and what’s trivial.  The large-and-small of every situation co-determine one another, such that tactics are just as important as strategy.  Given the research, it’s fortunate that brains remember the core experience as well as the context, as this gives us a natural opportunity to combine science and story.

Lurie makes compelling suggestions on how to turn curiosity into a strategic resource.  Make questions central to your daily work.  Encourage transparency.  Ensure the environment is safe for this exploratory behaviour.  Ensure diversity at all levels, to signal that all perspectives are cherished.  Direct this curiosity towards contact with customers.  “Celebrate prudent risks that fail – otherwise you will create a culture where employees are risk averse, thereby limiting your upside.” (Emphasis added)07

Most intriguing is that Lurie asserts that since Artificial Intelligence will allow robots to out-do us on efficiency and quality, “Being curious is our best defense.”  As we name compelling human instincts that cannot be imitated by robots, future careers become increasingly evident.  Decide for yourself what you think is interesting and share your discoveries with executives and clients.

The robots won’t have a clue.

Information is the New Sugar

pie. by chad glenn. (=)
pie. Photo courtesy of chad glenn.

On Pi Day, are you able to resist temptation?

The bright colours?

The sweet flavours?

Maybe once a year it’s good for you.  But what if you were force-fed sweets every day?  That’s what’s happening today with information.

In an article in Wired, author Zynep Tufekci makes a comparison to food when describing the addictive power of information.

“…within the next few years, the number of children struggling with obesity will surpass the number struggling with hunger. Why? When the human condition was marked by hunger and famine, it made perfect sense to crave condensed calories and salt. Now we live in a food glut environment, and we have few genetic, cultural, or psychological defenses against this novel threat to our health.”

The author compares our food behaviours to our current addictions to highly processed data:

“Humans are a social species, equipped with few defenses against the natural world beyond our ability to acquire knowledge and stay in groups that work together. We are particularly susceptible to glimmers of novelty, messages of affirmation and belonging, and messages of outrage toward perceived enemies. These kinds of messages are to human community what salt, sugar, and fat are to the human appetite.”

There was a time when humans desperately needed food and new information.  Once these needs are satisfied the ability of industry to exploit our lingering sense of need and push unhealthy variants and volumes became the next big threat.

With food, it is helpful to seek out existing traditions in which things have been figure out already.  Healthy people eat in a manner that resembles the cuisine of their grandparents, rejecting processed foods and fad diets alike.  To quote Michael Pollan, the food writer, “eat food, not to much, mostly plants.”  So, if we were to seek healthy and viable traditions in the free flow of information, where would we turn?

Pi Day is a great place to start.  In the late nineties, I stayed at the home of a family friend named Larry Shaw, a science educator at the San Francisco Exploratorium.  During this trip Larry handed me a slice of pie on March 14.  I didn’t figure out until years later that he was the creator of Pi Day.  Larry looked like a hippie, and he had a great sense of fun.  But he was closer-at-heart to a serious movement to empower people to disagree with those with power, and express disagreements through free speech.

We watched a brief documentary about the Freedom of Speech Movement.  In 1964 a man named Jack Weinberg was arrested for distributing political materials on the Berkeley campus.  Students encircled the police car Weinberg was in.  There was a 32-hour stand-off during which activist Mario Savio gave a compelling speech, saying:

“There’s a time when the operation of the machine becomes so odious — makes you so sick at heart — that you can’t take part. …you’ve got to put your bodies upon the gears and upon the wheels, upon the levers, upon all the apparatus, and you’ve got to make it stop. And you’ve got to indicate to the people who run it, to the people who own it, that unless you’re free, the machine will be prevented from working at all.”

In the era of social media and big data we are experiencing this same problem, but in reverse.  In decades past, government and industry asserted legal power and made threats against the publication of some news.  Coercion-narrowed perspectives whipped the public mood into compliance.  When protests break out today, we know about it through social media in minutes, without the support of broadcast media.  This should be the golden era of free speech.  But it’s not.

Nowadays when you see news it is unclear if you are receiving something accurate.  And if you are the one posting the video Tufecki asks “…is anyone even watching it?  Or has it been lost in a sea of posts from hundreds of millions of content producers?”  It’s not the case that accurate news is reaching the broadest audience, and it’s not the case that you as a citizen can make your voice heard.

Social media offers a community experience that is equivalent to shopping for groceries at a convenience store.

Tufekci notes that the world’s attention is overwhelmingly funnelled through Facebook, Google, YouTube, and Twitter.  These entities

“…stand in for the public sphere itself. But at their core, their business is mundane: They’re ad brokers. …they sell the capacity to precisely target our eyeballs. They use massive surveillance of our behavior, online and off, to generate increasingly accurate, automated predictions of what advertisements we are most susceptible to…”

The author makes the case that freedom of speech is not an end in its own right.  Rather, it is a vehicle through which we achieve other social goals, such as public education, respectful debate, holding institutions accountable, and building healthy communities.  Consider Savio’s “bodies upon the gears” speech and you know he wasn’t in this so you could look at food porn or cat videos.

We shall seek the best possible recipe for our knowledge.  We need to read books, watch well-produced documentaries, and talk to trustworthy friends who are knowledgeable on the right topic.  We must be skeptical of those in power but even more skeptical about friends who coddle us with complacent views.  Seek information that is healthy and fulfilling, and guard it like a borrowed recipe from your grandmother’s box of index cards.

And yet, enjoy small amounts of rumor and gossip, like the indulgence in a favorite slice of pie.  You still get to have fun, once in a while.  You’re still human.

Not Too Shocking – Those High Numbers from AI Job Disruption

Shocked. By Mark Turnauckas.
Shocked. Photo courtesy of Mark Turnauckas.

Can you think of a time you took advantage of a new technology, and in the process got way more work done?  We’re all going to need more stories like this in order to stay ahead of the game.

I’ll never forget my first exposure to a pirated version of Microsoft Excel.  I was in graduate school in 1994 and a young woman in my class, Bev, handed me a stack of eight floppy disks held together with a blue elastic band.  She told me Excel was way better than what I was using.  Six months later I had finished an entire graduate thesis based on clever charts and tables I had created using new software.  Six months after that, I was at a firm in one of the towers in Toronto’s downtown core with experienced consultants lining up at my cubicle, waiting for some solid analysis.  My mind had co-evolved around the technology, and I was valued.

For many months I was the only analyst on a team that had four consultants.  When new technologies are brought in, sometimes one person can do the work of several peers.  And this appears to be a concern today with incoming technologies, such as artificial intelligence, internet of things, and analytics.

There has been some excitement lately about McKinsey’s report that 800 million jobs will be eliminated worldwide by technology.  Reading the content of the report – not just the media coverage – I can assure you that it’s far less dramatic.

First, the 800 million jobs was the upside of a forecasted range, and the authors recommend considering the mid-point of the range, which is 400 million jobs.  Those 400 million jobs are proportional to 15% of current work activities in the global labour market.  These job losses are not expected to be immediate, as this is a forecast into 2030 – twelve years from now.  This means the forecast is closer to 30-35 million jobs lost per year, which seems far more modest on a planet with 7.6 billion inhabitants.

But it gets better.  Of the 400 million jobs lost, only 75 million jobs will be eliminated altogether.  The remaining job losses will be in cases where parts of our jobs will be eliminated.  About 30% of “constituent” work will be automated for 60% of occupations.  That is, there will be bots taking care of the more mundane parts of our jobs.  It remains to be seen whether this shift will result in 30% less employment, or if our outputs will just be more efficient.  There may be a line-up at your own desk, with senior people increasingly reliant on your own unique, human-machine hybrid.

This technological revolution will have more dramatic impacts on industrialized economies such as Canada, the U.S. and Europe.  New technologies have a cost of implementation, and cost savings are needed to justify the investment.  A lot of cost savings can be found in eliminating expensive jobs.  But in the developing world, wages are lower and the gains of the new technology won’t always outweigh the cost.  The trade-offs between hiring people and bringing in new technology often tips towards employing people in those places where wages are low.  It’s in the industrialized world where we will see the most change.

In my opinion (not necessarily McKinsey’s), this will have an impact on political optics.  Jobs will appear to be eliminated in industrialized economies and then magically reappear in the developing world.  But the back-story is that technology allows work to be done with fewer employees and more machines in industrialized countries.  And those western workplaces will have competition from countries where it is not optimal to bring in new technologies.  The jobs created in developing countries will look like the same jobs that used to exist in the West.  But that’s not what’s going on.  Developing economies are just briefly immune to the more-expensive technology, for as long as those countries have low wages.

McKinsey also reviewed the history of technological change and found that there tends to be a net gain from new technologies.  The technology benefits someone — the buyer, investor, or some new profession or trade.  That someone spends money in a manner that creates different jobs, often by taking advantage of yetanother new technology.  Those 400 million lost jobs are likely to be the downside of a net-gain from technology.

This raises the difficult issue of things getting better on average.  As I described in an earlier post, if one million jobs are eliminated and a million-plus-one jobs are created, this is a net gain of one job.  In the minds of economists, this is considered progress.  However, looking at the blow-back from voters in industrialized countries, it appears that we must now pay very close attention to the millions who were on the downside of this net-gain.  And perhaps you know some of these people.

McKinsey was all over this issue:

“Midcareer job training will be essential, as will enhancing labour market dynamism and enabling worker redeployment.  These changes will challenge current educational and workforce training models…  Another priority is rethinking and strengthening transition and income support for workers caught in the cross-currents of automation.” (p. 8)

Within the human resources crowd, we are experienced at either enduring push-back from unions, or anticipating labour’s response with meaningful policies and initiatives.  But regardless of whether you are sympathetic to the underclass, or just trying to implement a new technology as quickly as possible, you can see that society’s success at adapting to this change will hinge on the personal experience of those who have lost.

Looking around us, it seems like we are all trying to get our footing, trying to figure out for that one special thing that sets ourselves apart.  You might not be told ahead of time what that thing should be.  In fact, you might need to figure it out entirely by yourself.  But those who are always working on their angle will have a better shot than those who are relying on prior wins.

Sure, there might be an employer who is loyal enough to set you up for success, or a program or union that will help with the job transition.  But as we take turns eliminating each other’s jobs, you might want to hold onto a dash of selfishness.  If you can bot-boss your way into a superior level of productivity, you might have a shot at being that one valued employee on the upside of a turbulent net-gain.

Either as a society, or as an individual, you need to write yourself into a story where you reached for the power cord and taught the corporate machine to work for you.