Tag Archives: influence

We Must Stop Fooling Ourselves and Get Our Facts Straight

We Must Stop Fooling Ourselves and Get Our Facts Straight

GUEST POST from Greg Satell

Mehdi Hasan’s brutal takedown of Matt Taibbi was almost painful to watch. Taibbi, a longtime muckraking journalist of some renown, was invited by Elon Musk to review internal communications that came to be known as the Twitter Files and made big headlines with accusations regarding government censorship of social media.

Yet as Hasan quickly revealed, Taibbi got basic facts wrong, either not understanding what he was looking at, doing sloppy work or just plainly being disingenuous. What Taibbi was reporting as censorship was, in fact, a normal, deliberative process for flagging problematic content, most of which was not taken down.

He looked foolish, but I could feel his pain. In both of my books, I had similarly foolish errors. The difference was that I sent out sections to be fact-checked by experts and people with first-hand knowledge of events before I published. The truth is that it’s not easy to get facts straight. It takes hard work and humility to get things right. We need to be careful.

A Stupid Mistake

Some of the most famous business stories we hear are simply not accurate. Gurus and pundits love to tell you that after inventing digital photography Kodak ignored the market. Nothing could be further from the truth. In fact, its EasyShare line of cameras were top sellers. It also made big investments in quality printing for digital photos. The problem was that it made most of its money on developing film, a business that completely disappeared.

Another popular fable is that Xerox failed to commercialize the technology developed at its Palo Alto Research Center (PARC), when in fact the laser printer developed there saved the company. What also conveniently gets left out is that Steve Jobs was able to get access to the company’s technology to build the Macintosh because Xerox had invested in Apple and then profited handsomely from that investment.

But my favorite mistold myth is that of Blockbuster, which supposedly ignored Netflix until it was too late. As Gina Keating, who covered the story for years at Reuters, explains in her book Netflixed, the video giant moved relatively quickly and came up with a successful strategy, but the CEO, John Antioco, left after a fight with investor Carl Icahn and the strategy was reversed.

Yet that’s not exactly how I told the story. For years I reported that Antioco was fired. I even wrote it up that way in my book Cascades until I contacted the former CEO to fact-check it. He was incredibly generous with his time, corrected me and then gave me additional insights that improved the book.

To this day, I don’t know exactly why I made the mistake. In fact, as soon as he pointed it out I knew I was wrong. Somehow the notion that he was fired got stuck in my head and, with no one to correct me, it just stayed there. We like to think that we remember things as they happened, but unfortunately our brains don’t work that way.

Why We Get Fooled

We tend to imagine that our minds are some sort of machines, recording what we see and hear, then storing those experiences away to be retrieved at a later time, but that’s not how our brains work at all. Humans have a need to build narratives. We like things to fit into neat patterns and fill in the gaps in our knowledge so that everything makes sense.

Psychologists often point to a halo effect, the tendency for an impression created in one area to influence opinion in another. For example, when someone is physically attractive, we tend to infer other good qualities and when a company is successful, we tend to think other good things about it.

The truth is that our thinking is riddled with subtle yet predictable biases. We are apt to be influenced not by the most rigorous information, but what we can most readily access. We make confounding errors that confuse correlation with causality and then look for information that confirms our judgments while discounting evidence to the contrary.

I’m sure that both Matt Taibbi and I fell into a number of these pitfalls. We observed a set of facts, perceived a pattern, built a narrative and then began filling in gaps with things that we thought we knew. As we looked for more evidence, we seized on what bolstered the stories we were telling ourselves, while ignoring contrary facts.

The difference, of course, is that I went and checked with a primary source, who immediately pointed out my error and, as soon as he did, it broke the spell. I immediately remembered reading in Keating’s book that he resigned and agreed to stay on for six months while a new CEO was being hired. Our brains do weird things.

How Our Errors Perpetuate

In addition to our own cognitive biases, there are a number of external factors that conspire to perpetuate our beliefs. The first is that we tend to embed ourselves in networks that have similar experiences and perspectives that we do. Scientific evidence shows that we conform to the views around us and that effect extends out to three degrees of relationships.

Once we find our tribe, we tend to view outsiders suspiciously and are less likely to scrutinize allies. In a study of adults that were randomly assigned to “leopards” and “tigers,” fMRI studies noted hostility to out-group members. Research from MIT suggests that when we are around people we expect to agree with us, we don’t check facts closely and are more likely to share false information.

In David McRraney’s new book, How to Change a Mind, he points out that people who are able to leave cults or reject long-held conspiracy theories first build alternative social networks. Our associations form an important part of our identity, so we are loath to change our opinions that signal inclusion into our tribe. There are deep evolutionary forces that drive us to be stalwart citizens of the communities we join.

Taibbi was, for years, a respected investigative journalist at Rolling Stone magazine. There, he had editors and fact checkers to answer to. Now, as an independent journalist, he has only the networks that he chooses to give him feedback and, being human like all of us, he subtly conforms to a set of dispositions and perspectives.

I probably fell prey to similar influences. As someone who researches innovation, I spend a lot of time with people who regard Netflix as a hero and Blockbuster as something of a bumbler. That probably affected how I perceived Antioco’s departure from the company. We all have blind spots and fall prey to the operational glitches in our brains. No one is immune.

Learning How To Not Fool Ourselves

In one of my favorite essays the physicist Richard Feynman wrote, “The first principle is that you must not fool yourself — and you are the easiest person to fool. So you have to be very careful about that,” He goes on further to say that simply being honest isn’t enough, you also need to “bend over backwards” to provide information so that others may prove you wrong.

So the first step is to be hyper-vigilant and aware that your brain has a tendency to fool you. It will quickly grasp on the most readily available data and detect patterns that may or may not be there. Then it will seek out other evidence that confirms those initial hunches while disregarding contrary evidence.

This is especially true of smart, accomplished people. Those who have been right in the past, who have proved the doubters wrong, are going to be less likely to see the warning signs. In many cases, they will even see opposition to their views as evidence they are on the right track. There’s a sucker born every minute and they’re usually the ones who think that they’re playing it smart.

Checking ourselves isn’t nearly enough, we need to actively seek out other views and perspectives. Some of this can be done with formal processes such as pre-mortems and red teams, but a lot of it is just acknowledging that we have blind spots, building the habit of reaching out to others and improving our listening skills.

Perhaps most of all, we need to have a sense of humility. It’s far too easy to be impressed with ourselves and far too difficult to see how we’re being led astray. There is often a negative correlation between our level of certainty and the likelihood of us being wrong. We all need to make an effort to believe less of what we think.

— Article courtesy of the Digital Tonto blog
— Image credit: 1 of 1,050+ FREE quotes for your meetings & presentations at http://misterinnovation.com

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Focus on Shaping Networks Not Opinions

Focus on Shaping Networks Not Opinions

GUEST POST from Greg Satell

Anybody who has ever been married or had kids knows how difficult it can be to convince even a single person. To persuade dozens or hundreds — much less thousands or millions — to change their mind about something important seems like a pipe dream. Yet that doesn’t stop people from spending significant time and energy to do just that.

In fact, there is a massive industry dedicated to shaping opinions. Professionals research attitudes, identify “value propositions,” craft messages and leverage “influencers” in the hopes that they can get people to change their minds. Yet despite the billions of dollars invested each year, evidence of consistent success remains elusive.

The truth is that the best indicator of what people do and think is what the people around them do and think. Instead of trying to shape opinions, we need to shape networks. That’s why we need to focus our efforts on working to craft cultures rather than wordsmithing slogans. To do that, we need to understand the subtle ways we influence each other.

The Influencer Myth

Malcolm Gladwell, blockbuster book, The Tipping Point, popularized his “Law of the Few,” which he stated as: “The success of any kind of social epidemic is heavily dependent on the involvement of people with a particular and rare set of social gifts.” This reenergized earlier ideas about opinion leaders, the supposedly secret people who somehow have outsize influence on others.

Perhaps not surprisingly, the communications industry quickly jumped to promote the idea of secret “influentials” living among us. Clearly, if you’re looking to shape opinions, being able to identify such people would be incredibly valuable and, it goes without saying, firms who could claim an expertise in leveraging those powers could earn outsized fees.

Yet the actual evidence that these people actually exist is incredibly thin. Even the original opinion leader research found that influence was highly contextual. In a more recent study of e-mails, it was found that highly connected people weren’t necessary to produce a viral cascade. In another, based on Twitter, it was found that they aren’t even sufficient. So called “Influentials” are only slightly more likely to produce viral chains.

Duncan Watts, co- author of both studies and a pioneer in the science of networks told me, “The Influentials hypothesis, is a theory that can be made to fit the facts once they are known, but it has little predictive power. It is at best a convenient fiction; at worst a misleading model. The real world is much more complicated.”

The Framingham Heart Study

While there is little evidence to suggest that there are special people secretly influencing our attitudes and decisions, there is abundant evidence that completely normal people exert influence all the time. We may ask our nephew about what app to download, or a co-worker about where to go for dinner. We all have people in our lives that we go to for advice about particular things.

Decades of scientific research suggests that the best indicator of what we think and do is what the people around us think and do. A famous series of studies performed in the 1950s—replicated countless times since then—found that when confronted with a overwhelming opinion, people will conform to the majority even if it is obviously wrong.

More recent research indicates that the effect applies not only to people we know well, but that extends even to second and third-degree relationships. So not only our friends, but the friends of their friends as well—many of whom we may have never met—influence us. This effect not only applies to our opinions, but also things like smoking and obesity and behaviors related to cooperation and trust.

The evidence is, in fact, overwhelming. Working to shape opinions is bound to be a fruitless exercise unless we are able to shape the networks in which ideas, attitudes and behaviors form. Fortunately, there are some fairly straightforward ways to do that.

Starting With A Majority

When we’re passionate about an idea, we want it to spread. We want to tell everyone, especially, for psychological reasons which are not quite clear to me, the opposition. There is some strange quirk embedded in human nature that makes us want to try to convince those who are most hostile to the proposition. We want to convince skeptics.

As should be clear by now, that’s a very bad idea. An idea in its early stages is, almost by definition, not fully formed. It hasn’t been tested and doesn’t have a track record. You also lack experience in countering objections. Taking an idea in its infancy into hostile territory almost guarantees failure.

The simple alternative is to start with a majority, even if that majority is only three people in a room of five. You can always expand a majority out, but once you’re in the minority you’re going to get immediate pushback. Go out and find people who are as enthusiastic as you are, who are willing to support your idea, to strengthen it and help troubleshoot.

That’s how you can begin to gain traction and build a sense for shared purpose and mission. As you begin to work out the kinks, you can embark on a keystone project, show some success, build a track record and accumulate social proof. As you gain momentum, you will find that there is no need to chase skeptics. They will start coming to you.

Small Groups, Loosely Connected, But United By A Shared Purpose

The biggest misconception about change is that once people understand it, they will embrace it and so the best way to drive change forward is to explain the need for change in a convincing and persuasive way. Change, in this view, is essentially a communication exercise and the right combination of words and images is all that is required.

Even assuming that it is practical to convince people that way, by the same logic they can just as easily have their mind changed right back by counter-arguments. So even successful shaping opinions is, at best, a temporary solution. Clearly, if we are going to bring about sustainable change, we need to shape not just opinions, but networks as well.

In my book Cascades, I explained how small groups, loosely connected but united by a shared purpose drive transformational change. It happens gradually, almost imperceptibly, at first. Connections accumulate under the surface, barely noticed, as small groups slowly begin to link together and congeal into a network. Eventually things hit a tipping point.

The good news is that decades of research suggest that tipping point is much smaller than most people think. Everett Rogers’ “S-curve” research estimated it at 10%-20% of a system. Erica Chenoweth’s research calculated the tipping point to be at 3.5% of a society. Damon Centola at the University of Pennsylvania suggests the tipping point to be at 25% of an organization.

I would take each of these numbers with a grain of salt. The salient point here is that nowhere does the evidence suggest we need anything close to 51% support for change to take hold. Our job as leaders is to cultivate networks, help them connect and inspire them with a sense of shared values and shared purpose.

— Article courtesy of the Digital Tonto blog
— Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.