Four Principles of Successful Digital Transformation

Four Principles of Successful Digital Transformation

GUEST POST from Greg Satell

When Steve Jobs and Apple launched the Macintosh with great fanfare in 1984, it was to be only one step in a long journey that began with Douglas Engelbart’s Mother of All Demos and the development of the Alto at Xerox PARC more than a decade before. The Macintosh was, in many ways, the culmination of everything that came before.

Yet it was far from the end of the road. In fact, it wouldn’t be until the late 90s, after the rise of the Internet, that computers began to have a measurable effect on economic productivity. Until then, personal computers were mainly an expensive device to automate secretarial work and for kids to play video games.

The truth is that innovation is never a single event, but a process of discovery, engineering and transformation. Yet what few realize is that it is the last part, transformation, that is often the hardest and the longest. In fact, it usually takes about 30 years to go from an initial discovery to a major impact on the world. Here’s what you can do to move things along.

1. Identify A Keystone Change

About a decade before the Macintosh, Xerox invented the Alto, which had many of the features that the Macintosh later became famous for, such as a graphical user interface, a mouse and a bitmapped screen. Yet while the Macintosh became legendary, the Alto never really got off the ground and is now remembered, if at all, as little more than a footnote.

The difference in outcomes had much less to do with technology than it had to do with vision. While Xerox had grand plans to create the “office of the future,” Steve Jobs and Apple merely wanted to create a cool gadget for middle class kids and enthusiasts. Sure, they were only using it to write term papers and play video games, but they were still buying.

In my book, Cascades, I call this a “keystone change,” based on something my friend Talia Milgrom-Elcott told me about ecosystems. Apparently, every ecosystem has one or two keystone species that it needs to thrive. Innovation works the same way, you first need to identify a keystone change before a transformation can begin.

One common mistake is to immediately seek out the largest addressable market for a new product or service. That’s a good idea for an established technology or product category, but when you have something that’s truly new and different, it’s much better to find a hair on fire use case, a problem that’s someone needs solved so badly that they are willing to put up with early glitches and other shortcomings.

2. Indoctrinate Values, Beliefs And Skills

A technology is more than just a collection of transistors and code or even a set of procedures, but needs specific values and skills to make it successful. For example, to shift your business to the cloud, you need to give up control of your infrastructure, which requires a completely new mindset. That’s why so many digital transformations fail. You can’t create a technology shift without a mind shift as well.

For example, when the Institute for Healthcare Improvement began its quest to save 100,000 lives through evidence-based quality practices, it spent significant time preparing the ground beforehand, so that people understood the ethos of the movement. It also created “change kits” and made sure the new procedures were easy to implement to maximize adoption.

In a similar vein, Facebook requires that all new engineers, regardless of experience or expertise, go through its engineering bootcamp. “Beyond the typical training program, at our Bootcamp new engineers see first-hand, and are able to infer, our unique system of values,” Eddie Ruvinsky, an Engineering Director at the company, told me.

“We don’t do this so much through training manuals and PowerPoint decks,” he continued,”but through allowing them to solve real problems working with real people who are going to be their colleagues. We’re not trying to shovel our existing culture at them, but preparing them to shape our culture for the future.”

Before you can change actions, you must first transform values, beliefs and skills.

3. Break Through Higher Thresholds Of Resistance

Growing up in Iowa in the 1930s, Everett Rogers, noticed something strange in his father’s behavior. Although his father loved electrical gadgets, he was hesitant to adopt hybrid seed corn, even though it had higher yields. In fact, his father only made the switch after he saw his neighbor’s hybrid seen crop thrive during a drought in 1936.

This became the basis for Rogers’ now-familiar diffusion of innovations theory, in which an idea first gets popular with a group of early adopters and then only later spreads to other people. Later, Geoffrey Moore explained that most innovations fail because they never cross the chasm from the early adopters to the mainstream.

Both theories have become popular, but are often misunderstood. Early adopters are not a specific personality type, but people with a low threshold of resistance to a particular idea or technology. Remember that Rogers’s father was an early adopter of electrical gadgets, but was more reticent with seed corn.

As network theory pioneer Duncan Watts explained to me, an idea propagates through “easily influenced people influencing other easily influenced people.” So it’s important to start a transformation with people who are already enthusiastic, work out the inevitable kinks and then move on to people slightly more reticent, once you’ve proved success in that earlier group.

4. Focus On The Network, Not The Nodes

Perhaps the biggest mistake that organizations commit when trying to implement a new technology is to try to push everything from above, either through carrots, like financial incentives, or sticks, like disciplinary action for noncompliance. That may give senior management the satisfaction of “taking action,” but can often backfire.

People are much more willing to adopt something new if they feel like its their idea. The Institute for Healthcare Improvement, for example, designated selected institutions to act as “nodes” to help spread its movement. These weren’t watchdogs, but peers that were early adopters who could help their colleagues adopt the new procedures effectively.

In a similar vein, IBM has already taken significant steps to drive adoption of Quantum computing, a technology that won’t be commercially available for years. First it created the Q Experience, an early version of its technology available through the cloud for anyone to use. It has also set up its Q Network of early adopter companies who are working with IBM to develop practical applications for quantum computing.

To date, tens of thousands have already run hundreds of thousands of experiments on Q Experience and about a dozen companies have joined the Q Network. So while there is still significant discovery and engineering to be done, the transformation is already well underway. It always pays to start early.

The truth is that transformation is always about the network, not the nodes. That’s why you need to identify a keystone change, indoctrinate the values and skills that will help you break through higher thresholds of resistance and continuously connect with a diverse set of stakeholders to drive change forward.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Irrelevant Innovation

Irrelevant Innovation

GUEST POST from John Bessant

Why change is not always a good thing….

Forget about the ice truckers who haul their precious cargoes across frozen lakes and tundra in the Arctic Circle. Or those heroes who service remote islands in the Pacific or who fly into inaccessible airstrips in the rainforests. They are doing a tough job, undoubtedly — but we should accept that perhaps the hardest haulage challenge in the world has to be that of getting a seven-year-old back to school after the spring break. Motivating muscles to power little legs school-ward (even if the journey is downhill) and placing a smile of anticipation on her face at the prospect of six hours experiencing the joys of learning is not an easy task.

So in one of my many desperate attempts to put a spring back in her step (if not the broader British climate) was to suggest we invent some crazy new things as we trudged our way. Come up with some ideas for wild inventions, the less practical and the more outlandish, the better.

The exercise worked in terms of smoothing the school journey and distracting a daughter. But it also got me thinking — we spend so much of our time thinking about important innovation but maybe we should spare a thought for what might be called ‘irrelevant innovation’? And explore round the edges of this phenomenon — is it all wacky stuff or are there circumstances where it has more to offer? Is it a matter of framing, are we missing an innovation trick or two by dismissing such ideas too early?

Innovation Typology

So here’s a suggested outline typology, a first shot at mapping the territory — feel free to add your own examples and categories….

1. WTF?!!!

These are the ideas that leap out at you from the screen or jump up from the page with a fistful of questions. Like what were they thinking of, who dreamed this up (and what were they on when they did so), who on earth would want this or maybe just a pure, simple and very large why? For example patenting the cheese flavoured cigarette? Or the musical flame-thrower? Sometimes a closer look might reveal the originator’s tongue firmly wedged in their cheek, these are elaborate jokes and nudges to remind us not to take innovation life too seriously. But all too often they have the stamp of sincerity about them — someone really believes that what the world needs now is their invention. Like, for example, the urban window baby cage, in which (for high rise apartment dwellers) your child can get plenty of fresh air by being suspended outside the window, hundreds of feet off the ground…..

These are easy to spot and throw into the rubbish bin — but maybe we shouldn’t be too quick to apply our BS filters and dismiss them. After all history reminds us that sometimes we need visionaries, those who can see into the future and bring back wild ideas which become part of that future. Apple’s famous ad campaign around ‘Think different’ had Richard Dreyfus turning our collective heads towards ‘the crazy ones….the misfits, the rebels, the troublemakers — the round pegs in the square holes. The ones who see things differently…..” Which echoes the great playwright George Bernard Shaw’s observation that ‘ all progress depends on unreasonable men…’. Trouble is that the line between crazy and visionary is often vanishingly thin.

Think about Nikoloai Tesla who did a lot more than lend his name to a car brand; without his insights we wouldn’t have much of the electricity generation technology we rely on today, not to mention valuable innovations around radio, lighting, transportation, etc. But we didn’t get earthquake generating machines, thought cameras, supersonic airships, ‘death-beams’ or artificial tidal waves — which may be a good thing. Melissa Schilling in her excellent book of the same name classes people like Tesla as ‘quirky’ and that word captures their character traits well. It’s also worth noting that we tend to label ordinary folk who come up with oddball stuff as variations on crazy — but if the ideas originate from billionaires who’ve built their fortune on innovation we use the more forgiving ‘eccentric’ descriptor….

2. Bouncing back off the wall.

You can almost see the creative moment, late night, fuelled by questionable alcohol or other stimulants, that point where the conversation explodes around a key wild thought. Like ‘let’s convince people that what they really need is a …pet rock!’. Innovations of this kind start life as a crazy idea but somehow along the way they acquire a momentum of their own. A community of users — or perhaps co-conspirators — emerges which brings the thing to life and creates its own use case. Gary Dahl’s madcap thought about pet rocks led to him selling over 10,000 of them every day; at the height of the craze several tons of nearly 2 million of them were being adopted. (You can still buy them today if you’re wanting a low maintenance companion). Or how about changing your eating habits and improving your digestion by using a ‘slow fork’ next time you sit down to a meal? Or pick up a ‘no-phone’, looks like the real thing but actually has zero functionality inside? Or the ‘selfie toaster which produces toast with your image on it?

3. Following the Yellow Brick Road — sometimes innovations build on well-established trajectories but lead us to unexpected and irrelevant places. Packaging offers plenty of examples — it’s become a huge industry and of central importance in food retailing and distribution, to help preserve integrity, freshness and safety. But take a closer look at the contents of your supermarket trolley (or your home delivery order). Do we really need our bananas shrink wrapped and encased in plastic trays? Or whole nuts inside plastic cartons? It took Nature several million years to evolve some useful natural protection — do we really need to update it? Do we need a personal pocket water spray when we could splash ourselves at the sink? Or leaf blowers that serve to create miniature sandstorms?

4. On second thoughts…..

Confession time — in my research on ‘wacky inventions’ I came across several Japanese sites which feature oddball innovations including a miniature umbrella which you could wear as a hat. Who would ever really want something like that and why? Some rapid reframing was in order when my wife not only bought one enthusiastically but then proceeded to deploy it in the garden, demonstrating its considerable advantages over hats (which fall off) or hooded jackets (which lock your arms up like a straitjacket and obscure your vision). This device keeps her dry enough for enough the most delicate gardening tasks — and made me rapidly revise my estimate of it!

Innovations like these might appear unnecessary but sometimes there’s more to them — beauty (or at least value) really is in the eye of the beholder and maybe we need to practise a little reframing? Maybe the ‘floor cleaning onesie’ (a baby outfit which polishes your floors while your offspring are crawling around) isn’t such a bad idea after all?

5. String and sealing wax creations.

Necessity or sometimes frustration is a very fecund mother of invention and this plays out big-time in the world of user innovation. As extensive research has shown users are responsible for a significant amount of product and process innovation. Studies suggest over 20% of new products and an even higher proportion of process innovations originate in the hands of users — because they are actively seeking a solution to a problem which bothers them. Couple this with a tolerance for imperfection — they will experiment with prototypes which work even if they look a bit odd and lack design elegance. So many of those early hacks and minimum viable workarounds might look crazy but could be the start of something which becomes a mainstream innovation. Think of where many new sports (like skateboarding) originate or where childcare innovations (like collapsible buggies or disposable diapers) began and the oddball user is often clearly in view……

6. Seemed like a good idea at the time…

Sometimes (back to trajectories) we can extrapolate trends to create apparently interesting opportunities and then go on to innovate something irrelevant. The wonderful Museum of Failure in Sweden (and online) has plenty of examples including a sizeable number from big companies. Anticipating the time poor commuters across big cities like New York and recognising the nutritional challenges in a diet consisting of snatched snacks the food giant Gerber came up with a line of quality adult foods which could be consumed quickly from a jar. Sort of spooning up adult baby food in grown up flavours like ‘Mediterranean vegetables’ …… Perhaps not surprisingly it didn’t take off.

And despite having proved his innovation skills in the field of home computers where his ZX80 range opened up the mass market for the product in Europe Clive Sinclair’s venture into electromobility — the C5 — became a byword for how not to do innovation. At some point some kind of ‘reality distortion field’ seems to come into play for the innovators — an experience well documented in the excellent history of the Segway personal transportation revolution that never quite happened….

Clive Sinclair C5 Wikipedia

7. Wrong place, wrong time

Timing in innovation as much as in stand-up comedy, is everything. And sometimes the great idea on which many people have worked arrives perfectly formed and well-thought out but at totally the wrong moment. Take the Bristol Brabazon — originally conceived as a breakthrough aeroplane design to exploit the anticipated huge market growth in long-haul international air travel in the post-war period. Based on a design for a giant long-range bomber, which was approved by the Ministry of Aviation for development in 1943 it took shape in consultation with the UK national airline, BOAC. Like many projects it took on a life of its own; the budget rapidly escalated, with the construction of new facilities to accommodate such a large plane and, at one stage, the demolition of an entire village in order to extend the runway at Filton, near Bristol. Many unnecessary features were included — for example, the mock-up contained ‘a most magnificent ladies’ powder room with wooden aluminium-painted mirrors and even receptacles for the various lotions and powders used by the modern young lady’. The prototype took six-and-a half years to build and involved major technical crises with wings and engine design but eventually it flew, and very well. The only problem was that the character of the postwar aircraft market was very different from that envisaged by the technologists and in 1952, after flying less than 1000 miles, the project was abandoned at considerable cost to the taxpayer.

8. Coming too early to the party

Sometimes it’s the other way around, innovations arriving ahead of, rather than behind their time and looking around in embarrassment at the handful of other early bird party guests, trying to interest them. Markets that have yet to materialise or, very often, technologies that have yet to mature. Step forward Apple and the Newton or Google’s Glasses? These are examples where the particular embodiment of the innovation didn’t quite make it and appeared unnecessary or irrelevant — but where the learning acquired through such failure has proved invaluable in terms of shaping future successful direction (s).

9. Blind spots

And of course we should spare a thought for otherwise great ideas which suffer from a lack of insight into the context in which they might find themselves. For example there are plenty of cases where a simple and apparently useful name can turn out to have unfortunate consequences when placed in a different linguistic or cultural zone. Think of French kids growing up happily drinking bottles of a fizzy drink with the unfortunate (in English-speaking contexts) name of ‘Psschitt’ or their Ghanaian counterparts who enjoy a draught of Pee Cola (not so popular with tourists).

Everett Rogers spent his lifetime researching adoption and diffusion of innovations and one of the cardinal lessons he drew out of thousands of studies was the need to think carefully about compatibility — how well does your innovation fit into the context in which you’re planning to place it?

The moral of this story? First, creativity is a powerful motivator, not least when your primary aim is getting recalcitrant children to school. We’re (fortunately) hard-wired for it and our imaginations sometimes lead us to come up with end even try crazy stuff out. (And, as the Darwin awards regularly demonstrate, there is an element of natural selection involved which helps us avoid the really bad ideas!)

But not every wild idea is worthless; one of the early lessons I learned about creativity was the importance of what Tudor Rickards called ‘stepping stones’ — oddball ideas in themselves which serve to take our minds down different pathways and may lead to somewhere useful.

And framing matters — in two directions. First we need to hammer home the compatibility lesson taught us by Everett Rogers — innovations don’t exist in a vacuum and we need to think about compatibility with the context into which we’re placing them.

But second, how far can we adapt the frame we place around an innovation, how far are we willing to stretch our own thinking and behaviour to accommodate it? Think of the science-fiction images of ideas like a smart wristwatch which wakes you, talks to you, enables communication, acts as a map and compass combined — and also tells you the time. Literally incredible, unbelievable — until we all started to buy and wear smart watches….

But perhaps we should also think of those innovations which started out as important, relevant and useful things which offered to make significant positive impact. But which — like DDT and many others — later turned out to have negative consequences. ‘Responsible innovation’ is the term used to describe an approach which involves carefully considering what innovations might do and trying to anticipate their possible unwanted side effects and making sure we have the capacity to shape (and, if necessary, reshape) them for good. In the exploding world of innovation possibilities which AI is bringing this looks like an essential rather than optional approach to take.

You can find my podcast here and my videos here

And if you’d like to learn with me take a look at my online course here

Image credits: Dall-E via Microsoft CoPilot, Wikipedia

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

Bad Questions to Ask When Developing Technology

Bad Questions to Ask When Developing Technology

GUEST POST from Mike Shipulski

I know you’re trying to do something that has never been done before, but when will you be done?

I don’t know. We’ll run the next experiment then decide what to do next. If it works, we’ll do more of that. And if it doesn’t, we’ll do less of that. That’s all we know right now.

I know you’re trying to create something that is new to our industry, but how many will we sell?

I don’t know. Initial interviews with customers made it clear that this is an important customer problem. So, we’re trying to figure out if the technology can provide a viable solution. That’s all we know right now.

No one is asking for that obscure technology. Why are you wasting time working on that?

Well, the voice of the technology and the S-curve analyses suggest the technology wants to move in this direction, so we’re investing this solution space. It might work and it might not. That’s all we know right now.

Why aren’t you using best practices?

If it hasn’t been done before, there can be no best practice. We prefer to use good practice or emergent practice.

There doesn’t seem like there’s been much progress. Why aren’t you running more experiments?

We don’t know which experiments to run, so we’re taking some time to think about what to do next.

Will it work?

I don’t know.

That new technology may obsolete our most profitable product line. Shouldn’t you stop work on that?

No. If we don’t obsolete our best work, someone else will. Wouldn’t it be better if we did the obsoleting?

How many more people do you need to accelerate the technology development work?

None. Small teams are better.

Sure, it’s a cool technology, but how much will it cost?

We haven’t earned the right to think about the cost. We’re still trying to make it work.

So, what’s your solution?

We don’t know yet. We’re still trying to formulate the customer problem.

You said you’d be done two months ago. Why aren’t you done yet?

I never said we’d be done two months ago. You asked me for a completion date and I could not tell you when we’d be done. You didn’t like that answer so I suggested that you choose your favorite date and put that into your spreadsheet. We were never going to hit that date, and we didn’t.

We’ve got a tight timeline. Why are you going home at 5:00?

We’ve been working on this technology for the last two years. This is a marathon. We’re mentally exhausted. See you tomorrow.

If you don’t work harder, we’ll get someone else to do the technology development work. What do you think about that?

You are confusing activity with progress. We are doing the right analyses and the right thinking and we’re working hard. But if you’d rather have someone else lead this work, so would I.

We need a patented solution. Will your solution be patentable?

I don’t know because we don’t yet have a solution. And when we do have a solution, we still won’t know because it takes a year or three for the Patent Office to make that decision.

So, you’re telling me this might not work?

Yes. That’s what I’m telling you.

So, you don’t know when you’ll be done with the technology work, you don’t know how much the technology will cost, you don’t know if it will be patentable, or who will buy it?

That’s about right.

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.






Six Steps to Creating a Brand Experience Personality

Six Steps to Creating a Brand Experience Personality

GUEST POST from Shep Hyken

Two weeks ago, I contributed an article that compared the different concert experiences I had with two rock legends, Bob Dylan and Ringo Starr. The title of the article summed up the point I was trying to make: Transactions versus Experiences

I want to take it a step further this week. Last week’s content was meant to get you thinking. Now, I want you to take action on the content. So, here are six ways to create an experience personality that will transform your company or brand from merely providing products and services to doing so with personality:

  1. Your Company’s Personality: I don’t care what you sell. It could be military equipment or comic books. Every company has a personality, and these personalities run the gamut from serious to whimsical. What are the adjectives that customers use to describe you? How would you like them to describe you? These are two great questions to ask as you start to explore your company’s personality.
  2. Communicate Your Company’s Personality: Once you know it, don’t keep it a secret. When you know the perception you want customers to have of your organization, empower your employees to deliver on the personality.
  3. Top-Down Personality: If you want employees on the front line to deliver on the company’s personality, it must be modeled from the top down. In other words, leaders must practice the behaviors they want their employees to practice. The personality comes from the top and makes its way through the entire organization, eventually being felt by the customers.

Shep Hyken Brand Experience Personality Cartoon

  1. Manage Every Moment: I have always been a huge fan of Jan Carlson’s Moments of Truth concept, in which every interaction a customer has with a company is an opportunity for them to form an impression. These interactions include advertising, websites, people-to-people, and more. Find ways to instill the personality into all of these interactions.
  2. Get Feedback: There is only one way to know for sure that you’re delivering on your company’s personality experience. Ask your customers.
  3. Be Consistent: The only way for your experience personality to become a reality is for the experience to be consistent and predictable. It can’t be an engaging experience this time and something other than engaging next time. When customers like the experience personality, they will want to experience more of it! Consistency counts!

As you adopt these strategies, your customers will become familiar and comfortable with the experience personality you portray. Take the time to work through these steps, get everyone on board and in alignment with the personality you want to be known for, and create the experience that gets customers to say, “I’ll be back!”

Image Credits: Pixabay, Shep Hyken

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.






Why Organizations Struggle with Innovation

Why Organizations Struggle with Innovation

GUEST POST from Howard Tiersky

We all know the world is changing rapidly. It’s clear that in order for organizations to remain relevant to the next generation of customers, and even in the next generation of technology, we must adapt, evolve and transform. The field is littered with once-great companies who failed to do this: Blackberry, Nokia, Kodak, Borders, Western Union, Blockbuster, Polaroid.

But accepting major change, or even in some cases small changes, isn’t easy for large companies. At Innovation Loft we’ve worked with scores of major brands on their efforts to conceive, create and launch new products, enter new markets, redefine their value propositions and distribution strategies, and address various types of transformations. We’ve seen some spectacular successes and some tragic near misses. In watching these innovation stories unfold, we’ve concluded that there are three key reasons why innovations fail.

Three Key Reasons Innovations Fail:

  1. The Wrong Idea
  2. Failure to Execute
  3. Sabotage!

It’s important to keep these three domains of risk in mind when approaching any innovation project, and a lot of our work at Innovation Loft is focused on how to manage and mitigate risks in each of these three categories. Let’s look at these one at a time:

1. The Wrong Idea

Change is not always good. New is not always popular. How can you tell the right ideas from the wrong ones? Here are a few practices that can make a big difference.

Focus on Customer Needs

It may seem like Apple has made its success on delivering customers new capabilities they “didn’t know they needed.” And that may be true in the sense that if you had asked customers, they might not have articulated a desire for an iPod or an iPad. However, if you focused on observing consumers in their day-to-day interactions back then, the challenge of dealing with dozens or more CDs, and the decision about which ones to bring clearly created a “pain point.” Fast forward a few years. People trying to curl up with their laptop in bed to watch a movie was clearly awkward, and watching a movie on a small iPhone was also sub-optimal. Apple identified gaps they could fill. Many unsuccessful ideas lack a clear customer value proposition and are based on the assumptions of a benefit consumers will eventually realize.

Test and Iterate

Think of product development as a spiral. Test the simplest, lowest-cost version of your product (even if it’s a paper mockup) to get early feedback from users. Continue that process each step of the way, through launch and beyond, to really understand how consumers are using your product and where it may need improvement.

Pivot

Ultimately, don’t fall in love with your idea. Focus on the value you can create for your customers. Even with the first two points in this list, you can still find yourself launching the wrong idea. That’s the risk of innovation. In a large corporate environment, it’s important to set the expectation up front that there will be flexibility on redefining the product, even substantially, as the project goes on. While this approach may not be consistent with typical enterprise “capital budgeting” processes, it’s critical to the success of innovative projects.

2. Failure to Execute

Even if you have the right idea, you can fail to execute. Effective execution is measured by quality, speed, and communication.

Quality: Does the product fulfill the vision? An initial version of a product may not be as feature-rich as future releases (the original iPhone did not allow copy and paste, let alone the downloading of apps!) The key test is not comprehensive features but doing a few things very well.

Speed: In a world of innovation, we are always in competition. At the initial launch of Android, it was clearly behind the curve compared to iOS. Over time, Android was able to catch up and eventually exceed iOS sales. The two remain locked in an arms race for higher standards and better capabilities, and the timing of improvements clearly has a substantial impact. Nevertheless, Android’s story demonstrates that even with a late start, one can catch up. Kyocera and Nokia were in the market with smartphones several years before Apple.

Communication: Peter Drucker said, “Business has just two functions: innovation and marketing.” The two must go hand-in-hand. Apple’s genius has been the marriage of a great product with great communication.

3. Sabotage

Companies are designed to resist change. Classic business books define how organizations must specify roles and clear processes for how to operate. But this resistance to change is misplaced when it comes to innovation. We’ve seen many great projects killed in infancy, or even after launch and initial success, due to areas of an organization whose interests would be threatened by the success of that transformation.

If a new product or project is truly going to be transformational for your company, expect it to have enemies. These enemies’ very survival (or their perception of it) may be at stake. Many innovative products that were on the path to “saving the company” are killed through internal sabotage. As soon as there is any misstep in an innovation initiative — as there always is — forces are ready to pounce and convince the powers-that-be that it’s time to “put it out of its misery.” Can you imagine Apple killing the iPhone over Antennaegate or the Apple Maps debacle?

How can you avoid sabotage? One tactic is trying to gain as much organizational alignment as possible during each step of the innovation process. Don’t assume that because a solution seems “obvious” to your team that others will automatically support it. Involving key executives, in addition to as many parts of the organization as possible, will garner more support. Give team members the chance to participate and feel ownership of the initiative. In the words of Harry Truman:

“It’s amazing what you can accomplish if you don’t care who gets the credit.”

So how do you figure out the right answer, get everyone on the same page, and focus on a common innovation goal? At FROM, we use a specific model to approach the process of identifying the most relevant opportunity areas for innovation, and to build group consensus around the best approach. You’ll have to adapt it to your situation, but the model should provide a good starting framework.

This article originally appeared on the Howard Tiersky blog
Image Credits: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.






AI Strategy Should Have Nothing to do with AI

AI Strategy Should Have Nothing to do with AI

GUEST POST from Robyn Bolton

You’ve heard the adage that “culture eats strategy for breakfast.”  Well, AI is the fruit bowl on the side of your Denny’s Grand Slam Strategy, and culture is eating that, too.

1 tool + 2 companies = 2 strategies

On an Innovation Leader call about AI, two people from two different companies shared stories about what happened when an AI notetaking tool unexpectedly joined a call and started taking notes.  In both stories, everyone on the calls was surprised, uncomfortable, and a little bit angry that even some of the conversation was recorded and transcribed (understandable because both calls were about highly sensitive topics). 

The storyteller from Company A shared that the senior executive on the call was so irate that, after the call, he contacted people in Legal, IT, and Risk Management.  By the end of the day, all AI tools were shut down, and an extensive “ask permission or face termination” policy was issued.

Company B’s story ended differently.  Everyone on the call, including senior executives and government officials, was surprised, but instead of demanding that the tool be turned off, they asked why it was necessary. After a quick discussion about whether the tool was necessary, when it would be used, and how to ensure the accuracy of the transcript, everyone agreed to keep the note-taker running.  After the call, the senior executive asked everyone using an AI note-taker on a call to ask attendees’ permission before turning it on.

Why such a difference between the approaches of two companies of relatively the same size, operating in the same industry, using the same type of tool in a similar situation?

1 tool + 2 CULTURES = 2 strategies

Neither storyteller dove into details or described their companies’ cultures, but from other comments and details, I’m comfortable saying that the culture at Company A is quite different from the one at Company B. It is this difference, more than anything else, that drove Company A’s draconian response compared to Company B’s more forgiving and guiding one.  

This is both good and bad news for you as an innovation leader.

It’s good news because it means that you don’t have to pour hours, days, or even weeks of your life into finding, testing, and evaluating an ever-growing universe of AI tools to feel confident that you found the right one. 

It’s bad news because even if you do develop the perfect AI strategy, it won’t matter if you’re in a culture that isn’t open to exploration, learning, and even a tiny amount of risk-taking.

Curious whether you’re facing more good news than bad news?  Start here.

8 culture = 8+ strategies

In 2018, Boris Groysberg, a professor at Harvard Business School, and his colleagues published “The Leader’s Guide to Corporate Culture,” a meta-study of “more than 100 of the most commonly used social and behavior models [and] identified eight styles that distinguish a culture and can be measured.  I’m a big fan of the model, having used it with clients and taught it to hundreds of executives, and I see it actively defining and driving companies’ AI strategies*.

Results (89% of companies): Achievement and winning

  • AI strategy: Be first and be right. Experimentation is happening on an individual or team level in an effort to gain an advantage over competitors and peers.

Caring (63%): Relationships and mutual trust

  • AI strategy: A slow, cautious, and collaborative approach to exploring and testing AI so as to avoid ruffling feathers

Order (15%): Respect, structure, and shared norms

  • AI strategy: Given the “ask permission, not forgiveness” nature of the culture, AI exploration and strategy are centralized in a single function, and everyone waits on the verdict

Purpose (9%): Idealism and altruism

  • AI strategy: Torn between the undeniable productivity benefits AI offers and the myriad ethical and sustainability issues involved, strategies are more about monitoring than acting.

Safety (8%): Planning, caution, and preparedness

  • AI strategy: Like Order, this culture takes a centralized approach. Unlike Order, it hopes that if it closes its eyes, all of this will just go away.

Learning (7%): Exploration, expansiveness, creativity

  • AI strategy: Slightly more deliberate and guided than Purpose cultures, this culture encourages thoughtful and intentional experimentation to inform its overall strategy

Authority (4%): Strength, decisiveness, and boldness

  • AI strategy: If the AI strategies from Results and Order had a baby, it would be Authority’s AI strategy – centralized control with a single-minded mission to win quickly

Enjoyment (2%): Fun and excitement

  • AI strategy: It’s a glorious free-for-all with everyone doing what they want.  Strategies and guidelines will be set if and when needed.

What do you think?

Based on the story above, what culture best describes Company A?  Company B?

What culture best describes your team or company?  What about your AI strategy?

*Disclaimer. Culture is an “elusive lever” because it is based on assumptions, mindsets, social patterns, and unconscious actions.  As a result, the eight cultures aren’t MECE (mutually exclusive, collectively exhaustive), and multiple cultures often exist in a single team, function, and company.  Bottom line, the eight cultures are a tool, not a law (and I glossed over a lot of stuff from the report)

Image credit: Wikimedia Commons

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.

DNA May Be the Next Frontier of Computing and Data Storage

DNA May Be the Next Frontier of Computing and Data Storage

GUEST POST from Greg Satell

Data, as many have noted, has become the new oil, meaning that we no longer regard the information we store as merely a cost of doing business, but a valuable asset and a potential source of competitive advantage. It has become the fuel that powers advanced technologies such as machine learning.

A problem that’s emerging, however, is that our ability to produce data is outstripping our ability to store it. In fact, an article in the journal Nature predicts that by 2040, data storage would consume 10–100 times the expected supply of microchip-grade silicon, using current technology. Clearly, we need a data storage breakthrough.

One potential solution is DNA, which is a million times more information dense than today’s flash drives. It also is more stable, more secure and uses minimal energy. The problem is that it is currently prohibitively expensive. However, a startup that has emerged out of MIT, called CATALOG, may have found the breakthrough we’re looking for: low-cost DNA Storage.

The Makings Of A Scientist-Entrepreneur

Growing up in his native Korea, Hyunjun Park never planned on a career in business, much less the technology business, but expected to become a biologist. He graduated with honors from Seoul National University and then went on to earn a PhD from the University of Wisconsin. Later he joined Tim Lu’s lab at MIT, which specializes in synthetic biology.

In an earlier time, he would have followed an established career path, from PhD to post-doc to assistant professor to tenure. These days, however, there is a growing trend for graduate students to get an entrepreneurial education in parallel with the traditional scientific curriculum. Park, for example, participated in both the Wisconsin Entrepreneurial Bootcamp and Start MIT.

He also met a kindred spirit in Nate Roquet, a PhD candidate who, about to finish his thesis, had started thinking about what to do next. Inspired by a talk from given by the Chief Science Officer at a seed fund, IndieBio, the two began to talk in earnest about starting a company together based on their work in synthetic biology.

As they batted around ideas, the subject of DNA storage came up. By this time, the advantages of the technology were well known but it was not considered practical, costing hundreds of thousands of dollars to store just a few hundred megabytes of data. However, the two did some back-of -the-envelope calculations and became convinced they could do it far more cheaply.

Moving From Idea To Product

The basic concept of DNA storage is simple. Essentially, you just encode the ones and zeros of digital code into the T, G, A and C’s of genetic code. However, stringing those genetic molecules together is tedious and expensive. The idea that Park and Roquet came up with was to use enzymes to alter strands of DNA, rather than building them up piece by piece.

Contrary to popular opinion, most traditional venture capital firms, such as those that populate Sand Hill Road in Silicon Valley, don’t invest in ideas. They invest in products. IndieBio, however, isn’t your typical investor. They give only give a small amount of seed capital, but offer other services, such as wet labs, entrepreneurial training and scientific mentorship. Park and Roquet reached out to them and found some interest.

“We invest in problems, not necessarily solutions,” Arvind Gupta, Founder at IndieBio told me. “Here the problem is massive. How do you keep the world’s knowledge safe? We know DNA can last thousands of years and can be replicated very inexpensively. That’s a really big deal and Hyunjun and Nate’s approach was incredibly exciting.”

Once the pair entered IndieBio’s four-month program, they found both promise and disappointment. Their approach could dramatically reduce the cost of storing information in DNA, but not nearly quickly enough to build a commercially viable product. They would need to pivot if they were going to turn their idea into an actual business.

Scaling To Market

One flaw in CATALOG’s approach was that the process was too complex to scale. Yet they found that by starting with just a few different DNA strands and attaching them together, much like a printing press pre-arranges words in a book, they could come up with something that was not only scalable, but commercially viable from a cost perspective.

The second problem was more thorny. Working with enzymes is incredibly labor intensive and, being biologists, Park and Roquet didn’t have the mechanical engineering expertise to make their process feasible. Fortunately, an advisor, Darren Link, connected the pair to Cambridge Consultants, an innovation consultancy that could help them.

“We started looking at the problem and it seemed that, on paper at least, we could make it work,” Richard Hammond, Technology Director and Head of Synthetic Biology at Cambridge Consultants, told me. “Now we’re about halfway through making the first prototype and we believe we can make it work and scale it significantly. We’re increasingly confident that we can solve the core technical challenges.”

In 2018 CATALOG introduced the world to Shannon, its prototype DNA writer. In 2022 CATALOG announced its DNA computation work at the HPC User Forum. But CATALOG isn’t without competition in the space. For example, Western Digital‘s LTO-9 from 2022, can store 18 TB per cartridge. CATALOG for its part is partnering with Seagate “on several initiatives to advance scalable and automated DNA-based storage and computation platforms, including making DNA-based platforms up to 1000 times smaller.” That should make the process competitive for archival storage, such as medical and legal records as well as storing film databases at movie studios.

“I think the fact that we’re inventing a completely new medium for data storage is really exciting,” Park told me. “I don’t think that we know yet what the true potential is because the biggest use cases probably don’t exist yet. What I do know is that our demand for data storage will soon outstrip our supply and we are thrilled about the possibility of solving that problem.”

Going Beyond Digital

A generation ago, the task of improving data storage would have been seen as solely a computer science problem. Yet today, the digital era is ending and we’re going to have to look further and wider for solutions to the problems we face. With the vast improvement in genomics, which is far outpacing Moore’s law these days, we can expect biology to increasingly play a role.

“Traditional, information technology has been strictly the realm of electrical engineers, physicists and coders,” Gupta of IndieBio told me. “What we’re increasingly finding is that biology, which has been honed for millions of years by evolution, can often point the way to solutions that are more robust and potentially, much cheaper and more efficient.”

Yet this phenomenon goes far beyond biology. We’re also seeing similar accelerations in other fields, such as materials science and space-related technologies. We’re also seeing a new breed of investors, like IndieBio, that focus specifically on scientist entrepreneurs. “I consider myself a product of the growing ecosystem for scientific entrepreneurs at universities and in the investor community,” Park told me.

Make no mistake. We are entering a new era of innovation and the traditional Silicon Valley approach will not get us where we need to go. Instead, we need to forge greater collaboration between the scientific community, the investor community and government agencies to solve problems that are increasingly complex and interdisciplinary.

— Article courtesy of the Digital Tonto blog and previously appeared on Inc.com
— Image credits: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.






Video Killed More Than the Radio Star

Video Killed More Than the Radio Star

by Braden Kelley

If you are a child of the eighties, you will remember when MTV went live 24 hours a day with music videos on cable television August 1, 1981 with the broadcast of “Video Killed the Radio Star” by the Buggles.

But I was thinking the other day about how video (or taken more broadly as streaming media – including television, movies, gaming, social media, and the internet) has killed far more things than just radio stars. Many activities have experienced substantial declines due to people staying home and engaging in these forms of entertainment – often by themselves – where in the past people would leave their homes to engage in more human-to-human-interactions.

The ten declines listed below have not only reshaped the American landscape – literally – but have also served to feed declines in the mental health of modern nations at the same time. Without further ado, here is the list

1. Bowling Alleys:

Bowling alleys, once bustling with players and leagues, have faced challenges in recent years. The communal experience of bowling has been replaced by digital alternatives, impacting the industry.

2. Roller Skating Rinks:

Roller skating rinks, which were once popular hangout spots for families and teens, have seen declining attendance. The allure of roller disco and skating parties has waned as people turn to other forms of entertainment.

3. Drive-In Movie Theaters:

Drive-in movie theaters, iconic symbols of mid-20th-century entertainment, have faced challenges in recent decades. While they once provided a unique way to watch films from the comfort of your car, changing lifestyles and technological advancements have impacted their popularity.

4. Arcade Game Centers:

In the ’80s and ’90s, video game arcades were buzzing hubs of entertainment. People flocked to play games like Pac-Man, Street Fighter, and Mortal Kombat. Traditional arcade game centers, filled with pinball machines, classic video games, and ticket redemption games, have struggled to compete with home gaming consoles and online multiplayer experiences. The convenience of playing video games at home has led to a decline in arcade visits. Nostalgia keeps some arcades alive, but they are no longer as prevalent as they once were.

5. Miniature Golf Courses:

Mini-golf courses, with their whimsical obstacles and family-friendly appeal, used to be popular weekend destinations. However, the rise of digital entertainment has impacted their attendance. The allure of playing a round of mini-golf under the sun has faded for many.

6. Indoor Trampoline Parks:

Indoor trampoline parks gained popularity as a fun and active way to spend time with friends and family. However, the pandemic and subsequent lockdowns forced many of these parks to close temporarily. Even before the pandemic, the availability of home trampolines and virtual fitness classes reduced the need for indoor trampoline parks. People can now bounce and exercise at home or virtually, without leaving their living rooms.

7. Live Music Venues:

Live music venues, including small clubs, concert halls, and outdoor amphitheaters, have struggled due to changing entertainment preferences. While some artists and bands continue to perform, the rise of virtual concerts and streaming services has affected attendance. People can now enjoy live music from the comfort of their homes, reducing the need to attend physical venues. The pandemic also disrupted live events, leading to further challenges for the industry.

8. Public Libraries (In-Person Visits):

Public libraries, once bustling with readers and community events, have seen a decline in in-person visits. E-books, audiobooks, and online research resources have made it easier for people to access information without physically visiting a library. While libraries continue to offer valuable services, their role has shifted from primarily physical spaces to digital hubs for learning and exploration – and a place for latchkey kids to go and wait for their parents to get off work.

10. Shopping Malls

Once bustling centers of retail and social activity, shopping malls have faced significant challenges in recent years. Various technological shifts have contributed to their decline, including e-commerce and online shopping, social media and influencer culture, changing demographics and urbanization. Shopping malls are yet another place that parents are no longer dropping off the younger generation at for the day.

And if that’s not enough, here is a bonus one for you:

11. Diners, Malt Shops, Coffee Shops, Dive Bars/Taverns, Neighborhood Pubs (UK) and Drive-In Burger Joints

If you’re a child of the seventies or eighties, no doubt you probably tuned to watch Richie, Potsie, Joanie, Fonsie and Ralph Malph gather every day at Al’s. Unfortunately, many of the more social and casual drinking and dining places are experiences declines as diet, habit and technology changes have kicked in. Demographic changes (aging out of nostalgia) and the rise of food delivery apps and takeout culture have helped to sign their death warrant.

Conclusion

In the ever-evolving landscape of entertainment, video and streaming media have reshaped our experiences and interactions. As we bid farewell to once-thriving institutions, we recognize both the convenience and the cost of this digital transformation. For example, the echoes of strikes and spares have faded as digital alternatives replace the communal joy of bowling. As we navigate this digital era, let us cherish what remains and adapt to what lies ahead. Video may have transformed our world, but the echoes of lost experiences linger, urging us to seek balance in our screens and our souls. As these once ubiquitous gathering places disappear, consumer tastes change and social isolation increases, will we as a society seek to reverse course or evolve to some new way of reconnecting as humans in person? And if so, how?

What other places and/or activities would you have added to the list?
(sound off in the comments)

p.s. Be sure and follow both my personal account and the Human-Centered Change and Innovation community on LinkedIn.

Image credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to get Human-Centered Change & Innovation Weekly delivered to your inbox every week.

References:
(1) Duwamish Drive-In was not really about the movies. https://mynorthwest.com/289708/duwamish-drive-in-not-really-about-the-movies/.
(3) How online gaming has become a social lifeline – BBC. https://www.bbc.com/worklife/article/20201215-how-online-gaming-has-become-a-social-lifeline.
(3) Social media brings benefits and risks to teens. Psychology can help …. https://www.apa.org/monitor/2023/09/protecting-teens-on-social-media.
(4) Frontiers | Social Connectedness, Excessive Screen Time During COVID-19 …. https://www.frontiersin.org/articles/10.3389/fhumd.2021.684137/full.






Time is Not Fundamental

Time is Not Fundamental

GUEST POST from Geoffrey A. Moore

For all my life I have been taught that time is the fourth dimension in a space-time continuum. I mean, for goodness sake, Einstein said this was so, and all of physics has followed his lead. Nonetheless, I want to argue that, while the universe may indeed have four dimensions, time is not one of them, nor is it a fundamental element of reality.

Before you think I have really jumped off the deep end, let me just say that my claim is that motion is a fundamental element of reality, and it is the one that time is substituting for. This is based simply on observation. That is, we can observe and measure mass. We can observe and measure space. We can observe and measure energy. We can observe and measure motion. Time, on the other hand, is simply a tool we have developed to measure motion. That is, motion is fundamental, and time is derived.

Consider where our concept of time came from. It started with three distinct units—the day, the month, and the year. Each is based on a cyclical motion—the earth turning around its axis, the moon encircling the earth, the earth and moon encircling the sun. All three of these cyclical motions have the property of returning to their starting point. They repeat, over and over and over. That’s how they came to our attention in the first place.

If we call this phenomenon cyclical time, we can contrast it with linear time. The latter is time we experience as passing, the one to which we apply the terms past, present, and future. But in fact, what is passing is not time but motion, motion we are calibrating by time. That is, we use the cyclical units of time to measure the linear distance between any given motion and a reference location.

As I discuss in The Infinite Staircase, by virtue of the Big Bang, the Second Law of Thermodynamics, and the ongoing rush to greater and greater entropy, the universe is inherently in motion. Some of that motion gets redirected to do work, and some of that work has resulted life emerging on our planet. Motion is intrinsic to our experience of life, much more so than time. As babies we have no sense of time, but we immediately experience mass, space, energy, and motion.

Because mass, space, energy, and motion are core to our experience, we have developed tools to help us engage with them strategically. We can weigh mass and reshape it in myriad ways to serve our ends. We can measure space using anything as a standard length and create structures of whatever size and shape we need. We can measure energy in terms of temperature and pressure and manipulate it to move all kinds of masses through all kinds of spaces. And we can measure motion through space by using standard units of time.

The equation for so doing is typically written as v = d/t. This equation makes us believe that velocity is a concept derived from the primitives of distance and time. But a more accurate way of looking at reality is to say t = d/v. That is, we can observe distance and motion, from which we derive time. If you have a wristwatch with a second hand, this is easily confirmed. A minute consists of a wand traveling through a fixed angular distance, 360°, at a constant velocity set by convention, in this case the International System of Units, these days atomically calibrated by specified number of oscillations of cesium. Time is derived by dividing a given distance by a given velocity.

OK, so what? Here the paths of philosophy and physics diverge, with me being able to pursue the former but not the latter. Before parting, however, I would like to ask the physicists in the room, should there be any, a question: If one accepted the premise that motion was the fourth dimension, not time, such that we described the universe as a continuum of spacemotion instead of spacetime, would that make any difference? Specifically, with respect to Einstein’s theories of special and general relativity, are we just substituting terms here, or are there material consequences? I would love to learn what you think.

At my end, I am interested in the philosophical implications of this question, specifically in relation to phenomenology, the way we experience time. To begin, I want to take issue with the following definition of time served up by Google:

a nonspatial continuum that is measured in terms of events which succeed one another from past through present to future.

From my perspective, this is just wrong. It calls for using events to measure time. The correct approach would focus on using time to measure motion, describing the situation as follows:

an intra-spatial continuum that can be measured in terms of time as one event succeeds another from a position of higher energy to one of lower energy.

The motive for this redefinition is to underscore that the universe is inherently in motion, following the Second Law of thermodynamics, perpetually seeking to cool itself down by spreading itself out. We here on Earth are born into the midst of that action, boats set afloat upon a river, moving with the current on the way to a sea of ultimate cool. We can go with the flow, we can paddle upstream, we can even divert the river of entropy to siphon off energy to do work. The key point to register is that motion abides, inexorably following the arrow of entropy, moving from hot to cold until heat death is achieved.

If motion is a primary dimension of the universe, there can be no standing still. Phenomenologically, this is quite different from the traditional time-based perspective. In a universe of space and time, events have to be initiated, and one can readily imagine a time with no events, a time when nothing happens, maybe something along the lines of Beckett’s Waiting for Godot. In a universe of space and motion, however, that is impossible. There are always events, and we are always in the midst of doing. A couch potato is as immersed in events as a race car driver. Or, to paraphrase Milton, they also move who only stand and wait.

A second consequence of the spacemotion continuum is that there is no such thing as eternity and no such thing as infinity. Nothing can exist outside the realm of change, and the universe is limited to whatever amount of energy was released at the Big Bang. Now, to be fair, from a phenomenological perspective, the dimensions of the universe are so gigantic that, experientially, they might as well be infinite and eternal. But from a philosophical perspective, the categories of eternity and infinity are not ontologically valid. They are asymptotes not entities.

Needless to say, all this flies in the face of virtually every religion that has ever taken root in human history. As someone deeply committed to traditional ethics, I am grateful to all religions for supporting ethical action and an ethical mindset. If there were no other way to secure ethics, then I would opt for religion for sure. But we know a lot more about the universe today than we did several thousand years ago, and so there is at least an opportunity to forge a modern narrative, one that can find in secular metaphysics a foundation for traditional values. That’s what The Infinite Staircase is seeking to do.

That’s what I think. What do you think?

Image Credit: Pixabay

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.






Great People Are the Reason Companies Become Great

Great People Are the Reason Companies Become Great

GUEST POST from Mike Shipulski

You can look at people’s salaries as a cost that must be reduced. Or, you can look at their salaries as a way for them to provide for their families. With one, you cut, cut, cut. With the other, you pay the fairest wage possible and are thankful your people are happy.

You can look at healthcare costs the same way – as a cost that must be slashed or an important ingredient that helps the workers and their families stay healthy. Sure, you should get what you pay for, but do you cut costs or do all you can to help people be healthy? I know which one makes for a productive workforce and which one is a race to the bottom. How does your company think about providing good healthcare benefits? And how do you feel about that?

You can look at training and development of your people as a cost or an investment. And this distinction makes all the difference. With one, training and development is minimized. And with the other, it’s maximized to grow people into their best selves. How does your company think about this? And how do you feel about that?

You can look at new tools as a cost or as an investment. Sure, tools can be expensive, but they can also help people do more than they thought possible. Does your company think of them as a cost or an investment? And how do you feel about that?

Would you take a slight pay cut so that others in the company could be paid a living wage? Would you pay a little more for healthcare so that younger people could pay less? Would you be willing to make a little less money so the company can invest in the people? Would your company be willing to use some of the profit generated by cost reduction work to secure the long-term success of the company?

If your company’s cost structure is higher than the norm because it invests in the people, are you happy about that? Or, does that kick off a project to reduce the company’s cost structure?

Over what time frame does your company want to make money?

When jobs are eliminated at your company, does that feel more like a birthday party or a funeral?

Are you proud of how your company treats their people, or are you embarrassed?

I’ve heard that people are the company’s most important asset, but if that’s the case, why is there so much interest in reducing the number of people that work at the company?

In the company’s strategic plan, five years from now are there more people on the payroll or fewer? And how do you feel about that?

Image credit: Unsplash

Subscribe to Human-Centered Change & Innovation WeeklySign up here to join 17,000+ leaders getting Human-Centered Change & Innovation Weekly delivered to their inbox every week.