Innovation: Why We Need It

connect the dots2
When we talk about creating new processes or ways of doing things, we’re talking about change. Another word for change is innovation. And people usually want to innovate when something needs to be more effective or efficient.

Steve Jobs believed, and Sir Richard Branson believes, that innovation is connecting the dots. In other words, seeing how A-B-C are connected, and also seeing that something new is created when the dots are connected differently. It’s not always easy to see the dots, though, and businesses suffer the consequences according to innovation thought leaders.

What keeps innovation thought leaders up at night?

That question was asked at the Annual Meeting of the World Economic Forum in 2012. Participants at the Creative Workplace session, including IDEO CEO Tim Brown and OpenDNS CEO David Ulevitch, said they worried about:

  • How organizations can build creative, engaging, and energizing work environments
  • How organizations can build a culture of successful failure, and
  • How organizations can manage energy and time for resilience and performance

What do innovation thought leaders say about innovation?

They believe innovation has never been a higher business priority than it is today. Tweets aggregated by @imaginatik during the 2014 Open Innovation Festival offer clues about how some companies innovate:

  • Taking a forward-looking approach helps us get away from the “what happened?” question later @CloroxCo
  • Apply design thinking to drive user-focus and a “build to learn” attitude @intel
  • I’ve disregarded the fact that being inquisitive can get you in trouble @innoalchemist

Yet most companies struggle to innovate consistently. Some of the leading thinkers on the topic of innovation say mindsets need to change if innovation is going to occur.

Jerald Hage, co-director at the Center for Innovation at the University of Maryland said this:

Companies that want technology innovations should give their STEM professionals (people with science, technology, engineering, and mathematics backgrounds) a say in decisions.

CEOs across the board should avoid a penny-pinching mindset; stop embracing productivity and cost. It leads people to think that the best thing to do to reduce costs is to fire workers.

And Hal Gregersen, senior affiliate professor of leadership at graduate business school INSEAD had this to say:

Building a culture of productive innovation—where employees practice the five discovery skills [associating, questioning, observing, networking, and experimenting] takes time. Just opening the floor to suggestions can lead to a flood of low-value propositions.

Renée Mauborgne and W. Chan Kim, co-authors of Blue Ocean Strategy say that it’s important to:

Shake up the status quo with a dose of “harsh reality.”

They say that people often have a perception, which is often contrary to the facts. In other words, the dots are not connected right. We must first see the dots and then connect the dots in ways that will create new efficiencies and help people be creative and productive in their work environments.

Technology and Training: Do We Understand this Partnership?

Corporations in the U.S. spend $156 billion annually to train employees. So when I read the headline “So Much Training, So Little to Show for It” in a recent online Wall Street Journal (WSJ) article,[1] I wanted to know why Eduardo Salas, a professor of organizational psychology at the University of Central Florida and a program director at its Institute of Simulation and Training, had reached that conclusion. He uses data gathered in 2011 on employee learning from the American Society for Training and Development (ASTD), as well as research that claims 90 percent of new skills are lost within a year to point out that businesses spend a lot of money on training that doesn’t stick. One myth, according to Salas, is that “technology will solve all training problems … that a mobile app or computer game is the solution to learning.”

Salas believes that organizations need to rely more on the science of learning and training instead of acting on what he refers to as “myths” that lead people to make poor decisions about training. That’s a strong message! Is he right?

Before we launch into further discussion about myths and training, let’s look at how myths develop. A myth is the result of stories—real or imagined—that are repeated over and over for years—centuries and millennia—until hundreds of versions of a story exist. Myths are not created by one individual, but by many storytellers. Each time a story is retold, it changes. Myths usually do contain a grain of truth, however, and that may be what keeps them alive.

Let’s assume there’s a grain of truth in this myth about technology solving training problems. What is it?

Man - technology-cropped

A Quest for Clarity

What is technology? Well, technology means different things to different people. It comes from the Greek word techne which means “art, skill, and cunning at hand.” In this article, technology means “the purposeful application of knowledge in the design, production, and use of goods and services and in the organization of human activities.” Businessdictionary.com divides technology into five categories: tangible (e.g., blueprints, models); intangible (e.g., problem solving and training); automated and intelligent technology (e.g. telepresence robots and some computers); semi-automated, partially intelligent technology (e.g., computer tablets and smartphones); and labor-intensive technology (e.g., software programs and most water sprinklers for the yard).

If we assume that technology is the purposeful application of knowledge to produce goods and services that people use, then we’re right to think that technology will solve all training problems. If on the other hand we assume that technology is a computer and software, a robot, a tablet or smartphone, or the TV remote then we can be fairly certain that technology will not solve all training problems. How we use technology appears to be a key factor in whether or not it can solve training problems.

In the future, the training industry might employ intelligent, decision-making, problem-solving robots to train people, but for the present and foreseeable future, all indications point to reliance on partially intelligent and labor-intensive technology to create and deliver training. So how did this particular myth (or we could call it a “half-truth”) get started?

The Backstory

A history of the partnership that has developed between technology and learning is far too long to include here. Briefly, however, it was B. F. Skinner, a behaviorist psychologist, who introduced the concept of using technology to enable people to do self-teaching. He developed a specialized book—a teaching machine—that presented material in a structured, organized manner. And in 1957 Skinner published a paper about the teaching machine that launched a lot of interest in programmed instruction.[2]

There’s an excellent discussion on teaching machines in Thomas Gilbert’s book Human Competence: Engineering Worthy Performance (originally published in 1978).[3] Gilbert’s storytelling style and humor provide good reading and fascinating insights into the genesis of eLearning. But eLearning as we know it today wouldn’t appear until 38 years later.

By 1995, computer use was more widespread and the Internet was becoming main stream. During that year, a small group of learning professionals with a passion for technology and learning were listening to Brandon Hall. Hall heads up one of today’s leading eLearning research companies. Back then, he was showing the group the latest technologies and computer applications from his CD-ROM library. Everyone was excited about the level of instructional design and various applications, and they started talking to others about what they’d seen.

In 1996, the American Society for Training and Development (ASTD) held one workshop on Internet-based training. The next year, Training Magazine published the first article on “Internet-Based Training”; and Elliott Masey founded the TechLearn Conference. And in the span of two years, a new industry was born. At seminars, sales demonstrations, and in magazines, everybody was talking about eLearning!

At that time, predictions about the impact of eLearning included:

  • eLearning will replace all classroom training
  • eLearning courseware is easy and cheap to design and develop
  • eLearning technologies are easy to use and integrate
  • eLearning technology infrastructure is easy to implement
  • Employees from almost all cultures like eLearning
  • eLearning is mostly about how to use technology
  • Significant cost savings can be made by adoption of eLearning
  • The eLearning market will at least quadruple annually

People learned quickly that automated technology—primarily computers, software, and the Internet—had inherent limitations that prevented wide distribution of eLearning. Then, as today, technology did not match people’s expectations.

The problems that developed in the 1990s, when people tried to merge technology and learning to produce eLearning, persist today. Those bold predictions about the impact of eLearning would have on training don’t match the state of eLearning today. Here’s what we know for sure as eLearning development nears the 30-year mark:

  • Only a few organizations have moved most of their classroom-based curriculum to the Internet
  • High-quality, well-designed eLearning requires a significant investment of both time and money in assessment, scoping, design, development, and deployment
  • eLearning technology integration has proven to be more difficult than imagined
  • Connectivity and speed issues have been ongoing challenges due to a lack of high bandwidth
  • eLearning is not that much about technology but more about learning, using technology

So Salas is right when he says that we need to rely on the science of learning and training rather than the technology to deliver training. In the 67 years since Skinner introduced the teaching machine, advances in technology have been phenomenal. Technology is making it possible to do things that we couldn’t even imagine 10 years ago. Things like online MOOCs that bring together tens of thousands of people to learn with each other in one space of time but from many places around the world. But we’ve also made mistakes due to assumptions and misunderstandings about learning and about technology.

Today, we have better understanding about how people learn as well as how technology can support learning. Dr. Salas suggests that vendors make sure training is flashy and engaging with lots of bells and whistles that employees find “fun and interesting for a few hours.” He argues, however, that visually interesting presentations are not enough; that for learning to successfully transfer back to the job, participants need opportunities and time to practice in the work environment. They also need very clear and precise learning objectives, and clear feedback on performance. When these things are not present, it’s unlikely that the training experience will transfer back to the job, or that it will stick.

It’s true that many people LOVE shiny new gadgets that have lots of “bells and whistles.” We’re eager consumers of the next new thing to come along. But that’s only one side of our infatuation with technology. The other side is that we live in a complex world and we have to manage wild problems with limited time and resources. So when someone says they have a solution to any problem and then brings out the sample to cement the impression, we get interested.

Mobile apps and computer games are examples of this type of problem-solving technology. We need to be open-minded consumers, though, and try to understand the need before we commit to a course of action. Mobile apps are designed to remember and do things repeatedly and easily. They complement our human intelligence and free our minds to do complex problem solving and decision-making tasks.[4] But they’re not the solution to everything learning related.

Those stories and predictions—like the stories and predictions about eLearning that spread—are often not accurate. While digital devices are portable, and we’ve both seen and heard lots of stories about the amazing things mobile apps can do, making it easy to believe mobile technology reduce the need to take people away from their jobs for training, the research indicates mobile technology is not the answer for every learning need. There’s convincing evidence that the learning environment is a key factor in learning and retention. The science of learning tells us that we have the ability to hold information in our short-term memories for a short amount of time. So Salas is also right about what is needed to make training stick. To get better ROI for training, we need to practice what the research has proven and provide learners with time and opportunities to integrate new skills and knowledge with their prior experiences.


[2] Gilbert, Thomas F. (2007) Human Competence: Engineering Worthy Performance, Tribute Edition. San Francisco: Pfeiffer, pp. 297.

[3] Thomas F. Gilbert’s book Human Competence: Engineering Worthy Performance was first published in 1978 New editions are available. [Return to text.]

[4] Quinn, Clark N. (2011). Designing mLearning. San Francisco: Pfeiffer. [Return to text.]

Front-line Workers: Ensuring Their Success

shutterstock

Front line is a term often associated with military action. It’s the most advanced tactical combat units, those who are situated in the zones, or borderlands, where potential for conflict is highest. The term front line derives from frontier, a word the French used to describe the “prow of a ship or the front rank of an army.”[1]

Today’s front-line workers are the most important and active positions in a job or industry. They’re in the zones where conflict and interactions are most likely to happen—on sales and manufacturing floors, in customer service departments, behind store counters; they’re the  on-the scene news reporters covering developing stories, teachers in classrooms, and sports teams’ defensive lines. What front-line workers do—how they engage with customers, how they perform in their jobs—contribute directly to an organization’s success or failure. This is not always recognized.

In a recently published article by Korn-Ferry Institute, Michael Hyter wrote that organizations “underestimate the value contributed by the ‘vital many.'” Hyter goes on to say that each individual is capable of contributing more, and that while “developing future managers and executives is crucial, an overemphasis on that segment of the workforce means too many individuals plateau at levels of contribution far below what they could deliver.”

Peter Drucker, a leader in the development of management education, said in the late 1980s that “The more knowledge-based an institution becomes, the more it depends on the willingness of individuals to take responsibility for contribution to the whole, for understanding the objectives, the values, the performance of the whole, and for making themselves understood by…the other knowledge people in the organization.”[2] He also said, “The productivity of the newly dominate groups in the workforce, knowledge and service workers, will be the biggest and toughest challenge facing managers.”[3] Drucker seems to be describing the challenges front-line managers face, as well.

In most industries, the shift from manual labor to knowledge and service work is complete. But as most of the workforce knows from experience, the fast pace that’s required just to keep up with roles and responsibilities leave little or no time to learn “the next new thing” or improve an existing skill. Yet, learn we must, or suffer the consequences of being left behind. This is, I believe, a primary reason that managing productivity is such a huge challenge.

Excellence at Work Can Be the Norm

How can we help front-line workers be more productive and enable them to meet organizational and individual goals? We need to create frameworks for continuous improvement and give workers time and opportunities to learn and apply new concepts and skills; then we need to support their performance.

Continuous improvement has its roots in the lean work system that Toyota developed at the end of World War II in response to severe financial and material limitations. Its focus is on value generation and growth. Today, lean principles have expanded beyond the manufacturing industry and have become a philosophy of lean management. [4]

In the lean philosophy, value originates with the customer. This is the philosophy behind the lean just-in-time practice of filling customer requests through a series of synchronized steps designed to improve efficiencies and avoid waste in manufacturing. We see this philosophy at work in coffee shops that fill individual coffee orders and when airlines adjust flights and seat capacity to manage passenger travel and maximize capital investments.

When knowledge and service workers are performing well in their front-line positions, they contribute to their organization’s success by building relationships and taking actions that deliver value to the customer.

Creating Environments Where Success Can Grow

To be in a growth state of continuous improvement that supports the organization’s valued relationships with clients and customers, workers also need to practice continuous improvement. Organizations can support workers’ continuous improvement through:

  • Education that broadens and deepens their knowledge of the organization so they understand how actions inpact outcomes
  • Training to reduce the variability in their performance
  • Respect for them and the tasks they perform
  • High-quality, relevant feedback that is positive and constructive from within the organization

These investments in people make sense because they promote a culture of continuous improvement, increased productivity, and engaged workers.

Supporting Performance

Training, education, and skill building activities are seldom, if ever, just one-and-done activities. They’re ongoing, and they require ongoing support. Clark Quinn, author of Designing mLearning, said, “…the amount of change—amount of information, complexity of solutions, and speed of competitive response—means that just executing against preplared plans is no long sufficient…continual innovation will be required.” Of course, we all realize this, which may be one reason we look at all the “shiny new gadgets” when they become available. We hope some of these gadgets can help us meet our needs.

Interestingly, front line also means “cutting edge.” In that context, it’s where we look for the most recent scientific information, new innovations, state of the art tools, and the best techniques to help us close the gaps and make progress toward goals. This is the front line  we need to be aware of when we design and develop education and training used to prepare front-line workers for their roles and responsibilities. But we need to go beyond awareness and understand how to choose appropriate techniques and tools from among all the bright, shiny objects available to us.

Technology provides us with new and better tools that make it easier to accomplish certain tasks, but technology is only part of any solution. Clark suggests that learning and development professionals should “draw on updated models of cognition and learning, and emerging technology capabilities, to propose new approaches to achieving our goals.”

Clark spoke to a group of technical training professionals in Minneapolis last week. One of my takeaways from his presentation was that mLearning (“any activity that allows individuals to be more productive when consuming, interacting with, or creating information, mediated though a compact digital portable device the individual carried on a regular basis”) [5] is here to stay, but we need to do the right things with it. That will be the topic of the next post on Pinnacle Point of View.


[1] www.etymonline.com

[2] Drucker, Peter (2003). The New Realities, Revised ed. New Jersey: Transaction Publishers.

[3] Drucker, Peter (1992). Managing for the Future. New York: Routledge.

[4] Knuf, J. and Lauer, M. (2006). Normal Excellence: Lean Human Performance Technology and the Toyota Production System in James A. Pershing (Ed.), Handbook of Human Performance Technology, 3rd ed. (pp.717-742). San Francisco: Pfeiffer.

[5] eLearning Guild 360 Mobile Learning Research Report, 2007 in Quinn, Clark N. (2011). Designing mLearning. San Francisco: Pfeiffer.

Is Seeing Learning, or Is Seeing Entertainment?

Click to enlarge the photo.

A picture is worth a 1,000 words, and people do learn better from words and graphics than from words alone. In situations where learners are novices, content is complex (such as describing what astronauts wear in space), or the rate of presentation of material is not in the learner’s control, learning can be improved by using visuals in ways that reduce cognitive load.[1] But when, where, and why to use visuals, what kind of visuals to use, and how to use visuals are all important factors that we shouldn’t ignore if we want—and expect—learning to be effective.

Cognitive load is the amount of effort required of working memory, and working memory has very limited capacity. The fact that we have trouble remembering more than four or five numbers (seven plus or minus two) at a time is an example of working memory limits. And information that enters working memory has a short shelf-life. If too much information comes in during a short period of time, it will be lost. For information and new knowledge to “stick,” the brain needs time to process and organize it so it can be retrieved and used when needed from long-term memory. If the processing and organizing doesn’t take place, the information becomes useless brain clutter—fragments that aren’t connected in meaningful ways to other information—or it’s simply forgotten.

There are two factors that determine potential cognitive load for a learner: their prior knowledge and the content. If the learner has little prior knowledge, but the content is simple, their cognitive load is less. If the learner has little or no prior knowledge and the content is complex, their cognitive load is greater. For this reason, we need to understand the learner’s prior knowledge. Then, the challenge is to understand the why, when, where, what, and how of using visuals to address their needs and better enable them to execute on a business’s strategy. To keep this post constrained, only the “why” factor is discussed here.

Why: To Engage, Explain, Educate, or Entertain?

As already stated, visuals are used to engage and explain. The show business industry—live theater, movies, television, Disney’s empire, for example—is successful because it provides engaging entertainment, and most people, especially children, love to be entertained! The desire to engage learners has morphed into a demand for training that entertains. This is the motivation behind “edutainment,” a hybrid of education and entertainment that uses entertaining content to educate.[2]

There’s a spectrum of edutainment. At one end of the spectrum is the eLearning module that’s very entertaining (engaging) and contains only a small amount of educational content. At the other end of the spectrum is the course that’s very educational and contains only a small dose of entertainment. A factor that determines where on the spectrum the edutaining course falls is budget. If the entertainment value is high—high-definition graphics and high-level animations—the cost to produce the course is also high. If the content is primarily educational and simple line drawings are used to create a more interesting visual course content, the cost to produce the course content is considerably lower.

The purpose of edutainment is to attract and hold the learner’s attention by engaging their emotions.[3] And it works! We begin teaching children before they enter elementary school that learning is fun and entertaining. But there’s a downside to entertaining rather than educating; it’s what happens when the balance of entertainment and education is skewed too much toward fun and entertaining. People not only learn less, their ability to learn actually decreases.

Entertainment, and this includes videos, pushes content to passive learners and requires only their attention in return. This applies to adults as well as to children.

A research study on the effect of using video-based cases instead of text-based cases to teach medical students revealed that the video-based cases disrupt deep critical thinking needed to solve problems.[4] The medical students preferred using video-based cases and felt watching videos represented better use of their time, because it takes more time to read a case than to watch a video about the case. But the outcome of replacing text-based cases with video-based cases was negative. This is not an isolated incident.

A meta-analysis on the use of multimedia formats in distance learning showed that courses reporting the highest level of student engagement and interest also showed lower measures of achievement. The shift from deep to superficial thinking, when video replaces text, suggests that learners may be distracted by the content. Extraneous (entertaining) content and information competes for cognitive resources in working memory and disrupts the process of organizing and remembering information.[5]

In summary, visuals—illustrations, photographs, videos, and animations—can enhance learning and reduce cognitive load when they’re used in the right context and at the right time. The instructional designer’s task is to understand the context and the learner, and to use this information to design an appropriate solution to help enable learners to perform tasks that are aligned with business strategy.

In upcoming posts, best practices for how, when, where, and what type of graphics to use will be discussed. If you want to dig into this topic on your own in the meantime, we recommend the following:

  • Graphics for Learning by Ruth Colvin Clark and Chopeta Lyons
  • The Art of Explanation by Lee Lefever
  • The Back of the Napkin by Dan Roam
  • Multimedia for Learning by Stephen M. Alessi and Stanley R. Trollip

[1] Clark, Ruth Colvin and C. Lyons (2004). Graphics for Learning: Proven Guidelines for Planning, Designing, and Evaluating Visuals in Training Materials. San Francisco: Pfeiffer; 105.
[3] Okan, Zühal, (2003). “Is Learning at Risk?” British Journal of Educational Technology, 34(3), 255-264.
[4] Basu Roy, R., and McMahon, G. (2012). “Video-based cases disrupt deep critical thinking in problem-based learning.” Medical Education, 46(4), 426-435.
[5] Ibid.

Too Much Information Interferes with Thinking

Ask anyone if having too much information is a problem and you’ll hear the same answer: Yes! And this is a problem that has many of us crying “Uncle!” We do so for good a reason. It’s hard to think straight when information zings at us on hundreds of channels and dozens of devices all day long.

Neuroscientist Angelica Dimoka of the Center for Neural Decision Making at Temple University suspected that something biological happens when people are overstimulated with information. To verify her suspicions, she used an fMRI machine to study the brain activity of people who were bidding on items in an auction—a taxing task. What she found is that as the information load increased as more items were offered at a faster pace. So did activity increase in the pre-fontal cortex of the brain, the part that is responsible for emotional control and decision making. But then something strange happened. As bidders were given more and more information, brain activity in the pre-frontal cortex suddenly fell off. Bidders started making stupid mistakes and bad choices.

It’s absolutely true—the research proves it—when people receive too much information, their decisions make less and less sense.  Too much information interferes with people’s ability to sustain focus, make sound decisions, and minimize stress—three important skills in life and in the workplace.

How We Can Help People Think Better

As workplace learning professionals, we need to be keep that fact in mind as we design learning opportunities for people. People do need nearly continuous learning to stay relevant in their jobs, because change has also accelerated as more information has become available. More information, more change—it’s a vicious cycle. But we can acknowledge and work with the limits of the human brain.

The once alluring theory that we use only 10 percent of our brain’s capacity has been dispelled as a myth. Brain imaging technology shows that over the course of a day, you and I use 100 percent of our brains according to John Henley, a neurologist at the Mayo Clinic in Rochester, Minnesota. Even when we’re asleep or resting, our brains are busy. So even if we want to activate more brain power, we can’t just try harder to overcome what many have assumed (incorrectly) is a deficiency in motivation.

What we can do is own up to our limits. Our brain struggles to take in and organize all the information that comes at us, and our nervous system gets overstimulated. The popular term for too much information is “information overload.” According to neuroscientists, the more accurate term for having too much information pushed on us in too short of time is “cognitive overload.” Conscious intellectual activity wears us out, because while the brain weighs only about three pounds, it uses 20 percent of the body’s available energy.

Limit Cognitive Load for Better Learning

One of the best, although not easiest, ways to manage cognitive load for learners is to limit what they are exposed to in any given learning situation. We can do this by limited information to only what is essential, and not ask people to process extraneous information. As designers of learning, we need to be the filter for what’s best and leave out the rest. Granted, designing learning with this thought in mind is challenging. It’s challenging because we must make deliberate decisions about what to leave in and what to omit. It’s easy to succumb to the notion that everything has value, when in fact, that’s not true.

Performance Support

We can also minimize cognitive load for people by embedding performance support in their work environments. The notion of performance support isn’t new. Job aids, those unobtrusive and accessible support items, have been used for decades. Designed to provide just-In-time information about what step to take next, who to call, or what are equivalent weights and measures, for example, job aids help people to complete important tasks. They’re used primarily as reminders that supplement training, because not everything that’s learned in training “sticks.”

Until recently, performance support has been considered an add-on to learning courses—an option. Thinking with regard to the real value of performance support is evolving rapidly, though. Some workplace learning professionals, like Bob Mosher, chief global learning evangelist at Ontuitive, believes we should replace training with performance support. Others are taking a more moderate view. Allison Rossett, professor of technical education at San Diego State University, believes that performance support and training should be an integrated package.

There’s room in workplace learning for both approaches. Sometimes, performance support is enough. Other times, embedding performance support in training makes a lot of sense. This is when we need to really discriminate and design for what is essential to know or do—to manage information and cognitive load. When we do that, we can increase the probability that important information will stick and that workers will know when, where, and why to access performance support, because it’s been designed to work that way.