December 2010


Last night I heard a story on NPR about the creative partnership of Steve Jobs and Jonathan Ive.

Some excerpts I found noteworthy:

The two men seemed to agree on a basic philosophy about design and products.

This is critical to any partnership.  In fact, you can’t have an effective partnership without this.  It may work in the short term but partnerships that last need to be aligned philosophically.

Ive says when he thinks about design he thinks about details that matter to users.

I believe this has a lot to do with Apple’s success.  Most companies view product development from the developer’s point of view and don’t give enough consideration to users.  All you have to do is look to where the innovation is coming from to see evidence for this.  Who is creating the “hot” products?  Where is the buzz coming from?  Who is thriving in this economy?  How does this happen?  Being hip and creating buzz is not the point of Apple’s approach but it helps get your ideas to market and ultimately improves sales.  In the story Ive is quoted that a lack of consideration for users shows that designers don’t care.

Both Ive and Jobs believe that the hardware must work together seamlessly with the software.  The iPod and the iPhone are part of a system that includes a music store, a video store, and a book store.

In retrospect this may not seem as revolutionary as it really was.  Apple was simultaneously creating a device that would completely change the way consumers enjoy music, television, and movies and creating a mechanism that enabled users to seamlessly purchase media for this device.  The device was not created separately from the store.  They were developed at the same time.  This is a great example of design thinking.  Continuing to ask questions.  Continuing to challenge assumptions.  Continuing to recognize and create new opportunities.

In case you can’t tell, I’m a big fan of this kind of thinking.

Yesterday NPR presented a story on the positive affects of video games.  According to research cited in the story, “video gamers show improved skills in vision, attention and certain aspects of cognition.”  This also has a positive impact on real-world skills involving “attention, speed, accuracy, vision and multitasking.”

I read a book a few years ago that suggests there are other benefits to the design of today’s video games.  In his book Everything Bad is Good for You Steven Johnson suggests that elements of our modern culture are making us smarter.  One example he cites in video games.  Below are some quotes from his book.

Start with the basics: far more than books or movies or music, games enable you to make DECISIONS.  Novels may activate our imagination, and music may conjure up powerful emotions, but games force you to decide, to choose, to prioritize.  All the intellectual benefits of gaming derive from this fundamental virtue, because learning how to think is ultimately about learning to make the right decisions: WEIGHING EVIDENCE, ANALYZING SITUATIONS, CONSULTING YOUR LONG-TERM GOALS, AND THEN DECIDING.
p. 40 (Emphasis mine)

Its not WHAT you’re thinking about when you’re playing a game, it’s THE WAY you’re thinking that matters.
Here’s John Dewey, in his book Experience and Education: ‘Perhaps the greatest of all pedagogical fallacies is the notion that a person learns only that particular thing he is studying at the time.  COLLATERAL THINKING in the way of formation of enduring attitudes, of likes and dislikes, may be and often is much more important than the spelling lesson or lesson in geography or history that is learned.  For these attitudes are fundamentally what count in the future.
p. 41 (Emphasis mine)

“If you stopped playing in the early 90s, or if you only know about games from secondhand accounts, you’d probably assume that the mid-game objectives would sound something like this: Shoot that guy over there! Or: Avoid the blue monsters! Or: Find the magic key!
But interrupt a player in the middle of a Zelda quest, and ask her what her objectives are, and you’ll get a much more interesting answer.  Interesting for two reasons: first, the sheer number of objectives simultaneously at play; and second, the nested, hierarchical way in which those objectives have to be mentally organized.
p. 48-49

What does this mean for learning?  I believe learners will respond more positively to content if they are actively engaged with the content rather than passively receiving the information.  While there is a place for traditional training where the instructor lectures, learners WANT to be challenged.  This is particularly true for millenials, who grew up in the age of video games.  This does not mean organizations need to make a large investment in technology.  It means learning opportunities need to reflect real-world situations where right answers are not always clear.

A popular way to design learning this way is problem based learning.  In this approach learners are presented with a problem that does not have a clear solution or path to a solution.  Learners must work individually or as a team to solve the problem.  As they work towards a solution they must find information that helps them achieve their goal.

Here is a post I wrote on problem based learning.
Here is the Wikipedia page on problem based learning.  It provides links to other sites if you want to learn more.

While I don’t agree with everything in this article the basic point is noteworthy, “creating and maintaining an effective culture of commitment and engagement takes effort from leaders who work closely with employees.”  While supporting this point, the author makes some other points I agree with.

Here is an example, “Leaders need to work on creating excitement and enthusiasm,” Hunter said. “Be clear about the future you want to create. Be clear about what’s in it for everybody in the company. Build a sense of team. Have people feel like they’re a part of what’s going on; include them; have them feel acknowledged and appreciated. When you do that, that’s the formula for success.”

One of the most precious organizational assets is knowledge.  How well leaders optimize and convert knowledge to sustained employee performance is one measure of success, reflected by employees’ knowledge management capabilities, their ability to learn, support of their own performance and collaboration with experts.

The quote above is from an article in the current edition of Chief Learning Officer titled The Learning Ecosystem.  I recommend reading the entire article.  It is  thorough, well thought out and provides real-world examples of organizations who overcame obstacles to implement a learning ecosystem that meets their organization’s needs.

The challenge to organizational leaders is fostering a culture that empowers individuals, rewards sharing, and encourages transparency.  There is no recipe to do this.  The first step is for leaders to recognize the needs of organization, how it could work better and the benefits of greater collaboration.

Throughout the article the author sprinkles references to ISPI standards.  If you find one mention it in the Comments.

…one common occurrence that warns you that a performance discrepancy may be lurking around is the announcement that takes some form of “We’ve got a training problem.”  Someone has detected a difference between what is desired and what is actually happening.

But statements such as “We’ve got to train/teach…” are pits into which one can pour great amounts of energy and money unproductively.  Such statements talk about solutions, not  problems.  Training (teaching, instruction) is a solution, a remedy – a procedure used to achieve desired results.  It implies transferring information to change someone’s state of knowledge or ability to perform.

But lack of information is often not the problem.

Robert Mager/Peter Pipe, Analyzing Performance Problems, p. 8

If lack of information is not the problem how do you find out what the problem is?  One of the first questions one should ask when looking at a performance problem is, “what does ‘right’ look like?”  In performance terms “right” is defined as optimal performance.  Actual performance is the current way the work is being done.  When starting a project, one should not assume the problem lies is the “actual” way the work is being done.  One should also not assume that training will resolve any perceived problem. The problem may have a cause nobody has considered that cannot be resolved through training alone.

A fundamental step in understanding a performance problem is documenting the actual and optimal performance.  The difference between actual and optimal performance is the performance gap.  In many organizations it is likely that optimal performance has been documented at some point.  This can be in the form of a manual or an existing training course.  One should consider that the problem may be caused by a lack of accountability to existing standards or failure to send people through existing training.  Reviewing existing documentation can provide a good starting point for any needs analysis.  It provides valuable background and a reference point from which to begin data collection.  However, one should also consider other factors that may be causing a performance problem.

The focus of this post is how to document and analyze a performance gap.  I will cover how to close a gap in a later post.

There are many ways to approach a gap analysis.  To effectively understand an issue a combination of analysis techniques is usually required.  Below are seven types of analysis ISPI recognizes:

  • Job or Task Analysis—Identifies the required activities, information, processes used, and outputs produced and then compares that to actual practice.
  • Process Analysis—Identifies the cycle time compared to process time; time at task compared to time on rework, waiting, or checking; resources consumed and the cost of those resources; and what drives activity (customer or product requirements).
  • Work Environment Analysis—Identifies and evaluates the effectiveness and efficiency of feedback, the reward and incentive system, information and communication systems, work and process designs, and work tools and equipment.
  • User or Audience Analysis—Identifies current expectations, perceptions, physical capability and capacity, and knowledge and skills.
  • Communication Systems Analysis—Identifies and evaluates the availability, capability, capacity, upgrade ability, and cost to use and maintain.
  • Market Analysis—Identifies the size, competition, growth, current and potential constraints or limitations, organizational expectations, initiatives, capabilities, and capacity.
  • Data System Analysis—Identifies and evaluates the capability, capacity, availability, upgrade ability, and cost to use and maintain.

Speaking from experience it is unlikely that a performance specialist would possess all the skills required to successfully complete each of these forms of analysis.  This is one reason why ISPI emphasizes partnerships.  If the stakeholders on a project feel strongly that any of the methods listed above are necessary to gather data they should also be prepared to employ a specialist to complete the data collection.  Although the work can be done by a layperson it will add time and may impact the overall quality of the data.  Ultimately the decision is a matter of cost and the benefits derived from the expenditure.

Regardless of who is conducting the analysis, the participants play a critical role in the quality of the data.  Participants should be representative of the entire team in knowledge, experience, and responsibilities.  A control group can be helpful in validating and clarifying data.

Access is another critical success factor.  Being able to observe and interact with job performers in their work environment may be necessary.  However, striking the right balance between observing and interacting can be a challenge.  In my experience it is very important to be open with the participants about your needs and expectations.  It can be unsettling for an employee to be observed.  This is where it is critical to have a sponsor who can communicate with employees and address their concerns.  Interacting with participants may influence your data so should be considered with caution.  Direct contact with participants can also negatively impact their productivity which may also negatively impact the data.

Once optimal performance is documented, the analysis should attempt to validate while documenting the actual performance of workers.  This is where the control group is useful.  It is possible that changes have taken place since previous standards were written that require an organization to revisit their expectations (optimal performance).  Finding out how to compensate for the gap is where the needs analysis can get complicated.  It would be a mistake to assume the gap is caused by a lack of knowledge.  Unfortunately, many organizations treat all performance issues this way.  If the problem is caused by performers lacking knowledge, it is logical to assume the solution is traditional training.  However, there may be another cause.  But that is a discussion for a future post.

I am continuing in my review of ISPI’s performance standards. The remaining standards follow a systematic process that is familiar to instructional designers, who refer to it as ADDIE (Add-ee).  This process consists of five steps or phases: Analysis, Design, Development, Implementation, and Evaluation.  This process has stood the test of time and remains the basis for most instructional design efforts or discussions.  ISPI differ slightly with the addition of a standard for requirements.

Needs assessment or analysis is ISPI’s fifth performance improvement standard (the A in ADDIE).  This is a vast topic.  Entire books are written on this subject.  Although it would be impossible to provide a thorough description of needs analysis in this space, I will attempt provide enough information to help you understand the factors to consider and the procedures to follow in a successful analysis effort.

In my experience, the needs assessment begins with standard two, context.  In my post on that standard I wrote, “identifying and discussing potential barriers will help design an intervention that will achieve the desired outcome.” Barriers to performance are often indicators of what is causing a performance problem.  Although one must be careful not to draw conclusions based on incomplete information, the perspective gathered early in the process can help plan subsequent data collection and analysis.

Robert Mager is one of the best known authors in performance assessment and instructional development.  In his book, Analyzing Performance Problems, he provides a process for analyzing a performance problem.  The first step in his process is to describe the performance problem.  While I agree that this is a prerequisite for a successful needs analysis, it is not always a fast or simple process.  It is often difficult to describe the actual problem but easy to describe the symptoms.  That is why it is critical to follow a systematic analytic process.  Since there are many approaches to needs analysis I will take several posts to describe how to conduct a successful needs assessment.