ISPI Standards


As a Certified Performance Technologist, I often find it hard to explain exactly what I do. Consequently I’m always looking for examples of performance improvement in practice.  In the past week I’ve been working on a project that brings together many of the core elements of performance improvement.

Let me first summarize the project.  The company I work for is looking to expand its business into a new area.  We have pilot tested the approach to fine-tune our process and learn the best way to engage the market and carry out our plan.  The pilot team included representatives from our marketing, merchandising, and store operations departments.  I was added to the team after several pilot tests had been conducted (this happens in most projects I work on even though I tell my colleagues they would get more value from my participation if I were included earlier).  After the pilot period was over we began to craft the policy that would govern our new endeavor.

Through the end of the pilot period I created a training guide on our internal website to help our pilot stores plan and conduct their events.  The guide was considered a draft and was being pilot tested along with the processes.  When the pilot period ended work began on the final policy that would govern and guide the new initiative.  A key stakeholder on the project, who also initiated the effort, drafted the initial policy and gave it to me to review.

I took the draft and reviewed it with a colleague with experience in the area of the new initiative.  As we reviewed the draft we documented areas of concern and noted issues that needed clarification.  With this information I set out to resolve the issues with the members of the team who either owned or had insight into the subject.  This is a key difference between training development and performance improvement.  The first three standards of ISPI‘s performance standards are focusing on results, taking a systemic view, and adding value.  Training development fundamentally is transferring content from one form to another without regard for results, which departments are involved, or the contribution made by the developer.  I do not mean to cast aspersions on training developers.  I am simply drawing contrasts between training development and performance improvement.

As I worked through the issues we reduced the number of people who needed to be notified of events, we eliminated unnecessary or redundant reports, confirmed the involvement of part-time employees with HR, reviewed how products are handled and tracked, and addressed how to handle exceptions that are likely to arise (such as special requests).

Embedded in this process were the remaining performance improvement standards, determining (or anticipating) the cause of potential performance issues, ensuring the process is feasible and can be easily implemented, and that there are mechanisms to monitor the success of the process and policy.

We are still working to finalize the policy and I am considering how best to communicate the procedure to the chain.  This will follow a the more familiar steps of instructional design, (Analysis, Design, Development, Implementation, Evaluation).  However, I am confident the training we create will be more effective due to my early involvement.

What are the benefits of having a CPT on this project?  I don’t know what everyone else thinks but here is my take.  First, I was able to work with all the stakeholders to identify and address conflicts or confusion.  Second, I could take an objective approach to the process (I often refer to myself as Switzerland).  Third, performance improvement focuses on results (ISPI standard #1).  So often the training and policy for a new initiative focuses on the activity involved and the actual reason for the effort is lost.

Advertisements

There is so much I agree with packed into this post by Seth Godin.

The space matters

It might be a garage or a sunlit atrium, but the place you choose to do what you do has an impact on you.

More people get engaged in Paris in the springtime than on the 7 train in Queens. They just do. Something in the air, I guess.

Pay attention to where you have your brainstorming meetings. Don’t have them in the same conference room where you chew people out over missed quarterly earnings.

Pay attention to the noise and the smell and the crowd in the place where you’re trying to overcome being stuck. And as Paco Underhill has written, make the aisles of your store wide enough that shoppers can browse without getting their butts brushed by other shoppers.

Most of all, I think we can train ourselves to associate certain places with certain outcomes. There’s a reason they built those cathedrals. Pick your place, on purpose.

The first standard in human performance technology is “focus on results.”  Based on that standard the first question you should always ask yourself is, “What am I trying to accomplish?”  The second question should be, “How do I go about accomplishing it?”

Too many of us view our days as a series of items on our to-do list, measuring our success by the number of items we get crossed off the list.  Don’t go through your day blindly going from meeting to meeting, task to task, email to email.  Approach everything you with intentionality.  Be fully present.

ISPI’s eighth standard for performance improvement is development.  As you may have figured this phase of a project deals with the actual creation of training materials.  According to ISPI it can be much more than training, “The output is a product, process, system, or technology.”  This array of solutions underscores a key difference between human performance technology (HPT) and instructional design.  Where instructional design tends to focus exclusively on training and learning, HPT looks at the way work is done and the tools that are used.

As I have written before, a common mistake is to start with this step.  As I write this, I stand by that position because this approach usually results in unfocused materials and unmet expectations.  However, a recent trend in learning development is rapid prototyping.  This approach combines elements of analysis, design, and development.  By combining these into one process, the team is able to refine their expectations, goals, objectives, and materials as their understanding of the project grows.

In a traditional approach, commonly referred to as the waterfall method, lots of meeting time and resources are spent drafting goals and objectives.  The expectation of this approach is that each step flawlessly flows into the next.  Unfortunately, experience has shown this is not always the case.  When the team finally gets to the development step there is no room to reconsider decisions made earlier.  If something was overlooked or the focus changes it requires significant rework to implement which results in delays and cost overruns.

Rapid prototyping, also called iterative prototyping, puts the emphasis more where it belongs, creating a solution that improves performance.  Through iterations the design of the learning intervention is refined, expectations are clarified and outcomes are documented.  The designer solicits feedback from stakeholders and subject matter experts to get approval on acceptable design elements and direction for future iterations.

The strength of an iterative approach is that it emphasizes what is important, actual learning materials, and focuses the team’s energy there.  It is also less labor intensive.

What are the keys to success when applying an iterative methodology?

  • Create.  Don’t just be creative.  An idea is only useful if it results in something
  • Discard. Be willing to throw out unproductive ideas
  • Learn.  An iterative approach is exploratory and prototypes are disposable.
  • Focus.  Appoint a timekeeper or reality checker.   If the group has spent too much time on an idea without producing anything usable it is their job to call us on it.

A key word in this series is “design.”  It would be understandable for a person to ask “when is he going to get around to discussing design?”  I’m almost there.  Before I do, I want to review where we have been in previous posts.

Whenever an organization is dealing with a performance problem, the focus must be on results (ISPI standard #1).    You may think training is required.  You may discuss who is or is not doing what they’re supposed to do.  Some may suggest investing in new software or systems.  Resist these temptations.   Focusing your efforts on results will put your discussions into the right context.  This will enable you to collect the right information, understand the true cause of the problem, and come up with a solution that will achieve the desired results.

Focusing on improved results sets the tone for the entire effort.  You must also consider the situation or context (ISPI standard #2) and decide what resources are required to effectively achieve the desired results.  With a clear understanding of the context and the right people on the project (ISPI standard #4), it is time to do a detailed analysis of the problem (ISPI standard #5).  Your preliminary research and partnerships will help.  Throughout all of this, resist the temptation to draw conclusions too soon.  Patterns will emerge.  Solutions will seem appropriate and attractive.  Wait until you have all the data and have analyzed it before you draw conclusions.  Let the data reveal the true nature of the problem and what is causing it (ISPI standard #6).

I can’t tell you the number of times I have seen an organization decide training will solve a problem without any idea what results they are trying to achieve.  They see that something is going wrong so they automatically assume training will fix it.  What do you train on?  Who decides what the training should be about?  How will you know the participants got what they’re supposed to get?  To find the right answer you have to ask the right questions.  Here are some ideas to get started.

  • What are your expectations (be specific)?
  • What does “wrong” look like?
  • What does “right” look like?
  • Who is doing it “right?”
  • Why is this person doing it “right?”
  • Why can they do it “right” and others can’t?
  • How will we know we have achieved our goal?

This process does not have to take months to complete.  Depending on the scope of the situation, performance improvement can be achieved in weeks or possibly even days.  Do not automatically assume that this process will consume a lot of time and resources.  It is not unusual for this process to save money.

This is a time-tested approach to achieve a successful outcome.  It isn’t always glamorous, but it works.  With the review behind us, lets get on with the discussion of ISPI standard #7, design.

Resuming my overview of ISPI’s performance standards, the focus of this post is cause analysis.  According to ISPI, “some causes are obvious, such as new hires [who] lack the required skills to do the expected task.”  Training should be a part of the solution for onboarding new hires.  However, one should not always assume training will resolve every need a new hire or any other employee has.

What is implied in the quote above, but not explicitly stated, is the fact the underperformance can be caused by factors that are not fully resolved through training.   These can include organizational priorities, lack of access to resources or individuals, inadequate or defective tools, poor morale, ineffective incentives, or ill-defined processes.

Settling on a solution because it seems right or because it is the way you have always done things is a common mistake.  I encourage you to look beyond the obvious and explore what else could be causing a performance problem.  This may take some effort and will definitely be challenging but you will be well on your way to coming up with a solution that will actually improve performance.

…one common occurrence that warns you that a performance discrepancy may be lurking around is the announcement that takes some form of “We’ve got a training problem.”  Someone has detected a difference between what is desired and what is actually happening.

But statements such as “We’ve got to train/teach…” are pits into which one can pour great amounts of energy and money unproductively.  Such statements talk about solutions, not  problems.  Training (teaching, instruction) is a solution, a remedy – a procedure used to achieve desired results.  It implies transferring information to change someone’s state of knowledge or ability to perform.

But lack of information is often not the problem.

Robert Mager/Peter Pipe, Analyzing Performance Problems, p. 8

If lack of information is not the problem how do you find out what the problem is?  One of the first questions one should ask when looking at a performance problem is, “what does ‘right’ look like?”  In performance terms “right” is defined as optimal performance.  Actual performance is the current way the work is being done.  When starting a project, one should not assume the problem lies is the “actual” way the work is being done.  One should also not assume that training will resolve any perceived problem. The problem may have a cause nobody has considered that cannot be resolved through training alone.

A fundamental step in understanding a performance problem is documenting the actual and optimal performance.  The difference between actual and optimal performance is the performance gap.  In many organizations it is likely that optimal performance has been documented at some point.  This can be in the form of a manual or an existing training course.  One should consider that the problem may be caused by a lack of accountability to existing standards or failure to send people through existing training.  Reviewing existing documentation can provide a good starting point for any needs analysis.  It provides valuable background and a reference point from which to begin data collection.  However, one should also consider other factors that may be causing a performance problem.

The focus of this post is how to document and analyze a performance gap.  I will cover how to close a gap in a later post.

There are many ways to approach a gap analysis.  To effectively understand an issue a combination of analysis techniques is usually required.  Below are seven types of analysis ISPI recognizes:

  • Job or Task Analysis—Identifies the required activities, information, processes used, and outputs produced and then compares that to actual practice.
  • Process Analysis—Identifies the cycle time compared to process time; time at task compared to time on rework, waiting, or checking; resources consumed and the cost of those resources; and what drives activity (customer or product requirements).
  • Work Environment Analysis—Identifies and evaluates the effectiveness and efficiency of feedback, the reward and incentive system, information and communication systems, work and process designs, and work tools and equipment.
  • User or Audience Analysis—Identifies current expectations, perceptions, physical capability and capacity, and knowledge and skills.
  • Communication Systems Analysis—Identifies and evaluates the availability, capability, capacity, upgrade ability, and cost to use and maintain.
  • Market Analysis—Identifies the size, competition, growth, current and potential constraints or limitations, organizational expectations, initiatives, capabilities, and capacity.
  • Data System Analysis—Identifies and evaluates the capability, capacity, availability, upgrade ability, and cost to use and maintain.

Speaking from experience it is unlikely that a performance specialist would possess all the skills required to successfully complete each of these forms of analysis.  This is one reason why ISPI emphasizes partnerships.  If the stakeholders on a project feel strongly that any of the methods listed above are necessary to gather data they should also be prepared to employ a specialist to complete the data collection.  Although the work can be done by a layperson it will add time and may impact the overall quality of the data.  Ultimately the decision is a matter of cost and the benefits derived from the expenditure.

Regardless of who is conducting the analysis, the participants play a critical role in the quality of the data.  Participants should be representative of the entire team in knowledge, experience, and responsibilities.  A control group can be helpful in validating and clarifying data.

Access is another critical success factor.  Being able to observe and interact with job performers in their work environment may be necessary.  However, striking the right balance between observing and interacting can be a challenge.  In my experience it is very important to be open with the participants about your needs and expectations.  It can be unsettling for an employee to be observed.  This is where it is critical to have a sponsor who can communicate with employees and address their concerns.  Interacting with participants may influence your data so should be considered with caution.  Direct contact with participants can also negatively impact their productivity which may also negatively impact the data.

Once optimal performance is documented, the analysis should attempt to validate while documenting the actual performance of workers.  This is where the control group is useful.  It is possible that changes have taken place since previous standards were written that require an organization to revisit their expectations (optimal performance).  Finding out how to compensate for the gap is where the needs analysis can get complicated.  It would be a mistake to assume the gap is caused by a lack of knowledge.  Unfortunately, many organizations treat all performance issues this way.  If the problem is caused by performers lacking knowledge, it is logical to assume the solution is traditional training.  However, there may be another cause.  But that is a discussion for a future post.

I am continuing in my review of ISPI’s performance standards. The remaining standards follow a systematic process that is familiar to instructional designers, who refer to it as ADDIE (Add-ee).  This process consists of five steps or phases: Analysis, Design, Development, Implementation, and Evaluation.  This process has stood the test of time and remains the basis for most instructional design efforts or discussions.  ISPI differ slightly with the addition of a standard for requirements.

Needs assessment or analysis is ISPI’s fifth performance improvement standard (the A in ADDIE).  This is a vast topic.  Entire books are written on this subject.  Although it would be impossible to provide a thorough description of needs analysis in this space, I will attempt provide enough information to help you understand the factors to consider and the procedures to follow in a successful analysis effort.

In my experience, the needs assessment begins with standard two, context.  In my post on that standard I wrote, “identifying and discussing potential barriers will help design an intervention that will achieve the desired outcome.” Barriers to performance are often indicators of what is causing a performance problem.  Although one must be careful not to draw conclusions based on incomplete information, the perspective gathered early in the process can help plan subsequent data collection and analysis.

Robert Mager is one of the best known authors in performance assessment and instructional development.  In his book, Analyzing Performance Problems, he provides a process for analyzing a performance problem.  The first step in his process is to describe the performance problem.  While I agree that this is a prerequisite for a successful needs analysis, it is not always a fast or simple process.  It is often difficult to describe the actual problem but easy to describe the symptoms.  That is why it is critical to follow a systematic analytic process.  Since there are many approaches to needs analysis I will take several posts to describe how to conduct a successful needs assessment.

Next Page »