Thursday, January 31, 2013

The Brain - Yesterday and Today


Theory Basis: Learning Theory

I recently attended ASTDs Brain-Based Learning Virtual Summit and had an opportunity to review a computer-based course about the Global Harmonization Standard initiative being implemented in the realm of hazard communications.  In this blog, like a good Tom Clancy novel, I intent to connect the two experiences to provide you with a new filter for thinking about designing instruction and  advise about the “rational” step in an instructional process.

Way Back When - Neural Science

Once upon a time the survival of the human race depended on how quickly one could recognize danger and react to it.  Hence, nature favored a brain that would process emotions fast and first.  It is just such a brain that we inherit today.


The Quick Mind Survived!
The way our brain works has important implications for us today and affects how we learn.  Basically anytime you or I perceive a threat the logical (thinking) part of the brain’s function decreases proportionally.  The gateway of information into the brain is the amygdala; the memory manager of the brain the hippocampus. Hence retention!   The amygdala is like a 3-way control valve (see graphic below).  In general, the more a threat is perceived the more the brain’s functions go away from logic and memory and toward survival.  Conversely, the lower the perceived threat potential the more the brain can function toward learning and retention.


The Brain is Constantly Assessing the Environment for Threats
New Model

Dr. David Rock of the NeuroLeadership Institute developed a model based on research centered on the activity of the brain; specifically the amygdala and various environmental conditions.  There’s a good chance Dr. Rock couldn’t find any saber-toothed tigers but he did created situations that relate to being in a learning environment.  He found five factors that will influence whether we are mentally moving toward or away from being able to learn and retain information.  These factors make up the SCRAF model.  SCARF stands for: Status, Certainty, Autonomy, Relatedness, and Fairness.

Referring to the diagram below, as we process information from the environment we are unconsciously filtering it for a level of threat or reward.  A condition mildly leaning in the direction of “Toward” is the best for learning.
 
The NeuroLeadership SCARF Model

Application

I like to do a bit of reality check when I come across a new model by looking for consistency with currently accepted applications and practices.  In the case of the SCARF model, it seems to be in alignment.  For example, the application of “Adult Learning Theory” (Knowles) and its #1 Premise; Adults have a need to know why they should learn something matches up well with the “Certainty” factor of the model.

As we approach the development phase of a design process, we can use the SCRAF model to evaluate our strategy or method by asking, “Will the result of implementing this strategy be perceived by the learner as a threat or a reward, and to what degree?  The answer should influence the “how” we implement.

Example: Event of Instruction – Introduction

Strategy: Sharing Learning Objectives

The value of sharing the learning goal or objectives can’t be emphasized enough (See my Blog, April 2012, Learning Objectives – The Rosetta Stone of ISD).  We can now add another reason for why.  Applying the SCARF model, by providing the students with “Certainty” in what the future holds and  increasing “Status” with the reward of success will shift the SCARF continuum it the “Toward” direction  favoring learning.

But how to share objectives seems to be a challenge.  And, there is lots of advice:

“Only rarely will designers express the objectives to the learners in the same form that were used when designing instruction.” (Smith & Ragan)

“A list of thirty or forty technically worded objectives would be likely to shatter the learner’s confidence. A list of three of four global objectives written in the learners’ language would tend to build confidence. (Dick & Carey)

“Of course, if objectives are to be communicated effectively, they must be put into words (or pictures if appropriate) that the student can readily understand.” (Gagne’, Briggs & Wager)

“Also, when using the cognitive approach, it is important to state the objective in terms the learner will understand, rather than a formally stated objective.” (Foshay, Silber, & Stelnicki)

“Lists of objectives are not motivating… listing them at the beginning of each module of instruction isn’t a very effective thing to do.” (Michael Allen)

“The more concrete and verifiable what you want the learners to be able to do and say, the more easily you can identify their successes or short comings”. (Stolovitch & Keeps)

“Trainees must clearly understand them (objectives), or they are of limited use” (DOE Handbook-1078-94)

“Use sticky objectives. A “sticky objective” simply shifts the timeframe of the performance from at the conclusion of training to on the job.” (Barbra Carnes)

“Just dumping in objectives to satisfy auditors is an affront to the profession!” (Cj Stape)

Applying SCARF Model

Please take a moment and watch the video capture of an e-Learning course introducing a lesson within a course and sharing an objective. (Please give it a few seconds to load and maybe even watch it twice.)


video
 
Now let’s determine the degree the objective will increase certainty or create uncertainty.

·        Is it clear as to when success will be achieved?
·        Is there enough detail to remove uncertainty?
·        Is there any jargon or concepts unknown to the learner?
·        Does it seem achievable to the uninitiated?
·        Imagine a test question develop to measure this objective. Is it evident what the answer would be?
·        Ask someone unfamiliar with the subject and ask them if they would be certain of what is expected?
My Evaluation & Opportunities for Improvement

In my opinion, this objective is ambiguous. Ambiguity activates brain regions that process “threat” because it leaves the learner uncertain; depresses learning.

Besides the blatant disregard for the “Coherence Principle” or a feeble attempt to wake up the student, the objective lacks any details on the quality and content of the discussion.  When I read it I asked myself, “what specific properties, how many, at what level of expertise will the discussion be held, how much do I really need to know?

I would question if the potential audience is familiar with the term “pictogram”. I also had the advantage of seeing the next two screens which defined what a pictogram is.  One could deduce that if the term needs defined the perspective audience doesn’t know what it means and the term shouldn’t be used until after defined.  So I would say contains jargon.

My version:

Working with the given objective (because I really question its validity as really supporting a task from a job analysis; (kinda fails the real world practice test) my version would be:

In this lesson the details about the new signs (pictograms) will be explained. When finished, you’ll be able to discuss the topic of pictograms by describing the qualities of the border, symbols and background that are used to design one.  You’ll be able to do this as easily as you can identify a stop sign when driving.


Example of a Pictogram
Your Turn

Evaluate my objective for the qualities of certainty and share your evaluation in a comment.

Best regards,

Cj

 References:

Allen Michael, W. (2003) Michael Allen’s Guide to e-Learning – Building interactive, fun, and effective learning programs for any company.  John Wiley & Sons, Inc. Hoboken, New Jersey.

Carnes Barbara. (2012) Making Learning Stick: Techniques for easy and effective transfer of technology-supported training. American Society for Training & Development. ASTD Press.  Alexandria, VA.

Dick, W., & Carey, L. (1996). The systematic design of instruction (4th Ed.).  New York, NY: Harper Collins.

Department of Energy Handbook 1078-94, A systematic approach to training. (1994).  Page 17.  U.S. Department of Commerce, Technology administration, National Technical Information Services.  Springfield VA.

Gagne, R., Briggs, L., Wager, W. (1992).  Principles of instructional design (4th Ed.).  Orlando, FL: HBJ.

Knowles, M (1996). Adult Learning. In Robert L Cragf (Ed), The ASTD Training and Development Handbook (pp. 253-264). NY: McGraw-Hill

Rock David (2009). Your Brain at Work - Strategies for Overcoming Distraction, Regaining Focus, and Working Smarter All Day Long. Harper-Collins. New York, New York.  NeuroLeader Institute – Home Page

Smith, P. L. & Ragan, T. J. (1993) Instructional Design. Upper Saddle River, NJ. Prentice-Hall.

Stolovittch, H. D. and Keeps, E. J. (2002). Telling Ain’t Training. Alexandria, VA. ASTD Press.

The Cambridge Handbook of Multimedia Learning. (2005) Mayer Richard, E. Editor. New York, New York. Cambridge University Press.

 

Wednesday, January 2, 2013

Land of Confusion

Theory Basis: Communication

It’s traditional at this time of year to make resolutions to better our lives.  We can apply this tradition to our professional lives as well.  I am resolving to communicate better by using and dispelling buzz words in our profession and I encourage you to join me.

The greatest enemy of communication is the illusion of it.  This enemy not only is to be battled during a training activity, but in the business end of ID as well.  This blog is about concepts and terms related to ID, commonly used, and frequently misunderstood and some strategies to combat the enemy.

We often mistake fluency (we both use the words easily) with comprehension (we both have the same meaning for the words).  My guess is that because I understand what something means to me that everyone else shares the same meaning.

To avoid guessing, we need to listen carefully to key words or phrases (think buzz words) that are related to the instructional design process as we talk to customers, clients and subject matter experts (SMEs).

The need for clarity is especially critical when developing requirements within mandating documents.  Life’s irony will happen when an analytically gifted assessor arrives to see how well you are living into the requirements you helped to create and he or she says “ that’s not what that means!”.  The following are examples from my work-life.

What were they thinking?

The ADDIE “Model”

Your customer says, “I want you to develop the training using the “ADDIE Model”.   As Yoda said to young Skywalker, “Be afraid, be very afraid!”   If the devil is in the details, this is the devil incarnate.   The ADDIE Model is merely a colloquial term used to describe a systematic approach to instructional development, virtually synonymous with instructional system development.  There is no original, fully elaborated model.  So, how is one to implement a model with no delineated processes?

One thing is consistent, as Michael Molenda (1) shares “What everyone agrees on is that ADDIE is an acronym referring to the major processes that comprise the generic ISD process: Analysis, Design, Development, Implementation, and Evaluation.”

“The” Systematic Approach to Training

For clarity, a systematic approach to training is one of many instructional design models. For example, the Department of Energy (DOE) model in DOE Handbook 1078-94 “A” Systematic Approach to Training.  DOE recognizes that there is more than one way, as they offer DOE Handbook 1074-95, Alternative Systematic Approaches to training.

The Training and Development Handbook (2) says it in a pretty straight forward way, “The fact is, there is no single, universally accepted instructional design model.”  If you’re going to agree to such a commitment, get clarification as to which model the customer has in mind (if any).

Andrews & Goodson (3) conducted a comparative analysis of instructional design models. Starting with 60 possible models, 40 contained sufficient theoretical underpinnings, purpose and use, and a degree of documentation to meet the generally agreed upon “major processes” of an ID model.

University/college instruction design degree programs seem to favor the Dick & Carey ID model. If you ever consider expanding an ID career outside of Hanford you might consider familiarizing yourself with this model.

“Formal training”

Formal training refers to the student’s attire.  Men must where a white shirt and tie; women a dress of a length longer than the knees.  Absurdity aside, you will not find a generally accepted definition for “formal” training.  A Site committee chairperson was once asked what formal training meant when discussing requirements for a safety related program and the reply was, “completion of the training is auditable in the Training Records System”.  Other meanings include:  implemented with a stand-up classroom methodology, includes a written examination, or developed using an instructional design model.  Find out if you are going to need to be a good tailor! (Pun intended)

“Active learning element”

Active learning was a buzz term for a movement to encourage instruction away from the instructional method of stand-up lecture.  According to Wikipedia, “Active learning is an umbrella term that Jose Castillo invented and it refers to several models of instruction that focus the responsibility of learning, on learners. Bonwell and Eison (1991) popularized this approach to instruction. This "buzz word" of the 1980s became their 1990s report to the Association for the Study of Higher Education (ASHE). In this report they discuss a variety of methodologies for promoting "active learning." My take, get away from the lecture, but this leaves a plethora of choices; find out which one(s)

“Hands-on”

OK. Let’s make it clear we are not talking about the Reiki treatment method.  A Site safety program once required “hands-on” training for beryllium workers.  I thought it would be great to have all the students make an ashtray from beryllium.  What a great way for the learners to get to know the physical properties of that metal!  It turns out the intention was to include practice dawning and doffing protective equipment. If you can substitute “practice with feedback” for hands-on I think you will be in close proximity of the desired outcome.  Further, a good learning objective will define the level of proficiency to be obtained.  Beware of defining an instructional method before the instructional objectives have been determined.


Training to “understanding”

Understanding is considered one of those taboo words when developing learning objectives because understanding is a state of mind not a behavior and not measureable or observable.  All kinds of alarms should go off in your head if your customer or requirement specifies achievement of understanding.  Get your subject matter expert (SME) or interpretive authority on board and get expectations clarified before doing anything else.

“Effective training”

The goal to achieve “effective training” begs for some quantifiable level of achievement.  The clarification the customer needs to provide is to what degree? Make sure it is measurable and those in authority sign off on the metric that is going to measure if your training is going to be effective enough.

“Consistent training is critical…

This statement has the same flaw as “practice makes perfect”.   Just like practice can result in a permanent correct or incorrect behavior, consistent training can be consistently good or bad. Considering the infinite number of possible human behaviors there are relatively few that must be consistent; calling 911 for emergency help might be one example.  It is almost like saying there is only one way to teach and one way to learn.  Beware of  business or political agendas hidden behind such claims that may be contaminated by bias.  I submit that behaviors that demand consistency will be nearly self-evident during the analysis phase of instructional design.

Battle Strategies

A General Method for Clarification – No Guessing

When we listen to someone talk, the brain is constantly making assumptions – hundreds of them.  Each word, gesture, inflection, and tone of voice is interpreted, but not always as the speaker intended.  We usually are not aware of the fact we are selecting one meaning from a number of possibilities.

Once the “buzz word bell” starts ringing in your ears, make a physical or mental note to get clarification.  I suggest using one of the following:

 Just to be sure we are talking about the same thing in the context of your situation what does __________________ mean to you?

Or, Do you mind if I ask, when you say ______________, what does that mean to you in the context of what you are trying to achieve?

Test the Depth of Knowledge

Ask a question about a detail surrounding the buzz word.  For example if the customer wants you to apply “adult learning principles” you could ask, “So I can address your needs, which adult principle(s) do you think is/are the most important? In this way you can find out if there is any expectations and how to meet them.  If there aren’t any, suggest a couple; like design the training to maximize the need to be self-directing.

Endorse Your Design Model

Get Buy-In for your ID process. I suggest it is nonsensical to define which instructional methods are to be employed (hands-on) before an analysis is conducted and instructional objectives are developed.  If you have an internal procedure or process get support for its application; it may even be mandatory to follow it. After all, a systematic approach is the only method proven to lead to effective and efficient training.

Best in the New Year,

Cj

 

References:

1. Molenda Michael (2003) In Search of the Elusive ADDIE Model.  Performance Improvement, May June 2003. Available on line @ http://www.comp.dit.ie/dgordon/courses/ilt/ilt0004/insearchofelusiveaddie.pdf

2. Training and Development Handbook. A Guide to Human Resource Development (1987) Robert L. Craig Editor and Chief. Third Edition. American Society for Training and Development. McGraw-Hill Book Company. New York, New York. Page 199.

3.  In Anglin, Gary, J. (1995). Instructional Technology Past, Present and Future. Second Edition.Englewood Colorado, Libraries Unlimited Inc.