What is the philosophy on corrective feedback within the Autism Partnership method?

  • Journal List
  • Int J Dev Disabil
  • v.66(5); 2020
  • PMC7942800

Int J Dev Disabil. 2020; 66(5): 358–369.

Abstract

There has been rapid growth in the number of behavior analysts and interventionists in the world today. With this growth it is imperative to ensure that each behavior analyst and interventionist receives quality training. The training should be comprehensive (i.e. training multiple areas of behavior analysis) and should not conclude until the trainee is able to perform each behavior analytic procedure to a high degree of fidelity. The purpose of this study was to evaluate the effectiveness of a training package to train four participants how to implement multiple behavior analytic procedures. Using a multiple baseline design across participants the results indicate that each participant improved their implementation of behavioral intervention. Additionally, the participants maintained their skills over time.

Keywords: Applied behavior analysis, staff assessment, staff training, training

Recently the Center of Disease Control has estimated that 1 in every 54 children will be diagnosed with Autism Spectrum Disorder (ASD; Maenner et al. 2020). ASD is defined by deficits in social communication and social interaction skills as well as displaying restricted, repetitive behaviors (American Psychiatric Association 2013). As such, it is not uncommon for individuals diagnosed with ASD to display deficits in language, social behavior, academic behavior, self-help skills. These deficits are commonly coupled with aberrant behavior such as stereotypic behavior, aggression, PICA, and elopement (Didden et al. 2012, Kanne and Mazurek 2011, Richards et al. 2012). Comprehensive intervention is required that systematically addresses these deficits and excesses for individuals diagnosed with ASD to make meaningful improvements and achieve a high quality of life.

For many individuals diagnosed with ASD to make meaningful improvements and achieve a high quality of life it requires early, comprehensive, and intensive behavioral intervention (EIBI; Eldevik et al. 2012, Leaf et al. 2011, Lovaas 1987, McEachin et al. 1993, Reichow 2012). Comprehensive behavioral intervention consists of the implementation of a variety of procedures including but not limited to, discrete trial teaching (DTT; Leaf et al. 2019), shaping (e.g. Cihon et al. 2019a), behavioral skills training (BST; e.g. Shireman et al. 2016), the teaching interaction procedure (TIP; Cihon et al. 2017), and social skills groups (e.g. Leaf et al. 2017b).

Researchers have consistently found that the implementation of comprehensive behavior analytic intervention has resulted in improvements in language, social behavior, academic skills, self-help skills, and the reduction of aberrant behavior (Lovaas 1987, Howard et al. 2005, Howard et al. 2014). While this approach to intervention can make positive life altering outcomes for individuals diagnosed with ASD, it requires that interventionists are well trained and implementing the procedures with a high degree fidelity (Fryling et al. 2012, St Peter Pipkin et al. 2010). For an interventionist to implement procedures with a high degree of quality and fidelity it could take numerous hours of intensive training working directly with individuals diagnosed with ASD.

Fortunately, there have been numerous empirical investigations which have evaluated methods to effectively train individuals how to implement behavior analytic procedures with a high degree of fidelity (Deliperi et al. 2015, Lavie and Sturmey 2002, Pence et al. 2012, Roscoe and Fisher 2008). For example, Bishop and Kenzer (2012) used behavioral skills training to train 11 behavioral interventionists to implement preference assessments. The behavioral skills training occurred within a group instructional format and the participants implemented preference assessments in one hour sessions working directly with individuals diagnosed with ASD. The results were positive in that the participants learned how to implement preference assessments. In another example, Nosik and Williams (2011) evaluated competency-based instructions with modeling and written feedback to improve discrete trial teaching and backward chaining procedures with four interventionists. The results of a multiple baseline design showed the participants improved their implementation of discrete trial teaching and backward chaining. Unfortunately, the implementation of these procedures occurred with confederates as opposed to actual individuals diagnosed with ASD. Nonetheless, there have been numerous additional studies which have shown different modalities to train staff to be effective including video modeling (e.g. Deliperi et al. 2015), written instructions (e.g. Graff and Karsten 2012), and the TIP (e.g. Green et al. 2020).

Although the research on staff training is robust, there are some limitations with the current research base that future research could address. First, there are several studies in which researchers measure interventionist performance with confederates (e.g. adults pretending to be individuals diagnosed with ASD; Deliperi et al. 2015, Leaf et al. 2019, Leaf et al. 2020, Pence et al. 2012, Roscoe and Fisher 2008). This does not allow one to determine if the procedures were effective in training participants working with actual individuals diagnosed with ASD. Second, participant performance is commonly measured in short duration periods (e.g. 5 min session or 10 trials; Bolton and Mayer 2008, Catania 2009, Leaf et al. 2019, Leaf et al. 2020, Nosik and Williams 2011, Nosik et al. 2013). This does not closely represent the context in which interventionists commonly work with individuals diagnosed with ASD, which is commonly several hours per day. Third, much of the research involves only training one skill (e.g. prompting, discrete trial teaching) as opposed to multiple skills, which would be more representative of what occurs in the context in which interventionists commonly work (with some notable exceptions; e.g., Weinkauf et al. 2011). Finally, the majority of studies fail to measure or report training time (Deliperi et al. 2015, Leaf et al. 2019, Leaf et al. 2020, Pence et al. 2012, Roscoe and Fisher 2008). This prevents obtaining information on how long training actually takes. Collectively, these limitations represent a failure to take into account the terminal context in which interventionists commonly find themselves working and more research on staff training with this context in mind is warranted.

In addition to the literature on staff training, certification organizations (i.e. the Behavior Analyst Certification Board; BACB) have set standards of direct implementation of behavioral intervention and supervision. Relevant to the present study, the BACB recently published 2nd edition of the RBT task list (Behavior Analyst Certification Board 2016) that outlines skill sets that interventionists need to competently demonstrate to meet minimum standards. This task list includes numerous skills including, but not limited to, discrete trial teaching, preference assessment, and data collection. Although, the task list represents some basic skills, there are numerous important skills that an interventionist has to demonstrate not included on these task lists (e.g. cultural sensitivity, soft skills, curriculum). Additionally, the skills on the task lists are not operationally defined (e.g. prompting and fading could mean least-to-most, most-to-least, constant time delay, and/or no-no prompt), which creates difficulties in determining how and what to train interventionists on with respect to the task list.

To date, internationally there are over 97,082 certified professionals from the BACB (Behavior Analyst Certification Board, n.d.). Although there is a continuous growth in the number of certificiants, the growth has not translated to other countries outside of the USA. For example, only 217 certified professionals from the BACB reside in China, 35 in Japan, 35 in Malaysia, and 5 in Thailand (Behavior Analyst Certification Board, n.d.). Thus, clinical practice is underrepresented in Asia resulting little extension of best training practices in this region of the world.

The purpose of the present study was to expand upon the current research on staff training and help inform clinical practice by evaluating the effectiveness of a staff training package with three newly hired interventionists. Specifically, this study expanded upon current research by targeting multiple skills simultaneously and evaluating adult participant mastery of these skills within the context in which the participants would work (i.e. with individuals diagnosed with ASD across an extended time period).

Methods

Design

A non-concurrent multiple baseline across participants design (Watson and Workman 1981) was used to evaluate the effects of the training package. This design was used as the target skills of the current study were likely to be irreversible. All participants began in baseline, followed by the introduction of the intervention condition in a staggered fashion as soon as improvement was observed. Within a non-concurrent multiple baseline design functional control is demonstrated when changes in the dependent variable occur only after the introduction of the intervention and not during baseline for the other participants (Watson and Workman 1981).

Participants

Adult participants

Four adult participants participated in this study. Each participant was recruited to be employed by a private agency that provides behavioral intervention services for individuals diagnosed with ASD in Hong Kong. Recruitment of the participants was from websites, social media, and recruitment at conferences. Once participants were hired from the agency they were informed about the study, that it was optional, and signed consent to participate. The first four participants that signed consent were included and no one was excluded from the study. Thus, the participants in this study are representative of interventionists that would be hired within the agency.

Kate was a 23-year-old female with a master’s degree in Foundations of Clinical Psychology. Kate had no prior experience in behavior analysis with or without individuals diagnosed with ASD. However, Kate did take a course on behavior analysis in her undergraduate studies. Additionally, Kate volunteered at a daycare center for children diagnosed with ASD prior to the start of this study.

Jill was a 23-year-old female with a bachelor’s degree in Social Sciences in Psychology. She had 1.5 years of experience providing behavior intervention for individuals diagnosed with ASD in an agency other than where this study took place. She was responsible for teaching a variety of skills including learning how to learn, communicative, and social skills in one-to-one and group contexts.

Nadia was a 23-year-old female with a master’s degree in Psychology. Nadia had no prior experience in behavior analysis but did volunteer in a speech clinic that provided intervention for individuals diagnosed with ASD. She had experience working with children with ASD in a speech clinic as a volunteer under the supervision of the chief speech therapist. Her responsibilities included setting up the classroom for lessons and assisting oral motor and sentence forming classes.

Sam was a 26-year-old male with a bachelor’s degree in Psychology. Sam had previous experience working with individuals diagnosed with ASD. Sam was an intern in a behavior analytic clinic for 1 month where he received didactic training on the diagnosis of ASD, behavioral intervention, and was able to observe sessions. Additionally, Sam was an intern at a University in the United States for 6 months where he learned about behavioral intervention. Third, Sam was a teaching assistant at a primary school that served individuals diagnosed with ASD.

Children diagnosed with ASD

All four participants worked with three different children diagnosed with ASD throughout training. The children were clients of the agency whose caregivers provided consent for them to receive intervention from the trainees. As such, they were not recruited for this study but were already receiving services from prior trainees and throughout this study.

The first child, Alex, was a 4 year 10 month old boy at the time of the study. Alex had an Autism Diagnostic Observation Scale (ADOS) overall total of 17 (high level of impairment). Alex had received 3 h of behavioral intervention per day for 15 months prior to the start of the study. Alex displayed a variety of aberrant behaviors (e.g. stereotypy, aggression, self-injury), could make simple requests, did not display any learning how to learn skills (e.g. waiting, relinquishing reinforcers, responding to corrective feedback), displayed limited social and play behaviors, and could attend for up to 5 min. Alex participated in the training for all participants and during mastery probes for Kate.

The second child, Jackson, was a 3 year 7 month old boy at the time of the study. Jackson had received 6 h of behavior intervention per day for a year prior to the study. Jackson engaged in stereotypic and aggressive behaviors, displayed learning how to learn behaviors, had social interest but did not play with peers, used a Picture Exchange Communication System as his primary form of communication, and could attend up to 10 min. Jackson participated in the training for all participants and during mastery probes for Jill, Nadia, and Sam.

The third child, Misty, was a 2 year 7 month old girl. Misty had a Vineland-3 Adaptive Behavior composite score of 62 and an ADOS total of 19 (high level of impairment). Misty had received 6 h of behavior intervention per day for a year prior to the study. Misty engaged in several aberrant behaviors (e.g. screaming, aggression, stereotypic), did not display any learning how to learn behaviors, did not display appropriate social behaviors with peers, was non-vocal and could attend for less than 1 min. Misty was not used for any of the probes as the parents did not consent for her being part of probes.

Adult and child pairing

All four adult participants (i.e. Kate, Jill, Nadia, and Sam) worked with all three child participants (i.e. Alex, Jackson, and Misty) as part of training. For probe sessions (described below) each adult participant worked with one child participants during probes. During these probes Kate worked with Alex and Jill, Nadia, and Sam worked with Jackson.

Trainers

Two adults served as trainers during all intervention sessions. Elaine trained Kate and Jill, and Alan trained Nadia and Sam. Elaine was a 30-year-old female with a master’s degree in Applied Behavior Analysis. She was a BCBA with 8 years of experience of direct behavioral intervention and 6 months of providing supervision for individuals diagnosed with ASD. She had been the trainer of the training program for about 6 months by the time participant Kate and Jill began the training. Before becoming the trainer, she had received training from a Clinical Psychologist in providing behavioral intervention. In addition, she had experience in conducting workshops for interventionists and parents on behavioral intervention for children diagnosed with ASD.

Alan was a 28-year-old male with a bachelor’s degree in Psychology, 4.5 years of experience of direct behavioral intervention, and 3 months of experience providing supervision for individuals diagnosed with ASD. Before becoming the trainer, he had experience in providing parent training on behavioral intervention for children diagnosed with ASD. In addition, he had received training from two BCBAs and consultants of the center in which the study occurred. It should be noted within the philosophy of the program having a BCBA does not make one qualified to provide supervision and it is our contention that non-BCBAs can provide equivalent supervision to a BCBA (see Leaf et al. 2008).

Setting and materials

All sessions took place in a therapy room which measured 5.5 by 5 meters. The room had 3 children sized tables, 6 children sized chairs, 4 adult sized chairs, a white board on the wall, toys, and instructional materials. Before each session, the trainer provided each participant with the student’s file, which contained their behavior plan and data sheets of all the targets skills the participant was required to teach during the session. In addition, potential reinforcers selected based on each students’ preference and age (e.g. iPad, fishing games, musical instruments), and materials required to teach the target skills (e.g. token board, flash cards and figurines) were put inside a plastic box beside the table.

Dependent measures and measurement

The primary dependent variable was the adult participants displaying the 38 targeted skills a direct line interventionists would likely need to display to effectively work with individuals diagnosed with ASD. Six items measured the use of engagement strategies, six items measured the use of reinforcement, six items measured discrete trial teaching, three items measured prompt and prompt fading, six items measured mand training (communication temptations a term used as a descriptor for mand training), two items measured maximizing the child’s progress, and nine items measured behavior management (contact corresponding author for probe sheet and operational definition).

The researcher rated the participants’ performance of behavioral intervention via formal probes. Formal probes were conducted during baseline, intervention, and maintenance. During baseline, formal probes occurred daily and during intervention formal probes occurred once per week. Formal probes took place in the same room as the training. During each probe, the researcher observed the participant working with the assigned child diagnosed with ASD. The probes lasted 30 min and neither the trainer nor researcher provided any training, feedback, or prompting during any of the probes. During the probe the adult participant had access to the child’s clinical notebook, instructional materials, and potential reinforcers. The probe started with the researcher stating, “To the best of your ability you are going to work with [Child’s Name] for 30 min. [Child’s Name] is [Years old]. We have provided all the instructional materials that [Child's Name] needs as well as toys that they might like. Here is their binder (while handing over the child’s clinical notebook). Please take 5 min to look it over before I bring them in. Unfortunately, I cannot answer any questions about the procedures to use or what you should work on. Go ahead and read about [Child’s Name] now for 5 min.” After 5 min the researcher stated, “I am now bringing in [Child’s Name]. You will be working with them for 30 min. During this time I cannot answer questions, I will not provide any instruction, advice or feedback. If, however, I feel that you or the child in any danger I will stop the probe immediately. Have fun and work with them to the best of your ability.” The participant then worked with the child for 30 min.

During each formal probe the researcher scored the adult participants’ behaviors on a likert scale (see Tables 1–2). The researcher scored an item as 0 if the participant rarely/never displayed the skill, 1 if the participant sometimes displayed the skill, or 2 if the participant frequently/often/always displayed the skill during the probe. The researchers calculated the sum of scores obtained divided by the total score possible and multiplied by 100%. A dual criteria was used to determine mastery for the adult participants. This consisted of first reaching two consecutive days of a total score above 80%. Once this criterion was met the clinic supervisor was asked to observe a formal probe. If the clinic supervisor determined the adult participant’s performance was passable, the performance was considered mastered. If the clinic supervisor determined the adult participant’s performance was not passable, intervention continued. The clinic supervisor continued to observe formal probes until the dual mastery criteria was met.

Table 1

Description of modules

DomainOrder for Kate and JillOrder for Nadia and SamObjectivePerformance Evaluation
ABA Programs Throughout Throughout Understand the purpose and how to implement different programs Yes
Work Attitude 1 1 Understand AP’s expectation towards staff and ethical standards Yes
ASD & ABA 2 2 Symptoms of ASD & history of ABA No
Engagement 3 3 Engage students in an enthusiastic and child-friendly manner. Be always aware of students’ safety. Yes
Reinforcement 4 4 Understanding techniques for reinforcement sampling & reinforcement development. Yes
DTT 5 5 Utilizing DTT in therapy. Understanding the different component of a trial. Yes
Prompting 6 6 Learn how to use different types of prompts, prompting hierarchy, use of expanded trial, and be aware of not using inadvertent prompts Yes
Communication Temptations 7 7 Learning how to do CT and catching correcting timing (i.e. ensuring that intend is established). No
Maximizing Progress 8 8 Techniques to run a session smoothly (e.g. positioning of materials, appropriate length of each work round), naturalistic teaching & activity-based teaching, interspersing of program. Yes
Behavior Management 9 9 Respondent vs. Operant behavior, FBA, reactive and proactive procedures. Yes
Data Collection Throughout 10 Learn to collect data accurately using different types of data collection methods (e.g. probe data, percentage data, trial-by-trial data) No
Solicit Attention 10 11 Learn techniques to help improve attention- behavior momentum, rainbow token, choice of program Yes
Self-Help & Task Analysis 11 12 Learn about how to analyze and break down task + different types of chaining No
Play 12 13 Teaching the concept of play No
Group Teaching 13 N/A Teaching the concept of group teaching No
Social Skills 14 14 Understanding basic social skill programs & social skills taxonomy No

Table 2.

Performance evaluation.

DomainNumberBehaviorScore: (0 = Rarely/Never,
1 = Sometimes, 2 = Frequently)
Work Attitude 1 Attendance  
  2 Punctuality  
  3 Safeguard Confidentiality  
  4 Child-friendliness  
  5 Initiation (solicits feedback, initiating help appropriately)  
  6 Attentiveness  
  7 Responsiveness  
  8 Responsible  
  9 Receptiveness (accepts feedback, discus feedback in respectful manner)  
  10 Represents agency positively  
Engagement 11 Attentive to student all the time  
  12 Predict student’s behavior and prevent him/her from getting hurt  
  13 Use student’s preferences  
  14 Enthusiastic tones and volumes  
  15 Enthusiastic facial expressions  
  16 Making interesting comments  
  17 Not asking questions  
  18 Pair social and tangible  
  19 Present toys in different ways  
  20 Let student participate  
  21 Identify and use salient points of preferences  
  22 Make a change when student is not engaged  
Reinforcement Development 23 Able to identify likes and dislikes  
  24 No unnecessary demand and corrective feedback during reinforcement sampling  
  25 Use of a variety of reinforcers  
  26 Pairing of socials with tangible reinforcers  
  27 Develop reinforcers appropriately  
  28 Use of age-appropriate reinforcers  
DTT 29 Presentation of materials is clear  
  30 Instructions are clear, no repeating of instructions  
  31 Issue instructions according to student’s level  
  32 Vary instructions  
  33 Artificial attention cues are not provided  
  34 Behavior expected is clearly defined  
  35 Consistent with expectations of tasks  
  36 Provided feedback for each trial  
  37 Provide information feedback  
  38 Provided appropriate time for responses, feedback, and inter-trial interval  
  39 Vary feedback  
Use of Reinforcement 40 Reinforcement is delivered after the desired behavior (correct timing)  
  41 Reinforcement was provided consistently  
  42 Bribery was not used  
  43 Differential reinforcement was utilized  
Prompting 44 Able to identify skill deficits and behavioral deficits of a student  
  45 Plan prompts in advance  
  46 Timing of prompts is optimal  
  47 Able to avoid prolonged failure  
  48 Level of prompts is optimal  
  49 Fading of prompts are appropriate  
  50 Use expanded trials appropriately  
  51 Inadvertent prompts are not provided  
Communication Temptations 52 Able to identify appropriate target  
  53 Able to identify students preference  
  54 Able to attempt to establish intent (physical environment and avoid grabbing)  
  55 Provide prompt at the right time  
  56 Do not ask question  
  57 Deliver reinforcement immediately upon occurrence of target behavior  
  58 Re-articulate and expand language as necessary  
  59 Set up repeated practices  
Maximizing Progress 60 Session length is appropriate (duration of breaks is appropriate)  
  61 Sequence of task presented is appropriate  
  62 Session is ended on a pattern of success  
  63 Behavior momentum is created  
  64 A good balance of play is incorporated into the overall program  
  65 Therapy is as natural as possible  
  66 Facilitated generalization as quickly as possible  
  67 Tasks are adjusted based on student’s behaviors and performance  
  68 Enthusiastic  
  69 A mastered program is not repeated more than necessary  
  70 Different programs are interspersed  
  71 Preferred and interesting materials are used  
Behavior Management 72 Stay calm when behavior problems occur  
  73 Able to identify function of behavior problems  
  74 Behavioral procedures suggested by supervisor are implemented consistently  
  75 Behavior tokens are delivered contingently  
  76 Provide reinforcement for absences of excessive behavior (DRO)  
  77 Provide clear, informational feedback on desirable behavior accurately  
  78 Provide just enough amount of response blocking  
Behavior Management 79 Provide minimal attention to excessive behaviors  
  80 Rainbow tokens  
  81 Wait Program  
Soliciting Attention 84 Good attention behaviors are reinforced  
  85 Artificial attention cures are not used (e.g. pointing at eyes to ask student to look)  
  86 Timing of onset of trials is optimal  

Second, we evaluated how long it took each of the four participants to complete the training package. Finally, we examined the data of number of programs introduced and mastered for one child (Jackson) who was part of the training. This provided a sample of the progress the child made while working with the adult participants in this study. Unfortunately, we were not able to provide information on each child’s progress as they did not consent to use their data as part of an empirical write up.

Baseline and maintenance

Each session of the baseline and maintenance condition consisted of a single formal probe (described above). During the baseline condition the participant came to the setting and the researchers implemented a full probe. When the full probe was completed the participant would then leave the setting for the day. During the maintenance condition the participant came to the setting and the researchers implemented a full probe. When the full probe was completed the participant would then stay in the setting and implement clinical intervention to individuals diagnosed with ASD.

Procedure (independent variable)

General

Each participant received a training package to learn how to implement comprehensive behavioral intervention for individuals diagnosed with ASD. Participants received training 5 days a week for 9 h per day. Each day was broken into three, 3-hour sessions (referred to as morning, midday, and late day sessions). During the morning and midday sessions participants worked directly with the child participants. During the late day sessions, the participants had educational opportunities (e.g. didactic training, readings, video presentations) about ABA and ASD.

The training for all participants consisted of modules on topics related to behavioral intervention and ASD. Table 1 provides information on the modules, the order of the modules, the objectives of the modules, and if it was directly probed during training. Kate and Jill had a total of 16 modules and Nadia and Sam had a total of 15 modules. These differences were done based on the individualized needs of the adult and child participants. The participants went through each module in a stepwise fashion with two exceptions. First, for all participants there was a module on ABA programs which was interwoven across the other 15 modules. Second, data collection was targeted throughout every module for participants 1 and 2 but was a separate module for participants 3 and 4.

Hands-on experience

During the morning and midday sessions, each of the participants received hands on training working directly with their assigned child participants. All of the hands-on training took place in the training room (described above) which included the participant, the trainer, and three children diagnosed with ASD. During each module, the trainer focused on training the participant behaviors that corresponded to the module. For example, during prompting the trainer focused on different prompt types that can be used (e.g. point, verbal, physical guidance), how to effectively use prompts, and how to effectively fade prompts.

There were a variety of different teaching modalities that occurred during the hands-on training. First, demonstrations of targeted behaviors were used as models (e.g. the trainer engaged in the targeted behavior while the participant observed). Demonstrations occurred in two different ways: 1) the trainer demonstrated targeted behaviors within a module (i.e. concepts currently targeted) and across different modules (i.e. concepts previously targeted) and discussed the demonstration with the participant; 2) senior level staff demonstrated targeted behaviors while the trainer sat with the participant and explained what was occurring in the demonstration. Second, the trainer provided the participant an opportunity to work for a short period (i.e. an average of 10 min) with one of the three children. During this time, the trainer instructed the participant to focus on a specific concept(s). Prompts and feedback were provided based upon the participants behavior during this work period. Prompts generally consisted of verbal, gestural, and model prompts. Third, the participant worked with one of the children for longer periods of time (i.e. an average of 1 hr). During this time, the trainer occasionally observed the participant and provided prompts and feedback on their implementation of behavioral intervention. Finally, every 2.5 weeks the trainer provided the participant a performance evaluation (see Table 2). This performance evaluation used a 3-point Likert scale and allowed the trainer to provide the participant with objective data on their performance and the participant to self-evaluate his or her performance during training (i.e. the participant could review the evaluation form and their scores).

Educational training

During late day sessions, each of the participants received educational opportunities dependent upon the current module(s). The educational training was not a commercial entity but rather training created by the agency and determined by the trainers. During the educational opportunities the trainer provided didactic training, readings, and activities that corresponded with the module. Since the modules were designed to build upon each other, there were sessions during which the trainer targeted behaviors spanning more than one module.

There were a variety of modalities that occurred during these educational opportunities. First, the trainer provided lectures on the concepts within a module (contact first author for PowerPoints). For example, if the module was on prompting the trainer described various prompt types, prompting systems, and ways to effectively fade prompts. Second, the participants watched videos related to the module topic. During these videos the trainer explained what occurred and how the participant could implement these concepts within their work. Third, the participants were provided with research articles and curricular books related to the module topic to read. Finally, the participants took different open-ended written exams about the module. Each written test consisted of between 6 and 9 questions. The participant was required to score 70% or above on the test prior to moving within or across modules. All educational opportunities were used to support the concepts within the module and the participants demonstration of behavior analytic skills when working directly with individuals diagnosed with ASD.

Interobserver agreement

The experimenter and an independent observer scored the adult participants’ behaviors during 33%, 38.46%, 33% and 33% of probes for Kate, Jill, Nadia and Sam respectively. The independent observer was trained on the operational definitions of the 38 tasks. Interobserver agreement (IOA) was calculated by totaling the number of times the experimenter and the independent observer agreed on the scoring of a task divided by the total number of agreements plus disagreements and multiplying by 100%. Mean IOA was calculated by the sum of IOAs divided by the total number of IOAs. The mean IOA for Kate, Jill, Nadia and Sam were 87.15%, 91.46%, 82.64% and 88.26% respectively. Given that the scoring was on a 3 point likert scale which included some subjectivity in scoring anything over 80% IOA is often considered satisfactory (Kazdin 1977).

Results

Adult participants

Results of the participants implementation of behavioral intervention during performance probes are depicted in Figure 1. Consecutive probes are depicted along the x-axis and the overall percentage correct is depicted along the y-axis. Each panel represents a different participant.

What is the philosophy on corrective feedback within the Autism Partnership method?

Participant performance during probes.

Kate had two formal probes during the baseline condition receiving 28.3% of points possible during each probe. During the intervention condition Kate showed an immediate increase in the number of skills she engaged in correctly. The intervention condition lasted 11 probes which was a total of 531 h to complete the training. We evaluated Kate’s implementation of behavioral intervention 7 days, 15 days, and 22 days following the completion of training. During this maintenance condition, Kates performance probes were all above 90%.

Jill had four formal probes during the baseline condition receiving an average 14.5% (range 10 to 20%). During the intervention condition Jill showed an immediate increase in the number of skills she engaged in correctly. The intervention condition lasted 6 probes which was a total of 252 h to complete training. We evaluated Jill’s implementation of behavioral intervention 8 days, 15 days, and 22 days following the completion of training. During this maintenance condition, Jill’s performance probes were all above 97%.

Nadia had six performance probes during the baseline condition receiving an average 34.8% (range, 31 to 38%). During the intervention condition Nadia showed an immediate increase in the number of skills she engaged in correctly. The intervention condition lasted 11 probes which was a total of 468 h to complete training. We evaluated Nadia’s implementation of behavioral intervention 7 days, 14 days, and 21 days following the completion of training. During this maintenance condition, Nadia’s performance probes were all above 85%.

Sam had six performance probes during the baseline condition receiving an average 41.5% (range 34 to 55%). During the intervention condition Sam showed an immediate increase in the number of skills she engaged in correctly. The intervention condition lasted 11 probes which was a total of 468 h to complete training. We evaluated Sam’s implementation of behavioral intervention 7 days, 14 days, and 21 days following the completion of training. During this maintenance condition, Sam’s performance probes were all above 85%.

Sample of child participant

Figure 2 provides the clinical data for Jackson throughout the course of the study. His data are representative of all child participants. Across the x-axis are the various teaching programs that were directly taught to Jackson. Across the y-axis are the number of targets in each of the programs. The black bars represent the number of targets mastered and the gray bars represent the number of targets introduced.

What is the philosophy on corrective feedback within the Autism Partnership method?

Sample child participant data, representative of all children’s performance.

Within the course of this study, Jackson worked on 12 different programs (e.g. expressive labels, playing basketball). These programs targeted learning how to learn skills, communication skills, and language development. Working on vowels and different ways to play basketball had the fewest targets (n = 5) while expressive labels had the greatest number of targets (n = 162). Jackson was introduced to a total of 318 targets for which he mastered 302 (95%) in the course of this study. In 8 of the 12 programs, Jackson mastered all targets. Jackson mastered 95% of targets introduced in the scripted role-playing program, 87.5% of targets introduced in the describing program, 79.1% of targets introduced in the negation program, and 68.4% of targets introduced in the communication temptations program.

Discussion

The purpose of this study was to evaluate a comprehensive training package to train four professionals how to effectively implement behavioral intervention for individuals diagnosed with ASD. The results of the study demonstrated that prior to the training package all four participants did not correctly implement behavioral intervention with the child participants. Once the training package was implemented, all four participants showed an immediate improvement and continued to improve throughout training. Furthermore, all four participants continued to correctly implement behavioral intervention following training. The successful training of these four participants occurred with an average of 429 hr of training. The results expand the current state of the research and have clinical implications for those who work with individuals diagnosed with ASD.

This study helps improve upon the current training research of interventionists in several ways. First, this study expanded upon the ways in which trainees are usually evaluated in the research literature. Within the training research it is common that trainees evaluated in short sessions (e.g. 5 min of 10 trials) and/or evaluated implementing behavioral intervention with a confederate rather than an individual diagnosed with ASD (see Leaf et al. 2019, Leaf et al. 2020). In this study, however, the trainees were evaluated for longer durations and evaluated working with a child diagnosed with ASD. As such, this study expands upon the research as it provides an assessment of trainee behavior which more closely represents real world situations.

A second way the current study expanded upon the current training research was the scope of the training. In numerous training studies the purpose is to train the participants how to implement one component of behavioral intervention (e.g. preference assessments, discrete trial teaching, or functional assessment). Although, it is important training professionals to correctly implement specific behavior analytic procedures, in real world settings, an interventionist is more likely to be required to implement a wide range of procedures/interventions. This study provided an example of how to train professionals on multiple procedures simultaneously rather than one specific procedure.

A third way the present study expanded upon the current research base was by evaluating a training package that consisted of multiple modalities rather than a single training technique (e.g. BST). Implementing multiple training modalities allows the trainer the flexibility to individualize training based upon the trainee’s needs. This flexibility could result in more effective, efficient training as well as a more preferred method of training for the trainees. Future research could take direct social validity data to assess if this hypothesis is correct.

The results of this study have several clinical implications. First, there are several task lists (e.g. Behavior Analyst Certification Board 2016) outlining the minimum repertoires for behavior analysts at different levels. Unfortunately, these task lists are usually not defined nor is there an objective criterion for evaluating these skills (Leaf et al. 2017a). This study provided a description of the skills a trainee should display during comprehensive behavioral intervention for individuals diagnosed with ASD and an objective method of how to evaluate trainees. Objective measurement of these skills will help ensure trainees are competent in the implementation of behavior analytic procedures. Second, this study provided clinicians with a method of training that uses multiple modalities that permits the trainer to be flexible in selecting methods that are most effective, and likely preferred, for the trainee. This may align to what has been described as a progressive approach to ABA (Leaf et al. 2016), albeit applied to staff training. That is, a progressive approach to ABA places the main source of control on the context and behavior of individual receiving intervention. In the case of this study, multiple training modalities placed the main source of control for the trainer in the behavior of the trainee.

Despite the overall effectiveness of the intervention, there are some limitations of the study that are worth noting. First, no treatment fidelity measures of trainer behavior were obtained. Treatment fidelity data allows one to ensure the observed improvement was due to the intervention being implemented as designed. Data of this sort become more difficult to obtain when interventions require clinical judgement or in-the-moment assessment (Cihon et al. 2019a, 2019b), which was the case in the present study. That is, the trainers made changes to the intervention based upon what would be most beneficial for the adult participants. Furthermore, intervention occurred for long periods of time (i.e. 9 hr). Despite this limitation, the results across adult participants and with multiple trainers may minimize the concerns related to treatment fidelity. Nonetheless, future researchers could address this limitation by collecting treatment fidelity samples and developing measures of clinical judgement.

Second, the time required for the adult participants to reach the dual mastery criteria was considerably longer than documented in other training studies. These results are inconsistent with the 40 hr of training that may be considered the standard based upon certification requirements (e.g. RBT; Behavior Analyst Certification Board 2016). Rather, the results of the present study align more closely with previous commentaries that have questioned the 40 hr training requirement (e.g. RBT; Behavior Analyst Certification Board 2016). That is, if mastery is to be performance based, like in the present study, it seems likely that more than 40 hr of training will be required for a trainee to demonstrate competency in a wide variety of skills.

Third we were only able to provide data for one child participant (i.e. Jackson) in this study. Thus, it is not known what the progress is for the other two child participants. We were unable to present data on the other two child participants due to the fact that consent was not obtained to provide their data and due to the fact that child data was a supplementary measure. Nevertheless, it is a limitation in the study and future researchers should provide all child data in the future. Fourth, all children participant were young and therefore we do not know what the effects of training would be working an older population.

Finally, all participants made large improvements by the first formal probe followed by gradual improvements observed throughout the rest of intervention. The reasons for this trend, observed across all participants, remains unknown. It is possible that some initial skills that do not require in-the-moment assessment are easier to acquire (e.g. using age appropriate reinforcers) while skills that require in-the-moment assessment require more training (e.g. changing presumed reinforcers when they are not producing the desired behavior change). Future researchers could conduct an analysis of the time to mastery for skills that require and do not require in-the-moment assessment. This analysis could provide critical information of time required when training more complex skills.

Despite the aforementioned limitations we hope this study helps inspire additional studies evaluating staff training methods that closely align to the terminal context. That is, studies that shift from more decontextualized settings to those that evaluate interventionist behavior for longer periods of time (e.g. 1–3 hr), a variety of skills, and several clients and contexts. These kinds of studies will be necessary to continue to progress behavior analytic training technology and document its effectiveness in real world, applied settings.

Disclosure statement

No potential conflict of interest was reported by the author(s).

References

  • American Psychiatric Association . 2013. Diagnostic and statistical manual of mental disorders. 5th ed. Washington, DC: Author. [Google Scholar]
  • Behavior Analyst Certification Board . (n.d.). BACB certificant data. Available at: .
  • Behavior Analyst Certification Board . 2016. Registered behavior technician™ (RBT®) task list. Available at: .
  • Bishop, M. R. and Kenzer, A. L.. 2012. Teaching behavioral therapists to conduct brief preference assessments during therapy sessions. Research in Autism Spectrum Disorders, 6, 450–457. [Google Scholar]
  • Bolton, J. and Mayer, M. D.. 2008. Promoting the generalization of paraprofessional discrete trial teaching skills. Focus on Autism and Other Developmental Disabilities, 23, 103–111. [Google Scholar]
  • Catania, C. N., Almeida, D., Liu-Constant, B. and DiGennaro Reed, F. D.. 2009. Video modeling to train staff to implement discrete-trial instruction. Journal of Applied Behavior Analysis, 42, 387–392. [PMC free article] [PubMed] [Google Scholar]
  • Cihon, J. H., Ferguson, J. L., Leaf, J. B., Leaf, R., McEachin, J. and Taubman, M.. 2019. a. Use of a level system with flexible shaping to improve synchronous engagement. Behavior Analysis in Practice, 12, 44–51. [PMC free article] [PubMed] [Google Scholar]
  • Cihon, J. H., Ferguson, J. L., Milne, C. M., Leaf, J. B., McEachin, J. and Leaf, R.. 2019. b. A preliminary evaluation of a token system with a flexible earning requirement. Behavior Analysis in Practice, 12, 548–556. [PMC free article] [PubMed] [Google Scholar]
  • Cihon, J. H., Weinkauf, S. M. and Taubman, M.. 2017. Using the teaching interaction procedure to teach social skills for individuals diagnosed with autism spectrum disorder. In: Leaf J. B., ed. Handbook of social skills and autism spectrum disorder: Assessment, curricula, and intervention. AG Switzerland: Springer International Publishing, pp.313–323. [Google Scholar]
  • Deliperi, P., Vladescu, J. C., Reeve, K. F., Reeve, S. A. and DeBar, R. M.. 2015. Training staff to implement a paired‐stimulus preference assessment using video modeling with voiceover instruction. Behavioral Interventions, 30, 314–332. [Google Scholar]
  • Didden, R., Sturmey, P., Sigafoos, J., Lang, R., O’Reilly, M. F. and Lancioni, G. E.. 2012. Nature, prevalence, and characteristics of challenging behavior. In: Functional assessment for challenging behaviors. New York, NY: Springer, pp. 25–44. 10.1007/978-1-4614-3037-7_3 [CrossRef] [Google Scholar]
  • Eldevik, S.,, Hastings, R. P.,, Jahr, E. and, Hughes, J. C. 2012. Outcomes of behavioral intervention for children with autism in mainstream pre-school settings. Journal of Autism and Developmental Disorders, 42, 210–220. doi: 10.1007/s10803-011-1234-9. 21472360 [PMC free article] [PubMed] [CrossRef] [Google Scholar]
  • Fryling, M. J., Wallace, M. D. and Yassine, J. N.. 2012. Impact of treatment integrity on intervention effectiveness. Journal of Applied Behavior Analysis, 45, 449–453. [PMC free article] [PubMed] [Google Scholar]
  • Graff, R. B. and Karsten, A. M.. 2012. Evaluation of a self-instruction package for conducting stimulus preference assessments. Journal of Applied Behavior Analysis, 45, 69–82. [PMC free article] [PubMed] [Google Scholar]
  • Green, D. R., Ferguson, J. L., Cihon, J. H., Torres, N., Leaf, R., McEachin, J., Rudrud, E., Schulze, K. and Leaf, J. B.. 2020. The teaching interaction procedure as a staff training tool. Behavior Analysis in Practice, 13, 421–433. [PMC free article] [PubMed] [Google Scholar]
  • Howard, J. S., Sparkman, C. R., Cohen, H. G., Green, G. and Stanislaw, H.. 2005. A comparison of intensive behavior analytic and eclectic treatments for young children with autism. Research in Developmental Disabilities: A Multidisciplinary Journal, 26, 359–383. [PubMed] [Google Scholar]
  • Howard, J. S., Stanislaw, H., Green, G., Sparkman, C. R. and Cohen, H. G.. 2014. Comparison of behavior analytic and eclectic early interventions for young children with autism after three years. Research in Developmental Disabilities, 35, 3326–3344. [PubMed] [Google Scholar]
  • Kanne, S. M. and Mazurek, M. O.. 2011. Aggression in children and adolescents with ASD: Prevalence and risk factors. Journal of Autism and Developmental Disorders, 41, 926–937. [PubMed] [Google Scholar]
  • Kazdin, A. E. 1977. Artifact, bias, and complexity of assessment: The ABCs of reliability. Journal of Applied Behavior Analysis, 10, 141–150. [PMC free article] [PubMed] [Google Scholar]
  • Lavie, T. and Sturmey, P.. 2002. Training staff to conduct a paired-stimulus preference assessment. Journal of Applied Behavior Analysis, 35, 209–211. [PMC free article] [PubMed] [Google Scholar]
  • Leaf, R. B., Taubman, M. T., McEachin, J. J., Leaf, J. B., & Tsuji, K. H. (2011). A program description of a community-based intensive behavioral intervention program for individuals with autism spectrum disorders. Education and Treatment of Children, 34(2), 259–285. doi: 10.1353/etc.2011.0012 [CrossRef] [Google Scholar]
  • Leaf, J. B., Aljohani, W. A., Milne, C. M., Ferguson, J. L., Cihon, J. H., Oppenheim-Leaf, M. L., McEachin, J. and Leaf, R.. 2019. Training behavior change agents and parents to implement discrete trial teaching: A literature review. Review Journal of Autism and Developmental Disorders, 6, 26–39. [Google Scholar]
  • Leaf, J. B., Leaf, R., McEachin, J., Taubman, M., Ala'i-Rosales, S., Ross, R. K., Smith, T. and Weiss, M. J.. 2016. Applied behavior analysis is a science and, therefore, progressive. Journal of Autism and Developmental Disorders, 46, 720–731. [PubMed] [Google Scholar]
  • Leaf, J. B., Leaf, R., McEachin, J., Taubman, M., Smith, T., Harris, S. L., Freeman, B. J., Mountjoy, T., Parker, T., Streff, T., Volkmar, F. and Waks, A.. 2017. a. Concerns about the Registered Behavior Technician™ in relation to effective autism intervention. Behavior Analysis in Practice, 10, 154–163. [PMC free article] [PubMed] [Google Scholar]
  • Leaf, J. B., Leaf, J. A., Milne, C., Taubman, M., Oppenheim-Leaf, M., Torres, N., Townley-Cochran, D., Leaf, R., McEachin, J. and Yoder, P.. 2017. b. An evaluation of a behaviorally based social skills group for individuals diagnosed with autism spectrum disorder. Journal of Autism and Developmental Disorders, 47, 243–259. [PubMed] [Google Scholar]
  • Leaf, J. B., Milne, C., Aljohani, W. A., Ferguson, J. L., Cihon, J. H., Oppenheim-Leaf, M. L., McEachin, J. and Leaf, R.. 2020. Training change agents how to implement formal preference assessments: A review of the literature. Journal of Developmental and Physical Disabilities, 32, 41–18. [Google Scholar]
  • Leaf, R., Taubman, M., Bondy, A. and McEachin, J.. 2008. To BCBA or Not to B? In: Leaf R., McEachin J. and Taubman M., ed. Sense and nonsense in the behavioral treatment of autism: It has to be said. 1st ed. DRL Books Inc: New York, NY, pp.49–54. [Google Scholar]
  • Lovaas, O. I. 1987. Behavioral treatment and normal educational and intellectual functioning in young autistic children. Journal of Consulting and Clinical Psychology, 55, 3–9. [PubMed] [Google Scholar]
  • Maenner, M. J., Shaw, K. A., Baio, J., Washington, A., Patrick, M., DiRienzo, M., Christensen, D. L., Wiggins, L. D., Pettygrove, S., Andrews, J. G., Lopez, M., Hudson, A., Baroud, T., Schwenk, Y., White, T., Rosenberg, C. R., Lee, L.-C., Harrington, R. A., Huston, M., Hewitt, A., Esler, A., Hall-Lande, J., Poynter, J. N., Hallas-Muchow, L., Constantino, J. N., Fitzgerald, R. T., Zahorodny, W., Shenouda, J., Daniels, J. L., Warren, Z., Vehorn, A., Salinas, A., Durkin, M. S. and Dietz, P. M.. 2020. Prevalence of autism spectrum disorder among children aged 8 years - autism and developmental disabilities monitoring network, 11 sites, United States, 2016. Morbidity and Mortality Weekly Report. Surveillance Summaries (Washington, D.C.: 2002), 69, 1–12. [PMC free article] [PubMed] [Google Scholar]
  • McEachin, J. J.,, Smith, T., &, Lovaas, O. I. (1993).Long-term outcome for children with autism who received early intensivebehavioral treatment. American Journal of Mental Retardation : AJMR, 97(4), 359- discussion 373. [PubMed] [Google Scholar]
  • Nosik, M. R. and Williams, L.. 2011. Component evaluation of a computer based format for teaching discrete trial and backward chaining. Research in Developmental Disabilities, 32, 1694–1702. [PubMed] [Google Scholar]
  • Nosik, M. R., Williams, L., Garrido, N. and Lee, S.. 2013. Comparison of computer based instruction to behavior skills training for teaching staff implementation of discrete-trial instruction with an adult with autism. Research in Developmental Disabilities, 34, 461–468. 10.1016/j.ridd.2012.08.011 [PubMed] [CrossRef] [Google Scholar]
  • Pence, S. T., St Peter, C. C. and Tetreault, A. S.. 2012. Increasing accurate preference assessment implementation through pyramidal training. Journal of Applied Behavior Analysis, 45, 345–359. [PMC free article] [PubMed] [Google Scholar]
  • Reichow, B. 2012. Overview of meta-analyses on early intensive behavioral intervention for young children with autism spectrum disorders. Journal of Autism and Developmental Disorders, 42, 512–520. [PubMed] [Google Scholar]
  • Richards, C., Oliver, C., Nelson, L. and Moss, J.. 2012. Self-injurious behaviour in individuals with autism spectrum disorder and intellectual disability. Journal of Intellectual Disability Research : Jidr, 56, 476–489. [PubMed] [Google Scholar]
  • Roscoe, E. M. and Fisher, W. W.. 2008. Evaluation of an efficient method for training staff to implement stimulus preference assessments. Journal of Applied Behavior Analysis, 41, 249–254. [PMC free article] [PubMed] [Google Scholar]
  • Shireman, M. L., Lerman, D. C. and Hillman, C. B.. 2016. Teaching social play skills to adults and children with autism as an approach to building rapport. Journal of Applied Behavior Analysis, 49, 512–531. [PubMed] [Google Scholar]
  • St Peter Pipkin, C., Vollmer, T. R. and Sloman, K. N.. 2010. Effects of treatment integrity failures during differential reinforcement of alternative behavior: A translational model. Journal of Applied Behavior Analysis, 43, 47–70. [PMC free article] [PubMed] [Google Scholar]
  • Watson, P. J. and Workman, E. A.. 1981. The non-concurrent multiple baseline across-individuals design: An extension of the traditional multiple baseline design. Journal of Behavior Therapy and Experimental Psychiatry, 12, 257–259. [PubMed] [Google Scholar]
  • Weinkauf, S. M., Zeug, N. M., Anderson, C. T. and Ala’i-Rosales, S.. 2011. Evaluating the effectiveness of a comprehensive staff training package for behavioral interventions for children with autism. Research in Autism Spectrum Disorders, 5, 864–871. [Google Scholar]


Articles from International Journal of Developmental Disabilities are provided here courtesy of The British Society of Developmental Disabilities


Which data collection system is a hallmark of the Autism Partnership method?

Which data collection system is a hallmark of the Autism Partnership Method? a Likert scale.

Which of the following are the three prongs of evidence based practice?

All three prongs of EBP (i.e., best external evidence, clinical expertise, and client/family input and circumstances) must be equally stressed in both academic and clinical contexts.

What is one of the most important components of a token economy?

One effective method of reinforcement is the use of “token economies.” Token economies have three major components: 1) a behavior or behaviors someone needs to exhibit; 2) tokens or points earned for engaging in those behaviors; and 3) exchanging tokens or points for a choice of reinforcing rewards.

What were the results of Leaf et al 2016 ie the penguin study )?

What were the results of Leaf et al. (2016) (i.e., the penguin study)? Statistical improvement in social behavior.