top of page

Monitoring Executive Functioning for Improvement

  • Writer: ConnectedMTSS
    ConnectedMTSS
  • Jan 30, 2021
  • 5 min read

At the high school level*, I estimate that I hear the term “executive functioning” (EF) hourly. I have been curious to ask those using the term for a definition. My prediction is that I would receive a slightly different variation from each person I asked. Furthermore, I estimate seeing the diagnosis of “executive functioning disorder” (EFD) at least once or twice a month. Without getting controversial or providing my opinion, based on the five minutes of searching I conducted, Executive Functioning Disorder is not in the DSM-V.


Cirino and Willcutt (2017) provided a brief background of EF in the introduction for a special issue in the Journal of Learning Disabilities. They defined EF as “domain-general control processes important for managing goal-directed behaviors.” Taking that just a bit further, the authors also reference Lezak from 1983 in identifying four component processes of EF, “goal formation, planning, carrying out goal-directed plans, and effective performance.” EF is how our students plan, organize, and complete work.


Whether or not EFD is real or how significantly students are impacted by EFD is not my purpose here. However, I noticed that the range of goals on IEPs and measurement of behaviors related to EF is astonishingly vast and the quality of the goals and measurement is equally vast. Plus, I have seen that some professionals just do not know or have the background and/or time to develop defensible goals and measurement of behavior related to EF. I intend to provide a brief, defensible measure and framework for how it could be used to monitor and help students improve their academic behaviors.


Those who I worked with in the past will likely roll their eyes at this because I mentioned DBR to each one of them, repeatedly. Direct Behavior Ratings (DBRs) are behavior monitoring tools that are flexible, quick, efficient, reliable, and valid to use. Behavior is operationalized and respondents indicate the degree to which they observed the target behavior during a specific window of time. For example, the “Big 3” behaviors listed on DBR are engagement, respect, and disruption. A teacher could be asked to provide an estimation of the percentage of time that a student was engaged during U.S. History from 1 to 10 with 1 indicating 0-10% or rarely and 10 indicating the student was engaged almost 100% of the time. Some teachers will push back indicating that DBR seems like a very inconsistent or imprecise measure. If used once or twice, DBR is unreliable. The reliability comes from using DBRs repeatedly and collecting behavioral data repeatedly for a few weeks.


The disclaimer here is that DBRs are not recommended for high-intensity behaviors such as aggression or violence. That would be a whole different behavior with different measures and interventions.


There appears to be a wide range of behaviors and definitions for EF. However, it may be logical that behaviors that interfere with a student planning how to complete a task, sustaining focus, or sustaining effort could lead to declines in academic skill acquisition or performance. Increased behaviors that enable work completion, planning, and organizing and academic success could be enabled.


Another disclaimer, the NCII tools chart lists the recommended grades for DBR as K-8. My interpretation is that DBR is recommended as a widely used tool in these grades (e.g. Tier 2 and 3). I am proposing the use of DBR for high school use if students are conferenced with privately and included in the process of defining behaviors, setting goals, and reviewing progress. The use of self-reported behavior is also recommended to be completed privately, which is likely a good idea in grades K-8 too. With those caveats, only a few students or those likely with IEPs would be possible to use DBR at the high school level. With this in mind, here is a proposed plan for how to use DBR adjacent to EF in a secondary setting.


  1. Operationalize the Target Behavior. Keep the behaviors relatively simple and make sure the student knows and understands the target behaviors. For high school, conference, discuss and share the results privately to maintain buy-in. When starting, it might be acceptable for the sake of efficiency to adopt the DBR behaviors of Engagement, Respect, and Disruption. Although this may not be the perfect fit for every student, the researchers at DBR developed that definition after extensive research and analysis. If students demonstrate engaged behavior, grades or academic performance will likely increase. Thinking back to Psych 101, consider developing target behaviors that are observable, measurable, and repeatable AND in line with the definition of EFs. Seek out behaviors that increase planning, organization, and work completion.

    1. For other behavior examples, consider the Academic Enablers Handout based on DiPerna’s work from 2006.

  2. Collect baseline data: For an acceptable baseline, it is recommended that 7-10 data points are collected and averaged for a baseline score.

  3. Set a Goal for each target behavior: For positive behaviors, the target is 8 or above and for negative behaviors (disruption), the goal is 2 or below (Chafouleas, Kilgus, & Hernandez, 2009).

  4. Establish measurement schedule: From personal experience, maintaining adherence to completion of any form or rating over a sustained amount of time is subject to reduced fidelity over time. Ideally, the schedule may include having a teacher or teachers respond to DBRs (online or paper) twice a week. It is also highly recommended that the scores/data be entered as soon as possible to avoid the dreaded backlog. This may be an advantage for the use of online forms (e.g. create a DBR on Google or Microsoft Forms) or the DBR Connect product.

  5. Graph the data and provide updates to the student. Feedback can be powerful and graphs can enable powerful conversation. Regular updates can be impactful.

  6. When scores are consistently above 8 or below 2, consider modifying the behavior or reducing the number of goals provided other indicators agree (e.g. grades, ODRs).


If gaps in data occur, update using the procedure for collecting baseline data. Use of the 7-10 data points with an average could also be used if there is a lack of completion of DBRs by certain team members.


Although DBR is generally recommended for use in grades K-8, I could imagine this being modified a bit to provide benefits to high school students. Target behaviors can be modified to address EF related behavior to communicate targets to students and also provide feedback to students about whether the behaviors are seen in class. DBR offers flexibility to change behaviors and schedules of monitoring to fit a student’s unique needs. Furthermore, case managers may appreciate the efficiency and defensibility inherent in DBRs and the ability to provide updates to parents and students through progress reports or regular updates. I have been a broken record extolling the values of DBRs for years now. I see incredible value and benefit to using this tool in ways that could benefit students and increase efficiency and effectiveness at the high school level.


Please let me know.


Are you using DBR?

How is it going if so?

What grade level and for what behaviors?

Would you consider this at the high school and why or why not?


@connectedmtss


*I serve in suburban high schools. I acknowledge this may not apply in other settings as easily.




 
 
 

Recent Posts

See All
Educator Experience

The other day, I was comparing notes with someone about how long we both worked in K-12 education. I had a rough idea of the number of...

 
 
 
AI Interventions?

The other day, I was reviewing a student's performance on the CORE Phonics Survey. I noticed that the student struggled with...

 
 
 

コメント


Subscribe Form

©2018 by ConnectedMTSS. Proudly created with Wix.com

bottom of page