Complete the charts in the word document (2 tables for each article) in 7th APA format using the info provided in the articles.
The topic of our research is:
Increasing On task/time on task behavior during distance learning time in Gifted students also diagnosed with ADHD at elementary level.
At the top of each table include the reference of the article. You have an example included on the word doc.
Universityof Nebraska at Omaha
Psychology Faculty Publications Department of Psychology
9-2011
Increasing On-Task Behavior Using Teacher
Attention Delivered on a Fixed-Time Schedule
Jessica L. Riley
Brian McKevitt
University of Nebraska at Omaha, bmckevitt@unomaha.edu
Mark D. Shriver
University of Nebraska Medical Center
Keith D. Alle
n
University of Nebraska Medical Center
Follow this and additional works at: https://digitalcommons.unomaha.edu/psychfacpub
Part of the Psychology Commons
This Article is brought to you for free and open access by the Department
of Psychology at DigitalCommons@UNO. It has been accepted for
inclusion in Psychology Faculty Publications by an authorized
administrator of DigitalCommons@UNO. For more information, please
contact unodigitalcommons@unomaha.edu.
Recommended Citation
Riley, Jessica L.; McKevitt, Brian; Shriver, Mark D.; and Allen, Keith D., “Increasing On-Task Behavior Using Teacher Attention
Delivered on a Fixed-Time Schedule” (2011). Psychology Faculty Publications. 217.
https://digitalcommons.unomaha.edu/psychfacpub/217
http://www.unomaha.edu/?utm_source=digitalcommons.unomaha.edu%2Fpsychfacpub%2F217&utm_medium=PDF&utm_campaign=PDFCoverPages
http://www.unomaha.edu/?utm_source=digitalcommons.unomaha.edu%2Fpsychfacpub%2F217&utm_medium=PDF&utm_campaign=PDFCoverPages
https://digitalcommons.unomaha.edu?utm_source=digitalcommons.unomaha.edu%2Fpsychfacpub%2F217&utm_medium=PDF&utm_campaign=PDFCoverPages
https://digitalcommons.unomaha.edu/psychfacpub?utm_source=digitalcommons.unomaha.edu%2Fpsychfacpub%2F217&utm_medium=PDF&utm_campaign=PDFCoverPages
https://digitalcommons.unomaha.edu/psych?utm_source=digitalcommons.unomaha.edu%2Fpsychfacpub%2F217&utm_medium=PDF&utm_campaign=PDFCoverPages
https://digitalcommons.unomaha.edu/psychfacpub?utm_source=digitalcommons.unomaha.edu%2Fpsychfacpub%2F217&utm_medium=PDF&utm_campaign=PDFCoverPages
http://network.bepress.com/hgg/discipline/404?utm_source=digitalcommons.unomaha.edu%2Fpsychfacpub%2F217&utm_medium=PDF&utm_campaign=PDFCoverPages
https://digitalcommons.unomaha.edu/psychfacpub/217?utm_source=digitalcommons.unomaha.edu%2Fpsychfacpub%2F217&utm_medium=PDF&utm_campaign=PDFCoverPages
mailto:unodigitalcommons@unomaha.edu
http://library.unomaha.edu/?utm_source=digitalcommons.unomaha.edu%2Fpsychfacpub%2F217&utm_medium=PDF&utm_campaign=PDFCoverPages
http://library.unomaha.edu/?utm_source=digitalcommons.unomaha.edu%2Fpsychfacpub%2F217&utm_medium=PDF&utm_campaign=PDFCoverPages
Running head: INCREASING ON-TASK BEHAVIOR USING FIXED TIME 1
Increasing On-Task Behavior Using Teacher Attention Delivered
on a Fixed Time Schedule
Jessica L. Riley, MS and Brian C. McKevitt, PhD1
Mark D. Shriver, PhD and Keith D. Allen, PhD2
1J. L. Riley & B. C. McKevitt, Department of Psychology, University of Nebraska at Omaha,
Omaha, NE. USA.
2M. D. Shriver & K. D. Allen, Department of Psychology, Munroe-Meyer Institute, University of
Nebraska Medical Center, Omaha, NE. USA.
Correspondence concerning this manuscript should be addressed to Brian C. McKevitt,
PhD, Department of Psychology, University of Nebraska at Omaha, 6001 Dodge St., Omaha, NE
68182. Phone: 402-554-2498. Fax: 402-554-2556. E-mail: bmckevitt@unomaha.edu.
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 2
Abstract
The effectiveness of fixed-time delivery of attention to increase the on-task behavior of 2
students in general education was examined. The teacher in this study provided attention to
students on a 5-min fixed-time schedule and responded to students in her typical manner between
cued intervals. An ABAB withdrawal design was used to test the effects of the intervention. The
results of this study indicate that a fixed-time schedule of attention was effective in increasing
students’ on-task behavior and decreasing their off-task behavior. Implications of the study for
research and practice are discussed.
Key Words: Fixed time schedule, reinforcement, teacher attention, intervention, on-task
behavior
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 3
Increasing On-Task Behavior Using Teacher Attention Delivered
on a Fixed Time Schedule
Classroom management is an important component of effective teaching (Algozzine,
Ysseldyke, & Elliott, 1997). Teachers must be able to keep students engaged in academic tasks,
as well as employ strategies to reduce classroom disruptions and inappropriate classroom
behaviors. Managing inappropriate behaviors and classroom disruptions is time-consuming and
takes away from valuable instructional time and time that students can engage in academic
behaviors. Students who frequently engage in off-task and inappropriate behaviors disrupt the
classroom environment and hinder learning. Teachers working with these students need effective
strategies to increase student compliance and engaged academic behaviors, as well as to reduce
off-task and inappropriate classroom behaviors and disruptions (Alberto & Troutman, 2009).
The implementation of fixed-time (FT) delivery of teacher attention has been found to be
effective in reducing off-task and disruptive behavior in school settings (Austin & Soeda, 2008;
Jones, Drew, & Weber, 2000). Austin and Soeda (2008) found that the delivery of attention
using a FT schedule was effective in reducing the off-task behaviors of two typically developing
third grade students in a general education classroom. After a functional analysis determined
attention to be maintaining both students’ disruptive behavior, the researchers worked with the
classroom teacher to identify a feasible schedule of reinforcement. During the FT attention
condition, the teacher provided the students with attention every 4 min. The teacher praised the
students for on-task behavior and redirected the students if they were off-task. The teacher was
asked to ignore the students’ appropriate and inappropriate behavior that occurred between the
intervals (i.e., extinction).
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 4
Austin and Soeda (2008) found immediate and sustained reductions in the percentage of
intervals off-task using this schedule of FT reinforcement. During the condition in which the
teacher implemented the FT schedule of reinforcement delivery, the students’ off-task behaviors
decreased; following a return to baseline the percentage of intervals off-task for both students
increased, but once the teacher resumed FT reinforcement delivery, the off-task behaviors of the
students immediately decreased. The results were maintained within a different setting and also
when the intervention was implemented by a different teacher.
Austin and Soeda (2008) conceded that extinction was likely a contributing factor to the
intervention’s effectiveness. Given that the teacher ignored the problematic behavior of the
students during the intervention condition, the effects of the intervention may have been
attributed to extinction for off-task behaviors between intervals and increased attention for
students’ on-task behaviors at cued intervals. Another potential explanation offered for the
decrease in off-task behaviors was differential reinforcement. The teacher responded with praise
for on-task behaviors or a brief redirection for off-task behaviors. The researchers reported that
there may have been an increased association between engaging in on-task behaviors and the
delivery of teacher praise.
A FT schedule of attention delivery requires teacher attention be delivered at certain pre-
determined time intervals. FT attention delivery is presumed to work because an individual’s
motivation to engage in problematic behavior to obtain attention may be reduced as a result of
the freely available reinforcer (Lalli, Casey, & Kates, 1997). Given that the delivery of attention
is based on time and not occurrences of a target behavior, FT attention delivery is sometimes
referred to as noncontingent reinforcement or NCR (Waller & Higbee, 2010). However, to be
classified as NCR, the stimulus delivered on the time-based schedule should be unrelated to the
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 5
target behavior and not contingent upon its occurrence (Alberto & Troutman, 2009; Kazdin,
2001). In schools, it may be unlikely that teachers will provide students with comments that are
unrelated to the delivery of desired target behaviors. In fact, teachers are commonly taught to
make behavior-specific praise statements (Alberto & Troutman, 2009). Therefore, providing
teachers with cues to make statements related to praising desired behavior or redirecting
undesired behavior may be considered more acceptable than typical NCR delivery used with
students with more severe disabilities that involves neutral statements.
Furthermore, extinction is a common component of NCR. Teachers are instructed to
ignore problematic behaviors between cued intervals of attention delivery (Alberto & Troutman,
2009). However, teachers may find it challenging and unacceptable to ignore problem
behaviors between intervals of attention, especially when those problem behaviors disrupt the
learning environment. Allowing teachers to address behaviors between intervals of fixed-time
attention might serve to make the intervention more acceptable.
FT attention delivery is an intervention that can be reasonably implemented by a
classroom teacher without a significant amount of time and cost (Austin & Soeda, 2008). The
density of the schedule should depend on the frequency of the problematic behaviors. Leaner
schedules, which may be more feasible for teachers, may be used in the classroom for students
whose off-task or disruptive behavior disrupts their learning or the learning of other students in
the class while denser schedules may be used with students with more severe behaviors.
Mautone et al. (2009) reported that the time and resources required to implement
interventions, as well as the complexity of the intervention, are characteristics that may impact
the treatment integrity of interventions implemented in the classroom. Educators prefer
interventions that are time efficient, minimally intrusive, and that increase appropriate classroom
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 6
behaviors and skills (Elliott, Witt, & Kratochwill, 1991). Given that FT reinforcement delivery
may be a time and resource efficient intervention for teachers to implement in their classroom to
address disruptive behaviors of students, teacher acceptability of the intervention may increase.
Teacher acceptability of the intervention is critical for treatment integrity and intervention
success (Mautone et al., 2009). Austin and Soeda (2008) indicated that involving the teacher in
decisions about the schedule of reinforcement may also have increased acceptability of the
intervention.
The purpose of this study was to examine the effectiveness of the delivery of FT attention
on the increase in on-task behavior and co-occurring reduction in off-task behavior in general
education students. Studies applying FT attention to a general education population are
uncommon but important because FT can be a highly manageable intervention for a general
education teacher to implement (Austin & Soeda, 2008). The present study replicates the study
conducted by Austin and Soeda (2008) thereby adding to the existing evidence for the
effectiveness of a FT schedule of attention as an acceptable method for increasing on-task
behavior and decreasing off-task behavior in a general education setting. This study also
expands on Austin and Soeda’s (2008) study by examining if FT attention delivery is effective in
reducing off-task behaviors without the inclusion of extinction between intervals.
Method
Participants and Setting
Data were collected from two elementary school students and their classroom teacher in a
school located in a small Midwestern U.S. city. One 7-year-old female student, Sally, and one 7-
year- old male student,
Joey
, were identified by their teacher as displaying off-task behavior,
including not remaining seated or keeping hands to self during large-group instruction, talking to
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 7
peers, calling out, getting out of their seats, disrupting instruction and disturbing other students.
The teacher reported that both students have the academic skills to complete coursework but that
these students engaged in more off-task behavior than their classroom peers. Joey had been
diagnosed with attention-deficit-hyperactive disorder and was taking medication during the
course of the study. The classroom teacher had been teaching elementary-age students for nine
years.
The school in which the study took place had 225 students in grades Kindergarten
through fifth, with 60% of the school’s population eligible for free or reduced-price lunch. This
study took place in the classroom of the teacher and student participants. The classroom had 11
girls and 10 boys at the time of the study.
Recording and Reliability
The variables of interest that were recorded are included in the Behavioral Observation of
Students in Schools (BOSS) developed by Shapiro (2004). These variables included: (a) active
engaged time, defined as “those times when the student is actively attending to the assigned
work” (b) passive engaged time, defined as “those times when the student is passively attending
to assigned work” (c) off-task motor behaviors, defined as “any instance of motor activity that is
not directly associated with an assigned academic task” (d) off-task verbal behaviors, defined as
“any audible verbalizations that are not permitted and/or are not related to an assigned academic
task” (e) off-task passive behaviors, defined as “those times when a student is passively not
attending to an assigned academic activity for a period of at least three consecutive seconds” and
(f) teacher-directed instruction, defined as “those times when the teacher is directly instructing
the class or individuals within the class” (Shapiro, 2004, pp. 38-41). Shapiro designed the BOSS
to be used for classroom observations of independent seatwork, small group, or other
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 8
instructional events. Active and passive engaged time were reported as on-task behaviors in the
results of this study, while off-task motor, verbal, and passive behaviors were coded as off-task
behaviors. The behavioral categories were combined by adding the intervals together.
Direct observations of student on-task and off-task behavior were conducted by the first
author and trained graduate students using a 15-s momentary time sampling recording procedure.
A timer was used to indicate the time intervals when observations by the researchers occurred
and were recorded. At the beginning of the cued interval, the observer(s) looked at the behavior
of the target student or teacher and placed a mark in the appropriate box on the scoring sheet.
Direct observations took place three to four times a week during 30-min sessions in the
classroom during both teacher-led large-group and small-group instruction, as well as
independent and partner work. Observers rotated between each student participant after 5 min of
recording so that each student was observed for a total of 15 min. On some days during
observation sessions, students had to leave for other activities so the observation session was
limited to 12.5 to 13 min per student. Data were recorded for 17 sessions during the fall semester
of the academic school year.
A frequency count of teacher attention towards each student participant was also
recorded; teacher attention included teacher praise and redirection. Teacher praise towards the
student was recorded if the teacher said something positive to the student. Examples of teacher
praise included: “Great job (student name)” and “I like how hard you are working (student
name).” A redirection of student behavior made by the teacher was recorded if the teacher told
the student that he/she should not be engaging in a certain behavior and/or should be engaging in
a different behavior or task. Examples of redirection included: “(Student name), you should be
working on your paper,” “(Student name), you should not be talking to your friends right now,”
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 9
and “Keep your hands to yourself (student name).” Examples of teacher talk that were not scored
include: “(Student name), please read the next section out loud” and “(Student name), point out
on the board which word most correctly fits this sentence.”
The observations were conducted by the first author and a secondary observer for
interrater agreement checks. Graduate students were trained to use the 15-s momentary time
sampling procedure and how to code behaviors using the variables from the BOSS. The training
included teaching and reviewing the procedures, providing examples of behaviors that could be
coded, and also practicing the coding procedures. Trained graduate students simultaneously, but
independently, observed and recorded on-task and off-task behavior of the target students and the
praise and redirection statements made by the teacher during 23.5% of the sessions across all
phases of the study. Interrater agreement was calculated by dividing the sum of intervals the
codes were consistent between observers by the total number of intervals recorded during the
observation session. The mean interrater reliability for on-task and off-task behavior was 95%,
with a range of 80.77% to 100% accuracy. Interrater reliability for teacher praise and redirection
statements was 100%.
Treatment Integrity
To assess procedural integrity during intervention conditions, data were collected to
monitor the delivery of teacher attention to the target students. The delivery of teacher attention
had to have been provided every 5 min. Due to the teacher’s classroom responsibilities, such as
providing direct instruction to the large group or talking with another student, the teacher was
allowed to provide attention to the students up to 5 s after her cue and still have it coded as a
correct delivery. Treatment integrity was calculated by dividing the number of intervals the
teacher provided attention divided by the number of intervals the teacher was cued, then
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 10
multiplying by 100%. Treatment integrity during 8 of the 9 sessions in the intervention
conditions was 100%; one intervention session was implemented with 80% integrity.
Procedure
Informed consent was obtained from the teacher participant and the parents of the student
participants. The author met with the teacher participant to explain the benefits of using FT with
students who display off-task behavior. The author then described the FT procedure, including
the teacher’s role during the baseline and intervention conditions and a brief description of the
data collection procedures and interrater observations.
Functional assessment. The teacher was interviewed using the Functional Behavioral
Assessment Interview Form developed by Gable, Quinn, Rutherford, and Howell (1998). The
purpose of the interview was to develop hypotheses regarding the function of off-task and
disruptive behavior of the student participants. The information obtained during the interview
included a description of the problem behavior, when and where the target behavior was most
and least likely to occur, and what happened before, during, and after the problem behavior.
Direct observations of the student participants were then conducted by the author using anecdotal
recording. Data were recorded on the occurrence and description of the problem behavior, and
what occurred directly before and after the problem behavior of each student participant. The
teacher and the first author used interview and observation information to generate a hypothesis
that the off-task behavior of each student participant was maintained by teacher and peer
attention.
Baseline. During the baseline phase, no changes in teacher behavior were made; the
teacher was asked to respond to the students in her typical manner which consisted of brief
reprimands or redirections for off-task or inappropriate behavior and praise for on-task and
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 11
desirable behavior. Direct observations took place by the first author and secondary observer to
record the on-task and off-task behavior of each student participant. Observations of student
behavior occurred during large and small group teacher-led instruction and independent work
time.
FT attention. The teacher and the author agreed upon a 5-min schedule of reinforcement
which was based upon the baseline levels of teacher attention for each participating student. The
teacher provided attention to each student about every 15 min during baseline. The teacher
agreed that providing the students attention every 5 min seemed feasible and would increase
rates of teacher attention to three times the rate originally provided in the baseline conditions.
Austin and Soeda (2008) chose a 4-min schedule of reinforcement in their study; this was
effective in decreasing off-task behaviors of students. Considering the rates of the students’ off-
task behaviors, we hypothesized that an even leaner schedule of reinforcement could be effective
in decreasing these behaviors.
The teacher was cued to provide attention to the student participants using a Motivaidor®
device that was set to vibrate every 5 min. Upon receiving the cue, the teacher provided brief,
individual attention to each student participant; the students were praised for on-task behavior
and the teacher redirected off-task behavior. The teacher was asked to alternate providing
attention to each student first. During the intervention condition, the teacher was allowed to
redirect and reinforce student participants’ behavior between intervals as she typically would.
Experimental Design and Data Analysis
This study utilized an ABAB withdrawal design with both student participants in order to
assess the presence of a functional relationship between intervention implementation and student
behavior. During the baseline (A) condition, the teacher was asked to respond to the student
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 12
participants’ behavior in her usual manner. During the FT attention (B) condition, the teacher
provided attention to the students’ behavior following a vibratory cue at fixed intervals. Between
the cued intervals, the teacher responded to the students in her typical manner. The initial
intervention phase was followed by a withdrawal of the intervention and a return to baseline (A).
The FT attention intervention condition (B) was then reinstated.
The on-task and off-task behaviors of each student were recorded in the same way during
the baseline and intervention phases to allow for an analysis of the effects of the intervention.
This design demonstrates the relations between the implementation of the intervention and the
changes in target behavior (Tankersely, Harjusola-Webb, & Landrum, 2008). In addition, rates
of teacher praise and redirections were recorded for each condition to analyze the effect of the
FT attention intervention on increasing the unprompted rates of teacher attention.
Results
The on-task behaviors of each student are presented in Figure 1. On- and off-task
behaviors were mutually exclusive in the coding system, thus only on-task behaviors are
displayed. For Sally, there is a clear and immediate increase in on-task behaviors at the start of
the intervention, (M=90.43%; range 80% to 96.7%) relative to baseline (M=69.5%; range 50% to
91.7%). This increase in the percentage of intervals of on-task behavior is maintained throughout
the intervention condition. There is an immediate decrease in on-task behavior with a return to
baseline (M=69.58%; range 61.7% to 83.3%); followed by an increase in on-task behaviors when
the intervention is reinstated (M=91.7%; range 83.3% to 96.7%). The mean percentage of
intervals of off-task behavior were greater in both baseline conditions (M=30.95%; range 8.3%
to 50%) in the first baseline and in the second baseline (M=30.43%; range 16.7% to 38.3%)
when compared to the mean percentage of intervals of off-task behavior in both intervention
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 13
conditions (M=9.58%; range 3.3% to 20%) in the first intervention condition and in the second
intervention condition (M=8.3%; range 3.3% to 16.7%).
For Joey, there is a distinct increase in on-task behaviors at the start of the intervention,
which is sustained throughout the intervention condition (M=80%; range 76.7% to 86.7%)
relative to baseline (M=59.85%; range 51.9% to 75%). There is a decrease in the percentage of
intervals of on-task behavior after a return to baseline (M=62.5%; range 48.3% to 71.7%). On-
task behaviors then increase when the intervention condition is reinstated (M=80.05%; range
71.7% to 88.5%). The mean percentage of intervals of off-task behavior were greater in both
baseline conditions (M=40.15%; range 25% to 48.1%) in the first baseline and in the second
baseline condition (M=37.5%; range 28.3% to 51.7%) when compared to the mean percentage of
intervals of off-task behavior in both intervention conditions (M=20%; range 13.3% to 21.7%) in
the first intervention condition and in the second intervention condition (M=18.7%; range 11.5%
to 28.3%).
The rate of teacher attention (i.e., praise and redirection statements) made per minute
during baseline and intervention conditions is presented in Figure 2. The rate of praise statements
made by the teacher to
Sally
increased during intervention conditions when compared to baseline
conditions. The mean rate of praise statements for Sally during the first baseline condition
(M=.03; range 0 to .10) increased during the intervention condition (M=.17; range .13 to .20).
Praise statements decreased during the second baseline condition (M=.02; range 0 to .07) and
then increased again during the second intervention condition (M=.15; range .07 to .20).
Likewise, the number of redirections for Sally increased during the intervention conditions when
compared to baseline, although redirections occurred less frequently than praise statements in the
intervention conditions. The mean rate of redirection statements during the first baseline
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 14
condition (M=.04; range 0 to .10) increased during the first intervention condition (M=.10; range
.03 to .17). In the second baseline condition, the mean rate of redirections again decreased
(M=.02; range 0 to .03), then increased in the second intervention condition (M=.13; range 0 to
0.13).
The same pattern of increases occurred for Joey; the mean rate of praise statements in the
first baseline (M=0.01; range 0 to .03) and in the second baseline (M=0.02; range 0 to .03)
increased in the first intervention condition (M=.18; range .13 to .20) and in the second
intervention condition (M=.15; range .12 to .20). The mean rate of redirections for Joey in the
first baseline (M=.05; range .03 to .07) and in the second baseline (M=.07; range 0 to .13) also
increased in the first intervention condition (M=.09; range .03 to .17) and in the second
intervention condition (M=.09; range .03 to .20). Like Sally, praise statements occurred more
frequently than redirections for Joey.
Figure 3 depicts the rates of total teacher attention (i.e., praise and redirection statements
combined) during baseline and intervention conditions. As expected, rates of teacher attention
for each student increased considerably following the introduction of FT because of the cued
prompts provided to the teacher. However, a question remained whether the teacher provided
attention beyond the cued prompts, thus demonstrating that she did not engage in extinction
between teacher prompts. For the intervention, the teacher was asked to provide attention to the
students every 5 min, or at a rate of .20 times per min (as depicted by the horizontal line in
Figure 3). Because the rates of attention provided exceed .20 per min during the intervention
conditions, this graph illustrates that the teacher also provided attention to each student between
the cued intervals and thus did not engage in extinction. In fact, when we subtracted the teacher
attention provided on cue from the total teacher attention provided (as illustrated in Figure 3), we
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 15
found that the rates of teacher attention provided without cues in baseline (M=.07 for Joey and
M=.06 for Sally) were almost identical to those provided without cues during the intervention (M
=.08 for Joey and M= .06 for Sally), further demonstrating that the teacher proceeded with her
typical routine between cued prompts.
Discussion
The findings of this study indicate that the provision of FT attention was effective in
increasing the on-task behaviors of both student participants. With an increase in the percentage
of intervals of student engagement in on-task behaviors, there was a decrease in the occurrence
of incompatible off-task and disruptive behaviors. This study demonstrates that FT attention
delivery can be an effective strategy used to increase the on-task behaviors and decrease the off-
task behaviors of typically-developing students. These findings are similar to the results of the
Austin and Soeda (2008) study in which off-task behaviors of students decreased upon the
implementation of FT.
In this study, FT attention delivery was differentiated from NCR because the attention
provided by the teacher was related to the behaviors she observed. Commonly, teachers make
behavior specific statements, so the provision of attention related to observed behaviors in this
FT intervention has practical applicability for teachers. Furthermore, the extinction component
typically required when implementing FT or NCR was removed for this study, also enhancing its
practical application. Even without the inclusion of extinction between cued intervals, FT was
still effective in increasing the on-task behavior of the student participants. Teachers may be
more willing to implement an intervention which would allow them to respond to student
behaviors between cued intervals (e.g., by redirecting disruptive behavior during instruction).
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 16
The teacher in this study stated that she liked the intervention and that it was easy to use.
She believed that this intervention was not a distraction to learning and teaching in her
classroom. She reported that using the Motivaidor® cueing device reminded her to provide
reinforcement to the students in the study and also to other students in the classroom. The teacher
understood how important it was to provide more frequent attention to these students and also to
provide more positive reinforcement by praising the students for desirable behaviors. She
reported that she is trying to build in more reinforcement for all the students in her class.
This study also indicates that a relatively lean schedule of reinforcement is sufficient in
increasing the on-task behaviors of students. Research by Lalli et al. (1997) also found that lean
schedules of reinforcement were effective in decreasing aggressive behaviors, as long as the
schedule did not result in reinforcement deprivation. The schedule of reinforcement used in this
study was determined after identifying baseline levels of teacher attention towards the students.
The teacher in this study provided teacher attention about every 15 min to each student; in order
to prevent deprivation, the delivery of reinforcement during the intervention conditions was
increased to 5 min for each student. This schedule of reinforcement was acceptable to the
teacher, and increased the amount of teacher attention each student received in the intervention
conditions relative to baseline conditions.
We hypothesize that the provision of teacher attention on a fixed-time schedule of
reinforcement served to weaken the contingency between problem behaviors of the students and
existing reinforcement. Teacher attention became freely available to these students without their
engagement in problematic behaviors. The rate of teacher attention was increased to prevent
deprivation and to provide students with more frequent access to reinforcement. The students
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 17
received teacher praise for positive behaviors more often, and the teacher maintained the ability
to redirect the students if they were off-task.
However, because the teacher’s comments were specific to the behaviors she observed, it
may be the case that increases in on-task behavior resulted from a reinforcing effect of praise,
and decreases in off-task behavior resulted from a punishing effect of redirection statements.
These effects may have occurred independent of an existing contingency that the FT intervention
attempted to remove, especially given the lack of extinction between cued intervals. Further
research is needed to address possible mechanisms for the intervention effects observed in
relation to the content of teacher verbal statements.
The inclusion of an extinction component may be a more essential component of FT if
the target students more frequently engage in off-task behavior or engage in more disruptive
classroom behaviors. The off-task behaviors of the students in this study (e.g. out-of-seat,
fidgeting with objects, etc.) may have been less disruptive to the class compared to other off-task
behaviors (e.g. yelling out answers, running around the room, throwing objects, etc.). Future
research may need to identify the conditions when FT without an extinction component may be
more effective than FT with extinction between cued intervals.
Increasing the rate of attention delivery in this study increased the frequency of teacher
praise and redirection statements for each student. Thus, the FT intervention also served to
increase the number of positive teacher-student interactions, even though redirections continued.
Burnett (2002) found that students who reported positive relationships with their teachers
perceived the overall classroom environment as positive. Although not specifically measured, the
researcher noted anecdotally that the frequency of teacher praise for other students in the
classroom also increased. During FT, the teacher also responded positively to other students in
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 18
the class. The teacher tried to make it less noticeable that she was targeting the student
participants during the study, so she provided praise to other students for engaging in on-task
behaviors. Implementing FT, even if the intervention is directed at only a few students, may
increase the frequency of positive teacher statements towards other students in the classroom
which may lead to a more positive classroom environment.
One limitation of this study was the length of intervention implementation. Future studies
may be conducted to identify strategies to maintain the effects of the intervention over time.
Similarly, there were only four data points per condition and a lack of stability in baseline which
limits interpretation of changes between phases. Another limitation of this study was the class
activity during the time of observation. Observations took place during the same time of day
during most, but not all, sessions. Class activities varied during all the sessions including
combinations of small and large group teacher-led instruction, as well as individual and partner
work. The teacher identified that student participants engaged in more off-task behaviors during
large group teacher-led instruction, so the first author tried to match observations during this
time. However, the different activities observed may have contributed to the variability in data.
Future studies may try to isolate observations during the times when students are engaging in
more off-task behaviors to further identify the effectiveness of FT attention delivery.
An additional limitation of the study was the lack of a functional analysis to accurately
identify a true function of the problematic behavior. While this lack of a functional analysis may
be considered a limitation, the procedures used replicate common practice in schools (i.e.,
interview and observation techniques to hypothesize function; Crone & Horner, 2003). In
addition, there is empirical evidence that descriptive functional assessment procedures may
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 19
produce accurate hypotheses of function (e.g., Alter, Conroy, Mancil, & Haydon, 2008; Lerman,
Hovanetz, Strobel, & Tetreault, 2009; Taylor & Romanczyk, 1994).
Researchers need to continue to work to determine a model for identifying schedules of
reinforcement delivery most effective for promoting positive student behavior and learning in
classrooms. This would help teachers and other education professionals determine the most
appropriate schedule of reinforcement delivery when using FT interventions in schools. FT
interventions with lean schedules of reinforcement may enhance teacher acceptability, thus
promoting the use of this promising intervention.
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 20
References
Alberto, P. A., & Troutman, A. C. (2009). Applied behavior analysis for teachers. Upper Saddle
River, NJ: Pearson.
Algozzine, B., Ysseldyke, J., & Elliott, J. (1997). Strategies and tactics for effective instruction.
Longmont, CO: Sopris West.
Alter, P. J., Conroy, M. A., Mancil, G., & Haydon, T. (2008). A comparison of functional
behavior assessment methodologies with young children: Descriptive methods and
functional analysis. Journal of Behavioral Education, 17, 200-219.
Austin, J. L., & Soeda, J. M. (2008). Fixed-time teacher attention to decrease off-task behaviors
of typically developing third graders. Journal of Applied Behavior Analysis, 41, 279-283.
Burnett, P. C. (2002). Teacher praise and feedback and students’ perceptions of the classroom
environment. Educational Psychology, 22, 5-16,
Crone, D. A., & Horner, R. H. (2003). Building positive behavior support systems in schools.
New York: Guilford.
Elliott, S. N., Witt, J. C., & Kratochwill, T. R. (1991). Selecting, implementing, and evaluating
classroom interventions. In G. Stoner, M. Shinn, & H. Walker (Eds.). Interventions for
achievement and behavior problems (pp. 99-135). Bethesda, MD: National Association
of School Psychologists.
Gable, R. A., Quinn, M. M., Rutherford, R. B., & Howell, K. W. (1998). Functional behavioral
assessments and positive behavioral interventions. Preventing School Failure, 42, 106 –
119.
Jones, K. M., Drew, H. A., & Weber, N. L. (2000). Noncontingent peer attention as treatment for
disruptive classroom behavior. Journal of Applied Behavior Analysis, 33, 343-346.
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 21
Kazdin, A. E. (2001). Behavior modification in applied settings. Belmont, Ca:
Wadsworth/Thomson Learning.
Lalli, J. S., Casey, S. D., & Kates, K. (1997). Noncontingent reinforcement as treatment for
severe problem behavior: Some procedural variations. Journal of Applied Behavior
Analysis, 30, 127-137.
Lerman, D. C., Hovanetz, A., Strobel, M., & Tetreault, A. (2009). Accuracy of teacher-collected
descriptive analysis data: A comparison of narrative and structured recording formats.
Journal of Behavioral Education, 18(2), 157-172.
Mautone, J. A., DuPaul, G. J., Jitendra, A. K., Tresco, K. E., Junod, R.V., & Volpe, R. J. (2009).
The relationship between treatment integrity and acceptability of reading interventions
for children with attention-deficit/hyperactivity disorder. Psychology in Schools, 46, 919-
931.
Shapiro, E.S. (2004). Academic skills problems: Direct assessment and intervention. New York:
Guilford.
Tankersley, M., Harjusola-Webb, S., & Landrum, T.J. (2008). Using single-subject research to
establish the evidence base of special education. Intervention in School and Clinic, 44,
83-90.
Taylor, J. C., & Romanczyk, R. G. (1994). Generating hypotheses about the function of student
problem behavior by observing teacher behavior. Journal of Applied Behavior Analysis,
27, 251-265.
Waller, R. D. & Higbee, T. S. (2010). The effects of fixed-time escape on inappropriate and
appropriate classroom behavior. Journal of Applied Behavior Analysis, 43, 149-153.
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 22
Figure 1. Percentage of intervals in which on-task behaviors were observed for Sally (top) and
Joey (bottom) during baseline and intervention conditions.
0
10
20
30
40
50
60
70
80
90
100
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
P
e
r
c
e
n
ta
g
e
o
f
In
te
r
v
a
ls
O
n
-t
a
sk
Sessions
Baseline BaselineFT FT
0
10
20
30
40
50
60
70
80
90
100
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
P
e
r
c
e
n
ta
g
e
o
f
In
te
r
v
a
ls
O
n
-t
a
sk
Sessions
Joey
Baseline BaselineFT FT
Sally
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 23
Figure 2. Rate of teacher attention per minute during baseline and intervention conditions.
Teacher attention is depicted as either praise statements or redirection of student behavior.
0.00
0.05
0.10
0.15
0.20
0.25
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
R
a
te
o
f
A
tt
e
n
ti
o
n
Sessions
Praise
Redirection
Baseline BaselineFT FT
Sally
0.00
0.05
0.10
0.15
0.20
0.25
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
R
a
te
o
f
A
tt
e
n
ti
o
n
Sessions
Praise
Redirection
Baseline BaselineFT FT
Joey
INCREASING ON-TASK BEHAVIOR USING FIXED TIME 24
Figure 3. Rates of teacher attention during baseline and intervention conditions for each student.
The teacher was prompted to provide attention to the students at a rate of .20 (as depicted by the
horizontal line). During the FT conditions, the teacher also provided unprompted attention to the
students between cued intervals.
0.00
0.05
0.10
0.15
0.20
0.25
0.30
0.35
0.40
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17
R
a
te
o
f
T
e
a
c
h
e
r
A
tt
e
n
ti
o
n
Sessions
Sally Rate
Joey Rate
Baseline BaselineFT FT
- University of Nebraska at Omaha
- Increasing On-Task Behavior Using Teacher Attention Delivered on a Fixed-Time Schedule
- tmp.1554409695 .nUBMX
DigitalCommons@UNO
9-2011
Jessica L. Riley
Brian McKevitt
Mark D. Shriver
Keith D. Allen
Recommended Citation
J Appl Behav Anal. 2010 Fall; 43(3): 547–551.
doi: 10.1901/jaba.2010.43-547
PMCID: PMC2938950
PMID: 21358918
USE OF PEER-MEDIATED INTERVENTION IN CHILDREN WITH
ATTENTION DEFICIT HYPERACTIVITY DISORDER
Alicia N Grauvogel-MacAleese
UNIVERSITY OF NEVADA, RENO
Michele D Wallace
Chris Ninness, Action Editor
Address correspondence to Michele D Wallace, Charter College of Education, Division of Special Education and
Counseling, California State University, Los Angeles KH C1064, Los Angeles, California 90032, e-mail:
mwallac@calstatela.edu
Received 2007 Aug 25; Accepted 2009 Jan 1.
Copyright Society for the Experimental Analysis of Behavior, Inc.
Abstract
The present experiment extended and replicated the use of functional analysis and a peer-mediated
intervention to decrease disruptive behavior displayed by children diagnosed with attention deficit
hyperactivity disorder in an afterschool program. After determining that the participants displayed off-
task behavior maintained by peer attention via a functional analysis, peer-implemented differential
reinforcement of other behavior with extinction was effective in reducing participants’ off-task
behaviors. The use of peers as behavior-change agents is discussed, as are avenues for future research.
Keywords: attention deficit hyperactivity disorder, differential reinforcement, functional analysis, peer
mediation
Approximately two million children in the United States are estimated to have attention deficit
hyperactivity disorder (ADHD, National Insitute of Mental Health, 2006). Furthermore, it is estimated
that 80% of children diagnosed with ADHD exhibit a variety of behavior problems (Cantwell & Baker,
1991). Research has demonstrated that the most efficacious strategy to decrease or eliminate behavior
problems is to develop an intervention based on the identified function of the behavior (Carr & Durand,
1985). In addition to using functional analyses to guide intervention, research has demonstrated that
peer attention can be a functional reinforcer for some children with ADHD, and the use of peer-
mediated interventions can decrease behavior problems for these children (e.g., Flood, Wilder, Flood,
& Masuda, 2002). However, most applications of behavioral assessments and peer-mediated
interventions of behavior problems exhibited by children with ADHD have been conducted exclusively
in analogue educational settings using single interventions (e.g., extinction alone).
JOURNAL OF APPLIED BEHAVIOR ANALYSIS
Society for the Experimental Analysis of Behavior
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2938950/#
https://dx.doi.org/10.1901%2Fjaba.2010.43-547
https://www.ncbi.nlm.nih.gov/pubmed/21358918
https://www.ncbi.nlm.nih.gov/pubmed/?term=Grauvogel-MacAleese%20AN%5BAuthor%5D&cauthor=true&cauthor_uid=21358918
https://www.ncbi.nlm.nih.gov/pubmed/?term=Wallace%20MD%5BAuthor%5D&cauthor=true&cauthor_uid=21358918
mailto:dev@null
https://www.ncbi.nlm.nih.gov/pmc/about/copyright/
The purposes of the current study were (a) to replicate and extend functional analysis procedures using
peers in an afterschool program and (b) to replicate and extend peer-mediated interventions for
problem behavior maintained by peer attention using multiple-component contingencies (e.g., both
differential reinforcement and extinction).
METHOD
Participants and Setting
Three participants with ADHD, Scott (8-year-old boy), Zane (6-year-old boy), and Drew (10-year-old
boy), and their respective peers, Howey (9-year-old boy), Brian (7-year-old boy), and Jeffery (10-year-
old boy), participated in the study. Zane was the only participant who was taking medication for his
ADHD at the time of the study. Participants chose peers as children with whom they would like to
work during homework time, and we ensured that the staff deemed the selected peers as good role
models. All sessions were 5 min in duration and were conducted in the homework setting of the
afterschool program.
Data Collection, Interobserver Agreement, and Procedural Integrity
Off-task behavior was defined as talking about subjects unrelated to homework (all participants),
leaving or falling out of his seat (all participants), wandering around the room (Scott and Drew),
leaving the homework area (Zane), hiding behind objects (Zane), and crawling under the tables (Drew).
Trained observers scored a response on a data sheet broken into 10-s intervals if the participant
engaged in off-task behavior during any portion of the 10-s interval. The observers used a stopwatch to
identify the 10-s intervals. Data are presented as percentage of intervals, which was calculated by
dividing the intervals in which off-task behavior was scored by the total number of intervals (30) and
converting the ratio to a percentage.
Interobserver agreement was evaluated by having a second observer independently record data during
29%, 57%, and 33% of sessions for Scott, Zane, and Drew, respectively. Agreement was calculated by
dividing the number of agreements by the number of agreements (both observers recorded the target
behavior in the same interval) and disagreements and converting the ratio to a percentage. Mean
agreement was 95% for both Scott and Zane (range, 80% to 100%) and 89% for Drew (range, 73% to
100%) across all conditions.
Procedural integrity data were collected for the peers’ responses during the functional analysis and
treatment sessions. The observer scored whether the peer responded correctly or incorrectly during the
interval as specified by the condition. The observer scored a correct response when the peer provided
attention contingent on off-task behavior during the peer-attention condition, provided attention
noncontingently during the control condition, and ignored off-task behavior and provided attention for
on-task behavior during treatment sessions, whether it occurred independently or was prompted via the
vibrating pager (Anglesea, Hoch, & Taylor, 2008). The observer scored an incorrect response if the
peer delivered attention when he should not have or if he ignored the prompt to deliver attention. The
procedural integrity measures were converted to a percentage correct after dividing the number of
intervals in which the peer responded correctly by the total number of intervals. Procedural integrity
was recorded for 100% of all sessions and follow-up, resulting in a mean of 99%, 97%, and 94% for
Scott, Zane, and Drew, respectively.
Peer and Staff Training
Before functional analysis and treatment sessions, the investigator used role play and modeling to teach
the peers how to respond during the various conditions (as described above for correct responses). The
training phase continued until peers reached an 80% accuracy criterion. In addition, the peers wore a
concealed vibrating pager during all sessions, which was used to prompt the peer if he did not respond
correctly.
Staff were trained prior to the start of the functional analysis by using role playing and modeling.
During the functional analysis, if the staff did not implement the correct contingency, the investigator
verbally prompted the correct response (e.g., to provide attention or to remove homework).
Functional Analysis
Antecedents and consequences correlated with the attention, play, and demand conditions (Iwata,
Dorsey, Slifer, Bauman, & Richman, 1982/1994) were presented in a multielement design with the
addition of a peer-attention condition. Worksheets included the participant’s homework and were
assigned to a condition based on the results of a preference assessment (e.g., the worksheet the
participant identified as the least preferred was assigned to the demand condition, and the others were
assigned to the staff-attention and peer-attention conditions). During the play condition, word finds and
crossword puzzle worksheets were used.
The peer was present during all functional analysis conditions but interacted only during the peer-
attention and control conditions. During the peer-attention condition, if the target behavior occurred,
the peer provided brief (approximately 10 s) attention (e.g., laughing at the joke, going under the table
with the target participant, walking over to the bleachers). During the staff-attention condition, staff
provided attention contingent on the target behavior. In the demand condition, the participant was
allowed to escape work for 30 s contingent on the target behavior (neither the peer nor the staff
member interacted with him during this break). After 30 s, the staff member instructed the participant
to get back to work. In the control condition, the peer delivered noncontingent attention (approximately
once every 30 s and included the peer talking about the worksheet activity) and ignored any off-task
behaviors.
Treatment Evaluation
A multiple baseline design across participants, with a reversal for one participant (Scott), was used to
evaluate the treatment intervention. Baseline sessions were identical to the peer-attention conditions of
the functional analysis for all three participants and included the three peer-attention sessions.
During the treatment phase, the peer provided statements of praise and help if the participant was on
task. If the participant engaged in off-task behavior, the peer discontinued praise and help until the
participant was on task again (i.e., extinction). During baseline and treatment, the worksheet consisted
of the homework assigned by the teacher.
RESULTS AND DISCUSSION
Results of the functional analyses are depicted in Figure 1. For all participants, the functional analyses
indicated that off-task behavior was sensitive to attention from peers (Ms = 67%, 70%, and 61% for
Scott, Zane, and Drew, respectively).
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2938950/figure/jaba-43-03-21-f01/
Open in a separate window
Figure 1
Percentage of 10-s intervals of off-task behavior for Scott, Zane, and Drew during the functional analysis.
Results of the treatment analysis are depicted in Figure 2. During baseline, all participants engaged in
high levels of off-task behavior (Ms = 67% and 89% for Scott, 76% for Zane, and 63% for Drew).
When peers implemented differential reinforcement, off-task behavior immediately decreased for all
three participants and remained low (Ms = 16% and 12% for Scott, 13% for Zane, and 9% for Drew).
Follow-up sessions were conducted for Drew a month after the last treatment session, and off-task
behavior remained low (M = 1%). It should be noted that the pager was not used to prompt the peer
during these follow-up sessions. After the intervention, the participant, peer, and a staff member
completed a social validity questionnaire regarding the intervention. Results across individuals and
questions were positive (see Table 1).
Open in a separate window
Figure 2
Percentage of 10-s intervals of off-task behavior for Scott, Zane, and Drew during baseline and differential
reinforcement.
Table 1
Social Validity Questionnaire Results
Open in a separate window
Results from the current study demonstrated that it is feasible to conduct both functional analyses and a
peer-mediated intervention in an afterschool program. In addition, it was demonstrated that the peers
were capable of accurately implementing a differential reinforcement procedure, including reinforcing
the absence of problem behavior as well as ignoring problem behavior. However, difficulties were
encountered that are worth mentioning. First, there were several uncontrolled situations that occurred
during both the functional analysis conditions and the treatment phase due to the natural setting. Twice
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2938950/figure/jaba-43-03-21-f01/
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2938950/figure/jaba-43-03-21-f02/
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2938950/table/jaba-43-03-21-t01/
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2938950/figure/jaba-43-03-21-f02/
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2938950/figure/jaba-43-03-21-f01/?report=objectonly
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2938950/figure/jaba-43-03-21-f02/?report=objectonly
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2938950/table/jaba-43-03-21-t01/?report=objectonly
during the peer-attention condition of Scott’s functional analysis, another peer interacted briefly with
him when he engaged in off-task behavior. The investigators determined that this was not a serious
confounding effect because it was in line with the contingencies for that condition (i.e., peer attention
delivered contingent on off-task behavior). A similar situation occurred once with Drew during
treatment, and the staff moved the children down two seats. In addition, during the 26th session of
treatment, Zane’s sister corrected one of his math problems, and he ran away. Subsequently, she was
trained in how to implement the intervention. Future research may investigate the feasibility of training
all peers in the extinction component of treatment.
Besides the difficulties associated with the application of procedures in natural settings, using a peer to
mediate the behavior of another child can be difficult. For example, it requires the peer to identify
specific behaviors and respond appropriately. In the current study, we created scenarios in which the
peer practiced with the investigator on how to respond. In addition, the use of the vibrating pager was
useful for the purpose of training the peer as well as an effective prompt during intervention. Although
this proved to be an effective way to overcome the difficulties of having peers implement the
intervention, a systematic fading procedure was not used to eliminate the pager and warrants future
research.
One could argue that the length of sessions used in this study was too short (only 5 min); however, it
should be noted that several sessions occurred during each homework period, and participants were
often on task for the duration of the homework period (up to 30 min total) during treatment.
Nevertheless, future research should run extended sessions to evaluate if this type of treatment can be
in place for longer periods of time.
Although the purpose of the current study was to decrease off-task behavior in children with ADHD
using peers, the effects of the procedure on peers were not evaluated. Although there are several
potential benefits for the peers, one could argue that the time the peer spent with the other child during
treatment may result in the peer completing less work (anecdotally, this was not observed in the current
study). Thus, future research should evaluate the effects on the peer with this type of procedure.
Acknowledgments
The research presented in this article was completed in partial fulfillment of thesis requirements for the
MA degree by the first author. We thank Erin Pitts, Mandy McClanhan, and Daniel Sutich for their
assistance in conducting this project, as well as Reno Parks and Recreation for their support.
REFERENCES
1. Anglesea M.M, Hoch H, Taylor B.A. Reducing rapid eating in teenagers with autism: Use of a
pager prompt. Journal of Applied Behavior Analysis. 2008;41:107–111. [PMC free article]
[PubMed] [Google Scholar]
2. Cantwell D.P, Baker L. Association between attention-deficit hyperactivity disorder and learning
disorders. Journal of Learning Disabilities. 1991;24:88–95. [PubMed] [Google Scholar]
3. Carr E.G, Durand V.M. Reducing behavior problems through functional communication training.
Journal of Applied Behavior Analysis. 1985;18:111–126. [PMC free article] [PubMed]
[Google Scholar]
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2410197/
https://www.ncbi.nlm.nih.gov/pubmed/18468283
https://scholar.google.com/scholar_lookup?journal=Journal+of+Applied+Behavior+Analysis&title=Reducing+rapid+eating+in+teenagers+with+autism:+Use+of+a+pager+prompt&author=M.M+Anglesea&author=H+Hoch&author=B.A+Taylor&volume=41&publication_year=2008&pages=107-111&pmid=18468283&
https://www.ncbi.nlm.nih.gov/pubmed/2010679
https://scholar.google.com/scholar_lookup?journal=Journal+of+Learning+Disabilities&title=Association+between+attention-deficit+hyperactivity+disorder+and+learning+disorders&author=D.P+Cantwell&author=L+Baker&volume=24&publication_year=1991&pages=88-95&pmid=2010679&
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1307999/
https://www.ncbi.nlm.nih.gov/pubmed/2410400
https://scholar.google.com/scholar_lookup?journal=Journal+of+Applied+Behavior+Analysis&title=Reducing+behavior+problems+through+functional+communication+training&author=E.G+Carr&author=V.M+Durand&volume=18&publication_year=1985&pages=111-126&pmid=2410400&
4. Flood W.A, Wilder D.A, Flood A.L, Masuda A. Peer-mediated reinforcement plus prompting as
treatment for off-task behavior in children with attention deficit hyperactivity disorder. Journal of
Applied Behavior Analysis. 2002;35:199–204. [PMC free article] [PubMed] [Google Scholar]
5. Iwata B.A, Dorsey M.F, Slifer K.J, Bauman K.E, Richman G.S. Toward a functional analysis of
self-injury. Journal of Applied Behavior Analysis. 1994;27:197–209. (Reprinted from Analysis
and Intervention in Developmental Disabilities, 2, 3–20, 1982) [PMC free article] [PubMed]
[Google Scholar]
6. National Institute of Mental Health. From NIMH research on treatment for attention deficit
hyperactivity disorder (ADHD): The multimodal treatment study. 2006. Retrieved from 0
http://www.nimh.nih.gov/childhp/mtaqa.cfm.
Articles from Journal of Applied Behavior Analysis are provided here courtesy of Society for the
Experimental Analysis of Behavior
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1284378/
https://www.ncbi.nlm.nih.gov/pubmed/12102141
https://scholar.google.com/scholar_lookup?journal=Journal+of+Applied+Behavior+Analysis&title=Peer-mediated+reinforcement+plus+prompting+as+treatment+for+off-task+behavior+in+children+with+attention+deficit+hyperactivity+disorder&author=W.A+Flood&author=D.A+Wilder&author=A.L+Flood&author=A+Masuda&volume=35&publication_year=2002&pages=199-204&pmid=12102141&
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1297798/
https://www.ncbi.nlm.nih.gov/pubmed/8063622
https://scholar.google.com/scholar_lookup?journal=Journal+of+Applied+Behavior+Analysis&title=Toward+a+functional+analysis+of+self-injury&author=B.A+Iwata&author=M.F+Dorsey&author=K.J+Slifer&author=K.E+Bauman&author=G.S+Richman&volume=27&publication_year=1994&pages=197-209&pmid=8063622&
http://www.nimh.nih.gov/childhp/mtaqa.cfm
1 3
Eur Child Adolesc Psychiatry (2017) 26:1471–1481
https://doi.org/10.1007/s00787-017-1006-y
ORIGINAL CONTRIBUTION
Time‑on‑task effects in children with and without ADHD:
depletion of executive resources or depletion of motivation?
Tycho J. Dekkers1,2 · Joost A. Agelink van Rentergem1 · Alette Koole1 ·
Wery P. M. van den Wildenberg1,3 · Arne Popma2,4,5 · Anika Bexkens6,7 ·
Reino Stoffelsen2,4 · Anouk Diekmann2,8 · Hilde M. Huizenga1,3,9
Received: 13 June 2016 / Accepted: 17 May 2017 / Published online: 23 May 2017
© The Author(s) 2017. This article is an open access publication
(stop-signal reaction time) and attention (reaction time
variability and errors), was administered in 96 children
(42 ADHD, 54 TD controls; aged 9–13). To differentiate
between depletion of resources and depletion of motivation,
the SST was administered twice. Half of the participants
was reinforced during second task performance, potentially
counteracting depletion of motivation. Multilevel analyses
indicated that children with ADHD were more affected by
time-on-task than controls on two measures of inattention,
but not on inhibition. In the ADHD group, reinforcement
only improved performance on one index of attention (i.e.,
reaction time variability). The current findings suggest that
time-on-task effects in children with ADHD occur spe-
cifically in the attentional domain, and seem to originate
in both depletion of executive resources and depletion of
motivation. Clinical implications for diagnostics, psycho-
education, and intervention are discussed.
Keywords ADHD · Executive functioning · Depletion ·
Time-on-task · Reinforcement · Inhibition
Abstract Children with attention-deficit/hyperactiv-
ity disorder (ADHD) are characterized by deficits in their
executive functioning and motivation. In addition, these
children are characterized by a decline in performance as
time-on-task increases (i.e., time-on-task effects). However,
it is unknown whether these time-on-task effects should be
attributed to deficits in executive functioning or to deficits
in motivation. Some studies in typically developing (TD)
adults indicated that time-on-task effects should be inter-
preted as depletion of executive resources, but other stud-
ies suggested that they represent depletion of motivation.
We, therefore, investigated, in children with and without
ADHD, whether there were time-on-task effects on execu-
tive functions, such as inhibition and (in)attention, and
whether these were best explained by depletion of execu-
tive resources or depletion of motivation. The stop-signal
task (SST), which generates both indices of inhibition
Electronic supplementary material The online version of this
article (doi:10.1007/s00787-017-1006-y) contains supplementary
material, which is available to authorized users.
* Tycho J. Dekkers
t.j.dekkers@uva.nl
1 Department of Psychology, University of Amsterdam,
Nieuwe Achtergracht 129B, 1018 WS Amsterdam, The
Netherlands
2 Department of Forensic Psychiatry and Complex Behavioral
Disorders, Academic Center for Child and Adolescent
Psychiatry, De Bascule, Rijksstraatweg 145, 1115
AP Duivendrecht, The Netherlands
3 Amsterdam Brain and Cognition Center, University
of Amsterdam, Amsterdam, The Netherlands
4 Department of Child and Adolescent Psychiatry, VU
University Medical Center Amsterdam, Amsterdam, The
Netherlands
5 Faculty of Law, Institute of Criminal Law and Criminology,
Leiden University, Leiden, The Netherlands
6 Department of Developmental and Educational Psychology,
Leiden University, Wassenaarseweg 52, 2333 AK Leiden,
The Netherlands
7 Department of Child and Adolescent Psychiatry, GGZ
Delfland, Center for Psychiatry, Amsterdam, The Netherlands
8 Practice for Individual, Couple, and Family Therapy
and Center for Training, De Kontekst, Van Breestraat 147HS,
1071 ZL Amsterdam, The Netherlands
9 Research priority Area Yield, University of Amsterdam,
Amsterdam, The Netherlands
http://crossmark.crossref.org/dialog/?doi=10.1007/s00787-017-1006-y&domain=pdf
https://doi.org/10.1007/s00787-017-1006-y
1472 Eur Child Adolesc Psychiatry (2017) 26:1471–1481
1 3
Introduction
Children with attention-deficit/hyperactivity disorder
(ADHD) are characterized by inattention, hyperactivity,
and/or impulsivity, which lead to problems in multiple
domains. For example, children with ADHD have more
academic problems [1] and adverse health outcomes [2],
report lower quality of life [3], and usually have one or
more comorbid psychiatric diagnoses [4]. Several models
explaining ADHD have been proposed (see [5, 6]). One
influential model is the dual pathway model, in which
ADHD is characterized by deficits in both executive and
motivational systems [7].
With regard to the executive pathway, several meta-
analyses indicate that children with ADHD are impaired
on multiple executive functions (EF) [8–11]. For example,
response inhibition, which is regarded as one of the core,
higher order executive functions [12, 13], has repeat-
edly shown to be implicated with ADHD [11, 14]. On a
more basic pre-executive level, attention is a crucial pre-
requisite of executive functioning [13], and associations
between ADHD and attentional problems are consistently
reported (ranging from problems in sustaining attention
on lab tasks to real life attention problems [9, 14]).
With respect to the motivational pathway, many empir-
ical studies as well as theoretical models suggest aberrant
motivation in children with ADHD (see [15] for an over-
view). Some models propose that children with ADHD
have a higher reward sensitivity than controls (i.e., larger
improvement in performance related to reward; [16, 17]),
but experimental findings for this account are mixed [18].
However, a recent meta-analysis on reinforcement effects
on inhibition in ADHD indicated that (1) a large major-
ity of children, both with and without ADHD, benefited
from reinforcement and (2) this reinforcement effect was
stronger for ADHD (large effect size) than for controls
(medium effect size), suggesting differential reward sen-
sitivity between groups [16]. The authors note that only
24% of the studies found significant group × reinforce-
ment interactions in this direction, which is in line with
the mixed findings that were mentioned previously.
EF performance in children with ADHD is often more
characterized by a stronger decrease in performance over
time (time-on-task) as compared to TD controls [19]. It
has been argued that these time-on-task effects originate
in difficulties sustaining attention, which is a typical,
although not specific [20], feature of ADHD [14, 21]. In
accordance with the dual pathway model, this time-on-
task effect can be caused by degraded (EF) resources, but
it may also be possible that decreased levels of motiva-
tion explain this decrease in performance. It was shown
that time-on-task effects on working memory in ADHD
could be partly counteracted with reinforcement [22],
suggesting that they should at least partly be attributed to
decreased motivation. However, to our knowledge, it has
never been tested before whether this is also the case for
response inhibition and attention. Therefore, the current
study investigates whether time-on-task effects on inhi-
bition and attention in children with ADHD can be rem-
edied by increasing motivation.
Dual pathway models of ADHD do not directly speak
to the role of motivation on time-on-task effects. However,
the effect of motivation on time-on-task effects is central in
the literature on resource depletion in healthy adults. Some
resource depletion theorists argue that self-control capacities,
a concept highly related to EF [23], are limited, and conse-
quently, self-control performance degrades after successive
attempts (for reviews, see [24, 25]). However, others have
argued that a decline in motivational resources (i.e., “reduced
motivation to attain task goals” [26]) can also explain time-
on-task effects [27, 28], as these effects appear to be weaker
if participants are motivated [26, 29].
To sum up, the current study combines dual pathway mod-
els of ADHD and resource depletion models of time-on-task
effects in healthy adults, to assess the origin of time-on-task
effects in children with ADHD. That is, we test whether chil-
dren with ADHD are more affected by time-on-task effects
than TD children. To investigate the nature of these time-on-
task effects, depletion of resources and depletion of motiva-
tion were disentangled. Children with and without ADHD
performed twice on the stop-signal task (SST; [30, 31]),
which yields a measure of response inhibition and more
indirect measures of (in)attention. In the second task, partici-
pants were either assigned to a reinforced or a non-reinforced
condition.
First, we hypothesize degraded performance of children
with ADHD as compared to TD children in the first task and
in the second task without reinforcement (effects of group)
[8–11]. Second, we hypothesize degraded performance on
the second task without reinforcement as compared to the
first task (effect of time-on-task), and we expect this effect to
be larger in children with ADHD than in TD controls (time ×
group interaction; [19]). Third, we hypothesize a better per-
formance on the second task with reinforcement as compared
to the second task without reinforcement (effect of reinforce-
ment), and we hypothesize children with ADHD profit more
from reinforcement than TD controls (reinforcement × group
interaction; [16, 22]).
Method
Participants
ADHD participants were recruited from an academic
outpatient mental healthcare center and TD control
1473Eur Child Adolesc Psychiatry (2017) 26:1471–1481
1 3
participants were recruited from elementary schools.
In the ADHD group, children were included when they
were diagnosed with ADHD (all subtypes), according
to the assessment by expert psychologists or psychia-
trists from the academic outpatient mental healthcare
center, following DSM-IV-TR criteria [32]. There was no
exclusion based on other disorders. Children in the con-
trol group were included only when their primary care-
takers confirmed that there was no ADHD diagnosis. In
total, our sample consisted of 111 children aged between
9 and 13 years. 54 children with ADHD (45 boys, mean
age 11.2 years, SD = 1.04) and 57 children without
ADHD (27 boys, mean age 11.8 years, SD = 0.68) were
included.
When using stimulant medication, participants were
instructed not to take their medication on the day of test-
ing, to reach total washout [33]. Informed consent was
obtained from primary caretakers of all children. All pro-
cedures were in accordance with the ethical standards of
the institutional research committee and with the 1964
Helsinki declaration and its later amendments.
Materials
Response inhibition and, indirectly, attention were meas-
ured with the standard stop-signal task (SST; [30, 31]),
which is a reliable indicator of inhibition in children
with ADHD [34]. Several studies showed associations
between the SST and a wide range of real life behaviors,
e.g., associations with classroom observations of chil-
dren with ADHD [35], with teacher ratings of inattention
[36], with observations as well as classroom measures of
hyperactivity and inattention [37], and with inattention
measured by both parents and teachers [38].
The SST was administered twice (i.e., T1 and T2), both
administrations consisted of one practice block and four
experimental blocks of 56 trials each. In this task, chil-
dren have to press one out of two marked buttons on the
keyboard, corresponding to green go signals appearing in
the center of the screen, as fast and accurate as possible
(i.e., the go task). In T1 [X] required a left response and
[O] required a right response; these were replaced by [H]
and [S], respectively, at T2 to prevent learning effects. A
choice between two response keys is necessary to create
time to process a potential stop signal. This presentation
order was counterbalanced across participants (the first,
third, fifth, etc. participant was assigned to [XO] at T1
and [HS] at T2, and in the second, fourth etc. participant,
this was reversed). In 25% of the cases, the green go sig-
nal turned red, indicating that the response tendency had
to be inhibited. To ensure that participants succeeded
to inhibit their response in 50% of the cases, the time
between the go signal and the stop signal was adaptive.
That is, if stopping was successful, the interval between
the go signal and the stop signal (i.e., the stop-signal
delay) of the following stop trial was increased by 50 ms,
making it harder to inhibit. On the other hand, if the par-
ticipant failed to stop, the stop-signal delay of the follow-
ing stop trial was shortened by 50 ms, making inhibition
easier. Accordingly, the stop-signal reaction time (SSRT)
can be estimated. This reflects the estimated mean time
required to inhibit responses to stop signals, imply-
ing that a short SSRT indicates good inhibitory capaci-
ties. SSRT was calculated according to the integration
method and the race model [41]. Assuming independence
between go and stop processes, the finishing time of the
stop process bisects the go RT distribution. Given that the
response could not be stopped successfully on nth per-
cent of all stop trials, SSRT is calculated by subtracting
the mean stop-signal delay from the go RT that represents
the nth percentile of go RTs (i.e., the finishing time of the
stop process).
Furthermore, the SST provides an index of choice errors
(i.e., pressing the wrong button), omission errors on go tri-
als (i.e., no response within the response frame), and reac-
tion time (including its variance). More omission errors,
slower mean RT, and higher RT variability (defined as the
standard deviation of all RTs of a participant within the
SST) are associated with problems in the domain of atten-
tion [39]. More specifically, omission errors on a Go/NoGo
task are related to symptoms of inattention in ADHD as
reported by both caregivers and teachers [40]. Moreover,
ADHD is characterized by attentional lapses, which gener-
ate reaction time distributions with a positive skew, leading
to increased mean reaction times and higher RT variability
[41, 42]. Relatedly, attention lapses appear to be related to
errors and variability of reaction times, which is referred to
as state instability [43]. Therefore, omission errors, RT and
RT variability were taken as indirect measures of the basic
attentional processes.
Procedure
To induce time-on-task effects, the SST was administered
twice in succession. Before every block, children were
instructed to respond as fast and accurate as possible to go
trials and to withhold their response if the signal turned red
(i.e., not to press a button). To avoid adoptation of a waiting
strategy, participants were also told that it was not allowed
to wait for the signal to become red. In the second task,
participants were randomly assigned to either a reinforcing
or a non-reinforcing condition. In the reinforcing condition,
children were told that they could earn coins which could
be used to “buy” a present at the end of the task if they had
1474 Eur Child Adolesc Psychiatry (2017) 26:1471–1481
1 3
earned at least ten coins. Reinforcement was not aimed at
any specific aspect of the SST, and children were instructed
to respond as fast and accurate as possible. Although the
present was emotionally appealing for the participants, the
monetary value was about 0.50 euro. To motivate partici-
pants, the box with presents was already shown before the
beginning of the second task. After each block, children
were informed on the screen about the amount of earned
coins. Feedback on the amount of earned coins was manip-
ulated; the cumulative amount of coins shown after each
block was, respectively, 2, 5, 7, and 9 or 10.1 There was no
possibility of losing in both conditions. The duration of one
administration of the SST was approximately 16 min; the
duration of the entire session ranged from 45 to 60 min.
The time between the end of the first and the beginning of
the second administration of the SST was approximately
2 min.
Data analysis
A repeated measures design with one within-subjects fac-
tor (time, T1 and T2) and two between-subjects factors
(ADHD vs. TD controls; reinforcement vs. no reinforce-
ment at T2) was used. Note, however, that the design is
not fully crossed, as none of the participants was rein-
forced during the first task. A fully crossed design would
have led to power difficulties, given the limited availabil-
ity of participants. Therefore, the current data could not
be analyzed with a regular repeated measures analysis,
but were analyzed with a multilevel analysis with time as
a first level variable and group and reinforcement as sec-
ond level variables [44]. Age was added as covariate, as
it might be related to executive functioning [45]. Gender
and intelligence were not added as covariates, because
a higher proportion of boys and a lower average intelli-
gence level are typical features of ADHD as compared to
controls [46], and are, therefore, inherently not suitable as
covariates [47] (see Appendix 1 for the complete multi-
level model).
Five dependent variables were derived from the SST
(see “Materials” section). Note that for the omission and
choice errors, the square root of the raw scores was ana-
lyzed, because percentages generally are not normally
distributed.
1 The end score was manipulated to either 9 or 10. This in order to
prevent the spread of the rumor that all children had won a present
regardless of their performance, as TD controls usually had class
together. After all data were gathered, children that ended up with 9
points received a present as well.
Results
Exclusion of participants
15 participants were excluded from the analyses (12 boys, 3
girls; mean age 11.2 years, SD = 1.18). Four of them were
excluded due to procedural errors. In addition, 11 partici-
pants were excluded because of aberrant performance: one
ADHD participant refused to continue with the second task;
one ADHD participant made choice errors on more than
50% of the go trials (at T1), indicating difficulty in under-
standing the task; one ADHD participant had a very abnor-
mal response pattern in which RT variability was larger than
mean RT (at T2 without reinforcement); four participants
were excluded, because SSRT was below 100 ms, indicating
that SSRT estimation was not reliable for these participants
(one ADHD participant at T1; two ADHD participants at
T2, without reinforcement; and one TD participant at T2
with reinforcement); four participants were excluded,
because the standardized value of at least one of the out-
come measures was more than three standard deviations
from the average (one ADHD participant had abnormal
slow reaction times at T1 as well as T2 without reinforce-
ment;2 one ADHD participant had an abnormal high SSRT
at T1;3 one ADHD participant had an abnormal high SSRT
at T2,4 without reinforcement; and one TD participant had
an abnormal high SSRT at T2,5 without reinforcement).
Interestingly, 9 of the 11 participants excluded because
of aberrant performance were diagnosed with ADHD. Most
of these participants were excluded because of aberrant
performance in the second task without reinforcement: of
the seven exclusions based on performance at T2, two were
reinforced, and five were not.
Demographics
After exclusion, the sample consisted of 96 participants
(42 ADHD, 54 TD children). Differences between the
ADHD group and the TD control group were significant
with regard to age. The mean age was 11.2 (SD = 1.0)
and 11.8 (SD = 0.68) years for ADHD and TD controls,
respectively: t (94) = 3.21, p = 0.002). Therefore, age
was added as covariate in all further analyses. Groups
also differed in gender (83% vs. 46% boys in ADHD and
TD, respectively: χ2 (1) = 13.8, p < 0.001). For extra
2 Mean reaction times of this participant were 1288 and 1219 ms at
T1 and T2, respectively.
3 SSRT at T1 of this participant was 678 ms.
4 SSRT at T2 of this participant was 677 ms.
5 SSRT at T2 of this participant was 538 ms.
1475Eur Child Adolesc Psychiatry (2017) 26:1471–1481
1 3
analyses with gender added as covariate, see Appendix
2.6
In the ADHD group, 31 children were diagnosed with
the combined type, 5 with the inattentive type, 2 with the
hyperactive-impulsive type, and 4 children were diag-
nosed with ADHD not otherwise specified. With regard
to comorbidity, 24 children had no comorbid diagnosis,
7 children had a comorbid learning disorder, 5 had a
comorbid disruptive behavior disorder (i.e., conduct dis-
order, oppositional defiant disorder, or disruptive behav-
ior disorder not otherwise specified), 3 had a comorbid
parent–child relational problem, 2 had a comorbid mood
disorder, and 1 had a comorbid communication disorder.
Effects of group: ADHD vs. TD controls (T1)
To test our first hypothesis that EF on T1 is impaired in
children with ADHD as compared to TD controls, we per-
formed ANCOVAs (and thus not a multilevel analysis)
for all outcome measures in which we controlled for age
differences between groups. The assumption of parallel
regression lines was not violated for all five outcome meas-
ures (p > 0.05). ANCOVAs revealed only group differences
at baseline for reaction time variability [F (1, 93) = 10.8,
p = 0.001] and omission errors [F (1, 93) = 4.5, p = 0.04].
No differences were found for SSRT [F (1, 93) = 2.7,
p = 0.11], mean reaction time [F (1, 93) = 0.001,
p = 0.98], and choice errors [F (1, 93) = 2.9, p = 0.10]
(see Table 1 for means and standard deviations at T1).
After the baseline assessment at T1, participants were
randomly assigned to the second task with or without
6 Note however that the addition of gender as covariate yielded no
differences in the results of the multilevel analyses (see extra analyses
in Appendix 2).
reinforcement. At baseline, there were no significant dif-
ferences on all five outcome measures between children
assigned to the reinforced and non-reinforced second task.
This was tested by adding the reinforcement condition as an
additional between factor to the ANCOVAs.
Multilevel analyses (also see Table 2)
Effects of group: ADHD vs. TD controls (T2: not
reinforced)
When comparing ADHD and TD control groups at T2
(without reinforcement), the multilevel analysis indicated
that participants with ADHD had a larger reaction time
variability [t (119.62) = −5.04, p < 0.001] and made more
omission errors [t (124.88) = −4.70, p < 0.001]. No differ-
ences between ADHD and TD controls at T2 were found
for SSRT, mean reaction time, and choice errors.
Time‑on‑task effects in ADHD (T2 vs. T1: not reinforced)
Children with ADHD were characterized by increased
reaction time variability [t (102.07) = −2.25, p = 0.027]
and more omission errors [t (96.86) = −2.64, p = 0.010]
on the second task without reinforcement, as compared to
the first task. No differences between T2 and T1 in ADHD
were found for SSRT mean reaction time and choice errors.
Time-on-task effects were thus only found on indices of
attention, but not on inhibition.
Time‑on‑task effects in ADHD vs. TD controls (T2 vs. T1:
not reinforced)
A significant time × group interaction was observed for
reaction time variability [t (102.28) = 2.80, p = 0.006].
As shown above, in the ADHD group, reaction time
Table 1 Group means at T1 and at T2, with and without reinforcement
Symbols + and – represent conditions with and without reinforcement
Note that this table depicts the % of choice and omission errors, whereas the square root is used in the analyses. Note that the number of ADHD
participants in the conditions without and with reinforcement is unequal, as a relatively high number of participants that did not receive feedback
were excluded because of aberrant performance. Reaction time variability reflects the mean within subject variability in RTs, whereas the SD of
reaction time reflects the standard deviation of the mean RTs of all subjects
TD typically developing control group, ADHD attention-deficit/hyperactivity disorder, ms milliseconds, SSRT stop-signal reaction time, SD
standard deviation
Measure TD (T1) TD (T2) − TD (T2) + ADHD (T1) ADHD (T2) − ADHD (T2) +
Number of participants 54 28 26 42 16 26
Reaction time in ms (SD) 730 (175) 670 (130) 712 (228) 730 (161) 741 (98) 767 (206)
Reaction time variability in ms (SD) 203 (52) 192 (44) 177 (49) 237 (48) 254 (45) 230 (58)
% choice errors (SD) 3.49 (3.91) 3.38 (3.46) 3.27 (3.18) 5.90 (5.34) 4.76 (3.76) 5.24 (4.64)
% omission errors (SD) 2.38 (3.13) 0.98 (1.34) 2.40 (3.74) 3.83 (3.74) 5.84 (5.73) 5.91 (6.17)
SSRT in ms (SD) 254 (65) 260 (53) 246 (62) 282 (75) 269 (54) 243 (54)
1476 Eur Child Adolesc Psychiatry (2017) 26:1471–1481
1 3
variability increased on the second as compared to the first
task. The positive t value for the interaction effect indi-
cates that this effect was less pronounced in the TD group.
Visual inspection of the effect (Fig. 1) suggests that in
the TD group, reaction time variability even decreased in
the second as compared to the first task. Similarly, a sig-
nificant time × group interaction was observed for omis-
sion errors [t (97.02) = 3.70, p < 0.001]. In children with
ADHD, the amount of omission errors increased on the
second as compared to the first task (see analyses above),
whereas visual inspection suggests that the amount of
omission errors in the TD control group even decreased
(Fig. 1). No time × group interactions were found for
SSRT, mean reaction time, and choice errors. To sum up,
this indicates larger time-on-task effects in the ADHD
group than in the TD control group on indices of atten-
tion, but not inhibition.
Effects of reinforcement in ADHD (T2: reinforced vs. not
reinforced)
Children with ADHD who were vs. were not rein-
forced were characterized by lower RT variability [t
(91.95) = −2.73, p = 0.008]. Reinforcement did not influ-
ence SSRT, mean reaction time, omission errors, and choice
errors. This indicates that children with ADHD performed
better when they were reinforced as compared to when they
were not reinforced, but only with regard to RT variability.
Effects of reinforcement in ADHD vs. TD controls (T2:
reinforced vs. not reinforced)
No significant group × reinforcement effect was found for
all measures, indicating that reinforcement equally influ-
enced children with and without ADHD.
Discussion
Dual pathway models of ADHD and depletion theo-
ries were combined to investigate potential depletion of
Table 2 Overview of all effects in the multilevel model
SSRT stop-signal reaction time, RT reaction time, Var. variability. B (SE) represents the unstandardized estimate with its standard error, γ01 the
group effect at T2 without reinforcement, γ10 the time effect in ADHD without reinforcement, γ11 the interaction effect between group and time
without reinforcement, γ02 the reinforcement effect at T2 in ADHD, γ03 the interaction effect of group and reinforcement at T2, γ04 represents
the effect of age in boys with ADHD at T2 receiving no reinforcement
* p < 0.05, ** p < 0.01, *** p < 0.001
γ γ01 γ10 γ11 γ02 γ03 γ04
Variable Group Time Group × time Reinforcement Group * reinforce-
ment
Age
SSRT B = 8.2 (16.2),
p = 0.61
B = 23.1 (12.6),
p = 0.07
B = −30.7 (16.1),
p = 0.06
B = −9.8 (14.3),
p = 0.50
B = −7.4 (18.9),
p = 0.70
B = −0.90 (0.55),
p = 0.11
Mean RT B = −58.0 (49.1),
p = 0.24
B = 7.8 (32.5),
p = 0.81
B = 58.6 (40.8),
p = 0.15
B = 56.8 (40.4),
p = 0.16
B = −2.4 (53.2),
p = 0.96
B = 0.008 (1.66),
p = 0.996
RT Var. B = −71.7, (14.2),
p < 0.001***
B = −23.0 (10.2),
p = 0.03*
B = 36.1 (12.9),
p < 0.01**
B = −33.9 (12.4),
p < 0.01**
B = 24.1 (16.4),
p = 0.14
B = 0.13 (0.47),
p = 0.79
Omission errors B = −1.40 (0.30),
p < 0.001***
B = −0.54 (0.20),
p = 0.01*
B = 0.95 (0.26),
p < 0.001***
B = −0.18 (0.25),
p = 0.48
B = 0.38 (0.33),
p = 0.25
B = 0.004 (0.01),
p = 0.72
Choice errors B = −0.18 (0.27),
p = 0.50
B = 0.11 (0.20),
p = 0.58
B = −0.29 (0.26),
p = 0.27
B = 0.07 (0.24),
p = 0.78
B = −0.34 (0.31),
p = 0.28
B = −0.001 (0.01),
p = 0.23
Fig. 1 Reaction time variability as a function of time, group, and
reinforcement condition. a effect of group at baseline (T1); b effect
of group at T2 (non-reinforced); c effect of time in ADHD (non-rein-
forced); d time × group interaction (non-reinforced); e effect of rein-
forcement in ADHD; f group × reinforcement interaction. *p < 0.05,
**p < 0.01, ***p < 0.001. ADHD attention-deficit/hyperactivity dis-
order, TD typically developing control group, ms milliseconds, n.s.
not significant, RT reaction time
1477Eur Child Adolesc Psychiatry (2017) 26:1471–1481
1 3
executive resources and depletion of motivation in chil-
dren with and without ADHD. The SST was administered
twice in school-aged children with and without ADHD, in
which half of the participants was reinforced during the
second task. We hypothesized (1) degraded performance of
children with ADHD as compared to typically developing
(TD) children on the first task and the second task with-
out reinforcement. Moreover, we expected (2a) degraded
performance on the second task without reinforcement, as
compared to the first task and (2b), and we expected this
effect to be larger in children with ADHD than in TD chil-
dren. We expected (3a) improved performance on the sec-
ond task with reinforcement, as compared to the second
task without reinforcement and (3b), and we expected that
children with ADHD profited more from reinforcement
than TD children. Performance was measured at two lev-
els, with response inhibition as core higher order execu-
tive function, and at a more basic pre-executive level [13],
attentional indices as reaction time (RT), RT variability,
and errors.
With respect to the first hypothesis, at baseline, groups
differed with regard to reaction time (RT) variability and
omission errors: children with ADHD were characterized
by larger RT variability and made more omission errors
than children without ADHD. On the second task without
reinforcement, groups differed on RT variability and omis-
sion errors: children with ADHD had larger RT variabil-
ity and made more omission errors than children without
ADHD.
No difference between groups was found for SSRT.
Although the typical finding of inhibitory differences
between children with and without ADHD is quite robust
[9, 16, 48, 49], there are other studies that did not find this
difference either [50–53].
Potential explanations might relate to participant charac-
teristics. For example, only a subgroup of children with
ADHD is characterized by inhibitory deficits [54–56]. Fur-
thermore, comorbidity profiles seem to play a role. For
instance, no inhibitory differences were found between
control children and children with only ADHD (i.e., with-
out comorbid disorders; [52]). In our sample, comorbidity
occurred less frequently than generally described in
ADHD: 57% of our ADHD group had no comorbid disor-
der, whereas other literature indicates that approximately
two-third of the children with ADHD have at least one
other comorbid disorder [4]. In line with this explanation,
the average SSRT at baseline in our sample was 254 ms for
controls, 274 ms for children with ADHD without comor-
bid disorders, and 293 ms for children with ADHD comor-
bid disorder(s). An additional analysis proved that children
with ADHD without comorbid disorder(s) did not differ
from controls with regard to SSRT at T1 [F (1,76) = 1.4,
p = 0.24], whereas children with ADHD and comorbid
disorder(s) had higher SSRTs than controls [F (1,70) = 4.7,
p = 0.03].7
Relatedly, Daugherty and colleagues [51] explain differ-
ences between studies in terms of severity of the clinical
group, where more severe ADHD groups are most likely to
show pronounced inhibitory dysfunctioning. In sum, lower
comorbidity rates as well as lower severity of ADHD symp-
toms (e.g., we included four children with ADHD not oth-
erwise specified) might be an explanation for the absence
of inhibition effects between groups.
However, increased RT variability in children with
ADHD as compared to TD control children was apparent
both at baseline and on the second administration without
reinforcement. This concurs with a large body of litera-
ture, arguing that increased RT variability is a typical and
robust finding in ADHD (see [57] for an excellent exten-
sive meta-analytic review). Some theorists even argue that
RT variability is a causal mechanism in the existence of
ADHD (e.g., Default Mode Network Model [21]), while
others regard this variability as the result of other under-
lying mechanisms, such as behavioral inhibition [14, 57].
With respect to the second hypothesis, the expected
time-on-task effect in ADHD was partially confirmed: RT
variability and omission errors increased on the second task
as compared to the first in the ADHD group. In line with
our hypothesis, this time-on-task effect on RT variability
and omission errors was larger in the ADHD than in the TD
control group (in fact, the TD control group did not dete-
riorate at all). RT variability and omission errors have been
linked to problems with sustained attention and attentional
lapses [39, 40, 42, 58]. All together, these results indicate
that time-on-task effects in ADHD mainly seem to occur
within the domain of basic attention, and not on response
inhibition.
With respect to the third hypothesis, the effect of rein-
forcement in ADHD was partly as expected: reinforced
children with ADHD had a smaller RT variability as com-
pared to children with ADHD who were not reinforced in
the second task, which might indicate that reinforcement
prevented attentional lapses. This finding is in line with a
recent meta-analytic review that found small improvements
in RT variability as a result of external reinforcement [57].
However, no effect of reinforcement was found on all
other outcome indices. This implies that, among all out-
come variables, RT variability might be particularly sensi-
tive to the effects of reinforcement. Furthermore, discordant
to expectations, reinforcement effects did not differ between
groups for all indices, implying that children with ADHD
did not profit more from reinforcement than their typically
7 Note however that children with ADHD with and without comor-
bidity did not differ significantly on SSRT at baseline.
1478 Eur Child Adolesc Psychiatry (2017) 26:1471–1481
1 3
developing peers. This contradicts several studies, reporting
that children with ADHD profited more from reinforcement
(see [16] for a meta-analysis), but concurs with others that
also did not find such differences between ADHD and typi-
cally developing controls [53, 59–63]. Moreover, despite the
overall significant effect, the same meta-analysis showed
that only 24% of the studies found a larger reward sensitiv-
ity in ADHD as compared to controls [16].
One explanation for the limited effects of reinforce-
ment could be that reinforcement in the current study was
not strong enough. Although the gift was emotionally
appealing, it was only worth 50 eurocents. Children with
ADHD are assumed to have an elevated reward thresh-
old [18, 64], and as shown by Dovis and colleagues [22],
only relatively large rewards (above threshold) motivated
children with ADHD enough to improve their perfor-
mance on EF tasks, whereas small rewards did not exert
any influence on performance. A second explanation
for the absence of pronounced motivation effects is that
several studies have pointed out that only a minority of
children with ADHD shows abnormal sensitivity to rein-
forcement [65, 66], possibly explaining the limited rein-
forcement effects at group level in the current study.
To summarize, stronger time-on-task effects were found
in children with ADHD as compared to children without
ADHD on indices of basic attention (i.e., RT variability
and omission errors). In terms of depletion theories, rein-
forcement prevented a time-on-task effect on RT variabil-
ity, implying that the time-on-task effect in the non-rein-
forced condition could be, at least partly, explained by a
depletion of motivation. On the other hand, reinforcement
did not affect the time-on-task effect on omission errors,
implying that this time-on-task effect could be driven by a
depletion of executive resources. Finally, no time-on-task
effects were found on other indices, among which inhibi-
tion. This implies that, compared to lower level attentional
capacities, higher order executive functions, such as inhi-
bition, seem to be less susceptible to the effects of deple-
tion of executive resources and depletion of motivation.
In the current study, slower mean RTs and higher RT
variability (i.e., derived from a Gaussian model with two
parameters: mean and variance) were interpreted as
potential indicators of attention lapses. However, as
attention lapses produce a skew in RT distributions, an
ex-Gaussian model might be more appropriate, in which
a third parameter (τ) indexes this skew specifically [67]
(note, however, that the τ parameter could be interpreted
as an indicator for many different cognitive processes
[68]). Therefore, all multilevel analyses were also per-
formed for τ as an index for attention lapses.8 The only
significant finding was related to reinforcement: more
8 These analyses were performed using the retimes package in R.
attentional lapses were reported when participants were
not reinforced instead of reinforced at T2 [t
(91.24) = −2.02, p < 0.05]. Therefore, this additional
analysis partly supports our conclusion that problems (in
this case attention lapses) originating in a depletion of
motivation can be counteracted with reinforcement.
The current results should be considered in the light
of four limitations. First, 15 children were excluded, 11
of them because of aberrant performance. Nine of those
11 were diagnosed with ADHD, and a majority dropped
out on the second task (mostly without reinforcement).
Conceivably, these excluded participants had most pro-
nounced symptoms, and effects were larger if data from
these children were taken into account (see Appendix
3 for results without excluding any of the participants).
These testing difficulties demonstrate an obvious limita-
tion of the depletion paradigm, in which multiple monoto-
nous tasks are administered in succession, in samples with
ADHD. This is in line with a recent meta-analytic review,
which described an increase in core ADHD symptomatol-
ogy (i.e., hyperactivity) in highly cognitive demanding
situations as compared to situations with low cognitive
demands [69]. Therefore, on the other hand, the current
depletion paradigm might be an promising way of testing,
mirroring the daily life routine at school, and creating a
situation in which ADHD symptoms come to light easier.
Second, it would have been more optimal to diag-
nose ADHD participants by means of a semi-structured
clinical interview, as the currently used clinical diagnoses
might be more lenient and potentially included some chil-
dren with ‘subclinical’ ADHD. Relatedly, clinical diagno-
ses might be subject to biases [70]. However, it should be
noted that stricter diagnostics and exclusion of subclinical
ADHD participants would logically result in larger and not
smaller differences between typically developing controls
and ADHD subjects. This limitation could also apply to the
control group, in which potential neuropsychiatric prob-
lems could have been missed. However, the average SSRT
of the control group (at T1, SSRT = 254 ms) was consist-
ent with typical control group SSRTs reported in a meta-
analysis ([11]; the mean SSRT in control groups on simi-
lar SSTs was 284 ms (k = 24, n = 937). Hence, inhibitory
problems in the control group seem unlikely.
Third, the absence of reinforcement effects is interpreted
as evidence in favor of executive over motivational deple-
tion. However, self-evidently, another explanation for the
absence of those effects could be that statistical power was
not high enough. Therefore, these results should be inter-
preted with caution.
Fourth, the validity of the SSRT as a pure index of
response inhibition is subject to debate. This SSRT deficit
might reflect general attentional or cognitive processes that
go beyond purely inhibitory processes [8, 49]. However, on
1479Eur Child Adolesc Psychiatry (2017) 26:1471–1481
1 3
a more theoretical level, the current SST is, in our opinion,
the best index to assess response inhibition. First, it directly
taps into the construct of interest and measures inhibi-
tion over a longer time period. Second, a meta-analysis on
response inhibition differences between ADHD and con-
trols reviewing 41 studies showed that between group dif-
ferences (i.e., ADHD groups showing stronger inhibitory
deficits) were most pronounced when responses, as in the
current study, were spatially noncompatible [11]. Although
note that some of the studies reported in that meta-analysis
also reported no group differences in SSRT, as in the cur-
rent study. Third, an additional advantage of the SST is that
it derives both measures for inhibition and for attention.
The current study has several implications, for future
research as well as for clinical practice. EF was assessed
under depleting circumstances, which matches most chil-
dren’s daily school routine, in which they have to use their
EFs over prolonged periods of time. This is not to say that
we advocate the SST as an ecologically valid EF instru-
ment, as laboratory EF tasks do not correlate well with real
life measures [71] and the task that was used in the current
study was more repetitive than regular schoolwork.
Our results showed that attentional problems in ADHD
became more apparent in the second task. Moreover, the
relatively large drop-out in our study potentially indicates
that core psychiatric symptoms might come to light eas-
ier after a certain amount of time. Hence, we recommend
future research to further investigate the diagnostic value of
this paradigm in clinical samples.
With regard to treatment of ADHD, psycho-education to
children with ADHD and their parents and teachers should
emphasize the tendency for problems to increase after pro-
longed exertion. If children recognize depletion, they can
be taught to adjust their behavior accordingly, for example,
by taking a short break or switch to less demanding tasks.
Furthermore, we suggest therapists to keep their sessions
short and offer short breaks, to prevent attentional lapses.
If time-on-task effects occur, for example, during school
work, this can be partly counteracted with reinforcement.
The current approach was adopted to distinguish
between depletion of executive resources and depletion of
motivation. However, recent studies show that children with
ADHD show heterogeneous patterns of deficits, some hav-
ing especially deficits in executive functioning, whereas in
others, mainly, motivational aberrations are observed [56,
65, 66, 72, 73]. Consequently, the use of a design compar-
ing ADHD and control groups limits the possibility to elu-
cidate this heterogeneity within the ADHD group. A more
personalized approach, in which individual deficits could
be assessed and, eventually, treated, would be a promising
addition to the literature and is in line with current trends
in youth mental healthcare ([73]; see [74] for an extensive
review on personalized interventions for youth).
In sum, children with ADHD are more affected time-
on-task effects than TD controls, as shown on measures
of inattention (RT variability and omission errors), but not
inhibition. These time-on-task effects seem to originate in
a depletion of executive resources as well as a depletion
of motivation. Offering external reinforcement is a prom-
ising way to compensate for depletion of motivation and,
consequently, to prevent attention lapses in children with
ADHD. The depletion paradigm offers both a new per-
spective on diagnostic assessment of ADHD and provides
further clues for optimizing treatment of children with
ADHD.
Acknowledgements We would like to thank participating schools,
children, and their parents for participation. Furthermore, we would
like to express our gratitude to De Bascule, Academic Center for
Child and Adolescent Psychiatry for the recruitment of children with
ADHD.
Compliance with ethical standards
Funding This research is supported by a VICI grant (453-12-005)
of the last author from the Netherlands Organization for Scientific
Research (NWO). JAR is supported by grant MaGW (480-12-015)
awarded by the Netherlands organization for scientific research
(NWO). The funding source had no role in the study design, collec-
tion, analysis or interpretation of the data, writing the manuscript, nor
the decision to submit the paper for publication.
Conflict of interest All authors declare that they have no conflict of
interest.
Research involving human participants All procedures performed
were in accordance with the ethical standards of the institutional
research committee and with the 1964 Helsinki declaration and its later
amendments or comparable ethical standards.
Informed consent Informed consent was obtained from primary
caretakers of all children included in the study.
Open Access This article is distributed under the terms of the Crea-
tive Commons Attribution 4.0 International License (http://crea-
tivecommons.org/licenses/by/4.0/), which permits unrestricted use,
distribution, and reproduction in any medium, provided you give
appropriate credit to the original author(s) and the source, provide a
link to the Creative Commons license, and indicate if changes were
made.
References
1. Daley D, Birchwood J (2010) ADHD and academic perfor-
mance: why does ADHD impact on academic performance and
what can be done to support ADHD children in the classroom?
Child Care Hlth Dev 36:455–464
2. Nigg JT (2013) Attention-deficit/hyperactivity disorder and
adverse health outcomes. Clin Psychol Rev 33:215–228
3. Klassen AF, Miller A, Fine S (2004) Health-related qual-
ity of life in children and adolescents who have a
http://creativecommons.org/licenses/by/4.0/
http://creativecommons.org/licenses/by/4.0/
1480 Eur Child Adolesc Psychiatry (2017) 26:1471–1481
1 3
diagnosis of attention-deficit/hyperactivity disorder. Pediatrics
114:e541–e547
4. Jensen PS, Hinshaw SP, Kraemer HC, Lenora N, Newcorn JH,
Abikoff HB et al (2001) ADHD comorbidity findings from the
MTA study: comparing comorbid subgroups. J Am Acad Child
Psy 40:147–158
5. Sonuga-Barke EJ, Coghill D (2014) Editorial perspective: laying
the foundations for next generation models of ADHD neuropsy-
chology. J Child Psychol Psyc 55:1215–1217
6. Coghill DR, Seth S, Matthews K (2014) A comprehensive
assessment of memory, delay aversion, timing, inhibition, deci-
sion making and variability in attention deficit hyperactivity
disorder: advancing beyond the three-pathway models. Psychol
Med 44:1989–2001
7. Sonuga-Barke EJ (2003) The dual pathway model of AD/HD:
an elaboration of neuro-developmental characteristics. Neurosci
Biobehav R 27:593–604
8. Alderson RM, Rapport MD, Kofler MJ (2007) Attention-deficit/
hyperactivity disorder and behavioral inhibition: a meta-ana-
lytic review of the stop-signal paradigm. J Abnorm Child Psych
35:745–758
9. Willcutt EG, Doyle AE, Nigg JT, Faraone SV, Pennington BF
(2005) Validity of the executive function theory of attention-def-
icit/hyperactivity disorder: a meta-analytic review. Biol Psychiat
57:1336–1346
10. Martinussen R, Hayden J, Hogg-Johnson S, Tannock R (2005) A
meta-analysis of working memory impairments in children with
attention-deficit/hyperactivity disorder. J Am Acad Child Psy
44:377–384
11. Huizenga HM, van Bers BM, Plat J, van den Wildenberg WP,
van der Molen MW (2009) Task complexity enhances response
inhibition deficits in childhood and adolescent attention-deficit/
hyperactivity disorder: a meta-regression analysis. Biol Psychiat
65:39–45
12. Miyake A, Friedman NP (2012) The nature and organization of
individual differences in executive functions four general conclu-
sions. Cur Dir Psyc Sci 21:8–14
13. Barkley RA (2012) Executive functions What they are, how they
work, and why they evolved. Guilford Press, New York
14. Barkley RA (1997) Behavioral inhibition, sustained atten-
tion, and executive functions: constructing a unifying theory of
ADHD. Psychol Bull 121:65–94
15. Luman M, Tripp G, Scheres A (2010) Identifying the neurobiol-
ogy of altered reinforcement sensitivity in ADHD: a review and
research agenda. Neurosci Biobeh Rev 34:744–754
16. Ma I, van Duijvenvoorde A, Scheres A (2016) The interac-
tion between reinforcement and inhibitory control in ADHD: a
review and research guidelines. Clin Psyc Rev 44:94–111
17. Luman M, van Meel CS, Oosterlaan J, Geurts HM (2012)
Reward and punishment sensitivity in children with ADHD:
validating the sensitivity to punishment and sensitivity to reward
questionnaire for children (SPSRQ-C). J Abn Child Psychol
40:145–157
18. Luman M, Oosterlaan J, Sergeant JA (2005) The impact of rein-
forcement contingencies on AD/HD: a review and theoretical
appraisal. Clin Psychol Rev 25:183–213
19. Johnson KA, Kelly SP, Bellgrove MA, Barry E, Cox M, Gill M,
Robertson IH (2007) Response variability in attention deficit
hyperactivity disorder: evidence for neuropsychological hetero-
geneity. Neuropsychologia 45:630–638
20. Swaab-Barneveld H, De Sonneville L, Cohen-Kettenis P, Gielen
A, Buitelaar J, van Engeland H (2000) Visual sustained attention in
a child psychiatric population. J Am Acad Child Psy 39:651–659
21. Sonuga-Barke EJ, Castellanos FX (2007) Spontaneous atten-
tional fluctuations in impaired states and pathological conditions:
a neurobiological hypothesis. Neurosci Biobehav R 31:977–986
22. Dovis S, van der Oord S, Wiers RW, Prins PJM (2012) Can moti-
vation normalize working memory and task persistence in chil-
dren with attention-deficit/hyperactivity disorder? The effects of
money and computer-gaming. J Abnorm Child Psych 40:669–681
23. Hofmann W, Schmeichel BJ, Baddeley AD (2012) Executive
functions and self-regulation. Trends Cogn Sci 16:174–180
24. Muraven M, Baumeister RF (2000) Self-regulation and deple-
tion of limited resources: does self-control resemble a muscle?
Psychol Bull 126:247–259
25. Baumeister RF, Vohs KD, Tice DM (2007) The strength model
of self-control. Cur Dir Psychol Sci 16:351–355
26. Hagger MS, Wood C, Stiff C, Chatzisarantis NL (2010) Ego
depletion and the strength model of self-control: a meta-analy-
sis. Psychol Bull 136:495–525
27. Inzlicht M, Schmeichel BJ (2012) What is ego depletion?
Toward a mechanistic revision of the resource model of self-
control. Perspect Psychol Sci 7:450–463
28. Huizenga HM, van der Molen MW, Bexkens A, Bos MG, van
den Wildenberg WP (2012) Muscle or motivation? A stop-sig-
nal study on the effects of sequential cognitive control. Front
Psychol 3:126
29. Muraven M, Slessareva E (2003) Mechanisms of self-control
failure: motivation and limited resources. Pers Soc Psychol B
29:894–906
30. Logan GD, Cowan WB (1984) On the ability to inhibit
thought and action: a theory of an act of control. Psychol Rev
91:295–327
31. Logan GD (1994) On the ability to inhibit thought and action:
a users guide to the stop signal paradigm. In: Dagenbach D,
Carr TH (eds) Inhibitory processes in attention, memory, and
language. Academic Press, San Diego, pp 189–239
32. American Psychiatric Association (2000) Diagnostic and sta-
tistical manual of mental disorders, 4th revised edition (DSM-
IV-TR). American Psychiatric Association, Washington, DC
33. Greenhill LL (1998) Childhood attention deficit hyperactivity
disorder: Pharmacological treatments. In: Nathan PE, Gorman
J (eds) A guide to treatments that work. Oxford University
Press, New York, pp 42–64
34. Soreni N, Crosbie J, Ickowicz A, Schachar R (2009) Stop sig-
nal and conners’ continuous performance tasks test—retest
reliability of two inhibition measures in ADHD children. J
Attention Disord 13:137–143
35. Solanto MV, Abikoff H, Sonuga-Barke E, Schachar R, Logan
GD, Wigal T, Turkel E (2001) The ecological validity of delay
aversion and response inhibition as measures of impulsivity
in AD/HD: a supplement to the NIMH multimodal treatment
study of AD/HD. J Abnorm Child Psych 29:215–228
36. Tillman CM, Thorell LB, Brocki KC, Bohlin G (2007) Motor
response inhibition and execution in the stop-signal task:
development and relation to ADHD behaviors. Child Neu-
ropsychol 14:42–59
37. Pliszka SR, Borcherding SH, Spratley K, Leon S, Irick S
(1997) Measuring inhibitory control in children. J Dev Behav
Pediat 18:254–259
38. Nigg JT (1999) The ADHD response-inhibition deficit as
measured by the stop task: replication with DSM–IV com-
bined type, extension, and qualification. J Abnorm Child Psych
27:393–402
39. Winstanley CA, Eagle DM, Robbins TW (2006) Behavioral
models of impulsivity in relation to ADHD: translation between
clinical and preclinical studies. Clin Psychol Rev 26:379–395
40. Bezdjian S, Baker LA, Lozano DI, Raine A (2009) Assessing
inattention and impulsivity in children during the Go/NoGo task.
Brit J Dev Psychol 27:365–383
41. Leth-Steensen C, Elbaz ZK, Douglas VI (2000) Mean response
times, variability, and skew in the responding of ADHD
1481Eur Child Adolesc Psychiatry (2017) 26:1471–1481
1 3
children: a response time distributional approach. Acta Psychol
104:167–190
42. Vaurio RG, Simmonds DJ, Mostofsky SH (2009) Increased
intra-individual reaction time variability in attention-deficit/
hyperactivity disorder across response inhibition tasks with dif-
ferent cognitive demands. Neuropsychol 47:2389–2396
43. Doran SM, van Dongen HPA, Dinges DF (2001) Sustained
attention performance during sleep deprivation: evidence of state
instability. Arch Ital Biol 139:253–267
44. Snijders TAB, Bosker RJ (1999) Multilevel analysis: an intro-
duction to basic and advanced multilevel modeling. Sage, Thou-
sand Oaks, CA
45. Zelazo PD, Craik FI, Booth L (2004) Executive function across
the life span. Acta Psychol 115:167–183
46. Frazier TW, Demaree HA, Youngstrom EA (2004) Meta-analysis
of intellectual and neuropsychological test performance in atten-
tion-deficit/hyperactivity disorder. Neuropsychol 18:543–555
47. Dennis M, Francis DJ, Cirino PT, Schachar R, Barnes MA,
Fletcher JM (2009) Why IQ is not a covariate in cognitive stud-
ies of neurodevelopmental disorders. J Int Neuropsych Soc
15:331–343
48. Oosterlaan J, Logan GD, Sergeant JA (1998) Response inhibi-
tion in AD/HD, CD, comorbid AD/HD + CD, anxious, and
control children: a meta-analysis of studies with the stop task. J
Child Psychol Psyc 39:411–425
49. Lijffijt M, Kenemans JL, Verbaten MN, van Engeland H (2005)
A meta-analytic review of stopping performance in attention-def-
icit/hyperactivity disorder: deficient inhibitory motor control? J
Abnorm Psychol 114:216–222
50. Demurie E, Roeyers H, Wiersema JR, Sonuga-Barke E (2013) No
evidence for inhibitory deficits or altered reward processing in
ADHD: data from a new integrated monetary incentive delay go/
no-go task. J Attention Disord. doi:10.1177/1087054712473179
51. Daugherty TK, Quay HC, Ramos L (1993) Response persevera-
tion, inhibitory control, and central dopaminergic activity in
childhood behavior disorders. J Genet Psychol 154:177–188
52. Jennings JR, van der Molen MW, Pelham W, Debski KB, Hoza B
(1997) Inhibition in boys with attention deficit hyperactivity dis-
order as indexed by heart rate change. Dev Psychol 33:308–318
53. Scheres A, Oosterlaan J, Sergeant JA (2001) Response execution
and inhibition in children with AD/HD and other disruptive dis-
orders: the role of behavioural activation. J Child Psychol Psyc
42:347–357
54. Sonuga-Barke EJ (2002) Psychological heterogeneity in AD/
HD—a dual pathway model of behaviour and cognition. Beh
Brain Res 130:29–36
55. Crosbie J, Schachar R (2014) Deficient inhibition as a marker for
familial ADHD. Am J Psychiat 158:1884–1890
56. Nigg JT, Willcutt EG, Doyle AE, Sonuga-Barke EJ (2005)
Causal heterogeneity in attention-deficit/hyperactivity disorder:
do we need neuropsychologically impaired subtypes? Biol Psy-
chiat 57:1224–1230
57. Kofler MJ, Rapport MD, Sarver DE, Raiker JS, Orban SA, Fried-
man LM, Kolomeyer EG (2013) Reaction time variability in
ADHD: a meta-analytic review of 319 studies. Clin Psychol Rev
33:795–811
58. O’Connell RG, Bellgrove MA, Dockree PM, Robertson IH
(2004) Reduced electrodermal response to errors predicts poor
sustained attention performance in attention deficit hyperactivity
disorder. NeuroReport 15:2535–2538
59. Oosterlaan J, Sergeant JA (1998) Effects of reward and response
cost on response inhibition in AD/HD, disruptive, anxious, and
normal children. J Abnorm Child Psych 26:161–174
60. Shanahan MA, Pennington BF, Willcutt EW (2008) Do moti-
vational incentives reduce the inhibition deficit in ADHD? Dev
Neuropsychol 33:137–159
61. Desman C, Petermann F, Hampel P (2008) Deficit in response
inhibition in children with attention deficit/hyperactivity dis-
order (ADHD): impact of motivation? Child Neuropsychol
14:483–503
62. Wodka EL, Mark Mahone E, Blankner JG, Gidley Larson JC,
Fotedar S, Denckla MB, Mostofsky SH (2007) Evidence that
response inhibition is a primary deficit in ADHD. J Clin Exp-
Neuropsyc 29:345–356
63. Stevens J, Quittner AL, Zuckerman JB, Moore S (2002) Behav-
ioral inhibition, self-regulation of motivation, and working mem-
ory in children with attention deficit hyperactivity disorder. Dev
Neuropsychol 21:117–139
64. Haenlein M, Caul WF (1987) Attention deficit disorder with
hyperactivity: a specific hypothesis of reward dysfunction. J Am
Acad Child Psy 26:356–362
65. Sonuga-Barke E, Bitsakou P, Thompson M (2010) Beyond the
dual pathway model: evidence for the dissociation of timing,
inhibitory, and delay-related impairments in attention-deficit/
hyperactivity disorder. J Am Acad Child Psy 49:345–355
66. Dovis S, Van der Oord S, Huizenga HM, Wiers RW, Prins
PJM (2015) Prevalence and diagnostic validity of motivational
impairments and deficits in visuospatial short-term memory and
working memory in ADHD subtypes. Eur Child and Adoles Psy
24:575–590
67. Lacouture Y, Cousineau D (2008) How to use MATLAB to fit the
ex-Gaussian and other probability functions to a distribution of
response times. Tutorials Quant Meth Psychol 4:35–45
68. Matzke D, Wagenmakers EJ (2009) Psychological interpreta-
tion of the ex-Gaussian and shifted Wald parameters: a diffusion
model analysis. Psychon B Rev 16:798–817
69. Kofler MJ, Raiker JS, Sarver DE, Well EL, Soto EF (2016) Is
hyperactivity ubiquitous in ADHD or dependent on environmen-
tal demands? Evidence from meta-analysis. Clin Psychol Rev
46:12–24
70. Garb HN (2006) The conjunction effect and clinical judgment. J
Soc Clin Psychol 25:1048–1056
71. Toplak ME, West RF, Stanovich KE (2013) Practitioner review:
do performance-based measures and ratings of executive func-
tion assess the same construct? J Child Psychol Psyc 54:131–143
72. Fair DA, Bathula D, Nikolas MA, Nigg JT (2012) Distinct neu-
ropsychological subgroups in typically developing youth inform
heterogeneity in children with ADHD. PNAS 109:6769–6774
73. De Zeeuw P, Weusten J, van Dijk S, van Belle J, Durston S
(2012) Deficits in cognitive control, timing and reward sensitiv-
ity appear to be dissociable in ADHD. PLoS ONE 7:e51416
74. Ng MY, Weisz JR (2016) Annual research review: building a
science of personalised intervention for youth mental health. J
Child Psychol Psyc 57:216–236
https://doi.org/10.1177/1087054712473179
- Time-on-task effects in children with and without ADHD: depletion of executive resources or depletion of motivation?
Abstract
Introduction
Method
Participants
Materials
Procedure
Data analysis
Results
Exclusion of participants
Demographics
Effects of group: ADHD vs. TD controls (T1)
Multilevel analyses (also see Table 2)
Effects of group: ADHD vs. TD controls (T2: not reinforced)
Time-on-task effects in ADHD (T2 vs. T1: not reinforced)
Time-on-task effects in ADHD vs. TD controls (T2 vs. T1: not reinforced)
Effects of reinforcement in ADHD (T2: reinforced vs. not reinforced)
Effects of reinforcement in ADHD vs. TD controls (T2: reinforced vs. not reinforced)
Discussion
Acknowledgements
References
See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/3
0
3691151
Article in Learning and Instruction · August 2016
DOI: 10.1016/j.learninstruc.2016.04.003
CITATIONS
26
READS
1,909
7 authors, including:
Some of the authors of this publication are also working on these related projects:
Learning analytics assessment of cognitive presence View project
Physics Playground View project
Karrie E Godwin
Kent State University
23 PUBLICATIONS 297 CITATIONS
SEE PROFILE
Ma. Victoria Almeda
TERC
20 PUBLICATIONS 117 CITATIONS
SEE PROFILE
Shimin Kai
Teachers College
7 PUBLICATIONS 74 CITATIONS
SEE PROFILE
Ryan Baker
University of Pennsylvania
389 PUBLICATIONS 9,419 CITATIONS
SEE PROFILE
All content following this page was uploaded by Ma. Victoria Almeda on 09 May 2019.
The user has requested enhancement of the downloaded file.
https://www.researchgate.net/publication/303691151_Off-task_behavior_in_elementary_school_children?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_2&_esc=publicationCoverPdf
https://www.researchgate.net/publication/303691151_Off-task_behavior_in_elementary_school_children?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_3&_esc=publicationCoverPdf
https://www.researchgate.net/project/Learning-analytics-assessment-of-cognitive-presence?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_9&_esc=publicationCoverPdf
https://www.researchgate.net/project/Physics-Playground?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_9&_esc=publicationCoverPdf
https://www.researchgate.net/?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_1&_esc=publicationCoverPdf
https://www.researchgate.net/profile/Karrie_Godwin?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_4&_esc=publicationCoverPdf
https://www.researchgate.net/profile/Karrie_Godwin?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_5&_esc=publicationCoverPdf
https://www.researchgate.net/institution/Kent_State_University?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_6&_esc=publicationCoverPdf
https://www.researchgate.net/profile/Karrie_Godwin?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_7&_esc=publicationCoverPdf
https://www.researchgate.net/profile/Ma_Victoria_Almeda?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_4&_esc=publicationCoverPdf
https://www.researchgate.net/profile/Ma_Victoria_Almeda?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_5&_esc=publicationCoverPdf
https://www.researchgate.net/profile/Ma_Victoria_Almeda?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_7&_esc=publicationCoverPdf
https://www.researchgate.net/profile/Shimin_Kai?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_4&_esc=publicationCoverPdf
https://www.researchgate.net/profile/Shimin_Kai?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_5&_esc=publicationCoverPdf
https://www.researchgate.net/institution/Teachers_College?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_6&_esc=publicationCoverPdf
https://www.researchgate.net/profile/Shimin_Kai?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_7&_esc=publicationCoverPdf
https://www.researchgate.net/profile/Ryan_Baker14?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_4&_esc=publicationCoverPdf
https://www.researchgate.net/profile/Ryan_Baker14?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_5&_esc=publicationCoverPdf
https://www.researchgate.net/institution/University_of_Pennsylvania?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_6&_esc=publicationCoverPdf
https://www.researchgate.net/profile/Ryan_Baker14?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_7&_esc=publicationCoverPdf
https://www.researchgate.net/profile/Ma_Victoria_Almeda?enrichId=rgreq-018efa77e15a1cc9f970d361d912e0ef-XXX&enrichSource=Y292ZXJQYWdlOzMwMzY5MTE1MTtBUzo3NTY2NjQ0Mjg5MzcyMzlAMTU1NzQxNDI2NjYzNw%3D%3D&el=1_x_10&_esc=publicationCoverPdf
lable at ScienceDirect
Learning and Instruction 44 (2016) 128e143
Contents lists avai
Learning and Instruction
journal homepage: www.elsevier.com/locate/learninstruc
Off-task behavior in elementary school children
Karrie E. Godwin a, *, Ma. V. Almeda b, Howard Seltman c, Shimin Kai b,
Mandi D. Skerbetz d, Ryan S. Baker b, Anna V. Fisher a
a Carnegie Mellon University, Department of Psychology, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA
b Teachers College Columbia, Department of Human Development, 525W 120th Street, New York, NY 10027, USA
c Carnegie Mellon University, Department of Statistics, 5000 Forbes Avenue, Pittsburgh, PA 15213, USA
d South Fayette Township School District, 3680 Old Oakdale Road, McDonald, PA 15057, USA
a r t i c l e i n f o
Article history:
Received 14 August 2014
Received in revised form
17 April 2016
Accepted 23 April 2016
Available online 31 May 2016
Keywords:
Off-ta
sk behavior
Attention
Time on-task
Instructional design
* Corresponding author. Carnegie Mellon University
PIER, 336E Baker Hall, 5000 Forbes Avenue, Pittsburg
E-mail address: kegodwin@andrew.cmu.edu (K.E.
http://dx.doi.org/10.1016/j.learninstruc.2016.04.003
0959-4752/
© 2016 Elsevier Ltd. All rights reserved.
a b s t r a c t
This paper reports results from a large-scale observational study investigating attention allocation during
instructional activities in elementary school students (kindergarten through fourth-grade). In Study 1, 22
classrooms participated while a more diverse sample of 30 classrooms participated in Study 2. This work
investigated temporal patterns in children’s attention allocation by collecting observational data on
children’s on- and off-task behaviors at three different time points (i.e., beginning, middle, and end of the
school year) [Study 1]. We also investigated whether patterns of attention allocation changed as a
function of student characteristics (gender, grade-level, SES), teachers’ instructional design choices
(instructional format and duration of an instructional activity), and school type (private, parochial, public
charter schools) [Studies 1 & 2]. Children’s patterns of attention allocation fluctuated over the course of
the school year. Female students were found to be more on-task compared to male students. On-task
behavior tended to decline as the instructional duration increased. The lowest rates of on-task
behavior were observed while children were engaged in whole-group instructional formats. An effect
of school type was found with higher proportions of on-task behavior observed in parochial schools.
However, the effect of grade-level was equivocal across studies. These findings can begin to form a
foundation for the development of research-based guidelines for instructional design aimed to support
engagement among students in elementary classrooms.
© 2016 Elsevier Ltd. All rights reserved.
Loss of instructional time due to off-task behavior is a well-
established problem in educational settings, recognized by re-
searchers (e.g., Carroll, 1963; Karweit & Slavin, 1981; Lee, Kelly, &
Nyre, 1999) and practitioners (e.g., Lemov, 2010) for over a hun-
dred years (cf. Currie, 1884 as cited in Berliner, 1990). Off-task
behavior has been documented to negatively impact academic
achievement, although the magnitude of this impact is unstable
across studies (for reviews see Frederick & Walberg, 1980;
Goodman, 1990). Off-task behavior is an indicator that students’
attention is not focused on the instructional activity. The link be-
tween the quality of attention and task performance has also been
documented in the cognitive psychology literature (e.g., Choudhury
& Gorman, 2000; Dixon & Salley, 2007; DeMarie-Dreblow & Miller,
1988). Despite considerable prior research on off-task behavior,
, Department of Psychology &
h, PA 15213, USA.
Godwin).
designing effective, easy to implement, and scalable interventions
to reduce off-task behavior has been challenging. Roberts (2002)
suggests that many existing interventions may be unsuccessful
because they do not take into sufficient account the conditions that
lead to off-task behavior. The goal of the present study is to eluci-
date some of the factors involved in off-task behavior in elementary
school settings.
1. Off-task behavior in elementary school students
There is a variety of reasons why loss of instructional time oc-
curs in schools; these reasons include but are not limited to:
weather (e.g., snow days), sudden onset interruptions (e.g., an-
nouncements over the loudspeakers), and special events. However,
student inattentiveness (i.e., off-task behavior during instructional
time) has been found to be the biggest factor that accounts for loss
of instructional time (Karweit & Slavin, 1981). Prior research esti-
mates that elementary school students spend between 10% and 50%
Delta:1_given name
Delta:1_surname
Delta:1_given name
Delta:1_surname
Delta:1_given name
Delta:1_surname
Delta:1_given name
mailto:kegodwin@andrew.cmu.edu
http://crossmark.crossref.org/dialog/?doi=10.1016/j.learninstruc.2016.04.003&domain=pdf
www.sciencedirect.com/science/journal/09594752
http://www.elsevier.com/locate/learninstruc
http://dx.doi.org/10.1016/j.learninstruc.2016.04.003
http://dx.doi.org/10.1016/j.learninstruc.2016.04.003
http://dx.doi.org/10.1016/j.learninstruc.2016.04.003
K.E. Godwin et al. / Learning and Instruction 44 (2016) 128e143 129
of their time off-task in regular education classrooms (e.g., Fisher
et al., 1980; Karweit & Slavin, 1981; Lee et al., 1999; Lloyd &
Loper, 1986). Inattention or off-task behavior is a serious chal-
lenge educators face. In fact, off-task behavior has been identified
as one of the most common reasons for student referrals (Roberts,
2001). While eliminating all off-task behavior is not a realistic
expectation, reducing rates of off-task behavior is an important goal
given the challenges that off-task behavior causes for classroom
management as well as the potential implications of off-task
behavior on academic achievement.
A large number of prior studies have examined off-task behavior
in elementary school students; however, the generalizability of
prior work is limited due to its relatively narrow scope. For
example, some studies observed a few classrooms within a single
grade level (e.g., Lahaderne, 1968; Samuels & Turnure, 1974). Other
studies involved a large number of classrooms (e.g., 18 to 25
classrooms), but only observed a small subset of students within
each classroom (e.g., Fisher et al., 1980; Karweit & Slavin, 1981).
Indeed, the wide range in estimates of off-task behavior reported in
the literature may be partially attributed to the relatively small
sample sizes utilized in prior research, as small samples are more
susceptible to the influence of extreme data points. In order to
establish a more comprehensive understanding of children’s on
and off-task behaviors during early and middle childhood, research
examining children’s patterns of attention allocation on a larger
scale and across multiple grade-levels is clearly needed.
The present work makes a contribution to the field due to its size
and scope: this work includes a large sample size both in terms of
the number of classrooms which were recruited (e.g., Study 1: 22
classrooms, Study 2: 30 classrooms) as well as the number of
children within each classroom who were observed (i.e., all stu-
dents in attendance were included in the study). In contrast to
some prior work which tended to focus on one particular grade or a
small range of grade-levels, this work conducts observations across
a broad range of grade-levels in elementary school (i.e., kinder-
garten through fourth-grade).
A second contribution of this work pertains to the detailed
coding scheme that was employed to provide a more nuanced
examination of the sources of off-task behavior common in
elementary school settings. Previous work examining off-task
behavior in classrooms tended to treat off-task behavior as a uni-
tary construct (e.g., Carnine, 1976; Frederick, Walberg, & Rasher,
1979; Karweit & Slavin, 1981). Consequently, the sources of chil-
dren’s off-task behavior remain underspecified. In the present
study we delineate common types of off-task behavior including
off-task peer interactions, self-distraction, and off-task behaviors
directed towards aspects of the classroom environment. Identifying
common types of off-task behavior in elementary school settings is
critical as interventions targeting inattention will be successful only
to the extent that they adequately address the source of children’s
off-task behavior. The types of off-task behavior measured in the
present work were based on the results of a teacher survey. Thirty
elementary school teachers were asked to rate the frequency of
students’ off-task behaviors on a scale from 1 to 4, where 1 in-
dicates that the behavior occurs rarely and 4 indicates that the
behavior is very frequent. Peers (M ¼ 3.21, SD ¼ 0.62) and self-
distractions (M ¼ 2.62, SD ¼ 0.90) were identified by teachers as
frequent sources of off-task behavior. Additionally, walking around
the classroom (or being out of one’s seat) was identified as a
frequent off-task behavior by 14 of the teachers (M ¼ 2.50,
SD ¼ 0.85). Off-task behavior relating to the environment was
identified as another common source of distraction (M ¼ 1.83,
SD ¼ 0.85). Studying off-task behaviors associated with the class-
room environment is of particular interest because of the hypoth-
esized link between off-task behavior and visual design features of
elementary school classrooms (e.g., Fisher, Godwin, & Seltmen,
2014; Godwin & Fisher, 2011). Primary classrooms often contain
large amounts of stimulating sensory displays intended to increase
children’s motivation and engagement (Barrett, Zhang, Moffat, &
Kobbacy, 2012; Tarr, 2004; Thompson & Raisor, 2013). However,
there is no empirical evidence demonstrating that this design
choice increases motivation and engagement. By contrast, large
amounts of stimulating displays in classrooms have been described
as “visual bombardment” (Bullard, 2010, p. 110) and a “cacophony
of imagery” (Tarr, 2004, p. 1). Barrett and colleagues (Barrett et al.,
2012) recently reported that, contrary to their initial hypothesis,
high amounts of color (i.e., the degree and manner in which color
was utilized in the classroom walls, furniture, and displays) was
negatively associated with elementary school children’s achieve-
ment scores (although note that a follow-up study by Barrett,
Davies, Zhang, and Barrett (2015) reported that very low amounts
of color are also negatively associated with achievement, suggest-
ing that there may be a level of visual stimulation that is optimal for
classroom settings). Furthermore, there is recent experimental
evidence supporting the notion that highly decorated learning
environments may actually promote off-task behavior in young
children and thereby decrease learning (Fisher et al., 2014; Godwin
& Fisher, 2011).
Despite a large number of studies documenting rates of off-task
behavior in elementary school students, there has been limited
research examining the factors associated with off-task behavior.
The present work aims to address this gap in the literature by
conducting an exploratory study which examines four main
research questions: (1) Do patterns of attention allocation change
over the course of the school year? (2) Are student characteristics
(e.g., gender, grade-level, and SES) related to children’s attention
allocation patterns? (3) Are instructional design strategies (e.g.,
instructional format and duration of instructional activity) related
to children’s tendency to engage in on and off-task behavior? (4) Is
school-type related to children’s attention allocation patterns?
Below we briefly discuss how each of these factors may be related
to patterns of attention allocation in elementary school students.
There has been limited investigation of the variability in chil-
dren’s patterns of attention allocation as a function of time (Martin
et al., 2015). The idea that students’ attentional capacity fluctuates
over the course of the school day is a common belief in education
circles. Indeed, teachers’ report that they modify their instruction
in response to fluctuations in students’ levels of attention by
avoiding challenging instructional activities following lunch or at
the end of the school day (for discussion see Muyskens & Ysseldyke,
1998; Ammons, Booker, & Killmon, 1995). Observational research
examining the effect of time of day on school children’s classroom
behaviors has found that inappropriate behaviors are more
frequent in the afternoon compared to the morning (Muyskens &
Ysseldyke, 1998). Similar findings have been obtained with chil-
dren who have attention deficit disorders (e.g., Antrop, Roeyers, &
De Baecke, 2005; Zagar & Bowers, 1983). Furthermore, studies us-
ing performance-based measures of attention (e.g., paper and
pencil visual search tasks in which participants are asked to locate
and cross out a target object from a group of distractors) have found
that performance on tests of attention is highest in the mid-
morning and declines mid-day, although there is some variability
in the observed attention patterns for preschool children and stu-
dents in primary grades (e.g., Janvier & Testu, 2007). Although
levels of attention have been found to oscillate over the course of
the school day, it is currently an open question if and how patterns
of attention allocation change across the school year. Specifically,
the proportion of on-task behavior as well as the prevalence of
different types of off-task behavior may fluctuate as children
become more familiar with their teacher, school rules, peers, and
K.E. Godwin et al. / Learning and Instruction 44 (2016) 128e143130
their classroom environment. For example, self-distractions may be
more common in the beginning of the school year, but as time
progresses children may become better acquainted with their
classmates, leading to more off-task behavior directed towards
peers in the middle and end of the school year. Another possible
outcome is that children may habituate to their classroom visual
environment. Therefore, off-task behavior directed at the visual
features of the classroom may decrease from the beginning of the
school year compared to the end of the school year.
Student characteristics also likely influence children’s patterns
of attention allocation. For example, prior work suggests that males
exhibit more off-task behavior compared to females (Marks, 2000;
Matthews, Ponitz, & Morrison, 2009). Consequently, it is of interest
to investigate whether gender differences emerge in a large sample
of elementary school children and to investigate whether the
specific types of off-task behavior children engage in vary as a
function of gender. Grade-level is another factor that may
contribute to children’s tendency to engage in off-task behavior as
prior research has documented that the ability to engage in selec-
tive sustained attention improves with age (e.g., Bartgis, Thomas,
Lefler, & Hartung, 2008; for review see; Fisher & Kloos, 2016;
White, 1970). Additionally, it is possible that specific types of off-
task behavior may be more prevalent in younger grade-levels
(e.g., self-distraction) while other types of distraction may be more
pervasive across grade-levels (e.g. peers). Furthermore, rates of on
and off-task behavior may vary as a function of SES. Prior work has
found that executive function skills are typically weaker in children
from a lower socioeconomic background (e.g., Wiebe et al., 2011);
consequently, these children may have greater difficulty inhibiting
distractions and thus be more likely to engage in off-task behaviors.
Instructional design choices (i.e., average duration of an
instructional activity and instructional format) may also be related
to off-task behavior in elementary school children. For example, the
duration of an instructional activity may influence children’s ability
to attend to the ongoing instruction. Specifically, children may be
better able to maintain a state of focused attention when instruc-
tional activities are shorter in duration, this may be particularly
true for younger children who are still developing the ability to
efficiently regulate their attention. This possibility is consistent
with studies suggesting that in laboratory settings the duration of
focused attention increases gradually with development, from
approximately 4-min in 2- and 3-year-old children to over 9-min in
5- and 6-year-old children (for review see Fisher & Kloos, 2016).
Surprisingly, this issue has not been investigated systematically in
genuine education settings. Consequently, there is a dearth of
evidence-based guidelines that teachers can utilize to inform their
instructional design choices. Teachers typically have considerable
autonomy when determining how instructional time is allotted
(Rettig & Canady, 2013). At the same time, better insight into what
the optimal durations of instructional activity might be for main-
taining high rates of on-task behavior in elementary schools would
be valuable information for educators.
It is also possible that some types of instructional format (e.g.,
whole-group instruction, small-group instruction, etc.) are more
likely to be associated with higher rates of off-task behavior than
other instructional formats. There is evidence indicating that small-
group instruction (when groups are formed on the basis of student
ability) is more effective than whole-group instruction with regards
to student achievement (for reviews see Lou et al., 1996; Kulik,
1992). However, the size of this effect is relatively small and vari-
ability in effect sizes across individual studies is substantial. For
example, Lou et al. (1996) reported that the effect size of small-
group versus whole-group instruction ranged from �1.96 to 1.52,
with an average effect size of 0.17. Thus, some researchers have
argued that the large variability in effect sizes severely limits the
degree to which it can be concluded that small-group instruction is
superior to whole-group instruction (Prais, 1998, 1999). The pos-
sibility that the type of instructional format is related not only to
achievement but also to off-task behavior has to our knowledge
been largely unexplored (see Goodman, 1990); however, some
prior research has documented higher rates of student engagement
during teacher led activities (BTES as cited in Goodman,1990; Good
& Beckerman, 1978; Ponitz & Rimm-Kaufman, 2011). The present
work provides a nuanced examination of the relationship between
off-task behavior and specific instructional formats (e.g., individual
work, small-group or partner work, whole-group instruction at
desks, whole-group instruction while sitting on the carpet) in
elementary school students. Obtaining evidence to evaluate this
possibility is important because it can empower teachers to choose
instructional formats that are likely to optimize children’s attention
allocation.
Lastly, rates of on and off-task behavior as well as the types of
off-task behavior that children engage in may vary as a function of
school type (e.g., public schools, private schools, parochial schools).
For example, schools may have different group cultures or norms
regarding student behavior. These shared expectations may influ-
ence children’s patterns of attention allocation.
2. The present study
Within this paper, we report the results of two studies. Study 1
investigates the temporal patterns in children’s attention allocation
[Research Question 1]. Additionally, Study 1 examines whether
student characteristics and specific instructional design choices are
associated with patterns of attention allocation in elementary
school children, both in terms of the overall amount of on and off-
task behavior but also the form which off-task behavior takes
[Research Questions 2 and 3 respectively]. Data for Study 1 contains
a relatively homogeneous set of schools, as all participating schools
were part of the same public charter school organization. Study 2
investigates whether the results obtained in Study 1 regarding the
role of student characteristics and instructional design strategies on
children’s selective sustained attention can be generalized to a
more diverse sample of schools. In Study 2, we collected data from a
more heterogeneous sample of schools that varied in terms of the
socioeconomic status of the student population [Research Question
3] as well as school type (i.e., public charter schools, private schools,
and parochial schools) [Research Question 4].
2.1. Study 1: Temporal patterns of on- and off-task behavior across
the school year
In order to address Research Question 1, Study 1 examined
temporal patterns in children’s attention allocation by collecting
observational data on children’s on- and off-task behavior over the
course of the school year. To this end, we conducted observations at
three different time points: the beginning, middle, and end of the
school year. For each time point the proportion of on- and off-task
behavior was modeled in order to determine if children’s attention
patterns fluctuated or remained stable throughout the school year.
Additionally, we examined the possibility that the prevalence of
certain types of off-task behavior may change over time. We also
examined whether children’s patterns of attention allocation
changed as a function of student characteristics [Research Question
2] or based on instructional design strategies [Research Question 3].
2.2. Method
2.2.1. Participants
Twenty-two classrooms participated in Study 1. Participating
3 Mind wandering, also referred to as “Stimulus independent thought”
(Killingsworth & Gilbert, 2010, p. 932) and daydreaming (see Smallwood, Fishman,
& Schooler, 2007), can be considered another form of off-task behavior. However, in
the present study mind wandering was not included as a category of off-task
behavior due to methodological concerns. This particular form of off-task
behavior is not readily observable. Instead, mind wandering is typically assessed
using thought sampling (Smallwood et al., 2007). However, it is unclear whether
K.E. Godwin et al. / Learning and Instruction 44 (2016) 128e143 131
classrooms were selected from 5 charter schools located in or near a
Northeastern medium-sized city in the United States of America.
Five grade-levels were recruited: kindergarten through fourth-
grade.1 The distribution across the five grade-levels was as follows:
5 kindergarten classrooms, 4 first-grade classrooms, 5 second-
grade classrooms, 2 third-grade classrooms, and 6 fourth-grade
classrooms. The average class size was 21 students (10 males, 11
females). However, due to absences the average number of children
observed in a single observation session was 18.9 children. The
number of children observed per session ranged from 15 to 22
children.
2.2.2. Design and procedure
The observation sessions were staggered across three time pe-
riods (Time 1: October 2011eDecember 2011, Time 2: February
2012eApril 2012, Time 3: May 2012eJune 2012). In order to
minimize measurement error and obtain more stable estimates of
on- and off-task behavior for each classroom, two observation
sessions were conducted during each time period for a total of six
observations per classroom. However, due to scheduling con-
straints, in four of the 22 classrooms only five observation sessions
were conducted. Thus, a total of 1282 observation sessions were
conducted in Study 1. The average delay between observation
sessions within a single time period was 3.6 days (the delay ranged
from 1 to 14 calendar days). The average delay across time periods
was 90.5 days. Each observation session lasted approximately one
hour. The average number of observations per session was 346.13
and the average number of observations per child within a session
was 19.27. For the purposes of the analyses reported below, chil-
dren in each session were treated as a different set of students
because student identifiers could not be collected. As a result, it was
not possible to link observations across the six sessions. Therefore,
a total of 2402 student-session pairs were observed. A student-
session pair refers to a specific student observed by a coder
within a specific session. However, treating the children within
each session as a different set of students artificially inflates sta-
tistical power. In order to mitigate this concern, a more conserva-
tive alpha level was used in the analyses reported below.
Specifically, the alpha level was adjusted to 0.0083 (the commonly
accepted alpha level of 0.05 was divided by 6, the total number of
observations, in order to more closely approximate the true size of
the sample).
2.2.3. Coding on- and off-task behaviors
All coders were trained in the Baker-Rodrigo Observation Method
Protocol (BROMP) for coding behavioral data in field settings
(Ocumpaugh, Baker, & Rodrigo, 2012). All coders received extensive
training consisting of coding videotapes and live observation ses-
sions. Inter-rater reliability was established prior to the study
proper. Kappa values ranged from 0.79 to 0.84. This level of reli-
ability is in line with past classroom research coding off-task
behavior, and exceeded the 0.75 threshold to which Fleiss (1981)
refers to as “excellent” in field settings.
Field coding was conducted using the HART app for Android
handheld computers (Baker et al., 2012), which enforces the
BROMP protocol. Children were observed using a round-robin
coding strategy, in order to reduce the tendency of observers to
1 Due to the nature of the IRB approval for this study, no identifying information,
including date of birth, was collected. Consequently, we are unable to provide the
mean age per grade-level. However, according to the National Center for Education
Statistics (2001), the average age at school entry for U.S. kindergarten children is 5.5
years.
2 Two observation sessions were excluded from analysis due to disruptions that
occurred during the observation session.
attend to more salient instances of off-task behavior. In the round-
robin coding strategy, each child present in the classroom was
observed individually in a prescribed order determined at the
beginning of each session. Each time a child was observed, the
observation lasted for up to 20 s. The first unambiguous behavior
observed during the 20 s period was recorded. Quick glances were
considered ambiguous behaviors, and coders were instructed to
wait for an unambiguous behavior to occur (i.e., a behavior that was
sustained long enough for the coder to identify and code the
behavior based on the coding scheme described below). If a
behavior was noted before 20 s elapsed, the coder proceeded to the
next child, and a new 20 s observation period began. This process
was repeated for the duration of the session; in this way each child
was observed multiple times throughout the observation session.
Coders observed the children using peripheral vision or side-
glances in order to avoid looking directly at the student being
observed. This technique makes it less apparent to the child that
s(he) is being observed. This procedure has successfully and reliably
captured students’ behavior in prior work which assessed behavior
and affect in middle and high school students (e.g., Baker, 2007;
Baker, D’Mello, Rodrigo, & Graesser, 2010; Ocumpaugh et al., 2012).
Coders first classified children’s behavior as on- or off-task using
primarily the direction of the child’s gaze. If the child was looking at
the teacher (or classroom assistant), the instructional activity, and/
or the relevant instructional materials, they were categorized as
being on-task. If the child was looking elsewhere, they were cate-
gorized as being off-task. Contextual clues (i.e., teacher in-
structions) were also taken into consideration when distinguishing
between on- and off-task behaviors. For example, if a child was
instructed to discuss an idea with a partner, coders would classify
conversing with another peer as on-task unless the coders could
clearly discern that the conversation was unrelated to the task.
If the child was classified as off-task, the type of off-task
behavior was recorded. Five3 mutually exclusive categories of off-
task behavior were logged: (1) Self-distraction, (2) Peer distraction,
(3) Environmental distraction, (4) Walking, or (5) Other. Self-
distraction entailed engagement with something on the child’s own
body, such as an article of clothing or an appendage, as well as
episodes in which the child would close their eyes. Peer distraction
was defined as interacting with or looking at another student when
not directed to do so. Environmental distractions include interacting
with or looking at any object in the classroom that was not related
to the task at hand. Walking was operationalized as a student
physically walking around the classroom when it was not consid-
ered appropriate for the task. Other distractions included student
behavior that was off-task but did not clearly align with the five
aforementioned categories. A sixth category Unknown was also
included to capture rare instances in which it was unknown
young children possess the metacognitive capabilities to self-report the occurrence
or frequency with which they experience mind wandering. Relying on self-report
may also be particularly problematic since mind wandering can occur without
awareness (see Schooler et al., 2011; Smallwood et al., 2007). Additionally, young
children may be particularly susceptible to demand characteristics which may
diminish the accuracy of thought sampling procedures. A thorough discussion of
mind wandering is beyond the scope of the present paper; however, interested
readers can refer to Killingsworth and Gilbert (2010), Smallwood et al. (2007) or
Schooler et al. (2011) for discussion of recent findings in the mind wandering
literature.
K.E. Godwin et al. / Learning and Instruction 44 (2016) 128e143132
whether the child was on- or off-task, and it was impossible or
inappropriate for the observer to relocate in order to obtain a better
view of the child. Unknown was also used when students left the
classroom for various reasons (e.g., to use the restroom). Since the
category Unknown is not informative in terms of children’s patterns
of attention allocation it was excluded from the analyses. The
category Unknown accounted for 5% of the total observations coded.
As discussed previously, instructional format was included as a
variable in order to examine whether certain instructional formats
elicit differential amounts of off-task behavior. Six different
instructional formats were coded: (1) individual work (when stu-
dents are working on an activity individually, for instance when the
teacher directs the students to complete a worksheet or other ac-
tivity by themselves or without any help from classmates), (2)
small-group work (when students are divided into smaller groups
and each group works on an activity independently of the other
groups; this category includes small-group work, partner work, and
centers), (3) whole-group instruction at desks (when students are
sitting at their desk or table while the teacher instructs the whole
class in an activity), (4) whole-group instruction while sitting on
the carpet (when students are sitting on the carpet or floor while
the teacher instructs the whole class in an activity), (5) dancing, and
(6) testing. It is important to note that observations were scheduled
during instructional time; however, on rare occasions testing and
dancing were observed resulting in a limited quantity of data
gathered during these formats. As a result, testing and dancing
were excluded from the analyses.
2.3. Results study 1
Overall, the percentage of off-task behavior averaged across all
observations (29.30%) was within the range of previously reported
estimates of off-task behavior in elementary school students (i.e.,
10%e50%). In the present study, the three most common types of
off-task behavior observed included: Peer distractions, Environment
distractions, and Self-distractions. These three sources of distraction
accounted for 85% of children’s off-task behaviors. The percentage
of children’s on and off-task behaviors are reported in Table 1.
A three-level hierarchical logistic regression was performed
with observations nested within students, nested within class-
rooms using SAS PROC GLIMMIX. Random intercepts for students
and classroom were included in the model. Separate models were
run for on-task behavior and the three most common types of off-
task behavior (peer distractions, environmental distractions, and self-
distractions). For all models the test of the covariance parameters
were significant indicating that both random intercepts are needed
(all ps < 0.0001). All off-task models are conditional on being off-
task. Fixed effects included Time of Year (beginning, middle, and
end of the school year), Gender (Males, Females), Grade-level
(Kindergarten, First, Second, Third, and Fourth grades), and
Instructional Format (individual work, small-group, whole-group
instruction at desks, whole-group instruction while sitting on the
carpet). Using this approach our results reflect the effect of each
independent variable controlling for the others and the correlations
induced by the hierarchy.
Table 1
Percentages of on- and off-task behaviors in Study 1.
On-task behavior
70.70%
Sources of off-ta
Peer distractions Environmental distractions Self d
44.12% 24.74% 15.91
2.3.1. Question 1: Do patterns of attention allocation change over
the course of the school year?
2.3.1.1. Time of year and on-task behavior
2.3.1.1.1. On-task behavior. Controlling for gender, grade-level,
and instructional format, a significant effect of time of year on
rates of on-task behavior was found (F(2, 2341) ¼ 26.16,
p < 0.0001). In general, on-task behavior was found to decline at the
end of the school year with children engaging in significantly less
on-task behavior at the end of the school year than at the beginning
of the school year (t(2399) ¼ 3.13, p ¼ 0.0018, Odds ratio
[OR] ¼ 1.13, 95% multiplicity-adjusted confidence interval
[CI] ¼ [1.03, 1.23]) or the middle of the school year (t(2431) ¼ 7.17,
p < 0.0001, OR ¼ 1.31, 95% CI ¼ [1.20,1.44]). Thus, children were 1.13
times more likely to engage in on-task behavior at the beginning of
the school year compared to the end of the school year and 1.31
times more likely to be on-task in the middle of the year compared
to the end of the school year. However, higher rates of on-task
behavior were found in the middle of the school year compared
to the beginning of the school year (t(2216) ¼ �4.32, p < 0.0001,
OR ¼ 1.17, 95% CI ¼ [1.07, 1.27]).
2.3.1.2. Time of year and the sources of off-task behavior. Next, we
examined temporal changes in the three most common types of
off-task behavior: peer distractions, environmental distractions, and
self-distractions. A significant effect of Time was found for all three
types of off-task behavior. However, fluctuations across time were
not uniform. The results for each type of off-task behavior are
described below.
2.3.1.2.1. Peer distractions. Controlling for gender, grade-level,
and instructional format, peer off-task behaviors, relative to all
other types of off-task behavior, were found to vary across time (F
(2, 2117) ¼ 7.20, p ¼ 0.0008). Children exhibited a significantly
higher rate of peer off-task behavior, relative to all other types of
off-task behavior, at the middle of the school year compared to the
beginning of the school year (t(2073) ¼ �3.43, p ¼ 0.0006,
OR ¼ 1.21, 95% CI ¼ [1.06, 1.38]) and the end of the school year
(t(2151) ¼ 3.12, p ¼ 0.0019, OR ¼ 1.20, 95% CI ¼ [1.04, 1.38]). Spe-
cifically, children were 1.21 (OR) times more likely to engage in peer
distractions in the middle of the year compared to the beginning of
the year and 1.20 (OR) times more likely to engage in peer dis-
tractions in the middle of the year compared to the end of the
school year. There was no significant difference in the frequency of
peer off-task behavior, relative to all other off-task behaviors, be-
tween the beginning and end of the school year (t(2133) ¼ �0.15,
p ¼ 0.88, OR ¼ 0.99, 95% CI ¼ [0.86, 1.14]).
2.3.1.2.2. Environmental distractions. Controlling for gender,
grade-level, and instructional format, a significant effect of time of
year was found on rates of off-task behavior directed towards the
environment relative to all other off-task behaviors (F(2,
2331) ¼ 8.08, p ¼ 0.0003). In general, off-task behavior directed
toward the environment increased over time, as a fraction of all off-
task behaviors, with children exhibiting a significantly higher rate
of environment based off-task behaviors (relative to all other types
of off-task behavior) at the end of the school year compared to the
beginning of the school year (t(2430) ¼ �3.69, p ¼ 0.0002,
Off-task behavior
29.30%
sk behavior
istractions Walking off-task Other distractions
% 3.07% 12.15%
K.E. Godwin et al. / Learning and Instruction 44 (2016) 128e143 133
OR ¼ 1.33, 95% CI ¼ [1.10,1.59]). Children were 1.33 (OR) times more
likely to engage in environmental distractions at the end of the
school year compared to the beginning of the school year. Similarly
children were 1.27 (OR) times more likely to engage in off-task
behavior directed toward the environment in the middle of the
school year compared to the beginning of the school year
(t(2384) ¼ �3.22, p ¼ 0.0013, OR ¼ 1.27, 95% CI ¼ [1.06,1.52]). There
was no significant difference in rates of environmental distractions
(relative to all other off-task behaviors) between the middle and
end of the school year (t(2195) ¼ �0.58, p ¼ 0.56).
2.3.1.2.3. Self-distractions. Controlling for gender, grade-level,
and instructional format, a significant effect of time of year was
found on rates of self-distraction relative to all other types of off-
task behaviors (F(2, 2337) ¼ 32.35, p < 0.0001). The results for
the proportion of Self-distractions across time points was similar to
the observed pattern of results for Environmental distractions. The
frequency of self-distractions relative to all other types of off-task
behavior increased significantly across all three time points (all
ps < 0.0013). For instance, self-distractions were 1.86 (OR) times
more likely at the end of the school year compared to the beginning
of the school year (t(2400) ¼ �7.98, p < 0.0001, OR ¼ 1.86, 95%
CI ¼ [1.54, 2.24]).
2.3.2. Question 2: Are student characteristics related to children’s
attention allocation patterns?
In order to evaluate the putative relationship between student
characteristics and children’s attention allocation, the fixed effects
for gender and grade-level were examined. The results are reported
below for on-task behavior as well as the three most common types
of off-task behavior (peer distractions, environmental distractions,
and self-distractions).
2.3.2.1. Effect of gender
2.3.2.1.1. Gender and on-task behavior. Controlling for grade-
level, instructional format, and time of year there was a signifi-
cant effect of gender (F(1, 2264) ¼ 58.46, p < 0.0001). Females were
more likely to engage in on-task behavior than males, and this
difference was statistically significant (t(2264) ¼ 7.65, p < 0.0001,
OR ¼ 1.26, 95% CI ¼ [1.19, 1.34]). Female students engaged in 1.26
(OR) times more on-task behaviors than males. For example, at the
end of the school year, in an average classroom, in the whole desk
format, the average fourth-grade girl has 1.98 on task behaviors for
every 1 off-task behavior, while the average boy has 1.57 on task
behaviors for every 1 off-task behavior.
2.3.2.1.2. Gender and peer-distractions. Controlling for grade-
level, instructional format, and time of year there was a signifi-
cant effect of gender on the rates of peer-distraction (F(1,
2048) ¼ 21.37, p < 0.0001). Relative to all other off-task behaviors,
females were more likely to engage in peer off-task behaviors
compared to males, and this difference was statistically significant
(t(2048) ¼ 4.62, p < 0.0001, OR ¼ 1.24, 95% CI ¼ [1.13, 1.36]). Thus
when a female is off-task, her off-task behavior is 1.24 times more
likely to be an off-task behavior directed towards her peers than for
a male in the same grade-level, classroom, instructional format, and
time of year.
2.3.2.1.3. Gender and environmental distractions.
Controlling for grade-level, instructional format, and time of year
there was a significant effect of gender on rates of off-task behavior
directed toward the environment (F(1, 2369) ¼ 10.75, p ¼ 0.001).
Relative to all other off-task behaviors, males were 1.22 [1.08, 1.38]
(OR and 95% CI) times more likely to engage in environmental dis-
tractions compared to females, and this difference was statistically
significant (t(2369) ¼ 3.28, p ¼ 0.001).
2.3.2.1.4. Gender and self-distractions. Contrary to the results for
peer distractions and environmental distractions there was no
significant effect of gender on the rate of self-distractions relative to
all other off-task behaviors after controlling for grade-level,
instructional format, and time of year (F(1, 2249) ¼ 0.20, p ¼ 0.65).
2.3.2.2. Effect of grade-level.
The analyses for grade-level revealed no significant effect of
grade-level for on-task behavior (F (4, 17) ¼ 1.32, p ¼ 0.30) and no
significant effect of grade-level for each type of off-task behavior
(peer distraction p ¼ 0.16, environmental distraction p ¼ 0.41, self-
distraction p ¼ 0.68) after controlling for gender, instructional
format, and time of year.
2.3.3. Question 3: Are instructional design choices related to
children’s attention allocation?
In order to evaluate the role of instructional design strategies on
children’s tendency to engage in on and off-task behaviors, the
fixed effects for instructional format were examined. The results
are reported below.
2.3.3.1. Instructional format and on-task behavior. Controlling for
gender, grade-level, and time of year there was a significant effect
of instructional format on the rates of on-task behavior (F(3,
4803) ¼ 8.35, p < 0.0001). Recall that significance is based on the
adjusted alpha level of 0.0083 in which the commonly accepted
alpha level of 0.05 was divided by 6, the total number of observa-
tions. The p-values are also adjusted for multiple comparisons us-
ing the Tukey-Kramer method. In general, on-task behavior was
most likely to occur during activities that took place in a small-
group format (small-group vs. whole-group carpet:
t(4803) ¼ 4.97, adj. p < 0.0001; small-group vs. whole-group desk:
t(4803) ¼ 2.97, adj. p ¼ 0.016 [marginally significant based on the
more conservative alpha level]; however, the contrast between
individual and small-group instruction was not statistically signif-
icant adj. p ¼ 0.03). On-task behavior was 1.23 [1.11, 1.37] (OR and
95% CI) times more likely in the small-group format compared to
whole-group instruction on the carpet and 1.12 [1.02, 1.24] (OR and
95% CI) times more likely compared to whole-group instruction at
desks. There was no significant difference in the rates of on-task
behavior across the remaining instructional formats: individual,
whole-group at desks, and whole-group on the carpet (all adj.
ps � 0.08).
2.3.3.2. Instructional format and peer distractions. Controlling for
gender, grade-level, and time of year there was a significant effect
of instructional format on rates of peer off-task behavior (F(3,
3914) ¼ 70.87, p < 0.0001). Adjusted for multiple comparisons
(using the Tukey-Kramer method) relative to all other off-task be-
haviors, peer distractions were most likely to occur during activities
that took place in an individual instructional format or small-group
format compared to whole-group instruction at desks or on the
carpet (all adj. ps < 0.0001). There was no significant difference in
the frequency of peer distractions between individual and small-
group formats (t(3914) ¼ 1.60, adj. p ¼ 0.38); nor was there a sig-
nificant difference in the frequency of peer distractions (relative to
other off-task behaviors) between the two whole-group instruc-
tional formats: whole-group instruction on the carpet and whole-
group instruction at desks (t(3914) ¼ 1.84, adj. p ¼ 0.26). Peer off-
task behaviors were approximately twice as likely during individ-
ual or small-group instruction compared to whole-group instruc-
tion with odds ratios ranging from 1.70 (small-group vs. whole-
group instruction on the carpet) to 2.16 (individual vs. whole-
group instruction at desks).
2.3.3.3. Instructional format and environmental distractions.
Controlling for gender, grade-level, and time of year there was a
K.E. Godwin et al. / Learning and Instruction 44 (2016) 128e143134
significant effect of instructional format on rates of environmental
distractions (F(3, 3914) ¼ 28.58, p < 0.0001). Adjusted for multiple
comparisons (using the Tukey-Kramer method) relative to all other
off-task behaviors environmental distractions were least likely to
occur during activities that took place in an individual instructional
format or small-group format; all adj. ps < 0.0001. There was no
significant difference in the rates of environment based off-task
behavior, relative to all other off-task behaviors, across these two
instructional formats (t(3914) ¼ �1.50, adj. p ¼ 0.44, small-
group:individual OR ¼ 1.19, 95% CI ¼ [0.88, 1.58]). Relative to all
other off-task behaviors, environmental distractions were most
likely to occur during whole-group instruction at desks and whole-
group instruction on the carpet. Specifically, children were 2.07
[1.60, 2.68] (OR and 95% CI) times more likely to exhibit environ-
mental distractions during whole-group instruction on the carpet
than during individual instruction and environmental distractions
were 1.75 [1.36, 2.26] (OR and 95% CI) times more likely during
whole-group instruction on the carpet than during small-group
work. Similarly, students were 1.94 [1.51, 2.51] (OR and 95% CI)
times more likely to engage in environmental distractions during
whole-group instruction at desks compared to individual instruc-
tion and 1.64 [1.29, 2.09] (OR and 95% CI) times more likely to
engage in environmental distractions during whole-group instruc-
tion at desks compared to small-group work. There was no signif-
icant difference in the rates of environment based off-task behavior
across the two whole-group formats (t(3914) ¼ 0.74, adj. p ¼ 0.88,
desk:carpet OR ¼ 0.94, CI ¼ [0.75, 1.18]).
2.3.3.4. Instructional format and self-distractions. Controlling for
gender, grade-level, and time of year there was a significant effect
of instructional format on rates of self-distraction (F(3,
3914) ¼ 139.14, p < 0.0001). Adjusted for multiple comparisons
(using the Tukey-Kramer method) relative to all other types of off-
task behavior, self-distractions were highest during whole-group
instruction on the carpet compared to all other instructional for-
mats (all adj. ps < 0.0001). For example, self-distractions were 5.24
[3.95, 6.90] (OR and 95% CI) times more likely during whole-group
instruction on the carpet than during small-group work. The sec-
ond highest rates of self-distraction, relative to all other types of off-
task behavior, occurred during whole-group instruction at stu-
dents' desks compared to small group and individual instructional
formats (both adj. ps < 0.0001). There was no significant difference
in the frequency of self-distractions, relative to all other off-task
behaviors, between small-group and individual instructional for-
mats (t(3914) ¼ �0.90, adj. p ¼ 0.81, small-group:individual
OR ¼ 1.13, 95% CI ¼ [0.80, 1.58]).
2.4. Discussion study 1
Several novel findings have emerged from Study 1. First, the
results from Study 1 indicate that children’s patterns of attention
allocation are not stable across the school year. Variations in the
proportion of both on-task and off-task behavior were observed
as a function of the time of year (i.e., the beginning, middle, or end
of the school year). The findings suggest that on-task behavior
declines by the end of the school year. The frequency with which
children engaged in different sources of off-task behavior was
found to fluctuate over time. Notably, peer off-task behavior was
found to increase during the middle of the school year while self-
distractions and environmental distractions increased at the end of
the school year. The latter finding is notable because it contradicts
the possibility that over the course of the school year children
habituate to their environment and engage in less off-task
behavior related to the environment e Study 1 suggests the
opposite to be the case. Recall that in the present work we
observed children twice at each time point in order to minimize
aberrations and obtain more stable estimates of children’s
behavior. However, future research should aim to replicate these
findings using a greater density of observations at each time
point.
Second, children’s patterns of attention allocation also varied as
a function of certain student characteristics (i.e., gender). For
example, in line with prior research, females tended to engage in
more on-task behavior than males. Interestingly, there were also
differences in the types of off-task behaviors that children tended
to engage in as a function of gender with males being more likely to
engage in off-task behavior directed toward the environment
(relative to other off-task behaviors) and females engaging in more
peer off-task behaviors (relative to other off-task behaviors). In
contrast to the results for gender, there was no significant effect of
grade-level on children’s on or off-task behaviors.
Third, the results from study 1 also suggest that instructional
design choices are related to children’s attention allocation, as rates
of on and off-task behavior varied as a function of the instructional
format (e.g., small-group, individual, etc.). On-task behavior was
most likely to occur during small-group instruction. Different
instructional formats also appear to elicit different types of off-task
behavior. For instance, peer distractions were most common in in-
dividual or small-group instructional formats, whereas environ-
mental distractions were least likely to occur during these two
instructional formats but were more likely to occur during whole-
group instruction.
Taken together the results from Study 1 indicate that children’s
patterns of attention allocation are influenced by external factors
such as time of year and instructional design strategies; however,
there is also some evidence to suggest that certain student char-
acteristics such as gender influence the rates of children’s on-task
behavior and to some extent the form in which off-task behavior
may take.
2.5. Study 2: Assessing the generalizability of the relationship
between student characteristics, instructional design strategies, and
attention allocation
Study 1 contained a relatively homogeneous set of schools, as all
5 of the participating schools were part of the same public charter
school organization. Thus, it is an empirical question as to whether
the results obtained in Study 1 would replicate with a more diverse
sample. The goal of Study 2 was therefore to examine whether the
findings obtained in Study 1 could be generalized to other schools
and student populations. Consequently, in Study 2 a more hetero-
geneous sample of schools was recruited. The schools varied in
terms of the socioeconomic status (SES) of the student population
as well as school type (i.e., public charter schools, private schools,
and parochial schools). Specifically, Study 2 investigated how pat-
terns of attention allocation change as a function of student char-
acteristics (gender, grade-level, SES) [Research Question 2],
instructional design strategies (i.e., instructional format and
average duration of an instructional activity) [Research Question 3],
and school type (i.e., public charter schools, private schools, and
parochial schools) [Research Question 4].
2.6. Method
2.6.1. Participants
Thirty classrooms participated in Study 2. The classrooms were
recruited from 9 schools which included: 4 charter schools, 3 pri-
vate schools, and 2 parochial schools. Data were also collected
based on the percentage of students belonging to low-income
families. The distribution of schools in each SES quartile was as
K.E. Godwin et al. / Learning and Instruction 44 (2016) 128e143 135
follows: 5 schools had 0e25% low-income students; 0 schools had
25e50% low-income students; 3 schools had 50e75% low-income
students; 1 school had 75% and above low-income students.
Students were recruited from five grade-levels: kindergarten
through fourth-grade. The distribution across the five grade-levels
was as follows: 7 kindergarten classrooms, 7 first-grade classrooms,
7 second-grade classrooms, 3 third-grade classrooms, and 6 fourth-
grade classrooms. Information regarding the class size and gender
distribution was provided by 20 out of 30 participating teachers.
The average class size was 18.71 students (9.25 males, 9.95 fe-
males). Due to absences, the average number of children observed
in a single observation session was 18.58 children. The number of
children observed per session ranged from 14 to 23.
2.6.2. Design and procedure
Each classroom was observed two times during the school year,
resulting in a total of 60 observation sessions. The observation
sessions occurred between October 2012 and December 2012. The
average delay between observation sessions within a single time
period was 3.8 calendar days (the delay ranged from 1 to 10 days).
Each observation session lasted approximately one hour. The
average number of observations per session was 263.93 and the
average number of observations per child within a session was
16.17. A total of 1113 student-session pairs were observed. Recall
that a student-session pair refers to a specific student observed by a
coder within a specific session. As mentioned in Study 1, treating
the children within each session as a different set of students
artificially inflates statistical power. As a result, a more conservative
alpha level was employed; specifically the alpha level was adjusted
to 0.025 (the standard alpha level of 0.05 was divided by 2, the
number of observations, in order to more closely approximate the
true sample size).
Coding of on-task behavior was identical to that in Study 1.
Coding of off-task behavior was modified in one important way:
Based on reports from coders, we split the environmental distrac-
tions category into two separate categories to distinguish between
distinct types of off-task interactions with the classroom environ-
ment. Specifically, in Study 2 environmental distractions were
defined more narrowly as looking at or interacting with elements of
the classroom visual design (e.g., charts, posters, etc.); inappro-
priate use of objects that were a part of the assigned task (e.g.,
playing with a pen instead of using it for its intended purpose) were
coded as off-task behavior related to school supplies (whereas in
Study 1 such instances were coded as environmental distractions).
The decision to code supplies separately from environmental dis-
tractions was based on the idea that interventions that target the
classroom visual environment may be more feasible than in-
terventions that address the inappropriate use of school supplies.
Presumably the classroom environment can be streamlined (i.e.,
extraneous posters and charts can be removed) while school sup-
plies are by definition tools that students need to complete their
assignment and are therefore less amenable to intervention.
Furthermore, prior laboratory research has indicated that the
classroom visual environment (charts, posters, educational
Table 2
Percentages of on- and off-task behaviors in Study 2.
On-task behavior
73.58%
Sources of off-ta
Peer distractions Environmental distractions Supplies distraction
49.07% 12.10% 16.86%
displays, etc.) can be a source of distraction for young children
(Fisher et al., 2014). Thus, it was of interest to determine whether
this finding would generalize from the laboratory to real-world
classrooms. Therefore, a total of six categories of off-task behavior
were coded: (1) Self-distraction, (2) Peer distraction, (3) Environ-
mental distraction, (4) Supplies, (5), Walking, and (6) Other. Similar to
Study 1, the category Unknown was utilized when coders could not
establish whether a child was on- or off-task. Since the category
Unknown was not informative in terms of children’s patterns of
attention allocation it was excluded from the analyses. The category
Unknown accounted for 3% of the total observations.
In addition to gender and grade-level, SES was also included as a
student characteristic. For each school, SES was based on the per-
centage of students from low-income families, obtained from the
Pennsylvania Department of Education for the 2012e2013 aca-
demic year. As such, lower values represent schools in a more
affluent community.
Predictor variables pertaining to instructional design strategies
included the instructional format and the average duration of an
instructional activity. The same four instructional formats were
included: (1) individual work, (2) small-group or partner work, (3)
whole-group instruction at desks, (4) whole-group instruction
while sitting on the carpet. The average duration of an instruc-
tional activity (sec) was operationalized as the total duration of an
observation session divided by the number of activities (defined as
the number of transitions between instructional activities plus
one). Duration of an instructional activity was included in order to
investigate whether children were better able to maintain a state
of focused attention when instruction consisted of small blocks of
activities versus instructional activities that occurred over a longer
duration. Transitions were noted every time the teacher paused
instruction to change from one activity to another (e.g., tran-
sitioning from working on a math problem to listening to a short
story). In many cases, transitions coincided with a change in
instructional format (e.g., switching from whole-group instruction
to small-group instruction); however this was not always the case
as transitions could occur without a change in instructional format
(e.g., with children rotating from one small-group activity to
another). Transitions were frequently marked by the teacher
asking the children to get out new instructional materials (e.g.,
“Please get out your math binders”) or requesting that students
change locations (e.g., “Please put your notebooks away and come to
the carpet”).
2.7. Results study 2
Similar to Study 1, and to previous reports in the literature
(Karweit & Slavin, 1981; Lee et al., 1999), 26.42% of children’s
observed behaviors were coded as off-task. In line with the patterns
of attention allocation observed in Study 1, the most common types
of off-task behavior were: peer distractions (49.07%), self-distrac-
tions (11.56%), environmental distractions (12.10%) and supplies
(16.86%); see Table 2 for the percentages of children’s on and off-
task behaviors.
Off-task behavior
26.42%
sk behavior
s Self distractions Walking off-task Other distractions
11.56% 3.17% 7.24%
K.E. Godwin et al. / Learning and Instruction 44 (2016) 128e143136
In order to investigate changes in the students’ patterns of
attention allocation based on student characteristics, instructional
design strategies (i.e., format and average duration of an instruc-
tional activity), and school type, a three-level hierarchical logistic
regression was performed with observations nested within stu-
dents, nested within classrooms using SAS PROC GLIMMIX.
Random intercepts for students and classroom were included in the
model. Separate models were run for on-task behavior and the four
most common types of off-task behavior (peer distractions, envi-
ronmental distractions, self-distractions, and supplies). The tests of
the covariance parameters were significant in all but one model
(self-distractions) indicating that both random intercepts are
needed (all other ps < 0.0001). All off-task models are conditional
on being off-task. Fixed effects included: Gender, Grade-level, SES,
Instructional Format, Instructional Duration, and School Type
(private, charter, parochial).
2.7.1. Question 2: Are student characteristics related to children’s
attention allocation patterns?
2.7.1.1. Effect of gender
2.7.1.1.1. Gender and on-task behavior. Consistent with the
findings of Study 1, there was a significant effect of gender after
controlling for SES, grade-level, instructional format, duration, and
school type (F(1, 1062) ¼ 23.96, p < 0.0001). Females had signifi-
cantly higher rates of on-task behavior than males (t(1062) ¼ 4.89,
p < 0.0001). The corresponding odds ratio is 1.26 with 95% CI [1.15,
1.37], indicating a 26% higher ratio of on-task behavior for females
compared to males. As an example, for an average private school
classroom of third graders at the beginning (i.e., duration ¼ 0) of
whole desk instruction, an average female of average SES has 2.07
on task behaviors for every one off-task behavior, while an average
male has only 1.65 on task behaviors for every one off-task
behavior.
2.7.1.1.2. Gender and peer distractions. As in Study 1, there was a
significant effect of gender on rates of peer off-task behavior after
controlling for SES, grade-level, instructional format, duration, and
school type, (F(1, 865) ¼ 7.41, p ¼ 0.007). Peer distractions were
more frequent among females than among males (t(865) ¼ 2.72,
p ¼ 0.007). Females were 1.24 [1.06, 1.45] (OR and 95% CI) times
more likely to engage in peer off-task behavior (vs. other off-task
behaviors) compared to males.
2.7.1.1.3. Gender and environmental distractions. In contrast to
Study 1, there was no significant effect of gender on rates of envi-
ronmental distractions, as a fraction of all other off-task behaviors,
after controlling for SES, grade-level, instructional format, duration,
and school type (F(1, 962) ¼ 1.60, p ¼ 0.21).
2.7.1.1.4. Gender and self-distractions. Based on the more con-
servative alpha value of 0.025, the effect of gender on self-distrac-
tion rates (as a fraction of all off-task behavior) was marginally
significant e controlling for SES, grade-level, instructional format,
duration, and school type (F(1, 970) ¼ 4.14, p ¼ 0.04). Self-distrac-
tions were more frequent among males than among females
(t(970) ¼ �2.03, p ¼ 0.04). Males were 1.27 [1.01, 1.59] (OR and 95%
CI) times more likely to engage in self-distractions compared to
females. Note this pattern of results is not consistent with Study 1
in which no significant gender difference for self-distractions was
found.
2.7.1.1.5. Gender and supplies. A significant effect of gender on
children’s off-task behavior directed toward supplies e controlling
for SES, grade level, instructional format, duration, and school type
was found (F(1, 990) ¼ 16.71, p < 0.0001). Off-task behavior directed
towards supplies was more frequent for males than for females
(t(990) ¼ �4.09, p < 0.0001). Specifically, males were 1.54 [1.25,
1.90] (OR and 95% CI) times more likely to engage in off-task
behavior oriented toward supplies compared to females.
2.7.1.2. Effect of grade-level
2.7.1.2.1. Grade-level and on-task behavior. In contrast to Study
1, a significant effect of grade-level on children’s on-task behavior
was found after controlling for gender, SES, instructional format,
duration, and school type (F(4, 25) ¼ 3.84, p ¼ 0.01). Adjusting for
multiple comparisons (using the Tukey-Kramer method), third
graders engaged in significantly less on-task behavior compared to
students in fourth grade (t(23) ¼ 3.38, adj. p ¼ 0.02, OR ¼ 0.41, 95%
CI ¼ [0.19, 0.83]). Third graders also tended to exhibit lower rates of
on-task behavior compared to first graders; however, this differ-
ence was only marginally significant based on the more conserva-
tive alpha level of 0.025 selected previously (t(25) ¼ 3.04, adj.
p ¼ 0.04, OR ¼ 0.45, 95% CI ¼ [0.20, 0.97). All other grade-level
comparisons were not statistically significant (all adj. ps � 0.23).
2.7.1.2.2. Grade-level and peer distractions. After controlling for
gender, SES, instructional format, duration, and school type, the
effect of grade-level on peer distractions, relative to all other off-task
behaviors, was not statistically significant (F(4, 21) ¼ 1.78, p ¼ 0.17).
2.7.1.2.3. Grade-level and environmental distractions. After con-
trolling for gender, SES, instructional format, duration, and school
type, the effect of grade-level on environmental distractions, relative
to all other off-task behaviors, was not statistically significant (F(4,
21) ¼ 1.25, p ¼ 0.32).
2.7.1.2.4. Grade-level and self-distractions. Controlling for
gender, SES, instructional format, duration, and school type, the
effect of grade-level on self-distractions, relative to all other off-task
behaviors, was statistically significant (F(4, 1083) ¼ 5.71,
p ¼ 0.0002). Adjusting for multiple comparisons (using the Tukey-
Kramer method), fourth graders tended to engage in more self-
distractions (as a fraction of all off-task behavior) than both kin-
dergarteners (t(1334) ¼ 3.71, adj p ¼ 0.002; OR ¼ 2.22, 95%
CI ¼ [1.24, 4.00]) and first graders (t(1220) ¼ �4.44, adj p < 0.0001;
OR ¼ 2.36, 95% CI ¼ [1.39, 4.00]). All other grade-level comparisons
were not statistically significant (all adj. ps � 0.15).
2.7.1.2.5. Grade-level and supplies. After controlling for gender,
SES, instructional format, duration, and school type, the effect of
grade-level on children’s off-task behavior directed toward supplies,
relative to all other off-task behaviors, was not statistically signif-
icant (F(4, 21) ¼ 0.37, p ¼ 0.83).
2.7.1.3. Effect of SES.
No significant effect of SES was found on children’s patterns of
attention allocation after controlling for student characteristics,
instructional design strategies, and school type (on task p ¼ 0.28,
peer distractions p ¼ 0.29, environmental distractions p ¼ 0.70, self-
distractions p ¼ 0.83, supplies p ¼ 0.90).
2.7.2. Question 3: Are instructional design strategies related to
children’s attention allocation?
2.7.2.1. Effect of instructional format.
A graphical overview of the distribution of on-task and off-task
behavior across instructional formats is provided in Fig. 1.
2.7.2.1.1. Instructional format and on-task behavior. After con-
trolling for gender, grade-level, SES, duration, and school type, a
significant effect of instructional format on children’s rates of on-
task behavior was found (F(3, 2169) ¼ 27.61, p < 0.0001).
Compared to all other instructional formats, the highest rates of on-
task behavior occurred during individual and small-group in-
struction; all adj. ps � 0.0001. The largest OR was 1.62 [1.30, 2.02]
for small-group compared to whole-group instruction on the car-
pet, and the smallest OR was 1.51 [1.28, 1.77], for individual
compared to whole-group instruction at desks. There was no sig-
nificant difference in rates of on-task behavior occurring between
individual and small-group formats (t(2169) ¼ �0.83, adj. p ¼ 0.84,
individual:small-group OR ¼ 1.07, 95% CI ¼ [0.86, 1.33]). On-task
Fig. 1. Depicts the full distribution of on and off-task behavior as it differs across
instructional formats for the baseline group in Study 2 (i.e., male, third-grade, average
SES, average classroom, private school). Please refer to the text for a discussion
regarding the statistical significance of the differences displayed here.
K.E. Godwin et al. / Learning and Instruction 44 (2016) 128e143 137
behavior was least likely to occur during whole-group instruction
and there was no significant difference between rates of on-task
behavior during whole-group instruction on the carpet and
whole-group instruction at desks (t(2169) ¼ �0.04, adj. p ¼ 1.00,
desk:carpet OR ¼ 1.00, 95% CI ¼ [0.83, 1.22]).
2.7.2.1.2. Instructional format and peer distractions. After con-
trolling for gender, grade-level, SES, duration, and school type,
there was a significant effect of instructional format on children’s
rates of peer distractions, relative to all other types of off-task
behavior (F(3, 1015) ¼ 47.09, p < 0.0001). As in Study 1, peer dis-
tractions were most frequent during individual instruction and
small-group formats, followed by whole-group instruction on the
carpet and whole-group instruction at desks. All formats were
significantly different from each other (all adj. ps � 0.0001; whole-
group instruction on the carpet vs. whole-group at desks was
marginally significant, t(1087) ¼ 2.69, adj. p ¼ 0.04) with the
exception of individual and small-group formats in which there
was no significant difference in the rates of peer distractions, rela-
tive to all other off-task behaviors, across these two instructional
formats (t(645) ¼ 0.75, adj. p ¼ 0.88). The largest difference was
individual vs. whole-group at desks with an OR of 3.11 with 95% CI
[2.32, 4.18].
2.7.2.1.3. Instructional format and environmental distractions.
After controlling for gender, grade-level, SES, duration, and school
type, there was a significant effect of instructional format on chil-
dren’s rates of environmental distractions, relative to all other types
of off-task behavior (F(3, 1582) ¼ 18.69, p < 0.0001). Adjusting for
multiple comparisons (using the Tukey-Kramer method) environ-
mental distractions (relative to all other off-task behaviors) were
most frequent during whole-group instruction on the carpet and
whole-group instruction at desks (all adj. ps < 0.0001), and there
was no significant difference between these two group instruc-
tional formats (adj. p ¼ 0.99). Environmental distractions were
almost 3 times more likely to occur in whole-group formats
compared to any other instructional format (OR's ranged from 2.72
to 2.82). Furthermore, there was no significant difference between
rates of environmental distractions during individual instruction or
small-group formats (adj. p ¼ 1.0).
2.7.2.1.4. Instructional format and self-distractions. After con-
trolling for gender, grade-level, SES, duration, and school type,
there was a significant effect of instructional format on children’s
rates of self-distractions, relative to all other types of off-task
behavior (F(3, 1582) ¼ 45.34, p < 0.0001). Adjusting for multiple
comparisons (using the Tukey-Kramer method), self-distractions
(relative to all other off-task behaviors) were most frequent during
whole-group instruction on the carpet (all adj. ps � 0.007), fol-
lowed by whole-group instruction at desks (both adj ps � 0.0001),
and least likely to occur during individual and small-group in-
struction which were not significantly different from each other
(t(1582) ¼ 1.49, adj. p ¼ 0.44). The OR for whole-group instruction
on carpet to whole-group instruction at desks is 1.68 [1.11, 2.54],
and the ORs for whole-group instruction at desks to individual and
small-group are 3.13 [1.95, 5.05] and 4.48 [2.73, 7.35] respectively.
2.7.2.1.5. Instructional format and supplies. After controlling for
gender, grade-level, SES, duration, and school type, there was a
significant effect of instructional format on children’s rates of off-
task behavior oriented towards supplies, relative to all other types
of off-task behavior (F(3, 1582) ¼ 10.32, p < 0.0001). Adjusting for
multiple comparisons (using the Tukey-Kramer method) supplies
(relative to all other off-task behaviors) were most frequent during
whole-group instruction at desks and during small-group instruc-
tion (all adj. ps � 0.002 with the exception of the contrast between
the small-group format and individual instruction which was
marginally significant based on the more conservative alpha level
of 0.025; t(1582) ¼ �2.70, adj. p ¼ 0.04). There was no significant
difference in the rates of off-task behavior directed toward supplies
(as a fraction of all off-task behaviors) between the small-group
format and whole-group instruction at desks (t(1582) ¼ �0.77,
adj. p ¼ 0.87). Additionally there was no significant difference be-
tween whole-group instruction on the carpet and individual
instructional formats (t(1582) ¼ 1.30, adj. p ¼ 0.57). Off-task
behavior directed towards supplies was more than twice as likely
in small-group (OR ¼ 2.22, 95% CI ¼ [1.25, 3.97]) and in whole-
group instruction at the desks (OR ¼ 2.51, 95% CI ¼ [1.50, 4.20])
than in whole-group instruction on the carpet. Similarly, compared
to the individual instructional format off-task behavior directed
towards supplies was 1.69 [1.02, 2.80] (OR and 95% CI) times more
likely in small-group formats and 1.91 [1.30, 2.81] (OR and 95% CI)
times more likely in whole-group instruction at the desks.
2.7.2.2. Effect of instructional duration.
Recall that the average duration of an instructional activity (sec)
was operationalized as the total duration of an observation session
divided by the number of activities (defined as the number of
transitions between instructional activities plus one). In many
cases, transitions coincided with a change in instructional format;
however, transitions could occur without a change in instructional
format (e.g., rotating from one small-group activity to another). The
duration of activities observed ranged from 6.7 to 39.7 min with a
median duration of 12.8 min, 25% of the activity durations were less
than 10.6 min and 25% of the activity durations were longer than
17.1 min.
2.7.2.2.1. Duration and on-task behavior. After controlling for
gender, grade-level, SES, instructional format, and school type, a
significant effect of instructional duration (sec) on children’s rates
of on-task behavior was found (F(1, 954) ¼ 7.04, p ¼ 0.0081). The
slope estimate is �0.0174 (95% CI [�0.0301, �0.0047]) log odds
units per additional minute of activity length. For example,
comparing an activity of 10 min in length to one of 30 min in length,
the estimated OR for on task behavior is 1.42 times higher for the
shorter instructional activity (i.e., 10 min) compared to the longer
instructional activity (i.e., 30 min). This finding suggests that
elementary school children are better able to maintain a state of
K.E. Godwin et al. / Learning and Instruction 44 (2016) 128e143138
focused attention when instruction consists of relatively short (e.g.,
10 min) blocks of instructional activities compared to instructional
activities that occur over a longer duration (e.g., 30 min or longer).
This finding is consistent with laboratory studies which point to a
gradual increase in the duration of focused attention during early
childhood (Ruff & Lawson, 1990; Sarid & Breznitz, 1997). See Fig. 2
for a graphical depiction of the effect of instructional duration on
on-task behavior across the different instructional formats.
2.7.2.2.2. Duration and peer distractions. After controlling for
gender, grade-level, SES, instructional format, and school type, the
effect of instructional duration (sec) on children’s rates of peer off-
task behavior, as a fraction of all other off-task behaviors, was not
statistically significant based on the more conservative alpha level
of 0.025 (F(1, 392) ¼ 2.91, p ¼ 0.09).
2.7.2.2.3. Duration and environmental distractions. After con-
trolling for gender, grade-level, SES, instructional format, and
school type, the effect of instructional duration (sec) on rates of
environmental distractions, as a fraction of all other off-task be-
haviors, was not significant (F(1, 731) ¼ 2.31, p ¼ 0.13).
2.7.2.2.4. Duration and self-distractions. After controlling for
gender, grade-level, SES, instructional format, and school type, the
effect of instructional duration (sec) on rates of self-distractions, as a
fraction of all other off-task behaviors, was marginally significant
based on the more conservative alpha level of 0.025 (F(1,
1582) ¼ 4.89, p ¼ 0.027). However the estimate was negative
(�0.033 log odds units per additional minute of activity length)
suggesting that as the duration of a lesson increased the rate of self-
distractions decreased.
2.7.2.2.5. Duration and supplies. After controlling for gender,
grade-level, SES, instructional format, and school type, the effect of
instructional duration (sec) on rates of off-task behavior directed
towards supplies, as a fraction of all other off-task behaviors, was
not significant (F(1, 770) ¼ 0.07, p ¼ 0.79).
Fig. 2. Displays the fraction of on-task behaviors for each instructional format at two
durations (10 min and 30 min) for the baseline group in Study 2 (i.e., an average third-
grade male with average SES in a private school in an average classroom). Error bars
represent the 95% Confidence Intervals. Note that females in the same grade, class-
room, SES, and school type show an analogous pattern.
2.7.3. Question 4: Is school type related to children’s attention
allocation?
2.7.3.1. Effect of school type.
Recall that three different types of schools were recruited for the
present study: Parochial schools, Private schools, and Charter
Schools (public). The analyses below examine potential differences
in the rates of children’s on-task behavior as a function of School
Type as well as differences in the frequency of four types of off-task
behavior: Peer distractions, Environment distractions, self-distrac-
tions, and Supplies.
2.7.3.1.1. School type and on-task behavior. After controlling for
gender, grade-level, SES, instructional format, and duration, a
marginally significant effect of school type on children’s rates of on-
task behavior was found; F(2, 24) ¼ 3.81, p ¼ 0.04 (based on the
more conservative alpha level of 0.025). Parochial schools had
higher proportions of on-task behavior than private schools
(t(24) ¼ 2.73, p ¼ 0.01). Children attending parochial schools were
2.10 [1.07, 4.14] (OR and 95% CI) times more likely to engage in on-
task behavior compared to children attending private schools. For
example, for an average third-grade boy of average SES in whole-
group instruction at desks with the median activity duration in
an average classroom at a parochial school the expected on-task
rate is 2.77 on-task behaviors for every 1 off-task behavior, while
a corresponding boy from a private school would have 1.32 on-task
behaviors per one off-task behavior. Rates of on-task behavior were
not significantly different between charter schools and private
schools nor was there a significant difference between charter
schools and parochial schools (both ps � 0.36).
2.7.3.2. School type and sources of off-task behavior. There was no
significant effect of school type on any of the sources of off-task
behavior (conditional upon being off-task) after controlling for
gender, grade-level, SES, instructional format, and duration (peer
distractions p ¼ 0.38, environmental distractions p ¼ 0.87, self-dis-
tractions p ¼ 0.67, supplies p ¼ 0.67).
2.8. Discussion study 2
Even in a more heterogeneous sample, gender remained an
important student characteristic that was related to children’s
patterns of attention allocation: Female students engaged in more
on-task behavior than male students. Additionally, there were
gender differences regarding the types of off-task behavior that
each gender tended to engage in. For instance, peer distractions
were more common in female students whereas male students
were more likely to engage in self-distractions and off-task behavior
directed towards supplies.
A significant effect of grade-level on children’s on-task behavior
was found in Study 2. Third graders engaged in less on-task
behavior than first or fourth graders. Additionally, effects of
grade-level were obtained for specific types of off-task behavior
(fourth graders engaged in more self-distractions compared to first
graders and kindergarteners) while no effect of grade was found for
peer distractions, environment, or supplies. These findings differ from
those obtained in Study 1 in which grade-level was not a significant
predictor of children’s’ attention allocation.
In Study 2 a new student characteristic, SES, was added to the
model. However, at least in the present sample, no significant effect
of SES was found on children’s patterns of attention allocation after
controlling for student characteristics, instructional design strate-
gies, and school type. One possibility is that SES is confounded with
school type. As such the unique contribution of SES may be minimal
once school type is taken into account. Consequently, we conducted
a follow-up analysis in which school type was dropped from the
model of children’s on-task behavior. The effect of SES on children’s
K.E. Godwin et al. / Learning and Instruction 44 (2016) 128e143 139
on-task behavior controlling for gender, grade-level, instructional
format, and duration remained non significant (F(1, 28) ¼ 1.73,
p ¼ 0.20).
Similar to Study 1, instructional format was found to be a
consistent predictor of children’s attention allocation in Study 2.
Instructional format was a significant predictor of on-task behavior
with the highest rates of on-task behavior occurring during indi-
vidual and small-group instruction. Instructional format also
influenced how children went off-task. For instance, whole-group
instruction elicited higher rates of off-task behavior directed to-
ward the environment as well as more self-distractions, while peer
distractions were most frequent during individual and small-group
formats.
The effect of the duration of an instructional activity on chil-
dren’s attention allocation revealed that on-task behavior be-
comes less frequent as the length of the instructional activity
increases. While duration of an activity was a significant predictor
of whether or not children would go off-task, in general the
duration of an instructional activity did not influence the type of
off-task behavior children would engage in e with the exception
of self-distractions which were more frequent during shorter
instructional activities.
In Study 2, three different types of schools participated: paro-
chial schools, private schools, and public charter schools. A signif-
icant effect of school type was found for children’s rates of on-task
behavior with higher rates of on-task behavior evident in parochial
compared to private schools. However, in the present sample
school type was not an important determinant of the type of off-
task behavior that children would engage in as there was no sig-
nificant effect of school type on any of the sources of off-task
behavior (conditional upon being off-task).
3. General discussion
The present study provides a systematic examination of specific
factors that may influence elementary school students’ on- and off-
task behavior. The following factors were explored: Time of year,
student characteristics (gender, grade-level, SES), instructional
design strategies (instructional format and duration of instructional
activity), and school type (private, parochial, public charter
schools). The following findings emerged from the present work
(see Table 3 for a summary of key findings).
First, children’s pattern of attention allocation is not uniform
across Time of year [Research Question 1; Study 1]. Specifically,
children’s on-task behavior was found to decline by the end of
the school year. Additionally, the three most common types of
off-task behavior (Peer, Environment, and Self Distractions) were
all found to oscillate over the course of the school year with peer
off-task behavior increasing in the middle of the year and envi-
ronmental and self-distractions both increasing by the end of the
school year.
Second, certain student characteristics influenced children’s
on-task and off-task behaviors [Research Question 2].
Gender
was a significant predictor of children’s patterns of attention
allocation across studies. In both Study 1 and Study 2, our find-
ings revealed that female students were on-task significantly
more than male students. These findings align with previous
research suggesting that in elementary school females are
consistently more engaged than males (Marks, 2000). It is
possible that in this and prior studies the amount of off-task
behavior in females was under-estimated if females tend to
engage in off-task behaviors that are less noticeable than off-task
behaviors in males. However, our findings do not support this
possibility. Specifically, females tended to engage in more peer
off-task behaviors than males. Arguably, off-task peer
interactions are a highly noticeable type of off-task behavior;
therefore, it appears unlikely that this study under-estimated the
rates of off-task behaviors in females. The effects for males were
more equivocal. In study 1 males tended to engage in more
environment based off-task behavior while in Study 2 males
tended to engage in greater rates of self-distractions and off-task
behavior directed towards supplies than female students. It is
possible that these inconsistencies emerged because in Study 2
we separated off-task behavior directed to the visual environment
from off-task behaviors involving supplies, with the latter cate-
gory driving the gender differences observed in Study 1.
Variations in the absolute proportion of on-task behavior and
off-task behavior across grade-levels were not uniform across the
present studies. While results from Study 1 indicate no signifi-
cant effect of grade-level on students’ patterns of attention
allocation, Study 2 found some evidence that grade-level influ-
enced children’s rates of on and off-task behavior. For example,
third graders were least likely to engage in on-task behavior
(compared to first and fourth grade students). While higher rates
of self-distractions were found among fourth graders (compared
to kindergarten and first graders). The inconsistent effect of
grade-level across studies may be due in part to the under-
sampling of third-grade classrooms in both reported studies.
While we attempted to recruit a representative sample of
different grade-levels, third grade teachers were considerably
less likely to volunteer to participate in the study than teachers
in other grade-levels. One possible explanation for third grade
teachers’ reluctance to volunteer may be due to added pressures
that third grade teachers encounter with the onset of standard-
ized testing in third grade. Overall, the reported results suggest
that in elementary school grade-level is an unstable factor in
influencing the prevalence of students’ on-task and off-task
behaviors.
In the present study socioeconomic status (SES) did not influ-
ence children’s patterns of attention allocation. One possible
explanation for this result is that SES is confounded with school
type. However, this possibility was not supported in a follow up
analysis in which school type was removed from the analysis and
the effect of SES remained non-significant. Additionally, it is
possible that any influence from SES may be obscured by under-
sampling of very low SES schools. Future research will need to
more fully evaluate the influence of SES on attention allocation
particularly in these more vulnerable populations.
Third, we also examined the relationship between teachers’
instructional design choices and children’s on- and off-task
behavior [Research Question 3]. Specifically, we investigated
two components of instructional design: type of instructional
format (e.g., individual work, small-group work, whole-group
work, etc.) and the average duration (sec) of an instructional
activity. We found that instructional format influenced the
overall rate of on-task behavior as well as the form that off-task
behavior takes in both a homogeneous sample of charter schools
(Study 1) and within a heterogeneous sample of private, paro-
chial, and charter schools (Study 2). Thus, the effect of instruc-
tional format appears to be generalizable to a wide range of
schools and student populations.
Across both Study 1 and Study 2 we observed a consistent
association between the type of instructional format and on-task
behavior. Specifically, higher rates of on-task behavior were
found during small-group work. These results are largely
consistent with the findings that small-group instruction is
associated with better learning outcomes compared to whole-
group instruction (Lou et al., 1996): greater proportion of on-
task behavior for small-group instructional activities might
mediate the relationship between instructional type and learning
Table 3
An overview of the key findings from Study 1 and Study 2. Consistent effects across both studies are highlighted in blue (with lighter blue highlighting effects that were partially consistent across studies).
Factor Behavior Study 1 Study 2
Time of Year
Time of Year:
Beginning (T1)
Middle (T2)
End (T3)
On-task Less on-task behavior at end of the year:T3 < T1 and T2; T2 > T1 –
Peer off-task behavior More peer off-task behavior in middle of the year: T2 > T1 and T3 –
Environmental off-task behavior Environmental off-task behavior increased over the year: T3 and T2 > T1 –
Self-distractions Self-distractions increased over the year:T1 < T2 < T3 -
Student Characteristics
Gender
On-task Females more on-task: F > M Females more on-task: F > M
Peer off-task behaviors Females more peer off-task behavior: F > M Females more peer off-task behavior: F > M
Environmental off-task behavior Males more environmental off-task behavior: M > F ns
Self-distractions ns Males more self-distractions: M > F+
Supplies off-task behavior – Males more supplies off-task behavior: M > F
Grade Level
On-task ns 3
rd graders less on-task behavior:
3rd < 1st + and 4th
Self-distractions ns 4
th graders more self-distraction:
4th > K and 1st
Instructional Design Strategies
Instructional Format:
Individual (Indv)
Small-Group (SG)
Whole-Group Carpet
(WG Carpet)
Whole-Group Desk
(WG Desk)
On-task On-Task most likely in SG: SG > WG Carpet / WG Desk+
On-Task most likely in SG and Indv:
SG / Indv > WG Carpet / WG Desk
Peer off-task behavior Peer off-task most likely in SG and Indv: SG / Indv > WG Carpet / WG Desk
Peer off-task most likely in SG and Indv:
SG / Indv > WG Carpet / WG Desk
Environmental off-task behavior Environmental off-task most likely during WG: WG Desks/WG carpet > SG / Indv
Environmental off-task most likely during WG:
WG Carpet /
WG Desks > SG / Indv
Self-distractions
Self-distractions most likely during WG Carpet:
WG Carpet > all other formats
WG Desks > SG / Indv
Self-distractions most likely during WG Carpet:
WG Carpet > all other formats
WG Desks > SG / Indv
Supplies off-task behavior –
Supplies off-task behavior most likely during WG Desk and SG:
WG Desk / SG > WG Carpet
WG Desk / SG+ > Indv
Average Duration of
Instructional Activity
On-task – On-task behavior declines during longer instructional activitiesNegative association
Self-distractions – Self-distractions declined during longer instructional activitiesNegative association +
School Based Factor
School Type On-task – More on-task behavior in Parochial schools: Parochial > Private
Note. The symbol ‘+’ denotes a marginally significant result
K
.E.
G
od
w
in
et
al.
/
Learn
in
g
an
d
In
stru
ction
4
4
(2
016
)
12
8
e
14
3
14
0
K.E. Godwin et al. / Learning and Instruction 44 (2016) 128e143 141
outcomes. Certain instructional formats were also found to elicit
different types of off-task behavior. For example, across Study 1
and Study 2, whole-group instruction (e.g., whole-group in-
struction on the carpet and at the desks) was associated with
more off-task behavior directed toward the environment. There-
fore, it is possible that streamlining the classroom visual envi-
ronment may help reduce off-task behaviors in this common
instructional format. Overall, the finding that certain types of
instructional format are associated with more on-task behavior
than others indicates that further research is necessary and
important given the potential implications for instruction.
However, it is worth noting that instructional duration likely
varies across different instructional formats; hence, differences
attributed to the instructional format may also reflect effects of
the duration of an instructional activity.4 Additional research is
necessary to further explore this possibility.
In regard to the effect of the average duration of an instructional
activity, we found that on-task behavior became less frequent as
the length of the instructional activity increased. Thus, children
may be better able to maintain a state of focused attention when
instruction consists of small blocks of instructional activities versus
instructional episodes that occur over a longer duration (cf. Ruff &
Lawson, 1990; Sarid & Breznitz, 1997). This finding is in line with
prior empirical work with adults in which attention was found to
wane over time (for review see Middendorf & Kalish, 1996): at the
beginning of a lecture, most adult students were able to attend to
the lecture for up to 18 min before a lapse in attention was
observed; however, by the end of the lecture period adults’ atten-
tion span had decreased to 3e4 min segments (Johnstone & Per-
cival, 1976 as cited in Middendorf & Kalish, 1996). However, to our
knowledge no prior studies have examined the effects of duration
of instructional activity on children’s on-task behavior. Attention
regulation skills are certainly not as developed in elementary
school children as in adults. In this light, it is interesting to note that
in the present study the median duration of an instructional ac-
tivity was 12.8 min, and 25% of the instructional activity durations
were longer than 17.1 min. It is possible that instructional duration
interacts with other factors (e.g., activity type, difficulty level, topic
interest, novelty, etc.), which may collectively contribute to chil-
dren’s ability to maintain a state of selective sustained attention.
Further research is needed to provide educators with evidence-
based guidelines on the optimal length of an instructional activity
for children at each grade-level; however, the outcomes of this
study suggest that long instructional activity durations (i.e., 30 min)
may be suboptimal in elementary grades.
Lastly, the relationship between patterns of children’s attention
allocation and School Type was investigated [Research Question 4].
A marginally significant effect of school type was found, as school
type influenced students’ rates of on-task behavior. Specifically,
students from parochial schools had higher rates of on-task
behavior (compared to students attending private schools). How-
ever, the type of school (private, parochial, charter) children attend
was not found to be a determining factor of the source of children’s
off-task behavior. It is not clear what aspects of parochial schools
explain the observed patterns of increased on-task behavior found
in this sample.
4 In the present study, the average duration of an instructional activity is not
linked with instructional format. Thus within the current data set it is not possible
to determine whether the duration of activities within particular instructional
formats vary systematically. As mentioned previously, a change in an instructional
activity often coincided with a change in instructional format; however, this was
not always the case as an activity could change while the instructional format
remained constant (e.g., when children rotate from one center to another the
format remains small-group even though the instructional activity changes).
4. Limitations and future directions
The present work reported a number of novel findings; never-
theless there are also important limitations that should be raised
and addressed in future research. First, although eye gaze is a
common measure of visual attention (see Henderson & Ferreira,
2004; Just & Carpenter, 1976 for reviews), it is admittedly not a
perfect measure of attention. For example, it is possible for students
to appear to be on-task or attending to the teacher while they are
actually daydreaming. Conversely students may be looking else-
where while still listening to the teacher’s instruction. In the latter
case one could argue that the student is not off-task as some
amount of attention is still being attributed to the learning activity.
However, the student described above is by definition in a state of
divided attention which has been demonstrated in the prior liter-
ature to be less optimal for performance than a state of selective
sustained attention (e.g., Craik, Govoni, Naveh-Benjamin, &
Anderson, 1996). To address this limitation, in addition to eye gaze
future work may benefit from utilizing multiple measures of
attention such as activity level or gross motor movements (e.g.,
Milich, 1984), and body posture (e.g., D’Mello & Grasser, 2010).
Finally, it would also be useful to corroborate visual eye gaze
measures with performance-based measures of attention (e.g., K-
CPT: Conners & Staff, 2001; Track-It: Fisher, Thiessen, Godwin,
Kloos, & Dickerson, 2013) and teacher or parent reports (e.g.,
CBCL: Achenbach & Rescorla, 2001).
Second, children’s patterns of attention allocation may vary as a
function of the characteristics of the learning task. For example,
children’s ability to maintain a state of selective sustained attention
may be greater in self-directed learning activities compared to
learning contexts that are more structured (e.g., Geary, 2011).
Similarly, the difficulty level of the instructional task may interact
with students’ patterns of attention allocation. For instance,
attention may decline when the difficulty level of the task exceeds a
student’s ability level (e.g., Shernoff, Csikszentmihalyi, Schneider, &
Shernoff, 2003; Imai, Anderson, Wilkinson, & Yi, 1992). Future
research should explore in more depth how these factors modulate
students’ patterns of attention allocation in classroom learning
environments.
Inattention or off-task behavior is a significant problem in
educational settings as inattention reduces students’ opportunities
to learn. Therefore, it is imperative that researchers isolate ante-
cedents of off-task behavior in order to identify potential avenues
for intervention. The present work provides a detailed exploration
of elementary school children’s attention allocation patterns and
highlights how the time of year, student characteristics (e.g.,
gender, grade-level, SES), instructional design strategies (e.g.,
instructional format and duration of an instructional activity), and
school-based factors (e.g., school type) contribute to children’s on
and off-task behaviors in classroom settings. The present findings
are a first step in providing empirical evidence to inform in-
terventions that aim to better engage students.
Acknowledgements
We thank Megan Miller, Laura Pacilio, and Jessica Meeks for
their help collecting data. We also thank the children, parents, and
teachers who made this project possible. The work reported here
was supported by a Graduate Training Grant awarded to Carnegie
Mellon University by the Department of Education (R305B090023)
and by the Institute of Education Sciences, U.S. Department of Ed-
ucation (R305A110444). The opinions expressed are those of the
authors and do not represent views of the Institute or the U.S.
Department of Education.
K.E. Godwin et al. / Learning and Instruction 44 (2016) 128e143142
References
Achenbach, T., & Rescorla, L. (2001). The manual for the ASEBA school-age forms &
profiles. Burlington, VT: University of Vermont, Research Center for Children,
Youth, and Families.
Ammons, T. L., Booker, J. L., Jr., & Killmon, C. P. (1995). The effects of time of day on
student attention and achievement (ERIC Document no. ED 384 592).
Antrop, I., Roeyers, H., & De Baecke, L. (2005). Effects of time of day on classroom
behavior in children with ADHD. School Psychology International, 26(1), 29e43.
Baker, R. S. (2007). Modeling and understanding students’ off-task behavior in
intelligent tutoring systems. Proceedings of ACM CHI 2007: Computer-Human
Interaction, 1059e1068.
Baker, R. S. J. D., D’Mello, S. K., Rodrigo, M. M. T., & Graesser, A. C. (2010). Better to be
frustrated than bored: the incidence, persistence, and impact of learners’
cognitive-affective states during interactions with three different computer-
based learning environments. International Journal of Human-Computer
Studies, 68(4), 223e241.
Baker, R. S. J. D., Gowda, S. M., Wixon, M., Kalka, J., Wagner, A. Z., Salvi, A., et al.
(2012). Towards sensor-free affect detection in cognitive tutor algebra. In Pro-
ceedings of the 5th international conference on educational data mining (pp.
126e133). Worcester, MA: International Educational Data Mining Society.
Barrett, P., Davies, F., Zhang, Y., & Barrett, L. (2015). The impact of classroom design
on pupils’ learning: final results of a holistic, multi-level analysis. Building &
Environment, 89, 118e133.
Barrett, P., Zhang, Y., Moffat, J., & Kobbacy, K. (2012). A holistic, multi-level analysis
identifying the impact of classroom design on pupils’ learning. Building &
Environment, 59, 678e689.
Bartgis, J., Thomas, D. G., Lefler, E. K., & Hartung, C. M. (2008). The development of
attention and response inhibition in early childhood. Infant and Child Devel-
opment, 17, 491e502.
Berliner, D. C. (1990). What’s all the fuss about instructional time? The nature of time
in schools: Theoretical concepts, practitioner perceptions. New York: College Press.
Bullard, J. (2010). Creating environments for learning: Birth to age eight. Prentice Hall.
Carnine, D. W. (1976). Effects of two teacher-presentation rates on off-task behavior,
answering correctly, and participation. Journal of Applied Behavior Analysis, 9,
199e206.
Carroll, J. (1963). A model of school learning. Teachers College Record, 64, 723e733.
Choudhury, N., & Gorman, K. S. (2000). The relationship between sustained
attention and cognitive performance in 17-24 month old toddlers. Infant and
Child Development, 9, 127e146.
Conners, C. K., & Staff, M. (2001). Conners’ Kiddie continuous performance test (K-
CPT): Computer program for windows technical guide and software manual. Tor-
onto, ON: MHS.
Craik, F. I. M., Govoni, R., Naveh-Benjamin, M., & Anderson, N. D. (1996). The effects
of divided attention on encoding and retrieval processes in human memory.
Journal of Experimental Psychology, 125(2), 159e180.
DeMarie-Dreblow, D., & Miller, P. H. (1988). The development of children’s strate-
gies for selective attention: evidence for a transitional period. Child Develop-
ment, 59, 1504e1513.
Dixon, W. E., & Salley, B. J. (2007). “Shh! we’re tryin’ to concentrate”: attention and
environmental distracters in novel word learning. The Journal of Genetic Psy-
chology, 167(4), 393e414.
D’Mello, S., & Grasser, A. (2010). Mining bodily patterns of affective experience
during learning. In A. Merceron, P. Pavlik, & R. Baker (Eds.), Proceedings of the
third international conference on educational data mining (pp. 31e40).
Fisher, C. W., Berliner, D. C., Filby, N. N., Marliave, R., Cahen, L. S., & Dishaw, M. M.
(1980). Teaching behaviors, academic learning time, and student achievement.
In C. Denham, & A. Liberman (Eds.), Time to learn: A review of the beginning
teacher evaluation study (pp. 7e22).
Fisher, A. V., Godwin, K. E., & Seltmen, H. (2014). Visual environment, attention
allocation, and learning in young children: when too much of a good thing may
be bad. Psychological Science, 25(7), 1362e1370.
Fisher, A. V., & Kloos, H. (2016). Development of selective sustained attention: the
role of executive functions. In J. A. Griffin, P. McCardle, & L. Freund (Eds.), Ex-
ecutive function in preschool-age children: Integrating measurement, neuro-
development, and translational research (pp. 215e237). Washington, DC, US:
American Psychological Association.
Fisher, A., Thiessen, E., Godwin, K., Kloos, H., & Dickerson, J. (2013). Assessing se-
lective sustained attention: evidence from a new paradigm. Journal of Experi-
mental Child Psychology, 114(2), 275e294.
Fleiss, J. L. (1981). Statistical methods for rates and proportions (2nd ed.). New York:
John Wiley.
Frederick, W. C., & Walberg, H. J. (1980). Learning as a function of time. Journal of
Educational Research, 73, 183e194.
Frederick, W. C., Walberg, H. J., & Rasher, S. P. (1979). Time, teacher comments, and
achievement in urban high schools. Journal of Educational Research, 73(2),
63e65.
Geary, K. E. (2011). The impact of choice on child sustained attention in the preschool
classroom (Unpublished Thesis). Baton Rouge, Louisiana: Louisiana State
University.
Godwin, K., & Fisher, A. (2011). Allocation of attention in classroom environments:
consequences for learning. In L. Carlson, C. H€olscher, & T. Shipley (Eds.), Pro-
ceedings of the 33rd annual conference of the cognitive science society (pp.
2806e2811). Austin, TX: Cognitive Science Society.
Good, T. L., & Beckerman, T. M. (1978). Time on task: a naturalistic study in sixth-
grade classrooms. The Elementary School Journal, 78(3), 192e201.
Goodman, L. (1990). Time and learning in the special education classroom. SUNY
Press.
Henderson, J., & Ferreira, F. (2004). The interface of language, vision, and action: Eye
movements and the visual world. New York, NY: Taylor & Francis.
Imai, M., Anderson, R. C., Wilkinson, I. A. G., & Yi, H. (1992). Properties of attention
during reading lessons. Journal of Educational Psychology, 84(2), 160e173.
Janvier, B., & Testu, F. (2007). Age related differences in daily attention patterns in
preschool, kindergarten, first grade & fifth grade pupils. Chronobiology Inter-
national, 24(2), 327e343.
Just, M., & Carpenter, P. (1976). Eye fixations and cognitive processes. Cognitive
Psychology, 8, 441e480.
Karweit, N., & Slavin, R. E. (1981). Measurement and modeling choices in studies of
time and learning. American Educational Research Journal, 18(2), 157e171.
Killingsworth, M. A., & Gilbert, D. T. (2010). A wandering mind is an unhappy mind.
Science, 330, 932.
Kulik, J. A. (1992). An analysis of research on ability grouping: Historical and
contemporary perspectives. In Research-based decision-making series. Storrs, CT:
University of Connecticut. National Research Center on the Gifted and Talented
(ERIC Document Reproduction Service No. ED 350 777).
Lahaderne, H. M. (1968). Attitudinal and intellectual correlates of attention: a study
of four sixth-grade classrooms. Journal of Educational Psychology, 59(5),
320e324.
Lee, S. W., Kelly, K. E., & Nyre, J. E. (1999). Preliminary report on the relation of
students’ on-task behavior with completion of school work. Psychological Re-
ports, 84, 267e272.
Lemov, D. (2010). Teach like a champion: 49 techniques that put students on a path to
college. San Francisco, CA: Jossey-Bass.
Lloyd, J. W., & Loper, A. B. (1986). Measurement and evaluation of task-related
learning behaviors: attention to task and metacognition. School Psychology
Review, 15(3), 336e345.
Lou, Y., Abrami, P. C., Spence, J. C., Poulsen, C., Chambers, B., & d’Apollonia, S. (1996).
Within-class grouping: a meta-analysis. Review of Educational Research, 66(4),
423e458.
Marks, H. (2000). Student engagement in instructional activity: patterns in
elementary, middle, and high school years. American Educational Research
Journal, 37(1), 153e184.
Martin, A. J., Papworth, B., Ginns, P., Malmbergm, L.-E., Collie, R. J., & Calvo, R. A.
(2015). Real-time motivation and engagement during a month at school: every
moment of every day for every student matters. Learning and Individual Dif-
ferences, 38, 26e35.
Matthews, J. S., Ponitz, C. C., & Morrison, F. J. (2009). Early gender differences in self-
regulation and academic achievement. Journal of Educational Psychology, 101(3),
689e704.
Middendorf, J., & Kalish, A. (1996). The ‘change-up’ in lectures. The National Teaching
and Learning Forum, 5(2), 1e4.
Milich, R. (1984). Cross-sectional and longitudinal observations of activity level and
sustained attention in a normative sample. Journal of Abnormal Child Psychology,
12(2), 261e276.
Muyskens, P., & Ysseldyke, J. E. (1998). Student academic responding time as a
function of classroom ecology and time of day. The Journal of Special Education,
31(4), 411e424.
Ocumpaugh, J., Baker, R. S. J. D., & Rodrigo, M. M. T. (2012). Baker-Rodrigo observation
method protocol (BROMP) 1.0. Training manual version 1.0. Technical Report.
Manila, Philippines: Ateneo Laboratory for the Learning Sciences.
Ponitz, C. C., & Rimm-Kaufman, S. E. (2011). Contexts of reading instruction: im-
plications for literacy skills and kindergarteners’ behavioral engagement. Early
Childhood Research Quarterly, 26, 157e168.
Prais, S. J. (1998). Raising schooling attainments by grouping pupils within each
class. National Institute Economic Review, 165, 83e88.
Prais, S. J. (1999). Within-class grouping: a rejoinder (response to article by Philip C.
Abrami et al. in this issue p. 105). National Institute Economic Review, 169,
109e110.
Rettig, M. D., & Canady, R. L. (2013). Elementary school scheduling: Enhancing in-
struction for student achievement. NY: Routledge.
Roberts, M. (2001). Off-task behavior in the classroom: Applying FBA and CBM.
Retrieved from: http://www.nasponline.org/communications/spawareness/Off-
Task%20Behavior .
Roberts, M. (2002). Research in practice: practical approaches to conducting func-
tional analyses that all educators can use. The Behavior Analyst Today, 3, 83e87.
Ruff, H. A., & Lawson, K. R. (1990). Development of sustained, focused attention in
young children during free play. Developmental Psychology, 26, 85e93.
Samuels, S. J., & Turnure, J. E. (1974). Attention and reading achievement in first-
grade boys and girls. Journal of Educational Psychology, 66(1), 29e32.
Sarid, M., & Breznitz, Z. (1997). Developmental aspects of sustained attention
among 2- to 6-year-old children. International Journal of Behavioral Develop-
ment, 21, 303e312.
Schooler, J. W., Smallwood, J., Christoff, K., Handy, T. C., Reichle, E. D., & Sayette, M. A.
(2011). Meta-awareness, perceptual decoupling and the wandering mind.
Trends in Cognitive Science, 15(7), 319e326.
Shernoff, D. J., Csikszentmihalyi, M., Schneider, B., & Shernoff, E. S. (2003). Student
engagement in high school classrooms from the perspective of flow theory.
School Psychology Quarterly, 18(2), 158e176.
Smallwood, J., Fishman, D. J., & Schooler, J. W. (2007). Counting the cost of an absent
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref1
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref1
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref1
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref2
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref2
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref3
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref3
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref3
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref4
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref4
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref4
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref4
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref5
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref5
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref5
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref5
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref5
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref5
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref6
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref6
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref6
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref6
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref6
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref7
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref7
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref7
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref7
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref8
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref8
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref8
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref8
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref9
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref9
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref9
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref9
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref10
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref10
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref11
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref12
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref12
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref12
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref12
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref13
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref13
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref14
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref14
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref14
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref14
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref16
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref16
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref16
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref17
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref17
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref17
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref17
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref18
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref18
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref18
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref18
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref19
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref19
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref19
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref19
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref20
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref20
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref20
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref20
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref22
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref22
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref22
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref22
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref22
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref23
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref23
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref23
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref23
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref24
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref24
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref24
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref24
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref24
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref24
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref25
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref25
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref25
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref25
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref26
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref26
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref27
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref27
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref27
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref28
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref28
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref28
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref28
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref29
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref29
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref29
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref30
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref30
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref30
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref30
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref30
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref30
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref31
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref31
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref31
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref32
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref32
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref72
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref72
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref72
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref33
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref33
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref33
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref69
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref69
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref69
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref69
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref69
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref70
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref70
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref70
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref34
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref34
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref34
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref35
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref35
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref36
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref36
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref36
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref36
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref37
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref37
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref37
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref37
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref38
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref38
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref38
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref38
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref39
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref39
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref40
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref40
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref40
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref40
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref41
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref41
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref41
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref41
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref42
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref42
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref42
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref42
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref43
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref43
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref43
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref43
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref43
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref44
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref44
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref44
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref44
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref45
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref45
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref45
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref47
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref47
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref47
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref47
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref48
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref48
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref48
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref48
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref49
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref49
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref49
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref50
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref50
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref50
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref50
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref51
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref51
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref51
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref52
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref52
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref52
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref52
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref53
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref53
http://www.nasponline.org/communications/spawareness/Off-Task%20Behavior
http://www.nasponline.org/communications/spawareness/Off-Task%20Behavior
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref55
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref55
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref55
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref56
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref56
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref56
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref57
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref57
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref57
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref58
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref58
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref58
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref58
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref59
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref59
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref59
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref59
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref61
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref61
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref61
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref61
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref62
K.E. Godwin et al. / Learning and Instruction 44 (2016) 128e143 143
mind: mind wandering as an underrecognized influence on educational per-
formance. Psychonomic Bulleting & Review, 14(2), 230e236.
Tarr, P. (2004). Consider the walls. Young Children, 59(3), 88e92.
Thompson, S. D., & Raisor, J. M. (2013). Meeting the sensory needs of young chil-
dren. Young Children, 68(2), 34e43.
White, S. (1970). Some general outlines of the matrix of developmental changes
View publication statsView publication stats
between 5 and 7 years. Bulletin of the Orton Society, 20, 41e57.
Wiebe, S. A., Sheffield, T., Nelson, J. M., Clark, C. A. C., Chevalier, N., & Andrews
Espy, K. (2011). The structure of executive function in 3-year-old children.
Journal of Experimental Child Psychology, 108(3), 436e452.
Zagar, R., & Bowers, N. D. (1983). The effect of time of day on problem solving and
classroom behavior. Psychology in the Schools, 20, 337e345.
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref62
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref62
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref62
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref62
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref63
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref63
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref64
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref64
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref64
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref65
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref65
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref65
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref66
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref66
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref66
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref66
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref68
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref68
http://refhub.elsevier.com/S0959-4752(16)30027-5/sref68
https://www.researchgate.net/publication/303691151
- Off-task behavior in elementary school children
1. Off-task behavior in elementary school students
2. The present study
2.1. Study 1: Temporal patterns of on- and off-task behavior across the school year
2.2. Method
2.2.1. Participants
2.2.2. Design and procedure
2.2.3. Coding on- and off-task behaviors
2.3. Results study 1
2.3.1. Question 1: Do patterns of attention allocation change over the course of the school year?
2.3.1.1. Time of year and on-task behavior
2.3.1.1.1. On-task behavior
2.3.1.2. Time of year and the sources of off-task behavior
2.3.1.2.1. Peer distractions
2.3.1.2.2. Environmental distractions
2.3.1.2.3. Self-distractions
2.3.2. Question 2: Are student characteristics related to children’s attention allocation patterns?
2.3.2.1. Effect of gender
2.3.2.1.1. Gender and on-task behavior
2.3.2.1.2. Gender and peer-distractions
2.3.2.1.3. Gender and environmental distractions
2.3.2.1.4. Gender and self-distractions
2.3.2.2. Effect of grade-level
2.3.3. Question 3: Are instructional design choices related to children’s attention allocation?
2.3.3.1. Instructional format and on-task behavior
2.3.3.2. Instructional format and peer distractions
2.3.3.3. Instructional format and environmental distractions
2.3.3.4. Instructional format and self-distractions
2.4. Discussion study 1
2.5. Study 2: Assessing the generalizability of the relationship between student characteristics, instructional design strategie …
2.6. Method
2.6.1. Participants
2.6.2. Design and procedure
2.7. Results study 2
2.7.1. Question 2: Are student characteristics related to children’s attention allocation patterns?
2.7.1.1. Effect of gender
2.7.1.1.1. Gender and on-task behavior
2.7.1.1.2. Gender and peer distractions
2.7.1.1.3. Gender and environmental distractions
2.7.1.1.4. Gender and self-distractions
2.7.1.1.5. Gender and supplies
2.7.1.2. Effect of grade-level
2.7.1.2.1. Grade-level and on-task behavior
2.7.1.2.2. Grade-level and peer distractions
2.7.1.2.3. Grade-level and environmental distractions
2.7.1.2.4. Grade-level and self-distractions
2.7.1.2.5. Grade-level and supplies
2.7.1.3. Effect of SES
2.7.2. Question 3: Are instructional design strategies related to children’s attention allocation?
2.7.2.1. Effect of instructional format
2.7.2.1.1. Instructional format and on-task behavior
2.7.2.1.2. Instructional format and peer distractions
2.7.2.1.3. Instructional format and environmental distractions
2.7.2.1.4. Instructional format and self-distractions
2.7.2.1.5. Instructional format and supplies
2.7.2.2. Effect of instructional duration
2.7.2.2.1. Duration and on-task behavior
2.7.2.2.2. Duration and peer distractions
2.7.2.2.3. Duration and environmental distractions
2.7.2.2.4. Duration and self-distractions
2.7.2.2.5. Duration and supplies
2.7.3. Question 4: Is school type related to children’s attention allocation?
2.7.3.1. Effect of school type
2.7.3.1.1. School type and on-task behavior
2.7.3.2. School type and sources of off-task behavior
2.8. Discussion study 2
3. General discussion
4. Limitations and future directions
Acknowledgements
References
lable at ScienceDirect
Computers in Human Behavior 86 (2018) 174e180
Contents lists avai
Computers in Human Behavior
journal homepage: www.elsevier.com/locate/comphumbeh
Full length article
Effect of uninterrupted time-on-task on students’ success in Massive
Open Online Courses (MOOCs)
Youngjin Lee
University of Kansas, 1122 W. Campus Rd. (Room 413) Lawrence, KS 66045, USA
a r t i c l e i n f o
Article history:
Available online 23 April 2018
Keywords:
Massive open online course (MOOC)
Log file analysis
Academic success
Self-regulated learning (SRL)
Time-on-task
E-mail address: yjlee@ku.edu.
https://doi.org/10.1016/j.chb.2018.04.043
0747-5632/
© 2018 Elsevier Ltd. All rights reserved.
a b s t r a c t
This study investigated the relationship between uninterrupted time-on-task and academic success of
students enrolled in a Massive Open Online Course (MOOC). The variables representing uninterrupted
time-on-task, such as number and duration of uninterrupted consecutive learning activities, were mined
from the log files capturing how 4286 students tried to learn Newtonian mechanics concepts in a MOOC.
These variables were used as predictors in the logistic regression model estimating the likelihood of
students getting a course certificate at the end of the semester. The analysis results indicate that the
predictive power of the logistic regression model, which was assessed by Area Under the Precision-Recall
Curve (AUPRC), depends on the value of off-task activity threshold time, and the likelihood of students
getting a course certificate increases as students were doing more uninterrupted learning activities over
a longer period of time. The findings from this study suggest that a simple count of learning activities,
which has been used as a proxy for time-on-task in previous studies, may not accurately describe student
learning in the computer-based learning environment because it does not reflect the quality, such as
uninterrupted durations, of those learning activities.
© 2018 Elsevier Ltd. All rights reserved.
Since Carroll (1963) proposed a model of school learning, time-
on-task, the amount of time students are actively engaged in
learning, has been considered an important variable that can
explain academic success of students. Earlier studies conducted
with undergraduate students in traditional face-to-face courses
found that self-reported days of study per week (Allen, Lerner, &
Hinrichsen, 1972) and self-reported hours of study per week
(Wagstaff & Mahmoudi, 1976) were positively correlated with the
GPA of students. Similarly, Wagner, Schober, and Spiel (2008) re-
ported that secondary school students who spent more time
working on homework obtained a better grade.
Although self-reported survey is the most common form of data
collection method employed in the research investigating time-on-
task, it inherently contains errors because students report their
time-on-task after they completed the learning tasks, instead of
while performing them. Creating a journal of time-on-task while
performing the learning task can improve the accuracy of self-
reported time-on-task to some extent. This approach, however,
may generate a different type of error because creating a journal
entry disrupts the natural flow of learning processes of students. To
address these issues, researchers started analyzing log files of
computer-based learning environments when they need to esti-
mate time-on-task of students. Since computer-based learning
environments, such as Massive Open Online Courses (MOOCs), can
capture learning behaviors of students in detail without inter-
rupting their learning processes, it is relatively easy to estimate the
time-on-task of students. Wellman and Marcinkiewicz (2004) used
a frequency of accessing Web pages of an online course as a proxy
for the time-on-task of college students, and found that it was
positively correlated with the achievement of students measured
by pre- and post-tests. Similarly, Cho and Shen (2013) reported that
the amount of time spent in the Learning Management System
(LMS) is positively correlated with the total points students earned
in the course although they did not explain how the LMS computed
the time-on-task values they analyzed in their study.
Even though log file analysis allows researchers to better esti-
mate the time-on-task of students in computer-based learning
environments, it also presents a new challenge. If students stop
using the learning environment, engage in an alternative task for a
while, and return to what they were doing, the log file would not be
able to recognize this off-task activity because the learning session
in the system is preserved as long as the Web browser window
remains open. In order to address this issue, time-on-task values
mailto:yjlee@ku.edu
http://crossmark.crossref.org/dialog/?doi=10.1016/j.chb.2018.04.043&domain=pdf
www.sciencedirect.com/science/journal/07475632
www.elsevier.com/locate/comphumbeh
https://doi.org/10.1016/j.chb.2018.04.043
https://doi.org/10.1016/j.chb.2018.04.043
https://doi.org/10.1016/j.chb.2018.04.043
Y. Lee / Computers in Human Behavior 86 (2018) 174e180 175
longer than a pre-determined threshold, which typically ranges
from 10 to 60 min, are often discarded from the analysis (Ba-Omar
& Petrounias, 2007; del Valle & Duffy, 2007; Munk & Drlík, 2011;
Wise, Speer, Marbouti, & Hsiao, 2012). Although Kovanovi�c et al.
(2015)’s experiments show that the threshold value for deter-
mining off-task activities has an impact on the subsequent data
analysis, many studies analyzing log files of computer-based
learning environments did not take into account off-task activ-
ities when they examined the relationship between time-on-task
and students’ academic success. In addition, most studies
focusing on time-on-task were descriptive in nature, and did not
investigate whether time-on-task can predict the academic success
of students.
The study reported in this paper analyzed the log files of a
MOOC to infer uninterrupted time-on-task of students by excluding
off-task activities from the analysis, and examined the effect of
uninterrupted time-on-task on the academic success of students in
the MOOC. More specifically, this study has two research questions:
(1) Does the threshold value determining off-task activities have an
impact on the predictive power of the model estimating the like-
lihood of students getting a course certificate in the MOOC?; and
(2) What is the relationship between uninterrupted time-on-task
inferred from the log files of a MOOC and students’ success in
getting a course certificate? The rest of the paper is organized as
follows. After reviewing related work, the method section describes
the MOOC and its log files analyzed in this study, and explains in
detail how the predictive models estimating the likelihood of stu-
dents getting a course certificate were developed. The results sec-
tion compares the predictive power of the developed models, and
examines the relationship between uninterrupted time-on-task
and the likelihood of students getting a course certificate, fol-
lowed by discussion.
1. Related work
1.1. Completion rate and attrition in MOOCs
Completion rate and attrition in MOOCs have been extensively
studied in recent years. Breslow et al. (2013) examined how
154,763 students who signed up for a physics MOOC in spring 2012.
They found that 15% of registrants tried to complete the first
homework assignment, 6% of them passed the midterm exam, and
only 5% of them earned the course certificate at the end of the
semester. Similarly, Ho et al. (2014) reported that about 5% of reg-
istrants were able to earn the course certificate when they exam-
ined the completion rate of students enrolled in seventeen
HarvardX and MITx MOOCs between fall 2012 and summer 2013.
Jordan (2015) investigated the completion rates of more than
hundred MOOCs having a different grading scheme and varying
length of study. She found that the completion rate of MOOCs
varied widely (from 0.7% to 52.1%), depending on the length of
study (higher completion rates for shorter MOOCs), start date
(higher completion rates for recent MOOCs), and assessment type
(higher completion rates when MOOCs adopt an automatic grading
scheme). When Goldberg et al. (2015) investigated the relationship
between participants’ level of education and engagement in their
completion of a MOOC on dementia, they found that 38% of regis-
trants were able to complete the course, and the discussion board
activity was a significant predictor for their course completion.
Crossley, Paquette, Dascalu, McNamara, and Baker (2016) examined
whether students’ online activity and the language they produce in
the discussion forum can predict the course completion. Of 320
students who completed at least one graded assignment and pro-
duced at least 50 words in discussion forum,187 students were able
to complete the course successfully (completion rate ¼ 58%). In
Hone and SaidEl (2016)’s survey study of 379 participants, 32% of
survey respondents were able to complete an entire course, and
there was no significant difference in the course completion rate by
gender, level of study (undergraduate or postgraduate) or MOOC
platform used by students. In Pursel, Zhang, Jablokow, Choi, and
Velegol (2017)’s study, 5.6% of students who registered for a
MOOC on creativity and innovation were able to complete the
course with a statement of accomplishment. They also found that
registering after the course launch date is negatively correlated
with the course completion while the frequency of watching a
video, posting a message and providing a comment are positively
correlated with the course completion.
Several studies indicate that study time or time management is
the most common reason for disengaging from MOOCs. Nawrot and
Doucet (2014) reported that about 70% of the survey respondents
(N ¼ 508) indicated that bad time management (e.g., bad time or-
ganization, conflicting real life responsibilities, too much time
consuming course, left behind due to illness or work) was the main
reason for their MOOC withdrawal decision. Kizilcec and Halawa
(2015) analyzed the survey responses of 1968 students who were
sampled from twenty MOOCs to investigate reasons for attrition.
Their analysis revealed four reasons for disengaging: time issues,
course difficulty, format and content, and goals and expectations
(in the order of significance). 84% of survey respondents mentioned
that they did not have enough time for the course, and half of those
respondents also indicated that they are easily distracted from the
course. Zheng, Rosson, Shih, and Carroll (2015)’s interview with 18
students taking a MOOC revealed that high workload, challenging
course content, lack of time, lack of pressure, no sense of commu-
nity, social influence and lengthy course start-up were the factors
relevant to course drop-out. Similarly, Eriksson, Adawi, and St€ohr
(2016) found that the learner’s ability to find and manage time
was the most frequently mentioned reason for course drop-out
when they interviewed 34 students selected from two MOOCs
with a different completion rate. Other reasons for disengagement
include the learner’s perception of course content, the learner’s
perception of course design, and the learner’s social situation and
characteristics.
1.2. Effect of time-on-task on academic performance in computer-
based learning environments
In the last decade, researchers studied the effect of time-on-task
on academic performance of students learning in the computer-
based learning environment. Lustria (2007) found that college
students who spent more time in using interactive Web sites
containing health related information performed better in the
comprehension test. Louw, Muller, and Tredoux (2008) examined
the importance of various predictors such as pre-existing level of
mathematics ability, degree of access to computers outside of
school, time spent on computers both inside and outside of school
and on the computer-based tutoring system used in the study,
computer literacy, confidence in using information technology,
motivation for mathematics, degree of enjoyment of learning
mathematics, intention to study after school, language used at
home and parental encouragement. Of these predictors, they found
that time spent on the computer-based tutoring environment was
the strongest variable predicting the academic success of high
school students participating in their study. When Krause, Stark,
and Mandl (2009) investigated how 137 college students learned
statistics in the computer-based learning environment, they found
that the time-on-task is significantly positively correlated with the
learning outcome of students. Macfadyen and Dawson (2010)’s log
file analysis of an LMS showed that number of log-ins and time
spent in the LMS can explain more than 30% of variance in the final
Fig. 1. An example of 8.MReVx log files analyzed in the study.
Y. Lee / Computers in Human Behavior 86 (2018) 174e180176
grade of college students. Cho and Shen (2013) reported that time-
on-task logged in the LMS, along with effort regulation, can predict
students’ academic achievement in the course. Goldhammer et al.
(2014) investigated how the effect of time-on-task on the
learning performance in the computer-based learning environment
is moderated by task difficulty and student skill using linear mixed
models. When students were solving problems in the computer-
based learning environment, time-on-task increased with task
difficulty, and was positively related to the performance of stu-
dents. However, the positive effect of time-on-task decreased as
skill levels of students is increasing. Landers and Landers (2015)
studied the effect of time-on-task in the gamified learning envi-
ronment. They reported that college students who learned in the
gamified learning environment spent much more time using the
learning environment than students who used non-gamified
learning environment, which in turn improved their academic
performance in the course. Although these studies reported time-
on-task as an important predictor for academic success of stu-
dents learning in the computer-based learning environment, none
of them took into account off-task activities of students, which may
have an impact on the predictive power of time-on-task.
2. Method
2.1. Context
This study analyzed the log files that captured how 12,981
students who signed up for a MOOC called “edX 8.MReVx Me-
chanics Review (hereafter, 8.MReVx)” interacted with various
learning resources during the Summer 2014 semester. 83% of
12,981 registrants were male and 17% of them were female. Age of
registrants varied from 15 to 75: 45% of them were under 26, 39% of
them were in the range of 26 and 40, and 16% of them were age 41
and above. 25% of registrants had an advanced degree, 35% of them
had a college degree, and 38% of them had a secondary diploma or
less. Geographic distribution included the US (27% of registrants),
India (18%), UK (3.6%), Brazil (2.8%) and others.
8.MReVx, which was offered by MIT, is designed to provide a
comprehensive overview of Newtonian mechanics and greater
expertise with problem-solving. 8.MReVx provides various learning
resources, such as e-texts, videos, discussion boards, wiki, check-
points, weekly homework problems, quizzes, midterm exam and
final exam, to help students learn Newtonian mechanics concepts.
Students did not need to use an external resource, such as textbook,
because the course was designed to be self-contained; it provided
all required information through its e-texts and wiki. Checkpoints
are easier problems embedded within e-texts as formative as-
sessments whereas homework problems, quizzes, midterm exam
and final exam are more difficult problems provided throughout
the 12-week long semester as summative assessments. Students
were given a week to work on formative and summative assess-
ment problems which were due on Sunday at midnight. The
achievement of students was determined by checkpoints (8%),
homework problems (34%), quizzes (36%), midterm exam (7%) and
final exam (16%), and students who scored more than 60% of the
maximum possible points received a course certificate. For further
explorations of course structure and available learning resources,
see the archived course at https://courses.edx.org/courses/MITx/8.
MReVx/2T2014/info.
2.2. Procedures
Of 12,981 students who registered in 8.MReVx, this study
focused on students who solved at least one assessment problem
(N ¼ 4286) because problem solving is the most important learning
activity for earning a certificate in this course; in 8.MReVx, all
available points were allocated to solving formative and summative
assessment problems. Of these 4286 students who solved at least
one assessment problem, only 434 students earned a course cer-
tificate at the end of the semester.
Fig. 1 shows a snippet of log files analyzed in this study. Each
row is a timestamped database transaction representing a specific
learning activity experienced by one particular student enrolled in
the MOOC. One can easily see that the student solved a problem
([check_problem_correct] in row 14 in Fig. 1), watched the same
video twice ([play_video] in row 15 and 17 in Fig. 1), and then
solved another problem ([check_problem correct] in row 19 in
Fig. 1). In total, there were 12,981 log files like the one shown in
Fig. 1 because each log file captured what one student did while
taking the course. These log files were imported into an SQLite
(https://www.sqlite.org) database, and Python (https://www.
python.org) and R (https://www.r-project.org) scripts were devel-
oped to pre-process the merged log files in the imported database.
Pre-processing of the merged log files consists of the following
steps. First, examine “Event” and “Resource Name” columns in the
database to create a list of unique problems available in the entire
course. Second, for each student in the database, examine how
many problems he or she tried to solve. Third, select a subset of
students who attempted to solve at least one problem. Fourth,
compute the time-on-task for all learning activities of selected
students (e.g. [play_video] in row 17 in Fig. 1), by subtracting its
timestamp value (e.g., 6/24/14 16:05:21.327 in row 17 in Fig.1) from
the timestamp value of the subsequent learning activity (e.g., 6/24/
14 16:13:28.982 in row 18 in Fig. 1). Finally, using the computed
time-on-task values, create variables representing the frequency
and the duration of uninterrupted time-on-task such as (1) number
of learning chunks per week (NLC); (2) number of learning activities
per week (NLA); (3) duration of all learning chunks per week (TLC);
and (4) median of duration of learning chunks per week (MedTLC).
Figure 2 explains how these variables are defined and computed
in this study. Each row in Fig. 2 represents a specific learning ac-
tivity, such as solving a problem or watching a video, one particular
MOOC student experienced during one week period. A learning
chunk is defined as a group of consecutive learning activities not
interrupted by an off-task activity, an activity whose time-on-task
value is larger than a predetermined threshold value (e.g., 10, 30
or 60 min). Fig. 2 indicates that the student had three learning
chunks this week (NLC ¼ 3) that are separated by two off-task ac-
tivities. Number of learning activities per week is the count of in-
dividual learning activity whose time-on-task value is smaller than
the predetermined off-task activity threshold time (NLA ¼ 10 in
Fig. 2). Duration of all learning chunks can be obtained by summing
up the duration of individual learning chunk observed in the week
(i.e., TLC ¼ TLC,1 þ TLC,2 þ TLC,3; TLC,1 ¼ TOT1 þ TOT2 þ TOT3;
https://courses.edx.org/courses/MITx/8.MReVx/2T2014/info
https://courses.edx.org/courses/MITx/8.MReVx/2T2014/info
https://www.sqlite.org
https://www.python.org
https://www.python.org
https://www.r-project.org
Fig. 2. Learning chunk (LCi), off-task activity, time-on-task (TOTi), and duration of
learning chunk (TLC,i) identified in the log file.
Y. Lee / Computers in Human Behavior 86 (2018) 174e180 177
TLC,2 ¼ TOT4 þ TOT5 þ TOT6; TLC,3 ¼ TOT7 þ TOT8 þ TOT9 þ TOT10).
Finally, median of duration of learning chunks
(MedTLC ¼ Median({TLC,1, TLC,2, TLC,3}) captures, on average, how long
students were engaged in meaningful learning activities before
getting distracted by an off-task action.
Log transform was applied to these variables in order to mitigate
heavy skewness and incompatible ranges. Table 1 summarizes
descriptive statistics of unscaled and log-transformed variables
pre-processed with different off-task activity threshold values.
2.3. Predictive models estimating likelihood of students getting
course certificate
In order to answer the research questions, three logistic
regression models (base, main effect and interaction) were devel-
oped. The response variable of logistic regression model was
whether or not students earned a course certificate at the end of the
Table 1
Mean, median, standard deviation and interquartile range of unscaled and log-
transformed variables representing the frequency and the duration of uninter-
rupted time-on-task.
Variable Mean Median SD IQR
Off-task threshold ¼ 10 min
NLC 8.75 5.00 10.13 9.00
NLA 112.48 66.00 121.73 135.25
TLC 6826.12 3626.35 7947.68 8041.12
MedTLC 658.48 491.57 618.97 543.15
log(NLC) 0.82 0.78 0.37 0.60
log(NLA) 1.83 1.83 0.46 0.75
log(TLC) 3.55 3.56 0.54 0.79
log(MedTLC) 6.12 6.20 0.93 1.08
Off-task threshold ¼ 30 min
NLC 5.40 3.50 5.51 5.00
NLA 116.35 67.94 126.34 137.62
TLC 10638.27 5265.48 13310.41 12161.47
MedTLC 1501.39 1159.74 1371.65 1461.46
log(NLC) 0.69 0.65 0.31 0.43
log(NLA) 1.84 1.84 0.46 0.75
log(TLC) 3.70 3.72 0.59 0.83
log(MedTLC) 6.88 7.06 1.08 1.28
Off-task threshold ¼ 60 min
NLC 4.53 3.00 4.24 4.00
NLA 117.30 68.00 127.53 138.75
TLC 13134.35 6316.93 17047.45 14899.79
MedTLC 2029.12 1583.05 1914.13 2152.61
log(NLC) 0.65 0.60 0.28 0.37
log(NLA) 1.85 1.84 0.46 0.75
log(TLC) 3.77 3.80 0.61 0.86
log(MedTLC) 7.18 7.37 1.14 1.37
semester. The base model predicts a probability for getting a course
certificate based on two explanatory variables, number of learning
chunks per week (NLC) and number of learning activities per week
(NLA). The base model uses frequencies of uninterrupted learning
activities as a proxy for the time-on-task of MOOC students as was
done in the previous study. The main effect model includes two
additional duration-based explanatory variables, duration of all
learning chunks per week (TLC) and median of duration of learning
chunks per week (MedTLC), in order to investigate whether the
duration-based time-on-task variables proposed in this study can
improve the predictive power of logistic regression model. Finally,
the interaction model adds interaction terms to the main effect
model in order to examine the importance of interactions between
main effect explanatory variables. To investigate the effect of off-
task threshold value on the predictive power of logistic regres-
sion model, each model was fit to data sets pre-processed with
three different off-task activity threshold values frequently used in
educational data mining research (10, 30 and 60 min). Table 2
summarizes the resulting nine predictive models compared in
this study in terms of the off-task threshold value and the
explanatory variables.
3. Results
3.1. Effect of off-task threshold value on predictive power of logistic
regression model
In order to investigate whether an off-task activity threshold
value has an impact on the predictive power of logistic regression
model, data sets pre-processed with three different off-task activity
threshold values (10, 30 and 60 min) were divided into training and
test sets, each logistic regression model (base, main effect and
interaction) was fit to the three training sets, and the predictive
power of each model was evaluated on the three test sets. When
creating training and test sets, which consist of 80% and 20% of the
original data sets, stratified random sampling was used to ensure
that the ratio of positive (students who earned a course certificate)
to negative instances (students who did not earn a course certifi-
cate) in both sets is similar. Since the test sets were not used in
building logistic regression models, which were fit only to the
training sets, they can play a role of future unseen data providing an
unbiased measure of predictive power of the model.
Accuracy and Area Under the receiver operating characteristic
Curve (AUC) are two most frequently used performance metrics
when comparing predictive power of binary classification models.
However, accuracy and AUC are not appropriate performance
measurements for the data set analyzed in this study because it is
heavily imbalanced to negative instances. Because only 434 stu-
dents, out of 4286 students who solved at least one problem, got a
course certificate, the proportion of negative instance, students
who did not earn a course certificate, is approximately 0.90 in both
training and test sets. Therefore, a simple model that always pre-
dicts everyone will not get a course certificate will achieve a 90%
accuracy. Despite the high accuracy, however, such a model is
useless because it does not provide any meaningful information
about who will get a course certificate. Likewise, AUC is not an
appropriate performance measure because it would give an overly
optimistic result based on the high percentage of correct classifi-
cations of the majority class. In order to address these issues, this
study used Area Under the Precision-Recall Curve (AUPRC) (Davis &
Goadrich, 2006) when comparing the predictive power of logistic
regression model.
Table 3 compares AUC and AUPRC of nine logistic regression
models tested in this study. As explained above, AUC values do not
change much because of the large number of negative instances in
Table 2
Nine logistic regression models compared in the study.
Model Type Off-task Threshold Explanatory Variables
M1 Base 10 min log(NLC), log(NLA)
M2 Base 30 min log(NLC), log(NLA)
M3 Base 60 min log(NLC), log(NLA)
M4 Main effect 10 min log(NLC), log(NLA), log(TLC), log(MedTLC)
M5 Main effect 30 min log(NLC), log(NLA), log(TLC), log(MedTLC)
M6 Main effect 60 min log(NLC), log(NLA), log(TLC), log(MedTLC)
M7 Interaction 10 min log(NLC), log(NLA), log(TLC), log(MedTLC), log(NLC):log(NLA), log(NLC):log(TLC), log(NLC):log(MedTLC)
M8 Interaction 30 min log(NLC), log(NLA), log(TLC), log(MedTLC), log(NLC):log(NLA), log(NLC):log(TLC), log(NLC):log(MedTLC)
M9 Interaction 60 min log(NLC), log(NLA), log(TLC), log(MedTLC), log(NLC):log(NLA), log(NLC):log(TLC), log(NLC):log(MedTLC)
Note. “:” in the explanatory variable name indicates an interaction between the corresponding main effect variables.
Table 4
Summary of logistic regression analysis for variables predicting students’ success in
getting a course certificate.
Explanatory Variable Estimate Standard Error p-value
log(NLC) 1.37 0.33 <0.0001
*
log(NLA) 2.27 0.32 <0.0001 *
log(TLC) 0.78 0.44 0.078
log(MedTLC) 0.62 0.24 0.011
*
log(NLC):log(NLA) �1.45 0.25 <0.0001* log(NLC):log(TLC) 0.30 0.24 0.21 log(NLC):log(MedTLC) 0.24 0.16 0.12
Note. *p < .05; “:” in the explanatory variable name indicates an interaction between the corresponding main effect variables.
Y. Lee / Computers in Human Behavior 86 (2018) 174e180178
the data. All nine logistic regression models show a very similar
AUC value ranging from 0.922 to 0.938 regardless of the off-task
activity threshold value or the explanatory variables included in
the model. On the other hand, AUPRC values are comparable only in
the same model type (base, main effect and interaction). In addi-
tion, prediction models including more explanatory variables ach-
ieve a larger AUPRC value. On average, AUPRC of the main effect
models is about 32% larger than that of the base models, indicating
that the two duration-related time-on-task variables proposed in
this study (TLC and MedTLC) are important in discriminating stu-
dents who earned a course certificate from students who did not.
On the other hand, the difference between main effect and inter-
action models is smaller. On average, AUPRC of the interaction
models is approximately 4% larger than that of the main effect
models.
3.2. Relationship between uninterrupted time-on-task and
students’ success in MOOC
This study elects to examine the importance of explanatory
variables in the interaction model with 60-min threshold (M9) in
relation to the student’s success in getting a course certificate
because this model achieved the largest AUPRC value. The results
from logistic regression analysis suggest that the more learning
activities (NLA) and learning chunks (NLC) students have and the
longer their average learning chunk (MedTLC) is, the more likely
they will earn a course certificate; when log(NLA), log(NLC) and
log(MedTLC) increase by one standard deviation, the log odds of
students getting a course certificate increase by 2.27, 1.37 and 0.62,
respectively (see Table 4).
Of three interactions, only the interaction between number of
learning chunks per week and number of learning activities per
week, log(NLC):log(NLA), is statistically significant. The large, nega-
tive value of the interaction coefficient indicates that students are
less likely to get a course certificate if they are engaged in many
learning activities in a large number of learning chunks. Rather,
their chance of getting a course certificate will increase when they
are engaged in many learning activities in fewer learning chunks.
This is a sensible result because number of learning chunks is also
Table 3
AUC and AUPRC of nine logistic regression models compared in this study.
Model AUC AUPRC
M1 0.925 0.547
M2 0.923 0.522
M3 0.922 0.517
M4 0.935 0.701
M5 0.931 0.696
M6 0.932 0.692
M7 0.936 0.704
M8 0.934 0.731
M9 0.938 0.739
proportional to the frequency of off-task activities and students are
less likely to be engaged in meaningful learning when they are
frequently distracted by other tasks.
Of three interactions, only the interaction between number of
learning chunks per week and number of learning activities per
week, log(NLC):log(NLA), is statistically significant. The large, nega-
tive value of the interaction coefficient indicates that students are
less likely to get a course certificate if they are engaged in many
learning activities in a large number of learning chunks. Rather,
their chance of getting a course certificate will increase when they
are engaged in many learning activities in fewer learning chunks.
This is a sensible result because number of learning chunks is also
proportional to the frequency of off-task activities and students are
less likely to be engaged in meaningful learning when they are
frequently distracted by other tasks.
4. Discussion and future works
Uninterrupted time-on-task is closely related to Self-Regulated
Learning (SRL) because it involves allocating time and effort to
improve learning performance. Students with higher SRL ability
would have longer time-on-task that is uninterrupted by off-task
activities than students who do not possess enough SRL ability.
Previous studies found that SRL is important for students to be
successful especially in the computer-based learning environment
where there is no instructor or peers who can guide and support
their learning processes (Puzziferro, 2008; Sun & Rueda, 2011).
These studies used survey instruments, such as Motivated Strate-
gies for Learning Questionnaire (MSLQ) (Duncan & McKeachie,
2005), which include questions measuring SRL ability of students
(e.g., I make good use of study time for this course). However, self-
reported survey instruments may not be able to estimate uninter-
rupted time-on-task accurately because they are relying on stu-
dents’ perception of their self-regulatory processes aggregated over
more than one learning activity (Zimmerman, 2008). One contri-
bution of this study is that it operationalized SRL ability of students
in terms of observable learning behaviors and examined how these
Y. Lee / Computers in Human Behavior 86 (2018) 174e180 179
learning behaviors are correlated with the academic performance
of students by analyzing log files of a computer-based learning
environment.
This study found that students who did more learning activities
and had more and longer learning chunks per week were more
likely to get a course certificate, which is not a surprising result.
However, what is interesting is the interaction between number of
learning activities and number of learning chunks per week; the
likelihood of getting a course certificate increases when the same
number of learning activities occurred in fewer learning chunks.
Since most LMS provide statistics on how many times students
accessed a specific learning resource in the course, number of
learning activities students are engaged in (e.g., number of clicks on
the Web page containing a specific instructional material) is often
used as a proxy for their time-on-task. The findings from this study
suggest that frequency of learning activities alone does not provide
enough information about how students self-regulate their
learning and their time-on-task. Rather, it is important to examine
how learning activities are grouped to form more meaningful
learning experiences as students are interacting with learning re-
sources in the computer-based learning environment.
In most MOOCs (and other learning environments), important
learning activities have a due date and time. For instance, students
taking 8.MReVx were required to complete all assigned learning
activities by Sunday at midnight each week. A recent study that
examined a problem completion rate of students enrolled in a
MOOC found that students who were able to successfully complete
the course started working on their weekly homework problems
very early (Lee, Y, 2018), suggesting that uninterrupted time-on-
task of students may change over the one-week assignment cycle.
Thus, as a future work, it would be meaningful to investigate
whether incorporating daily uninterrupted time-on-task into the
prediction model can improve its predictive power.
This study focused on the relationship between the amount of
uninterrupted time-on-task and students’ success in acquiring a
course certificate in the MOOC. Although getting a course certificate
is important, especially from the perspective of MOOC instructors
and providers, not all people take MOOCs to earn a certificate
(Kizilcec, Piech, & Schneider, 2013). Therefore, it would be impor-
tant to investigate how uninterrupted time-on-task is related to
other variables of success, such as participation in discussion or
problem solving in subsequent weeks. Similarly, it would be
interesting to compare uninterrupted time-on-task of students
having a different intention of enrollment.
In this study, the variables representing uninterrupted time-on-
task were averaged over weeks. As a result, these variables do not
capture how uninterrupted time-on-task of MOOC students are
changing over the course of the semester. As a future work, we plan
to extend the statistical model developed in this study into a
multilevel model in which regression coefficients vary by week. By
comparing logistic regression coefficients from each week, we will
be able to better understand how the effect of uninterrupted time-
on-task changes over time, and how it is related to the success of
students in the MOOC.
5. Limitations of study
Learning is a complex cognitive activity that can take many
different forms depending on the subject matter. Solving home-
work and quiz problems on the physics concepts covered each
week was the most important learning activity for students
enrolled in the MOOC examined in this study. Although problem
solving is not limited to mathematics and science, because any
higher-order thinking can be considered a problem solving activity
(Veresov, 2004, 2010; Vygotsky, 2012), solving mathematics or
science problems in the computer-based learning environment is
different from applying higher-order thinking skills in other subject
domains, such as literature or social studies, emphasizing different
pedagogies (e.g., socio-constructivism). First of all, mathematics or
science problems students are trying to solve in MOOCs almost
always have one correct answer whereas problem solving in other
subject domains may have more than one correct answer. Second,
while solving mathematics or science problems in MOOCs, students
are allowed to submit their answer only a few times, and their
incorrect answers are often penalized, which is usually not the case
in solving problems in other subject domains. In MOOCs on
mathematics or science, moreover, students are expected to solve
problems without getting external helps because of the honor code
of MOOCs. On the other hand, students enrolled in MOOCs on
literature or social studies are expected to develop their under-
standing by engaging in meaningful discussions with their peers
and the instructor. Problem solving in this case must be studied in
connection with higher mental functions, such as abstract thinking,
logical memory and voluntary attention (Veresov, 2004, 2010;
Vygotsky, 2012), by carefully observing the entire learning pro-
cesses of students. Consequently, the findings from this study may
not be generalized to other subject domains in which collaboration
with other people are encouraged because different learning ac-
tivities and pedagogies would affect how students interact with
learning resources in MOOCs.
In this study, only 434 out of 12,981 registrants were able to get
a course certificate, which falls on the lower end of the range of
completion rates Jordan (2015) reported in her study. Since stu-
dents taking MOOCs with a higher completion rate may show
different learning behaviors, the findings from this study may not
be applicable to such MOOCs, either. It would be interesting to
conduct a similar study examining uninterrupted time-on-task of
students who are learning in the MOOC with a higher completion
rate.
In determining uninterrupted time-on-task, this study used off-
task activity threshold values frequently used in educational data
mining research (10, 30 and 60 min). Since the choice of these
values is not based on a strong theoretical foundation, care should
be taken when interpreting the findings from this study. Depending
on the nature of content knowledge being acquired and the peda-
gogies employed in the course, different off-task activity threshold
values could result in an optimal predictive power of the model.
More in-depth replication studies in different subject domains are
warranted to better understand the effect of off-task activity
threshold values on the predictive power of the model incorpo-
rating uninterrupted time-on-task as a predictor variable.
Acknowledgement
The author thanks Prof. David Pritchard at MIT for providing the
log files of 8.MReVx analyzed in this study.
References
Allen, G. J., Lerner, W. M., & Hinrichsen, J. J. (1972). Study behaviors and their re-
lationships to test anxiety and academic performance. Psychological Reports,
30(2), 407e410. http://doi.org/http://doi.org/10.2466/pr0.1972.30.2.407.
Ba-Omar, H., & Petrounias, I. (2007). A framework for using Web usage mining to
personalise e-learning. In Proceedings of the seventh IEEE international confer-
ence on advanced learning technologies (pp. 937e938). http://doi.org/http://doi.
org/10.1109/ICALT.2007.13.
Breslow, L., Pritchard, D. E., DeBoer, J. D., Stump, G. S., Ho, A. D., & Seaton, D. T.
(2013). Studying learning in the worldwide classroom: Research into edX’s first
MOOC. Research Practice in Assessment, 8, 13e25.
Carroll, J. B. (1963). A model of school learning. Teachers College Record, 64(8),
723e733.
Cho, M.-H., & Shen, D. (2013). Self-regulation in online learning. Distance Education,
34(3), 290e301. http://doi.org/10.1080/01587919.2013.835770.
http://doi.org/http://doi.org/10.2466/pr0.1972.30.2.407
http://doi.org/http://doi.org/10.1109/ICALT.2007.13
http://doi.org/http://doi.org/10.1109/ICALT.2007.13
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref3
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref3
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref3
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref3
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref4
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref4
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref4
http://doi.org/10.1080/01587919.2013.835770
Y. Lee / Computers in Human Behavior 86 (2018) 174e180180
Crossley, S., Paquette, L., Dascalu, M., McNamara, D. S., & Baker, R. (2016). Combining
click-stream data with NLP tools to better understand MOOC completion. In
Proceedings of the sixth international conference on learning analytics & knowl-
edge (pp. 6e14). http://doi.org/10.1145/2883851.2883931.
Davis, J., & Goadrich, M. (2006). The relationship between Precision-Recall and ROC
curves. In Proceedings of the 23rd international conference on world wide Web
(pp. 233e240). http://doi.org/10.1145/1143844.1143874.
Duncan, T. G., & McKeachie, W. J. (2005). The making of the motivated Strategies for
learning Questionnaire. Educational Psychologist, 40(2), 117e128. http://doi.org/
10.1207/s15326985ep4002_6.
Eriksson, T., Adawi, T., & St€ohr, C. (2016). “Time is the bottleneck”: A qualitative
study exploring why learners drop out of MOOCs. Journal of Computing in Higher
Education, 29(1), 133e146. http://doi.org/10.1007/s12528-016-9127-8.
Goldberg, L. R., Bell, E., King, C., OMara, C., McInerney, F., Robinson, A., et al. (2015).
Relationship between participants’ level of education and engagement in their
completion of the understanding dementia massive open online course. BMC
Medical Education, 15(1), 60e67. http://doi.org/10.1186/s12909-015-0344-z.
Goldhammer, F., Naumann, J., Stelter, A., T�oth, K., R€olke, H., & Klieme, E. (2014). The
time on task effect in reading and problem solving is moderated by task diffi-
culty and skill: Insights from a computer-based large-scale assessment. Journal
of Educational Psychology, 106(3), 608e626. http://doi.org/10.1037/a0034716.
Hone, K. S., & Said El, G. R. (2016). Exploring the factors affecting MOOC retention: A
survey study. Computers & Education, 98(C), 157e168. http://doi.org/10.1016/j.
compedu.2016.03.016.
Ho, A. D., Reich, J., Nesterko, S. O., Seaton, D. T., Mullaney, T., Waldo, J., et al. (2014).
HarvardX and MITx: The first year of open online courses, fall 2012-summer
2013. SSRN Electronic Journal. http://doi.org/10.2139/ssrn.2381263.
Jordan, K. (2015). Massive open online course completion rates revisited: Assess-
ment, length and attrition. The International Review of Research in Open and
Distributed Learning, 16(3). http://doi.org/10.19173/irrodl.v16i3.2112.
Kizilcec, R. F., & Halawa, S. (2015). Attrition and achievement gaps in online
learning. In Proceedings of the second ACM conference on learning @ scale (pp.
57e66). http://doi.org/10.1145/2724660.2724680.
Kizilcec, R. F., Piech, C., & Schneider, E. (2013). Deconstructing disengagement:
Analyzing learner subpopulations in massive open online courses. In Pro-
ceedings of the third international conference on learning analytics and knowledge
(pp. 170e179). http://doi.org/10.1145/2460296.2460330.
Kovanovi�c, V., Ga�sevi�c, D., Dawson, S., Joksimovi�c, S., Baker, R., & Hatala, M. (2015).
Does time-on-task estimation matter? Implications for the validity of learning
analytics findings. Journal of Learning Analytics, 2(3), 81e110. http://doi.org/10.
18608/jla.2015.23.6.
Krause, U.-M., Stark, R., & Mandl, H. (2009). The effects of cooperative learning and
feedback on e-learning in statistics. Learning and Instruction, 19(2), 158e170.
http://doi.org/10.1016/j.learninstruc.2008.03.003.
Landers, R. N., & Landers, A. K. (2015). An empirical test of the theory of gamified
learning: The effect of leaderboards on time-on-task and academic perfor-
mance. Simulation & Gaming, 45(6), 769e785. http://doi.org/10.1177/
1046878114563662.
Lee, Y. (2018). Using self-organizing map and clustering to investigate problem-
solving patterns in the massive open online course: An exploratory study.
Journal of Educational Computing Research (in press) https://doi.org/10.1177/
0735633117753364.
Louw, J., Muller, J., & Tredoux, C. (2008). Time-on-task, technology and mathematics
achievement. Evaluation and Program Planning, 31(1), 41e50. http://doi.org/10.
1016/j.evalprogplan.2007.11.001.
Lustria, M. L. A. (2007). Can interactivity make a difference? Effects of interactivity
on the comprehension of and attitudes toward online health content. Journal of
the American Society for Information Science and Technology, 58(6), 766e776.
http://doi.org/10.1002/asi.20557.
Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early
warning system” for educators: A proof of concept. Computers & Education,
54(2), 588e599. http://doi.org/10.1016/j.compedu.2009.09.008.
Munk, M., & Drlík, M. (2011). Impact of different pre-processing tasks on effective
identification of users’ behavioral patterns in Web-based educational system.
Procedia Computer Science, 4, 1640e1649. http://doi.org/10.1016/j.procs.2011.04.
177.
Nawrot, I., & Doucet, A. (2014). Building engagement for MOOC students. In Pro-
ceedings of the 23rd international conference on world wide Web (pp.
1077e1082). http://doi.org/10.1145/2567948.2580054.
Pursel, B. K., Zhang, L., Jablokow, K. W., Choi, G. W., & Velegol, D. (2017). Under-
standing MOOC students: Motivations and behaviours indicative of MOOC
completion. Journal of Computer Assisted Learning, 32(3), 202e217. http://doi.
org/10.1111/jcal.12131.
Puzziferro, M. (2008). Online technologies self-efficacy and self-regulated learning
as predictors of final grade and satisfaction in college-level online courses.
American Journal of Distance Education, 22(2), 72e89. http://doi.org/10.1080/
08923640802039024.
Sun, J. C.-Y., & Rueda, R. (2011). Situational interest, computer self-efficacy and self-
regulation: Their impact on student engagement in distance education. British
Journal of Educational Technology, 43(2), 191e204. http://doi.org/10.1111/j.1467-
8535.2010.01157.x.
del Valle, R., & Duffy, T. M. (2007). Online learning: Learner characteristics and their
approaches to managing learning. Instructional Science, 37(2), 129e149. http://
doi.org/10.1007/s11251-007-9039-0.
Veresov, N. (2004). Zone of proximal development (ZPD): The hidden dimension?
In A. Ostern, & R. Helia-Ylikallio (Eds.), Languages as culture Tensions in time and
space (pp. 13e30).
Veresov, N. (2010). Introducing cultural historical theory: Main concepts and
principles of genetic research methodology. Cultural-Historical Psychology,
2010(4), 83e90.
Vygotsky, L. (2012). Thoughts and language. Cambridge, MA: MIT Press.
Wagner, P., Schober, B., & Spiel, C. (2008). Time students spend working at home for
school. Learning and Instruction, 18(4), 309e320. http://doi.org/10.1016/j.
learninstruc.2007.03.002.
Wagstaff, R., & Mahmoudi, H. (1976). Relation of study behaviors and employment
to academic performance. Psychological Reports, 38(2), 380e382. http://doi.org/
10.2466/pr0.1976.38.2.380.
Wellman, G. S., & Marcinkiewicz, H. (2004). Online learning and time-on-task:
Impact of proctored vs. un-proctored testing. Journal of Asynchronous Learning
Networks, 8, 93e104.
Wise, A. F., Speer, J., Marbouti, F., & Hsiao, Y.-T. (2012). Broadening the notion of
participation in online discussions: Examining patterns in learners’ online
listening behaviors. Instructional Science, 41(2), 323e343. http://doi.org/10.
1007/s11251-012-9230-9.
Zheng, S., Rosson, M. B., Shih, P. C., & Carroll, J. M. (2015). Understanding student
motivation, behaviors and perceptions in MOOCs. In Proceedings of the 18th ACM
conference on computer supported cooperative work & social computing (pp.
1882e1895). http://doi.org/10.1145/2675133.2675217.
Zimmerman, B. J. (2008). Investigating self-regulation and motivation: Historical
background, methodological developments, and future prospects. American
Educational Research Journal, 45(1), 166e183. http://doi.org/10.3102/
0002831207312909.
http://doi.org/10.1145/2883851.2883931
http://doi.org/10.1145/1143844.1143874
http://doi.org/10.1207/s15326985ep4002_6
http://doi.org/10.1207/s15326985ep4002_6
http://doi.org/10.1007/s12528-016-9127-8
http://doi.org/10.1186/s12909-015-0344-z
http://doi.org/10.1037/a0034716
http://doi.org/10.1016/j.compedu.2016.03.016
http://doi.org/10.1016/j.compedu.2016.03.016
http://doi.org/10.2139/ssrn.2381263
http://doi.org/10.19173/irrodl.v16i3.2112
http://doi.org/10.1145/2724660.2724680
http://doi.org/10.1145/2460296.2460330
http://doi.org/10.18608/jla.2015.23.6
http://doi.org/10.18608/jla.2015.23.6
http://doi.org/10.1016/j.learninstruc.2008.03.003
http://doi.org/10.1177/1046878114563662
http://doi.org/10.1177/1046878114563662
https://doi.org/10.1177/0735633117753364
https://doi.org/10.1177/0735633117753364
http://doi.org/10.1016/j.evalprogplan.2007.11.001
http://doi.org/10.1016/j.evalprogplan.2007.11.001
http://doi.org/10.1002/asi.20557
http://doi.org/10.1016/j.compedu.2009.09.008
http://doi.org/10.1016/j.procs.2011.04.177
http://doi.org/10.1016/j.procs.2011.04.177
http://doi.org/10.1145/2567948.2580054
http://doi.org/10.1111/jcal.12131
http://doi.org/10.1111/jcal.12131
http://doi.org/10.1080/08923640802039024
http://doi.org/10.1080/08923640802039024
http://doi.org/10.1111/j.1467-8535.2010.01157.x
http://doi.org/10.1111/j.1467-8535.2010.01157.x
http://doi.org/10.1007/s11251-007-9039-0
http://doi.org/10.1007/s11251-007-9039-0
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref29
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref29
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref29
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref29
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref30
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref30
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref30
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref30
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref31
http://doi.org/10.1016/j.learninstruc.2007.03.002
http://doi.org/10.1016/j.learninstruc.2007.03.002
http://doi.org/10.2466/pr0.1976.38.2.380
http://doi.org/10.2466/pr0.1976.38.2.380
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref34
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref34
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref34
http://refhub.elsevier.com/S0747-5632(18)30207-3/sref34
http://doi.org/10.1007/s11251-012-9230-9
http://doi.org/10.1007/s11251-012-9230-9
http://doi.org/10.1145/2675133.2675217
http://doi.org/10.3102/0002831207312909
http://doi.org/10.3102/0002831207312909
- Effect of uninterrupted time-on-task on students’ success in Massive Open Online Courses (MOOCs)
1. Related work
1.1. Completion rate and attrition in MOOCs
1.2. Effect of time-on-task on academic performance in computer-based learning environments
2. Method
2.1. Context
2.2. Procedures
2.3. Predictive models estimating likelihood of students getting course certificate
3. Results
3.1. Effect of off-task threshold value on predictive power of logistic regression model
3.2. Relationship between uninterrupted time-on-task and students’ success in MOOC
4. Discussion and future works
5. Limitations of study
Acknowledgement
References
International
Review of Research in Open and Distributed Learning
Volume 18, Number 2
April – 2017
Analysis of Time-on-Task, Behavior Experiences, and
Performance in Two Online Courses with Different
Authentic Learning Tasks
Sanghoon Park
University of South Florida
Abstract
This paper reports the findings of a comparative analysis of online learner behavioral interactions, time-
on-task, attendance, and performance at different points throughout a semester (beginning, during, and
end) based on two online courses: one course offering authentic discussion-based learning activities and
the other course offering authentic design/development-based learning activities. Web log data were
collected to determine the number of learner behavioral interactions with the Moodle learning management
system (LMS), the number of behavioral interactions with peers, the time-on-task for weekly tasks, and the
recorded attendance. Student performance on weekly tasks was also collected from the course data.
Behavioral interactions with the Moodle LMS included resource viewing activities and
uploading/downloading file activities. Behavioral interactions with peers included discussion postings,
discussion responses, and discussion viewing activities. A series of Mann-Whitney tests were conducted to
compare the two types of behavioral interactions between the two courses. Additionally, each student’s
behavioral interactions were visually presented to show the pattern of their interactions. The results
indicated that, at the beginning of the semester, students who were involved in authentic
design/development-based learning activities showed a significantly higher number of behavioral
interactions with the Moodle LMS than did students involved in authentic discussion-based learning
activities. However, in the middle of the semester, students engaged in authentic discussion-based learning
activities showed a significantly higher number of behavioral interactions with peers than did students
involved in authentic design/development-based learning activities. Additionally, students who were given
authentic design/development-based learning activities received higher performance scores both during
the semester and at the end of the semester and they showed overall higher performance scores than
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
214
students who were given authentic discussion-based learning activities. No differences were found between
the two groups with respect to time-on-task or attendance.
Keywords: authentic learning task, behavioral experience, online learning, Web log data, time-on-task
Introduction
The number of online courses has been growing rapidly across the nation in both K-12 and higher education.
According to the U.S. Department of Education’s National Center for Education Statistics (NCES),
approximately half of all K-12 school districts nationwide (55%) had students enrolled in at least one online
course (National Center for Education Statistics [NCES], 2011). In higher education, more than 7.1 million
students are taking at least one online course (Allen & Seaman, 2014). These numbers are projected to grow
exponentially as more universities are striving to meet the increasing demand for online courses. Online
courses are expected to provide formal learning opportunities at the higher education level using various
learning management platforms (Moller, Foshay, & Huett, 2008; Shea & Bidjerano, 2014; Wallace, 2010).
Consequently, E-learning systems, or learning management systems (LMSs), are being advanced to provide
students with high quality learning experiences and high quality educational services in their online courses
(Mahajan, Sodhi, & Mahajan, 2016).
Although the quality of an online learning experience can be defined and interpreted differently by the
various stakeholders involved, previous studies identified both time flexibility and authentic learning tasks
as two key factors affecting successful online learning. Time flexibility has been regarded as the most
appealing option for online learning (Romero & Barberà, 2011) as it allows online learners to determine the
duration, pace, and synchronicity of the learning activities (Arneberg et al., 2007; Collis, Vingerhoets, &
Moonen, 1997; Van den Brande, 1994). Recently, Romero and Barberà (2011) divided time flexibility into
two constructs, instructional time and learner time, and asserted the need for studies that consider the time
attributes of learners, such as time-on-task quality. Authentic tasks form the other aspect of successful
online learning. Based on the constructivist learning model, online students learn more effectively when
they are engaged in learning tasks that are relevant and/or authentic to them (Herrington, Oliver, & Reeves,
2006). Such tasks help learners develop authentic learning experiences through activities that emulate real-
life problems and take place in an authentic learning environment (Roblyer & Edwards, 2000). Authentic
learning activities can take many different forms and have been shown to provide many benefits for online
learners (Lebow & Wager, 1994). For example, authentic tasks offer the opportunity to examine the task
from different perspectives using a variety of available online resources. Additionally, authentic tasks can
be integrated across different subject areas to encourage diverse roles and engage expertise from various
interdisciplinary perspectives (Herrington et al., 2006). To maximize the benefits of authentic tasks,
Herrington, Oliver, and Reeves (2006) suggested a design model that involves three elements of authentic
learning: tasks, learners, and technologies. After exploring the respective roles of the learner, the task and
the technology, they concluded that synergy among these elements is a strong contributor to the success of
online learning. Therefore, online learning must be designed to incorporate authentic learning tasks that
are facilitated by, and can be completed using, multiple types of technologies (Parker, Maor, & Herrington,
2013).
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
215
In summary, both time flexibility and authentic learning tasks are important aspects of a successful online
learning experience. However, few studies have investigated how online students show different behavioral
interactions during time-on-tasks with different authentic learning tasks, although higher levels of online
activity were found to be always associated with better final grades greater student satisfaction (Cho & Kim,
2013). Therefore, the purpose of this study was to compare behavioral interactions, time-on-task,
attendance, and performance between two online courses employing different types of authentic tasks.
Web Log Data Analysis
Web log data analysis or Web usage analysis is one of the most commonly used methods to analyze online
behaviors using electronic records of a system-user interactions (Taksa, Spink, & Jansen, 2009). Web logs
are the collection of digital traces that provide valuable information about each individual learner’s actions
in an online course (Mahajan et al., 2016). Many recent LMSs, such as CANVAS, or the newly upgraded
LMSs, such as Blackboard or Moodle, offer various sets of Web log data in the form of learning analytics.
The data usually contain course log history, number of views for each page, number of comments,
punctuality of assignment submission, and other technology usage. Web log files also contain a list of user
actions that occurred during a certain period of time (Grace, Maheswari, & Nagamalai, 2011). This vast
amount of data allows instructors and researchers to find valuable information about learners’ online
course behaviors, such as how many times per day and how often students log in, how many times and how
often they post to discussion boards, how many students submit assignments on time, how much time they
spend on each learning task, etc. Web log data also provides personal information about online learners,
such as each student’s profile and achievement scores, and each student’s behavioral interaction data, such
as content viewing, discussion posting, assignment submission, writing, test taking, task performances, and
communications with peers/instructor (Mostow et al., 2005). The data can be presented in the form of
visualization to support students and instructors in the understanding of their learning/teaching
experiences. Therefore, the Web log analysis method offers a promising approach to better understand the
behavioral interactions of online learners at different points during the semester. Researchers can use Web
log data to describe or make inferences about learning events without intruding the learning event or
involving direct elicitation of data from online learners (Jansen, Taksa, & Spink, 2009). Although Web log
data is a source of valuable information to understand online behaviors, it also has to be noted that
researchers must be careful when interpreting the data with a fair amount of caution because Web log data
could be misleading. For example, an online student might appear to be online for a longer time than she/he
actually participated in a learning activity. Therefore, prior to conducting the Web log analysis, a researcher
needs to understand the type of behavioral data to be analyzed based on the research questions and
articulates the situational and contextual factors of the log data. Using the timestamps showing when the
Web log was recorded, the researcher can make the observation of behaviors at certain point of time and
decide the validity of the online behavior (Jansen et al., 2009).
Behavioral Interactions
Previous studies have shown the benefits of analyzing Web log data to understand the online learning
behaviors of students. Hellwege, Gleadow, and McNaught (1996) conducted a study of the behavioral
patterns of online learners while studying a geology Web site and reported that learners show a pattern of
accessing the most recent lecture notes prior to accessing the Web site materials. Sheard, Albrecht, and
Butbul (2005) analyzed Web log files and found that knowing when students access various resources helps
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
216
instructors understand the students’ preferred learning patterns. While analyzing log data to investigate
learning effectiveness, Peled and Rashty (1999) found that the most popular online activities were, in
general, passive activities, such as retrieving information, rather than contributing activities. Dringus and
Ellis (2005) reported on how to analyze asynchronous discussion form usage data to evaluate the progress
of a threaded discussion. Several recent studies showed a positive link between students’ online activities
and their final course grades. For example, Valsamidis and Democritus (2011) examined the relationship
between student activity level and student grades in an e-learning environment and found that the quality
of learning content is closely related to student grades. Also, Dawson, McWilliam, and Tan (2008) found
that when students spend more time in online activities and course discussions, they earned higher final
grades. Similarly, Macfadyen and Dawson (2010) reported that the numbers of messages postings, email
correspondences, and completed assessments were positively correlated with students’ final course grades.
Most recently, Wei, Peng, and Chou’s study (2015) showed the positive correlations between the number of
discussion postings, reading material viewings, and logins with students’ final exam scores. Although the
previous studies utilized Web log data to investigate the relationships between students’ behavioral
interactions and learning achievement, few studies examined how online students’ behavioral interactions
are different at different phases of online learning when involved in two types of authentic learning tasks.
Online Learning Experience
The overall online learning experience consists of continuous behavioral interactions that are generated
while completing a series of learning tasks (Park, 2015). Therefore, an examination of the nature of the
learning tasks and the influences of the learning tasks on student behaviors is needed. The examined short-
term learning experiences can then be combined to create a big picture of the online learning experience
within a course. According to Veletsianos and Doering (2010), the experience of online learners must be
studied throughout the semester due to the long-term nature of online learning programs. To analyze the
pattern of behavioral interactions, this study employed time and tasks as two analysis frames because both
time and tasks form essential dimensions of a behavioral experience, as shown in Figure 1. An online
learning experience begins at the starting point (first day of the course) and ends at the ending point (last
day of the course). In between those two points, a series of learning tasks are presented to provide learners
with diverse learning experiences. As the course continues, the learner continues to interact with learning
tasks and eventually accumulates learning experiences by completing the learning tasks (Park, 2016).
Students learning experiences are built up from the previous learning tasks because learning tasks are not
separated from each other as shown in the spiral area in Figure 1. Hence, to analyze behavioral interactions
in online learning, both the type of learning tasks and the time-on-task must be analyzed simultaneously.
In this paper, the researcher gathered and utilized Web log data to visualize the behavioral interaction
patterns of online learners during the course of a semester and to compare the behavioral interactions
between two online courses requiring different types of learning tasks.
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
217
Figure 1. Online learning experience – time and tasks.
Research Questions
1. Do online learners’ behavioral interactions with Moodle LMS differ between a course employing
authentic discussion-based learning tasks and a course employing authentic design/development-
based
learning tasks?
2. Do online learners’ behavioral interactions with peers differ between a course employing authentic
discussion-based learning tasks and a course employing authentic design/development-based
learning tasks?
3. Does online learners’ time-on-task differ between a course employing authentic discussion-based
learning tasks and a course employing authentic design/development-based learning tasks?
4. Does online learners’ attendance differ between a course employing authentic discussion-based
learning tasks and a course employing authentic design/development-based learning tasks?
5. Does online learners’ academic performance differ between a course employing authentic
discussion-based learning tasks and a course employing authentic design/development-based
learning tasks?
Method
Setting
In this study, the researcher purposefully selected two online courses as units of analysis. The two courses
were purposefully selected because of the different learning approach that each course employed to design
authentic learning activities and the extent to which technology was used. Course A activities were designed
based on the constructivist learning approach while
Course B
activities were designed based on the
constructionist learning approach. Both the constructionist approach and constructivist approach hold the
basic assumption that students build knowledge of their own and continuously reconstruct it through
personal experiences with their surrounding external realities. However, the constructionist approach is
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
218
different from the constructivist approach in that constructionist learning begins with a view of learning as
a construction of knowledge through constructing tangible projects or creating digital artifacts (Kafai, 2006;
Papert, 1991). The title of Course A was Program Evaluation, in which the major course activities consisted
of textbook reading, weekly online discussions, and a final evaluation plan proposal. Students enrolled in
this course were expected to read the textbook, participate in weekly discussion activities, and complete a
program evaluation plan. Course B was titled Instructional Multimedia Design/Development, which
consisted of a series of hands-on tasks to design and develop multimedia materials using different
multimedia authoring programs. Students were required to review related literature and tutorials on
multimedia design during the semester and to create audio-based, visual-based, and Web-based
multimedia materials through a series of hands-on-activities. The comparison of course requirements, key
learning activities, authentic tasks, and technology use between the two online courses is presented in Table
1.
Table 1
Comparison of Course Requirements, Key Learning Activities, Authentic Tasks, and Technology use
Between Courses
Course A Course B
Course* requirements Textbook reading & online discussion Multimedia design/development
Key learning activities
Program evaluation overview
Document review, online
discussion
Textbook reading, article review,
online discussion
Quizzes
Evaluation plan progress report
Final evaluation plan
Audio based learning module
design/development
Visual learning module
design/development
Personal Website development
Instructional Web based learning
module design/development
Usability testing report
Authentic tasks** Students were guided to a real
world scenario and presented with
contextualized data for weekly
discussions.
Discussion topics were ill-defined
and open to multiple
interpretations.
Students were given a week for
each discussion topic.
Students were encouraged to use a
variety of related documents and
Web resources.
Students were required to create a
course outcome (program
evaluation plan proposal) that
could be used in their own
organization.
Students were guided to design and
create instructional multimedia
materials to solve a performance
problem that they identified in their
own fields.
Students had to determine the
scope of each multimedia project to
solve the unique performance
problems they had identified.
Students were encouraged to try
different multimedia programs and
apply various design principles that
were related to their own projects.
Students were required to create a
Web based learning module that
can be used as an intervention to
solve the identified performance
problem in their own organizations.
Technology use Students utilized the following
technology to share their ideas and
insights via weekly discussions
Students utilized the following
technology to design and create
instructional multimedia materials
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
219
based on the constructivist learning
approach:
Moodle LMS
Online discussions
Web resources
MS-Word
MS-PowerPoint
based on the constructionist
learning approach:
Moodle LMS
Online discussions
Web resources
Multimedia design programs
Audio development tools
Visual material development tools
Instructional multimedia Web
design tools
Note.
* A course in this study refers to a general online class that delivers a series of lessons and learning tasks
(online lectures, readings, assignments, quizzes, design and development activities, etc.).
** Authentic tasks were designed based on 10 characteristics of authentic activities/tasks defined by
Herrington et al. (2006).
Both courses were delivered via Moodle LMS and met the Quality Matters (QM) standards. Moodle is an
open-source LMS that helps educators create online learning courses. It has been used as a popular
alternative to proprietary commercial online learning solutions and is distributed free of charge under open
source licensing (Romero, Ventura, & Garcia, 2008). QM specifies a standard set employed to certify the
quality of an online course (www.qualitymatters.org). Both courses A and B in this study met the required
standards for high quality online course design after a rigorous review process by two certified QM
reviewers.
Participants
As two courses with different online learning tasks were purposefully selected, 22 graduate students who
were enrolled in two 8 week long online courses participated in this study. Twelve students were enrolled
in Course A, and 10 students were enrolled in Course B. Excluding four students, two who dropped from
each course due to personal reasons, the data reported in this paper concern 18 participants, 10 students (4
male and 6 female) in Course A and 8 students (all female) in Course B, with a mean age of 32.60 years (SD
= 5.76) and 35.25 years (SD = 9.66), respectively. The average number of online courses the study
participants had taken previously was 11.40 (SD = 4.88) for Course A and 11.38 (SD = 12.28) for Course B,
indicating no significant difference between the two courses. However, it should be noted that the number
of students who had not previously taken more than 10 online courses was higher in Course B (five
participants) than in Course A (three participants). Fifteen of the 18 participants were teachers: five taught
elementary school, five taught middle school, and five taught high school. Of the remaining three
participants, one was an administrative assistant, one was a curriculum director, and one was an
instructional designer.
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
220
Figure 2. Example of Web log data screen.
Data
In this study, the researcher examined behavioral interactions by utilizing students’ Web log data acquired
from Moodle LMS used in this study (Figure 2). The obtained sets of data were significant for this study
because they contained timestamp-sequenced interaction activities that are automatically recorded for each
student with pre-determined activity categories such as view discussion, post discussion, view resources,
etc. Hence, it clearly showed the type of activities each student followed in order to complete a given online
learning task. The researcher also ensured the accuracy of data by following the process to decide the
validity of the online behavior (Jansen et al., 2009). First, the researcher clearly defined the type of
behavioral data to be analyzed based on the research questions (Table 2), and second, the researcher
articulated the situational and contextual factors of the log data by cross-examining the given online tasks
and recorded students activities. Lastly, the researcher checked the timestamps for each activity to confirm
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
221
the time and the length of data recorded. The data were then converted to Excel file format and computed
based on three semester phases for further analysis. These phases were phase 1 ─ beginning of the semester,
phase 2 ─ during the semester, and phase 3 ─ end of the semester. An example of a Web log data screen is
presented in Figure 2. The data show online learner behaviors in chronological order. Based on the type of
behavioral activities, the researcher identified two categories of behavioral interactions that affect task
completion: interactions with the Moodle LMS and interactions with peers. Table 2 presents the two
categories of behavior interactions and example behaviors for each category.
Table 2
Categories of Behavioral Interactions and Description
Categories of behavioral
interaction
Behavior description
(Operational definition)
Interactions with Moodle
LMS
Quiz taking
(# of times quiz participation – quiz completion and submission)
Resource viewing
(# of visits to the Resource page)
File uploading/downloading
(# of visits to files page – file uploading and file downloadng)
Interactions with peers
Discussion viewing
(# of times discussion viewed)
Discussion posting
(# of times discussion posted – making initial posts)
Discussion responding
(# of times discussion responded – making comments or replies)
Among the identified behaviors, quiz taking was excluded from the analysis because it was a behavioral
interaction that only applied to Course A. Student attendance and time-on-task were collected from
recorded attendance data and each student’s weekly reported time-on-task. Weekly performance scores
were also collected from the course instructors and from the students with student permission. Due to the
different grading systems, task scores from the two courses were converted to a 1 (minimum) to 100
(maximum) scale and combined based on the corresponding weeks for each phase.
Results
Collected data were analyzed to answer each of the five research questions. Table 3 displays the descriptive
statistics of behavioral interactions with the Moodle LMS, behavioral interactions with peers, time-on-task,
attendance, and performance for each of the two courses.
A series of Mann-Whitney tests (Field, 2013), the non-parametric equivalent of the independent samples t-
test, were used to compare the two types of behavioral interactions, time-on-task, attendance, and
performance between the two courses. The Mann-Whitney test was selected for use in this study because
the data did not meet the requirements for a parametric test, and the Mann-Whitney test has the advantage
of being used for small samples of subjects, (i.e., between five and 20 participants) (Nachar, 2008).
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
222
RQ1: Do online learners’ behavioral interactions with the Moodle LMS differ between
a course employing authentic discussion-based learning tasks and a course employing
authentic design/development-based learning tasks?
The average number of behavioral interactions with the Moodle LMS between the two courses was
compared using the Mann-Whitney test. Among the three phases compared, the average number of Moodle
LMS interactions in phase 1 (weeks 2/3) was significantly different, as revealed in Figure 3. In phase 1, the
average number of Moodle LMS interactions in Course B (M = 32.00, Mdn = 31.50) was significantly higher
than the average number of Moodle LMS interactions in Course A (M = 19.60, Mdn = 20.00), U = 12.00, z
= – 2.50, p < .05, r = -.59, thus revealing a large effect size (Field, 2013). In phases 2 and 3, however, the
average number of Moodle LMS interactions were not significantly different between the two
courses.
Nonetheless, the total number of Moodle LMS interactions between the two courses was significantly
different as the total number of Moodle LMS interactions in Course B (M = 74.13, Mdn = 73.50) was
significantly higher than the average number of Moodle LMS interactions in Course A (M = 59.70, Mdn =
65.50), U = 17.50, z = – 2.01, p < .05, r = -.47, indicating a medium to large effect size.
Figure 3. Average number of behavioral interactions with the Moodle LMS for each phase of the semester
for two courses.
0
5
10
15
20
25
30
35
Phase1 Phase2 Phase3
Course A
Course B
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
223
Table 3
Descriptive Statistics of Behavioral Interactions, Time, Attendance, and Performance
Phase 1 (Weeks 2/3) Phase 2 (Weeks 4/5/6) Phase 3 (Weeks 7/8) All three phases
Course A
(n = 10)
Course B
(n = 8)
Course A
(n = 10)
Course B
(n = 8)
Course A
(n = 10)
Course B
(n = 8)
Course A
(n = 10)
Course B
(n = 8)
M SD M SD M SD M SD M SD M SD M SD M SD
Behavioral
interactions a
LMS
interactions
19.60(5.36) 32.00(9.37) 23.10(6.72) 19.63(7.15) 17.00(5.77) 22.50(12.09) 59.70(10.46) 74.13(19.28)
Peer
interactions
64.50(24.04) 86.75(42.60) 126.30(57.88) 58.25(25.39) 65.70(41.17) 48.25(22.38) 256.50(99.69) 193.25(69.77)
Attendance b 9.90(2.99) 10.75(2.82) 15.20(3.05) 13.50(4.47) 11.70(2.21) 9.75(4.20) 36.80(7.57) 34.00(9.15)
Time-on-task c 375.00(53.59
)
700.63(549.5
2
)
564.00(155.27
)
1919.38(1928.10
)
252.50(140.3
4
)
360.00(304.2
6)
1191.50(314.
80
)
2980.00(2713.
6
5)
Performance
Task score d 185.43(11.02) 195.00(4.47) 241.25(29.71) 283.59(21.37) 79.00(15.23) 96.43(1.66) 505.68(46.31) 575.02(26.03)
Note.
a Average number of interactions per phase
b Average number of course participations per phase (Logins)
c Time presented in minutes
d Scores ranging from 0 (minimum) to 200 (maximum) in phase 1, from 0 (minimum) to 300 (maximum) in phase 2, from 0 (minimum) to 100
(maximum) in phase 3
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
224
RQ2: Do online learners’ behavioral interactions with peers differ between a course
employing authentic discussion-based learning tasks and a course employing
authentic design/development-based learning tasks?
The average number of behavioral interactions with peers for the two courses was compared using the
Mann-Whitney test. Among the three phases, the average number of interactions with peers in phase 2
(weeks 4/5/6) was significantly different, as evidenced in Figure 4. In phase 2, the average number of peer
interactions in Course A (M = 126.30, Mdn = 111.50) was significantly higher than the average number of
peer interactions in Course B (M = 58.25, Mdn = 59.50), U = 7.00, z = – 2.93, p < .01, r = -.69, thus revealing
a large effect size. However, the average number of peer interactions was not significantly different between
the two courses in phases 1 and 3, nor was the total number of peer interactions between the two courses
significant.
Figure 4. Average number of behavioral interactions with peers for each phase of the semester for the two
courses.
The findings for both research questions 1 and 2 show the statistical comparisons of Moodle LMS
interactions and peer interactions between two online courses involving different types of authentic
learning tasks. To help understand the exact type of behavioral interactions and possible patterns, the
researcher visualized each student’s behavioral interaction pattern, as shown in Figures 5, 6, and 7. Each
category of students’ behavioral interactions was imported into an Excel spreadsheet with different color
themes (Figure 5).
0
20
40
60
80
100
120
140
Phase1 Phase2 Phase3
Course A
Course B
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
225
Figure 5. Legend of color themes.
Student 1
Student 2
Student 3
Student 4
Student 5
Student 6
Student 7
Student 8
Student 9
Student 10
Figure 6. Behavioral interaction pattern for each individual student in Course A.
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
226
Student 1
Student 2
Student 3
Student 4
Student 5
Student 6
Student 7
Student 8
Figure 7. Behavioral interaction pattern for each individual student in Course B.
Blue colors represent a student’s course exploration activities, such as course viewing and other user
viewing. Brown colors represent a student’s interactions with the Moodle LMS, and green colors represent
a student’s interactions with peers. Each square in the pattern graph represents one occurrence of the case.
Each pattern line represents the total behavioral interactions that occurred in each week. Through visual
representations of behavioral interactions, different patterns were identified in the two courses. Most of the
students in course A showed a ( ) shape of behavioral patterns, while students in Course B
showed a ( ) shape of behavioral patterns. In other words, students in Course A tend to
show more behavior interactions as they move toward the end of the semester, while students in course B
showed higher behavioral interactions in the first week of the semester and also at the end of the semester.
RQ3: Does online learners’ time-on-task differ between a course employing authentic
discussion-based learning tasks and a course employing authentic
design/development-based learning tasks?
Time-on-task for weekly authentic tasks for the two courses was compared using the Mann-Whitney test.
No significant differences were found in any of the three phases or for the entire semester (Figure 8).
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
227
Figure 8. Average time-on-task (in minutes) for each phase for the two courses.
RQ4: Does online learners’ attendance differ between a course employing authentic
discussion-based learning tasks and a course employing authentic
design/development-based learning tasks?
Attendance for weekly authentic tasks for the two courses was compared using the Mann-Whitney test. No
significant differences were found in any of the three phases or for the entire semester (Figure 9).
Figure 9. Average attendance for each phase for the two courses.
RQ5: Does online learners’ academic performance differ between a course employing
authentic discussion-based learning tasks and a course employing authentic
design/development-based learning tasks?
The average task score between the two courses was compared using the Mann-Whitney test. Among the
three phases compared, the average scores in phases 2 and 3 were significantly different, as displayed in
Figure 10. In phase 2, the average score in Course B (M = 283.59, Mdn = 290.00) was significantly higher
than the average score in Course A (M = 241.25, Mdn = 240.47), U = 9.00, z = -2.76, p < .01, r = -.65,
revealing a large effect size. In phase 3, the average score in Course B (M = 96.43, Mdn = 96.43) was
significantly higher than the average score in Course A (M = 79.00, Mdn = 85.00), U = 7.50, z = -2.94, p
< .01, r = -.69, indicating a large effect size. However, the task scores were not significantly different in
0
500
1000
1500
2000
2500
Phase1 Phase2 Phase3
Course A
Course B
0
2
4
6
8
10
12
14
16
Phase1 Phase2 Phase3
Course A
Course B
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
228
phase 1. The total task scores for the entire semester for the two courses were significantly different. The
total score for Course B (M = 575.02, Mdn = 585.43) was significantly higher than the total score for Course
A (M = 505.68, Mdn = 495.47), U = 8.00, z = -2.85, p < .01, r = -.67, indicating a large effect size.
Figure 10. Average score in each phase for the two courses.
In addition to the Mann-Whitney test comparisons, a Spearman’s rank-order correlation was also run to
determine the relationship between all 18 students’ behavioral interactions, time-on-task, and performance
per week.
Table 4
Significant Correlations between Behavioral Interactions, Time-on-Task, and Performance
Weeks Correlation rs(16) p value *
Week2 Discussion viewing – Discussion response .794 .000
Week3 Discussion viewing – Discussion response .639 .004
Week4 Discussion viewing – Discussion posting .742 .000
Resource viewing – Discussion posting .632 .005
Resource viewing – Discussion viewing .631 .005
Week5 Discussion viewing – Discussion posting .599 .009
Resource viewing – Discussion posting .792 .000
Resource viewing – Discussion viewing .703 .001
Week6 File uploading – Score .650 .003
File uploading – Discussion posting .732 .001
Week7/8 File uploading – Discussion posting .622 .006
Discussion viewing – discussion response .661 .003
Note. * All correlations are significant at the 0.01 level (2-tailed).
Although no overall significant correlations were found between time-on-task and behavioral interactions,
or between performance scores and behavioral interactions, there were several noticeable patterns found
among behavioral interactions. For example, during the first half of the semester, strong positive
correlations were found between discussion reviewing and discussion response /discussion posting
0
50
100
150
200
250
300
Phase1 Phase2 Phase3
Course A
Course B
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
229
behaviors. Then, during the second half of the semester, resource viewing, discussion posting, and file
uploading behaviors showed strong positive correlations. Especially in week 6, students scored higher when
they showed more file uploading behaviors with discussion postings.
Discussion
Time flexibility and authentic tasks are two factors that affect the success of an online learning experience.
However, the type of behavioral interactions students exhibit at different points when they are involved in
different types of authentic tasks is not well understood. Accordingly, this study attempted to analyze and
visualize the behavioral interactions of online learners at different times during the semester and compare
the occurrences of these behavioral interactions in two online courses. The study found that online students
exhibit different behavioral interactions when they are involved in two different authentic online learning
activities. Students in authentic design/development-based learning activities demonstrated more
behavioral interactions with the Moodle LMS at the beginning of the course, whereas students in authentic
discussion-based learning activities exhibited more behavioral interactions with peers during the middle of
the semester. Overall, attendance and time-on-task did not differ between the two courses. Understanding
time flexibility as the capacity to spend time-on-task at different times of the day and week (Romero &
Barberà, 2011), the results indicate that students are likely to be involved in behavioral interactions with
the Moodle LMS early in the course if given tasks require authentic design/development learning activities.
This finding could be viewed from the perspective of student time management. In other words, students
in the design/development course tried to understand the scope of the design/development projects early
in the semester so they could plan the design/development of the multimedia materials for the semester.
This notion is supported by their attendance and time-on-task (Figures 8 and 9). Although students in
Course B did not actively participate in behavioral interactions with peers in the middle of the semester,
they attended the course regularly and spent significantly more time working on given tasks compared with
students in Course A. Given the different behavioral interaction patterns found in the different authentic
online tasks, the findings support the importance of designing technological learning resources at different
points of the semester depending on the type of authentic learning tasks and on the needs of the student
(Swan & Shin, 2005).
Another important finding of this study is that the correlations between student performance and each type
of student behavioral interactions according to Spearman’s rank correlation coefficients were not
significant. The evidence offers the possibility of behavioral interactions being an intermediate variable,
suggesting that more indicators must be examined to understand factors affecting student performance in
online learning. In fact, many of the online learning analytics focus on behavioral indicators rather than on
the psychological aspects of learning, such as cognitive involvement, academic emotions, and motivation.
Therefore, we must seek ways to incorporate a different methodology to approach the online learning
experience in a holistic way. For example, the experience sampling method (ESM) combined with learning
analytics would be a good alternative method to analyze the multiple dimensions of the online learning
experience.
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
230
Conclusion
This study analyzed the behavioral interactions of online learners and compared the differences in
behavioral interactions for two online courses each with different authentic learning tasks. Since the first
course was designed based on a constructivist approach, and the second course from a constructionist
perspective, the analysis results showed that students in each course experienced different behavioral
interactions during the semester. The findings imply that when designing an online course that involves
authentic learning tasks, instructional designers need to consider optimizing learners’ behavioral
interaction sequence to maximize their learning effectiveness. For example, interactions with peers should
be encouraged when designing an online course based on the constructive belief (Lowes, Lin, & Kinghorn,
2015). Unlike other previous studies, however, this study did not find the direct relationship between the
behavioral interactions, whether with Moodle LMS or peers, and performance scores. Previous studies such
as Davies and Graff’s (2005) also reported no relationship between discussion forum participation and final
course grades. As discussed, behavioral interactions could be an intermediate variable affected by students’
cognitive involvement and motivation, thus their psychological online learning experiences also need to be
considered when analyzing students’ Web log data. There are several limitations to this study. First, the
behavioral interaction data collected using Web logs are limited only to internal data stored in the Moodle
LMS server. External communication data, such as email correspondences or conference calls, were not
included in the data analysis. Second, although the study was conducted using two purposefully selected
courses to provide a rich description of the behavioral pattern for each individual student, future
researchers wanting to make generalizations about the findings of this study will need to increase the
number of participants. Third, this study only analyzed the behavioral patterns of online learners, and thus,
there is a need to examine how these behavior patterns are related to other learning experiences such as a
cognitive processing and affective states. This holistic approach to understanding learning experiences will
help researchers obtain a more comprehensive picture of the interactions among the cognitive processes,
affective states, and behavioral patterns. With the meticulous analysis of the individual learner’s learning
experience, we can gain deeper insight into ways to design the optimal online learning experience.
References
Allen, I. E., & Seaman, J. (2014). Grade change: Tracking online education in the United States. Babson
Park, MA: Babson Survey Research Group and Quahog Research Group. Retrieved
from http://www.onlinelearningsurvey.com/reports/gradechange
Arneberg P., Keegan D., Guardia, L., Keegan, D., Lõssenko, J., Fernández Michels, P., & Zarka, D. (2007).
Analyses of European mega providers of e-learning. Bekkestua: NKIforlaget.
Cho, M. H., & Kim, B. J. (2013). Studentsʼ self-regulation for interaction with others in online learning
environments. The Internet and Higher Education, 17, 69–75. doi:
http://dx.doi.org/10.1016/j.iheduc.2012.11.001
http://www.onlinelearningsurvey.com/reports/gradechange
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
231
Collis, B., Vingerhoets, J., & Moonen, J. (1997). Flexibility as a key construct in European training:
Experiences from the Tele Scopia project. British Journal of Educational Technology, 28(3),
199–217.
Davies, J., & Graff, M. (2005). Performance in e-learning: Online participation and student grades.
British Journal of Educational Technology, 35(4), 657–663.
Dawson, S., McWilliam, E., & Tan, J.P.L. (2008). Teaching smarter: How mining ICT data can inform and
improve learning and teaching practice. In Hello! Where are you in the landscape of educational
technology? Proceedings ascilite Melbourne 2008. Retrieved from
http://www.ascilite.org.au/conferences/melbourne08/procs/dawson
Dringus, L. P., & Ellis, T. (2005). Using data mining as a strategy for assessing asynchronous discussion
forums. Computers & Education, 45, 141-160. doi:10.1016/j.compedu.2004.05.003
Field, A. P. (2013). Discovering statistics using IBM SPSS Statistics (4th edition). London: Sage
publications.
Grace, L, J., Maheswari, V., & Nagamalai, D. (2011). Web log data analysis and mining. Proceedings of
Advanced computing: First international conference on computer science and information
technology, (pp. 459 – 469). Bangalore, India.
Hellwege, J., Gleadow, A., & McNaught, C. (1996). Paperless lectures on the web: An evaluation of the
educational outcomes of teaching Geology using the Web. Proceedings of 13th Annual Conference
of the Australian Society for Computers in Learning in Tertiary education (ASCILITE ’96), (pp.
289-299). Adelaide, Australia.
Herrington, J., Oliver, R., & Reeves, T.C. (2006). Authentic tasks online: A synergy among learner, task
and technology. Distance Education, 27(2), 233-248.
Jansen, B. J., Taksa, I., & Spink, A. (2009). Research and methodological foundations of transaction log
analysis. In B. J. Jansen, A. Spink, & I. Taksa (Eds.), Handbook of research on Web log analysis.
Hershey, PA: IGI
global.
Kafai, Y. B. (2006). Playing and making games for learning instructionist and constructionist
perspectives for game studies, Games and Culture 1(1), 36-40
Lebow, D., & Wager, W. W. (1994). Authentic activity as a model for appropriate learningactivity:
Implications for emerging instructional technologies. Canadian Journal of Educational
Communication, 23(3), 231-144.
Lowes, S., Lin, P., & Kinghorn, B. (2015). Exploring the link between online behaviours and course
performance in asynchronous online high school courses. Journal of LearningAnalytics, 2(2),
169–194.
http://www.ascilite.org.au/conferences/melbourne08/procs/dawson
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
232
Macfadyen, L. P., & Dawson, S. (2010). Mining LMS data to develop an “early warning system” for
educators: A proof of concept. Computers & Education, 54(2), 588–599.
Mahajan, R., Sodhi, J. S., & Mahajan, V. (2016). Usage patterns discovery from a Web log in an Indian e-
learning site: A case study. Education and Information Technologies, 21(1), 123-148. doi:
10.1007/s10639-014-9312-1
Moller, L., Foshay, W., Huett, J. (2008). The evolution of distance education: Implications for
instructional design on the potential of the web (Part 2: Higher Education). Tech Trends,52(4),
66-70.
Mostow, J., Beck, J., Cen, H., Cuneo, A., Gouvea, E., & Heiner, C. (2005). An educational data mining tool
to browse tutor–student interactions: Time will tell! Proceedings of the workshop on educational
data mining, Pittsburgh, USA (pp. 15–22).
Nachar, N. M. (2008). The Mann-Whitney U: A test for assessing whether two independent samples
come from the same distribution, Tutorials in Quantitative Methods for Psychology, 4(1), 13-20.
National Center for Education Statistics. (2011). Distance education courses for public elementary and
secondary school students: 2009–10.Washington, DC: U.S. Department of Education
Government Printing Office. Retrieved from
http://gsehd.gwu.edu/documents/gsehd/resources/gwuohs-onlineresources/reports/ies-
nces_distanceeducationcourses-20092010
Park, S. (2015). Examining learning experience in two online courses using Web logs and experience
sampling method (ESM). In B. Hokanson, G. Clinton, & M. W. Tracey (Eds.), The design of
learning experience: Creating the future of educational technology. New York: Springer.
Park, S. (2016). Analyzing and comparing online learning experiences through micro-level analytics.
Journal of Educational Technology Development and Exchange, 8(2), 55-80.
Papert, S. (1991). Preface. In I. Harel, & S. Papert (Eds), Constructionism, research reports and essays,
1985-1990 (p. 1). Norwood, NJ: Ablex Publication.
Parker, J., Maor, D., & Herrington, J. (2013). Authentic online learning: Aligning learner needs, pedagogy
and technology. In Special issue: Teaching and learning in higher education. Issues In
Educational Research, 23(2), 227-241.
Peled, A., & Rashty, D. (1999). Logging for success: advancing the use of WWW logs to improve computer
mediated distance learning. Journal of Educational Computing Research, 21(3).
Roblyer, M. D., & Edwards, J. (2000). Integrating educational technology into teaching (2nd Ed.). Upper
Saddle River, New Jersey: Prentice-Hall, Inc.
http://gsehd.gwu.edu/documents/gsehd/resources/gwuohs-onlineresources/reports/ies-nces_distanceeducationcourses-20092010
http://gsehd.gwu.edu/documents/gsehd/resources/gwuohs-onlineresources/reports/ies-nces_distanceeducationcourses-20092010
Analysis of Time-on-Task, Behavior Experiences, and Performance in Two Online Courses with Different Authentic Learning Tasks
Park
233
Romero, C., Ventura, S., & Garcia, E. (2008). Data mining in course management systems: Moodle case
study and tutorial, Computers & Education, 51(1), 368-384.
Romero, M., & Barberà, E. (2011). Quality of e-learners’ time and learning performance beyond
quantitative time-on-task, The International Review of Research in Open and Distance Learning
12(5).
Shea, P., & Bidjerano, T. (2014). Does online learning impede degree completion? A national study of
community college students. Computers & Education, 75(2), 103-111.
Sheard, J., Albrecht, D., & Butbul, E. (2005). ViSION: Visualization student interactions online.
Proceedings of the Eleventh Australasian World Wide Web Conference (pp. 48–58).
Swan, K., & Shih, L. F. (2005). On the nature and development of social presence in online course
discussions. Journal of Asynchronous Learning Networks, 9, 115–136. Retrieved from
http://sloanconsortium.org/publications/jaln_main
Taksa, I., Spink, A., & Jansen, B. J. (2009). Web log analysis: Diversity of research methodologies. In B. J.
Jansen, A. Spink, & I. Taksa (Eds.), Handbook of research on Web log analysis. Hershey, PA: IGI
global.
Valsamidis, S., & Democritus, S.K. (2011). E-learning platform usage analysis. Interdisciplinary Journal
of E- Learning and Learning Objects, 7,185–204.
Van den Brande, L. (1994). Flexible and distance learning. Chichester: John Wiley & Sons.
Veletsianos, G., & Doering, A. (2010). Long-term student experiences in a hybrid, open-ended and
problem based Adventure Learning program, Australasian Journal of Educational Technology,
26(2), 280-296
Wallace, R. M. (2010). Online learning in higher education: A review of research on interactions among
teachers and students, Education, Communication & Information 3(2), 241 – 280.
doi: 10.1080/14636310303143
Wei, H.C., Peng, C., & Chou, C. (2015). Can more interactivity improve learning achievement in an online
course? Effects of college students’ perception and actual use of a course-management system on
their learning achievement. Computers & Education, 83, 10–21.
http://sloanconsortium.org/publications/jaln_main
6
Websites, Books, Independent Studies: 6 Journal Articles Summarized Here
APA Citation REQUIRED (Refer to APA Writer’s Manual, 6th ed.)
Fill out both areas for 1 Article on each page (6 pages)
Sample Citation in APA 6th edition:
Arbelo, F. (2016). Pre-entry doctoral admission variables and retention at a Hispanic Serving
Institution. International Journal of Doctoral Education, 11, 269 – 284.
http://www.informingscience.org/Publications/3545
Academic Journal Articles:
APA Citation (Refer to APA Writer’s Manual, 6th ed.)
Citation here
Selection |
Explanation |
||||||||||||||||||||||
Source: Primary or Secondary |
|||||||||||||||||||||||
Information Classification: (Self-contained study/ Research findings / Professional Association/ Unanalyzed Data / Compiled Statistics, etc.) |
|||||||||||||||||||||||
How and why is this information pertinent to your selected topic? |
Academic Journal Articles:
APA Citation (Refer to APA Writer’s Manual, 6th ed.)
Citation here
Issues / Topics Covered |
|||||
Author(s): |
|||||
Research Question(s) addressed: |
|||||
Research Subjects: (pre-K, 9th graders, elementary school students, etc.) |
|||||
Research setting: (Public school, 3rd grade class, Charter school, adult learning center, etc.) |
|||||
Methodology: |
|||||
Findings: |
|||||
Conclusions: |
|||||
Special Circumstances/Limitations: |
|||||
Future Implications: |
2nd Article
Academic Journal Articles:
APA Citation (Refer to APA Writer’s Manual, 6th ed.)
Citation here:
Selection
Explanation
Source: Primary or Secondary
Information Classification:
(Self-contained study/ Research findings / Professional Association/ Unanalyzed Data / Compiled Statistics, etc.)
How and why is this information pertinent to your selected topic?
Academic Journal Articles:
APA Citation (Refer to APA Writer’s Manual, 6th ed.)
Citation here:
Research setting: (Public school, 3rd grade class, Charter school, adult learning center, etc.) |
3rd Article
Academic Journal Articles:
APA Citation (Refer to APA Writer’s Manual, 6th ed.)
Citation here:
Selection
Explanation
Source: Primary or Secondary
Information Classification:
(Self-contained study/ Research findings / Professional Association/ Unanalyzed Data / Compiled Statistics, etc.)
How and why is this information pertinent to your selected topic?
Academic Journal Articles:
APA Citation (Refer to APA Writer’s Manual, 5th ed.)
Selection
Explanation
Issues / Topics Covered
Author(s):
Research Question(s) addressed:
Research Subjects: (pre-K, 9th graders, elementary school students, etc.)
Research setting:
(Public school, 3rd grade class, Charter school, adult learning center, etc.)
Methodology:
Findings:
Conclusions:
Special Circumstances/Limitations:
Future Implications:
How and why is this information pertinent to your selected topic?
4th Article
Academic Journal Articles:
APA Citation (Refer to APA Writer’s Manual, 6th ed.)
Citation here:
Selection
Explanation
Source: Primary or Secondary
Information Classification:
(Self-contained study/ Research findings / Professional Association/ Unanalyzed Data / Compiled Statistics, etc.)
How and why is this information pertinent to your selected topic?
Academic Journal Articles:
APA Citation (Refer to APA Writer’s Manual, 5th ed.)
Selection
Explanation
Issues / Topics Covered
Author(s):
Research Question(s) addressed:
Research Subjects: (pre-K, 9th graders, elementary school students, etc.)
Research setting:
(Public school, 3rd grade class, Charter school, adult learning center, etc.)
Methodology:
Findings:
Conclusions:
Special Circumstances/Limitations:
Future Implications:
How and why is this information pertinent to your selected topic?
5th Article
Academic Journal Articles:
APA Citation (Refer to APA Writer’s Manual, 6th ed.)
Citation here:
Selection
Explanation
Source: Primary or Secondary
Information Classification:
(Self-contained study/ Research findings / Professional Association/ Unanalyzed Data / Compiled Statistics, etc.)
How and why is this information pertinent to your selected topic?
Academic Journal Articles:
APA Citation (Refer to APA Writer’s Manual, 5th ed.)
Selection
Explanation
Issues / Topics Covered
Author(s):
Research Question(s) addressed:
Research Subjects: (pre-K, 9th graders, elementary school students, etc.)
Research setting:
(Public school, 3rd grade class, Charter school, adult learning center, etc.)
Methodology:
Findings:
Conclusions:
Special Circumstances/Limitations:
Future Implications:
How and why is this information pertinent to your selected topic?
6th Article
Academic Journal Articles:
APA Citation (Refer to APA Writer’s Manual, 6th ed.)
Citation here:
Selection
Explanation
Source: Primary or Secondary
Information Classification:
(Self-contained study/ Research findings / Professional Association/ Unanalyzed Data / Compiled Statistics, etc.)
How and why is this information pertinent to your selected topic?
Academic Journal Articles:
APA Citation (Refer to APA Writer’s Manual, 5th ed.)
Selection
Explanation
Issues / Topics Covered
Author(s):
Research Question(s) addressed:
Research Subjects: (pre-K, 9th graders, elementary school students, etc.)
Research setting:
(Public school, 3rd grade class, Charter school, adult learning center, etc.)
Methodology:
Findings:
Conclusions:
Special Circumstances/Limitations:
Future Implications:
How and why is this information pertinent to your selected topic?