20190315131947online_class1 20190315131937apa__meet_google1 20190315131947ebscohost1
WK 2 CAEX: Assignment: Critical Reading and Researching Main Ideas
In this Week’s Discussion, you reviewed reading strategies and examined your own research process. Now you will put these concepts and insights in practice for this Week’s Assignment.
To prepare for this Assignment:
- Review the Learning Resource on Determining the Main Points of a Reading.
- Choose one of the preselected journal articles in this week’s Learning Resources.
- Applying the concepts from this week’s resources, critically read the journal article, taking notes or engaging in any research methods that you would like to try.
The Assignment:
Select three related main points from the journal article (excluding the Abstract), and directly quote them.
In addition to the directly quoted main points, compose 1–2 sentences of rationale for each main point. Use these sentences to explain the reason you selected each point from the journal article. Consider the following questions in your rationale:
- Why are these three specific points the author’s main ideas?
- What makes them stand out to you as a critical reader?
- How are these main points related to one another?
resources to choose from attached
Online class size, note reading, note writing
and collaborative discourse
Mingzhu Qiu & Jim Hewitt & Clare Brett
Received: 25 March 2010 /Accepted: 14 June 2012 /
Published online: 22 July 20
12
# International Society of the Learning Sciences, Inc.; Springer Science+Business Media, LLC 2012
Abstract Researchers have long recognized class size as affecting students’ performance in
face-to-face contexts. However, few studies have examined the effects of class size on exact
reading and writing loads in online graduate-level courses. This mixed-methods study
examined relationships among class size, note reading, note writing, and collaborative
discourse by analyzing tracking logs from 25 graduate-level online courses (25 instructors
and 341 students) and interviews with 10 instructors and 12 graduate students. The quan-
titative and qualitative data analyses were designed to complement each other. The findings
from this study point to class size as a major factor affecting note reading and writing loads
in online graduate-level courses. Class size was found positively correlated with total
number of notes students and instructors read and wrote, but negatively correlated with
the percentage of notes students read, their note size and note grade level score. In larger
classes, participants were more likely to experience information overload and students were
more selective in reading notes. The data also suggest that the overload effects of large
classes can be minimized by dividing students into small groups for discussion purposes.
Interviewees felt that the use of small groups in large classes benefited their collaborative
discussions. Findings suggested 13 to 15 as an optimal class size. The paper concludes with
a list of pedagogical recommendations and suggestions for new multimedia software
features to enhance collaborative learning in online classes.
Keywords Classsize . Notereading . Notewriting . Collaborativediscourse . Mixedmethods
study
Computer-Supported Collaborative Learning (2012) 7:423–442
DOI 10.1007/s11412-012-9151-2
M. Qiu (*) : J. Hewitt : C. Brett
Department of Curriculum, Teaching, and Learning, Ontario Institute for Studies in Education,
University of Toronto, Toronto, Canada
e-mail: mingzhu.qiu@utoronto.ca
J. Hewitt
e-mail: jim.hewitt@utoronto.ca
C. Brett
e-mail: clare.brett@utoronto.ca
The study discussed here1 examined the relationship between class size and note reading
loads, note writing loads, and collaborative discussions in online graduate-level courses at a
Canadian institute using software WebKF. Specifically, it investigated three questions: “How
do different class sizes affect students’ and instructors’ participation in note reading and note
writing?” “What are students’ and instructors’ opinions about note reading and writing loads
related to class sizes?” “How do students and instructors make sense of online cooperation
and collaboration across different class sizes?” The findings from this study point to class
size as a major factor affecting note reading and writing loads in online graduate-level
courses. Although the specific findings of this study are not individually surprising to people
experienced with CSCL instruction, the discussion of their implication may contain a
perspective that could usefully be made available to the CSCL research and practitioner
community.
Class size has long been recognized as a factor affecting students’ achievement in face-to-
face instructional contexts, but has been little investigated in online courses. Some research
has shown that online class size certainly has important effects on information overload in
computer conferencing courses (Hewitt and Brett 2007; Lipponen and Lallimo 2004).
However, few studies have examined the effects of online class size on exact note reading
and writing loads and collaborative discourse, especially with mixed methods.
In face-to-face courses, students learn by attending class, listening to the instructors’
lectures and participating in discussions with classmates. They contribute by talking to share
ideas and opinions. In online courses, discussions are still primarily text-based. As a basic
precondition, online learners have to read the messages, ask questions, comment on mes-
sages, and answer questions (Hron and Friedrich 2003). Students read instructors’ an
d
classmates’ notes, and contribute by writing their own notes. Because note reading and
writing are fundamental online activities (Davie 1988), we can analyze these operations to
investigate how much students “listen” (read notes), and how much students contribute
(write notes) in their online discussions. More importantly, we can investigate how class size
correlated with students’ and instructors’ note reading and writing practices and their
perspectives. However, “online teaching should not be expected to generate larger revenues
by means of larger class sizes at the expense of effective instructional or faculty over-
subscription” (Tomei 2006, p. 531). Online education will continue to shape the way some
people learn in the 21st century (Wuensch et al. 2008). While e-learning systems have
improved with time, they still have some problems that need to be resolved in order to
achieve a truly stimulating and realistic learning experience (Monahan et al. 2008).
Class size and challenges in online learning
There is a growing tendency for instructors who previously taught face-to-face classes to
teach online despite insufficient knowledge of online teaching. For example, Moore and
Kearsley (1996) found that some “distance education courses were developed and delivered
in a very piece-meal and unplanned fashion” (p. 6); a similar situation still exists. The
present study’s literature review found no set principles or detailed guidance for
instructors and students about how to cope with different situations and workloads
in different sizes of online classes. Educators need to build pedagogy or instructional
strategies to enhance the online educational experience for instructors and studen
ts
alike (Xu and Morris 2007).
1 The study is discussed in detail in Qiu 2009, on which this article is based.
424 M. Qiu et al.
Crucial to the success of online learning is active student participation and interaction with
both peers and instructors (Sutton 2001). A common approach to encourage student participa-
tion is some overt reward or punishment system (Masters and Oberprieler 2004). However,
such systems also create an authority structure which has a large impact on subsequent learning
and collaborative learning activities (Hubscher-Younger and Narayanan 2003), and may not be
effective in some online situations. For example, Bender (2003) found that one of the reported
feelings in Computer Mediated Communication is being overwhelmed brought on by a large
class size. Potentially, according to Hewitt and Brett (2007), the perception of information
overload could have a number of negative consequences, such as heightened student anxiety,
which can interfere with the amount of attention that participants dedicate to online learning.
This leaves shy students, especially those who lack confidence or withdraw upon rejection of
their initial ideas, with little chance to participate in discussions, a situation which may lead to
depersonalization and deindividuation (Bordia 1997). Hewitt et al. (2007) also found that CMC
students habitually engaged in practices like scanning, skimming, or reading new notes, and
those larger classes had higher “scanning” rates due to an increased information load.
To overcome such problems, Hron and Friedrich (2003) argue, appropriate class sizes
should be set in order to ensure for each class a minimum critical mass for participation
without overload, to reach the goals associated with collaborative learning, and to make it
easier to establish social presence and encourage greater interactivity (Aragon 2003). Studies
of class size for online courses should examine the optimum class size for quality education
and establish a discussion-board size that allows meaningful discourse (Frey and Wojnar
2004). Optimal class sizes “must be sufficiently large to encourage activity, but not so large
that the sense of group connectedness is lost” (Colwell and Jenks 2004, p. 7).
Online conferencing usually takes more time (Clouder et al. 2006), and a major challenge
in online learning settings is how to structure asynchronous online discussions in order to
engage students in meaningful discourse (Gilbert and Dabbagh 2005). Educational research-
ers need to find technologies which best contribute to making collaborative online learning
effective (Xu and Morris 2007). Hutchinson (2008) suggests that “the more effective
deployment of existing technologies may be part of the solution” (p. 357). The majority of
online education systems are still mainly text-based (Wuensch et al. 2008) with insufficient
features to allow effective, interactive discourse. Dohn (2009) studied some discrepancies
that lead to theoretical tensions and practical challenges when Web 2.0 practices are utilized
for educational purposes. In addition, advanced multimedia applications, such as graphs,
audio, and video are not much used, though some experts have suggested a movement “from
e-learning to m-learning” using streaming synchronous audio and video technologies (e.g.,
Keegan 2002).
Constructivism, knowledge building, cooperation, collaboration and class size
Social constructivism, knowledge building, cooperative learning, and collaborative learning
theories support the idea that students can learn from each other. They believe that expla-
nation leads to deeper understanding and stress that the goal for students is to build
knowledge and negotiate meaning in a learning community. How people learn is strongly
influenced by social context, which in turn is the product of the interaction of individual
differences (Bransford et al. 1999). Knowledge building can be considered as deep con-
structivism that involves making a collective inquiry into a specific topic, and coming to a
deeper understanding through interactive questioning, dialogue, and continuing improve-
ment of ideas. When learners are effectively motivated and actively try to achieve their
Computer-Supported Collaborative Learning 425
learning goals, deeper levels of thinking and learning are promoted (Scardamalia and
Bereiter 1994). This notion is consistent with Bruner’s (1986) observation that learning is
an active social process. Studies on teaching from a Vygotskian perspective (1978) empha-
size creating more advanced social learning opportunities for students. Boettcher (1999)
states that knowledge has the best chance of flourishing in an environment that is rich,
supportive, encouraging, and enthusiastic.
Cohen (1994) stresses that cooperative learning can stimulate the development of higher-
order thinking skills and that cooperative groups are particularly beneficial “in developing
harmonious interracial relations in desegregated classrooms.” (p. 17) Students receiving
individual feedback on cooperative group mates obviously increase their cooperation rate in
comparison to those receiving no feedback (Kimmerle and Cress 2008). However, cooper-
ative groups differ from collaborative groups; the former tend to have a “divide and
conquer” mentality, where the group divides the work into chunks that can be done
independently (Graham and Misanchuk 2004). By contrast, collaboration involves the
mutual engagement of participants in a coordinated effort to solve the problem together
(Roschelle and Teasley 1995).
The commonsense starting point in Computer-Supported Collaborative Learning is that
learning is social in nature (Jones et al. 2006). Collaboration is especially important in online
learning (Pena 2004), where the learners tend to be isolated without the usual social support
systems found in on-campus or classroom-based instruction. Since the purpose of collabo-
rative groups is to achieve consensus and shared classroom authority (Bruffee 1999),
individual accountability becomes central to ensuring that all the participants in the group
develop by learning collaboratively (Hutchinson 2008). In classrooms that adopt a collab-
orative approach, the basic challenge shifts from learning in the conventional sense to the
construction of collective knowledge (Scardamalia and Bereiter 2006; 2003). Hakkaranen
(2009) argued that “knowledge advancement is not just about putting students’ ideas into the
centre but depends on corresponding transformation of social practices of working with
knowledge.” (p. 213) With collaborative learning, the control of learning is turned over to
the students and the learning environment is student-centric. Learning takes place in a
meaningful, authentic context and is a social, collaborative activity, in which peers play an
important role in encouraging (Neo 2003). In order to establish and maintain an online
learning community, the learning environment needs to be effectively designed to provide
students with opportunities to practice collaboration, critical thinking, and teamwork skills
that are increasingly valuable in the information age (Kerka 1996). Though its benefits are
widely known, collaborative learning remains rarely practiced, particularly at the university
level (Roberts 2004).
Proper online instructional strategies could guide meaningful online discussion between
or among peers who co-construct knowledge; allowing learners to share and refine meaning
with peers in a social context (Tao and Gunstone 1999). Some writers (e.g., Weigel 2002)
have argued that combining traditional courses with online collaboration represents a
significant step forward in higher education. Laurillard (2008) argued that “New technolo-
gies invariably excite a creative explosion of new ideas for ways of doing teaching and
learning, although the technologies themselves are rarely designed with teaching and
learning in mind.” (p. 5) Online technology enables the transfer of content and feedback
(Neo 2003). Properly deployed, the technology can support and enhance learning, the
acquisition of knowledge, and the development of intellectual analysis and skills in the
information age (Collins and Halverson 2009), rather than serving merely as an added
medium for transmitting information. It can be very productive to marry appropriate
instructional strategies with online technology (Ingram and Hathorn 2004).
426 M. Qiu et al.
Researchers have proposed a number of different optimal sizes for online classes. Based
on their own online teaching experience, Aragon (2003) proposed 30 as an upper limit on
class size. This matches Bi’s (2000) suggestion that to optimize and allow for effective
feedback, fewer than 30 students should be enrolled in each class. Roberts and Hopewell
(2003) suggested that faculty keep the size of the class to 20 students, to allow for more
“workable” loads. This size is manageable without overwhelming the instructor or mini-
mizing his effectiveness. Rovai (2002) argued that to guarantee effective online engagement
and interactions, 8–10 students were required. However, in general, students in smaller
classes tended to learn more (Glass and Smith 1979).
Method
Creswell (2005) states that “Mixed methods designs are procedures for collecting, analyzing,
and linking both quantitative and qualitative data in a single study or in a multiphase series
of studies” (p. 53). He points out that all research methods have limitations that in mixed-
methods research the biases inherent in any single method could neutralize or cancel the
biases of other methods. Morse (2003) argues that the major strength of mixed methods
research is that it allows research to develop as “comprehensively and completely as
possible” (p. 189). In other words, the fundamental principle of mixed method research is
to collect multiple sets of data using different research methods in such a way that the
resulting mixture or combination has complementary strengths and non-overlapping weak-
nesses (Johnson and Christensen 2004). Results from one method can help develop or
inform the other method (Greene et al. 1989) and provide insight into different levels or
units of analysis (Tashakkori and Teddlie 2003). Mixed methods help researchers develop a
fuller understanding of the issues under investigation.
This study adopted a mixed methods design, using results from quantitative data analyses
and from qualitative interviews. Specifically, it used a mixed methods design in order to: (1)
develop stronger claims to test the hypothesis that different class sizes do affect note reading
and note writing; (2) examine the research questions from multiple perspectives, thus
providing greater diversity of positions and values; (3) understand online graduate-level
discussion loads more insightfully; and (4) develop more comprehensive, more complete,
and more enriched portraits of online graduate level discourse.
This study adopted purposeful criteria (Strauss and Corbin 1998) for selecting both quan-
titative and qualitative samples with maximum variation in the sampling of interview partic-
ipants, taking into account the notion that participants must have experience (Morgan et al.
1998) of online group discussions in different sizes of classes. The samples for both quantitative
and qualitative data analyses were drawn from one Canadian institute, because of its diversity of
graduate online courses, its history of online education, its experienced faculty members and the
software (Web Knowledge Forum) used for threaded online discussions. Many studies suffer
from high attrition or otherwise wind up using statistical analyses with inadequate sample sizes
(Schoech 2000), which violate the underlying assumptions of the statistical methods. Here, the
sample for the quantitative analyses in this study was made larger than those for most
quantitative computer-mediated communication studies described in the literature (Schoech
2000). This study analyzed tracking logs from 25 graduate-level online courses (from fall 2003
to summer 2004) using software Web Knowledge Forum (25 instructors and 341 students) and
semi-structured interviews with 10 instructors and 12 graduate students who had diverse
backgrounds and extensive online teaching and learning experience. The actual class sizes in
this study range from 6 to 22 for the quantitative data and 6 to 25 for the interviews.
Computer-Supported Collaborative Learning 427
The quantitative and qualitative data analyses were designed to complement each
other. In the quantitative data analysis, a number of issues central to ensuring
maximum statistical power in the study were considered in order to minimize the
risk of Type II errors and to sufficiently protect against Type I errors with a
significance level of at least .05. We used two-tailed tests in the analysis, which
meant we required a larger sample in order to maximize the study’s power. The
sample size—341 students and 25 instructors in 25 courses—was large enough to
produce effective statistical power. First we conducted data cleaning and checking to
ensure the quality of the dataset. The descriptive statistical analyses compared means,
standard deviations, maximum, and minimum values of variables from the 25 course
datasets concerning note reading and note writing. We employed a Pearson Correla-
tion, one-way ANOVA, t-test, ANCOVA, and multiple regression analyses.
The qualitative data analysis followed the principles and practices that Tesch (1990)
identified for grounded theory. As Denzin and Lincoln (2005) pointed out, “Grounded
theory is probably the most widely employed interpretive strategy in the social sciences
today” (p. 204). Following Tesch’s principles, the inductive analysis of the qualitative data
started with the sorting of transcripts and developing a coding scheme and a description
using a sample transcript. This was followed by the coding and typology development of
themes. Interview data analysis moved from a detailed, fine-grained analysis of the data
(open coding) towards successively more general categories (axial coding), themes, and
theories (selective coding). Memoing and diagramming began with initial analysis and
continued throughout the research process.
Comparisons of results from both quantitative and qualitative methods were carried
out at every stage of the cross-track analysis procedure. Verifications of the analyses
were planned and conducted with all possible methods (e.g., triangulation, negative
case analysis, peer review, member checks, and external audits) in order to guarantee
reliability and validity.
Results
Class size and note reading
Both quantitative and qualitative data analyses suggested that class size plays a pivotal role
in supporting or impeding note reading. Statistical analyses (see Table 1 in Appendix) found
that class size was positively correlated with the total number of notes students read (from
330 to 900 notes; r00.777, p<0.001). As class size increased, students read significantly
more notes. However, class size was negatively correlated with the percentage of notes
students read (from 90 % to 49 %; r0- 0.801, p<0.01); they read a significantly fewer
proportion of the notes as class size increased. As class size increased, instructors also read
significantly more notes (from 320 to 1,300 notes; r00.902, p<0.001). However, the
percentage of notes they read was not significantly correlated with class size (with an
average of 82 %). (See Figs. 1 & 2)
In interviews, problems reported in small classes were slow discussions, not enough
information to read and less diversity of ideas. In large classes, both instructors and students
often encountered information overload. Student interviewees knew that graduate students
were expected to read a lot and have deeper discussions. However, in online graduate
courses, the reading load comprises articles plus notes. If the students were not reading
others’ notes, they were not participating and not learning, especially because they had to
428 M. Qiu et al.
read a substantial number of messages before they could contribute their own. As class size
increased, most students in large classes started to feel that there was always “a lot to read”.
When the number of notes that students were meant to read increased beyond a certain point,
the percentage of notes they actually read declined, mainly because of information overload.
They reported that information overload was mainly caused by increased numbers of
students; so students in larger classes were particularly vulnerable to information overload.
When they logged on and saw all those unread notes, they sometimes became disheartened.
They felt that they could not read so many messages closely. Besides, students did not all
have the same amount of time to deal with their course work; an excessive reading load was
particularly difficult for those students who had full-time jobs or had to log on later in the
week. The students in the study admitted that they used a variety of compensatory strategies
to cope with overload: selective reading (by topic or author), scanning through messages
quickly, skimming some messages, skipping reading some messages completely, or simply
ignoring large numbers of messages. The consequences were significant: If students were
not closely attending to each other’s notes in large classes, they might miss important
information and collaborative learning might not be realized, contrary to some instructors’
intention of putting all students in one large class so that they could be exposed to more
information. The findings also implied that letting students choose which notes they wanted
to read was not an ideal strategy. For example, students could select notes by reading the
note titles only. In such a case, they still might miss important information in notes with less
attractive titles.
5 10 15 2
0
Class Size
0
400
800
1
200
A
v
e
ra
g
e
N
u
m
b
e
r
o
f
N
o
te
s
S
tu
d
e
n
ts
R
e
a
d
Fig. 1 Correlation between class size and total notes each student read. The colors on the figures represent
classes of small, large, and large with subgroups
5 10 15 20
Class Size
40
50
60
70
80
90
P
e
rc
e
n
ta
g
e
o
f
N
o
te
s
S
tu
d
e
n
ts
R
e
a
d
Figure 2 Correlation between class size and percentage of notes students read. The colors on the figures
represent classes of small, large, and large with subgroups
Computer-Supported Collaborative Learning 429
Class size and note writing
The main learning for online students comes not only from reading other people’s notes but
also from having to construct their own ideas in their own notes. Writing is essential for
learning, even more so than reading as Instructor 3 stated. Generally speaking, a larger
number of notes is supposed to further students’ understanding of the discussion and provide
information and knowledge for the target learning. It also indicates active learning in the
class. The findings suggest that class size may have played a key role in the quantity and
quality of instructors’ and students’ note writing (See Tables 2 & 3). Increased class size was
positively correlated with a larger total number of notes written in a class, with a larger
average number of notes written per student (from 50 to 80 notes; r00.498, p<0.01) and per
instructor (from 12 to 461 notes; r00.554, p<0.01), and with a higher note Flesch-Kincaid
Reading Ease Scores by students (r00.517, p<0.01). Yet, larger class size correlated
negatively with students’ note sizes (r00.613, p<0.001) and students’ note Flesch-
Kincaid Grade Level Score (r00.555, p<0.01), but not with instructors’. Thus, class size
relates not only to overall note quantity but also to students’ note length and writing style. As
class size increased, only students tended to write shorter notes with simpler vocabulary (See
Figs. 3, 4, 5, 6).
The reason is unclear: one possibility, as some interviewees stated, is that students only
had a certain amount of time to read and write notes. When they were facing information
overload, they had less time to think about using more academic words and writing longer
notes. They chose a simpler vocabulary and wrote shorter notes in order to dialogue. Several
students reported that when they “were competing” for participation marks in a larger class,
they paid more attention to their numbers of notes and chose easier ways to convey their
ideas than to write longer notes with more academic phrasing. One student participant
explained thus: The statistical analyses showed that Larger class sizes meant more total
notes and hence more notes to respond to. The results revealed that a student in a class of less
than 10 students would write approximately 50 notes on average, while a student in a class
of more than 16 wrote close to 80. More students produced more topics, and more topics
might inspire more notes. Competition to establish students’ status in the large classes was
also reported to have encouraged more note-writing. Instructors, accordingly, also wrote
more notes as the number of students increased in a class. However, the note size, the Flesch-
Kincaid Reading Ease Score and Grade Level Score of instructors’ notes did not change
significantly as class size increased. Consequently, when class size increased, it influenced
students’ note writing behaviors more. A large number of classmates appeared to “force”
students to write shorter notes to save time and to “beat” their classmates in number of notes
5 10 15 20
Class Size
10
30
50
70
90
110
130
T
o
ta
l
N
o
te
s
a
S
tu
d
e
n
t
W
ro
te
Fig. 3 Correlation between class size and total notes by a student. The colors on the figures represent classes
of small, large, and large with subgroups.
430 M. Qiu et al.
for participation marks. With limited time spent on a larger number of notes, note quality
declined.
To some extent, it is believed that the more notes that the students write, the more
productive the class discussion will be and the more the students will learn. In the small
classes in this study, sometimes less information was produced and the discussion tended to
slow down, especially when instructors did not participate actively. Thus, instructors’
participation became even more important in small classes. Strategies instructors adopted
to encourage note writing and keep the class discussion going might not always work as
intended. From the interviews, most instructors said that they had a participation requirement
—usually 2 to 3 notes per week. However, some students said that they tried to exceed the
minimum requirement for postings only in order to secure a good participation mark. Such
note-writing for quantity might reduce the quality of the notes, which then did not contribute
much knowledge to the learning community but added to information overload. Information
overload was also reported correlated with improper contents and lengthy notes, because it
related to the time it took to read a note. Discussions were arguably helped by shorter and to-
the-point notes. Long rambling notes tended to lose readers and confuse the discourse.
Especially in larger classes, some students reported that when they opened a lengthy note
with copy-and-paste contents, an off-topic note, or a note like a mini-essay, they tended to
skim it without really reading it carefully or else skip it entirely.
Instructors’ presence and facilitation affected how students interact. The findings sug-
gested that frequency of instructors’ note writing was associated with students’ note-writing
5 10 15 20
Class Size
100
200
300
400
500
A
v
e
ra
g
e
N
o
te
S
iz
e
(
W
o
rd
s
)
b
y
S
tu
d
e
n
ts
Fig. 4 Correlation between class size and average note size by students. The colors on the figures represent
classes of small, large, and large with subgroups
5 10 15 20
Class Size
40
45
50
55
60
65
N
o
te
R
e
a
d
in
g
E
a
s
e
S
c
o
re
b
y
S
tu
d
e
n
ts
Fig. 5 Correlation between class size and note Reading Ease Score by students. The colors on the figures
represent classes of small, large, and large with subgroups
Computer-Supported Collaborative Learning 431
activities. Instructors often found it hard to draw a line between participating too much and
not enough. Students perceived instructors’ not writing “enough” notes as “absence”. It
tended to discourage students’ note writing and even stop the discourse. Some students
complained that their instructors ‘disappear’ this way, especially in smaller classes or
subgroups, even though the instructors were actually reading the students’ notes; the
instructors just did not respond as much. That perception was another reason for instructors
to write more in small classes. Otherwise, the discussion tended to slow down or stop due to
the lack of stimuli and the students’ perception that the instructor was neglectful. Students
felt that instructors, in addition to reading notes or facilitating the discussion, should “teach”
by writing a proper number of notes to “lead the discussion” instead of just giving answers to
questions or not participating. But it could also be a problem if instructors were “too active”
in writing. Some instructors felt that very active note writing (e.g., answering most ques-
tions) was perceived as their “dominating the discussion”. If instructors did dominate
discussions, the students tended to respond to their instructors more than to their peers,
thereby losing opportunities to collaborate with their peers, especially in larger classes, and
perhaps even halting the discussion. Instructors found different ways to participate in
discussions by writing notes. For example, some wrote comment notes, bridged ideas by
writing convergent notes, summarized at the end of a session, or guided students to take over
and summarize the discussions. Instructors’ summary notes were welcomed because they
helped students get a whole picture of the issues under discussion.
The study also found that note-writing assessments could powerfully encourage and
guide students’ note-writing activities, affecting how students interact. Some instruc-
tors assessed students’ participation by requiring a certain number of notes (usually
two to three) weekly, though some students did not feel comfortable at “being forced
to write”. Some instructors counted the total number of notes students wrote and gave
a specific mark for that. However, any quota system sometimes produced excess note
writing to gain participation marks, with concomitant decline in quantity and meaning.
In contrast, some instructors assessed note writing by quality, monitoring the content
of students’ notes. These instructors valued notes into which students had put a lot of
thought and which advanced the discussion. This study suggested that setting require-
ments for high-quality notes would help in reducing information overload, particularly
in larger classes. Nevertheless, most students felt that standards for high-quality notes
were not as objective as judging by number of notes, and often involved unclear
requirements or rubrics. To avoid bias, most of the instructors assessed students’ note
5 10 15 20
Class Size
8
10
12
14
16
N
o
te
G
ra
d
e
L
e
v
e
l
S
c
o
re
b
y
S
tu
d
e
n
ts
Fig. 6 Correlation between class size and note Grade Level Score by students. The colors on the figures
represent classes of small, large, and large with subgroups
432 M. Qiu et al.
writing by both quantity and quality, with a rubric heavily oriented toward quality.
This method appeared to be more effective. However, this study found that most
instructors’ assessment of note writing had not taken class size into consideration.
Discussions
Using mixed methods helped this study to arrive at an essential finding: that different sizes
of classes led to different reading and writing loads for students and instructors respectively.
The students’ and instructors’ feedback and opinions are essential and pertinent. Both
students and instructors felt that a class of eight or fewer would not have enough stimuli,
perspectives or interaction for a proper discussion, while a class of 18 or more, at least for a
graduate-level course, would make a single conversation difficult and would become
overwhelming and less manageable for both students and instructors. Apparently, the
participants’ ideal, manageable class size would be about 13 to 15. This size allows students
to have a good sense of their peers and to read and respond to other participants’ contribu-
tions, while maintaining enough stimuli and diversity. For some small classes in this study,
information is limited to about 360 notes on average plus course reading materials. However,
the knowledge that students gain from such courses is restricted to the background knowl-
edge of the limited number of members. The students felt that having peers from varied
backgrounds would contribute to more diverse discussions and learning experiences. They
favored being exposed to more ideas than would have been possible with a more homoge-
neous small learning community.
However, complaints about information overload came mainly from larger classes,
especially those with whole-class discussion setups. In the study, students in large
classes have workloads of reading more than 1,700 notes on average plus course
reading materials. As a result, students complained that it is impossible for them to
digest the huge amount of information in large classes. Some of them felt lost in the
crowd. Thus, most students reported that they had frustrating and exhausting learning
experiences in large whole-class discussions. Students would welcome the design of
subgroup discussions embedded in large classes, because it allows them more inter-
actions with their peers and an escape from mass, large whole-class discussions. They
felt less frustration with more intimate, more focused discourse in small groups, in
which they could experience the formation of a sense of an online learning commu-
nity among the members.
This study found that students’ learning experiences varied with instructors’ online
teaching experiences and strategies in different sizes of classes. Small whole-class
discussion worked well and received positive reflections from students, according to
one instructor who has taught only small classes in her 5 years of online teaching
experiences and consequently can maintain the strategy of whole-class discussions.
One new instructor has whole-class discussions in her large online class and is
distressed that there are more dropouts than in her face-to-face classes. She has never
thought of utilizing the subgroup strategy, because she does not have solid informa-
tion about the different workloads in different sizes of classes. She plans to use large
whole-class discussions again in her next online course. She says she has noticed that
her one-on-one note responding practice in large whole-class discussions has weak-
ened student participation. She also noticed that in her large class students tend to
have fewer opportunities to “talk” with their peers or to initiate discussions. Three
instructors use the large whole-class discussion strategy for its benefits of diversity.
Computer-Supported Collaborative Learning 433
These three instructors usually have large classes. Their strategy was to let students
choose which notes to read or respond to. Two of them had not thought of dividing
students into subgroups, while one felt that subgroup discussions might limit students’
exposure to diverse ideas. Students in large classes like theirs complained about
information overload more. Five out of the 10 instructors interviewed use the sub-
group strategy to reduce information overload in large classes and to provide students
with small intimate learning environments. Before the interviews, all of these five
instructors had taught online graduate-level courses with different class sizes for more
than 9 years; among them are pioneers in online teaching at the institute and in the
world. On the basis of their years of online teaching experiences, when they have
small classes, they usually adopt a whole-class discussion format and participate more
actively as a member in the class. When they have large classes, they usually
introduce the class members and course contents in whole-class settings. Later, for
certain weeks they divide students into subgroups, aiming to promote focused, in-
depth discussions. The subgroups’ insights are reported back to benefit large whole-
class discussions. To preserve the advantages of diversity in large classes, their
instructors rotate the students through different subgroups and make the subgroup
discussions public to the whole class. When assigning students to subgroups, they
group or mix students with different skills, professions, gender and characters. They
allow students to choose subgroups on the basis of topics, contents or interests. Their
students appreciated the strategies these instructors used to deal with reading and
writing loads in different sizes of classes, reporting that their learning experiences
were thereby made more satisfactory.
Recommendations
The study arrived at a listing of pedagogical recommendations, suggestions for new
software features, and a call for applying multiple educational theories that may help
remedy problems relating to class size in online courses. 1). Pre-informing the Partic-
ipants Using orientation video or audio clips and detailed rubrics pre-informing students
of possible reading and writing loads in different sizes of classes may help students
prepare for reading and writing notes. It may also provide students with an initial
understanding of the expectations. Tutorials seem necessary to provide instructors and
students with information about possible problems due to different class sizes. 2).
Providing Proper Guidance This study found that instructors’ presence and facilitation
affect students’ note reading and writing. Instructors’ pre-structuring discussions can
significantly increase the number of times students challenge each other. Proper instruc-
tor participation may reduce students’ anxiety about being left to continue the discussion
on their own, especially in subgroups. “Supervision behind the scene” needs to become
“visible” to let students know that instructors are reading their notes. 3). Assigning
Appropriate Workloads Both the quantitative and qualitative data analyses suggest that
instructors’ expectations for students’ participation need to be adjusted to fit different
class sizes in order to achieve effective collaborative discourse. This study suggests that
the required number of notes should be higher in small classes than in large ones in order
to guarantee participation and class energy. Notes in small classes can be expected to be
better-quality and longer. It may be more satisfactory to assess note writing by both
quantity and quality, with an emphasis on quality. Requiring high-quality notes may
reduce information overload and achieve better discussions. Standards should set out
how to write “good” notes with proper length and “come-to-the-point” contents. 4).
434 M. Qiu et al.
Segmenting the Semester Instructors can segment the semester to achieve different goals and
to meet different needs by combining whole-class and subgroup discussions to manage
discourse, to reduce information overload in large classes and to bring insights back to the
whole class. 5). Utilizing Multimedia Technologies Large class size and text-only communica-
tion create heavy reading and writing loads. It can be helpful to use multimedia (e.g., audio,
video, graph, or even animation) to introduce the course and the weekly discussion topics, to get
to know the class members, especially in large classes to humanize their learning environment.
6). Creating Coherent Environments Findings from this study suggest that a class of 13 to 15
graduate students is an ideal size. Instructors may need strategies to manage classes smaller or
larger than the ideal size in order to achieve collaborative discourse. In small classes, keeping all
the students in one group may increase participant accountability and encourage participation,
thus compensating for the lack of information and supporting a coherent learning environment.
In larger classes, dividing students into subgroups during certain weeks appears an effective
strategy for creating opportunities for coherent discussion environments. 7). Enhancing Indi-
vidual Learning Individual learners care more about what they can learn from a course and what
they can apply in their future work. An ideal class size is one that serves the purpose of
supporting individual learning. The quantity and quality of note reading and writing should be
designed to benefit individual learners who have different interests as well as to allow learning
in subgroups. Requiring students to write a certain number of notes based on course reading
materials may create a collection of ideas that leads to cooperative and/or collaborative
discussions. Asking students to write convergent notes can lead students to read notes in related
discussions. Assigning students to summarize subgroup discussions will help individual
students gain an overall view of the discourse. Appointing students as discussion leaders in
subgroups may help them learn better through leading. 8). Creating new software features
Heavy text-based reading and writing loads in large classes in this study may be reduced by
creating functions using audio and video technologies or by creating links to ‘invite’ existing
computer-based multimedia technologies, such as Webinar, to enhance social presence. It
would be helpful to create functions to allow students to choose which note to read: for
instance, searching (by key words or topics), browsing (for notes in other groups), checking
(note length), marking (important convergent or summary notes), filtering (by topics), tailoring
(references or quoted contents) and linking (convergent notes). 9). Applying Multiple Theories
Online learning is a complex learning process. Existing theories supporting and guiding online
education tends to direct online work and learning from their own individual perspectives.
However, instructors who follow a single theory, hoping that it will solve all the problems they
encounter, might find it difficult to explain some issues arising in their online classes. Holistic
application of several theories could balance out the biases of any single theory.
Conclusions
The findings from this study points to class size as a major factor affecting note reading and
writing loads in online classes. However, it appears not necessarily true that smaller classes
have better class discussions and larger classes have worse ones. Both optimal class size and
effective organizational strategies, such as appropriate group configuration, contribute to
more interactive and productive online conferencing.
When the class size is too small, students may not have access to sufficient information;
the instructor’s participation usually determines whether a small-class discussion will be
successful or not. As class size increases, note reading load for both students and instructors
increases greatly. When class size increases beyond an optimal size, information overload
Computer-Supported Collaborative Learning 435
may “kick in” and students’ complaints arise. Instructors’ note-reading activities in
larger classes are not obviously seen; therefore, some students think that their
instructors often are not participating in discussions, especially in subgroup discus-
sions. Instructors’ responding to notes appropriately often seems to encourage stu-
dents’ note writing.
As class size increases, note-writing load increases accordingly. Both students and
instructors tend to write more notes of shorter length and with fewer academic words.
Discussions become more like dialogues. However, assessment of note writing has an
impact on quantity and quality of student note-writing behaviors.
Different class sizes played an important role in students’ learning experiences and the
amount of information the students learn. Instructors’ teaching experiences in different sizes
of classes lead to their developing different strategies to cope with different class situations,
which then may affect students’ learning experiences. This study found that splitting larger
classes into subgroups serves as a strategy to reduce information overload and to encourage
focused, in-depth small group discussions. Finally, the study found that class size and group
configuration affects how collaborative the online discourse becomes: Larger classes tend to
be more cooperative and less collaborative.
The findings from this study may have implications for both practitioners and
researchers. They could serve as a base for researchers to further explore the issue
of class size and seek optimal patterns of group configuration to achieve more fruitful
online conferencing. Nevertheless, a number of concerns suggest a variety of addi-
tional questions for further research. There is a need to clarify the definition and
processes of effective online collaboration in order to support productive whole class
and subgroup discussions. Another area requiring further research concerns further
exploration of other potential technologies, especially with the support of existing
multimedia, to reduce text-based only communication and to support collaborative
online discussions. Further research is recommended to look at the issue in a macro
context by inviting more samples from other institutes globally as well as more micro
studies of single classes and subgroups. Studies are needed to compare online text
only collaborative discourse with discourse utilizing multimedia technologies.
Many online courses intended as collaborative learning environments are not
effective due to the failure to consider class size and note reading and writing loads.
Some experienced online instructors do utilize effective strategies but keep these
stored in their own mental “attics” rather than broadcasting them to benefit other
online instructors and students. As a result, some online students and instructors,
especially new ones, tend to participate in discussions mechanically without noticing
that some of the problems they encounter may be caused by class size and note
reading and writing due to pure online text-based communication. We need to take
class size into consideration rationally and place more emphases on effective student
learning with appropriate strategies. Any instructor who is blind to this point may pay
a heavy price: their students’ unsatisfied or even failures in online learning.
Many factors affect the success of online graduate-level discourse; class size is
only one of them. This study does not aim to provide final answers to some questions
or define recipes for instructional design. Rather, it opens up a suggestive window by
pointing out practices and opinions from some representative participants. It is to be
hoped that it contributes in some modest measure to future understanding and
supporting of effective online learning, and that its fundamental conclusions hold true
not only for online courses in the institute examined but also for online courses in
many other institutes.
436 M. Qiu et al.
Appendixes
Table 1 Percentage of notes read, average number of notes read, or total number of notes read by a
participant, a student, or an instructor in the 25 courses
Whole Class Students Instructors
ID Size All Notes Size % Avg. Size % Total
1 6 325 5 83.45 271.20 1 72.62 236
2 8 344 7 79.44 273.29 1 81.10 279
3 8 298 7 83.94 250.14 1 86.58 258
4 8 727 7 75.14 546.29 1 42.78 311
5 8 247 7 75.94 187.57 1 87.85 217
6 9 462 8 85.90 396.88 1 86.80 401
7 9 456 8 71.35 325.38 1 73.68 336
8 10 679 9 70.48 478.56 1 74.96 509
9 11 307 10 90.03 276.40 1 87.95 270
10 11 388 10 80.08 310.70 1 98.20 381
11 16 1,284 15 44.49 571.20 1 72.51 931
12 16 1,148 15 74.86 859.33 1 85.28 979
13 17 1,240 16 56.74 703.63 1 85.97 1,066
14 17 2,155 16 62.02 1336.62 1 63.16 1,361
15 17 1,885 16 66.73 1257.94 1 82.33 1,552
16 17 1,171 16 49.16 575.69 1 86.25 1,010
17 18 1,614 17 56.78 916.41 1 73.61 1,188
18 19 1,146 18 67.68 775.56 1 91.36 1,047
19 19 1,128 18 57.83 652.33 1 76.42 862
20 19 1,993 18 58.54 1166.78 1 86.35 1,721
21 20 1,308 19 59.74 781.42 1 87.39 1,143
22 20 1,597 19 54.26 866.53 1 94.55 1,510
23 20 2,194 19 57.74 1266.89 1 89.11 1,955
24 21 1,525 20 57.06 870.10 1 93.84 1,431
25 22 1,404 21 55.80 783.48 1 96.51 1,355
ID 0 Class ID. Size 0 Total number of participants, students, or instructors in a class. All Notes 0 All notes
written in a class. % 0 Percentage of the average number of notes all participants, students, or instructors read
in each class. Avg. 0 Average number of notes all participants or students read in each class. Total 0 All notes
instructors read in a class
Computer-Supported Collaborative Learning 437
Table 2 Percentage of notes written, average number of notes written or total notes written by all participants,
students, or instructors in 25 courses
Whole Class Students Instructors
ID Size Total Avg. Size % Total Avg. Size % Total
1 6 325 54.17 5 74.15 241 29.00 1 25.85 84
2 8 344 43.00 7 81.40 280 40.00 1 18.60 64
3 8 298 37.25 7 88.93 265 37.86 1 11.07 33
4 8 727 90.88 7 91.20 663 94.71 1 8.80 64
5 8 247 30.88 7 82.19 203 29.00 1 17.81 44
6 9 462 51.33 8 89.39 413 51.63 1 10.61 49
7 9 456 50.67 8 76.32 348 43.50 1 23.68 108
8 10 679 67.90 9 83.80 569 63.22 1 16.20 110
9 11 307 27.91 10 83.39 256 25.60 1 16.61 51
10 11 388 35.27 10 96.91 376 37.60 1 3.09 12
11 16 1,284 80.25 15 91.04 1,169 77.93 1 8..96 115
12 16 1,148 71.75 15 88.24 1,013 67.53 1 11.76 135
13 17 1,240 72.94 16 84.44 1,047 65.44 1 15.56 193
14 17 2,155 126.76 16 93.50 2,015 125.94 1 6.50 140
15 17 1,885 110.88 16 89.50 1,683 105.44 1 10.50 198
16 17 1,171 68.88 16 94.02 1,101 68.81 1 5.98 70
17 18 1,614 89.67 17 71.44 1,153 67.82 1 28.56 461
18 19 1,146 60.32 18 91.54 1,049 58.28 1 8.46 97
19 19 1,128 59.37 18 80.32 906 50.33 1 19.68 222
20 19 1,993 104.89 18 91.07 1,815 100.83 1 8.93 178
21 20 1,308 65.40 19 86.85 1,136 59.79 1 13.15 172
22 20 1,597 79.85 19 90.48 1,445 76.05 1 9..52 152
23 20 2,194 109.70 19 91.57 2,009 105.74 1 8.43 185
24 21 1,525 72.62 20 90.03 1,373 68.65 1 9..97 152
25 22 1,404 63.82 21 92.95 1,305 62.14 1 7.05 99
ID 0 Class ID. Size 0 Total number of participants, students, or instructors in a class. % 0 Percentage of the
average number of notes all participants, students, or instructors wrote in each class. Avg. 0 Average number
of notes all participants or students Wrote in each class. Total 0 All notes students or instructors wrote in a
class
438 M. Qiu et al.
Table 3 Average size, reading ease score, or grade level score of notes by a participant, a student, or an
instructor in the 25 courses
Whole Class Students Instructors
ID Size Ease Grade Size Ease Grade Size Ease Grade
1 484.29 44.12 15.50 518.65 41.75 16.33 312.49 55.96 11.35
2 329.14 53.80 12.29 317.97 53.76 12.39 407.34 54.11 11.59
3 340.85 41.54 12.64 347.95 41.15 12.76 291.15 44.28 11.77
4 168.81 58.88 8.95 179.07 57.43 9.33 97.02 69.04 6.31
5 391.37 50.84 11.06 312.87 52.05 10.90 940.80 42.38 12.19
6 308.38 52.18 10.55 314.67 52.26 10.61 258.08 51.54 10.06
7 304.11 50.69 11.11 334.01 47.80 11.81 64.92 73.84 5.50
8 175.21 60.24 9.12 184.39 58.72 9.51 92.59 73.90 5.62
9 477.28 47.67 11.52 501.53 47.07 11.70 234.77 53.69 9.75
10 254.90 50.17 11.33 199.04 48.43 11.77 813.42 67.49 7.03
11 199.57 57.28 9.88 199.31 56.98 9.97 203.47 61.81 8.46
12 204.48 47.47 11.41 202.71 47.93 11.36 230.92 40.64 12.22
13 219.40 44.61 12.31 223.38 43.82 12.52 155.77 57.36 9.01
14 135.95 65.36 7.88 141.06 64.44 8.09 54.29 79.99 4.53
15 264.79 53.75 10.73 270.22 53.88 10.66 177.93 51.61 11.79
16 225.09 55.59 10.17 229.60 55.98 10.07 148.39 48.98 11.93
17 188.47 56.74 9.76 186.28 56.47 9.82 227.81 61.61 8.74
18 210.14 59.62 9.51 212.58 59.28 9.58 166.27 65.74 8.28
19 210.48 59.32 9.34 213.65 59.67 9.21 153.33 52.90 11.65
20 195.94 49.68 11.10 198.41 49.08 11.25 149.08 61.09 8.26
21 119.35 60.64 8.69 116.40 61.08 8.53 166.63 53.58 11.28
22 183.34 54.75 10.60 183.62 54.71 10.60 178.00 55.61 10.61
23 235.47 65.07 8.73 233.29 65.18 8.67 276.98 62.97 9.89
24 212.37 56.07 10.27 211.82 56.05 10.26 223.36 56.49 10.63
25 185.13 59.49 9.35 183.85 59.53 9.35 211.85 58.76 9.50
ID 0 Class ID. Size 0 Average note size by a participant, a student, or an instructor in a class. Ease 0 Note
Reading Ease Score of notes by a participant, a student, or an instructor in a class. Grade 0 Average Note
Grade Level Score of notes by a participant, a student, or an instructor in a class
Computer-Supported Collaborative Learning 439
References
Aragon, S. R. (2003). Creating social presence in online environment. New Directions for Adult and
Continuing Education, 100, 57–68.
Bender, T. (2003). Discussion-based online teaching to enhance student learning. Sterling: Stylus Publishing.
Bi, X. (2000). Instructional design attributes of web-based courses. Athens: Ohio State University (ERIC
Document Reproduction Service No. ED 448746).
Boettcher, J. V. (1999). What does knowledge look like and how can we help it grow? Syllabus Magazine, 13(2),
64–65.
Bordia, P. (1997). Face-to-face versus computer-mediated communication: A synthesis of the experimental
literature. The Journal of Business communication, 34, 99–120.
Bransford, J. D., Brown, A. L., & Cocking, R. R. (1999). How people learn: Brain, mind, experience, and
school. Washington: National Academy Press.
Bruffee, K. A. (1999). Collaborative learning: Higher education, interdependence, and the authority of
knowledge (2nd ed.). Baltimore: John Hopkins University Press.
Bruner, J. S. (1986). Acts of meaning. Cambridge: Harvard University Press.
Clouder, L., Dalley, J., Hargreaves, J., Parkes, S., Sellars, J., & Toms, J. (2006). Electronic reconstruction of group
dynamics from face-to-face to an online setting. Computer-Supported Collaborative Learning, 1(4), 467–480.
Cohen, E. G. (1994). Restructuring the classroom: Conditions for productive small groups. Review of
Educational Research, 64(1), 3–35.
Collins, A., & Halverson, R. (2009). Rethinking education in the age of technology: The Digital Revolution
and Schooling in America. New York: Columbia University, Teachers College Press.
Colwell, J. L., & Jenks, C. F. (2004). The upper limit: The issues for faculty in setting class size in online
courses. Retrieved September 17, 2008 from http://www.ipfw.edu/tohe/Papers/Nov%2010/
015__the%20upper%20limit
Creswell, J. (2005). Educational research: Planning, conducting, and evaluating quantitative and qualitative
research. Upper Saddle River: Merrill.
Davie, L. (1988). Facilitating adult learning through computer-mediated distance education. Journal of
Distance Education, 3(2), 55–69.
Denzin, N. K., & Lincoln, Y. S. (2005). The sage handbook of qualitative research (3rd ed.). Thousand Oaks:
Sage Publications.
Dohn, N. B. (2009). Web 2.0: Inherent tensions and evident challenges for education. International Journal of
Computer Supported Collaborative Learning, 4(3), 343–363.
Frey, B. A., & Wojnar, L. C. (2004). Successful synchronous and asynchronous discussions: Plan, implement,
and evaluate. Retrieved September 17, 2008 from http://www.educause.edu/ir/library/pdf/MAC0426
Gilbert, P. K., & Dabbagh, N. (2005). How to structure online discussions for meaningful discourse: A case
study. British Journal of Educational Technology, 36(1), 5–18.
Glass, G., & Smith, M. (1979). Meta-analysis of research on class size and achievement. Educational
Evaluation and Policy Analysis, 1, 2–16.
Graham, C. R., & Misanchuk, M. (2004). Computer-mediated learning groups: Benefits and challenges to
using groupwork in online learning environments. In T. S. Roberts (Ed.), Online collaborative learning:
Theory and practice (pp. 181–202). Hershey: Information Science Publishing.
Greene, J. C., Caracelli, V. J., & Graham, W. F. (1989). Toward a conceptual framework for mixed-method
evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255–274.
Hakkaranen, K. (2009). A knowledge-practice perspective on technology-mediated learning. International
Journal of Computer Supported Collaborative Learning, 4(2), 213–231.
Hewitt, J., & Brett, C. (2007). The relationship between class size and online activity patterns in asynchronous
computer conferencing environments. Computers & Education, 49, 1258–1271.
Hewitt, J., Brett, C., & Peters, V. (2007). Scan rate: A new metric for the analysis of reading behaviors in
asynchronous computer conferencing environments. American Journal of Distance Education, 21(4), 1–
17.
Hron, A., & Friedrich, H. F. (2003). A review of web-based collaborative learning: Factors beyond technol-
ogy. Journal of Computer Assisted Learning, 19, 70–79.
Hubscher-Younger, T., & Narayanan, N. H. (2003). Authority and convergence in collaborative learning.
Computers & Education, 41, 313–334.
Hutchinson, D. (2008). Teaching practices for effective cooperative learning in an online learning environ-
ment (OLE). Journal of Information Systems Education, 18(3), 357–366.
Ingram, A. L., & Hathorn, L. G. (2004). Methods for analyzing collaboration in online communications. In T.
S. Roberts (Ed.), Online collaborative learning: Theory and practice (pp. 215–241). Hershey: Informa-
tion Science Publishing.
440 M. Qiu et al.
http://www.ipfw.edu/tohe/Papers/Nov%2010/015__the%20upper%20limit
http://www.ipfw.edu/tohe/Papers/Nov%2010/015__the%20upper%20limit
http://www.educause.edu/ir/library/pdf/MAC0426
Johnson, B., & Christensen, L. (2004). Educational research: Quantitative, qualitative, and mixed
approaches. Boston: Pearson Education.
Jones, C., Dirckinck-Holmfeld, L., & Lindstrom, B. (2006). A relational, indirect, meso-level approach to
CSCL design in the next decade. International Journal of Computer Supported Collaborative Learning, 1
(1), 35–56.
Keegan, D. (2002). The future of learning: from eLearning to mLearning. ZIFF Papier, 119, Fern-University Hagen.
Kerka, S. (1996). Distance learning, the Internet, and the World Wide Web. Columbus, OH: ERIC Clearinghouse
on Adult, Career, and Vocational Education. (ERIC Document Reproduction Service No. ED395214)
Kimmerle, J., & Cress, U. (2008). Group awareness and self-presentation in computer-supported information
exchange. International Journal of Computer Supported Collaborative Learning, 3(1), 85–97.
Laurillard, D. (2008). The pedagogical challenges to collaborative technologies. International Journal of
Computer Supported Collaborative Learning, 4(1), 5–20.
Lipponen, L., & Lallimo, J. (2004). Assessing applications for collaboration: From collaboratively usable
applications to collaborative technology. British Journal of Educational Technology, 35(4), 433–442.
Masters, A., & Oberprieler, G. (2004). Encouraging equitable online participation through curriculum
articulation. Computers & Education, 42, 319–332.
Monahan, T., McArdle, G., & Bertolotto, M. (2008). Virtual reality for collaborative e-learning. Computers &
Education, 50(4), 1339–1353.
Moore, M. G., & Kearsley, G. (1996). Distance education: A systems view. Belmont: Wadsworth.
Morgan, D. L., Krueger, R. A., & King, J. A. (1998). Focus group kit. Thousand Oaks: Sage Publications.
Morse, J. M. (2003). Principles of mixed methods and multimethod research design. In A. Tashakkori (Ed.),
Handbook of mixed methods in social & behavioral research: Principles of mixed methods and multi-
method research design (pp. 189–208). Thousand Oaks: Sage Publications.
Neo, M. (2003). Developing a collaborative learning environment using a web-based design. Journal of
Computer Assisted Learning, 19, 462–473.
Pena, C. M. (2004). The design and development of an online, case-based course in a teacher preparation
program. Journal of Interactive Online Learning, 3(2), 1–18. Retrieved on September 20, 2008 from
http://www.ncolr.org/jiol/issues/PDF/3.2.4 .
Qiu, M. (2009). A mixed methods study of class size and group configuration in online graduate course
discussions. Open library published doctoral dissertation, University of Toronto, Toronto, Ontario, Canada.
Roberts, T. S. (2004). Online collaborative learning: Theory and practice. Hershey: Information Science Publishing.
Roberts, M. R., & Hopewell, T. M. (2003). Web-based instruction in technology education. Council on
Technology Teacher Education, 52nd Yearbook: Selecting instructional strategies for technology educa-
tion. McGraw Hill, Glencoe.
Roschelle, J., & Teasley, S. (1995). The construction of shared knowledge in collaborative problem solving. In
C. O’Malley (Ed.), Computer-supported collaborative learning (pp. 69–97). New York: Springer.
Rovai, A. P. (2002). Building sense of community at a distance. International Review of Research in Open and
Distance Learning, 3(1). Retrieved at March. 2, 2010 from http://www.irrodl.org/index.php/irrodl/article/
view/79/152
Scardamalia, M., & Bereiter, C. (1994). Computer support for knowledge-building communities. The Journal
of the Learning Sciences, 3(3), 265–283.
Scardamalia, M., & Bereiter, C. (2003). Knowledge building. In Encyclopedia of education, (2nd ed.,
pp.1370–1373). New York: Macmillan Reference.
Scardamalia, M., & Bereiter, C. (2006). Knowledge building: Theory, pedagogy, and technology. In K. Sawyer
(Ed.), Cambridge handbook of the learning sciences (pp. 97–118). New York: Cambridge University Press.
Schoech, D. (2000). Teaching over the internet: Results of one doctoral course. Research on Social Work
Practice, 10(4), 467–486.
Strauss, A., & Corbin, J. (1998). Basics of qualitative research: Grounded theory procedures and techniques
(2nd ed.). Thousand Oaks: Sage Publications.
Sutton, L. A. (2001). The principle of vicarious interaction in computer-mediated communications. Interna-
tional Journal of Educational Telecommunications, 7(3), 223–242.
Tao, P. K., & Gunstone, R. F. (1999). The process of conceptual change in force and motion during computer-
supported physics instruction. Journal of Research in Science Teaching, 36(7), 859–882.
Tashakkori, A., & Teddlie, C. (2003). Handbook of mixed methods in social & behavioral research. Thousand
Oaks: Sage Publications.
Tesch, R. (1990). Qualitative research: Analysis types and software tools. Basingstoke: Falmer.
Tomei, L. A. (2006). The impact of online teaching on faculty load: Computing the ideal class size for online
courses. Journal of Technology and Teacher Education, 14(3), 531–541.
Vygotsky, L. S. (1978). Mind in society. Cambridge: Harvard University Press.
Computer-Supported Collaborative Learning 441
http://www.ncolr.org/jiol/issues/PDF/3.2.4
http://www.irrodl.org/index.php/irrodl/article/view/79/152
http://www.irrodl.org/index.php/irrodl/article/view/79/152
Weigel, V. B. (2002). Deep learning for a digital age: Technology’s untapped potential to enrich higher
education. San Francisco: Jossey-Bass.
Wuensch, K. L., Aziz, S., Ozan, E., Kishore, M., & Tabrizi, M. H. N. (2008). Pedagogical characteristics of
online and face-to-face classes. International Journal on E-Learning, 7(3), 523–532.
Xu, H., & Morris, L. V. (2007). Collaborative course development for online courses. Innovative Higher
Education, 32(1), 35–47.
442 M. Qiu et al.
Copyright of International Journal of Computer-Supported Collaborative Learning is the property of Springer
Science & Business Media B.V. and its content may not be copied or emailed to multiple sites or posted to a
listserv without the copyright holder’s express written permission. However, users may print, download, or
email articles for individual use.
Journal of the Scholarship of Teaching and Learning, Vol. 12, No.2, June 2012, pp. 27 – 38.
APA, Meet Google: Graduate students’ approaches to learning
citation style
Nancy Van Note Chism1 and Shrinika Weerakoon2
Abstract: Inspired by Perkins’ Theories of Difficulty concept, this exploratory
study examined the learning patterns of graduate students as they grappled with
using the style sheet of the American Psychological Association (APA). The
researchers employed task performance analysis of three APA formatting tasks,
interviews, and observation during a “think aloud” task to gather information on
students’ misconceptions and successes. The study was able to document in detail
how a group of Internet-savvy students approach the use of a style sheet.
Learning APA style was found to be a matter both of overcoming conceptual
blocks and personal style preferences. Once understanding of genre and
conventions that may be inconsistent with prior experience and with each other
are attained, motivation, patience, persistence, and attention to detail are also
needed to achieve high levels of performance.
Keywords: citation error, skill learning, APA style, graduate students
I. Introduction.
What seemed to be a straightforward task in a doctoral proseminar—formatting references in
American Psychological Association (APA) style—turned into a frustrating experience for both
instructor and students. Puzzled by the poor performance of talented students on a routine
exercise involving correcting bibliographic citations, we undertook a study of the reasons for
these challenges.
Given the context, it was easy to avoid assuming that the students were lazy or
unmotivated or did not consider the task important, possibilities that normally came to mind
when encountering students’ citation errors. This group of students was clearly eager to show
their proficiency, take their new program seriously, and impress their peers and instructor. They
were in the early phase of graduate student adjustment (Weidman, Twale, & Stein, 2001),
lacking confidence, but trying very hard to show the institution that it had not made a mistake in
accepting them into the program. Within the group, there was a competitive ethos, a seriousness
about study and grades that is characteristic of new doctoral students. Thus, inquiring into the
difficulties that serious students have with an apparently simple task was at the heart of this
study. We framed our main research question as “What factors are associated with errors that
new graduate students make in using APA style in citation lists?”
Learning a citation style like APA is important as it helps in academic and research
activities such as retrieving documents for verification of data and building credibility as
author(s) (Faunce, & Soames, 2001; Spivey, & Wilks, 2004; Sweetland, 1989). Citation styles,
such as APA, have evolved through peer-consulted agreements within discipline-oriented
1 Professor of Higher Education and Student Affairs at the Indiana University School of Education, ES3150, 902 West New York
Street, Indianapolis, IN 46202, nchism@iupui.edu.
2 Senior Lecturer, Staff Development Centre, University of Colombo, P.O. Box 1490, Colombo, Sri Lanka, shrinika@gmail.com.
Van Note Chism, N. and Weerakoon, S.
Journal of the Scholarship of Teaching and Learning, Vol. 12, No.2, June 2012.
www.iupui.edu/~josotl
28
communities of practice. Such agreed norms and ethics in research and publishing need to be
followed by authors (Waytowich, Onwuegbuzie & Jiao, 2006) to ensure the continuation of
agreed practice, and therein lies another importance for learning citation styles.
II. Conceptual Framework.
Initially, we were inspired by the work of David Perkins (2008), who advocates abandoning
common initial reactions to student mistakes, such as blaming the learner (for laziness, poor
study habits, etc.), settling for a formulaic fix (teach harder, use repetition, etc.), and focusing on
the topic rather than the symptom and the symptom rather than the cause (how is the difficulty
manifesting itself and what is it about the nature of the task and students’ actions that are
connected to the difficulty?) The approach aims at reaching a deeper understanding of student
difficulties that transcends immediate applications to the problem at hand—in our case, students’
inability to apply APA style to citation lists. Perkins’ ideas are aligned with the scholarship on
scientific misconceptions and other approaches that center on cognitive bottlenecks.
When we continued to search for a specific type of conceptual blockage that might be
connected with applying APA style, however, we recognized, with the help of a peer, that style
sheets are arbitrary in some ways and are patterns to follow, rather than internally consistent
logical systems. For this reason, we were grateful for Perkins’ advice about seeking deeper
causes and wanted to continue to be sensitive to cognitive areas, but turned from the idea of
conceptual blockages to a theoretical framework of social cognitive theory, which focuses on
skill learning, or the application of rules to particular situations. Svinicki (2004) describes the
task of social cognitive learning as students’ creation of mental images of the sequences involved
in making the desired application of a particular skill. She emphasizes the importance of
modeling, practice, and feedback in this kind of learning. In addition, we included in our
framework considerations of motivation and transfer, which Svinicki stresses are intertwined in
skills learning. In such an exploratory study, we wanted to keep ourselves open to entertaining
several theories that might be applicable.
III. Literature Review.
An initial search of the literature failed to reveal a study that focused on the specific topic of
learning the APA style sheet. We did discover studies that focused on rates of citation errors in
general (Garfield, 1990; Sweetland, 1989) and specifically in Medicine (Asano, Mikawa,
Nishina, Maekawa, & Obara, 1995) and in Education (Jiao, Onwuegbuzie, & Waytowich, 2008;
Waytowich, Onwuegbuzie, & Jiao, 2006). These studies found that very high rates of errors are
common. Across these studies, citation errors ranged from 22%-51% in the samples studied, with
most around 30%. Garfield found that many errors occurred when authors copied from other
citations rather than the original document. These studies argue that one of the reasons for poor
citation style and inattention to accuracy is that these skills are not formally taught. Two of these
studies explored the relationship between personal characteristics and performance on APA
citation style tasks. Waytowich, Onwuegbuzie, and Jiao found that student perfectionism is
associated with high performance on citation style tasks. They documented the disconcerting
finding that performance actually deteriorated rather than improved over time as graduate
students advanced, suggesting that perhaps complacency or lack of correction by other
instructors is to blame. Jiao, Onwuegbuzie, and Waytowich found an association between
Van Note Chism, N. and Weerakoon, S.
Journal of the Scholarship of Teaching and Learning, Vol. 12, No.2, June 2012.
www.iupui.edu/~josotl
29
library anxiety and APA citation style performance. In this piece, they claim that the Waytowich,
Onwuegbuzie, and Jiao study is the first to explore relationships between APA errors and author
characteristics, documenting the lack of studies on this topic.
Turning to social cognitive theory, we found Svinicki’s summaries (2004, 2010) helpful
in applying early work on social learning theory by Bandura (1986) to college teaching. In
teaching intellectual skills, Svinicki stresses the roles of “cognitive apprenticeship” (Collins,
Brown, & Newman, 1989) and prior knowledge, emphasizing that as learners watch another
demonstrate a new skill, they construct a mental model of the process and then with practice and
feedback, are able to make applications to other like instances. Awareness of their own processes
(metacognition) aids in this activity. Svinicki also discusses the importance of such motivational
theories as expectancy/value theory (Eccles, 1983), which alerted us to explore whether the
students felt the task worthwhile and felt confident that they could master APA style.
A final strand of literature that seemed of possible relevance was the literature on
learning styles. In particular, we looked at the idea of field independence and field dependence
(Witkin, Moore, & Goodenough, 1977) as a possible explanation for why some students were
more prone to notice details than others. This theory posits that people vary in the extent to
which they tend to perceive the overall “big picture” (field independent) or notice the smaller
components (field dependent).
The relative lack of prior work on the issue of learning citation style, coupled with a
variety of possible explanations to consider, were factors prompting our exploratory approach. In
short, we wanted to know whether cognitive confusion, poor mental imaging, motivation,
personal style, simple lack of practice, or some combination of these factors were to blame for
APA citation style errors.
IV. Methods.
The Proseminar enrolled 12 students, one of whom assisted in designing this study. For that case,
we used only the students’ assignments, but did not include her as a study participant in
subsequent data collection efforts. After obtaining Institutional Research Board approval, we
first assigned numbers to the cases of the 12 students and assembled their work products: an
initial assignment that involved identifying citation errors in a list of 26 entries, and the reference
list they submitted with the literature review assignment for the course. Each assignment was
labeled with the student’s number. A list of numbers and corresponding participant names was
kept in a separate file.
Our initial step was to analyze patterns of error on the first assignment, which involved
correction of errors in an instructor-generated reference list. We created spreadsheets for each
student and created coding categories, noting when they had failed to detect a citation error or
inserted an erroneous correction. All three coders worked independently and then reconciled
their coding. The data from these spreadsheets were aggregated and individual and group
percentages were calculated.
We next did similar coding of errors with the student-generated reference lists from the
literature review assignment, coding errors by category. Again, we coded separately and
reconciled differences. Because the lists varied in the number and type of citations used, we
calculated percentages based on error rates per citation.
Van Note Chism, N. and Weerakoon, S.
Journal of the Scholarship of Teaching and Learning, Vol. 12, No.2, June 2012.
www.iupui.edu/~josotl
30
We next held interviews with each student on their basic approach, perceptions of the
importance of APA style use, and preferred working style. Notes from each interview were then
coded and entered into a database for analysis.
Finally, we observed students as they completed a “think aloud” task involving
composing three reference entries in APA style from source items. Students were asked to talk as
they worked, telling us their reasoning process. They were encouraged to use any resource they
would normally use. Our notes from these sessions and analysis of the resulting citations were
coded and entered into the study database. Scores on the citation task were arrived at by
agreement of two coders and calculated on the basis of error rate per citation.
V. Findings.
Our explorations yielded important understandings on the thought processes and habits that
students used in routine APA tasks, “logical mistakes” they made, their understandings of their
own thinking processes (metacognition), and preference for digital rather than print resources.
They also supported the efficacy of repeated practice in learning a skill. We will first discuss the
results of students’ performance; then, describe their strategies and other factors that affected
performance.
A. Performance Results.
Student performance on the three APA tasks analyzed for this study had been preceded by a
classroom demonstration of APA citation applications using slides archived for later student
reference and an in-class exercise requiring students to do APA tasks and receive immediate
feedback on their performance.
Error correction task. As previously noted, the first task involved noting APA style
errors and substituting the correct format in a reference list of 26 items containing 61 errors.
Students were able to do this on their own, using whatever resources they chose, following a
class session that provided an overview of common citation tasks using APA style. The
performance range on this task was quite disappointing, ranging from 20% to 64% accuracy,
with a mean of 40%, showing that only 7 of the 12 students recognized and corrected more than
half the errors in the list. Most common mistakes included: failure to recognize the genre of the
entry (journal, book chapter, etc.); incorrect punctuation with multiple authors (not using both
comma and ampersand); incorrect order of month and year when both needed to be used;
inappropriate use of capitalization/lower case and italics/Roman in titles; and incorrectly listing
city and state of publisher.
Literature review reference list. Students were asked to do a literature review with a
reference list in APA style. On this task, the number and type of citations varied according to the
sources students identified, so the error rate was calculated on the basis of errors per entry. The
number of entries ranged from 7 to 26 with an average of 14. Scores ranged from an error rate of
.6 per entry to 3.5. The mean score was 1.8 errors per citation. Common mistakes for this task
involved: upper and lower case errors in titles; punctuation after the date; punctuation with
multiple authors; order of year and month, and choice of genre. Four of the 12 students failed to
indent their lists. In comparing the list of common errors between this and the first task, it must
be remembered that the second list was student-generated while the first list was instructor-
generated, so citation tasks on the second assignment depended on the students’ choice of
Van Note Chism, N. and Weerakoon, S.
Journal of the Scholarship of Teaching and Learning, Vol. 12, No.2, June 2012.
www.iupui.edu/~josotl
31
references rather than a standard list. The approach was generative rather than reactive as well,
which students cited as preferable, largely because it was easier for them to identify the genre of
the source when they had the physical source before them. Nevertheless, there are substantial
commonalities across the list of frequent errors, showing that students did not transfer much
learning from the first task to the second.
Think Aloud task. Students were given two physical publications and one web URL
during the interview visit and asked to write the citations, using whatever resources they chose.
Errors per citation ranged from 0 to 3, with a mean of 1.2 per citation. Number of total errors
ranged from 0 to 10, with a mean of 3.6. Most common mistakes involved genre identification,
retrieval language, and capitalization/italicization of titles.
Patterns across tasks. Given that these tasks were performed over the course of two
semesters (from the original assignment to the interview), and assuming that students had other
opportunities to use APA style in their coursework, we were curious to know what patterns of
improvement occurred. Did students seem to learn from their mistakes? To assess this, we
looked at data on initially-high error rate tasks, comparing performance on the first, second, and
last tasks per student and then in the aggregate. These citation activities involved the following:
• Identifying genre of source (recognizing correctly that the source is a book, journal, etc.)
• Using APA retrieval language (to cite retrieval of World Wide Web sources)
• Appropriately using Roman/Italics or capitalization/lower case, as called for by the
situation
• Using correct punctuation for sources having multiple authors
Table 1 shows the average per item performance for the class on these items. Since the second
task involved student-generated lists, a few of which did not use retrieval language or atypical
genres, the group average does not reflect the performance of each student as evenly as the other
lists. Given this condition, particularly with genre recognition errors, one can see a pattern of
improvement across all four citation activities from the first to third task. The two more common
activities—italics/capitalization of titles and use of punctuation with multiple authors, improved
most dramatically, while the less common citations, involving unusual genres and web sources,
were still associated with error rates over .5 per entry.
Table 1. Changes in Error Rate per Citation on Common Problem Citations Across Tasks.
Use of italics/
Capitalization
Genre
Recognition
Retrieval
Language
Authors
(& and ,)
1st Task 0.7 0.9 0.8 0.5
2nd Task* 0.3 0.5 0.8 0.3
Think
Aloud
0.2 0.6 0.5 0.1
*Not as standard as the other two tasks since students chose citations to include, meaning that
they had either more or fewer citations of these kinds on which to base the error rate.
B. Factors Influencing Performance.
Working from our literature base, we looked at several potential factors influencing student
performance on the tasks: student characteristics, their strategies for locating APA style
information, their work checking behaviors, their perceptions of the value of the task (intrinsic or
extrinsic), their prior knowledge, and their metacognition.
Van Note Chism, N. and Weerakoon, S.
Journal of the Scholarship of Teaching and Learning, Vol. 12, No.2, June 2012.
www.iupui.edu/~josotl
32
Student characteristics. Interview data on students’ self-perceptions of their approaches
were examined for possible relationships with performance. Students were asked to rate
themselves from 1 to 10, with 10 being high, on four constructs: attentiveness to detail,
persistence with APA tasks, perfectionism, and tendency to comply with, rather than question,
directions. We grouped students by scores into the categories of high, medium, and low
performers on the tasks. Then, we compared ratings on personal attributes to level of
performance in order to explore the existence of a relationship. We found that the lowest
performers rated themselves lowest of all three groups on their tendency to comply with
directions and their attentiveness to detail, while the top performers rated themselves highest on
perfectionism and attentiveness to detail. Top and middle group students rated themselves high
on compliance relative to the low performers. Interestingly, the lower performers rated
themselves most highly on persistence, perhaps due to the time their inefficient strategies take.
Observations during the Think Aloud task showed that these students were more likely to jump
from one strategy to another and to be unfamiliar with the use of some resources, such as the
organization of the APA style manual. The results are summarized in Table 2.
Table 2. Relationship between Self-ratings on Performance. Scale from 1 to 10, with
10=High.
Group Averages Details Persistent Perfectionist Compliant Averages
Top Performing 8.3 7.5 8.3 8.0 8.0
Mid Performing 7.6 6.5 6.5 8.3 7.2
Low Performing 7.2 7.7 7.3 6.8 7.3
Whole group 7.7 7.2 7.4 7.8 7.5
Strategies in composing citations. Students differed on whether they used a deductive or
inductive approach to composing citations. Differences in approach led to different kinds of
errors. Those who worked deductively immediately sought a model to accommodate the
information on their source. This method led to problems when they misidentified the genre of
the source. A frequent problem occurred with publications that are separately titled volumes in a
series. Students who saw these publications as a journal ignored the dilemma of three titles
(chapter, volume title, series title) and did not see the need to list the place of publication and
publisher. Students who worked inductively from the elements of the source information to a
model, however, were prone to make mistakes in copying the information or selecting which
elements to use. For example, one student chose the earliest of several copyright dates instead of
the most recent. Her reasoning was that one should indicate when the piece was first published.
Another copied down extraneous information, such as the publisher’s website. The choice of key
elements of information dictated choice of model, causing mishaps if the wrong elements were
chosen. For some, working from the source became a literal exercise—because the title of the
book was listed with main words capitalized, it was copied that way.
Clearly, an iterative approach, moving between information from the source and the
model entry is required, but many students seemed unable to move back and forth. Once
information had been recorded by those using an inductive approach, they were reluctant to
abandon some pieces of information as unnecessary; conversely, once a model had been adopted
by the deductively-inclined students, they were reluctant to abandon the model because it did not
fit the information. An example of a student using the iterative process was found in the student
Van Note Chism, N. and Weerakoon, S.
Journal of the Scholarship of Teaching and Learning, Vol. 12, No.2, June 2012.
www.iupui.edu/~josotl
33
who said, “I try not to stop at the first thing [model] that fits, because something better may come
down the line.” Her colleagues were less likely to do so.
The importance of genre recognition became clear from the start of the study. Students’
error patterns were often related to their misperception of how the source material should be
classified. While students most easily recognized books, book chapters, and journal articles, they
had trouble with separately-titled volumes in a series, conference presentations, and electronic
sources. Choosing the wrong category of publication meant choosing a model entry that would
accommodate only some of the information available. Many students identified this major
decision point about the source—What is this?—as the most difficult aspect of using APA
citation style. Yet, the ability to make this decision readily is assumed by the Manual, and alas,
was assumed by the teacher in the context of this study.
These two problems—failure to work iteratively and misperception of genre—were
responsible for many subsequent issues.
Strategies for locating APA information. Only four students expressed a preference for
using the APA manual to locate style information and one student outrightly admitted to never
using the manual. All of the eight students who relied on other information sources used either
the Internet or print model documents (entries in bibliographies or published sources likes books
or journals). Much of the Internet use involved using Google or Google Scholar to locate sample
citations, but some involved using sites on APA style conventions published by other users,
mostly university centers. APA manual use was associated with middle and top performing
students more than lower performing students. We did not find any relationship between use of
various Internet methods and performance.
The features of online sources, such as hyperlinks and color coding, were viewed as more
user friendly and efficient to the students than the print APA manual. As a less preferred
resource, students used the APA manual when they continued to have questions about
correctness after accessing other methods. For most, the manual is, as one student said, “So
dense with information that I find it overwhelming. I don’t have the time to spend more than 10
or 15 minutes to look for a citation.” The organization of the manual is not clear to students. As
they completed their think-aloud tasks, they frequently struggled with the index and flipped
through pages randomly. One student complained that there are not enough examples and that
those that are in the manual are basic rather than focused on complicated cases. Another student
called the manual “stagnant,” saying that it provides whole examples rather than building from
individual elements. In observing students using the manual, we noted that they referred only to
the examples without reading the explanatory text. They sometimes made errors of interpretation
when they did read.
Students expressed, to varying degrees, issues of “trust” in consulting sources. For
example, several placed trust in people, such as their professors, who would be able to give them
advice on a troublesome entry. They joked about “dialing a friend” while they were doing the
Think-Aloud exercise. Most students mentioned that using Google or other bibliographies were
risky courses of action. Some students expressed the opinion that refereed journals in the field of
education can be trusted since these journals all use APA (an incorrect assumption) while others
cautioned that one should not rely on collections such as ERIC or EBSCO to provide citations in
APA format. Students generally trusted the online APA style digests that other institutions have
compiled (again not always a good assumption) but did not trust their peers to be accurate, often
saying that peer review of APA work was not helpful because their peers make as many or more
mistakes as they do. One student realized that EndNote does not always format citations
Van Note Chism, N. and Weerakoon, S.
Journal of the Scholarship of Teaching and Learning, Vol. 12, No.2, June 2012.
www.iupui.edu/~josotl
34
accurately in APA style. In the end, however, the APA manual is viewed as the authoritative
source and as such is the final recourse of students who are searching to resolve a difficult
citation problem. One student said, “APA is like the Bible.”
Students described sometimes using “triangulation” in formatting a difficult entry. They
arrayed a variety of examples of a given citation, some from Google Scholar, some from other
sources, to judge the “majority opinion,” or the differences between the formats used by more- or
less-trusted sources before determining which to follow. Often, they would select the version that
most closely matched the APA manual example that they judged applicable.
Work checking behaviors. We did not see strong patterns in the checking and refining
patterns of students as observed in the Think-Aloud task. Top students were more likely to check
the whole entry than checking only parts on which they were unsure, while mid-performing
students were more likely to check only troublesome parts. There was no clear pattern with
“giving up” or “settling” behaviors by performance group.
Perceptions of value of task. Although one student saw the APA assignments as
“mundane” and said that she did not invest much energy in doing the tasks, all of the others
stressed that they were motivated to perform well and gave these assignments their best effort.
They stressed their understanding of the importance of using proper citation style, sometimes to
a somewhat exaggerated level, such as the student who said, “Mistakes like this [citation format
errors] are an ‘in’ for others to question your credibility.” She added, “It’s really important to
avoid ‘public mistakes’–you really have to be careful as a scholar.” A few students stated that
they did not think they would be using APA in their work because they were aspiring to
administrative careers in which they would not be doing research and publication.
Prior knowledge. Confidence and experience were factors that influenced student
performance on APA assignments. A few participants expressed that they were highly familiar
with APA through their previous experience as either undergraduates or master’s students in
fields using APA style. One was in the middle range on the two tasks that she said she did
casually, but did very well in the Think-Aloud task. The other expressed astonishment that her
practice had many mistakes, which had never been corrected by professors in past programs. A
few participants had been away from formal schooling for several years and cited their lack of
practice with formal academic writing as a general challenge.
An issue with prior learning that was experienced by the students in this study, however,
was “unlearning” when previous practices had not conformed to APA style. For example, some
students had formerly been in disciplines that used other style sheets. Their “memories” told
them to put references in numbered lists or to spell out the author’s first name. For many,
previous instruction in writing as far back as elementary school confused them when they relied
on memory. They had been told to capitalize all major words in a title, for instance. These former
practices were deeply rooted and often prevented noticing differences with the new style.
“Logic” of errors. Interview comments often illustrated students’ reasoning in ways that
made good sense. For example, they saw contradictions between authorities:
• “The conventions of Microsoft Office are sometimes misleading, since they will do that
“little red underline” for spacing or things, when really that’s how APA wants it. It causes
me to think I’ve made a mistake.”
• “You were taught in grade school to capitalize main words in a title, to use quotation
marks around chapter titles. And leaving no space between a volume and issue number
just looks weird. Who can you trust?”
Students also yearned for consistency in searching for a way to remember conventions.
Van Note Chism, N. and Weerakoon, S.
Journal of the Scholarship of Teaching and Learning, Vol. 12, No.2, June 2012.
www.iupui.edu/~josotl
35
• “I would like for conventions on these to be standardized–sometimes you use a comma
and sometimes not. What’s the rationale for the difference?”
• “Why is it that the main words in the title of a book are lower case, while those in a
journal title are upper case?”
• “Why do you have to list the authors’ first name initials last at the start of an entry but
first when they are the book editors cited in a chapter citation?”
• “Sometimes page numbers are supposed to be listed with ‘pp.’ and sometimes not. Why
the difference?”
• “It says to use “&” in listing authors in citations and “and” when referring to them in text.
It would be easier to just use one or the other consistently, wouldn’t it?”
Metacognition. While some students were quite aware of their approaches to using APA
style, several appeared to be confused about the match between their stated approach and their
actions. In eliciting interview comments from students about how they went about formatting
entries, we found that students were able to describe their usual approaches, such as first writing
down the pieces of information, then looking for a model. But in several cases, the students
failed to follow this approach during the subsequent Think-Aloud task. Generally, students
whose descriptions of their approach matched our observations of them were in the higher
performing groups, leading us to think that metacognition is important in this learning task.
VI. Discussion.
Our overall appraisal of the usefulness of conceptualizing this study in terms of Theories of
Difficulty (Perkins, 2008) is positive, in that it encouraged us to look for rational explanations of
errors and misleading conceptions. What first seemed to us a routine task that did not require
much mental energy emerged as a more complicated one. While our search failed to identify one
key type of conceptual difficulty, it did lead us to explore the many factors involved in this type
of skill learning. Working from a Theories of Difficulty approach also helped us to see some
student errors as rational: they were made on the basis of a tendency to expect consistency in
rules with those of previous authorities and with each other. We also learned that we made some
incorrect assumptions about prior knowledge, such as students’ ability to distinguish between a
monograph series and a journal.
We learned, in accord with social cognitive theory (Svinicki, 2004), that practice seemed
to improve performance, showing that familiarity and attentiveness to the task were important
success factors. A key recognition, however, was that while performance on some APA style
citation tasks seems to improve with practice, others require explicit repeated modeling of
elements that seem quirky, complicated, or contradictory to prior experience, which take longer
to master.
In addition, students’ self-ratings of their personal characteristics showed some
relationship to performance. This latter finding is consistent with the findings of Waytowich,
Onwuegbuzie, and Jiao (2006) with respect to the positive relationship between perfectionism
and performance on APA citation style tasks.
VII. Limitations.
The small sample size, location in one program and one doctoral course, and detailed analysis of
three tasks allowed us to explore students’ APA citation style learning in depth. Coding and
Van Note Chism, N. and Weerakoon, S.
Journal of the Scholarship of Teaching and Learning, Vol. 12, No.2, June 2012.
www.iupui.edu/~josotl
36
reconciling among coders was slow and labor-intensive, yet it was feasible for the sample size.
These advantages also present limitations. The sample was too small to use statistical methods
appropriately and is very context-specific. Results, therefore, can only be transferred by
individual instructors on the basis of “fit” with their population and context.
A further limitation is the comparability of the tasks used for the study. The first task
involved “working backwards”—looking at citations for errors. It used a standard list of 26
items, involving many types of citation tasks. The second involved generating citations for
documenting a paper. Here, students chose the sources and the kind of citation tasks varied from
one student to another. In the third task, the Think-Aloud task, the citation tasks were standard,
but the conditions under which the students worked—being watched and likely feeling some
time pressure—differed from those of the first two tasks. Although we have referred to these
differences in the analysis, they make interpretation more complex and tentative.
VII. Implications.
Since the use of APA style is valuable in the literature in not only our discipline but in all who
use the popular style sheet(e.g. Asano, Mikawa, Nishina, Maekawa, & Obara, 1995; Jiao,
Onwuegbuzie, & Waytowich, 2008; Waytowich, Onwuegbuzie, & Jiao, 2006), there are several
implications stemming from this study for us and our colleagues:
1. Instructors cannot assume that prior experience or self-discovery are adequate methods
for students to learn tasks that seem routine.
a. In the special case of APA citation style use, instructors need to pay explicit
attention to genre recognition skills. Teaching with physical specimens is called
for. Students need to know the difference between a continuously-paged journal
and one that is not, between a separately-titled volume in a series and a
multivolume work. By having students identify various types of sources and
helping them to know how to check in cases when they are not sure of the identity
of the type of source, instructors can assist them in using APA citation style.
b. Highlighting common conventions of APA style is not enough as an instructional
strategy. It is important for instructors to stress systematic search strategies by
walking students through them, noting inconsistencies and highlighting
conventions. In line with social learning theory, they also need to provide detailed
modeling and repeated practice, encouraging students to compose their own
learning journals as they encounter conventions that seem unusual or
contradictory to their thinking. Such metacognitive activities will address the
individual learning challenges in this area.
2. Colleagues in the program need to share the value of APA style use and reinforce
learning. In this study, several students who thought they were using APA style correctly
discovered that their previous instructors never pointed out APA style mistakes. It is also
common to hear that “ideas are more important than mechanics.” Instructors therefore
need to support each other in helping students by recognizing the importance of
reinforcement on style activities.
3. This study demonstrated that the scholarship of teaching and learning is important in
unraveling the causes behind student errors and improving instruction. Looking at
patterns of error systematically gave us an appreciation for specific types of errors in
APA citation style, but more fundamentally changed our approach to learning challenges,
Van Note Chism, N. and Weerakoon, S.
Journal of the Scholarship of Teaching and Learning, Vol. 12, No.2, June 2012.
www.iupui.edu/~josotl
37
inspiring us to look more carefully at how students approach learning tasks. Instructors
who are systematic in their explorations of student difficulties change their teaching in
intentional ways as well as help their colleagues to promote better learning.
We believe that the above issues are not limited to the use of APA style, but are issues likely to
appear in any use of a style sheet. It thus crosses academic disciplinary boundaries. We conclude
then, that mastering APA citation style is influenced by practice, conceptual issues, and personal
style preferences. Once understanding of genre and conventions that may be inconsistent with
prior experience and with each other are attained, desire, patience, persistence, and attention to
detail are also needed to achieve high levels of performance. These are the tasks involved in
socialization to the practices of given discipline; our attention to this basic task can help in
broader ways than the simple mastery of a style sheet. Gains in metacognition, attention to detail,
self-discipline, and pride in one’s work are all involved; style sheets can be the medium for
helping our students achieve these goals.
Acknowledgement
The authors acknowledge the assistance of Gretchen Harris for her participation in the data
collection and analysis conducted during the study.
References
Asano, M., Mikawa, K., Nishina, K., Maekawa, & N., Obara, H. (1995). Improvement of the
accuracy of references in the Canadian Journal of Anaesthesia. Canadian Journal of
Anaesthesia, 42(5), 370-372.
Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory.
Englewood Cliffs, NJ: Prentice-Hall.
Collins, A.M, Brown, J.S., & Newman, S.E. (1989). Cognitive apprenticeship: Teaching the
crafts of reading, writing and mathematics. In L.B. Resnick (Ed.), Knowing, learning, and
instruction: Essays in honor of Robert Glaser (pp. 453-494). Hillsdale, NJ: Lawrence Erlbaum.
Eccles, J. (1983). Expectancies, values, and academic behavior. In J. T. Spence (Ed.),
Achievement and achievement motivation (pp. 75-146). San Francisco, CA: W.H. Freeman.
Garfield, E. (1990). Current comments. Essays of an Information Scientist, 13, 367-375.
Faunce, G.J., & Soames, J.R.F. (2001). The accuracy of reference lists in five experimental
psychology journals. American Psychologist, 56(10), 829-830.
Jiao, Q. G., Onwuegbuzie, A. J., & Waytowich, V. L. (2008). The relationship between citation
errors and library anxiety: An empirical study of doctoral students in education. Information
Processing and Management, 44, 948-956.
Perkins, D. (2008, October). Theories of Difficulty. Presentation at Indiana University,
Bloomington, IN. Slides also retrieved from
Van Note Chism, N. and Weerakoon, S.
Journal of the Scholarship of Teaching and Learning, Vol. 12, No.2, June 2012.
www.iupui.edu/~josotl
38
www.tla.ed.ac.uk/events/bjep2005/presentations/perkins.ppt.
Spivey, C.A., & Wilks, S.E. (2004). Reference list accuracy in social work journals. Research on
Social Work Practice, 14(4), 281-286.
Svinicki, M. D. (2004). Learning and motivation in the postsecondary classroom. San Francisco,
CA: Jossey-Bass.
Svinicki, M. D. (2010). A guidebook for conceptual frameworks for research in engineering
education. Retrieved from http://cleerhub.org/resources/gb-svinicki.
Sweetland, J. H. (1989). Errors in bibliographic citations: A continuing problem. The Library
Quarterly, 59(4), 291-304.
Waytowich, V. L., Onwuegbuzie, A. J., & Jiao, Q. G. (2006). Characteristics of doctoral students
who commit citation errors. Library Review, 55(3), 195-208.
Weidman, J. C., Twale, D. J., & Stein, E.L. (2001). ASHE-ERIC Higher Education Report: Vol.
28(3), Socialization of graduate and professional students in higher education: A perilous
passage?. San Francisco, CA: Jossey-Bass.
Witkin, H., Moore, C., & Goodenough, D. Y. C. (1977). Field dependent and field independent
cognitive styles and their educational implications. Review of Educational Research, 47(1), 1-64.
Copyright of Journal of the Scholarship of Teaching & Learning is the property of Journal of the Scholarship of
Teaching & Learning and its content may not be copied or emailed to multiple sites or posted to a listserv
without the copyright holder’s express written permission. However, users may print, download, or email
articles for individual use.
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&Re… 1/16
Title:
Author(s):
Source:
Availability:
Peer Reviewed:
ISSN:
Descriptors:
Abstract:
Abstractor:
Number of References:
Language:
Number of Pages:
Education Level:
Publication Type:
Journal Code:
Entry Date:
Accession Number:
Database:
Record: 1
Co-Teaching an Online Action Research Class
Wilson, Brent G.; Linder VanBerschot, Jennifer
Canadian Journal of Learning and Technology, v40 n2 Spr 2014. 18 pp.
Full Text from ERIC Available online:
https://eric.ed.gov/contentdelivery/servlet/ERICServlet?accno=EJ1030427
Canadian Network for Innovation in Education. 260 Dalhousie Street Suite
204, Ottawa, ON K1N 7E4, Canada. Tel: 613-241-0018; Fax: 613-241-0019; e-
mail: cnie-rcie@cnie-rcie.ca; Web site: http://www.cjlt.ca
Y
1499-6677
Online Courses, Electronic Learning, Team Teaching, Teacher Collaboration,
Action Research, Teaching Experience, Masters Degrees, Course
Descriptions, Teaching Methods, Educational Strategies, Educational
Practices, Success, Performance Factors, Scaffolding (Teaching Technique),
Communities of Practice, Time Management, Instructional Improvement,
Barriers, Course Objectives
Two instructors report our experience co-teaching an action research (AR)
required as part of an e-learning master’s degree. Adopting a practice-
centered stance we focus on the course activities of participants (instructors
and students), with particular attention to the careful crafting of course
elements with the goal of achieving an excellent learning experience for
students. The case narrative describes the course and ways in which we have
modified the course based on a variety of considerations. We also outline
problems and areas still in need of improvement. We reflect on the role of
theory in our own pursuit of excellence, and the role of theory in our students’
inquiry processes. We find that theory is just another tool or resource to apply
to the work, with the core concerns being the needs of students and the
learning environment.
As Provided
41
English
18
Postsecondary Education; Higher Education
Journal Articles; Reports – Research
APR2018
2014
EJ1030427
ERIC
Co-Teaching an Online Action Research Class
Co-enseignement et classe de recherche-action en ligne
Problem and Purpose
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&Re… 2/16
Two instructors report their experience co-teaching an online action research (AR) course required as part of an e-
learning master’s degree. Adopting a practice-centered stance we focus on the course activities of participants
(instructors and students), with particular attention to the careful crafting of course elements with the goal of
achieving an excellent learning experience for students. The case narrative describes the course and ways in which
we have modified the course based on a variety of considerations. We also outline problems and areas still in need of
improvement. We reflect on the role of theory in our own pursuit of excellence, and the role of theory in our students’
inquiry processes. We find that theory is just another tool or resource to apply to the work, with the core concerns
being the needs of students and the learning environment.
Deux enseignants font le rapport de leur expérience de co-enseignement d’un projet de recherche-action requis pour
un cours de formation en ligne au niveau de la maîtrise. À l’aide d’une approche axée sur la pratique, nous nous
sommes concentrés sur les activités de cours des participants (enseignants et étudiants), avec une attention
particulière pour l’élaboration minutieuse d’éléments de cours. Il s’agissait finalement de créer une expérience
d’apprentissage enrichissante pour les étudiants. L’exposé décrit le cours et les façons par lesquelles nous avons
modifié le cours à partir de considérations diverses. Nous donnons également un aperçu des problèmes et secteurs
nécessitant des améliorations. Nous nous sommes penchés sur le rôle de la théorie dans notre propre quête
d’excellence et dans le processus d’enquête de nos étudiants. Nous concluons que la théorie n’est qu’un outil ou une
ressource s’appliquant au travail et qu’il faut davantage se préoccuper des besoins des étudiants et de
l’environnement d’apprentissage.
In just about every university, instructors and program leaders are trying to plan and deliver effective online courses
and programs. Making this happen is a matter of both theory and practice. Theory looks for the general knowledge
that applies to a wide range of cases, whereas practices attends to the lived experience of participants and the
demands of the local situation. The present case report describes how two experienced instructors teamed together
to plan and co-deliver an action research (AR) course to students in an e-learning master’s program. We frame our
method as practice-centered: describing and analyzing various needs and challenges and highlighting the design
thinking that led to critical course elements. We also reflect on the theoretical grounds for our design thinking, as well
as how our knowledge about teaching and learning informed our activity. We conclude by reflecting on the role of
ongoing professional commitment, rounds of feedback and action, and attention to detail in supporting outstanding
course and program design.
A Practice-Centered Approach
Descriptions of pedagogy are usually centered on pedagogical models or instructional theories of some kind –
problem-based learning, scaffolding theory, situated or constructivist learning, etc. A problem however with such
models and theories is that they seriously underspecify what is needed for good instruction to happen. Precisely
because of their generality and broadly intended scope, models and theories leave out many details that are critical
to the success of any particular course. Any particular situation will fit a given theory or model to some extent – but in
other respects will need special handling and some custom design.
Wilson (2013) outlined a practice-centered approach to instructional design (ID) that foregrounds the situated and
idiosyncratic details and the improvization formns of design thinking done by teachers and designers in work
situations. A practice-centered approach to instructional design is defined as:
A view of ID work framed in technical, craft, and critical terms, involving activity mediated by tools and situation,
where opportunities for innovation emerge from new technologies, ideas, and systemic tensions, as well as the
craftsmanship, character, and agency of participants. (Wilson, 2013, p. 40)
Our approach tries to respect the complexity of professional practice and decision-making. Participants are seen as
autonomous, collaborating agents engaged in real-time, real-world problem solving in pursuit of worthwhile goals but
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&Re… 3/16
buffeted and conditioned by myriad constraints. We try to be open to influences of cognitive learning theory and
instructional best practices – but also to technical, craft, and critical perspectives on our practice.
Rather than depending on a particular pedagogical model or theory, a practice-centered approach achieves
coherence in instruction by carefully configuring an elegant response to the problems and resources available in the
learning situation. We are promiscuous in our theorizing and willing to mix and match in a way that would raise
eyebrows in the Academy. Bricolage is a useful metaphor for the creative, sometimes improvisational mixing and
combining of instructional elements, but also of the supporting theories themselves. This approach is meant to fill in
the gaps and tell a more complete story than a typical theory-centered report would do. The coherence of a course is
not achieved by its adherence to a model, but instead in the details of implementation and delivery.
A stream within sociology called practice theory closely examines the practice of work to reveal the knowledge, aims,
and values of participants (Bourdeiu, 1990; Kemmis, 2011; Postill, 2008). Activity theory frames learning in terms of
collaborative activity, working toward defined objects, using tools and rules of engagement (Engeström, 2000). Our
approach is similar to these, with the added affinity of craft theory, which shows how people devote a lifetime to
gaining knowledge and expertise toward make a high-quality product or demanding performance (Sennett, 2008).
Our practice-centered approach examines the practices of course design and delivery with a craft-like commitment
intended to achieve excellence over time. In the narrative that follows, we describe the setting, participants, activities,
and outcomes of a co-teaching experience. Our report is partly framed in terms of a narrative research paradigm
(Friesen, 2008), but only loosely so. No human-subjects approval was sought or granted, and no confidential data
are reported. Primarily the study can be read as a straightforward report of the two instructors’ collaboration. In some
ways the report is like a design rationale accompanyng a newly designed course, the kind routinely recommended by
instructional design textbooks – with added detail and reflection. The project arose from the conviction that designs
for instruction are under-reported and often unacknowledged, yet critical containers of professional knowledge. We
hope in the sharing of our experience, designers and instructors of online courses may see points of connection to
their own work and transfer insights and ideas to their situations (see Wilson, 2014 for a discussion of non-traditional
forms of knowledge creation).
Description of Course
Most e-learning master’s programs require a research course, teaching students how to read and interpret research,
or how to do it themselves. For more than 15 years students at our university have completed action-research
projects as part of a master’s curriculum in education technology. We see research as an essential part of an e-
learning professional’s skill repertory and a fundamental role on the job, to actively and systematically gather data
and knowledge to help guide decisions and policies in practice. The master’s degree is based on professional
standards of Association for Educational Communications and Technology (AECT), International Society for
Technology in Education (ISTE), and International Board of Standards for Training, Performance and Instruction
(IBSTPI), all of which include research, inquiry, and evaluation as core professional competencies.
The purpose of INTE 6720 Research in Instructional Technology is for learners to apply research methods in order to
analyze and improve their professional practices. Students apply AR principles to conduct research in practical
settings such as corporate training environments, academic technology and media centers, schools and classrooms,
or other settings such as home, community, or place of worship.
We approach AR as a practice of in-depth inquiry to create positive change and action. Practitioners identify a
problem or opportunity in need of further inquiry, plan the project, and collect data. By analyzing the data,
practitioners look for evidence of change, seek to understand the perspectives of others affected by the change, and
gain a deeper understanding of their professional practices. Finally, action researchers make critical reflections on
what has been learned in order to create conceptual tools for planning new actions. We encourage learners to act as
agents of change (Stringer, 2007) as they extract a narrative account of their professional environment (Friesen,
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&Re… 4/16
2008). Friesen explains how narrative and theoretical thinking can work together. Whereas a principle conveys
knowledge in its general form, a story works from the particular. In our course we encourage learners to use both
theory and narrative to guide their inquiries.
The course guides learners through a step-by-step process to build and implement their own project, using an
adaptation of Sagor’s (2000) AR process (pp. 3-7) – see Table 1 below.
TABLE: Table 1: Course Assignments Aligned with the AR Process
Table 1: Course Assignments Aligned with the AR Process The AR Process Cumulative Assignments in Class 1.
Select a focus Opportunity scan – students briefly describe three problems or opportunities in a work or applied
context that need further understanding and action 2. Identify research questions Problem statement – students draft
the front section of their inquiry report, including a set of research questions Proposal – students build the front
section by adding a methods section, and linking methods to research questions to ensure coherence 3. Collect data
Students typically begin collecting data soon after feedback on their proposal 4. Clarify theories Literature review –
students report findings from a search of literature and Web-based best practices 5. Analyze data Students continue
data collection and analysis 6. Report results Findings section – students report empirical findings, usually organized
by research question Final report – students revise all sections based on peer and instructor feedback and pull
everything together, including a conclusion section with action plans and recommendations 7. Take informed action
Presentation – students report their findings via multimedia presentations, either to a live audience or published on
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&Re… 5/16
the Web; often times they include a discussion of subsequent action that will be taken as a result of the AR Reflection
– students celebrate their work and reflect on their learning
As a complex undertaking, AR needs some kind of support for novices. We break the work into sequential steps that
build incrementally throughout the semester toward a final completed report (see the Appendix for an example of a
weekly overview). Extensive peer and instructor feedback is provided at each step, leading to corrections and
adjustments along the way. The final product is a high-quality, comprehensive AR report. Learners understood the
schedule was adaptable, but they also expressed appreciation in knowing what was expected of them each week.
Periodic chapters from a required text (Stringer, 2007) provided further support, along with a series of handouts,
rubrics, optional synchronous meetings and five-minute pre-recorded mini-lectures.
Students submitted a list of burning questions related to problems faced in their settings, along with three potential
areas of inquiry during the first week of class. From that list and with instructor guidance, they chose one topic and
moved forward in articulating research questions and creating a proposal.
Proposal submission was a major milestone and quality-control marker. Students include a questions and methods
table, ensuring coherence between the ends and means of inquiry. As data collection ensued, learners conducted a
review of literature (and best practices found on the Web) as a first round of answering their research questions.
Students then submitted a draft findings section that provided qualitative and quantitative evidence bearing on
research questions. Each section was integrated into a final report. Examples of reports from our co-taught term
(Spring 2011) include Adams (2011), DeNio (2011), Harding (2011), and Shipman (2011), all recent alumni who have
published their studies as part of their graduation portfolios.
After teaching the course at least twice individually prior to co-teaching, we determined that it was essential to
provide learners with a variety of examples throughout the course. Part of the weekly discussion routine involved
reflecting on one or two examples of research, usually student-written reports. Questions for discussion always
correlated with the reading and the overall weekly objectives. We made a concerted effort to identify a wide array of
research reports, including formal publications, qualitative, quantitative and mixed-methods research approaches,
settings to include K12, higher education, library, religious congregations, home life, corporate training and more.
Through conversations around these diverse cases, learners formulated stable concepts of AR and found examples
suitable for emulation in their own work. Equally important, they developed the skill of critiquing AR.
As co-instructors, we communicated a lot behind the scenes so that students perceived us as a united front.
Oftentimes with two instructors, students get confused or take sides when they hear different perspectives and
interpretations (Wiesenberg, 2004). In spite of occasional differences, students seemed to accommodate both of us
as we each offered support and guidance. Because we were both in agreement on approach and terminal goals, it
was easy to take individual initiative when needed and support decisions made by the other instructor. We brought
different strengths to the course and learned by observing each other’s approach.
In spite of our planning, we appreciate Holly, Arhar and Kasten’s (2005) adoption of the metaphor “yellow brick road”
for the title of their book on AR. The AR spirit of adventure applies to our co-teaching as well. While we referred to
several AR textbooks to build the course, we acknowledge the journey that both learners and instructors commit to
as they engage in the action inquiry process.
Key Elements Contributing to Course Success
Instructor and course ratings for the co-taught course ranged between 5.5 and 5.7 (out of a 6.0) on a co-taught
course. This was in the 90th percentile of university taught courses and better than either of us had accomplished
with our individual teaching. In this section we review particular course elements that contributed to student learning
and a successful experience for instructors and students alike. Much of the literature on student success builds on
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&Re… 6/16
Vincent Tito’swork in the 1980s regarding student retention (Tito, 1993); online students face special challenges but
share a common need for care, engagement, and supports specifically addressing their needs.
Supporting Students
“It is important that distance educators determine the most effective means of introducing students to the online
environment, supporting their assimilation to the virtual learning community and sustaining their motivation as online
learners” (Motteram & Forrester, 2005, p. 284). Boettcher and Conrad (2010) listed 10 best practices for teaching
online, the first of which is being present at the course site. “Being present at the course site is the most fundamental
and important of all the practices” (Boettcher and Conrad, 2010, p. 37). They continue to explain what this means –
checking in with the class daily. Learners want to know that someone is on the other line, and expect a response to a
question as soon as the email is sent/question is posted.
One of the obvious advantages of having two instructors in an online course is the additional support that can be
provided to students. While one instructor is grading papers, the other one can focus on engaging in the weekly
discussion and planning for the next week’s activities. Even with the additional support, early interventions with
under-performing students can be a challenge. We provided feedback on major assignments typically within a week,
and updated grades on a weekly basis. A quick scan of student performance allowed us to keep tabs on students
who needed additional attention.
Community of Learners
Russo and Campbell (2004) explored students’ perceptions of social presence in an asynchronous online course.
They determined that frequency of interaction, responsiveness, nonverbal communication, and tone all contributed to
the sense that other participants were real and involved in the course. For onine classes, non-verbal cues and tone
remain important for synchronous sessions, group conferencing, and recorded feeedback. The authors encouraged
learners to communicate with each other about topics outside of the focused academic conversation because it
allowed for the creation of a stronger sense of community. These conversations may start at the beginning of a
course when each participant is introduced. Gunawardena, Nolla, Wilson, Lopez-Islas, Ramirez-Angel and Megchun-
Alpizar (2001) also support informal interaction to facilitate in the development of social presence. We offered
threaded discussions for formal and informal discussions. The Virtual Café offered social conversations about
classes, travel, professional engagements, etc. While not heavily used, the forum did provide a comfortable area for
sharing and networking. The formal discussions occurred weekly and focused on either (a) discussion of the AR
examples, or (b) sharing progress/drafts of work on personal AR project. These formal discussions proved to be both
beneficial and detrimental to the online classroom. On one hand, they helped build community and encouraged co-
learning. On the other hand, they required an inordinate amount of reading, which can be extremely demanding for
anyone, especially a learner who is not a native speaker of the language of instruction. Replacing asynchronous
discussion boards with Web 2.0 tools such as Wallwisher, VoiceThread, etc. may help to mitigate fatigue that can set
in over the long term with text-based discussions.
The following table provides an overview of several key elements in the course, which we believe contributed to its
success.
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&Re… 7/16
TABLE: Table 2: Key Elements Contributing to the Success of the Course
Table 2: Key Elements Contributing to the Success of the Course Course Element Comment Use of student AR
reports as primary text Most readings were drawn from a diverse pool of student-written reports; students noted
elements to emulate in different papers. Activity checklists – single point of access for weekly activities Procedural
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&Re… 8/16
checklists are surprisingly valuable to students seeking to organize their weekly activities. Frequent short mediated
lectures using Jing Online lectures can be extremely boring – so we kept them short, serving an orientation purpose
more than detailed information conveyance. Incremental completion of complex project with iterative feedback cycles
The multiple submissions of a growing project provided scaffolding for the overall complex task, thus allowing more
sophistication than if done in a single submission. Rich feedback on submitted work – scanning handwritten
comments; Track Changes; audio comments Detailed guidance via personalized feedback steered student
performance in a production direction (especially at first); it also conveyed a sense of caring to students. Audio-
recorded voice was especially valuable. Peer critique on submitted work Work was routinely submitted to a shared
site – so students could see and respond to each other’s submissions. Small-team critiquing helped establish high
performance norms before the instructors ever reviewed the documents Instructor demos of simple reflection
assignments at beginning and end First-week introductions and end-of-course reflections were meant to be simple
and personal; instructors participating in the same assignment helped model expectations. Optional live sessions for
each graded assignment Tension surrounds online assignments – live sessions helped clarify expectations and
guidance for those needing extra support. All sessions were recorded and posted on the course site so learners
could refer to them as needed. Style guide for professional reports based on CARP graphic-design principles – APA
for citations but not for style American Psychological Association- (APA)-style formatting (double-spaced text,
centered headings, etc.) is useful for journal editors but deadly for readers. All assignments are submitted as
professional reports using Contrast, Alignment, Repetition, and Proximity (CARP) graphic-design principles – to
improve communication and readability. Supporting students doing out-of-context inquiry Students in a work setting
were outliers with special challenges; extra measures were taken to help these students succeed (one project was
recently published in a refereed journal). Direct student outreach Instructors would often head off problems by direct
phone calls and other means of outreach.
Challenges
Although responding to challenge is what design is all about, the two issues below deserve special mention.
Requirements
One trend we have noticed, as we develop best practices in guiding students to complete an AR project, is that the
projects themselves are increasing in scope. Over the years the average word-count had grown to around 7,000 to
8,000 words. In response, we have encouraged smaller scope and shorter reports. These conversations must
happen during the proposal phase so learners do not take on more work than they can possible complete in a
semester. Length of paper is not in itself a virtue. Longer papers typically provide more details, but they can try a
reader’s patience. Students could see this occasionally as they read assigned reports from other students. Just as we
try to use students’ time well in class, we encourage students to consider their readers’ time as they complete their
projects and write their papers.
Time Management
Time management in online courses has long been recognized as a continuing challenge for both instructors and
students (Dunlap, 2005; Hara & Kling, 1999). Romero and Barbera (2011) surveyed 48 students enrolled in a
graduate online course and found that students who committed time to focus on studies were more likely to be
successful in class. Interestingly, students who reported studying in the mornings had the highest levels of success.
Bozarth, Chapman, and LaMonica (2004) recommended offering an orientation class for learners new to the online
environment, with one of the primary foci of the course being time management.
In the syllabus we explain to students that taking an online course requires them to take more responsibility for
structuring their time. Learners cannot depend on live meetings to structure their time and workload, as they might in
a face-to-face setting. In constrast to practices of some colleagues, we tell students to expect to spend only six to
nine hours per week on the course. This includes the time to complete assigned readings, as well as any group
activity or discussion work that week. We believe that professionals need to be careful guardians of their time – and
learn not to put inordinate amounts of time into their studies. We do our best to set expectations that learners will
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&Re… 9/16
work hard but that we are respectful of their time and will not waste a minute. In more recent courses, we have
offered a video tour of the course with guidance for learners on how to save time navigating the course and engaging
in course discussions.
Bender, Wood and Vredevoogd (2004) investigated the time required to teach in an online environment versus in a
face-to-face environment. When considering the number of students enrolled in both courses, they determined that
instructors tended to spend almost double the amount of time in the asynchronous online course. Their research was
conducted in 2002, before many of the social media tools were available and certainly before online instructors
started understanding the interaction required to successfully connect with students (LaPointe and Gunawardena,
2004).
Time management for us as instructors is equally challenging. Before the class is available to students, all lessons
have been drafted, each week is planned and rubrics are available. This helps us focus on the learners and their
needs throughout the semester. However, even with all that pre-work and planning, lessons need to be modified to fit
the individual classroom needs, papers need feedback and discussions need to be monitored. Co-teaching helps
remedy this issue.
Ongoing Improvements
Co-teaching the class was a rare occurrence – we each are so busy with work and other assignments. But the
evolution of the course continues through our independent teaching of different sections. We continue to gather
student feedback both in the form of unsolicited student correspondence and more formal faculty course
questionnaire data. In the future, we would like to compare the feedback from one semester to the next to determine
the extent to which student are satisfied with the course after changes are implemented.
Table 3 offers a wish list of contemplated changes under consideration. Some have been implemented in subsequent
offerings of the course, and some remain to be done. Most of these points are noteworthy for their lack of
innovativeness or theoretical sophoistication. Even so, they does not signify a poorly taught course in need of
obvious revisions, nor a lack of grounding in the course’s desing or development. Every instantiated course is a work
in progress. The proess of empirial tryout and noticing of needs is a critical part of effective instructional design. The
needs outlined above are very feasible and indeed have been largely integrated into continuing offerings of the
course. That ongoing improvement cycle poses perhaps the best opportunity at excellence for online instructors (for
a related discussion about “improvement science,” see Bryk et al., 2013).
Two additional needs of students warrant mention, but they are not course elements exactly, and have no simple fix.
* Professional voice. Students sometimes struggle to find a professional voice suitable for technical reports. We work
closely with students as they pursue a direct, honest, first-person active, yet professional and credible voice in their
papers. Repeating cycles of feedback is time-intensive but the best way we know of to help with this. Referring
students to university-sponsored writing labs has also helped.
* Critical stance. Students need to find that balance between relevance and rigor in their planning and thinking.
Additionally, they need to develop and maintain a critical stance as they weigh evidence and assign value, with
particular attention to social-justice impacts (Wilson, 2012). This is a continuing issue/goal for our instruction. Again,
there is no easy answer, but a continuing priority for the course.
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&R… 10/16
TABLE: Table 3: Contemplated Changes for Future Sections
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&R… 11/16
Table 3: Contemplated Changes for Future Sections Contemplated Change Comments Single point of access for
course documents, rubrics, and assigned readings Resources are fairly well organized, but improvements could be
made – for example, providing a single calendar with embedded download links to all readings and media. With each
cycle of offering, material is better organized, based on student feedback and recommendations. Monitoring of 3 to 5
person teams and feedback given to team members In spite of two instructors, we failed to monitor the quality of
ongoing feedback given within feedback teams. In subsequent courses, more effort has been made to review all peer
feedback in the process of providing feedback on individual papers. This takes time when reviewing drafts, but saves
time in grading, and most importantly results in a higher-quality final draft. Dual submission of course questions –
email and shell The LMS does not have a notification-to-email option – so queries to the Assignments area may go
unnoticed. In subsequent courses, students have been instructed on where to post questions. This creates a central
location where the instructor can always access first so that all questions are addressed immediately. Phone
numbers and emails are also provided in case an immediate response is needed. Efficiencies for instructors and
students Further efficiencies are needed, eliminating low-value activities, reducing extraneous cognitive load, and
avoiding “scope creep” of final reports. Students should be rewarded for succinctness and managing scope
successfully. In subsequent courses, instructors have been more diligent about providing extremely in-depth
feedback on the scope of research proposal during the initial planning phase of the project. Student-led discussions
of readings Students are very focused on their own projects, but engagement could be increased by student turn-
taking as discussion leader; it also gives them practice in taking a leadership role in an online course. Live sessions –
scheduling We routinely polled students about good times for live sessions – but times were usually Tuesday or
Wednesday nights. In subsequent courses, the poll is sent one time, and optional live sessions are scheduled within
those pre-planned times. MS Word styles and Track Changes Many students don’t know how to control formatting in
MS Word (e.g., styles for headings, block text, etc.) or use Track Changes effectively. In subsequent courses,
students are referred to previously recorded online tutorials for additional support. Encouraging partners in research
project Some of our best projects each semester tend to be from students who chose to work as a team. In
subsequent courses, partnerships have been strongly encouraged.
Conclusions and Implications for Practice
The most powerful benefit of the co-teaching experience may be in the opportunity to build capacity (in the course
and in the instructors) and learn from one another. This happens when we are open to making mistakes, letting each
other take risks, and afterward reflecting on the effectiveness on whatever approach we chose to take. Maintaining
positive energy in spite of lags or challenges can be tremendously empowering in teams; our experience confirms
that as well.
The narrative above made minimal mention of theories of learning and instruction, yet they did exert an influence on
design decisions, simply because we are both knowledgeable about the field. Theories relevant to the course
include:
* Pedagogical capital (Wilson & 1Switzer, 2012). By consistently meeting students’ needs over time, instructors can
build a store of trust with students, which at critical points can be drawn on to challenge and motivate students to
achieve more than they normally would.
* Scaffolding of complex performance. Completing the inquiry process by breaking down the task into pieces with
iterative feedback cycles may be seen in either behaviorist or Vygotskian terms (Wood, Bruner, & Ross, 1976).
* Dramatic pacing. Courses need to maintain sustainable effort throughout a course (Duffy & Jones, 1995); principles
of dramatic pacing drawn from aesthetics can help (Parrish, 2005).
* Cognitive load. Students’ limited cognitive load needs to focus on material intrinsically related to the task, avoiding
unnecessary distractions. Time and cognitive load spent finding info, managing the interface, and learning class-
related performance routines should be kept to a minimum (Clark, Nguyen, & Sweller, 2005).
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&R… 12/16
* Self-efficacy and self-regulation. Students who see themselves as competent and resourceful learners, and who
learn strategies for self-regulation, are more successful in academic work (Schunk, 1991).
* Caring. Instructors can show care toward students (even adults) through personal attention, respectful treatment
and personalized feedback and interactions (cf. Noddings, 2003).
* Community of inquiry. Instructors can foster a positive learning experience through selecting appropriate content,
supportive content-relevant discourse, and setting a healthy and trusting climate (Garrison, Anderson, & Archer,
2001).
* Activity theory. An online course is an activity system, with meaningful objects, tools, and division of labor.
Improvements can be made by examining tensions and contradictions in the system (Engeström, 2000).
* Worked cases. Students can learn AR practices by reviewing a diverse set of worked examples (Chi & Bassok,
1989; Merrill, 1968).
* Authentic learning. Students benefit when assigned projects are grounded in significant real-life problems
(Herrington & Herrington, 2006).
This is an eclectic mix of theoretical perspectives, with incompatibilities and contradictions among them. This violates
Hannafin et al.’s (1997) rule to ground instruction in a consistent theory. Instead, as noted in the introduction, the
grounding is in the practice itself. Throughout our collaboration, we were willing to mix and match ideas drawn from
different theories. The fidelity we sought was not to any theory, but rather to the students, the course goals, and the
situation. This is consistent with a practice-based approach to design, described above. We have tried to make our
thinking and decision-making transparent, so facilitate ttransfer of ideas to the reader’s own situation, depending on
the need and problems addressed.
As a means of professional development, co-teaching can be expensive (paying two instructors rather than one – or
asking instructors to work for less). Co-teaching is more participatory and less authority-driven than most methods of
professional development. It has a benefit though of building capacity within a course, as well as within instructors,
and fostering more innovation than traditional methods. More research is needed to fully determine the benefits and
concerns related to co-teaching, particularly in an online setting, as an institutional means for accomplishing both
ends.
References
Adams, B. (2011, April). Maintaining quality of instruction while transitioning from face-to-face to online training.
Retrieved from
http://brentadams.weebly.com/uploads/3/6/2/6/3626763/brentadams%5Faction%5Finquiry%5Fproject%5F041511 x
Bender, D. M., Wood, B. J., & Vredevoogd, J. D. (2004). Teaching time: Distance education versus classroom
instruction. American Journal of Distance Education, 18(2), 103-114.
Boettcher, J. V., & Conrad, R. M. (2010). The online teaching survival guide: Simple and practical pedagogical tips.
San Francisco, CA: Jossey Bass.
Bourdeiu, P. (1990). The logic of practice (R. Nice, Trans.). Stanford, CA: Stanford University Press.
Bozarth, J., Chapman, D. D., & LaMonica, L. (2004). Preparing for distance learning: Designing an online student
orientation course. Educational Technology and Society, 7(1), 87-106.
Bryk, A. Yeager, D.S., Hausman, H., Muhich, J., Grunow, A., LeMahieu, P., & Gomez, L. (2013). Improvement
research carried out through networked communities: Accelerating learning about practices that support more
http://brentadams.weebly.com/uploads/3/6/2/6/3626763/brentadams_action_inquiry_project_041511 x
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&R… 13/16
productive student mindsets. Retrieved from
http://www.carnegiefoundation.org/sites/default/files/improvement%5Fresearch%5FNICs%5Fbryk-yeager
Chi, M. T. H., & Bassok, M. (1989). Learning from examples via self-explanations. In L. B. Resnick (Ed.), Knowing,
learning, and instruction: Essays in honor of Robert Glaser (pp. 251-282). Hillsdale NJ: Erlbaum.
Clark, R. C., Nguyen, F., & Sweller, J. (2005). Efficiency in learning. San Francisco: CA: Pfeiffer.
DeNio, D. (2011, May). Lyme disease: A hidden epidemic. Investigating the impact of social media on patient
education. Retrieved from
http://dawndenioportfolio.weebly.com/uploads/2/1/8/3/2183383/lymedisease%5Factionresearc hproject
Duffy, D. K., & Jones, J. W. (1995). Teaching within the rhythms of the semester. San Francisco: Jossey-Bass.
Dunlap, J. C. (2005). Workload reduction in online courses: Getting some shuteye. Performance Improvement, 44(5),
18-25.
Engeström, Y. (2000). Activity theory as a framework for analyzing and redesigning work. Ergonomics, 43(7), 960-
974.
Friesen, N. (2008). Chronicles of change: The narrative turn and e-learning research. E-Learning and Digital Media,
5(3).
Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing
in distance education. The American Journal of Distance Education, 15(1), 1-24.
Gunawardena, C. N., Nolla, A. C., Wilson, P. L., Lopez-Islas, J. R., Ramirez-Angel, N., & Megchun-Alpizar, R. M.
(2001). A cross-cultural study of group process and development in online conferences. Distance Education, 22(1),
85-121.
Hannafin, M. J., Hannafin, K., Land, S. M. & Oliver, K. (1997). Grounded practice and the design of constructivist
learning environments. Educational Technology Research and Development, 45(3), 101-117.
Harding, D. (2011, May). Is there a benefit to using flipchart software for individual student interaction? Retrieved
from https://docs.google.com/file/d/0B%5Fl2PyVfdCVXZDEwMDM5MmUtNDRjOC00ZjEzLWE
5YmQtZGI1ZDgwMGFmOGJh/edit
Hara, N. & Kling, R. (1999). Students’ frustration with a web based distance education course. First Monday, 4(12).
Retrieved from http://firstmonday.org/ojs/index.php/fm/article/view/710/620
Herrington, A., & Herrington, J. (Eds.). (2006). Authentic learning environments in higher education. Hershey, PA:
Information Science Publishing.
Holly, M. L., Arhar, J., & Kasten, W. C. (2005). Action research for teachers: Traveling the yellow brick road (2nd ed.).
Upper Saddle River, NJ: Pearson Education, Inc.
Kemmis, S. (2011). Recognising and respecting diversity in understandings of practice. In C. Kanes (ed.), Elaborating
professionalism: Studies in practice and theory, 5, 139-165.
LaPointe, D. K., & Gunawardena, C. N. (2004). Developing, testing and refining of a model to understand the
relationship between peer interaction and learning outcomes in computer-mediated conferencing. Distance
Education, 25(1), 83-106.
http://www.carnegiefoundation.org/sites/default/files/improvement_research_NICs_bryk-yeager
http://dawndenioportfolio.weebly.com/uploads/2/1/8/3/2183383/lymedisease_actionresearc
http://firstmonday.org/ojs/index.php/fm/article/view/710/620
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&R… 14/16
Merrill, M. D. (1968). Teaching concepts: An instructional design guide. Englewood Cliffs, NJ: Educational Technology
Publications.
Motteram, G., & Forrester, G. (2005). Becoming and online distance learner: What can be learned from students’
experiences of induction to distance programs? Distance Education, 26(3), 281-298.
Noddings, N. (2003). Happiness and education. Cambridge: Cambridge University Press.
Parrish, P. (2005). Embracing the aesthetics of instructional design. Educational Technology, 45(2), 16-25.
Postill, J. (2008, October 30). What is practice theory? Retrieved from http://johnpostill.com/2008/10/30/what-is-
practice-theory
Romero, M., & Barbera, E. (2011). Quality of e-learners’ time and learning performance beyond quantitative time-on-
task. The International Review of Research in Open and Distance Learning, 12(5), Retrieved from
http://www.irrodl.org/index.php/irrodl/article/view/999/1870
Russo, T. C., & Campbell, S. W. (2004). Perceptions of mediated presence in an asynchronous online course:
Interplay of communication behaviors and medium. Distance Education, 25(2), 215-232.
Sagor, R. (2000). Guiding school improvement with action research. Alexandria, VA: Association for Supervision and
Curriculum Development.
Schunk, D. (1991). Self-efficacy and academic motivation. Educational Psychologist, 26 (3/4), 207-231.
Sennett, R. (2008). The craftsman. New Haven: Yale University Press.
Shipman, J. (2011, April). Exploring impacts of interactive whiteboard use in elementary school. Retrieved from
Stringer, E. T. (2007). Action research (3rd ed.). Los Angeles, CA: SAGE Publications.
Tinto, V. (1993). Leaving college: Rethinking the causes and cures of student attrition (2nd ed.). Chicago: The
University of Chicago Press.
Wiesenberg, F. (2011). Designing and co-facilitating online graduate classes: Reflections and recommendations.
Canadian Journal of University Continuing Education, 30(2), p. 39-57. Retrieved from
http://www.ccde.usask.ca/cjuce/articles/v30pdf/3022
Wilson, B. G. (2012). Developing a critical stance as an e-learning specialist: A primer for new professionals. In S. B.
Fee & B. Belland (Eds.), The role of criticism in understanding problem solving: Honoring the work of John C. Belland
(pp. 57-68). New York: Springer.
Wilson, B. G. (2013). A practice-centered approach to instructional design. In M. M. Spector, B. B. Lockee, S. E.
Smaldino, & M. Herring (Eds.), Learning, problem solving, and mind tools: Essays in honor of David H. Jonassen.
New York: Routledge.
Wilson, B. G. (2014, April). Knowledge creation in design-based research projects: Complementary efforts of
academics and practitioners. Presented at the meeting of the American Educational Research Association,
Philadelphia, PA.
Wilson, B. G., & Switzer, S. H. (2012, June). Pedagogical capital: Strengthening and leveraging trusting relationships
in online courses. Paper presented at EdMedia, Denver, CO.
http://johnpostill.com/2008/10/30/what-is-practice-theory
http://www.irrodl.org/index.php/irrodl/article/view/999/1870
http://www.ccde.usask.ca/cjuce/articles/v30pdf/3022
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&R… 15/16
Wood, D. J., Bruner, J. S., & Ross, G. (1976). The role of tutoring in problem solving. Journal of Child Psychiatry and
Psychology, 17(2), 89-100.
Appendix A: Sample of an Activity Checklist and Discussion (from Week 4)
This week you submit your Problem Statement, consisting of the front section of your research project, including
introduction, purpose statement, research questions, and initial thoughts about method. A number of readings
relating to research methods are planned, beginning this week and continuing throughout the course.
1. Download revised schedule with readings. Readings with links are available on the Web; those unlinked should be
available in Doc Sharing. Feel free to look ahead at upcoming readings if you are anxious to proceed with your
inquiry project.
2. Watch the mini-lecture, describing how his underlying beliefs have evolved over the years.
3. Read Gary Thomas’s chapter, The Design Frame, available in three parts in Doc Sharing. This is a long and
sometimes tangential chapter; feel free to skim parts that seem less central to our purposes. Our goal in assigning
this chapter is to convey the various forms your inquiry can take – and how each form of inquiry makes different
assumptions about the world.
4. In addition to the Thomas chapter, download and read these two short items for discussion: Darcie Gudger’s
reflection Melissa Vance’s comparison of intellectual development models.
5. Prepare and submit your Problem Statement within the weekly area, using the scoring rubric to guide your work.
Submit to the common area, then look for your Group’s area (A, B, C, or D) to provide feedback to your group
members. The Problem Statement is due Sunday, but you may begin reading and critiquing group members’
submissions as soon as you see them submitted.
We have scheduled a live session for discussion about this week’s assignment (see the announcement area for more
detail). Here is the web location: https://connect.cuonline.edu/assignment-support/.
[Direction for weekly discussion]
Our discussion reflects on how our experience shapes our beliefs – and how our beliefs shape our practices.
* Download and read Darcie Gudger’s reflection (below).
* Think about your own intellectual and career development over the years.
* Then reflect on your inquiry project and how your beliefs and assumptions about learning and practice are shaping
it (and perhaps shaped by it).
Theorists of intellectual development describe how people evolve in their views of knowledge -from very black-and-
white, authoritarian views to a more situated view, contingent on timing, place, and circumstance. See the attached
handout below, from Melissa Vance’s dissertation.
With all these thoughts in mind, let’s consider:
* How have your experiences shaped your beliefs – and in turn, your beliefs shaped your professional practice
(including your inquiry project)?
* How would you describe your stage or progression of intellectual development? How have you progressed since
beginning work in education and training? How does this affect your choices in your career?
This work is licensed under a Creative Commons Attribution 3.0 License.
3/15/2019 EBSCOhost
https://web-b-ebscohost-com.ezp.waldenulibrary.org/ehost/delivery?sid=91204314-301a-455c-a02b-95d41e0bb893%40pdc-v-sessmgr02&vid=1&R… 16/16
~~~~~~~~
By Brent G. Wilson, University of Colorado Denver and Professor of Information and Learning Technologies,
University of Colorado Denver and Jennifer Linder VanBerschot, University of Colorado Denver and Instructional
designer/project manager for Colorado State University-Global Campus and an adjunct professor for University of
Colorado Denver
Brent G Wilson. Email: brent.wilson@ucdenver.edu. Brent is Professor of Information and Learning Technologies at
the University of Colorado Denver. His research addresses these questions: What is good instruction and how can
we support its creation? How can we help instructors and learners make appropriate use of tools and resources?
Jennifer Linder VanBerschot. Email: jennifer.vanberschot@ucdenver.edu. Jenna is an instructional designer/project
manager for Colorado State University-Global Campus and an adjunct professor for University of Colorado Denver.
She is interested in the use of technology and social software to promote interaction across cultures and between
international learning communities.
Source: Canadian Journal of Learning and Technology, 20140101, Vol. 40 Issue 2, 18p
Item: EJ1030427