Sunday, May 25, 2008

Assessing Social Presence In Asynchronous Text-based Computer Conferencing - Article Critique

Rourke, Anderson, Garrison, and Archer (2001) recognized the potential in computer-mediated conferencing (CMC) to achieve high order thinking goals. They also recognized that to achieve this potential it is important to “develop research methods that explore the nature of teaching and learning in these environments” (p.3), use these methods in authentic contexts, and subsequently use the results to make the most of CMC technology. In this article, they address all three of those areas, by developing and testing a tool that can be used to assess social presence in the CMC environment. The article begins with a review of Garrison, Anderson, and Archer’s (as cited in Rourke et al., 2001) community of inquiry model and then focuses on one element of the model, social presence. A review of the strategy used to develop the tool is presented as well as the method used to test this tool, assessing social presence in two transcripts from computer conferencing.

Critique

The initial review of the Community of Inquiry framework serves to explain the instrumental role of social presence in the overall educational experience. This is an appropriate lead in to a literature review that adequately debates the long accepted view that nonverbal cues are essential to overall understanding and rich communications, and that CMC cannot adequately support social and affective interaction. They cite more recent research that challenges those beliefs and acknowledges that CMC can support such interaction in educational settings. To further justify their research, they point out some critical differences between the earlier research and more recent research on today’s CMC environment. First, that the earlier research on teacher immediacy, which is the foundational work of the social presence construct, implied that the teacher was responsible for the creation of a social environment; this is different from the community of inquiry framework, where social presence is a function of both learners and teachers. Secondly, since the CMC environment is largely text-based and asynchronous, social behaviours must be considered in that context. These points are both valid.

Methods for developing coding

The methodology used for developing the tool for content analysis was logical; it relied on prior research in social presence and communication, and careful reading of sample transcripts, to create an inclusive list of coding categories. Categories were derived from Garrison’s et al. (as cited in Rourke, Anderson, Garrison, and Archer, 2001) three categories of social presence, emotional expression, open communication, and group cohesion. The researchers discuss their concerns about finding these indicators, specifically how to determine the unit of analysis and achieve interrater reliability. The discussion of each of these challenges presents an acceptable methodology for addressing reliability concerns and to move forward and test the model.

They included two tables to illustrate the ‘Model and Template for Assessment of Social Presence’ and a ‘Sample of Coded Text’. This was very confusing, as both appeared to be identical. Although, it was clear to me how they would obtain and categorize each unit of analysis, I reread this section several times to see if there was something else that I was missing. I am still unclear as to whether one table was incorrectly inserted in the document in place of an omitted table, or whether the same table was being used to illustrate each point. In the latter case, I would suggest the deletion of the second table.

Methodology for testing the tool

Selected transcripts from two graduate-level courses were used. Both courses had a similar course structure which included an online discussion between the instructor, students and two student moderators. In both, courses roles and expectations of the students and moderators were similar however in one case the teacher role was more prevalent and interactive. This difference is acceptable since the study is not evaluating factors influencing social presence.

Three researchers worked together to code the package engaging in constant dialogue throughout the process to set a sound protocol. The article states that, once this protocol was determined, two coders worked independently to code the two conference selections. I am unclear as to whether or not two new coders were brought in to work with the set protocol or whether or not they came from the original three. This is relevant to the reliability, as discussions had already taken place around the units of analysis which present a bias.

Results

There were 2.5 times more instances of social presence in transcript A than transcript B however there were significantly more messages and words in transcript A. To account for these differences, the researchers developed a calculation called “social presence density” to allow for a more meaningful comparison of transcripts.

Results indicated that interrater reliability for individual indicators was varied according to the manifest versus latent nature of the indicator. This is not a surprise because a latent indicator such as ‘humor’ is more subjective that a manifest indicator such as, ‘continuing a thread’ which does or does not exist. They justified this lower figure in reliability by citing Rife et al. (in Rourke, Anderson, Garrison, and Archer, 2001) who explained that research that is “breaking new ground with concepts that are rich in analytical value may go forward with reliability levels that are somewhat below that range” (p.14). Although, I agree that new tools take time to perfect, I am disappointed that the researchers failed to implement the most obvious method of confirming their interpretations, interviewing the course participants. Since reliability was a concern, results could have been verified and qualified by speaking with participants from each of the CMC classes to cross check the coding of results. Triangulation contributes to the trustworthiness of data (Glesne, 2006) and this additional information would have added to the validity of results. This step would be effective even if only implemented in the initial stages of setting protocol, to ensure coding was accurate and would be especially valuable as it pertains to the latent indicators where the interrater reliability was only 0.25. Interview questions could include, “when you wrote ‘ya right!’ at the end of the phrases, what were you indicating?

Conclusion

The article is well written and organized with the exception of the late statement about the main purpose of the research. In the discussion section, they state that the main purpose was to develop a “methodology that would identify and analyse the social presence component of educational computer conferences” (Rourke, Anderson, Garrison, and Archer, 2001 p.14). I would have preferred that this statement been provided upfront, as the progression of the article made more sense and was easier to follow once I read the purpose statement.

I accept the study conclusion that the “social presence density calculation provides an important quantitative description of computer conferencing environments” (Rourke, Anderson, Garrison, and Archer, 2001, p.16). As indicated, there is much to be learned on the importance of social presence in the online world and its effect of student achievement, satisfaction and higher order thinking, and this calculation can add to further study in those areas. I also agree that the tool presented will help others critically assess the level of social presence in their classes. (Rourke et al., 2001). I do feel, however, the biggest failing of this study is that it did not qualify the quantitative results by speaking directly with course participants.

Reflection

Although I do not anticipate using this tool in any formal research, I will use it to gauge participant engagement when we introduce online programs in the workplace. The researchers have effectively illustrated how social presence can be assessed in written text and this information can assist instructors in identifying when social presence is strong or when it is lacking, so they may intervene as necessary.

As well, considering the categories of indicators when reading any text-based interactions may be useful in better understanding the sender’s intent. Text-based communications are used increasingly, and misunderstandings are not uncommon. Awareness of these social cues will be helpful in reviewing not only text from students but also subordinates who submit “group reports”, later determined to represent the view of one individual in situations where group cohesion is clearly non-existent.
The authors suggest areas for research which would be helpful to my personal practice in the transition to online learning. The determination of “relative influence and importance of each of the indicators on social presence” (Rourke, Anderson, Garrison, and Archer, 2001, p.15) would help differentiate between messages that represent true social presence versus replies just to satisfy minimum requirements if there are misconceptions that the quantity, not quality, of postings are relevant to evaluation. Also pertinent to my practice, as an air traffic control instructor, would be further evaluation on sufficient or optimal levels of social presence. Finding the right level of social presence is an existing challenge in our program resulting from low student - teacher ratios and the power imbalance that exists because course failure results in loss of job potential. Instructors need to balance the role of pedagogical leader with that of co-learner and friend. We attempt to mitigate this situation by creating a collaborative learning environment. These efforts will need to be extended to any online learning


References:

Glesne, C (2006). Becoming Qualitative Researchers: An Introduction.3rd Ed. New York: Addison Wesley Longman

Rourke, L., Anderson, T., Archer, W. & Garrison, D. R., (1999). Assessing social presence in asynchronous, text-based computer c
onferences, Journal of Distance Education, 14 (3), 51-70.