City Research Online

Reviewing the Quality of Discourse Information Measures in Aphasia

Pritchard, M., Hilari, K., Cocks, N. & Dipper, L. (2017). Reviewing the Quality of Discourse Information Measures in Aphasia. International Journal of Language and Communication Disorders, 52(6), pp. 689-732. doi: 10.1111/1460-6984.12318

Abstract

Background: Discourse is fundamental to everyday communication, and is an increasing focus of clinical assessment, intervention, and research. Aphasia can affect the information a speaker communicates in discourse. Little is known about the psychometrics of the tools for measuring information in discourse, which means it is unclear whether these measures are of sufficient quality to be used as clinical outcome measures or diagnostic tools.

Aims: The current review aimed to profile the measures used to describe information in aphasic discourse, and assess the quality of these measures against standard psychometric criteria.

Methods: A scoping review method was employed. Studies were identified using a systematic search of Scopus, Medline, and Embase databases. Standard psychometric criteria were used to evaluate the measures’ psychometric properties.

Main contribution: The current review summarises and collates the information measures used to describe aphasic discourse, and evaluates their quality in terms of the psychometric properties of acceptability, reliability, and validity. Seventy-six studies described 58 discourse information measures, with a mean of 2.28 measures used per study (SD= 1.29, range 1-7). Measures were classified as functional measures (n= 33), which focused on discourse macrostructure, and functional and structural measures (n= 25), which focused on microlinguistic and macrostructural approaches to discourse as described by Armstrong (2000). There were no reports of the acceptability of data generated by the measures (distribution of scores, missing data). Test-retest reliability was reported for just 8/58 measures with 3/8 > 0.80. Intra-rater reliability was reported for 9/58 measures and in all cases % agreement was reported rather than reliability. Percent agreement was also frequently reported for inter-rater reliability, with only 4/76 studies reporting reliability statistics for 12/58 measures; this was generally high (> .80 for 11/12 measures). The majority of measures related clearly to the discourse production model described by Sherratt (2007), indicating content validity. 36/58 measures were used to make 41 comparisons between PWA and NHP, with 31/41 comparisons showing a difference between the groups. Four comparisons were made between genres, with two measures showing a difference between genres, and two measures showing no difference.

Conclusions: There is currently insufficient information available to justify the use of discourse information measures as sole diagnostic or outcome measurement tools. Yet the majority of measures are rooted in relevant theory, and there is emerging evidence regarding their psychometric properties. There is significant scope for further psychometric strengthening of discourse information measurement tools.

Publication Type: Article
Additional Information: This is the peer reviewed version of the following article: Pritchard, M., Hilari, K., Cocks, N. & Dipper, L. (2017). Reviewing the Quality of Discourse Information Measures in Aphasia. International Journal of Language and Communication Disorders, which has been published in final form at http://dx.doi.org/10.1111/1460-6984.12318. This article may be used for non-commercial purposes in accordance with Wiley Terms and Conditions for Self-Archiving.
Subjects: P Language and Literature
Departments: School of Health & Psychological Sciences > Language & Communication Science
SWORD Depositor:
[thumbnail of Reviewing the Quality of Discourse Information Measures in Aphasia 030216 FINAL TO UPLOAD .pdf]
Preview
Text - Accepted Version
Download (543kB) | Preview
[thumbnail of Appendix]
Preview
Text (Appendix) - Supplemental Material
Download (419kB) | Preview

Export

Add to AnyAdd to TwitterAdd to FacebookAdd to LinkedinAdd to PinterestAdd to Email

Downloads

Downloads per month over past year

View more statistics

Actions (login required)

Admin Login Admin Login