Testing spelling: How does a dictation method measure up to a proofreading and editing format?

Tessa DAFFERN, Noella Mackenzie, Brian Hemmings

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

In response to increasing data-based decision making in schools comes increased responsibility for educators to consider measures of academic achievement in terms of their reliability, validity and practical utility. The focus of this paper is on the assessment of spelling. Among the methods used to assess spelling competence, tasks that require the production of words from dictation, or the proofreading and editing of spelling errors are common. In this study, spelling achievement data from the National Assessment Program - Literacy and Numeracy (NAPLAN) Language Conventions Test (a proofreading and editing based measure) and the Components of Spelling Test (CoST) (a dictation based measure) were examined. Results of a series of multiple regression analyses (MRAs) were based on a sample of low-achieving and high-achieving spellers from the Australian Capital Territory (ACT) in Year 3 (n=145), Year 4 (n=117), Year 5 (n=133) and Year 6 (n=117). Findings indicated significant relationships between scores in the spelling domain of the NAPLAN Language Conventions Test and the phonological, orthographic and morphological subscales scores of the CoST. Further, the orthographic subscale of the CoST was generally the main predictor of NAPLAN spelling across year level. Analysis also demonstrated that gender was not an influential factor. Implications for assessment and instruction in spelling are discussed in this paper, and the CoST is offered as a valid, reliable and informative measure of spelling performance for use in school contexts or future research projects
Original languageEnglish
Article number6
Pages (from-to)1-23
Number of pages23
JournalAustralian Journal of Language and Literacy
Volume2017
Issue numberFEB
Publication statusPublished - 2017

Fingerprint

literacy
language
academic achievement
school
Testing
Editing
Spelling
Dictation
educator
instruction
decision making
regression
responsibility
gender
performance
Literacy
Numeracy
Language
Orthographic

Cite this

DAFFERN, Tessa ; Mackenzie, Noella ; Hemmings, Brian. / Testing spelling: How does a dictation method measure up to a proofreading and editing format?. In: Australian Journal of Language and Literacy. 2017 ; Vol. 2017, No. FEB. pp. 1-23.
@article{bc01aea5f7984874abe788097dd2629c,
title = "Testing spelling: How does a dictation method measure up to a proofreading and editing format?",
abstract = "In response to increasing data-based decision making in schools comes increased responsibility for educators to consider measures of academic achievement in terms of their reliability, validity and practical utility. The focus of this paper is on the assessment of spelling. Among the methods used to assess spelling competence, tasks that require the production of words from dictation, or the proofreading and editing of spelling errors are common. In this study, spelling achievement data from the National Assessment Program - Literacy and Numeracy (NAPLAN) Language Conventions Test (a proofreading and editing based measure) and the Components of Spelling Test (CoST) (a dictation based measure) were examined. Results of a series of multiple regression analyses (MRAs) were based on a sample of low-achieving and high-achieving spellers from the Australian Capital Territory (ACT) in Year 3 (n=145), Year 4 (n=117), Year 5 (n=133) and Year 6 (n=117). Findings indicated significant relationships between scores in the spelling domain of the NAPLAN Language Conventions Test and the phonological, orthographic and morphological subscales scores of the CoST. Further, the orthographic subscale of the CoST was generally the main predictor of NAPLAN spelling across year level. Analysis also demonstrated that gender was not an influential factor. Implications for assessment and instruction in spelling are discussed in this paper, and the CoST is offered as a valid, reliable and informative measure of spelling performance for use in school contexts or future research projects",
keywords = "Spelling, Linguistics, Assessment",
author = "Tessa DAFFERN and Noella Mackenzie and Brian Hemmings",
year = "2017",
language = "English",
volume = "2017",
pages = "1--23",
journal = "Australian Journal of Language and Literacy",
issn = "1038-1562",
publisher = "Australian Reading Association",
number = "FEB",

}

Testing spelling: How does a dictation method measure up to a proofreading and editing format? / DAFFERN, Tessa; Mackenzie, Noella; Hemmings, Brian.

In: Australian Journal of Language and Literacy, Vol. 2017, No. FEB, 6, 2017, p. 1-23.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Testing spelling: How does a dictation method measure up to a proofreading and editing format?

AU - DAFFERN, Tessa

AU - Mackenzie, Noella

AU - Hemmings, Brian

PY - 2017

Y1 - 2017

N2 - In response to increasing data-based decision making in schools comes increased responsibility for educators to consider measures of academic achievement in terms of their reliability, validity and practical utility. The focus of this paper is on the assessment of spelling. Among the methods used to assess spelling competence, tasks that require the production of words from dictation, or the proofreading and editing of spelling errors are common. In this study, spelling achievement data from the National Assessment Program - Literacy and Numeracy (NAPLAN) Language Conventions Test (a proofreading and editing based measure) and the Components of Spelling Test (CoST) (a dictation based measure) were examined. Results of a series of multiple regression analyses (MRAs) were based on a sample of low-achieving and high-achieving spellers from the Australian Capital Territory (ACT) in Year 3 (n=145), Year 4 (n=117), Year 5 (n=133) and Year 6 (n=117). Findings indicated significant relationships between scores in the spelling domain of the NAPLAN Language Conventions Test and the phonological, orthographic and morphological subscales scores of the CoST. Further, the orthographic subscale of the CoST was generally the main predictor of NAPLAN spelling across year level. Analysis also demonstrated that gender was not an influential factor. Implications for assessment and instruction in spelling are discussed in this paper, and the CoST is offered as a valid, reliable and informative measure of spelling performance for use in school contexts or future research projects

AB - In response to increasing data-based decision making in schools comes increased responsibility for educators to consider measures of academic achievement in terms of their reliability, validity and practical utility. The focus of this paper is on the assessment of spelling. Among the methods used to assess spelling competence, tasks that require the production of words from dictation, or the proofreading and editing of spelling errors are common. In this study, spelling achievement data from the National Assessment Program - Literacy and Numeracy (NAPLAN) Language Conventions Test (a proofreading and editing based measure) and the Components of Spelling Test (CoST) (a dictation based measure) were examined. Results of a series of multiple regression analyses (MRAs) were based on a sample of low-achieving and high-achieving spellers from the Australian Capital Territory (ACT) in Year 3 (n=145), Year 4 (n=117), Year 5 (n=133) and Year 6 (n=117). Findings indicated significant relationships between scores in the spelling domain of the NAPLAN Language Conventions Test and the phonological, orthographic and morphological subscales scores of the CoST. Further, the orthographic subscale of the CoST was generally the main predictor of NAPLAN spelling across year level. Analysis also demonstrated that gender was not an influential factor. Implications for assessment and instruction in spelling are discussed in this paper, and the CoST is offered as a valid, reliable and informative measure of spelling performance for use in school contexts or future research projects

KW - Spelling

KW - Linguistics

KW - Assessment

UR - http://www.scopus.com/inward/record.url?scp=85018508375&partnerID=8YFLogxK

M3 - Article

VL - 2017

SP - 1

EP - 23

JO - Australian Journal of Language and Literacy

JF - Australian Journal of Language and Literacy

SN - 1038-1562

IS - FEB

M1 - 6

ER -