View Content #23258

Contentid23258
Content Type3
TitleWhat are Tests Assessing? Context-based Assessment as a Learning Tool
Body

Renée Marshall is an International Programs Specialst at CASLS. She has experience teaching high school and university level French and currently teaches a course for international internship program students.

When working towards my M.Ed, I noticed that the chapter tests I was giving my students often had little to no meaningful context or purpose. There were fill-in-the-blank grammar questions and vocabulary translation questions, neither of which required understanding of language or use of language in realistic contexts. Even students who had done well on the grammar and vocabulary section of the test would then later have trouble creating a skit using the same words and grammar. They were needing to look up the same grammar and vocabulary words again, not even realizing they had already been tested on the same material previously. As Wiggins (1993) stated: “We cannot be said to ‘understand’ something, in other words, unless we can employ it wisely, fluently, flexibly, and aptly in particular and diverse contexts” (p. 207). So I asked myself, what is the point of these assessments if they don’t give me information about how my students’ language proficiencies are developing?

In order to answer my M.Ed inquiry question, “What are tests assessing?,” I began doing both alternative assessments and chapter tests from the book and comparing the results. After examining three alternative assessments versus three chapter tests, I noticed that in terms of grammar accuracy, nothing really changed. Students made the same grammar mistakes on the chapter tests as they did in the alternative assessments (which included an advice blog, a poetry anthology, and a video newscast). However, the presence of context and the students’ ability to use the language in real-life situations was noticeably different among the two. Students were able to successfully engage in such tasks as giving each other dating advice in the target language and evaluating whether or not that advice was good. They were creating with the language on their own, and as a result, learning the grammatical conventions of giving advice in French. Even while the grammar accuracy did not appear to improve or change from the traditional context-less test to the context-rich alternative assessments, the students were actively learning real-world application of language from the alternative assessments, such as how to use the language to accomplish a task (giving advice) in different situations and also how to do other important skills (like creating and maintaining a blog), or as Wiggins (1993) puts it the “real, ‘messy’ uses of knowledge in context—the ‘doing’ of a subject.” (p. 207) In this way, providing a context and a purpose seemed to create an assessment that also served as a learning tool.

For a few examples of contextualizing test questions, see this week’s Activity of the Week.

References:

Wiggins, Grant P. (1993) Assessing student performance: Exploring the purpose and limits of testing. San Francisco: Jossey-Bass Publishers.

SourceCASLS Topic of the Week
Inputdate2017-05-30 14:33:49
Lastmodifieddate2017-06-19 03:53:47
ExpdateNot set
Publishdate2017-06-19 02:15:01
Displaydate2017-06-19 00:00:00
Active1
Emailed1
Isarchived0