Graduate teaching assistants’ assessment of students’ problem formulation within model-eliciting activities
Model-Eliciting-Activities (MEAs) are open ended engineering problems set in realistic contexts, which requires teams of students to create a mathematical model for solving a client’s problem. At the beginning of each MEA, students are required to answer three questions: Q1) “Who is the client?”, Q...
Saved in:
Main Authors: | , |
---|---|
Format: | Conference or Workshop Item |
Language: | English |
Published: |
American Society of Engineering Education
2010
|
Subjects: | |
Online Access: | http://irep.iium.edu.my/44342/1/44342_Graduate%20teaching%20assistants%E2%80%99%20assessment.pdf http://irep.iium.edu.my/44342/ https://peer.asee.org/graduate-teaching-assistants-assessment-of-students-problem-formulation-within-model-eliciting-activities |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Model-Eliciting-Activities (MEAs) are open ended engineering problems set in realistic contexts, which requires teams of students to create a mathematical model for solving a client’s problem. At the beginning of each MEA, students are required to answer three questions: Q1)
“Who is the client?”, Q2) “In one of two sentences, what does the client need?” and Q3) “Describe at least two issues that need to be considered when developing a solution for the client”. These questions are designed to guide students’ problem formulation. As graduate teaching assistants (GTAs) are responsible for assessing these student responses, it is anticipated that GTAs contribute to students’ ability to formulate problems. However, a cursory review of GTAs assessment of student work indicated that some GTAs struggle to properly assess
students’ responses. To guide future GTA professional development with MEAs, and problem formulation in particular, this paper seeks to explore these questions in more detail: “How are the GTAs’ assessing students’ responses to the MEA individual questions?” and “Do students’ ability to answer these questions improve across the three MEAs implemented in a single
semester?” Three distinct MEAs were implemented in a required first-year engineering problem-solving course in Fall 2007. Open coding and content analysis of ~500 (out of ~1500) student responses per MEA was performed to establish an expert assessment of the student
responses. The expert assessment of the student responses were used to evaluate the GTAs assessments. Results verify that the GTAs’ assessments of student responses were very weak. It is clear that GTA professional development with problem formulation is needed. Recommendations for such professional development are put forth. |
---|