Abstract
As design educators, most would aim to provide clear, helpful, equitable feedback to students as they develop and refine their skills. Assessing the level of achievement in a design submission is somewhat tricky however. It is an inherently qualitative and comparative undertaking, relying on a set of relative values, and drawing on both the assessor’s response as well as the particularities of the work itself. By contrast, many academic institutions require absolute measures of students’ success, expressing this using an agreed range of values or grades. This translation can become area of some confusion, if not dissention, for students (Otswald and Williams, 2008). The eRubric is a prototype interactive assessment tool, developed to investigate and to bridge the gap between an informed intuitive response and an absolute measure. The tool was initially conceived and designed by the author when working with groups of tutors from various disciplinary backgrounds to deliver a large cohort interdisciplinary design subject. The inherent values within the undertaking were soon apparent! (Tregloan and Missingham 2010). During 2011, the eRubric was used by more than 40 design tutors to assess over 5000 student submissions. Tutors’ experiences and responses were collected via survey and interview, and inform the further development of the tool. Initial findings are presented here. The eRubric continues to be developed with the support of the Faculty of Art Design & Architecture at Monash University, as well as the Faculty of Architecture, Building and Planning at the University of Melbourne. This paper will present the operation of the eRubric tool, and findings to date. It will also discuss the development of effective rubric terms for design education, and opportunities offered by new interface formats to support clear and informed intuitive evaluation of design work.
Keywords
design education, assessment, values, informed intuition, creativity
Citation
Tregloan, K. (2012) eRubric : Absolutely relative or relatively absolute? … striking a balance in the assessment of student design work, in Israsena, P., Tangsantikul, J. and Durling, D. (eds.), Research: Uncertainty Contradiction Value - DRS International Conference 2012, 1-4 July, Bangkok, Thailand. https://dl.designresearchsociety.org/drs-conference-papers/drs2012/researchpapers/139
eRubric : Absolutely relative or relatively absolute? … striking a balance in the assessment of student design work
As design educators, most would aim to provide clear, helpful, equitable feedback to students as they develop and refine their skills. Assessing the level of achievement in a design submission is somewhat tricky however. It is an inherently qualitative and comparative undertaking, relying on a set of relative values, and drawing on both the assessor’s response as well as the particularities of the work itself. By contrast, many academic institutions require absolute measures of students’ success, expressing this using an agreed range of values or grades. This translation can become area of some confusion, if not dissention, for students (Otswald and Williams, 2008). The eRubric is a prototype interactive assessment tool, developed to investigate and to bridge the gap between an informed intuitive response and an absolute measure. The tool was initially conceived and designed by the author when working with groups of tutors from various disciplinary backgrounds to deliver a large cohort interdisciplinary design subject. The inherent values within the undertaking were soon apparent! (Tregloan and Missingham 2010). During 2011, the eRubric was used by more than 40 design tutors to assess over 5000 student submissions. Tutors’ experiences and responses were collected via survey and interview, and inform the further development of the tool. Initial findings are presented here. The eRubric continues to be developed with the support of the Faculty of Art Design & Architecture at Monash University, as well as the Faculty of Architecture, Building and Planning at the University of Melbourne. This paper will present the operation of the eRubric tool, and findings to date. It will also discuss the development of effective rubric terms for design education, and opportunities offered by new interface formats to support clear and informed intuitive evaluation of design work.