Design_Focused_Evaluation

Design-focused evaluation

Design-focused evaluation

Add article description


Design-focused evaluation (DFE) is an approach to the evaluation of educational quality.

In DFE, the evaluation specifically focuses on the effectiveness of the constructive alignment in the course (subject, paper, unit) design. DFE was developed for use in the higher education sector. However, it has relevance in training environments also, and may be broadly applicable in any educational environment.[1]

History

DFE was developed by Calvin Smith, an educational researcher in Australia, to address the fact that most evaluation systems involve a focus on either inputs (the quality of the teaching, the quality of the resources, etc.) or outputs/outcomes (e.g. learning outcomes). Where both inputs and outcomes are included in an approach the items are separate (e.g. The Course Experience Questionnaire[2] which has scales on inputs and a scale on generic skill development as outcomes). There was at the time no approach that focused on the alignment of the learning activities and the learning objectives.

Uses

DFE is primarily to be used to gather students' perceptions of the efficacy of the alignment (of teaching and learning activities with learning objectives) in the design of a course or unit of study (or part thereof). The reason for doing this is that it is considered a high-quality design practice that learning activities and learning objectives are aligned.[3] Therefore, 'efficacy' here means 'the degree to which the teaching and learning activities supported the development of the learning outcomes'.

Approach

DFE is a survey-based approach to gathering evaluative data, therefore it generates quantitative data. The approach relies heavily on the way items are written for the surveys used. There is a clear articulation of how items can be written for the DFE approach.[4]

In short, each question is composed in two parts conjoined by a grammatical structure that does the work of '...helped me to learn...'. The first part of the question indexes the teaching and learning activity (e.g. 'the lab session on dissection...') and the second part indexes the learning objective ('...how to dissect the human torso.').

One sophistication in this process is that on both sides of the item structure there can be varying degrees of specificity-generality (or granularity). Compare the following three items all of which are stable on the right hand side, but vary on the left hand side:

  • The whole 20-lecture series of lectures helped me to learn the rules of logic
  • The three guest lectures helped me to learn the rules of logic
  • The guest lecture by Dr Jeeves helped me to learn the rules of logic.

These items become increasingly granular, or more specific, moving from reference to all the lectures right down to a specific lecture session. The same issue of degrees of granularity/specificity can apply on the learning objective side of the item construction. Consider the following three items:

  • the textbook helped me to learn the rules of logic
  • the textbook helped me to learn argument structures and forms
  • the textbook helped me to learn modus tollens.

Again, the degree of specificity increases with each step.

This ability means the DFE approach is extremely flexible and allows the course designer to target specific aspects of the course design. This allows a more strategic approach to evaluation design, and allows for the selective and high-leverage use of the limited 'real estate' in any survey instrument.

A further observation about the scope of each side of a DFE item is that, as well as being more or less generic or specific, it can make an "implicit" reference to the teaching and learning activities or the learning objectives (respectively). An example of this is when reference is made to 'the course' or 'learning'. Consider the following: 'the course helped me to learn'. In this case the reference to the course is an implicit reference to all of the teaching and learning activities the student experienced. Similarly, the reference to 'learning' in this case implies all of the learning objectives in the course. In one sense implicit references to either the learning objectives or the teaching and learning activities is the most generic level of reference, but a distinction is made in DFE between generic or general references to either TLAs or LOBs and implicit references.

With these three categories a cross-tabulation of the combinations of left-hand- and right-hand-side components of DFE items can be devised. This is useful in helping novice DFE survey designers develop items (Table 1).

Table 1 - Nine combinations of specific-generic-implicit

SGI
SSSSGSI
GGSGGGI
IISIGII

In the table S=Specific G=Generic I=Implicit

Some examples of follow:

  • SS: The demonstration of pivot tables (S) helped me to learn how to set up the pivot for my project (S)
  • II: This course (I) was excellent (I)
  • GG: The lectures (G) helped me to understand the essential concepts of Marxisim (G)
  • SI: The demonstration of pivot tables (S) was excellent (I)

References

  1. Patton, Michael Quinn (2011-08-22). Essentials of Utilization-Focused Evaluation. SAGE Publications. ISBN 978-1-4833-0697-1.
  2. Ramsden, P. (1991) A performance indicator of teaching quality in higher education: the Course Experience Questionnaire, Studies in Higher Education 16(2), pp. 129–150.
  3. Biggs, J. B. and Tang, C. (2007). Teaching for quality learning at university. Open University Press/Mc Graw-Hill Education.
  4. Smith, C. D. (2008). Design Focused Evaluation. Assessment & Evaluation in Higher Education, 33(6), 631-645.

Share this article:

This article uses material from the Wikipedia article Design_Focused_Evaluation, and is written by contributors. Text is available under a CC BY-SA 4.0 International License; additional terms may apply. Images, videos and audio are available under their respective licenses.