Workshop - Introduction & Motivation

The Learning & Teaching Academic Standards (LTAS) project, begun in 2009, shifted our approach to assessment. Learning outcomes had previously possessed, to some extent, an aspirational quality: they were qualities, skills, and knowledge that we, as educators, should hope that a graduate has attained by the completion of their degree. The Science LTAS, however, developed fundamental Threshold Learning Outcomes (TLOs) that described the skills, knowledge, and attitudes science bachelor degree graduates will have by the completion of their degree.

The advent of the TLOs and the Chemistry TLOs (CTLOs) that followed with the effort of the Chemistry Discipline Network in 2011 meant that pre-existing assessments were reconsidered with a new eye. This approach has met with pretty significant problems, one of which stands out as particularly important: tasks are often claimed to assess certain CTLOs, when in fact they only acknowledge them or do not assign sufficient credit to their demonstration. The problem lies in the idea of the threshold: we cannot say a student has demonstrated a certain CTLO unless there has been some point at which they must have failed a unit of study unless they demonstrated the CTLO.

This is best illustrated with an example:

A pair of third year Chemistry students are undertaking a laboratory assessment together. In the task, the students must:

  1. Prepare a set of superhydrophobic surfaces by coating small glass tiles with a series of silicon-based organic polymers.
  2. Record the contact angle of water droplets on each surface.
  3. Analyse the differences in hydrophobicity of the different surfaces.
  4. Evaluate the efficacy of the different coating methods.

At the end, students write individual short reports (2 pages), with marks assigned in equal portion to each of the introduction, methods, results/discussion, conclusions, referencing, and style.

The crux of the issue is that even though this task covers a lot of ground, we can’t say for certain that it actually assesses any of the CTLOs. In fact, there’s a glaring issue: to pass the assessment, a student doesn’t actually need to carry out the experiment. There’s some presumption of group work and collaboration, but it carries no weight in the assessment. More subtly, because of the way the marks are allotted, a student could pass this assessment overall while still failing to demonstrate a core requirement.

Clearly, a poorly designed assessment task is a serious issue because it can:

  • Stop a ‘good’ student from demonstrating a high level of capability.
  • Prevent an ‘average’ student from meeting minimum performance requirements.
  • Allow a ‘poor’ student to obtain a passing grade without having specifically met any outcomes.

The threshold requirement sets a high bar! It can be surprisingly difficult to determine if an assessment is truly fit-for-purpose. But don’t worry, we’ve got you covered: we’ve developed a simple tool that helps academic staff to assess their assessments. You can read all about the process that went into developing the tool here.

What you’re working through now is an online version of the project workshop that was run across Australia to develop the tool between 2015 and 2017. The aim is to help Chemistry academics adjust to this paradigm shift in assessment.

By the end of the workshop, which should take you no more than half an hour, you should:

  1. Have developed an understanding of what is meant by assessment and learning outcomes (FOUNDATION).
  2. Understand the basis for, know how to use, and have access to the task evaluation tool (EDUCATION).
  3. Have plenty of practice using the tool on examples from our set of (real) assessment tasks, and a sense of how your approach to assessment compares to other academics’ (CALIBRATION).

There are three sections to get through—let’s get started!


Part 1     Part 2     Part 3