Nine golden rules when developing an assessment tool

Here is a list of nine golden rules when designing and developing a competency-based assessment tool within the Australian VET system:

  1. Comply with principles of assessment and rules of evidence
  2. Select an appropriate number of assessment tasks
  3. Gather all specified knowledge evidence
  4. Gather the specified volume or frequency of performance evidence
  5. Gather all specified performance evidence, and this must include evidence for each performance criteria
  6. Keep the assessment task to gather knowledge evidence separate from assessment tasks that gather performance evidence
  7. Do not ask ‘how-to’ questions
  8. Comply with the specified assessment conditions
  9. Trial and review the assessment tool before implementing it.

Ensure that the assessment tool complies with the requirement specified by the Standards for RTOs, in particular:

  • Principles of assessment
  • Rules of evidence.

ASQA has published useful information about the principles of assessment and the rules of evidence, and how to comply.

The assessment tool will always have at least two assessment tasks; one assessment task to gather the knowledge evidence, and another to gather the performance evidence.

The volume or frequency of performance evidence may be used to determine the number of assessment tasks required to gather the required performance evidence. See ‘Golden rule number 4’.

All specified knowledge evidence must be gathered.

A simple assessment strategy is to write at least one question for each item of knowledge evidence specified. Consider each bullet point listed under the heading of Knowledge Evidence as an item of knowledge evidence to be gathered.

Sometimes a bullet point may have sub-bullet points. Each sub-bullet point may require its own question, or it might be possible to use one question to gather evidence that would cover all the sub-bullet points.

The following is a simple assessment strategy that can keep the gathering of performance evidence as simple as possible for assessors to implement and for candidates to understand.

Use the same number of assessment tasks to gather performance evidence that is equal the volume or frequency of performance evidence specified. For example, if the performance evidence specifies that the task must be performed on three occasions, then plan for three assessment tasks to gather the specified quantity of evidence. In this example, the same assessment instrument may be able to be used for each of the three occasions (but this may not always be appropriate).

There must be evidence gather for all specified performance evidence. And this includes gathering evidence for each performance criteria. There are some Training Packages that specify that evidence for foundation skills are also to be gathered. Therefore, the foundation skills become assessible items.

Avoid integrating the gathering of knowledge evidence while gathering performance evidence because this will quickly complicate the assessment task. It can also interrupt the flow of performing a task if the assessor stops the candidate mid-task and starts asking them questions about what they are doing.

The following diagram shows the common assessment methods (and in brackets the evidence to be gathered by the assessment instruments).

Keep the assessment task to gather knowledge evidence separate from the assessment task or tasks used to gather performance evidence.

Note: It is a good idea to gather the knowledge evidence before gathering the performance evidence. If a candidate has insufficient knowledge, they are likely to have difficulties performing the work tasks or activities. It may be best to delay the gathering of performance evidence until the candidate has gained sufficient knowledge.

Do not ask ‘how-to’ questions to gather performance evidence. Performance evidence will require a candidate to perform the task or tasks.

Comply with the specified assessment conditions. This may include location, facilities, equipment or resources required for assessment.

Always trial and review the assessment tool before implementing it.

The following are some check points when trialling and reviewing the assessment tool:

  • Instructions to assessor are clear and written in plain English
  • Instructions to candidates are clear and written in plain English
  • If applicable, instructions to third parties are clear and written in plain English
  • Headings, sub-heading, page layout and formatting, page breaks, and white space have been used to make assessment documents easy to read and navigate
  • Assessment instruments can be used for collecting evidence and making judgements, including space provided for results, comments and feedback
  • Assessment decision-making criteria are provided, for example:
    • Sample answers for knowledge questions
    • Criteria for determining the standard of performance
  • Assessment documents are free of typos and grammatical errors
  • Assessment task titles and numbering are consistent across all documents.

The assessment tool should never be implemented before it has been trialled or piloted. This is when we find out if the assessment tool works, or not.

Some rules, like ‘Golden rule number 1’ should never be broken.

And sometimes you may need to break some of the rules.

The guiding principle should always be about making the assessment tool as simple as possible for assessors to implement and for candidates to understand.

TAE Tutoring service

Are you a TAE40116 Student struggling with your studies?

Some people find the ‘TAEAS502 Design and develop assessment’ topic difficult. My TAE Tutoring service can help you with your studies.

Do you want more information? Ring Alan Maguire on 0493 065 396 to discuss.

Contact now!

logo otws

Training trainers since 1986