Do we need a new e-learning unit of competency?

As part of the TAE Training Package review, a new e-learning unit has been drafted. The title of of the proposed unit is TAEDEL405 Plan, organise and facilitate e-learning.

PwC’s Skills for Australia is currently seeking our feedback about this draft. The following is my feedback about the unit, and I would be happy to hear your comments before I submit it.

(You may also like to read an article that I have previously posted about another proposed unit with the title of TAEASS404 Conduct e-assessment of competence.)

Types of e-learning

There are different types of e-learning. Often, e-learning is referred to being synchronous and asynchronous. The following table broadly describes two types of e-learning.

The knowledge and skills required for facilitating self-paced e-learning (asynchronous) are different to those required for facilitating group-based e-learning (synchronous) . Also, the tasks performed are different. I think it would be best to cover these two types of e-learning by two different units of competency.

Is there a need for a new e-learning unit of competency?

Synchronous and asynchronous e-learning are different tasks, and each type of e-learning requires different knowledge and skills. I recommend having two units instead of one. One unit covering synchronous e-learning, and another covering asynchronous e-learning. Also, delivering synchronous e-learning to an individual is different to delivering synchronous e-learning to a group. Delivery to a group requires a higher level of skills compared with delivery to an individual.

The following is my recommendation. The unit codes and titles should be read as indicative for the purpose of this example.

Note: Learning to use web conferencing technology is relatively quick and easy.

TAEDEL405 Plan, organise and facilitate e-learning (draft)

Is the TAEDEL501 Facilitate e-learning unit of competency being down-graded to become the TAEDEL405 Plan, organise and facilitate e-learning units of competency?

The TAEDEL405 Plan, organise and facilitate e-learning unit of competency seems to be a rewrite of the current TAEDEL501 Facilitate e-learning unit of competency. Therefore, it appears that the TAEDEL501 unit is being down-graded to become the TAEDEL405 unit. However, this is my assumption. Down grading or lowering the skill level of the TAEDEL501 unit seems unnecessary and a waste of time and effort.

Feedback about the elements and performance criteria

The following table provides summary of how the draft TAEDEL405 unit differs from the current TAEDEL501 unit.

Note: The changes to the TAEDEL501 units are trivial or superficial.

Feedback about the foundation skills

A candidate will be required to demonstrate the ability to complete the tasks outlined in the elements, performance criteria and foundation skills (as per the Performance Evidence statement). This makes the foundation skills assessable.

Analysis and comprehensive feedback about the foundation skills would require much more time and effort. However, I believe that a critical analysis of the foundation skills should be done prior to implementation.

Feedback about the performance evidence

The following are the performance evidence.

I have many questions about the performance evidence:

  • What is one complete program of learning? What is a session? And what program of learning has only three sessions?
  • What is a synchronous e-learning session? Is it a training session delivered via a web conferencing platform?
  • Is a group of four students large enough? How many students are normally in a group attending a synchronous e-learning session? Who decide on the size of the group? From my experience, at least eight students would be a more realistic number.
  • Why are learners being referred to as students? Will this prohibit the unit from being used for community-based training providers, enterprise-based training providers and non-VET training providers who do not have ‘students’?
  • What is an asynchronous e-learning session? The concept of sessions for asynchronous e-learning seems odd.

Currently, the TAELLN411 unit adequately covers the identifying and addressing LLN (and D) needs. This does not have to integrated into another unit of competency.

I think the scope of the draft TAEDEL405 Plan, organise and facilitate e-learning unit of competency should be split over two or more units. For example, one unit covering the delivery of synchronous learning, and another unit covering the delivery of asynchronous learning. One unit of competency to address the range of e-learning types is too ambitious.

Feedback about the knowledge evidence

The following are the knowledge evidence.

Analysis and comprehensive feedback about the foundation skills would require much more time and effort. However, I believe that a critical analysis of the foundation skills should be done prior to implementation.

The breadth and depth of knowledge required for this unit seems to be at a higher AQF level. And I would like to know, ‘what is the source document for the code of conduct related to e-learning?’

Feedback about the assessment conditions

The following are the assessment conditions.

I am unsure what is meant by an ‘e-portfolio to reflect progress and collect evidence’. I hope the the term ‘e-portfolio’ is removed.

In conclusion

Is there a need for a new e-learning unit of competency?

The draft TAEDEL405 Plan, organise and facilitate e-learning unit of competency duplicate the current TAEDEL501 Facilitate e-learning unit of competency. Therefore, there is no need for this new e-learning unit. The draft TAEDEL405 Plan, organise and facilitate e-learning unit of competency should be scrapped.

Instead of the proposed one new e-learning unit of competency, I would suggest looking at several new e-learning units of competency. For example:

  • Plan, organise and deliver training to an individual using web conferencing technology
  • Plan, organise and deliver training to a group using web conferencing technology
  • Plan, organise and facilitate online and self-paced learning for an individual
  • Plan, organise and facilitate online and self-paced learning for a group

If the TAEDEL405 unit is implemented, will the TAEDEL501 unit be removed from the TAE Training Package? Is the aim to downgrade the skill level of the current TAEDEL501 unit? (So many unanswered questions.)

Please let me know what you think.

Do we need a new e-assessment unit of competency?

As part of the TAE Training Package review, a new e-assessment unit has been drafted. The title of of the proposed unit is TAEASS404 Conduct e-assessment of competence.

PwC’s Skills for Australia is currently seeking our feedback about this draft. The following is my feedback about the unit, and I would be happy to hear your comments before I submit it.

(You may also like to read an article that I have previously posted about another proposed unit with the title of TAEDEL405 Plan, organise and facilitate e-learning.)

A definition of e-assessment

E-assessment may mean different things to different people. I have categorised two different types of e-assessment:

  • Assessment conducted by technology
  • Assessment supported by technology

Assessment conducted by technology

Assessment conducted by technology may also be known as computer-based assessment (CBA) or e-marking. This type of e-assessment is assessment which is both delivered and marked by computer. And the computer may provide feedback automatically to students without the involvement of an assessor (or any other person).

Benefits of conducting assessment using technology

Benefits of using technology to conduct assessments include:

  • consistency of assessments decisions
  • supports the record keeping of evidence and results
  • speed of getting the assessment done and providing feedback
  • reduces the workload of assessors.

Limitations of conducting assessment using technology

Assessment conducted by technology can be used for assessing knowledge, but it is not an appropriate method for assessing performance, such as:

  • bake a cake
  • repair a lawn mower
  • care for an elderly person
  • unblock a toilet
  • shear sheep.

The Australian VET system requires evidence of a person’s ability to perform work tasks and activities. Therefore, this type of e-assessment should only be used to assess knowledge, and in some cases, the breath and depth of knowledge required cannot be covered by this application of technology.

This method of computer-based assessment requires the design and development of valid test questions with answers (and sometimes responses to be sent as automatic feedback). Not every trainer will have the ability to develop effective test questions. Usually, the writing of test questions would be the responsibility of an instructional designer or resources developer.

What skills are needed to develop computer-based assessment?

An instructional designer or resources developer need the following skills to develop computer-based assessment:

  • Write questions
  • Write answers
  • Write feedback
  • Upload to computer-based assessment instrument.

The value of computer-based assessment is that there is no involvement of an assessor during the assessment process. The computer-based assessment instrument performs the assessment and records the result.

Some trainers and assessors may be involved in the development of questions, answers and feedback for computer-based assessment. However, this would be outside the scope of normal activities for most trainers and assessors.

Assessment supported by technology

It has become common for RTOs to use technology to support the assessment process. Many RTOs have implemented a learning management system (LMS), such as Moodle or Canvas. It is more than 5 years since I worked for an RTO that wasn’t using an LMS for students to submit their assessments.

What does the ‘e’ in e-assessment stand for? Does is stand for ‘electronic’ or ‘everyday’? Assessment supported by technology has become the norm. The distinction between ‘e-assessment’ and ‘assessment’ has dissolved over the past 5 years. It would be rare for the assessment process not to supported by technology.

The Education IRC believes that ‘e-assessment’ is different to ‘assessment’ . How did the Education IRC identify a need to create a ‘new’ e-assessment unit of competency? Most RTOs use technology for students to submit their evidence and support the assessment process conducted by assessors. A very small number of exceptions to this may include:

  • VET in School programs that use print-based workbooks
  • Creative arts qualifications, such as, music or dance
  • Small RTO that have a paper-based system.

Is it difficult to learn how to use an LMS?

No, it is not difficult to learn how to use an LMS. So far, I have had to learn three different LMS platforms. The first LMS I learnt was Moodle. I attended a 3-hour session about how to use it to perform my role as an assessor. This was enough to get me using Moodle to conduct assessments. Later, I started to work for an RTO that used Canvas. About 30 minutes was taken during my induction to show me how to log-on, navigate, and use this LMS. And later again, another RTO emailed me a 5-page document to show how to log-on, navigate, and use Job Ready for conducting assessments.

Learning an LMS is not difficult. The knowledge and skills required to use an LMS are insufficient and insignificant for the creation of a unit of competency. It is possible that using an LMS or other technologies could be identified as a foundation skill for the TAEASS402 Assess competence unit of competency. However, learning one LMS does not negate the need for a newly qualified assessor to need to learn a different LMS once they get employment.

What skills are need to conduct assessments that are supported by technology?

The skills required by an assessor to conduct assessment that are support by technology include:

  • Prepare for assessment
  • Brief the candidate
  • Gather evidence
  • Support the candidate
  • Make the assessment decision
  • Record and report the assessment decision

These skills are the same as what is covered by the current TAEASS402 Assess competence unit of competency.

TAEASS404 Conduct e-assessment of competence (draft)

Feedback about elements and performance criteria

The following are the performance criteria for the first element.

Answers to questions raised

The Training Package Developer has raised some questions. The following are my answers to the three questions raised.

The following are the performance criteria for elements 2, 3 and 4.

There is very little difference between the draft TAEASS404 Conduct e-assessment of competence unit of competency and the current TAEASS402 Assess competence unit of competency.

What happens if we remove duplicate performance criteria from the TAEASS404 Conduct e-assessment of competence unit of competency? After removing all the performance criteria that duplicates what is already in the TAEASS402 Assess competence unit of competency, we are left with five performance criteria. The following table lists the five performance criteria.

Contrary to the Case for Change, there is very little difference between conducting ‘traditional assessment’ (as described by the TAEASS402 unit) and conducting ‘e-assessment’ (as described by the TAEASS404 unit). The TAEASS404 Conduct e-assessment of competence unit of competency is basically an attempt to contextualise the TAEASS402 Assess competence unit of competency. The minor differences does not warrant the development and implementation of new e-assessment unit of competency.

Feedback about foundation skills

A candidate will be required to demonstrate the ability to complete the tasks outlined in the elements, performance criteria and foundation skills (as per the Performance Evidence statement). This makes the foundation skills assessable.

The following are the foundation skills.

Analysis and comprehensive feedback about the foundation skills would require much more time and effort. However, some of the foundation skills seem to poorly describe or do not describe a skill. For example:

Feedback about performance evidence

The following are the performance evidence.

The performance evidence is poorly written. I am an experienced assessor and for many years I have been conducting e-assessment. I find the performance evidence is complicated, confusing, and difficult to comprehend. For example:

  • How is an e-assessment conduct for a group of students? When does this occur?
  • What is the size of the group of students? How is an individual’s competence assessed in a group?
  • What is the definition for an assessment task? Assessing one assessment task is far short of assessing competence.
  • Will sending an email be appropriate for communicating to candidates electronically?
  • What does ‘complete feedback for students in relation to their e-assessment completion’ mean?

Why is there a distinction being made between an ‘e-assessment tool’ and an ‘assessment tool’? Why are we using or alternating between the words ‘student’ and ‘candidate? The performance evidence is poorly written and needs to be significantly reworked.

Feedback about knowledge evidence

Analysis and comprehensive feedback about the knowledge evidence would require much more time and effort. However, I believe that a critical analysis of the knowledge evidence should be done prior to implementation.

Feedback about assessment conditions

The following are the assessment conditions.

Why is the term ‘digital assessment tool’ being used? And why is the term ‘e-portfolio being used?

I think the entire Assessment Requirements need to be thoroughly reworked. But I really think that the draft TAEASS404 Conduct e-assessment of competence unit of competency should be scrapped.

In conclusion

Is there a need for a new e-assessment unit of competency? Can the current TAEASS402 Assess competence unit of competency adequately cover the requirements of conducting e-assessment?

Most assessors are using an LMS or other technology when they conduct assessments. Using an LMS or other technology has become fundamental or foundational to conducting assessments. The distinction between conducting ‘e-assessment’ or conducting ‘non-e-assessment’ is blurred. And the knowledge and skills required by assessors to conduct ‘e-assessment’ or ‘non-e-assessment’ are the same.

The current TAEASS402 Assess competence unit of competency adequately covers the requirement of conducting assessment that are supported by technology (e-assessment). At best, the draft TAEASS404 Conduct e-assessment of competence unit of competency is an attempt to contextualise the TAEASS402 Assess competence unit of competency. The minor differences does not warrant the development and implementation of new e-assessment unit of competency.

We do not need a new e-assessment unit of competency.

Please let me know what you think.

Nine golden rules when developing an assessment tool

Here is a list of nine golden rules when designing and developing a competency-based assessment tool within the Australian VET system:

  1. Comply with principles of assessment and rules of evidence
  2. Select an appropriate number of assessment tasks
  3. Gather all specified knowledge evidence
  4. Gather the specified volume or frequency of performance evidence
  5. Gather all specified performance evidence, and this must include evidence for each performance criteria
  6. Keep the assessment task to gather knowledge evidence separate from assessment tasks that gather performance evidence
  7. Do not ask ‘how-to’ questions
  8. Comply with the specified assessment conditions
  9. Trial and review the assessment tool before implementing it.

Ensure that the assessment tool complies with the requirement specified by the Standards for RTOs, in particular:

  • Principles of assessment
  • Rules of evidence.

ASQA has published useful information about the principles of assessment and the rules of evidence, and how to comply.

The assessment tool will always have at least two assessment tasks; one assessment task to gather the knowledge evidence, and another to gather the performance evidence.

The volume or frequency of performance evidence may be used to determine the number of assessment tasks required to gather the required performance evidence. See ‘Golden rule number 4’.

All specified knowledge evidence must be gathered.

A simple assessment strategy is to write at least one question for each item of knowledge evidence specified. Consider each bullet point listed under the heading of Knowledge Evidence as an item of knowledge evidence to be gathered.

Sometimes a bullet point may have sub-bullet points. Each sub-bullet point may require its own question, or it might be possible to use one question to gather evidence that would cover all the sub-bullet points.

The following is a simple assessment strategy that can keep the gathering of performance evidence as simple as possible for assessors to implement and for candidates to understand.

Use the same number of assessment tasks to gather performance evidence that is equal the volume or frequency of performance evidence specified. For example, if the performance evidence specifies that the task must be performed on three occasions, then plan for three assessment tasks to gather the specified quantity of evidence. In this example, the same assessment instrument may be able to be used for each of the three occasions (but this may not always be appropriate).

There must be evidence gather for all specified performance evidence. And this includes gathering evidence for each performance criteria. There are some Training Packages that specify that evidence for foundation skills are also to be gathered. Therefore, the foundation skills become assessible items.

Avoid integrating the gathering of knowledge evidence while gathering performance evidence because this will quickly complicate the assessment task. It can also interrupt the flow of performing a task if the assessor stops the candidate mid-task and starts asking them questions about what they are doing.

The following diagram shows the common assessment methods (and in brackets the evidence to be gathered by the assessment instruments).

Keep the assessment task to gather knowledge evidence separate from the assessment task or tasks used to gather performance evidence.

Note: It is a good idea to gather the knowledge evidence before gathering the performance evidence. If a candidate has insufficient knowledge, they are likely to have difficulties performing the work tasks or activities. It may be best to delay the gathering of performance evidence until the candidate has gained sufficient knowledge.

Do not ask ‘how-to’ questions to gather performance evidence. Performance evidence will require a candidate to perform the task or tasks.

Comply with the specified assessment conditions. This may include location, facilities, equipment or resources required for assessment.

Always trial and review the assessment tool before implementing it.

The following are some check points when trialling and reviewing the assessment tool:

  • Instructions to assessor are clear and written in plain English
  • Instructions to candidates are clear and written in plain English
  • If applicable, instructions to third parties are clear and written in plain English
  • Headings, sub-heading, page layout and formatting, page breaks, and white space have been used to make assessment documents easy to read and navigate
  • Assessment instruments can be used for collecting evidence and making judgements, including space provided for results, comments and feedback
  • Assessment decision-making criteria are provided, for example:
    • Sample answers for knowledge questions
    • Criteria for determining the standard of performance
  • Assessment documents are free of typos and grammatical errors
  • Assessment task titles and numbering are consistent across all documents.

The assessment tool should never be implemented before it has been trialled or piloted. This is when we find out if the assessment tool works, or not.

Some rules, like ‘Golden rule number 1’ should never be broken.

And sometimes you may need to break some of the rules.

The guiding principle should always be about making the assessment tool as simple as possible for assessors to implement and for candidates to understand.

Do you need help with your TAE studies?

Are you a doing the TAE40122 Certificate IV in Training and Assessment, and are you struggling with your studies? Do you want help with your TAE studies?

Do you want more information? Ring Alan Maguire on 0493 065 396 to discuss.

Contact now!

logo otws

Training trainers since 1986

VET Qualification Reform – smashing the current and complicating the future

Humpty Dumpty sat on a wall
Then along came some VET Reforms
All the VET practitioners tried really hard
But couldn’t put Humpty together again.

Humpty Dumpty is a metaphor for the specifications currently found in Training Packages. I am frustrated with politicians, bureaucrats, VET lobbyists, VET experts, and other ignorant or naïve people telling me that they are simplifying the VET system when they are actually doing the reverse. They either think I am a fool or they are deluded.

Training Packages

Many people learning about the Australian VET system for the first time, often are confronted with a vast array of terminology and acronyms. It has been known for a long time that the term, Training Package, is a misnomer. Training Packages do not specify training, nor do they include training materials. A simple solution would be to rename it. But the proposed VET Qualifications Reform wants to smash it to bits.

After 29 years since the current Australian VET system was implemented, there are many people working in and around VET who do not understand what is meant by competency-based training and competency-based assessment. Some people don’t understand that qualifications have an occupational outcome, that have been determined by industry. Also, there are some people who don’t understand units of competency.

Some people complain about how vague or ambiguous units of competency are. They want more information provided (making documents bigger). Some people complain that the Training Package documents are too big, and should be reduced in size. We should acknowledge that Training Package developers have to find a balance between having enough details documented, but not too much.

We wouldn’t have a problem if politicians, bureaucrats, VET lobbyists, VET experts, and some VET practitioners learnt to use our current VET system. It has many good features. It isn’t that bad. But we are going to make massive and unnecessary changes.

VET Qualification Reforms

The framework for a future VET Qualifications architecture has been published by the Australian Government. It compares a boring ‘charcoal grey’ framework with a ‘colourful’ future framework. Of course we want to move to the vibrant future state.

Reference: https://www.skillsreform.gov.au/images/documents/VET_Qualification_Reform_Explanation_Notes.pdf accessed 3 September 2021

In the following diagram, I have compared the current and future states side by side, in an attempt to compare the two frameworks.

And the next diagram show the connections between the current and future states.

It seems that we will be creating new documents, and the new document titles are likely to still confuse people. Also, it seems that we will rearrange the existing information (maybe I should use the ‘rearranging the deck chairs on the Titanic’ metaphor). I think we are adding complexity.

I could further analyse the proposed VET Qualifications Reforms, but that would be a waste of my time. Changes are going to happen anyway. So here is an illustration showing my concerns.

I may be wrong. I would like to see a complete sample for an entire set of documents that are planned to replace what we currently have. Then I will know if I am wrong, or right. I hope we get to see a sample before government ministers and their bureaucrats make the decision to implement chaos.

I can image the massive confusion, massive frustration, massive non-compliance, and massive costs associated with implementing the VET Reforms. Unfortunately, I cannot image that the future will be better than what we currently have. My greatest concern with the VET Reforms (now known as Skills Reform) is the potential of damaging, if not destroying, the entire Australian VET system.

How to conduct assessment validation (Part 1)

Introduction to assessment validation

Validation is defined as the quality review of the assessment process. It involves checking that the assessment tool produces valid, reliable, sufficient, current and authentic evidence to enable reasonable judgements to be made as to whether the requirements of the training package or VET accredited courses are met. It includes reviewing a statistically valid sample of the assessments and making recommendations for future improvements to the assessment tool, process and/or outcomes and acting upon such recommendations. [1]

Assessment validation has two distinct parts:

  • Part 1. Check the assessment tool for compliance
  • Part 2. Review a sample of the assessments.

This article covers the first part only.

If you want to know more about the second part, then I recommend reading the information published by ASQA about how to conduct assessment validation. This information covers: [2]

  • Who conducts validation?
  • Scheduling validation
  • Statistically valid sampling and randomly selecting samples to be validated
  • Effective validation
    • Reviewing assessment practice
    • Reviewing assessment judgements
  • Validation outcomes and the implementation of recommendations for improvement.

Part 1. Check the assessment tool for compliance

The assessment tool must be checked to ensure it complies with the requirements specified by the Standards for RTOs, in particular: [3]

  • Compliance with the principles of assessment and the rules of evidence
  • Compliance with the requirements specified by the training package or VET accredited course.

The following 6-step process can be used to check the assessment tool for compliance:

  • Step 1. Read the assessment requirements
  • Step 2. Review the assessment plan
  • Step 3. Review the assessment matrix (mapping)
  • Step 4. Check the details about how the knowledge evidence is planned to be being gathered
  • Step 5. Check the details about how the performance evidence is planned to be being gathered
  • Step 6. Check the overall quality of the assessment tool.

Step 1. Read the assessment requirements

This is a quick step to perform. You will read and re-read the unit of competency and its assessment requirements many times during the assessment validation process. During this first step, have a quick read of the assessment requirements and answer the following questions:

  1. What is the volume or frequency of performance evidence?
  2. Is the location, facilities, equipment, or other assessment conditions specified?

Step 2. Review the assessment plan

This step should also be quick. The purpose of this step is to get an overview of what is the planned assessment approach During this second step, answer the following questions:

  1. Has the correct unit code and title been used?
  2. How many assessment tasks are planned?
  3. Is there a plan to gather the knowledge evidence?
  4. Does there appear to be sufficient assessment tasks for gathering the volume or frequency of performance evidence?
  5. Does the planned assessment approach seem to be simple or complex?

Note: This planned assessment approach may be found in the Training and Assessment Strategy (TAS) or other documents covering how the RTO plans to implement the delivery of the training and assessment for a unit or cluster of units.

Step 3. Review the assessment matrix (mapping)

This step should be a relatively quick step. The assessment matrix is an important document used to display how the RTO plans to gather evidence that comply with the requirements specified by the training package or VET accredited course. The assessment matrix will be used during Step 4 and Step 5 to cross-check the RTO’s planned assessment approach and the assessment instruments being used to gather evidence.

During this third step, answer the following questions:

  1. Has the correct unit code and title been used?
  2. Has the entire unit of competency and its assessment requirements been copied into the matrix? Are the number of items the same? For example, if the unit has five elements does the matrix have five elements? And scan the wording to ensure the matrix has the exact words as the unit of competency and its assessment requirements.
  3. Is there one column for each planned assessment task?
  4. Are the titles or descriptions of the assessment tasks the same in the assessment plan and assessment matrix?
  5. Is every item from the unit of competency and its assessment requirements planned to be assessed? For example, is there at least one ‘tick’ in every row?

Note: Some assessment matrices will provide information or numerical indicator about the assessment item instead of using a ‘tick’. For example, the matrix may indicate that a piece of knowledge evidence will be gather by Question 1.

Step 4. Check the details about how the knowledge evidence is planned to be being gathered

This step requires an attention to details. The purpose is to ensure that the assessment tool will gather the required knowledge evidence. During this fourth step, answer the following questions:

  1. Is there an assessment instrument for gathering the knowledge evidence?
  2. Are the instructions to the assessor clear and concise?
  3. Are the instructions to the candidate clear and concise?
  4. Is the structure, format, and layout of the assessment instrument easy to follow? This includes headings, sub-headings, page numbers, and numbering of questions.
  5. Is there consistency between the assessment plan, assessment matrix and assessment instrument? For example, if the assessment plan states that there are 17 questions, does the assessment instrument have 17 questions?
  6. Is every item of knowledge evidence being adequately gathered? A judgement about ‘adequately’ will need to be made.

Step 5. Check the details about how the performance evidence is planned to be being gathered

This step requires an attention to details and it can take time to examine the assessment documents for compliance. The purpose is to ensure that the assessment tool will gather the required performance evidence. During this fifth step, answer the following questions:

  1. Is there one or more assessment instruments for gathering the performance evidence?
  2. Are the assessment conditions compliant with those stated in the Assessment Requirements for the unit of competency? This may include assessment location, facilities, equipment, and access to specified documents. For example, if the assessment conditions state that the assessment occurs in the workplace, then the assessment tasks must state that the evidence must be gathered from a workplace (not from a simulated workplace).
  3. Are the instructions to the assessor clear and concise?
  4. Are the instructions to the candidate clear and concise?
  5. Are the items of performance evidence clearly listed or identified?
  6. Is the structure, format, and layout of the assessment instrument or instruments easy to follow? This includes headings, sub-headings, and page numbers.
  7. Is there consistency between the assessment plan, assessment matrix and assessment instrument? For example, if the assessment matrix states that evidence for Performance Criteria 1.1 will be gather during Assessment Task 2, then Assessment Task 2 must cover the gathering of evidence for Performance Criteria 1.1.
  8. Is every item of performance evidence being adequately gathered? A judgement about ‘adequately’ will need to be made. This includes a check that the amount of evidence being gathered is compliant with the specified volume or frequency of performance evidence.

Note: Verbs are important. For example, if performance criteria says, ‘negotiate and agree with a supervisor’, then there needs to be evidence that the candidate has negotiated and agreed with a supervisor’. Also, the letter ‘s’ is important. A item of performance evidence may specify plural rather than singular. For example, if it states ‘write reports’, then more than one written report is required for evidence.

Step 6. Check the overall quality of the assessment tool

This step can take time to examine the assessment tool for compliance, readability, and usability.

  1. Are there sample answers and assessment decision criteria for assessors?
  2. Is the structure, format, and layout of all assessment documents easy to follow?
  3. Are all instructions written clearly and concisely?
  4. Are there any grammar, spelling and typo errors?
  5. Is there a list of all the assessment documents required for the assessor?
  6. Does the assessment tool have all the documents required for the assessor?
  7. Is there a list of all the assessment documents required for the candidate?
  8. Does the assessment tool have all the documents required for the candidate?
  9. Has the correct unit code and title been used throughout all the assessment documents? This may include release number.
  10. Do all the assessment documents have version control information?

In conclusion

Assessment validation has two distinct parts:

  • Part 1. Check the assessment tool for compliance
  • Part 2. Review a sample of the assessments

Assessment validation can be time-consuming and mind-bending.

Preparation before an assessment validation meeting can reduce the time at the assessment validation meeting. However, you can expect a typical assessment validation meeting to require anywhere between a few hours and an entire day. The duration of the assessment validation meeting can depend on the quality of the assessment tool and number of assessment samples to be reviewed. I regularly see poor quality assessment tools, and it takes time to properly check large numbers of assessment samples.

Clear and critical thinking is required by people participating in an assessment validation meeting. There are usually many documents to be reviewed and checked. Printing paper copies of documents (or some documents) and using ‘split screens’ on computers will help when comparing information from two or more documents, such as:

  • unit of competency
  • assessment requirements
  • assessment plan
  • assessment matrix
  • assessment instructions
  • assessment instruments.

Frustration and fatigue can be experienced during long assessment validation meetings. Breaks will be needed (and sometimes chocolate helps). It is a good idea to assign an experienced VET practitioner to lead the assessment validation meeting.

References

[1] https://www.asqa.gov.au/standards/appendices/glossary accessed 2 September 2021

[2] https://www.asqa.gov.au/resources/fact-sheets/conducting-validation accessed 2 September 2021

[3] https://www.asqa.gov.au/standards/training-assessment/clauses-1.8-to-1.12 accessed 2 September 2021

Do you need help with your TAE studies?

Are you a doing the TAE40122 Certificate IV in Training and Assessment, and are you struggling with your studies? Do you want help with your TAE studies?

Ring Alan Maguire on 0493 065 396 to discuss.

Contact now!

logo otws

Training trainers since 1986