
As part of the TAE Training Package review, two new e-units has been drafted. PwC’s Skills for Australia is currently seeking our feedback about these two draft units of competency:
- TAEASS404 Conduct e-assessment of competence (draft)
- TAEDEL405 Plan, organise and facilitate e-learning (draft)
I have published two previous articles that presents a case against further developing and implementing these two units:
In this article, I provide a background to the development of these units of competency. I will also highlight the flawed process and missed opportunities.
Background to the E-assessment project
Case for Change
The Education IRC developed a Case for Change that covers:
- Holistic review of the TAE Training Package: a holistic review of six qualifications and 55 units of competency in the TAE Training Package.
- E-assessment project: an urgent response to address an identified gap in e-assessment in the TAE Training Package involving the development of two new units of competency.
The Case for Change was approved at the AISC meeting on 19 August 2021. [1]
E-assessment project
The following is an extract from the Case for Change. [2]

The AISC approval was for the creation of two new e-assessment units of competency. One unit of competency was to cover ‘designing e-assessment’, and the other to cover ‘facilitating e-assessment’ (as per Case for Change, see below).

The draft units of competency were released on 22 October 2021. Instead of two e-assessment units of competency, there was one e-learning unit and one e-assessment unit. This is a significant alteration to the project scope that had been approved by AISC. I have not seen any communication acknowledging or explaining this change of scope. This raises serious concerns about the decision-making and integrity of the Education IRC.
- Why was the scope changed?
- Has AISC approved the changed scope?
- Why hasn’t the changed scope been communicated and justified?
- Are members of the Education IRC aware of the changed scope? If so, why haven’t they ensured that the change be communicated?
The Case for Change identified an ‘e-assessment’ skills gap. AISC approved the creation of two new e-assessment units of competency. The Case for Change did not propose the creation of an e-learning unit of competency.
The Education IRC had considered VET sector stakeholder feedback and decided that two new e-assessment units of competency were needed. This was presented to the AISC, and subsequently approved.
- Was the original e-assessment skills gap false? Therefore, the ability of the Education IRC to analyse VET sector stakeholder feedback and make decisions is faulty.
- Was the original solution to create two e-assessment units of competency wrong? Therefore, the ability of the Education IRC to develop solutions is questionable.
- Did the Education IRC feel compelled to create two new units of competency? Therefore, did the Education IRC think it was better to create two units of competency than admit they got it wrong and only develop one e-assessment unit of competency?
The Case for Change does not justify the creation of an e-learning unit of competency. And the creation of a ‘new’ e-learning unit of competency is not required. The unnecessary creation of an e-learning unit of competency has been discussed in my previous article titled, ‘Do we need a new e-learning unit of competency?‘.
The Case for Change states that e-assessment requires a significantly different set of skills and knowledge to traditional assessment practices.
- What are ‘traditional’ assessment practices?
- Are ‘traditional’ and ‘contemporary’ assessment practices the same thing?
- Are the skills and knowledge to conduct e-assessment ‘significantly’ or ‘slightly’ different to those needed to conduct ‘traditional’ assessment?
These questions have been explored in my previous article titled, ‘Do we need a new e-assessment unit of competency?‘.
Required e-assessment skills
The following is an extract from the Case for Change. [2]

Five issues have been listed by the Education IRC. These issues or poor assessment practices may be caused by assessors lacking the required knowledge and skills. However, these issues may be caused by RTO management lacking knowledge or skills, inadequate allocation of resources for conducting quality assessment, or improper implementation of an RTO’s policies, procedures, or systems. Assessors may be falsely identified as the cause of issues, and then it is assumed that the development of knowledge and skills for assessors will resolve these issues. It is a waste of time and money to train assessors if the cause of issues is unrelated to assessor or their knowledge and skills.
The following is an analysis of the five issues.

None of the five issues seem to justify the creation of new e-assessment units of competency, especially, at developing the knowledge and skills of assessors.
The Case for Change was not made publicly available until after it had been approved by AISC. This meant that there had not been any prior opportunity to analyse and comment on the proposal or the justification of what was being proposed. The following is an extract from the Case for Change. [2]

A ‘strong evidence of industry support for the Case for Change’ is not the same thing as having strong evidence for change. The following peak bodies are listed as giving their support, or agreed with, the Case for Change:
- TAFE Directors Australia
- Enterprise RTO Association
- Australian Industry Group
- Independent Tertiary Education Council Australia
- Australasian Curriculum and Certification Authorities
- Regulators and state and territory training authorities (STAs).
Sadly, it seems that no one with current or detailed knowledge of the ‘contemporary assessment environment’ were consulted. The justification by the Education IRC for creating two new e-assessment units of competency is flawed. It is regrettable that no one consulted from the peak bodies, regulators, state and territory training authorities, or AISC knew enough to disagree with the false justifications and the proposed solution. The justifications for the ‘E-assessment project’ have already been proven as false because the project scope has been altered within two months of it being announced. This is evidence that the feedback from VET sector stakeholders, proposal by the Education IRC, and approval by AISC have been faulty.
Recent announcements [3] to scrap the current training package development and endorsement process will mean that the current people will have no accountability in the future. The current people involved will not have to face the consequences of getting things wrong or wasting government funding. I think it would be desirable to pause the current review of the TAE Training Package, including the ‘E-assessment project’, until the new Industry Cluster has been established.
Scope of the E-assessment project has changed
It is probably too late to stop the progress of the ‘E-assessment project’ that has morphed into the ‘E-learning and E-assessment project’. It should be noted that the Case for Change did not propose the creation of an e-learning unit of competency. This would indicate that the training product consultation and development process is flawed.
The scope of the ‘E-assessment project’ has been altered since it was approved by AISC. The following table shows that the proposed ‘Design e-assessment’ unit of competency has been dropped. It also shows that a ‘Plan, organise and facilitate e-learning’ has been created.

A ‘Design e-assessment’ unit of competency was proposed by the Case for Change but has not been pursued. I am unaware of any acknowledgement or communication about why it has silently been removed for the ‘E-assessment project’ scope. Instead, a new unit has been created that had not been proposed by the Case for Change. A new ‘Plan, organise and facilitate e-learning’ unit of competency has been drafted. Again, I am unaware of any acknowledgement or communication about why it has silently been added to the ‘E-assessment project’ scope. Evidence that supports the creation of a new e-learning unit of competency is unknown. It had not been identified by VET sector stakeholder and had not been presented to AISC for approval.
The work roles associated with e-assessment
The following diagram illustrates five work roles that perform tasks associated with e-assessment.

Some tasks performed by a trainer and assessor may crossover with those performed by a resources designer and developer. For example, some trainers and assessors may be required to develop simple learning or assessment resources.
Some tasks performed by a resources designer and developer may crossover with those performed by an LMS administrator. For example, some resource developers may upload resources to the LMS.
Some tasks performed by an LMS administrator may crossover with those performed by a training administrator. For example, some LMS administrator may set up courses and classes.
Some tasks performed by a training administrator may crossover with those performed by a training manager or coordinator. For example, some training administrators may have the ability to assign levels of LMS authority to other people.
The above diagram deliberately has a gap between:
- Trainer and assessor / LMS administrator
- LMS administrator / Training manager or coordinator
- Resources designer and developer / Training administrator.
The crossover between these roles are rare.
The role of LMS administrator
It is important to recognise that LMS administrators are specialists and perform tasks that are not usually performed by trainers and assessors. The most critical role in making e-assessment work is the LMS administrator. The knowledge and skills of the LMS administrator is far beyond what is required by users of the LMS, such as, trainers and assessors. The Education IRC may like to consider developing a qualification or skill set for the work role of LMS administrator. Some trainers and assessors may like this as a career pathway – but many trainers and assessors would have no desire to deeply immerse themselves into the world of LMS technology.
Users of technology
Why was a new e-assessment unit required for trainers and assessors? Was it because they cannot conduct assessments? Or was it because they cannot use the technology? Using an LMS or other technology that supports the assessment process can be treated in the same ways as using Microsoft Word, Microsoft PowerPoint, Google Docs or other technology applications that may be used by trainers and assessors. Technology to support assessment activities is a foundation skill.
The current TAEASS402 Access competence unit of competency addresses the knowledge and skills required by trainers and assessors to conduct assessments. Therefore, the ‘real’ need is to develop the capability to use Moodle, use Canvas, use Job Ready, use other LMS, or use other technology that supports an RTO’s assessment processes. Addressing every possible technology is onerous, wasteful and impractical.
A dilemma for the VET system is whether a person should learn how to:
- use all technologies although the VET graduate may never be required to use all technologies
- use one technology although the VET graduate may have to learn a different technology when they get a job.
The VET system recognises the need for further learning after a person gets qualified and gets a job. In regard to performing the role of trainer or assessor, most technology used to support assessment processes are relatively quick and easy to learn.
There is no need for the creation of a ‘new’ e-assessment unit of competency for the work role of assessor. The solution could be to add a foundation skill or slightly modify the Performance Evidence requirements for the existing TAEASS402 Access competence unit of competency.
Thinking outside the box
The Education IRC has justified the creation of new e-assessment units of competency because they got feedback from VET sector stakeholders saying there was skill gaps. And the Education IRC has listed issues that they believe create those identified skill gaps. It is a shame that the Education IRC did not explore solutions to develop the e-capability in work roles other than trainers and assessor. This is a shame because the problems associated with the identified issues are unlikely to be solved by developing the skills of trainers and assessors.
A better solution would be to improve the way RTOs deliver the TAEASS402 Access competence unit of competency. Improving the way RTOs deliver the TAEASS402 unit, and more generally the TAE40116 Certificate IV in Training and Assessment qualification (or what replaces it) would have a greater impact on improving the conduct of ‘e-assessment’ and delivery of ‘e-learning’.
The Education IRC has focused on the work role of trainer and assessor. This was the wrong focus. Developing the capability in work roles other than trainers and assessors would have a greater impact on improving the quality of ‘e-assessment’ and ‘e-learning’. The Education IRC seem to be fixated on developing the skills of trainers and assessors.
The following diagram shows various work roles that have an impact on the quality of ‘e-assessment’ and ‘e-learning’ services.

In regard to e-assessment, I think that the capability development of training managers or coordinators, resource designers and developers, and training administrators should occur before skills development of trainers and assessors.
The employment or development of talented LMS administrators is vital for implementing and maintaining quality of ‘e-assessment’ and ‘e-learning’ services. The Education IRC does not seem to be able to think outside the box. The fixation on the work role of trainers and assessors is wrong. The Education IRC, or the VET sector stakeholders that provided feedback, have lacked an insight or understanding about ‘e-assessment’ and ‘e-learning’. Not every problem or issue is going to be rectified by focusing on the skills development of trainers and assessors. My past research into the quality of assessment identified the need to focus on the training manager rather than the assessor. And training is not always the solution.
The following table outlines the existing units of competency for developing e-capability and identify the current gaps in the TAE Training Package.

The Education IRC should investigate the development of TAE qualifications or skill sets for:
- LMS administrator
- Training administrator
- Training manager or coordinator.
If a TAE qualification or skill set is developed for these work roles, it should include the skills and knowledge required to perform work tasks associated with ‘e-assessment’ and ‘e-learning’.
In conclusion
Some people may think my comments are a bit harsh. However, my comments are a critical analysis of the facts (as per the Case for Change). We have been told that feedback from industry has identified a need for two e-assessment units to be created. But we have not be given the evidence that supports the development of these units of competency. And instead of the creation of two e-assessment units, one e-assessment unit and one e-learning unit are being created.
- How was the need for two e-assessment units identified?
- Who in the VET sector identified the need?
- What exactly was the identified need?
- When was the need identified?
- Why was the scope of the ‘E-assessment project’ changed?
- Who in the VET sector identified the need for creating a new e-learning unit of competency?
The Case for Change had not be made public until after the AISC had approved it. The scope of the ‘E-assessment project’ has been changed without any justification. If the Education IRC and their SSO are unable to properly develop two units of competency after more than 12 months, then how can they be trusted to review the entire TAE Training Package within a shorter timeframe.
Please let me know what you think.
References
[1] https://www.skillsforaustralia.com/project-page/education-tae/ accessed 24 October 2021
[2] https://s3-ap-southeast-2.amazonaws.com/pwcau.prod.s4aprod.assets/wp-content/uploads/20210903164342/Final_TAE_2021-Case-for-Change_v2.5_Public.pdf accessed 9 November 2021
[3] https://www.dese.gov.au/skills-reform/resources/ministerial-statement-27-october-2021 accessed 7 November 2021
A transparent reasoned straightforward review of the history and status of the situation as of today.
I believe there is nothing to be gained with pursuing the concept, as the elephant in the room is how the assessor can be 100% certain that the assessment is the actual student’s work.
Until that is iron clad given then an online assessment will always be a flawed concept.
Derek Bailey
LikeLike