VET quality project

Tags: Allen Consulting Group, requirements, Training Package, assessment, VET QUALITY PROJECT, VET QUALITY, qualifications, conditions, quality assurance, assessment measures, ISCs, quality measures, Assessment Requirements, Industry Skills Council, Industry Training Packages, National Skills Standards Council, ISC, Project Steering Committee, RTO, the project, learner characteristics, Standards for Training Packages, Reasonable adjustments, Validation, Skills Australia, assessment methods, Training Package Standards, Allen Consulting Group vi VET QUALITY PROJECT, Tertiary Education Allen Consulting Group Pty Ltd, Industry Skills Councils, industry organisation, authentic evidence, Energy Skills Australia Transport & Logistics Industry Skills Council Ltd Service Skills Australia Government Skills Australia, background characteristics, national competency standards, Assessment conditions, quality measure, qualification, recommended measures, Assessment tool, Assessment system, Assessment tools, Training Packages, Industry Training Package, curriculum development, delivery strategies, industry requirements, Training Package Development Handbook, questionnaire
Content: VET quality project March 2013 Report to Industry Skills Councils, the National Skills Standards Council and the Department of Industry Innovation, Science, Research and Tertiary Education
Allen Consulting Group Pty Ltd ACN 007 061 930, ABN 52 007 061 930 Melbourne Level 9, 60 Collins St Melbourne VIC 3000 Telephone: (61-3) 8650 6000 Facsimile: (61-3) 9654 6363 Sydney Level 1, 50 Pitt St Sydney NSW 2000 Telephone: (61-2) 8272 5100 Facsimile: (61-2) 9247 2455 Canberra Level 1, 15 London Circuit Canberra ACT 2600 GPO Box 418, Canberra ACT 2601 Telephone: (61-2) 6204 6500 Facsimile: (61-2) 6230 0149 Online Email: [email protected] Website: www.allenconsult.com.au Disclaimer: While the Allen Consulting Group endeavours to provide reliable analysis and believes the material it presents is accurate, it will not be liable for any claim by any party acting on such information. © Allen Consulting Group 2013
The Allen Consulting Group
ii
Acknowledgments This project was undertaken by Peter Noonan and Luke Condon of the Allen Consulting Group and incorporating significant contribution from Andrea Bateman (Bateman & Giles Pty Ltd), Associate Prof Shelley Gillis (Victoria University) and Chloe Dyson (Chloe Dyson & Associates). The project team would like to acknowledge the significant contribution of the Project Steering Committee in guiding the project.
The Allen Consulting Group
iii
Contents
Executive summary
vi
Chapter 1
1
Introduction
1
1.1 Project rationale
1
1.2 Principles
2
1.3 Delivery and assessment measures
2
1.4 Related developments
2
1.5 Methodology
4
1.6 This report
5
Chapter 2
6
Drivers for the development of delivery and assessment measures
6
2.1 The Training Package model and ISCs
6
2.2 Extent of change required
9
2.3 Concerns about VET quality
9
2.4 Consistency and confidence in VET outcomes
10
2.5 Ensuring quality while enabling flexibility and innovation
12
Chapter 3
13
Proposed quality measures
13
3.1 Rationale for proposed measures
13
3.2 Delivery measures
14
3.3 Assessment measures
15
3.4 Relationship between measures
16
3.5 Other measures considered
18
3.6 Use and importance of the measures
19
Chapter 4
22
Use of the measures within current VET standards
22
4.1 Recommended approach
22
4.2 Options considered
25
4.3 Steps required for the measures to take effect
25
Chapter 5
27
Industry Skills Council advice regarding quality measures
27
5.1 Measure definitions
27
5.2 Use of measures: profile of qualifications chosen
28
The Allen Consulting Group
iv
5.3 Use of measures: intended application of measures to range of units 30
5.4 Use of measures: intended method of application
31
5.5 Use of measures
39
5.6 Importance of measures
40
5.7 Implementation
41
Appendix A
44
Detailed measures
44
Appendix B
52
Detailed ISC feedback
52
References
57
The Allen Consulting Group
v
VET QUALITY PROJECT Executive summary Introduction This project was commissioned by the Industry Skills Councils (ISCs) with support from the National Skills Standards Council (NSSC) and the Department of Industry Innovation, Science, Research and Tertiary Education (DIISRTE). The project was guided by a Project Steering Committee (PSC) comprising representatives from each of the three commissioning bodies. The project was established in response to a concern, that while ISCs are the bodies responsible for competency standard development they currently have minimal ability to ensure that registered training organisations (RTOs) consistently interpret, deliver and assess the requirements of the standards for all units and qualifications. The project used a combination of evidence based analysis of literature and existing practice, together with detailed consultations via an online questionnaire and face to face group meetings. This ensured a collaborative process, involving extensive opportunities for detailed input from the PSC, all ISCs and the NSSC Secretariat. The project involved the following stages: · project initiation and scoping; · circulation of detailed issues paper and draft measures; · ISC questionnaire, consultation and testing of draft measures; and · reporting. Drivers for the development of delivery and assessment measures1 Through the input provided by ISCs directly to the project, combined with the formal consultation process undertaken by the NSSC in reviewing the Standards for the Regulation of Vocational Education and Training (VET), the project has formed a clear understanding about current views of industry and other key stakeholders regarding VET quality. This project is in large part a response to current views. Specifically, concerns about VET quality have been raised by industry directly and through ISCs. These concerns must be addressed to ensure that industry confidence in the VET system is maintained. It is significant that submissions in response to the NSSC issues paper relating to its current review suggested that reform was needed to both inputs (delivery) and output (assessment) standards. More broadly, the issue of quality attracted 'the strongest response in the submissions' (NSSC 2012). There was a range of specific quality concerns, including:
1 The use of the term 'measures' in this context means: 'An action or procedure intended as a means to an end.' (Macquarie Dictionary 3rd Edition)
The Allen Consulting Group
vi
VET QUALITY PROJECT · the quality of VET teaching and the need for stronger requirements regarding teacher skills in delivery and design of training and related to this, the quality and adequacy of the Certificate IV in Training and Assessment (TAE); · the quality of educational design (including mode of delivery, workplace learning and the 'depth and duration of training') given its importance in shaping the learning experience; and · inconsistent understanding of 'quality', and what is expected for training to be of adequate quality, particularly with regard to the depth and duration of training. A theme to run through this project, which is also evident in the analysis of submissions to the Review of the Standards for the Regulation of VET, is that efforts to improve quality should not be to the detriment of flexibility and innovation. It is for this reason that this project has included among its guiding principles a clear intent that any measures put forward should be flexibly applied, incorporated within existing instruments and used in a selective and targeted manner. This is why, in most instances, it is more appropriate for the measures to be implemented through the Training Package Standards, rather than through the Standards for RTOs. The Training Package Standards can be applied at a unit and/or qualification level, while the Standards for RTOs apply to all RTOs, without reference to qualification or unit delivered. Proposed quality measures It is clear from both the NSSC Review of the Standards for the Regulation of VET, input provided directly to this project and public reporting of the practices of some RTOs, that there are widespread concerns about quality in some areas of the VET system. There is an acknowledgement of the need to: · set clear standards for delivery and the quality of assessment; · strengthen the current outcome based model of quality assurance by offering delivery measures; and · make these quality improvements in a way that does not create barriers to flexibility and innovation or impose excessive costs and complexity. The measures described in this report are designed to assist in this reform process. An extensive process of evidence gathering has led to the measures described, noting that: · there is a strong rationale for strengthening the existing approach to VET quality, as has been emphasised by ISCs on behalf of industry throughout this project; and · feedback from ISCs through discussions held during the project and via the questionnaire undertaken confirm that most measures are relevant but their actual use will vary depending on industry requirements.
The Allen Consulting Group
vii
VET QUALITY PROJECT
The delivery measures developed by the project and recommended for implementation are: · specific trainer requirements; · language of delivery; · learning resources; · prospective learner information; · range of training conditions; · learner characteristics; · mode; and · volume of learning. The definition of each delivery measure is provided in Table ES 1.1.
Table ES 1.1 DELIVERY MEASURES AND DEFINITIONS
Measure Specific trainer requirements Language of delivery Learning resources Prospective learner information Range of training conditions Learner characteristics Mode Volume of learning
Definition Specific trainer requirements are additional requirements to the national VET Standards for RTOs. This measure is to be applied to the delivery of training in high risk and/or high consequence areas. The language (oral and/or written) to be used in the delivery of training. Learning resources are texts, videos, software, and any other materials used to support learning. Prospective learner information related to the unit of competency, qualification and/or Skill Set that must be included in marketing material, information guides or prospectus. Specifies different environments and conditions that may affect training. Essential operating conditions that may be present (depending on the situation, needs of the learner, accessibility of the item, and local industry and regional contexts) are included. Range is restricted to essential operating conditions and any other variables essential to the learning environment. Learner characteristics are those that contribute to successful completion of the qualification. Delivery mode refers to the medium used to deliver the training/facilitate the learning and may be face-to-face, via technologies, distance resource based, blended. Volume of learning in the VET sector is the range of hours for learners to achieve the learning outcomes of a unit of competency or a qualification having regard to the characteristics of the learner (see learner characteristics) and the mode of delivery. It includes all learning and assessment activities required for the achievement of the learning outcomes, such as: · direct contact (as in training delivered in classrooms and the workplace); · practical and structured work; · practice and consolidation of knowledge and skills; · independent study and reflection; · formative and summative assessment; and · compliance with relativity measures applied to units such as points weightings (which account for complexity and other factors but do not refer to nominal hours).
The Allen Consulting Group
viii
VET QUALITY PROJECT
The assessment measures developed by the project and recommended for implementation are: · assessment system; · specific assessor requirements; · language of assessment; · assessment methods; · assessment tools; · reasonable adjustments; · validation model; and · validation specifications. The definition of each assessment measure is provided in Table ES 1.2.
Table ES 1.2 ASSESSMENT MEASURES AND DEFINITIONS
Measure Assessment system Specific assessor requirements Language of Assessment Assessment methods Assessment tools Reasonable adjustments Validation model Validation specifications
Definition An assessment system is a coordinated set of Policies and Procedures designed and implemented to increase the likelihood that assessments of numerous candidates, using many different assessors, in varying situations, are consistent, fair, valid and reliable. An assessment system may include grievances and appeals process, validation systems and processes, moderation, reporting/recording arrangements, acquisition of physical and human resources, administrative procedures, roles and responsibilities, partnership arrangements (where relevant), quality assurance mechanisms, risk management strategies and documented assessment processes. Specific assessor requirements are additional requirements to the national VET Standards for RTOs. This measure is to be applied to the delivery of assessment in high risk and/or high consequence areas. The Language (oral and/or written) to be used in assessment Assessment methods are the particular techniques used to gather evidence. Assessment tools include the following components--context and conditions of assessment, tasks to be administered to the student, an outline of the evidence to be gathered from the candidate and evidence criteria used to judge the quality of performance (i.e. the assessment decision-making rules). This term also takes in the administration, recording and reporting requirements, and may address a cluster of competencies as applicable for holistic assessment. Reasonable adjustments are those made to the way in which evidence of candidate performance is gathered, while ensuring that the criteria for making competent/not yet competent decisions (and/or awarding grades) are not altered in any way. Validation is a quality review process. It involves checking that the assessment tool produced valid, reliable, sufficient, current and authentic evidence to enable reasonable judgements to be made as to whether the requirements of the relevant aspects of the Industry Training Package or accredited course had been met. It includes reviewing and making recommendations for future improvements to the assessment tool, process and outcomes. Validation specifications are specific requirements for implementing the model such as timing, sampling framework, validator qualification/experience, and focus (e.g. tool and/or judgements).
The Allen Consulting Group
ix
VET QUALITY PROJECT
Use of the measures within current VET standards As the project's guiding principles state, the recommended quality measures are to be: 'Capable of integration into national VET standards (Standards for Training Packages or Standards for RTOs).' It is not proposed that the measures represent a new regulatory or quasi-regulatory instrument. It is recommended that each of the measures be incorporated in the Training Package Standards, except the assessment system measure, which should reside in the Standards for RTOs. In taking the measures forward, the Standards for RTOs and the Training Package Standards could be more clearly constituted as the `bookends' of VET regulation. The delivery standards in the Standards for the Regulation of VET would contain both general quality provisions in the form of a delivery system and a schedule of specifically defined measures such as those set out in Table ES 1.1 which could be cross referenced to specific requirements set out in the Training Package Standards, for example in the qualifications section. Similarly general assessment requirements could be set in the form of an assessment system together with defined measures set out in Table ES 1.2, which could be cross referenced to the Training Package Standards under the assessment heading building on current arrangements. Industry advice regarding quality measures ISCs provided extensive input to the project through a detailed questionnaire and participation in a series of workshops. The overall findings from the questionnaire process were that: · ISCs had high levels of agreement with the proposed definitions (the definitions described in this report reflect refinements based on ISC suggestions); · ISCs indicated they will apply measures selectively, based on the needs of their industry sector(s), and the quality requirements of individual qualifications; · ISCs emphasised the inter-related nature of the measures; and · the impact of each measure will need to be considered, but the compliance burden will be minimised through the flexible approach. In reference to implementation, there is a risk that over time measures could be applied to increasing numbers of qualifications. It would therefore be essential for the NSSC to closely monitor the proposed application of the measures. This could include monitoring why measures have been included in a given circumstance, and possibly why a measure has not been included where a qualification would seemingly benefit from it being specified. Additionally, there should be clear guidelines in the Training Package Standards regarding the appropriate use of each measure.
The Allen Consulting Group
x
VET QUALITY PROJECT
Chapter 1 Introduction
1.1 Project rationale This project was commissioned by ISCs with support from the NSSC and DIISRTE. The project was established in response to a concern, that while ISCs are the bodies responsible for competency standard development they currently have minimal ability to ensure that RTOs consistently interpret, deliver and assess the requirements of the standards for all units and qualifications. Current arrangements have led to a range of issues which have been debated but unresolved since the Training Package model was developed. These include variation in the quality of actual delivery by different RTOs, and resultant concerns in industry about the quality and consistency of the VET system. This project is focussed on quality measures2 that ISCs on behalf of industry have identified as being important and relevant. The application of specific measures if adopted will therefore vary by Training Package and qualification (and possibly by unit). This reflects that a range of quality measures has been identified, relating to different industries and qualification types. As the guiding principles set out in Section 1.2 state, an overarching consideration is that the measures can be flexibly applied and the advice on giving effect to the measures, set out in Chapter 4, reflects this principle. A further critical principle is that all of the measures under consideration should form part of the national VET regulatory framework and should not be seen as separate regulatory or quality instruments. The project was designed to elicit the views of industry and draw on the detailed understanding and experience of ISCs in the development and implementation of national competency standards and national qualifications in relation to measures that would improve VET, the quality of VET delivery and assessment, and the integrity of qualifications issuing. The project process and outcomes will therefore be considered as key inputs to the development of a position paper by the NSSC in relation to its Review of the Standards for the Regulation of VET and several of the assessment measures are consistent with strengthened assessment requirements agreed by the Standing Council on Tertiary Education, Skills and Employment (SCOTESE) in revised Training Package Standards last year. Consequently the measures can only be implemented if agreed by SCOTESE on the advice of the NSSC.
2 The use of the term 'measures' in this context means: 'An action or procedure intended as a means to an end.' (Macquarie Dictionary 3rd Edition)
The Allen Consulting Group
1
VET QUALITY PROJECT
Project governance The project was guided by a project steering committee comprising representatives from each of the three commissioning bodies.
1.2 Principles The following principles have guided development of the measures and advice on their implementation. The measures will: · focus on the quality of training delivery and integrity of assessment particularly as required by the specified units of competency and qualifications; · be able to be flexibly applied to different units of competency and qualifications within Industry Training Packages and between Industry Training Packages; · lead to improved industry confidence in the quality and consistency of outcomes from the national training system; · contribute to improved professional practice in VET delivery and assessment; · be capable of implementation without adding to overall compliance costs and compliance processes; · be capable of being accommodated within the Regulatory Impact Statement process for the national standards for VET regulation; · be capable of integration into national VET standards (Standards for Training Packages or VET regulation); · be clear, concise and auditable by better aligning requirements for training delivery and assessment with the requirements of units of competency and qualifications; and · be able to be consistently interpreted by both practitioners and regulators. 1.3 Delivery and assessment measures The project has developed 'delivery' and 'assessment' measures (these measures were originally termed input and output measures). Delivery measures focus on the process of training while assessment measures focus on the integrity of assessment, while recognising that these processes are inter related. The inclusion of both delivery and assessment measures reflects a view that existing arrangements, which have a focus on assessment determining quality outcomes, have been 'silent on the efficacy of the program design and its relationship to quality training and assessment outcomes' (Bateman, Vickers & Dunn 2010, p.3). The rationale for the scope of measures is detailed further throughout the report.
1.4 Related developments The measures should be seen in the context of recent major and relevant developments in VET regulation and quality assurance.
The Allen Consulting Group
2
VET QUALITY PROJECT Revision of standards for regulation of VET SCOTESE commissioned the NSSC to undertake a broad ranging review of the Standards for the Regulation of VET focussing on issues of quality. An issues paper was developed to inform and guide the development of the NSSC Position Paper. The Position Paper will put forward what the NSSC considers to be the changes required to the Standards for the Regulation of VET. The Position Paper will be the key reference for the subsequent drafting of new standards (see NSSC website). National and state VET regulation The Australian Skills Quality Authority (ASQA) is the VET national regulator. The transfer of the regulatory function has occurred in several stages. On 1 July 2011, ASQA became the regulatory body for the VET sector for the Australian Capital Territory, the Northern Territory and New South Wales; for Tasmania in February 2012; for South Australia in March 2012 and for Queensland July 20123. Training Package standards In November 2012 Commonwealth and state/territory ministers endorsed the Standards for Training Packages developed by the NSSC. The Standards apply to the design and development of Industry Training Packages by ISCs, which are then endorsed by the NSSC. The recommended application of the measures within the revised Training Package Standards is described in Chapter 4. National reforms In April 2012, COAG agreed to a set of reforms to the national training system; agreeing to a revised National Agreement for Skills and Workforce Development and a new National Partnership Agreement on Skills Reform (NP). Key reforms related to VET provision included: ... assure the quality of training delivery and outcomes, with an emphasis on measures that give industry more confidence in the standards of training delivery and assessment. ...improving the confidence of employers and students in the quality of training courses, by developing and piloting independent validation of training provider assessments National Partnership Agreement on Skills Reform 2012 The National Partnership initiative arose from the Skills Australia (2011) report, Skills for prosperity ­ a roadmap for vocational education and training, released in May 2011. The report recommended external validation as a mechanism to lift the quality, rigour, validity and consistency of assessment outcomes.
The Allen Consulting Group
3 Victoria and Western Australia have not referred their powers in the regulation of VET to the Commonwealth. However ASQA does regulate those providers in Victoria and WA who have international students or which also operate in a referring state. 3
VET QUALITY PROJECT
Under the NP between the states and territories and the Commonwealth, states and territories have agreed to develop and pilot models of independent validation of RTO assessment practices with a view to informing the development of a national model. These projects are in progress and the outcomes from this project can be used to inform those pilots. 1.5 Methodology The project used a combination of evidence based analysis of literature and existing practice, together with detailed consultations via an online questionnaire and face to face group meetings. This ensured a collaborative process, involving extensive opportunities for detailed input from the PSC, all ISCs and the NSSC Secretariat. The project involved the following stages: · project initiation and scoping; · circulation of detailed issues paper and development of draft measures; · ISC questionnaire, consultation and testing of draft measures; and · reporting. Project initiation and scoping This stage involved initial high-level consultation with PSC members and ISC Chairs. The NSSC Secretariat was also consulted from the outset of the project. Issues paper and draft measures The issues paper, which was endorsed by the PSC served two purposes: · it ensured the project was grounded in literature relating to the potential content of the measures as well as existing and emerging practice; and · it provided the framework for ISCs and the PSC to review, validate and test the measures. ISC questionnaire, consultation and testing of the measures Accompanying the issues paper, each ISC was sent a link to a web-hosted questionnaire. ISCs contributed to the development of the questionnaire. The questionnaire was in three parts: · Part 1: Obtain ISC feedback on the definition of each measure in the quality frameworks; · Part 2: 'Apply' the measures to selected ISC qualifications; and · Part 3: Rate the measures described in the two frameworks based on likely frequency of use, importance and ease or difficulty of implementation. Questionnaire results were analysed and results presented to ISCs for discussion. This assisted in guiding subsequent advice to the PSC.
The Allen Consulting Group
4
VET QUALITY PROJECT Reporting This report summarises the outcomes of the project, incorporating the extensive consultation undertaken as outlined in the methodology. 1.6 This report This report comprises the following chapters: · Chapter 2 -- describes the drivers for the development of delivery and assessment measures; · Chapter 3 -- describes the proposed quality measures; · Chapter 4 -- describes the recommended approach to the use of the measures in the current VET standards; · Chapter 5 -- summarises industry advice provided to the project, primarily from the ISC questionnaire; and · Appendices provide further detail of the rationale for each measure and ISC input.
The Allen Consulting Group
5
VET QUALITY PROJECT
Chapter 2 Drivers for the development of delivery and assessment measures
This chapter provides further detail regarding the key drivers for the development of delivery and assessment measures undertaken by this project. It provides relevant background regarding the Training Package model and the role of ISCs. It then summarises quality issues in VET and the role of the proposed measures in dealing with them, drawing on the NSSC Review of the Standards and advice provided directly to the project from ISCs and the PSC. (Chapter 5 provides further detail of ISC input.) 2.1 The Training Package model and ISCs Training Package model The measures recommended by this project must be considered in terms of the original objectives of Training Package, the Standards for the Regulation of VET and the move away from nationally prescribed courses and curriculum. The Training Package model was originally developed to more directly align qualifications with industry developed national competency standards, reduce regulation by removing requirements for state based course accreditation and to create greater flexibility in delivery strategies. The current definition of Industry Training Packages is provided in Box 2.1.
The Allen Consulting Group
Box 2.1 INDUSTRY TRAINING PACKAGES Industry Training Packages specify the skills and knowledge required to perform effectively in the workplace. They do not prescribe how an individual should be trained. The development and endorsement process for Industry Training Packages ensures the units of competency, qualifications, skill sets and assessment requirements are developed to an agreed quality standard and are responsive to industry's existing and future demand for skills. Each Industry Training Package: · provides a consistent and reliable set of endorsed components -- endorsed by the NSSC; · enables nationally recognised VET qualifications to be awarded through direct assessment of workplace competencies; · encourages the development and delivery of flexible training which suits individual and industry requirements; and · encourages learning and assessment in a work-related environment which leads to verifiable workplace outcomes. Industry Training Packages are made up of these nationally endorsed components: Units of competency; Assessment requirements; Qualifications; and Credit arrangements. Each Industry Training Package is accompanied by one or more quality assured Companion Volumes. Source: The Compact within the ISC Funding Agreement held with DIISRTE 6
VET QUALITY PROJECT Prior to the introduction of Industry Training Packages, national curriculum was developed and programs accredited to complement national competency standards. This process built on the practice of state based curriculum development and accreditation, providing essentially an input based approach to quality assurance. However while national curriculum provided a comprehensive base for delivery, the curriculum development and accreditation process tended to be cumbersome and unwieldy. Under the Training Package model, the emphasis of quality assurance shifted to RTO compliance against the standards for initial and continuing registration. To avoid widespread variation in the quality of learning resources, training delivery and assessment, it was agreed that learning resources produced by Industry Training Package developers would be noted, although not endorsed, by the predecessor to the NSSC. As such guidance within and use of learning resources cannot and was not intended to have any regulatory effect. In response to concerns about quality and consistency in training delivery there have been progressive revisions to standards for initial and continuing registration of RTOs relating to strategies for quality training and assessment. These standards as they relate to assessment cross-reference the requirements of the relevant Industry Training Packages. Some ISCs and industry bodies strongly adhere to the concept of Industry Training Packages primarily specifying the skills and knowledge required to perform effectively in the workplace. The issue is the extent to which important measures related to the delivery of training should be specified to address quality concerns, without the level of prescription about training delivery that existed previously. Industry Training Package reforms The VET Products for the 21st Century Report (NQC 2009) recommended reform of important aspects of Industry Training Packages, including restructuring and streamlining of Industry Training Package content. The new Standards for Training Packages implement several recommendations of the report. Box 2.2 summarises the role of the new Standards for Training Packages.
The Allen Consulting Group
7
VET QUALITY PROJECT
Box 2.2 NEW STANDARDS FOR TRAINING PACKAGES The new Standards for Training Packages were endorsed by SCOTESE on 16 November 2012. The purpose of the Standards for Training Packages is to ensure Industry Training Packages are of high quality and meet the workforce development needs of industry, enterprises and individuals. The Standards apply to the design and development of Industry Training Packages by ISCs, which are then endorsed by the NSSC. The Standards for Training Packages replace the Training Package Development Handbook (which includes the previous Training Package Development and Endorsement Process). The new Standards for Training Packages implement the agreed recommendations from the joint COAG/NQC VET Products for the 21st Century Report, endorsed by the Ministerial Council for Tertiary Education and Employment. Under the new Standards there will be a strengthened quality assurance process that will include all components of a Training Package put forward to the NSSC for endorsement. Source: NSSC, accessed 8 March 2013 http://www.nssc.natese.gov.au/__data/assets/pdf_file/0009/72765/NSSC-SB-03__Standards_for_Training_Packages.pdf
Role of ISCs The formal role of ISCs is described in Box 2.3, although as independent bodies most ISCs also undertake other roles relevant to their industry sectors. Box 2.3 ROLE OF INDUSTRY SKILLS COUNCILS The formal roles of Industry Skills Councils involve: · actively supporting the development and continuous improvement of the high quality training material (in accordance with the Compact with Industry); · engaging in workforce Development Activities and services; and · providing integrated industry advice to government, Australian Workforce and Productivity Agency, industry and enterprises on workforce development and skills needs including the production of an annual Environmental Scan. Source: ISC Funding Agreement held with DIISRTE
The Allen Consulting Group
ISCs have had a close involvement with this project. This has occurred through ISC representation on the PSC, an ISC questionnaire, and three major briefings with ISCs at key points in the project, including an initial meeting with ISC chairs. Through the input provided by ISCs directly to the project, combined with the formal consultation process undertaken by the NSSC in reviewing the Standards for the Regulation of VET, the project has formed a clear understanding about current views of industry through ISCs and other key stakeholders regarding VET quality and in particular as they apply to particular qualifications and units of competence. This project is in large part reflects these views and the insights of ISCs in relation to qualifications and units of competence within and across Industry Training Packages. 8
VET QUALITY PROJECT Specifically, concerns about VET quality have been raised by industry directly and through ISCs. These concerns must be addressed to ensure that industry confidence in the VET system is maintained. Concurrently, it is important that efforts to improve quality do not detract from the ability of RTOs to be flexible and innovative. 2.2 Extent of change required The analysis of submissions provided to the NSSC review (2012) found that of those respondents that provided a direct answer on the scope of change required (about three quarter of respondents), 42 per cent (33 out of 79) said 'significant reform was required'. A further 32 per cent (25 out of 79) stated there was a need for significant reform 'within key areas'. The remainder said 'minimal' reform was required. While the meaning of 'significant reform' is subjective, this response does indicate a strong general view that the Standards for the Regulation of VET are in need of fundamental revision reflecting concerns about aspects of VET quality, as other findings from the submissions indicate. 2.3 Concerns about VET quality The overall finding of the analysis of submissions to the NSSC Review was that: The main area...identified as needing significant reform was the quality of learning and assessment: both the inputs and the outcomes. NSSC Review of the Standards for the Regulation of VET 2012, p.8 It is significant that submissions in response to the NSSC issues paper suggested that reform was needed to both inputs (delivery) and output (assessment) standards. More broadly, the issue of quality attracted 'the strongest response in the submissions' (NSSC 2012). There was a range of specific quality concerns, including: · the quality of VET teaching and the need for stronger requirements regarding teacher skills in delivery and design of training and related to this, the quality and adequacy of the Certificate IV TAE; · the quality of educational design (including mode of delivery, workplace learning and the 'depth and duration of training' -- including inappropriate offering of RPL) given its importance in shaping the learning experience; · inconsistent understanding of 'quality', and what is expected for training to be of adequate quality, particularly with regard to the depth and duration of training. In summarising the views expressed by RTOs, the report found that: RTOs expressed particular concerns about the ability of the current standards to adequately safeguard the quality of training delivery and assessment, and ultimately the quality of the qualification outcomes. NSSC Review of the Standards for the Regulation of VET 2012, p.8
The Allen Consulting Group
9
VET QUALITY PROJECT The above concerns are consistent with the observation that the quality of training is not a sufficient parameter in the Standards for the Regulation of VET. The standards have been characterised as `silent on the efficacy of the program design and it relationship to quality training and assessment outcomes' and as relying on the validity of assessment to determine quality outcomes (Bateman, Vickers & Dunn 2010). While the above extracts are in the context of the adequacy of the Standards for the Regulation of VET, it reflects a broader concern about the apparent inadequacy of current safeguards of VET quality. This is reflected in widespread support for a reform of the VET standards across each of the major VET stakeholder groups (including RTOs, industry and government organisations). 2.4 Consistency and confidence in VET outcomes In analysing submissions to the NSSC Review of the Standards for the Regulation of VET (Ithaca Group 2012) on questions relating to nationally consistent outcomes, the report found that: The consistency of outcomes was generally seen as an area undermining the reputation of the VET sector and requiring significant attention in any regulatory framework. NSSC Review of the Standards for the Regulation of VET 2012, p.15 Bateman and Gillis (2012) noted that in recent years, a number of key stakeholders have raised concerns with the quality and consistency of competency based assessments (e.g. Skills Australia, 2011; Hoeckel, Moonhee, Simon, & Troy, 2008; Precision Consultancy, 2008; Service Skills SA, 2012), noting that: ... ensuring the comparability of standards has become particularly pertinent in the VET sector, as assessments can now be made across a range of contexts (e.g. vocational education, educational and industrial contexts) by a diverse range of assessors using highly contextualised performance based tasks Independent Validation: Workshop Background Paper The NSSC analysis of submissions also noted that some respondents considered that the problems with the VET standards 'have arisen due to inconsistencies of the interpretation and implementation of the standards' (NSSC 2012 p.8). While this can be framed as a national consistency issue, fundamentally it is a further manifestation of concerns about quality, and the ability of industry to have confidence in VET qualifications. Links with Industry Training Packages As Training Package requirements are themselves statements of required outcomes and the Standards for RTOs very broad, there can be significant differences between RTOs in the nature and quality of both learning and assessment resources, and in the actual training and assessment process.
The Allen Consulting Group
10
VET QUALITY PROJECT
These differences can create difficulties in the registration and audit process of RTOs in terms of consistent interpretations by regulators. They may also affect industry confidence in the quality and integrity of both the training process and in assessment and qualification issuing. For example there have been frequent concerns about programs for the same qualification of significantly different duration, in access by learners to industry relevant facilities and in the quality of teaching and assessment resources. In addressing this issue, the report of submissions to the NSSC summarised that respondents had advocated a stronger link with Training Package requirements. This is entirely consistent with the approach recommended by this project. The report observed that: Many respondents suggested that more emphasis needed to be placed on ensuring training delivery and assessment meets Training Package requirements in order to achieve national consistency of qualification outcomes. NSSC Review of the Standards for the Regulation of VET 2012, p.15 The report highlighted that ASQA had urged Industry Training Packages to set clear standards for delivery and the quality of assessment: The link between the national standards for the regulation of VET and the Training Package requirements needs to be strengthened to provide greater consistency in how industry and training providers interpret and implement vocational training to meet industry standards. ASQA submission to the NSSC Review of the Standards for the Regulation of VET 2012, p.16 Further to these submission extracts and the current definition of Industry Training Packages provided in Box 2.1, this project is putting forward measures which are optional for ISCs and only deal with critical aspects of the delivery process where required, and do not prescribe the full delivery process. Role of inputs (delivery measures) The concerns outlined in submissions to the NSSC review and expressed by many ISCs throughout this project and the PSC, have been evident since the initial High Level Review of National Training Packages in 2003. A summary of the major themes raised by stakeholders indicated that: Some stakeholders seek more specific information in endorsed components of Training Packages, including the sort of information currently provided in implementation guides and user guides, such as nominal hours, entry requirements for qualifications and more comprehensive guidance on delivery and assessment. Others suggest that Training Packages should define the processes for training delivery Australian National Training Authority 2003 Essentially, these sentiments are suggesting that input, or delivery, requirements may have a greater role to play in ensuring quality and consistency in what is fundamentally an outcome based system. With this in mind, respondents to the Review of the Standards for the Regulation of VET were asked if 'outcomes focussed' should remain a key feature of the standards.
The Allen Consulting Group
11
VET QUALITY PROJECT The report analysing submissions observed there was a sizeable portion of respondents who did not want an exclusively outcomes based approach. This was reflected in growing support for input measures in some areas. Also suggested in the responses to the NSSC issues paper was that the Tertiary Education Quality Standards Agency (TEQSA) Teaching and Learning Standards be used as a model for strengthening the Standards for the Regulation of VET. The TEQSA standards include processes and inputs as well as outcomes. The government agencies responding to the NSSC consultation paper also suggested that some quality indicators or measures, as used by funding bodies, should be introduced. This project has recommended that delivery measures be available for ISCs to include in the requirements for specified qualifications, primarily through the Training Package Standards. In any quality assurance system or regulatory framework it is reasonable to establish delivery measures accompanying an outcomes-based instrument (such as an Industry Training Package). This adds confidence to the legitimacy of the outcome. 2.5 Ensuring quality while enabling flexibility and innovation A theme to run through this project, which is also evident in the analysis of submissions to the Review of the Standards for the Regulation of VET, is that efforts to improve quality should not be to the detriment of flexibility and innovation. It is for this reason that this project has included among its guiding principles a clear intent that any measures put forward should be flexibly applied, incorporated within existing instruments and used in a selective and targeted manner. This is why, in most instances, it is more appropriate for the measures to be implemented through the Training Package Standards, rather than through the Standards for the Regulation of VET. The Training Package Standards can be applied at a unit and/or qualification level, while the Standards for the Regulation of VET apply to all RTOs.
The Allen Consulting Group
12
VET QUALITY PROJECT
Chapter 3 Proposed quality measures
This report sets out a range of quality measures, which have been validated and tested by ISCs with reference to specific qualifications. 3.1 Rationale for proposed measures The measures were derived from: an analysis of background research and reports (which were summarised in the issues paper); and drawing on the detailed past involvement of members of the project team in the development and reviews of Training Packages and the Australian Quality Training Framework (AQTF) standards over a number of years, as well as the development of both National Quality Council (NQC) validation and moderation booklets and the AQTF user guides. Further detail regarding the evidence base for the measures is detailed in Chapter 2 and Chapter 5. In summary, the project is responding to concerns about VET quality, which has implications for industry confidence in the VET system. It is clear from both the NSSC Review of the Standards for the Regulation of VET, input provided directly to this project (through ISCs on behalf of industry and PSC members) and public reporting of the practices of some RTOs, that there are widespread concerns about quality in some areas of the VET system. The NSSC review has reported a widespread acknowledgement of the need for significant reform in order to respond to these concerns and safeguard VET quality. There is an acknowledgement of the need to: · set clear standards for delivery and the quality of assessment; · strengthen the current outcome based model of quality assurance by offering delivery measures that are necessary to ensure assessment outcomes can be met; and · make these quality improvements in a way that does not create barriers to flexibility and innovation or impose excessive costs and complexity. The measures described in this chapter are designed to assist in this reform process. Chapter 4 details how the measures should be given effect, noting that in all instances it is recommended that measures either reside in the Standards for RTOs or, (in most cases) the Training Package Standards. The measures, including a definition, options describing the various forms they can take, and a rationale for their inclusion, is provided at Appendix A. An extensive process of evidence gathering has led to the measures described below, noting that: · there is a strong rationale for strengthening the existing approach to VET quality, as has been emphasised by ISCs on behalf of industry throughout this project (see Chapter 2); and
The Allen Consulting Group
13
VET QUALITY PROJECT
· feedback from ISCs through discussions held during the project and via the questionnaire undertaken confirm that most measures are relevant but their actual use will vary depending on industry requirements (see Chapter 5). 3.2 Delivery measures The delivery measures developed by the project and recommended for implementation are: · specific trainer requirements; · language of delivery; · learning resources; · prospective learner information; · range of training conditions; · learner characteristics; · mode; and · volume of learning.
Table 3.1 DELIVERY MEASURES AND DEFINITIONS
Measure Specific trainer requirements Language of delivery Learning resources Prospective learner information Range of training conditions Learner characteristics Mode Volume of learning
Definition Specific trainer requirements are additional requirements to the national VET Standards for RTOs. This measure is to be applied to the delivery of training in high risk and/or high consequence areas. The language (oral and/or written) to be used in the delivery of training. Learning resources are texts, videos, software, and any other materials used to support learning. Prospective learner information related to the unit of competency, qualification and/or Skill Set that must be included in marketing material, information guides or prospectus. Specifies different environments and conditions that may affect training. Essential operating conditions that may be present (depending on the situation, needs of the learner, accessibility of the item, and local industry and regional contexts) are included. Range is restricted to essential operating conditions and any other variables essential to the learning environment. Learner characteristics are those that contribute to successful completion of the qualification. Delivery mode refers to the medium used to deliver the training/facilitate the learning and may be face-to-face, via technologies, distance resource based, blended. Volume of learning in the VET sector is the range of hours for learners to achieve the learning outcomes of a unit of competency or a qualification having regard to the characteristics of the learner (see learner characteristics) and the mode of delivery. It includes all learning and assessment activities required for the achievement of the learning outcomes, such as: · direct contact (as in training delivered in classrooms and the workplace); · practical and structured work; · practice and consolidation of knowledge and skills; · independent study and reflection; · formative and summative assessment; and · compliance with relativity measures applied to units such as points weightings (which account for complexity and other factors but do not refer to nominal hours).
The Allen Consulting Group
14
VET QUALITY PROJECT
3.3 Assessment measures The assessment measures developed by the project and recommended for implementation are: · assessment system; · specific assessor requirements; · language of assessment; · assessment methods; · assessment tools; · reasonable adjustments; · validation model; and · validation specifications.
Table 3.2 ASSESSMENT MEASURES AND DEFINITIONS
Measure Assessment system Specific assessor requirements Language of assessment Assessment methods Assessment tools Reasonable adjustments Validation model Validation specifications
Definition An assessment system is a coordinated set of policies and procedures designed and implemented to increase the likelihood that assessments of numerous candidates, using many different assessors, in varying situations, are consistent, fair, valid and reliable. An assessment system may include grievances and appeals process, validation systems and processes, moderation, reporting/recording arrangements, acquisition of physical and human resources, administrative procedures, roles and responsibilities, partnership arrangements (where relevant), quality assurance mechanisms, risk management strategies and documented assessment processes. Specific assessor requirements are additional requirements to the national VET Standards for RTOs. This measure is to be applied to the delivery of assessment in high risk and/or high consequence areas. The Language (oral and/or written) to be used in assessment. Assessment methods are the particular techniques used to gather evidence. Assessment tools include the following components--context and conditions of assessment, tasks to be administered to the student, an outline of the evidence to be gathered from the candidate and evidence criteria used to judge the quality of performance (i.e. the assessment decision-making rules). This term also takes in the administration, recording and reporting requirements, and may address a cluster of competencies as applicable for holistic assessment. Reasonable adjustments are those made to the way in which evidence of candidate performance is gathered, while ensuring that the criteria for making competent/not yet competent decisions (and/or awarding grades) are not altered in any way. Validation is a quality review process. It involves checking that the assessment tool produced valid, reliable, sufficient, current and authentic evidence to enable reasonable judgements to be made as to whether the requirements of the relevant aspects of the Training Package or accredited course had been met. It includes reviewing and making recommendations for future improvements to the assessment tool, process and outcomes. Validation specifications are specific requirements for implementing the model such as timing, sampling framework, validator qualification/experience, and focus (e.g. tool and/or judgements).
The Allen Consulting Group
15
VET QUALITY PROJECT
3.4 Relationship between measures ISCs emphasised the inter-related nature of the measures. For example, they observed that volume of learning could sometimes be addressed in conjunction with a clear and comprehensive description of learner characteristics and information to prospective learners to differentiate between the time it would take for an experienced practitioner to achieve competence relative to that for a `novice learner' such as a school leaver or a new workforce entrant. The delivery and assessment measures are not necessarily independent of each other. For some measures there may be interdependence and the measures may be highly related. The following tables provide an indication of the measures which are most closely related. Table 3.3 shows the major links between the delivery measures.
Table 3.3 RELATIONSHIP BETWEEN DELIVERY MEASURES
Measure Specific trainer requirements Language of delivery Learning resources Prospective learner information Range of training conditions Learner characteristics Mode Volume of learning
Linkages - Mode - Range of training conditions Mode, language of delivery
Table 3.4 shows the major links between assessment measures.
The Allen Consulting Group
16
VET QUALITY PROJECT
Table 3.4 RELATIONSHIP BETWEEN MEASURES Measure Specific assessor requirements RPL/assessment only Language of assessment Assessment system Assessment methods Assessment tools Assessment conditions Reasonable adjustments Validation model Validation specifications
Linkage Assessment system Assessment system Assessment system All (i.e. through a combination of direct and indirect paths) Assessment tool Assessment methods Assessment conditions Validation model Validation specifications Assessment system Assessment tools Assessment system Assessment tools Assessment system Validation specification Assessment system Validation model
As the above tables indicate, the assessment measures are highly related whereas there are fewer connections among the delivery measures. It can also be seen that the measures only relate within (as opposed to across) each type of measure (i.e. within delivery or assessment). For illustrative purposes only, a conceptual framework for the connections associated with the assessment tool measure is shown in Figure 3.1. For example, only when the option for assessment tools is to be `centrally developed' (irrespective of whether they are centrally or locally administered) does the measure of reasonable adjustments come into play. It can also be seen that the options within the assessment system measure have a direct impact on the assessment tools, assessor requirements, validation model and validation specifications measure, as well as an indirect impact on reasonable adjustments (via the assessment tool measure).
The Allen Consulting Group
17
VET QUALITY PROJECT Figure 3.1 decision tree: ASSESSMENT TOOLS
3.5 Other measures considered In developing the draft quality frameworks all potential measures were included for the purposes of the project for testing. RPL/assessment only and assessment conditions are considered to be either unsuitable or not necessary for inclusion in the final recommended measures. The delivery measure range of training conditions was originally labelled range of conditions, but this was changed to avoid confusion with the existing range of conditions field in the Unit of Competency Template of the Standards for Training Packages. RPL/assessment only RPL/assessment only was a measure put forward for comment in the consultation process with ISCs. However, in the course of the project the PSC agreed that if ISCs had discretion in selecting when the quality measures recommended in this report were required and applied particularly in relation to assessment, it would then not be reasonable to also place what would be arbitrary limits on the extent of RPL based assessment which can be granted. It was therefore not necessary to take this measure forward through this project. However in the short to medium term a strong focus by ASQA and state regulators will be required to guard against inappropriate use of RPL. The other measures proposed will assist in ensuring that the risk of inappropriate use of RPL is greatly reduced.
The Allen Consulting Group
18
VET QUALITY PROJECT Assessment conditions Throughout the project, ISCs and the PSC emphasised the importance of assessment conditions as a quality measure, and this was included in the consultation process. However, as assessment conditions is now a mandatory field in the Assessment Requirements of the Standards for Training Packages, and as there was endorsement of the definition of assessment conditions contained used in the standards, it was not necessary to take this measure forward through this project. The implementation of assessment conditions is now a matter for ISCs through Industry Training Packages under current arrangements, but would have to be considered in the context of the potential future application of additional measures such as assessment validation. 3.6 Use and importance of the measures This report proposes that ISCs have discretion in specifying when measure(s) are to be applied to qualifications (or skill sets or units). By doing so, this is expected to ensure that expected competency requirements are met. Of the other measures, ISCs indicated that some of these would be used infrequently (as the summary of ISC views provided in Chapter 5 suggests). However, in some instances ISCs indicated that particular measures would be especially important. Therefore, the expected frequency of use of itself was not a sufficient basis for the removal of measures. Several factors were considered including the individual requirements of each ISC, which was based on the profile of their industry and industry need. Chapter 5 examines the intended ISC use of the measures in more detail. By way of summary, the two figures below have ranked the intended use and importance of each measure. Delivery measures Figure 3.2 shows the level of use and importance of each delivery measure, as indicated by the ISCs.
The Allen Consulting Group
19
VET QUALITY PROJECT Figure 3.2 DELIVERY MEASURES Source: ISC questionnaire Assessment measures Figure 3.3 shows the level of use and importance of each assessment measure, as indicated by the ISCs. Figure 3.3 ASSESSMENT MEASURES
Source: ISC questionnaire
The Allen Consulting Group
20
VET QUALITY PROJECT ISCs indicated throughout the project that they would apply measures selectively, based on the needs of their industry sector(s), and the quality requirements of individual qualifications. Allowing ISCs to choose whether to include these measures in an Industry Training Package is consistent with the principles described in Section 1.2. This reflects that ISCs are best placed to recognise where specific requirements will improve outcomes consistent with the Standards for Training Packages. It is also preferable to an alternative arrangement, in which expected measures are specified for all units and/or qualifications.
The Allen Consulting Group
21
VET QUALITY PROJECT
Chapter 4 Use of the measures within current VET standards
4.1 Recommended approach As the project's guiding principles described in Chapter 1 state, the recommended quality measures are to be: 'Capable of integration into national VET standards (Standards for Training Packages or Standards for RTOs).' It is not proposed that the measures represent a new regulatory or quasi-regulatory instrument. The project has recommended a placement of each measure within the current Standards for Training Packages. However, as the overall architecture of the Standards for RTOs evolves further consideration could be given to the overall framework for standards and the location of the measures within them. Table 4.1 summarises the recommended placement of each delivery measure and a brief rationale. It also observes whether a similar measure already exists in the Training Package Standards, the Standards for RTOs or another instrument.
Table 4.1 POSITIONING OF THE DELIVERY MEASURES
Measure
Standard
Specific trainer requirements Language of delivery Learning resources Prospective learner information Range of training conditions
Standards for Training Packages (Unit of Competency component and Qualifications component) Standards for Training Packages (Unit of Competency component and Qualifications component) Standards for Training Packages (Unit of Competency component and Qualifications component) Standards for Training Packages (Unit of Competency component and Qualifications component) Standards for Training Packages (Unit of Competency component and Qualifications
The Allen Consulting Group
Rationale These are requirements for trainers to have higher AQF level qualifications for specific qualifications, skill sets or units. It is most likely to apply in high risk or high consequence areas. The requirements would attach to the qualification or unit. This is the requirement for the language of delivery for a qualification. The requirement would attach to the qualification or unit. This is the requirement for learning materials used to support learning for a qualification. The requirement would attach to the qualification or unit. This is the requirement for prospective learner information related to the qualification that is to be provided. The requirement would attach to the qualification or unit. This is the requirement specifying different environments and conditions that may affect training. It is recommended that this be made a mandatory field at the unit or qualification level.
Similar to an existing measure? No No No No No (current range of conditions statement refers to performance not training) 22
VET QUALITY PROJECT
Measure Learner characteristics Mode Volume of learning
Standard component) Standards for Training Packages (Unit of Competency component and Qualifications component) Standards for Training Packages (Unit of Competency component and Qualifications component) Standards for Training Packages (Unit of Competency component and Qualifications component)
Rationale This is the requirement for a learner to possess desired characteristics, particularly when qualifications can only be effectively acquired by people working in specific job roles. The requirement would attach to the qualification or unit. This measure allows ISCs to specify if a particular mode or modes of delivery are required or inappropriate for a qualification. The requirement would attach to the qualification or unit. This measure is the range of hours for learners to achieve the learning outcomes of a unit of competency or a qualification having regard to the characteristics of the learner. The requirement would attach to the qualification or unit.
Similar to an existing measure? Entry Requirements is an optional field in the Qualifications Template however learner characteristics is different as it refers to prerequisite units or qualifications. No No
Further explanation of the rationale for each measure is provided in Appendix A. By way of brief further explanation of the volume of learning measure, the revised AQF guidelines include reference to the volume of learning (expressed as a duration measure) for each broad qualification type. The project acknowledges there has been a general unwillingness in VET to have Industry Training Packages specify this requirement. For the reasons outlined in Chapter 2 and reflected in feedback from ISCs, there is now a desire for clearer guidance to be given regarding volume of learning expectations in selected circumstances. The above measure is designed to enable ISCs to give effect to volume of learning in a VET context where appropriate.
The Allen Consulting Group
23
VET QUALITY PROJECT
Table 4.2 summarises the recommended placement of each assessment measure and a brief rationale.
Table 4.2 POSITIONING OF THE ASSESSMENT MEASURES
Measure Assessment system Language of assessment Assessment methods Assessment tools Reasonable adjustments Validation model Validation specifications
Standard Standards for RTOs Standards for Training Packages (Assessment Requirements component) Standards for Training Packages (Assessment Requirements component) Standards for Training Packages (Assessment Requirements component) Standards for Training Packages (Assessment Requirements component) Standards for Training Packages (Assessment Requirements component) Standards for Training Packages (Assessment Requirements component)
Rationale This measure is the assessment system established by an RTO and applied to all qualifications. Because the requirement applies to RTOs, it would form an additional requirement in the Standards for RTOs. Different overall assessment systems should not be specified for different qualifications, however specific measures within the system may be specified This is the requirement for the language of assessment for a qualification. The requirement would be specified through an additional field in the Assessment Requirements template. This measure specifies the methods used to gather evidence for assessment. The requirement would be specified through an additional field in the Assessment Requirements template. This measure specifies the tools to be used in the assessment. The requirement would be specified through an additional field in the Assessment Requirements template. This measure provides for adjustment in the way that assessment evidence is gathered. The measure would be specified through an additional field in the Assessment Requirements template. This measure is a quality review process to ensure the assessment tool has performed as it should. The measure would be specified through an additional field in the Assessment Requirements template. These are specific requirements for implementing the validation model. The measure would be specified through an additional field in the Assessment Requirements template
Similar to an existing measure? No No No No No No No No
The Allen Consulting Group
24
VET QUALITY PROJECT 4.2 Options considered This project has considered options through which the frameworks could be formally implemented; for example in relation to assessment through recently agreed national Training Package Standards (noting that these will be enforced through the revised Standards for RTOs, they are not enforceable in and of themselves) and the unit of competency template, and through the revised Standards for the Regulation of VET by cross referencing industry requirements where these have been endorsed by the NSSC. Ultimately, the strong view of the PSC was that the measures should be included in the endorsed components of the Training Package Standards in most cases. This was preferred for two key reasons: · this enables ISCs to attach measures at the qualification level, which is appropriate in most cases and supports the principle of flexibility and discretionary application; and · when a measure is specified, it should be mandatory and subject to audit, hence the need for inclusion in the endorsed components. 4.3 Steps required for the measures to take effect As outlined in the introduction to this report the project outcomes are to be considered by the NSSC in relation to its Review of the Standards for the Regulation of VET. It is expected that the overall architecture of the standards as well as the standards themselves are likely to be substantially revised and measures currently not adequately addressed in the standards are likely to be included, particularly measures related to training delivery. At present standards relating to training delivery are expressed generally and subject to differing interpretation. They are therefore difficult to apply through regulation and cannot be contextualised to meet the requirements of different units of competency or qualifications. Standards relating to assessment can however be cross referenced to assessment requirements specified in Industry Training Packages. The Standards for RTOs and the Training Package Standards could be more clearly constituted as the `bookends' of VET regulation. The Standards for RTOs would contain both general quality provisions in the form of a delivery system, and a schedule of specifically defined measures such as those set out in Table 3.1. Clearly defining each of the measures in such a way would avoid uncertainty about what each measure is referring to. These measures and corresponding definitions could be cross referenced to specific requirements set out in the Training Package Standards, for example in the qualifications section. Similarly general assessment requirements could be set in the form of an assessment system together with defined measures set out in Table 3.2, which could be cross referenced to the Training Package Standards under the assessment heading.
The Allen Consulting Group
25
VET QUALITY PROJECT An approach such as this would build on current arrangements. It would also ensure that the measure placed in the optional field of an Industry Training Package, aligned directly to the measures as defined. Further it would result in RTOs being required to have robust delivery and assessment systems which could be adapted to meet the requirements of specific qualifications or units of competence. It is recognised -- in fact by ISCs themselves -- that there is a risk that over time measures could be applied to increasing numbers of qualifications. It would therefore be essential for the NSSC to closely monitor the proposed application of the measures. This could include monitoring why measures have been included in a given circumstance, and possibly why a measure has not been included where a qualification would seemingly benefit from it being specified. Additionally, there should be clear guidelines in the Training Package Standards regarding the appropriate use of each measure.
The Allen Consulting Group
26
VET QUALITY PROJECT Chapter 5 Industry Skills Council advice regarding quality measures 5.1 Measure definitions ISCs were asked if they agreed with the definition of each measure. The wording of definitions submitted to ISCs was ultimately similar to the final wording described in Chapter 3. While all ISCs responded to this section of the questionnaire, one ISC provided an invalid response that has not been included in the analysis4. Figure 5.1 summarises the level of agreement with the delivery measures. Of the delivery measures, notional learning time attracted the most comment and the strongest disagreement. Generally, there was a high level of agreement with the delivery measure definitions proposed. Figure 5.1 DELIVERY MEASURES: AGREEMENT WITH DEFINITIONS
Source: ISC questionnaire Figure 5.2 summarises the level of agreement with the assessment measures. (As explained in Chapter 3 RPL/assessment only was subsequently removed.) Of the assessment measures, RPL/assessment only attracted the most comment and the strongest disagreement. Although it was removed, this was due to reasons other than the comment it attracted, as explained in Chapter 3. Generally, there was a high level of agreement with the assessment measures proposed.
The Allen Consulting Group
4 This ISC answered 'no' indicating disagreement, but made a corresponding comment indicating it did agree with many of the definitions but had misgivings that this would imply support for the adoption of the measures. The ISC was given the opportunity to resubmit the survey and used the same approach on the second occasion. 27
VET QUALITY PROJECT Figure 5.2 ASSESSMENT MEASURES: AGREEMENT WITH DEFINITIONS
Source: ISC questionnaire The high level of agreement with the definitions reflects that: · ISCs were consulted on the draft definitions prior to the questionnaire being circulated; and · where a definition for a measure had already been agreed through a formal instrument (such as the Training Package Standards), consistent wording was used. 5.2 Use of measures: profile of qualifications chosen The frameworks were 'applied' to 30 qualifications, two units and one skill set. This represents 513 to 563 total units. Of these, ISCs advised that 294 were core the remainder were elective. In addition, ISCs advised that the 30 chosen were representative of a further 70 qualifications, bringing the total qualifications/skill sets and units that the measures were applied to through the questionnaire, to 103. Criteria selected for qualifications The questionnaire asked ISCs to choose qualifications (or skill sets or units) based on one of four criteria. Table 5.1 shows the number of times each criterion was nominated with regard to the 30 qualifications, two units and one skill set. (The total is higher than 33 because more than one criteria could be selected for each qualification.)
The Allen Consulting Group
28
VET QUALITY PROJECT
Table 5.1 CRITERIA SELECTED FOR QUALIFICATIONS CHOSEN
Ref.
Criteria
A The highest volume qualification derived from your ISCs Training Packages(s) by enrolments B A qualification which forms part of a designated occupational standard or practice requirement stipulated by a third party (e.g. professional body or statutory regulator) C A qualification which has been associated with quality concerns and is therefore considered to be at risk of not being delivered to the level of quality desired by the ISC D A qualification in a substantially new area of provision, either because it reflects an emerging area and/or it has arisen following a recent Training Package revision
Source: ISC questionnaire
Selected 17 16 22 13
Typical learner cohort The questionnaire asked ISCs to nominate the typical learner cohort(s) of the 30 qualifications, two units and one skill set selected. Table 5.2 summarises the responses provided. (The total is higher than 33 because more than one cohort could be selected for each qualification.)
Table 5.2 TYPICAL LEARNER COHORT Cohort Existing worker School leaver VET in schools Other (those nominated included apprentices, new entrant with experience in allied occupation, job seeker and new migrant) Source: ISC questionnaire
Selected 32 19 7 16
AQF levels ISCs were asked to nominate the Australian Qualification Framework (AQF) level of the qualification selected. They were also asked if the qualifications selected were representative of other qualifications, and if so, to specify the AQF level of those qualifications. Figure 5.3 shows the AQF level of both the 30 qualifications, plus the qualifications these were representative of.
The Allen Consulting Group
29
VET QUALITY PROJECT Figure 5.3 AQF LEVEL Source: ISC questionnaire 5.3 Use of measures: intended application of measures to range of units ISCs were asked to 'advise how you would ideally like to see each measure applied to that qualification', for each of the qualifications nominated. Figure 5.4 summarises the responses provided for the delivery measures in relation to each qualification (or skill set or unit). Of the measures, prospective learner information was most likely to be applied to all units, while volume of learning was most likely to be applied to no units. Figure 5.4 DELIVERY MEASURES
Source: ISC questionnaire
The Allen Consulting Group
30
VET QUALITY PROJECT Figure 5.5 summarises the responses provided for the assessment measures. Of the measures assessment conditions and validation specifications was most likely to be applied to all units, while language of assessment was most likely to be applied to no units. Figure 5.5 ASSESSMENT MEASURES
Source: ISC questionnaire 5.4 Use of measures: intended method of application ISCs were provided with a range of options for each measure and asked to 'select the option which best describes how you would ideally like to see this measure applied to the qualification (or skill set or unit)'. The options, which matched those set out in the issues paper, are reproduced in Appendix A. In all instances, ISCs were able to select more than one option, reflecting that those listed could co-exist within a qualification. Delivery measures Figure 5.6 shows the ISC response for specific trainer requirements, language of delivery, learning resources and prospective learner information. The responses indicate that: · for the qualifications selected, each of the specific trainer requirement options would be widely used; · there was an intention to specify English as the language of delivery in 63 out of 103 cases; and · for learning resources the 'provider selected or developed' option was chosen more often than 'recommended' or 'mandated'.
The Allen Consulting Group
31
VET QUALITY PROJECT Figure 5.6 SPECIFIC TRAINER REQUIREMENTS, LANGUAGE OF DELIVERY, LEARNING RESOURCES AND PROSPECTIVE LEARNER INFORMATION
Source: ISC questionnaire
The Allen Consulting Group
Comments and suggestions made when 'Other' was selected for each of the measures included the following. · For specific trainer requirements: 'current and relevant experience and knowledge in the field being delivered', 'currency of technical vocational competency', and 'currency of industry experience'. · For language of delivery: some respondents observed that in some instances English is mandated by occupational regulations (including health and safety regulation) or Australian standards. · For learning resources: comments included a view that mandating was unnecessary because there is a 'vast array of resources' and that 'enterprise, location and technology requirements are determinant factors'. There was also an observation that in some regulated industries resources are currently mandated. A caution was also raised that mandating resources could be costly and stifle innovation. Figure 5.7 shows the ISC response for range of conditions, learner characteristics, prospective learner information and mode. The responses indicate that: · for prospective learner information, an intention was flagged to mandate this in 65 cases and to recommend it in 51 cases; · each of the options provided for range of conditions was selected relatively frequently, with 'critical attributes of the workplace/simulated environment' chosen in 98 cases; · of the learner characteristic options, 'access to workplace and appropriate work tasks' was selected in 75 cases, followed by language, literacy and numeracy levels, which was chosen in 63 cases; and 32
VET QUALITY PROJECT · for mode, 'on the job and off the job' was the most commonly selected option by some margin, being chosen in 74 cases. Figure 5.7 PROSPECTIVE LEARNER INFORMATION, RANGE OF CONDITIONS, LEARNER CHARACTERISTICS AND MODE
Source: ISC questionnaire Comments and suggestions made when 'Other' was selected for each of the measures included the following. · For prospective learner information, a comment that Industry Training Packages provide a range of important learner information. · For range of conditions a suggestion that another option is for the conditions to be 'enterprise/technology related' and examples indicating that the range of options is influenced by regulatory requirements (e.g. real estate agents are required to hold a license before operating in the workplace, while on the job training is strongly embedded in the apprenticeship model). · For learner characteristics, additional options provided related to regulatory requirements, including that learners must be engaged in a contract of training. Another suggestion was that 'communication and numeracy skills (are) sufficient to enrol in the qualification'.
The Allen Consulting Group
33
VET QUALITY PROJECT · For mode, additional comments included that a 'combination of on/off job, blended and online is desirable. A distinction was also made between simulation in the workplace and 'simulated workplace'. Respondents also observed that regulated requirements may limit mode options. Figure 5.8 shows the ISC response for assessment system. The responses indicate that: · for assessment system, 'RTO internally designed, managed and implemented but externally reviewed' was the most selected option, being chosen in 58 cases; · in response to the question about the type of organisation that should operate the assessment system, of the options presented, a 'national, industry specific organisation/body/association (e.g. ISC)' was chosen for 54 cases while the option of 'a national, cross industry organisation (e.g. government regulators)' was selected for 47 cases; · in response to the question asking if this could be done by an entity other than an RTO if it were to be done internally, there was agreement for 48 cases, while 37 stated 'no'. Figure 5.8 ASSESSMENT SYSTEM
Source: ISC questionnaire 'Other' was selected on few occasions regarding the assessment system. However, some other options were suggested regarding the external assessment body. These included that 'industry and regulators could engage with and support the government regulator', and an observation that 'international agreements on assessment specifications, skilled migration requirements and licence recognition' are also a source of assessment authority for some qualifications.
The Allen Consulting Group
34
VET QUALITY PROJECT Assessment measures Figure 5.9 shows the ISC response for specific assessor requirements and language of assessment. The responses indicate that: · for specific assessor requirements each of the options was selected relatively frequently, with 'qualifications' being chosen in 66 cases, and 'continuous professional development' in 59 cases; and · for language of assessment English was selected in 72 cases. Figure 5.9 SPECIFIC ASSESSOR REQUIREMENTS AND LANGUAGE OF ASSESSMENT
Source: ISC questionnaire 'Other' was selected quite frequently in response to 'specific assessor requirements'. The suggestions and comments included: · 'current and relevant experience and knowledge in the field being assessed' and 'currency of industry experience'; · 'regulator specifications'; and · 'technical vocational competency'. Figure 5.10 shows the ISC response for assessment methods. The responses indicate that: · for assessment methods, each of the options was chosen relatively frequently, with 'observation' selected in 89 cases; and · 'yes' was selected more than 'no' for each of the three questions about the circumstances in which assessment requirements might vary, although the strongest agreement was in relation to variation by 'qualification type'.
The Allen Consulting Group
35
VET QUALITY PROJECT Figure 5.10 ASSESSMENT METHODS
Source: ISC questionnaire Suggestions and comments made when 'other' was selected for assessment methods included a statement that methods should be flexible and not prescribed, while another option put forward was 'regulator mandated methods'. Figure 5.11 shows the ISC response for assessment tools. The responses indicate that assessment tools would be developed 'centrally' in 44 cases compared to 30 instances where the 'local/provider level' was chosen. However, on the question of how assessment tools should be implemented, the local level was chosen 56 times compared to 14 instances of 'centrally'. Figure 5.11 ASSESSMENT TOOLS
Source: ISC questionnaire
The Allen Consulting Group
36
VET QUALITY PROJECT 'Other' was selected a number of times in response to the two questions about assessment tools. Suggestions and comments made when other was selected included that: · 'tools should be drawn from both central and provider level sources to allow for ... jurisdictional regulatory differences'; and · 'regulators' could also have a role in assessment tool development in some instances. Figure 5.12 shows the ISC response for assessment conditions and reasonable adjustments. The responses indicate that four to five of the assessment condition options were chosen with a high frequency, including: · 'amount of evidence to be collected' was selected 87 times; · 'currency of evidence' was selected 86 times; and · 'critical attributes of the workplace/simulated environment' was selected 84 times. ISCs were more likely to specify that a qualification would not specify any reasonable adjustments, with 'no' selected in 74 cases. Figure 5.12 ASSESSMENT CONDITIONS AND REASONABLE ADJUSTMENTS
Source: ISC questionnaire
The Allen Consulting Group
Comments and suggestions made when 'other' was selected from the assessment conditions options included that: · an observation that this is addressed in the Assessment Conditions of the Assessment Requirements; · another option could be a 'realistic combination of attributes'; and 37
VET QUALITY PROJECT · 'regulatory requirements' might form part of the assessment conditions. Figure 5.13 shows the ISC response for validation model and validation specifications. The responses indicate that: · compared to the response to other measures, the options under validation model were selected relatively few times, indicating they would not be applied as often to these qualifications compared to some other measures; · of the validation model options, 'consensus: validators are independent parties' (34 times) and 'statistical: validators are independent body with authoritative power' (36 times) were chosen most frequently; and · for validation specifications, 'sampling' was selected in 86 cases, while 'validator qualifications/experience' was chosen 69 times. Figure 5.13 VALIDATION MODEL AND VALIDATION SPECIFICATIONS
Source: ISC questionnaire Comments and suggestions made when 'other' was selected for validation model and validation specifications included: · a further validation model option being 'ASQA as the lead authoritative power, with some ISC or industry body input, and regulators where appropriate'; · another option being 'nationally by RTOs, Industry, Regulators and Auditors'; and · comments reiterating the importance of industry involvement in validation.
The Allen Consulting Group
38
VET QUALITY PROJECT 5.5 Use of measures ISCs were asked: 'For how many of your qualifications do you estimate that this measure should be included.' This question was referring to all of the qualifications under the responsibility of each ISC, not just those nominated for the questionnaire. Figure 5.14 summarises the ISC response for the delivery measures. The responses indicate that ISCs envisaged using range of conditions the most often, and volume of learning, least often. Even though a measure may be rarely used, it can still have an important role in selected instances. For example, one ISC indicated that volume of learning is directly related to quality concerns about some of its qualifications. Figure 5.14 DELIVERY MEASURES
Source: ISC questionnaire Figure 5.15 summarises the ISC response for the assessment measures. The responses indicate that ISCs envisage using assessment conditions the most often and reasonable adjustments the least often. However, the both the assessment measures shown below and the delivery measures above, it is apparent that each of the measures would be used in some cases and some would be used frequently.
The Allen Consulting Group
39
VET QUALITY PROJECT Figure 5.15 ASSESSMENT MEASURES Source: ISC questionnaire 5.6 Importance of measures ISCs were asked to rate the importance of each measure in response to the question: For the qualifications that should include this measure, how important is it? Figure 5.16 summarises the ISC response for the delivery measures. The responses indicate that ISCs considered range of conditions to be the most important delivery measure. Volume of learning was considered the least important overall, although three ISCs nominated it as extremely important. Figure 5.16 DELIVERY MEASURES
Source: ISC questionnaire
The Allen Consulting Group
40
VET QUALITY PROJECT Figure 5.17 summarises the ISC response for the assessment measures. The responses indicate that specific assessor requirements and validation model were considered the most important overall, while reasonable adjustments was considered the least important. However, even reasonable adjustments was rated as moderately or very important by seven out of eight respondents. Figure 5.17 ASSESSMENT MEASURES
Source: ISC questionnaire 5.7 Implementation ISCs were asked how easy or difficult it would be to implement each measure, both for ISCs and RTOs. Figure 5.18 summarises the views of ISCs regarding the perceived ease or difficulty of implementation of the delivery measures for ISCs and RTOs. ISCs were more likely to state that range of conditions and mode would be difficult for RTOs to implement. While five ISCs stated that specific trainer requirements and volume of learning would be difficult or very difficult for ISCs to implement, a number of respondents believed these would be easy to implement. Overall, ISCs believed learning resources would be hardest for them to implement, with seven respondents selecting 'difficult' or 'very difficult' for this measure.
The Allen Consulting Group
41
VET QUALITY PROJECT Figure 5.18 DELIVERY MEASURES
Source: ISC questionnaire
Figure 5.19 summarises the views of ISCs regarding the perceived ease or difficulty of implementation of the assessment measures for ISCs and RTOs. ISCs clearly felt that validation model and validation specifications would be the most difficult for ISCs to implement, with almost all respondents selecting 'difficult' or 'very difficult' for these measures. ISCs also stated that these would be relatively difficult for RTOs to implement. Language of assessment was generally considered to be one of the easier measures to implement, for both ISCs and RTOs.
The Allen Consulting Group
42
VET QUALITY PROJECT Figure 5.19 ASSESSMENT MEASURES
Source: ISC questionnaire
Appendix B details the qualitative feedback provided by ISCs in responding to the questionnaire.
The Allen Consulting Group
43
VET QUALITY PROJECT Appendix A Detailed measures Table A.1 and Table A.2 provide further detail of the recommended measures, including the options for each measure and the rationale for their inclusion.
The Allen Consulting Group
44
VET QUALITY PROJECT
Table A.1 DELIVERY MEASURES
Measure Specific trainer requirements
Definition Specific trainer requirements are additional requirements to the national VET Standards for RTOs. This measure is to be applied to the delivery of training in high risk and/or high consequence areas.
Options Years of vocational experience Qualifications Licences Continuing professional development
Language of delivery
The language (oral and/or written) to be used in the delivery of training.
English Other
Learning resources
Learning resources are texts, videos, software, and any other materials used to support learning.
Mandated Recommended Provider selected or developed
Prospective learner information
Prospective learner information related to the unit of competency, qualification and/or Skill Set that must be included in marketing material, information guides or prospectus.
Mandated Recommended
Rationale This measure provides ISCs the opportunity to include additional requirements related to trainers in specific areas e.g. areas of high risk and consequences. Included in some Industry Training Packages and above that required by the VET standards, e.g. SIT10. Consideration needs to be given to the TEQSA Provider Course Accreditation Standards (to be introduced in 2014), which specify that those responsible for teaching and assessing qualifications in higher education must hold a qualification at least one level above the AQF level of the units within a qualification being assessed5. This measure allows ISCs to confirm which units or qualifications must be delivered in English for AQF purposes which has been a question raised by RTOs in some instances. Programs delivered in languages other than English will have an impact on regulator monitoring. A specification for individual qualifications ­ WSQ Singapore. Language of delivery is often queried in relation to offshore delivery. Refer to AQF ­ Qualifications Issuance Policy which notes that `this does not preclude the use of languages other than English for the delivery and/or assessment of a program of learning leading to a qualification provided that the level of English language proficiency is appropriate for the intended use of the qualification'. This measure allows ISCs to mandate any relevant training or learning resources where required for specific industry needs. Industry Training Packages were developed to allow RTOs to develop learning resources however Industry Training Packages themselves are sometimes seen as a learning resource rather than a specification of outcome requirements. This measure allows ISCs to develop key information that needs to be provided to prospective learners so that learners are aware of industry requirements. Prospective learner information could include: · Pathways · Entry requirements such as needing to
5 TEQSA (2012): Provider Information Request: Consultation Paper May 2012, see http://www.teqsa.gov.au/sites/default/files/2012%20Provider%20Information%20Request%20Consultation% 20Paper_Final%20Draft.pdf
The Allen Consulting Group
45
VET QUALITY PROJECT
Measure
Definition
Range of conditions Learner characteristics
Specifies different environments and conditions that may affect training. Essential operating conditions that may be present (depending on the situation, needs of the learner, accessibility of the item, and local industry and regional contexts) are included. Range is restricted to essential operating conditions and any other variables essential to the learning environment. Learner characteristics are those that contribute to successful completion of the qualification.
Mode
Delivery mode refers to the medium used to deliver the training/facilitate the learning and may be face-to-face, via technologies, distance resource based, blended
Volume of learning
Volume of learning in the VET sector is the range of hours for learners to achieve the learning outcomes of a unit of competency or a qualification having regard to the characteristics of the learner (see learner characteristics) and the mode of delivery. It includes all learning and assessment activities required for the achievement of the
The Allen Consulting Group
Options Training conditions could include: · workplace supervision requirements · supervisor requirements · scope of responsibilities in the workplace/simulated environment · range of contexts -- scope · specific equipment · critical attributes of the workplace/simulated environment Potential learner characteristics can include: · Access to workplace and appropriate work tasks · Past learning and work experiences · Qualifications · Language, literacy and numeracy (LLN) levels · English language competence e.g. IELTS, TOEFL · Specific needs · On the job and off the job · Fully on the job · Fully off the job · Blended · online learning / distance learning Volume of learning is to be a range statement that can sit within the AQF volume guidance
Rationale have access to a practice environment/workplace; IT skills (i.e. not pre-requisites), LL&N, notional learning time etc. This measure allows ISCs to provide critical and specific information in relation to range and conditions of training as at present there maybe ambiguities or a lack of clarity in this area. Provided for in new Unit of Competency Template This measure allows ISCs to outline desired characteristics of entrants for particular programs particularly where qualifications can only be effectively acquired by people working in specific job roles. Entry specifications for individual qualifications ­ WSQ Singapore. LLN levels could be determined by using Australian Core Skills Framework (ACSF). Learner characteristics are not the same as unit or qualification pre-requisites. This measure allows ISCs to manage mode of delivery to exclude or include options particularly where specific modes are inappropriate or skills can only be acquired through particular delivery modes e.g. workbased training. Different options could be weighted within a unit/qualification or alternatively, linked directly to the measures of notional learning time (practice and consolidation of knowledge and skills, tutorials etc.). This measures addresses the impact of shortened programs on the quality of delivery which has been an issue identified at audit and noted in the NSSC VET standards review analysis (2012). RTOs may develop programs that are outside the estimated range; however they would need to justify the program's notional learning time by the target group/cohort learner characteristics and mode of delivery. Estimates of notional learning at the individual qualification level should be determined with reference to the volume of learning range set out for each qualification type in the AQF. 46
VET QUALITY PROJECT
Measure
Definition learning outcomes, such as: · direct contact (as in training delivered in classrooms and the workplace); · practical and structured work; · practice and consolidation of knowledge and skills; · independent study and reflection; · formative and summative assessment; and · compliance with relativity measures applied to units such as points weightings (which account for complexity and other factors but do not refer to nominal hours).
Options
Rationale This measure differs from nominal hours and the relationship to teacher directed learning, teaching and assessment. States and Territories currently assign nominal hours for funding purposes but they are thought to relate poorly to required learning time because of their advisory status. Furthermore, there are significant variations between jurisdictions for similar qualifications. A specific issue for consideration might be underpinning literacy and numeracy levels and the extent to which these levels impact notional learning time. Added in definition `practice and consolidation of knowledge and skills' to address the issue of some RTOs developing very short programs for average learners. Used in NQF in various countries to define qualification types (Singapore WSQ, NZQF, Malaysian Qualifications Framework, Qualifications and Credit Framework (QCF) (England, Wales and Northern Ireland), the Scottish Qualifications Framework and the Hong Kong Qualifications Framework). A volume of learning measure (i.e. duration) is used in the Australian Qualifications Framework (AQF). For example 1 point = 10 hours learning time, or 1 point = 40 hours.
The Allen Consulting Group
47
VET QUALITY PROJECT
Table A.2 ASSESSMENT MEASURES
Measure Specific assessor requirements Language of assessment Assessment system
Definition
Options
Rationale
Specific assessor requirements are additional requirements to the national VET Standards for RTOs. This measure is to be applied to the delivery of assessment in high risk and/or high consequence areas. The Language (oral and/or written) to be used in assessment An assessment system is a coordinated set of policies and procedures designed and implemented to increase the likelihood that assessments of numerous candidates, using many different assessors, in varying situations, are consistent, fair, valid and reliable. An assessment system may include grievances and appeals process, validation systems and processes, moderation, reporting/recording arrangements,
· Years of vocational experience · Qualifications · Licences · Continuing professional development · Team based or panel approach English Other An assessment system could be internally and/or externally designed, managed, implemented and/or reviewed -- this could differ across Training Packages and/or qualifications. Within a Training Package and/or qualification options include: · Externally designed, managed, implemented and reviewed · Externally designed and reviewed but locally managed and implemented. · Locally designed, managed & implemented
Ensuring appropriate assessor vocational skills and knowledge is critical to the perceived validity of assessment. Included in some Industry Training Packages and which is above that required by the minimum VET standards, e.g. SIT10. To enable this to occur, sometimes a team based approach to assessment will be appropriate, in which an RTO assessor works with an industry expert for example. Consideration needs to be given to the TEQSA Provider Course Accreditation Standards (to be introduced in 2014), those responsible for teaching and assessing qualifications in higher education, must hold a qualification at least one level above the AQF level of the units within a qualification being 6 assessed . This measure allows ISCs to confirm which units or qualifications must be assessed in English Specification for individual qualifications ­ WSQ Singapore. Language of assessment is often queried in relation to offshore delivery. Refer to AQF ­ Qualifications Issuance Policy which notes that `this does not preclude the use of languages other than English for the delivery and/or assessment of a program of learning leading to a qualification provided that the level of English language proficiency is appropriate for the intended use of the qualification'. Currently the assessment system is locally managed at RTO level and with little or no industry input. Given the issue with the quality of assessment in the sector, there may need to be intervention by an external agency/body in the assessment of high risk units/qualifications. ISCs could manage an assessment system for critical units or qualifications, (e.g. EE-Oz) particularly where industry expertise and involvement is required. In all assessment systems all aspects of competence including underpinning knowledge must be assessed. Assessment system is categorised as an input measure because an assessment system is a precondition for high quality assessment.
6 TEQSA (2012): Provider Information Request: Consultation Paper May 2012, see http://www.teqsa.gov.au/sites/default/files/2012%20Provider%20Information%20Request%20Consultation% 20Paper_Final%20Draft.pdf
The Allen Consulting Group
48
VET QUALITY PROJECT
Measure Assessment methods Assessment tools
Definition acquisition of physical and human resources, administrative procedures, roles and responsibilities, partnership arrangements (where relevant), quality assurance mechanisms, risk management strategies and documented assessment processes. Assessment methods are the particular techniques used to gather evidence. (TAE10) Assessment tools include the following components--context and conditions of assessment, tasks to be administered to the student, an outline of the evidence to be gathered from the candidate and evidence criteria used to judge the quality of performance (i.e. the assessment decisionmaking rules). This term also takes in the administration, recording and reporting requirements, and may address a cluster of competencies as applicable for holistic assessment.
Options but externally reviewed · Or a different combination of the above Externally could be: · A national, cross industry organisation (e.g. government regulators) · A national, industry specific organisation/body/associ ation (e.g. ISC) · A regional/state based but industry specific organisation/body/associ ation · Industry specialists and RTO assessors working together · A cluster/network of local RTOs Locally could be: An Individual RTO Specific requirements in regards to type of assessment methods, e.g.: · Observation · Interview · Written (portfolio etc.) · Product · Third party Requirements may vary according to assessment purpose (e.g. RPL), qualification type, qualification level). Development options: · central versus local/provider level Implementation options · Centrally versus locally administered. Combination (e.g. knowledge and understanding could be developed and implemented nationally using a secure common assessment test but the underpinning skills and application of the skills could be developed and implemented at the provider level.
Rationale Assessment methods may be inappropriately applied to particular units of qualifications, for example a frequent criticism is that underpinning knowledge may not be appropriately assessed where observation is predominantly used to assess competence. Some units of competency specify assessment methods to be used. Licensing bodies may have specific requirements related to summative assessment methods. The use of appropriate assessment tools has been identified in audit as critical to compliance with the assessment standard. If common assessment tools designed centrally are mandated, ISCs or others may have to consider security issues as this will impact on ensuring authenticity of the evidence. Not mentioned in VET standards, although defined in Guide and included in ASQA General Direction on Student Records. Defined in A Code of Professional Practice for Validation and Moderation, NQC 2009 ­ a.
The Allen Consulting Group
49
VET QUALITY PROJECT
Measure Assessment conditions
Definition Stipulates any mandatory conditions for assessment. Specifies the conditions under which evidence for assessment must be gathered, including any details of equipment and materials, contingencies, specifications, physical conditions, relationships with team members and supervisor, relationship with client/customer, and timeframe Specifies assessor requirements, including any details related to qualifications, experience and industry currency.
Reasonable adjustments
Reasonable adjustments are those made to the way in which evidence of candidate performance is gathered, while ensuring that the criteria for making competent/not yet competent decisions (and/or awarding grades) are not altered in any way.
Validation model
Validation is a quality review process. It involves checking that the assessment tool produced valid, reliable, sufficient, current and authentic evidence to enable reasonable judgements to be made as to whether the requirements of the relevant aspects of the Training Package or accredited course had been met. It includes reviewing and making
The Allen Consulting Group
Options Assessment conditions that could be stipulated or recommended could include: · Location (e.g. on versus off the job) · Time restrictions · Number of attempts · Currency of evidence · Amount of supervision required to perform the task · Amount of assistance/prompting permitted by assessor · Specific equipment and/or materials · Critical attributes of the workplace/simulated environment Amount of evidence to be collected, for example, the number of times specific activities need to be observed If assessment methods are mandated within a training package, ISCs to determine reasonable adjustments to the methods for large subcohorts of the population in which it is anticipated that their background characteristics could potentially impact on their opportunity to demonstrate competence against the unit. Background characteristics are different to learner characteristics as as the former refer to factors thought to hinder opportunity to demonstrate competence, whilst learner characteristics are those thought to be positively related to success. Options for designing and implementing different approaches to validation: · Assessor partnership · Consensus: Internal parties · Consensus: Internal with external parties Options for designing and implementing different approaches to independent validation: · Consensus: Validators
Rationale Greater specification of assessment range and conditions could strengthen assessment outcomes. Contained in new Assessment Requirements Template Some Industry Training Packages (for example TAE04, SIT07) include this requirement Strengthening possible reasonable adjustments could strengthen assessment processes and practices across widely different learner cohorts. Identified in NQC 2009-b as part of assessment tool ideal characteristics. Validation is seen as critical to quality of assessment. In 2012, COAG signed up to a set of reforms to the national training system; agreeing to a revised National Agreement for Skills and Workforce Development and a new National Partnership Agreement on Skills Reform. One key item in the agreement is for validation models to be piloted by state/territory finding bodies. Outcomes of these pilots should inform the development of validation models. `Independent' was a term used for this pilot as external validation as noted by NQC documents is one form of independent 50
VET QUALITY PROJECT
Measure Validation specifications
Definition recommendations for future improvements to the assessment tool, process and outcomes. Validation specifications are specific requirements for implementing the model such as timing, sampling framework, validator qualification/experience, and focus (e.g. tool and/or judgements).
Options are independent parties (e.g. RTOs, enterprises, professional associations) · External: Validators are independent RTO(s) · External: Validators are independent enterprise(s) · External: Validators are independent body with authoritative power (e.g. Industry Skills Council, licensing body, ASQA) · Statistical: Validators are independent body with authoritative power (e.g. ISCA, employer associations, licensing bodies, ASQA, educational test development and measurement organisations) · Statistical: Validators are independent body with authoritative power (e.g. ISC, employer associations, licensing bodies) Statistical: Validators are independent body with authoritative power (e.g. ISC, employer associations, 7 licensing bodies, ASQA) Specific requirements for validation, e.g. · Timing · Sampling (e.g. of qualifications, units, assessors, learner evidence) · Validator qualifications/experience · Review of tool and/or judgements
Rationale validation. External validation was proposed by Skills Australia and in the Review of PostSecondary Education and Training in Queensland undertaken by ACG. ISCs could manage the type of validation model to be utilised in terms of industry involvement. Identified in NQC 2009-a as part of validation implementation. Providing further advice could strengthen quality assurance arrangements.
7 Bateman & Gillis 2012
The Allen Consulting Group
51
VET QUALITY PROJECT Appendix B Detailed ISC feedback
The purpose of this appendix is to provide a sample of the qualitative feedback provided by ISCs in response to those parts of the ISC questionnaire regarding application and implementation of the measures. Table B.1 is a sample of comments made regarding the possible use of input quality measures for specific qualifications.
Table B.1 DELIVERY MEASURES: ADVICE ON APPLICATION OF MEASURES
ISC Manufacturing Skills Australia SkillsDMC E-Oz Energy Skills Australia Transport & Logistics Industry Skills Council Ltd Service Skills Australia Government Skills Australia
Qualification/ Unit/Skill Set MEM20105 Certificate II in Engineering RII30709 Certificate III Mine Emergency Response and Rescue UEE30811 Certificate III in Electrotechnology Electrician TLI31210 Certificate III in Driving Operations
Comment This qualification was designed for existing workers in an engineering production environment. It has been incorrectly widely used as a labour market program, pre-apprenticeship program and for VET in schools. Input measures would address mainly skills of trainers as well as the context of learning Assessment must be usually be undertaken in the workplace, this not applicable in some of the units to this qualification. Where suitable application of a simulation is used as per the Industry Training Package requirements the assessment guidelines will determine the limitation to evidence that can be used to assess competence. This qualification is the base qualification for all the carrying out of licenced electrical work and the issue of a unrestricted electrical licence in all Australian jurisdictions. Related qualifications all require an electrical licence. At the para-professional level two Advanced Diploma qualifications have been approved (subject to appropriate inputs) by Engineers Australia as meeting the requirements of the Dublin accord for international recognition as meeting the requirements for an engineering associate - UEE62211 - Advanced Diploma of Electrical - Engineering and UEE62311 - Advanced Diploma of Electrical engineering - Coal Mining. Industry has established an RTO input Resource Assessment schema and tools and is trailing nationally agreed assessment tools and a integrated competency based apprentice progression model which has specific input requirements. TLISC has used alignment of units of competency to regulatory requirements to promote compliance in the industry.
SIT30807 Certificate III in Hospitality (Commercial Cookery) PSP52412 Diploma of Interpreting
There is regular feedback provided regarding concerns about students' 'work readiness' dependent on the RTO with which they have trained. Feedback indicates that assessment of units is not integrated and that there needs to be more guidelines in place to remove ambiguity about what the Training Package requires. These qualifications are in an area new to VET and language proficiency is a significant issue for the industry. The qualifications do not formally assess language proficiency but rather the application of translating and interpreting skills but that application must be underpinned by language proficiency in both English and the language being used for translation/interpretation.
Source: ISC questionnaire
The Allen Consulting Group
52
VET QUALITY PROJECT
Table B.2 is a sample of comments made regarding the possible use of assessment quality measures for specific qualifications.
Table B.2 ASSESSMENT MEASURES: ADVICE ON IMPLEMENTATION OF MEASURES
ISC Manufacturing Skills Australia Skills DMC E-Oz Energy Skills Australia Transport & Logistics Industry Skills Council Ltd Service Skills Australia
Qualification/Unit/Skill Set
Comment
MSS40312 Certificate IV in Competitive Systems and Practices
Assessment here would be expected to be carried out using actual workplace projects and tasks as well as some more 'institutional' type of testing such as pen and paper as well as online.
Certificate III in Civil Construction Plant Operations
Assessment must be undertaken in the workplace. Where suitable application of a simulation is used as per the Industry Training Package requirements the assessment guidelines will determine the limitation to evidence that can be used to assess competence.
UEE30811 Certificate III in Electrotechnology Electrician
Established assessment practices include mandated assessment of prerequisites, use of mandated assessment schemas including workplace evidence gathering and compliance with and approval by regulators of assessment practices. Current trailing of centrally developed assessments and competency based progression will impact on ongoing assessment practices.
TLI21610 Certificate II in Warehousing Operations
The breadth of this qualification requires a flexible approach to defining measures in the context of industry sectors and what is required in workplaces and in relation to group characteristics - TLISC does not support a mandatory 'all or nothing' approach. Potentially all measures could be relevant but not everywhere, every time.
SIB70110 Vocational Graduate Certificate in Intense Pulsed Light and Laser Hair Reduction
It is important to consider the length of 'practice' time required to meet competency.
Source: ISC questionnaire
Table B.3 is a sample of ISC feedback regarding the cost of implementing delivery measures.
The Allen Consulting Group
53
VET QUALITY PROJECT
Table B.3 FEEDBACK REGARDING IMPLEMENTATION AND COST: DELIVERY MEASURES
Measure
Selected comments
Specific trainer requirements Language of delivery Learning resources Prospective learner information Range of training conditions Learner characteristics Mode
The cost of implementation of this measure for an ISC would be high. (CSHISC) The costs associated with meeting the needs of individual state regulator requirements could be significantly difficult and expensive. (Skills DMC) Whilst not yet required to be explicit in units, RTOs are required to be cognisant of these to meet jurisdictional requirements. (EE-Oz) The level of cost to RTOs would be dependent upon the extent of the change/requirements. (SSA) Gaining consensus on the specific "higher teacher requirements" would need a significant amount of consultation. (AgriFood) It is possible that this could represent an implementation issue for RTOs. (CPSISC) The requirement for delivery in English relates to meeting regulatory requirements, the ability to apply and maintain relevant Australian Standards and WHS requirements. These are established requirements, which are explicit in units via LL&N requirements and are currently implemented. (EEOz) Mandated licensing requirements specify English and with the increased VET focus on literacy in the workplace it could be contradictory to have any other language as an established 'norm'. Circumstances where alternative languages may be justified could be outlined, where applicable. (TLISC) Very difficult unless ISCs were funded to do this. (CSHISC) All compliant RTOs already have resources and so there should no additional costs. (CPSISC) Under AQTF/NVR RTOs are required to demonstrate adequate resources including learning resources to ensure effective delivery. Making this requirement explicit will assist in clarifying any confusion and assist auditors. (EE-Oz) TLISC sees learning resources as part of the core business for a RTO. ISCs should have a limited role in this regard otherwise the responsibilities of RTOS become blurred and this could be counterproductive in trying to improve the VET quality framework. (TLISC) The cost of the development, production, and maintenance of learning resources would be significant. (AgriFood) There is little direct cost but it may curb enrolments. (MSA) Information in Training Package including LL&N specifications, Pre-requisites and workplace evidence requirements should allow RTOs to interpret land apply learner information. (EE-Oz) Prospective Learner information exists now and is an important part of planning, self screening and clarifying expectations. (TLISC) Only costs likely to be for RTOs to initially set up new information on existing websites/brochures/prospectus. (AgriFood) Access to different work environment by RTO's will increase the costs passed on to enterprises. (Skills DMC) The information will be available relatively easily, but getting stakeholder agreement on some of the parameters (e.g. volume) could prove time consuming and costly for the ISC. For the RTOs, the unit could represent a more demanding product to deliver and assess. (CPSISC) Range statements exist now and although they need review in the context of the streamlined template they will continue to serve a purpose. (TLISC) To address learner characteristics units will need to be reviewed to identify unit-specific demands to determine what the specific learner characteristics are that would best position the candidate to successfully complete the qualification. (CPSISC) Information in Training Package including LL&N specifications, Pre-requisites and workplace evidence requirements should allow RTOs to interpret land apply learner information. Costs for this should known and accounted for. (EE-Oz) This specification will immediately challenge some RTOs where they have either ignored or were ignorant of the industry expectations of learners and graduates. It would rapidly weed out deliberate bad practice and start to reinstate industry/enterprise confidence in RTO outcomes. (MSA) Evidence required to be capture as per the Industry Training Package assessment guidelines could be costly due to the varied enterprise processed and requirement. (SkillsDMC) Established practices/accepted norms would exist now so implementation should be easy. (TLISC)
The Allen Consulting Group
54
VET QUALITY PROJECT
Measure Volume of learning
Selected comments This is a low-cost item yet the impact could be extensive in terms of learner flow. That is, whereas a learner might now be ticked-off as 'competent' after say a 30 hour off-job learning period, we would expect that this would not occur until some period of consolidation and then evidence production has occurred. (MSA) E-Oz has implemented a weighting points system to account for learner effort. This system is in place and RTO cost should be understood and account. (EE-Oz) For an ISC if it was required, this could be quite costly and lengthen the time of development as you navigate between the stakeholder consultations (including RTOs) and the STAs who may have differing views regarding what was proposed. An alternative would be having a panel formed by NSSC, similar to the QA where you engage a member to determine the hours, and this would then be agreed to be accepted by all parties concerned (as they would be considered an expert in this area) (IBSA)
The Allen Consulting Group
55
VET QUALITY PROJECT
Table B.4 is a sample of ISC feedback regarding the cost of implementing assessment measures.
Table B.4 FEEDBACK REGARDING IMPLEMENTATION AND COST: ASSESSMENT MEASURES
Measure Specific assessor requirements Assessment system Assessment methods Assessment tools Reasonable adjustments Validation model Validation specifications
Selected comments Some RTOs would not be able to easily apply the measures we seek. The majority though would be able to fairly easily implement. (MSA) Specific assessor requirements are set out by the regulators and are not defined in units of competency. (TLISC) Resourcing for ISC's and RTO's monitoring and managing assessment systems could be costly. Constant access to Industry is not easy and needs to have alternative available with plenty of lead time. (SkillsDMC) With the new template for assessment requirements being adopted this will make it easier for the RTO to implement and produce consistent outcomes. However more explicit assessment requirements may result in higher for compliance. (EE-Oz) Moderation is something ISCs can do but the assessment system is outside the scope of the ISC. Moderation involves a range of other bodies. (SSA) Depends on the nature of the assessment system and to what extent ISCs are involved- if they are involved in development and validation then another layer of work will be required which could be resource intensive. (IBSA) Some RTOs are still relying only on institutional performance and are not embracing evidence gathered from the workplace. (MSA) The difficulty for RTOs would be if RTOs had to use a broader or different range of assessment methods than they currently do. (CPSISC) This reflects AQTF/NVR requirements which RTOs must implement. Costs should be known. Industry has identified preferred methods and developed resource to support these. (EE-Oz) Mandated Assessment Instruments exist for high risk licensing units. The high risk licensing process is a summative assessment method. In general, the evidence guide in units of competency include assessment methods. (TLISC) Specification of the assessment methods would be dependent upon the nature of the unit/qualification and level of risk, e.g. in entertainment staging units, OHS units and TAE. It is difficult to determine implementation cost for the RTOs because it would be on a unit by unit basis. (IBSA) The measure is difficult, but desirable to implement. (ForestWorks) As per and assessment system, good assessment tools are important. However, they are the responsibility of the RTO. (MSA) The ease for RTOs would depend on their current capacity to meet audit requirements. (CPSISC) Mandated for high risk licensing units. Resource guides have been established for other units. (TLISC) This would be supported by the trainer and assessor requirements part. That is, a trainer or assessor who is both qualified and technically current will be easily able to apply reasonable adjustment measures where required and for the appropriate units. (MSA) Regulatory frameworks gives limited scope to accommodate a wide range adjustments. High cost to RTOs to implement. (EE-Oz) The difficulty (time, cost) would be found in gaining consensus on the allowable modifications across the range of units, particularly also give the diversity of stakeholders and jurisdictions. (ForestWorks) MSA proposes that there be two levels of validation. One would be RTO local and the other a more formal intervention model. The first should be standard practice. The second will have significant resourcing implications for the RTO as well as the body/bodies undertaking the validation work. (MSA) Any form of validation process will have resource implications. (IBSA) The introduction of a new system for validation will be costly (but worthwhile) in most cases. (AgriFood) Any validation model must be supported by a sound policy framework and the specification component is the key component. (MSA)
Source: ISC questionnaire
The Allen Consulting Group
56
VET QUALITY PROJECT References
The Allen Consulting Group
Australian National Training Authority (2003) High level review of training packages: current realities of training packages: summary of key themes emerging from phase two http://www.voced.edu.au/content/ngv18589 Bateman, A and Gillis, S (September 2012). Independent Validation: Workshop Background Paper, DIIRSTE, Canberra. Bateman, A, Dunn, F and Vickers, A (2010) Use of volume of Learning in Qualification Design, VRQA, Victoria. Bateman, A & Dyson, C (2011). Eligibility criteria: For RTOs to provide publicly funded training, Skills Australia, Canberra. Baethge, Achtenhagen, Arends, Babic, Volker, Weber (2006). PISA-VET: A feasibility Study, Education Science: Franz Steiner Verlag. Gallagher, M (2010). The Accountability for Quality Agenda in Higher Education, Group of 8, Canberra. Gillis, S., Bateman, A., Clayton, B. (2009d). Research Report: Validation and Moderation in VET. Report submitted to the National Quality Council. Gillis, S., & Griffin, P. (2008). Competency Assessment in J. Athanasou (editor) Adult Education and Training, David Barlow Publishing, Sydney, pp.233-256. Group of 8 (2011). Group of 8 Update, July 2011 Hoeckel Kathrin, Kim Moonhee, Field Simon, Justesen Troy R. (2008). OECD Reviews of Vocational Education and Training: A Learning for Jobs Review of Australia, OECD. Innovation & Business Skills Australia Ltd (2008), TAA04 Training and Assessment Training Package, Hawthorn. Macquarie Dictionary (1998), Third Edition, Sydney. NQC (2009a). Code of Professional Practice: Validation and Moderation in VET. National Quality Council, Melbourne. NQC (2009b). Implementation Guide: Validation and Moderation in VET. National Quality Council, Melbourne. NQC (2009c). Guide to Develop Assessment Tools. National Quality Council, Melbourne. NQC (2009d). VET Products for the 21st Century. Final Report of the Joint Steering Committee of the NQC and the COAG Skills and Workforce Development Subgroup -- June 2009, Melbourne. 57
VET QUALITY PROJECT NQC (2010a). Research Report: Validation and Moderation in Diverse Settings. National Quality Council, Melbourne. NQC (2010b). Design Model for Streamlined Training Package Material, National Quality Council, Melbourne. NQC (2010c). Assessor Guide: Validation and Moderation, National Quality Council, Melbourne. NQC (2011). Final Report: VETiS ­ strengthening delivery and assessment. National Quality Council, Melbourne. National Skills Standards Council (August 2012). Review of the standards for the regulation of vocational education and training: Analysis of submissions, Ithaca Group, NSSC, Melbourne. National Skills Standards Council website: http://www.nssc.natese.gov.au/standards_review, accessed 15 February 2013 http://www.nssc.natese.gov.au/__data/assets/pdf_file/0009/72765/NSSC-SB-03__Standards_for_Training_Packages.pdf, accessed 8 March 2013 NTB (1992). National Competency Standards: Policy and Guidelines, 2nd Edition, ACT, Canberra. NZQA (2010). New Zealand Diploma in Business National External Moderation: Information and advice for Tertiary Education Organisations (TEOs), http://www.nzqa.govt.nz/assets/Providers-and-partners/Assessment-andmoderation/NZDipBus/nzdipbus-mod-info.pdf Precision Consultancy (2008). Investigation into industry expectations of Vocational Education and Training Assessment: Final Report. National Quality Council, Melbourne. Service Skills SA (2010). Services Industries VET in Schools Project, Service Skills Australia. Skills Australia (2010). Creating a future direction for Australian vocational education and training, Canberra. Skills Australia (2011). Skills for prosperity ­ a roadmap for vocational education and training, Canberra.
The Allen Consulting Group
58
Minerva Access is the Institutional Repository of The University of Melbourne Author/s: Noonan, P; Condon, L; Bateman, A; GILLIS, S; Dyson, C Title: VET quality project Date: 2013 Citation: Noonan, P; Condon, L; Bateman, A; GILLIS, S; Dyson, C, VET quality project, 2013 Persistent Link: http://hdl.handle.net/11343/129746 File Description: Published version

File: vet-quality-project.pdf
Title: ISK01final
Author: Luke Condon
Published: Fri Mar 8 06:40:47 2013
Pages: 69
File size: 0.64 Mb


Health fair planning guide, 60 pages, 0.43 Mb
Copyright © 2018 doc.uments.com