Workshop 2

Specific and Generic Performance Indicators for Measuring Learning Outcomes

Date: 18:00-19:30, 8 December, JST

Organizer

Wajid Hussain, Director, Office of Quality and Accreditation, Faculty of Engineering, Islamic University

Abstract

Generally most engineering programs employ methods for developing learning outcomes, performance indicators and their rubrics that primarily focus on the fulfillment of accreditation requirements. Therefore, a relatively small set of learning outcomes with generic performance indicators and their rubrics that are broad in application are usually developed at the program or course level. Typically engineering specializations teach students several hundred specific activities throughout the course of their curriculum delivery. Generic performance indicators do not adequately support holistic curriculum deliveries targeting coverage of all three Bloom’s learning domains and their learning levels. Consequently, their generic rubrics also present difficulty in characterizing these specific student learning activities and their skills levels. They cannot be accurately applied to assessment, scoring of specific student learning activities and create issues for inter or intra-rater reliability. In this workshop, participants will learn the essential principles of an authentic Outcome Based Educational (OBE) model used for developing learning outcomes, generic and specific performance indicators and their rubrics for measuring specific skills related to Bloom’s 3 learning domains and their learning levels. They will also be exposed to key perspectives that incorporate digitization of the assessment process. Participants will benefit from an interactive workshop where they will learn core outcome–based frameworks and apply them to develop some samples of outcome assessment. Participants will gain hands on experience in using performance indicators and hybrid rubrics for measuring course learning outcomes. This will help them develop valid and reliable assessments that are aligned with teaching and learning activities.

Keywords: learning outcomes, performance indicators, ABET, hybrid rubrics, assessment, course evaluations, FCAR

Aims and Target Audience of the Workshop

Quality in teaching and learning is based on a proper understanding of OBE frameworks. Holistic learning happens when the learning models incorporate all the three domains of learning (affective, cognitive and psychomotor) and their learning levels. A balanced learning distribution is achieved when curriculum delivery incorporates holistic learning models [1].
Accurate alignment of learning outcomes to teaching/learning strategies, evaluation and feedback is crucial for valid and reliable assessment. Hundreds of specific engineering activities in any undergraduate or graduate specialization cannot be accurately represented by generic outcomes and Performance Indicators (PIs) [1].
The aim of this preconference workshop is to highlight the benefits of authentic OBE frameworks in developing holistic educational models and their process of delivery. Both engineering educators and evaluators can take away essential knowledge and practical experience gained from the lecture/video presentations, valuable discussions and samples of hands on course work.

Novelty and Timeliness of the Workshop

Remote and virtual accreditation audits have been announced by ABET and other quality assurance agencies around the world for upcoming accreditation cycles. Therefore, additional work for digital accreditation can be avoided by incorporating appropriate learning models into the education process. The workshop presents cutting edge methodology that can be integrated with state of the art digital technology to help both engineering educators and accreditors navigate the current digital age in global engineering education.

Workshop Description

Precision in assessment is crucial for accurate evaluation, feedback and improvement of teaching and learning. However, Engineering programs across the globe face issues with faculty compliance regarding quality processes related with assessment and evaluation since inaccurate frameworks and assessment methods obscure genuine faculty teaching efforts. Simple mistakes in the language of outcomes statements or the descriptors of rubrics can render the entire assessment and evaluation activity meaningless [2].
Therefore, the objective of this workshop is to give participants first-hand knowledge of authentic OBE frameworks and assessment methodology along with the pitfalls and common mistakes of programs and instructors followed by samples of course work and practical sessions that expose educators to the required rigor and standards for developing outcomes, PIs and their hybrid rubrics.
The participants of this workshop will be able to achieve the following outcomes:

  1. Select appropriate learning models to establish educational systems based on authentic OBE frameworks.
  2. Align curriculum and course delivery with Washington Accord and ABET engineering graduate attributes and profiles.
  3. Identify core issues with prevalent assessment methods to make informed decisions for improving them
  4. Incorporate authentic OBE theory for developing effective learning outcome statements and their PIs.
  5. Develop sample hybrid rubrics and apply to assessment.
  6. Achieve Scientific Constructive Alignment of learning outcomes, teaching, assessment, evaluation and feedback.

Workshop Agenda

A. Part I – OBE Frameworks, Learning Model and Graduate Attributes
1) OBE Frameworks and Learning Model – 15 minutes
The presenter will review the paradigm, principles and purpose of authentic OBE theory, Bloom’s concepts for Mastery Learning and use the Learning Domains Wheel [2] for the selection of learning domains categories and their learning levels for classification of learning outcomes. The Ideal Learning Distribution for courses and a 3-Levels Skills Grouping Methodology [2] will also be presented. (Presentation with Interactive Discussions)
2) Incorporating Graduate Attributes – 10 minutes
Washington Accord engineering graduate attributes, knowledge, problem solving and competency profiles will be reviewed to systematically incorporate relevant knowledge and skills into various phases of the curriculum and achieve holistic course delivery and Mastery Learning by targeting specific student learning activities. (Presentation with Interactive Discussions)
B. Part II – Course Outcomes, Specific/Generic PIs and Hybrid Rubrics
1) Guidelines for Developing Course Outcomes and Specific/Generic PIs – 15 minutes
Combining Spady’s (1994, 2020) [3,4] fundamental guidelines related to the language of outcomes, key concepts from Adelman’s work (2015) [5] on verbs and nominal content, and some essential details on the hierarchical structure of outcomes from Mager’s work (1962) [6] led to a consistent standard for learning outcome statements that were accurately aligned to the course delivery using a structured format for Course Outcomes (COs) and their specific/generic PIs. This session of the workshop would involve review of these standards using some samples from engineering courses listing their COs and PIs exhibiting an Ideal Learning Distribution. (Presentation with Interactive Discussions and Review of Practical Examples)
2) Hybrid Rubrics and Assessments – 10 minutes
The presenter will explain a novel hybrid rubric and elaborate on its benefits by application to assessments using samples showing actual scoring of student course work. (Presentation with Interactive Discussions and Review of Practical Examples)
C. Part III – COs, PIs and Hybrid Rubrics Development Exercise
1) Developing Course Outcomes – 15 minutes
The presenter will guide participants by using a practice workbook to develop COs for a sample course targeting graduate attributes, course topics and specific learning activities corresponding to Bloom’s 3 domains and their learning levels. (Guided Hands-on Activity)
2) Developing Specifc/Generic PIs – 10 minutes
The presenter will guide participants by using a practice workbook to develop specific/generic PIs for a select CO, provide adequate detail of theory or concepts applied, methods/techniques used and final product/outcome of target learning activity corresponding to a specific learning domain and level. (Guided Hands-on Activity)
3) Developing Hybrid Rubrics – 10 minutes
The presenter will guide participants by using a practice workbook to develop hybrid rubrics for a select PI, define the scales, descriptors and scores for each step of the learning activity to be assessed. (Guided Hands-on Activity)
D. Part IV – Video Presentation and Group Discussion Tables
1) Video Presentation of a Digital Quality System – 5 minutes
Video presentation of an Integrated Quality Management System implemented at the Faculty of Engineering, IU using specific/generic PIs and Faculty Course Assessment Report (FCAR) Methodology. (Video Presentation)
2) Group Discussion – 10 minutes
Discuss the main aims of the workshop and compare how participants rate their understanding of the significance of authentic OBE frameworks, development of learning outcomes, and application to teaching, assessment, evaluation and feedback, before and after the workshop. The participants will also discuss how this workshop changes their approach to teaching in the future. (Group Discussion)
E. Part V – Anticipated Participant Interaction
All the sessions of this workshop involve interactive discussions. Participants should bring their own computers for a hands-on COs, PIs and hybrid rubrics development exercise. It would also be helpful for participants to refer any course syllabi, lecture outline or assessments to ensure the practical experience gained aligns with actual course work.

References

[1] Hussain, W., & Spady, W. (2017). ‘Specific, Generic Performance Indicators and Their Rubrics for the Comprehensive Measurement of ABET Student Outcomes,’ ASEE 124th Annual Conference and Exposition, June 25–28, Columbus, OH.
[2] Hussain, W., Mak, F., Addas, M. F. (2016). ‘Engineering Program Evaluations Based on Automated Measurement of Performance Indicators Data Classified into Cognitive, Affective, and Psychomotor Learning Domains of the Revised Bloom’s Taxonomy,’ ASEE 123rd Annual Conference and Exposition, June 26–29, New Orleans, LA https://peer.asee.org/engineering-program-evaluationsbased-on-automated-measurement-of-performanceindicators-data-classified-into-cognitive-affective-andpsychomotor-learning-domains-of-the-revised-bloom-staxonomy
[3] Spady, W. (2020). Outcome-Based Education’s Empowering Essence. Mason Works Press, Boulder, Colorado.
http://williamspady.com/index.php/products/
[4] Spady, W. (1994). Outcome-based education: Critical issues and answers. Arlington, VA: American
[5] Adelman, C. (2015). To imagine a verb: The language and syntax of learning outcomes statements. National Institute of Learning Outcomes Assessment (NILOA).
http://learningoutcomesassessment.org/documents/Occa sional_Paper_24.pdf.
[6] Mager, Robert F. (1984). Preparing Instructional Objectives: A Critical Tool in the Development of Effective Instruction (2nd ed.). Belmont, CA: Lake Publishing
[7] Hussain, W. (2017) Specific performance indicators. https://www.youtube.com/watch?v=T9aKfJcJkNk

Speaker Biography

Wajid Hussain is a renowned world expert on authentic OBE, QA processes, outcomes assessment and program evaluation for accreditation using digital technology and software. Wajid has extensive experience supporting and managing outcomes assessment and CQI processes to fulfill regional and ABET accreditation requirements. He joined the academic field coming from an intensive engineering background at Silicon Valley and more than 20 years’ experience of mass production expertise in a Billion-dollar microprocessor manufacture industry. Over the last two decades, Wajid has managed scores of projects related to streamlining operations with utilization of state of the art technology and digital systems with significant experience working with ISO standard quality systems. He received the LSI Corporation Worldwide Operations Review 1999 award for his distinguished contributions to the Quality Improvement Systems. He was the lead product engineer supporting the Portal Player processor for Apple’s IPOD plus many other world famous products at LSI Corporation. He led the first ‘tuning’ efforts in the Middle East by developing a complex database of thousands of outcomes and hundreds of rubrics for the engineering disciplines at the Islamic University. He developed and implemented state of the art Digital Integrated Quality Management Systems for the engineering programs. His research interests include CQI using digital technology, quality and accreditation, outcomes assessment, education and research methods, and VLSI manufacture. Wajid is currently reviewer for several international conferences. Wajid has been invited keynote or presenter in more than 40 international OBE and education conferences. Some notable events where he presented are the ICA 2015, MTN 2016, OBE ICON 2016, FIE 2016, ASEE 2016, ASEE 2017, ICTIEE 2017, ABET Symposium 2017, IICEDubai 2018, QS ASIA 2018, ASEE 2018, EDUTECH 2018, APAC STEM 2018, ICEE 2018, QS ASIA 2019, EDUTECH 2019, QS ASIA 2020, EDUTECH 2020. Wajid is also part of the organizing committee for the OBE ICON 2021. Wajid is a senior member of the IEEE, IEEE Education Society, board member of the IN4OBE, member of the AALHE and ASEE.