Student Learning Outcome Assessment A Programs Perspective Ling

  • Slides: 18
Download presentation
Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library

Student Learning Outcome Assessment: A Program’s Perspective Ling Hwey Jeng, Director School of Library and Information Studies Ljeng@twu. edu June 27, 2011

Operational Assumptions n Every program is unique, but it is still possible to have

Operational Assumptions n Every program is unique, but it is still possible to have common process for quality assessment. n Data used to inform decisions must be open, consistent, and continuous. n Evidence of student academic attainment must be explicit, clear and understandable. n No assessment can be good without direct measures on student learning outcomes. 2

Accreditation n Paradigm shift – from “what faculty teach” to “what students learn” n

Accreditation n Paradigm shift – from “what faculty teach” to “what students learn” n Focus shift – from structure “input, process, output” to outcomes 3

The Anatomy of Accreditation n Input (e. g. , enrollment, faculty recruitment, facility) Process

The Anatomy of Accreditation n Input (e. g. , enrollment, faculty recruitment, facility) Process (e. g. , curriculum, services, advising) Output (e. g. , grades, graduation, placement) n Outcome – Goal oriented planning, support, teaching, as shown in evidence of learning 4

Example: Structure and Outcomes Input Process Outputs Outcomes Students backgrounds, enrollment Student services and

Example: Structure and Outcomes Input Process Outputs Outcomes Students backgrounds, enrollment Student services and programs grades, graduation, placement Skills gained, attitudes changes Faculty backgrounds, recruitment teaching assignment, class size work units, publications, conference presentations citations, impacts Program history, budget, resources allocated policies, procedures, governance participation rates, resources utilization program objectives achieved 5

Example: Output v. Outcome Output > Outcome Course grades > Skills learned Faculty publications

Example: Output v. Outcome Output > Outcome Course grades > Skills learned Faculty publications > Citation impacts Enrollment growth > Objective achieved 6

Two Levels of Assessment n Program level assessment n Course level assessment 7

Two Levels of Assessment n Program level assessment n Course level assessment 7

Implicit in COA Standards is the expectation that assessment is Explicit and in writing

Implicit in COA Standards is the expectation that assessment is Explicit and in writing n Integrated in the program’s planning process n Done at both program level and course level n Accessible to those affected by the assessment n Implemented and used as feedback by the program n 8

The Guiding Principles of Assessment n Backward planning – Start with where we want

The Guiding Principles of Assessment n Backward planning – Start with where we want to end n Triangulation – Multiple measures, both direct and indirect n Gap analysis – Inventory of what has been done 9

Steps for Assessment n Identify standards, sources of evidence and constituent inputs n Define

Steps for Assessment n Identify standards, sources of evidence and constituent inputs n Define student learning objectives n Develop outcome measures – Direct and indirect measure n Collect and analyze data – Methods, frequency, patterns n Review and use data as feedback – Impacts on decisions made 10

Faculty Expectations as the Basis The really important things faculty think students should know,

Faculty Expectations as the Basis The really important things faculty think students should know, believe, or be able to do when they receive their degrees Approaches to establishing faculty expectations n Top down: n Bottom up: – use external standards to define faculty expectations – identify recurring faculty expectations among courses, and use the list to develop overarching program level expectations 11

Example: A Bottom Up Approach n Take all course syllabi n Examine what expectations

Example: A Bottom Up Approach n Take all course syllabi n Examine what expectations (i. e. , course objectives) are included in individual courses n Make a list of recurring ones as the basis for program level expectations n Ask what else need to be at program level n State the expectations in program objectives 12

Activities for Direct Measures (e. g. ) n n n written exams oral exams

Activities for Direct Measures (e. g. ) n n n written exams oral exams performance assessments standardized testing licensure exams oral presentations projects demonstrations case studies simulation portfolios juried activities with outside panel 13

Activities for Indirect Measures n questionnaire surveys n interviews n focus groups n employer

Activities for Indirect Measures n questionnaire surveys n interviews n focus groups n employer satisfaction studies n advisory board n job/placement data (examples only) 14

Demonstration of Assessment n Program objectives aligned with mission and goals n Multiple inputs

Demonstration of Assessment n Program objectives aligned with mission and goals n Multiple inputs in developing program objectives (both constituents and disciplinary standards) n Program objectives stated in terms of student learning outcomes 15

Demonstration of Assessment (Cont. ) n Student learning outcome assessment addressed at both course

Demonstration of Assessment (Cont. ) n Student learning outcome assessment addressed at both course level and program level n Triangulation with both direct and indirect measures n A formal, systematic process to integrate results of assessment into continuous planning 16

Words for Thought Assessment of input and process (i. e. , structure) only determines

Words for Thought Assessment of input and process (i. e. , structure) only determines capacity. It does not determine what students learn. n Don’t confuse “better teaching” with “better learning. ” One is the means and the other is the outcome. n Everything we do in the classroom is about something outside the classroom. n It’s what the learners do that determines what and how much is learned. n If I taught something and no one learned it, does it count? 17

A Program Director’s Perspective n Map the objectives with professional standards n Make visible

A Program Director’s Perspective n Map the objectives with professional standards n Make visible the invisible expectations n Make sure what we measure is what we value n Begin with what could be agreed upon n Include both program measures and course embedded measures n Make use of assessment in grading n Harness the accreditation process to make it happen 18