PROGRESS MONITORING with the Gale H Roid Ph

  • Slides: 44
Download presentation
PROGRESS MONITORING with the Gale H. Roid, Ph. D and Mark F. Ledbetter, Psy.

PROGRESS MONITORING with the Gale H. Roid, Ph. D and Mark F. Ledbetter, Psy. D

Outline of Workshop • Why progress monitoring? • Review of newest IDEA and RTI

Outline of Workshop • Why progress monitoring? • Review of newest IDEA and RTI criteria • CBM/DIBELS versus improved models • WRAT 4 -PMV: Design, administration, scoring, research, uses • Case studies • Recommended applications

Why Progress Monitoring? • Early failure in reading ripples through upper grades and other

Why Progress Monitoring? • Early failure in reading ripples through upper grades and other curriculum areas • New Individuals with Disabilities Education Act (IDEA) and No Child Left Behind Act (NCLB) guidelines suggest progress monitoring within the response to intervention (RTI) model • National Assessment of Educational Progress (NAEP) shows 37% of fourth graders are below basic level in reading skills

Benefits of Intervention with Progress Monitoring • Two types of problem readers 1 1.

Benefits of Intervention with Progress Monitoring • Two types of problem readers 1 1. Good oral language; poor phonic skills 2. Lower socioeconomic status (SES) with broad weaknesses • Two third graders from the northwest given intensive tutoring with frequent brief tests 1. Daron—Primary to Grade 3 oral reading in 14 months 2. Mia—Grade 1 to Grade 3 in 13 months 1 Torgesen, J. K. (2004, Fall). Preventing early reading failure—and its devastating downward spiral. American Educator, 28.

Progress Monitoring in NCLB, RTI, and IDEA • Annual yearly progress (AYP) in special

Progress Monitoring in NCLB, RTI, and IDEA • Annual yearly progress (AYP) in special education • Monitoring changes in classroom instruction (Tier 2 of RTI) • Intensive assessment in Tier 3 for possible special education

History of the RTI Model According to Heller, Holtzman, and Messick (1982), 2 there

History of the RTI Model According to Heller, Holtzman, and Messick (1982), 2 there are three criteria for judging the validity of special education 3 placements: 1. General education classroom OK? 2. Special education more effective? 3. Is assessment method accurate? Heller, K. A. , Holtzman, W. H. , & Messick, S. (Eds. ) (1982). Placing children in special education: A strategy for equity. Washington, DC: National Academy Press. 2 Fuchs, L. S. , & Vaughn, S. R. (2006, March). Response to intervention as a framework for the identification of learning disabilities. NASP Communiqué, 34, 1 -6. 3

History of the RTI Model (cont. ) Three-phase adaptation of Heller et al. ’s

History of the RTI Model (cont. ) Three-phase adaptation of Heller et al. ’s plan: 4 1. Student’s rate of growth in general education 2. Low-performing student’s response to better instruction 3. Intensive assessment and further response to evidence-based instruction 4 Fuchs, L. S. , & Fuchs, D. (1998). Treatment validity: A unifying concept for reconceptualizing the identification of learning disabilities. Learning Disabilities Research and Practice, 13, 204 -219.

History of the RTI Model (cont. ) Three-tiered prevention model 5, 6, 7 1.

History of the RTI Model (cont. ) Three-tiered prevention model 5, 6, 7 1. Tier 1: Screening in general education 2. Tier 2: Fixed duration remediation with progress monitoring 3. Tier 3: Assessment for special education using progress monitoring 5 Individuals with Disabilities Education Improvement Act of 2004 (IDEA) (2004). Public Law No. 108 -446, § 632, 118 Stat. 2744. 6 Vaughn, S. , Linan-Thompson, S. , & Hickman, P. (2003). Response to instruction as a means of identifying students with reading/learning disabilities. Exceptional Children, 69, 391 -409. 7 Gresham, F. M. (2002). Responsiveness to intervention: An alternative approach to the identification of learning disabilities. In R. Bradley, L. Danielson, & D. P. Hallahan (Eds. ), Identification of learning disabilities: Research to practice (pp. 467 -519). Mahwah, NJ: Erlbaum.

CBM and DIBELS • 1975: Stanley Deno (University of Minnesota) develops easy-to-use basic skills

CBM and DIBELS • 1975: Stanley Deno (University of Minnesota) develops easy-to-use basic skills assessments for teachers • 1976 to 2005: Deno’s grad students Lynn Fuchs (Vanderbilt), Gerald Tindal (Univ. of Oregon), Mark Shinn, and others continue development of curriculum-based measurement (CBM); major federal grant support • 1998: Roland Good’s Dynamic Indicators of Basic Early Literacy Skills (DIBELS) • 2004: IDEA reauthorization recommends CBM (see http: //IDEA. ed. gov)

Attributes of the “Best CBM” 4 • Easy-to-use individual or small group tests that

Attributes of the “Best CBM” 4 • Easy-to-use individual or small group tests that teachers understand • Measures improvement over time • Brief tests given frequently • Assesses program effectiveness • No progress changes in instruction

Attributes of the “Best CBM” (cont. ) 8, 9 • Word reading performance is

Attributes of the “Best CBM” (cont. ) 8, 9 • Word reading performance is highly related to other CBM measures (e. g. , fluency, comprehension), especially in Grades 1 -3 • Feedback to teachers and students is not enough. Guidance and follow-up on methods of reading instruction is necessary. 8 Hosp, M. K. , & Fuchs, L. S. (2005). Using CBM as an indicator of decoding, word reading, and comprehension: Do the relations change with grade? School Psychology Review, 34, 9 -26. 9 Graney, S. B. , & Shinn, M. R. (2005). Effects of reading curriculum-based measurement (R-CBM) teacher feedback in general education classrooms. School Psychology Review, 34, 184 -201.

Limitations of Some CBM Applications l l Criterion-referenced CBM may not have grade-based expectations

Limitations of Some CBM Applications l l Criterion-referenced CBM may not have grade-based expectations (norms) CBM test forms not always “equivalent” statistically (variation in difficulty) Scores not always good for program effectiveness or across-grade comparisons Available CBM tests not in upper grades

WRAT 4 -PMV Features and Benefits • Simple and easy to use • Long

WRAT 4 -PMV Features and Benefits • Simple and easy to use • Long tradition in special education • Four subtests: Word Reading, Sentence Comprehension, Spelling, and Math Computation • Allows dual comparisons 1. Rate of growth of the student 2. National norms for grade-level expectations

WRAT 4 -PMV Features and Benefits (cont. ) • Four equivalent test forms containing

WRAT 4 -PMV Features and Benefits (cont. ) • Four equivalent test forms containing 15 items at each level (six levels) • Covers Grades K-12 and college • Across-grade Level Equivalent (LE) scores are available • Computer scoring program is available

Design of WRAT 4 -PMV l l l Four forms for each level Four

Design of WRAT 4 -PMV l l l Four forms for each level Four subtests: Word Reading, Sentence Comprehension, Spelling, and Math Computation Six levels - Level 1: Grades K-1 - Level 2: Grades 2 -3 - Level 3: Grades 4 -5 - Level 4: Grades 6 -8 - Level 5: Grades 9 -12 - Level 6: Grades 13 -16 (i. e. , college)

Test Administration: Word Reading • Start at the grade level, then adjust (out-of-level testing

Test Administration: Word Reading • Start at the grade level, then adjust (out-of-level testing is OK) • Present card with letters and words • Say, “Look…. read across. ” • If not clear, say “Please say the word again. ”

Sample Test Form: Word Reading Level 3 (Grades 4 -5)

Sample Test Form: Word Reading Level 3 (Grades 4 -5)

Test Administration: Sentence Comprehension • “Find the missing word. ” • Present the sample

Test Administration: Sentence Comprehension • “Find the missing word. ” • Present the sample card and see if the student finds the missing word • Read the other sample sentences • Student silently reads the remaining sentences in the subtest

Test Administration: Sentence Comprehension (cont. ) Mark and score responses

Test Administration: Sentence Comprehension (cont. ) Mark and score responses

Test Administration: Spelling • Spell the word “in context” • Write (or print) letters

Test Administration: Spelling • Spell the word “in context” • Write (or print) letters or words • You read the word by itself, then read the word in a sentence • Student uses Response Booklet to write responses

Sample Response Booklet: Spelling Level 2 (Grades 2 -3)

Sample Response Booklet: Spelling Level 2 (Grades 2 -3)

Test Administration: Math Computation • Oral math for Grades K-5 (Levels 1 -3): “Show

Test Administration: Math Computation • Oral math for Grades K-5 (Levels 1 -3): “Show me 3 fingers. ” • Math calculation problems Level 1: 7 or 8 items Level 2: 10 or 11 items Level 3: 13 items Levels 4 -6: 15 items • Student uses Response Booklet • No calculators

Sample Oral Math Card: Levels 1 -3 (Grades K-5)

Sample Oral Math Card: Levels 1 -3 (Grades K-5)

Sample Examiner Instructions: Math Computation Card, Level 2 (Grades 2 -3)

Sample Examiner Instructions: Math Computation Card, Level 2 (Grades 2 -3)

Scoring: Plot Raw Scores on the Profile to Monitor Progress

Scoring: Plot Raw Scores on the Profile to Monitor Progress

Score Difference Tables

Score Difference Tables

Technical Aspects: Reliability l l High level of reliability in Grades K-12 Test-retest 30

Technical Aspects: Reliability l l High level of reliability in Grades K-12 Test-retest 30 day practice effect = less than. 5 point Subtest Median alpha Word Reading . 81 Sentence Comprehension . 83 Spelling . 79 Math Computation . 74

Technical Aspects: Test Form Equivalence l l Nearly perfect equivalence among the four test

Technical Aspects: Test Form Equivalence l l Nearly perfect equivalence among the four test forms at all levels Criterion Result Item percent correct equal Within. 02 Average means equal across forms Yes Gulliksen method with Wilks’ Lambda Equal standard deviations Yes Equality of intercorrelation Yes 10 11 10 Gulliksen, H. (1950). Theory of mental tests. New York: Wiley. 11 Wilks, S. S. (1932). Certain generalizations in the analysis of variance. Biometrika, 24, 471 -494.

Technical Aspects: Validity Other published test w/ WRAT 4 -PMV subtest Correlation WIAT-II Word

Technical Aspects: Validity Other published test w/ WRAT 4 -PMV subtest Correlation WIAT-II Word Reading w/ WR . 69 WIAT-II Word Reading w/ SP . 54 WIAT-II Number Operations w/ MC . 48 KTEA-II Reading w/ WR . 68 KTEA-II Writing w/ SP . 65 KTEA-II Math w/ MC . 48

Technical Aspects: Word Reading and LD l l Study of 30 students with reading

Technical Aspects: Word Reading and LD l l Study of 30 students with reading learning disability (LD) SD difference in scores of LD versus controls =. 5 -1. 00 (usually 2 raw score points) Level 1 Effect size. 54 to. 98 SD units 2 . 47 to. 77 3 . 53 to. 85 4 . 42 to. 82

Developmental Trends in Level Equivalent Scores

Developmental Trends in Level Equivalent Scores

Case Example #1: Ananta, Grade 2— Catching Up

Case Example #1: Ananta, Grade 2— Catching Up

Dual Criteria for LDs Look for two trends: 4 1. Shows no improvement—a “flat

Dual Criteria for LDs Look for two trends: 4 1. Shows no improvement—a “flat profile” based on “slope” of the graph line 2. Performs below grade level despite classroom interventions—the graph line stays below the grade norms

Case Example #2: Grade 3—Flat Profile Dual Discrepancy

Case Example #2: Grade 3—Flat Profile Dual Discrepancy

Case Example #3: Julio, Grade 4— Progress Across Grades

Case Example #3: Julio, Grade 4— Progress Across Grades

Applications of the WRAT 4 -PMV l Monitoring students identified by NCLB l Measuring

Applications of the WRAT 4 -PMV l Monitoring students identified by NCLB l Measuring RTI in Tier 2 (fixed duration remediation) l Verification of qualification for special education (Tier 3) l Long-term progress monitoring in special education (AYP)

Applications of the WRAT 4 -PMV (cont. ) l See reference list handout for

Applications of the WRAT 4 -PMV (cont. ) l See reference list handout for examples of empirically-based instructional interventions l Five methods of reading intervention - Repeated reading: Read passage twice - Listening passage preview: You read it, have student follow with finger - Phrase drill: Read error words, student repeats three times - Syllable segmentation: Read each syllable - Reward Contingency: If score is improved 12 12 Daly, E. J. , Persampieri, M. , Mc. Curdy, M. , & Gortmaker, V. (2005). Generating reading interventions through experimental analysis of academic skills: Demonstration and empirical evaluation. School Psychology Review, 34, 395 -414.

Sample Report From the WRAT 4 -PMV Scoring Program

Sample Report From the WRAT 4 -PMV Scoring Program

Sample Report From the WRAT 4 -PMV Scoring Program (cont. )

Sample Report From the WRAT 4 -PMV Scoring Program (cont. )

Sample Report From the WRAT 4 -PMV Scoring Program (cont. )

Sample Report From the WRAT 4 -PMV Scoring Program (cont. )

Sample Report From the WRAT 4 -PMV Scoring Program (cont. )

Sample Report From the WRAT 4 -PMV Scoring Program (cont. )

Sample Report From the WRAT 4 -PMV Scoring Program (cont. )

Sample Report From the WRAT 4 -PMV Scoring Program (cont. )

Sample Report From the WRAT 4 -PMV Scoring Program (cont. )

Sample Report From the WRAT 4 -PMV Scoring Program (cont. )

For More Information… See sample materials after workshop. Visit www. parinc. com and click

For More Information… See sample materials after workshop. Visit www. parinc. com and click on Assessment Consultants to contact a sales representative or to arrange a workshop in your school district.