Response to Intervention RTI Schoolwide Academic Screening ProgressMonitoring

  • Slides: 63
Download presentation
Response to Intervention RTI: Schoolwide Academic Screening & Progress-Monitoring Jim Wright www. interventioncentral. org

Response to Intervention RTI: Schoolwide Academic Screening & Progress-Monitoring Jim Wright www. interventioncentral. org

Response to Intervention RTI Literacy: Assessment & Progress-Monitoring • • • To measure student

Response to Intervention RTI Literacy: Assessment & Progress-Monitoring • • • To measure student ‘response to instruction/intervention’ effectively, the RTI model measures students’ academic performance and progress on schedules matched to each student’s risk profile and intervention Tier membership. Benchmarking/Universal Screening. All children in a grade level are assessed at least 3 times per year on a common collection of academic assessments. Strategic Monitoring. Students placed in Tier 2 (supplemental) reading groups are assessed 1 -2 times per month to gauge their progress with this intervention. Intensive Monitoring. Students who participate in an intensive, individualized Tier 3 intervention are assessed at least once per week. Source: Burns, M. K. , & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge. www. interventioncentral. org 2

Response to Intervention Educational Decisions and Corresponding Types of Assessment • SCREENING/BENCHMARKING DECISIONS: Tier

Response to Intervention Educational Decisions and Corresponding Types of Assessment • SCREENING/BENCHMARKING DECISIONS: Tier 1: Brief screenings to quickly indicate whether students in the general-education population are academically proficient or at risk. • PROGRESS-MONITORING DECISIONS: At Tiers and 3, ongoing ‘formative’ assessments to judge whether students on intervention are making adequate progress. • INSTRUCTIONAL/DIAGNOSTIC DECISIONS: At any Tier, detailed assessment to map out specific academic deficits , discover the root cause(s) of a student’s academic problem. • OUTCOME DECISIONS: Summative assessment (e. g. , Source: Hosp, M. K. , Hosp, J. L. , & Howell, K. W. (2007). The ABCs of CBM: A practical guide to curriculum-based measurement. New York: Guilford Press. state tests) to evaluate the effectiveness of a program. www. interventioncentral. org 3

Response to Intervention Measuring General vs. Specific Academic Outcomes General Outcome Measures… • Track

Response to Intervention Measuring General vs. Specific Academic Outcomes General Outcome Measures… • Track the student’s increasing proficiency on general curriculum goals such as reading fluency. Example: CBM -Oral Reading Fluency (Hintz et al. , 2006). • Are most useful for longer-term measurement (e. g. , to set and track IEP goals over the timespan of a school Sources: Burns, M. K. , & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge. year). Hintz, J. M. , Christ, T. J. , & Methe, S. A. (2006). Curriculum-based assessment. Psychology in the Schools, 43, 45 -56. www. interventioncentral. org 4

Response to Intervention Measuring General vs. Specific Academic Outcomes Specific Sub-Skill Mastery Measures… •

Response to Intervention Measuring General vs. Specific Academic Outcomes Specific Sub-Skill Mastery Measures… • Track short-term student academic progress with clear criteria for mastery (Burns & Gibbons, 2008). Example: Letter Identification. • Are helpful in assessing whether the student has acquired short-term skills whose acquisition may require weeks rather than months. Sources: Burns, M. K. , & Gibbons, K. A. (2008). Implementing response-to-intervention in elementary and secondary schools: Procedures to assure scientific-based practices. New York: Routledge. Hintz, J. M. , Christ, T. J. , & Methe, S. A. (2006). Curriculum-based assessment. Psychology in the Schools, 43, 45 -56. www. interventioncentral. org 5

Response to Intervention Curriculum-Based Measurement: An Introduction www. interventioncentral. org

Response to Intervention Curriculum-Based Measurement: An Introduction www. interventioncentral. org

Response to Intervention Curriculum-Based Measurement: Advantages as a Set of Tools to Monitor RTI/Academic

Response to Intervention Curriculum-Based Measurement: Advantages as a Set of Tools to Monitor RTI/Academic Cases • Aligns with curriculum-goals and materials • Is reliable and valid (has ‘technical adequacy’) • Is criterion-referenced: sets specific performance levels for specific tasks • Uses standard procedures to prepare materials, administer, and score • Samples student performance to give objective, observable ‘low-inference’ information about student performance • Has decision rules to help educators to interpret student data and make appropriate instructional decisions • Is efficient to implement in schools (e. g. , training can be done quickly; the measures are brief and feasible for classrooms, etc. ) Source: ABCs of CBM. New York: Guilford. • Hosp, M. K. , Hosp, J. L. , & Howell, K. W. (2007). The Provides data that can be converted into visual displays www. interventioncentral. org 7

Response to Intervention Among other areas, CBM Techniques have been developed to assess: •

Response to Intervention Among other areas, CBM Techniques have been developed to assess: • • Reading fluency Reading comprehension Math computation Writing Spelling Phonemic awareness skills Early math skills www. interventioncentral. org 8

Response to Intervention CBM: Developing a Process to Screen and Collect Local Norms Jim

Response to Intervention CBM: Developing a Process to Screen and Collect Local Norms Jim Wright www. interventioncentral. org

Response to Intervention Building-Wide Screening: Assessing All Students (Stewart & Silberglit, 2008) Screening data

Response to Intervention Building-Wide Screening: Assessing All Students (Stewart & Silberglit, 2008) Screening data in basic academic skills are collected at least 3 times per year (fall, winter, spring). • Schools should consider using ‘curriculumlinked’ measures such as Curriculum. Based Measurement that will show generalized student growth in response to learning. • If possible, schools should consider avoiding ‘curriculum-locked’ measures that Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes (Eds. ), Best practices in school psychology V (pp. 225 -242). Bethesda, MD: National Association of School Psychologists. are tied to a single commercial instructional www. interventioncentral. org 10

Response to Intervention Building-Wide Screening: Using a Wide Variety of Data (Stewart & Silberglit,

Response to Intervention Building-Wide Screening: Using a Wide Variety of Data (Stewart & Silberglit, 2008) Screenings can be compiled using: • Fluency measures such as Curriculum. Based Measurement. • Existing data, such as office disciplinary referrals. • Computer-delivered assessments, e. g. , Measures of Academic Progress (MAP) from www. nwea. org Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. Thomas & J. Grimes (Eds. ), Best practices in school psychology V (pp. 225 -242). Bethesda, MD: National Association of School Psychologists. www. interventioncentral. org 11

Response to Intervention Measures of Academic Progress (MAP) www. nwea. org www. interventioncentral. org

Response to Intervention Measures of Academic Progress (MAP) www. nwea. org www. interventioncentral. org 12

Response to Intervention Applications of Screening Data (Stewart & Silberglit, 2008) Screening data can

Response to Intervention Applications of Screening Data (Stewart & Silberglit, 2008) Screening data can be used to: • Evaluate and improve the current core instructional program. • Allocate resources to classrooms, grades, and buildings where student academic needs are greatest. • Guide the creation of targeted Tier 2 (supplemental intervention) groups • Set academic goals for improvement for students on Tier 2 and Tier 3 interventions. • Move students across levels of intervention, based on performance relative to that of Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. peers (local norms). Thomas & J. Grimes (Eds. ), Best practices in school psychology V (pp. 225 -242). Bethesda, MD: National Association of School Psychologists. www. interventioncentral. org 13

Response to Intervention Screening Data: Supplement With Additional Academic Testing as Needed (Stewart &

Response to Intervention Screening Data: Supplement With Additional Academic Testing as Needed (Stewart & Silberglit, 2008) “At the individual student level, local norm data are just the first step toward determining why a student may be experiencing academic difficulty. Because local norms are collected on brief indicators of core academic skills, other sources of information and additional testing using the local norm measures or other tests are needed to validate the problem and determine why the student is having difficulty. … Percentage correct and rate information provide clues regarding automaticity and accuracy of skills. Error types, error patterns, and qualitative data provide clues about how a student approached the task. Patterns of strengths and weaknesses on subtests of an assessment can provide information about the concepts in which a student or group of students may need greater instructional support, provided these Source: Stewart, L. H. & Silberglit, B. (2008). Best practices in developing academic local norms. In A. subtests are equated and reliable for these Thomas & J. Grimes (Eds. ), Best practices in school psychology V (pp. 225 -242). Bethesda, MD: purposes. ” p. 237 www. interventioncentral. org National Association of School Psychologists. 14

Response to Intervention Steps in Creating Process for Local Norming/Screening Using CBM Measures 1.

Response to Intervention Steps in Creating Process for Local Norming/Screening Using CBM Measures 1. Identify personnel to assist in collecting data. A range of staff and school stakeholders can assist in the school norming, including: • • Administrators Support staff (e. g. , school psychologist, school social worker, specials teachers, paraprofessionals) • Parents and adult volunteers • Field placement students from graduate Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and programs reading performance data. University of Oregon: Retrieved from https: //dibels. uoregon. edu/logistics/data_collection. pdf www. interventioncentral. org 15

Response to Intervention Steps in Creating Process for Local Norming/Screening Using CBM Measures 2.

Response to Intervention Steps in Creating Process for Local Norming/Screening Using CBM Measures 2. Determine method for screening data collection. The school can have teachers collect data in the classroom or designate a team to conduct the screening: • In-Class: Teaching staff in the classroom collect the data over a calendar week. • Schoolwide/Single Day: A trained team of 6 -10 sets up a testing area, cycles students through, and collects all data in one school day. • Schoolwide/Multiple Days: Trained team of 4 -8 either goes to classrooms or creates a central testing location, completing the assessment over multiple days. • Within-Grade: Data collectors at a grade level norm the Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and entire grade, with students kept busy with another activity reading performance data. University of Oregon: Retrieved from https: //dibels. uoregon. edu/logistics/data_collection. pdf www. interventioncentral. org (e. g. , video) when not being screened. 16

Response to Intervention Steps in Creating Process for Local Norming/Screening Using CBM Measures 3.

Response to Intervention Steps in Creating Process for Local Norming/Screening Using CBM Measures 3. Select dates for screening data collection. Data collection should occur at minimum three times per year in fall, winter, and spring. Consider: • • Avoiding screening dates within two weeks of a major student break (e. g. , summer or winter break). Coordinate the screenings to avoid state testing periods and other major scheduling conflicts. Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and reading performance data. University of Oregon: Retrieved from https: //dibels. uoregon. edu/logistics/data_collection. pdf www. interventioncentral. org 17

Response to Intervention Steps in Creating Process for Local Norming/Screening Using CBM Measures 4.

Response to Intervention Steps in Creating Process for Local Norming/Screening Using CBM Measures 4. Create Preparation Checklist. Important preparation steps are carried out, including: • • Selecting location of screening Recruiting screening personnel Ensure that training occurs for all data collectors Line up data-entry personnel (e. g. , for rapid computer data entry). Source: Harn, B. (2000). Approaches and considerations of collecting schoolwide early literacy and reading performance data. University of Oregon: Retrieved from https: //dibels. uoregon. edu/logistics/data_collection. pdf www. interventioncentral. org 18

Response to Intervention Methods of Classroom Data Collection Jim Wright www. interventioncentral. org

Response to Intervention Methods of Classroom Data Collection Jim Wright www. interventioncentral. org

Response to Intervention Activity: Classroom Methods of Data Collection In your teams: • Review

Response to Intervention Activity: Classroom Methods of Data Collection In your teams: • Review the potential sources of classroom data Classroom Data that can be used to monitor Sources: Tier 1 interventions. • Global skills checklist • Rating scales • What questions do you have about any of these • Behavioral frequency data sources? count • How can your school make • Behavioral log full use of these data • Student work samples sources to ensure that • Work performance logs every Tier 1 intervention is • Timed tasks (e. g. , www. interventioncentral. org

Response to Intervention RTI ‘Pyramid of Interventions’ Tier 3 Tier 2 Tier 1 Tier

Response to Intervention RTI ‘Pyramid of Interventions’ Tier 3 Tier 2 Tier 1 Tier 3: Intensive interventions. Students who are ‘non-responders’ to Tiers 1 & 2 are referred to the RTI Team for more intensive interventions. Tier 2 Individualized interventions. Subset of students receive interventions targeting specific needs. Tier 1: Universal interventions. Available to all students in a classroom or school. Can consist of whole-group or individual www. interventioncentral. org 21

Response to Intervention www. interventioncentral. org 22

Response to Intervention www. interventioncentral. org 22

Response to Intervention Global Skills Checklists • Description: The teacher selects a global skill.

Response to Intervention Global Skills Checklists • Description: The teacher selects a global skill. The teacher then breaks that global skill down into specific, observable ‘subskills’. Each subskill can be verified as ‘done’ or ‘not done’. www. interventioncentral. org 23

Response to Intervention Global Skills Checklists: Example • The teacher selects the global skill

Response to Intervention Global Skills Checklists: Example • The teacher selects the global skill ‘organizational skills’. • That global skill is defined as having the following components, each of which can be observed: qarriving to class on time; qbringing work materials to class; qfollowing teacher directions in a timely manner; qknowing how to request teacher assistance when needed; qhaving an uncluttered desk with only www. interventioncentral. org 24

Response to Intervention Behavioral Frequency Count • Description: The teacher observes a student behavior

Response to Intervention Behavioral Frequency Count • Description: The teacher observes a student behavior and keeps a cumulative tally of the number of times that the behavior is observed during a given period. • Behaviors that are best measured using frequency counts have clearly observable beginning and end points—and are of relatively short duration. Examples include: – Student call-outs. – Requests for teacher help during independent seatwork. – Raising one’s hand to make a contribution www. interventioncentral. org to large-group discussion. 25

Response to Intervention Behavioral Frequency Count: How to Record Teachers can collect data on

Response to Intervention Behavioral Frequency Count: How to Record Teachers can collect data on the frequency of student behaviors in several ways: • Keeping a mental tally of the frequency of target behaviors occurring during a class period. • Recording behaviors on paper (e. g. , simple tally marks) as they occur. • Using a golf counter, stitch counter, or other mechanical counter device to www. interventioncentral. org 26

Response to Intervention Behavioral Frequency Count: How to Compute • If student behaviors are

Response to Intervention Behavioral Frequency Count: How to Compute • If student behaviors are being tallied during a class period, frequency-count data can be reported as ‘X number of behaviors per class period’. • If frequency-count data is collected in different spans of time on different days, however, schools can use the following method to standardize frequency count data : – Record the total number of behaviors observed. – Record the number of minutes in the observation period. – Divide the total number of behaviors observed by total minutes in the observation period. Example: 5 callouts observed during a 10 www. interventioncentral. org 27

Response to Intervention Behavior Log • Description: The teacher makes a log entry each

Response to Intervention Behavior Log • Description: The teacher makes a log entry each time that a behavior is observed. An advantage of behavior logs is that they can provide information about the context within which a behavior occurs. (Disciplinary office referrals are a specialized example of a behavior log. ) • Behavior logs are useful for tracking ‘low-incidence’ problem behaviors. www. interventioncentral. org 28

Response to Intervention Behavior Log: Sample Form www. interventioncentral. org 29

Response to Intervention Behavior Log: Sample Form www. interventioncentral. org 29

Response to Intervention Rating Scales • Description: A scale is developed that a rater

Response to Intervention Rating Scales • Description: A scale is developed that a rater can use to complete a global rating of a behavior. Often the rating scale is completed at the conclusion of a fixed observation period (e. g. , after each class period). • Daily / Direct Behavior Report Cards are one example of rating scales. www. interventioncentral. org 30

Response to Intervention Daily Behavior Report Card: Daily Version Jim Blalock Mrs. Williams www.

Response to Intervention Daily Behavior Report Card: Daily Version Jim Blalock Mrs. Williams www. interventioncentral. org May 5 Rm 108

Response to Intervention Student Work Samples • Description: Work samples are collected for information

Response to Intervention Student Work Samples • Description: Work samples are collected for information about the student’s basic academic skills, mastery of course content, etc. • Recommendation: When collecting work samples: – Record the date that the sample was collected – If the work sample was produced in class, note the amount of time needed to complete the sample (students can calculate and record this information). www. interventioncentral. org 32

Response to Intervention Work Performance Logs • Description: Information about student academic performance is

Response to Intervention Work Performance Logs • Description: Information about student academic performance is collected to provide insight into growth in student skills or use of skills in appropriate situations. Example: A teacher implementing a vocabulary-building intervention keeps a cumulative log noting date and vocabulary words mastered. • Example: A student keeps a journal with www. interventioncentral. org 33

Response to Intervention Timed Tasks (e. g. , Curriculum-Based Measurement) • Description: The teacher

Response to Intervention Timed Tasks (e. g. , Curriculum-Based Measurement) • Description: The teacher administers structured, timed tasks to assess student accuracy and fluency. • Example: The student completes a 2 minute CBM single-skill math computation probe. • Example: The student completes a 3 minute CBM writing probe that is scored for total words written. www. interventioncentral. org 34

Response to Intervention Existing Records • Description: The teacher uses information already being collected

Response to Intervention Existing Records • Description: The teacher uses information already being collected in the classroom that is relevant to the identified student problem. • Examples of existing records that can be used to track student problems include: – Grades – Absences and incidents of tardiness – Homework turned in www. interventioncentral. org 35

Response to Intervention Combining Classroom Monitoring Methods • Often, methods of classroom data collection

Response to Intervention Combining Classroom Monitoring Methods • Often, methods of classroom data collection and progress-monitoring can be combined to track a single student problem. • Example: A teacher can use a rubric (checklist) to rate the quality of student work samples. • Example: A teacher may keep a running tally (behavioral frequency count) of student callouts. At the same time, the www. interventioncentral. org 36

Response to Intervention Activity: Classroom Methods of Data Collection In your teams: • Review

Response to Intervention Activity: Classroom Methods of Data Collection In your teams: • Review the potential sources of classroom data Classroom Data that can be used to monitor Sources: Tier 1 interventions. • Global skills checklist • Rating scales • What questions do you have about any of these • Behavioral frequency data sources? count • How can your school make • Behavioral log full use of these data • Student work samples sources to ensure that • Work performance logs every Tier 1 intervention is • Timed tasks (e. g. , www. interventioncentral. org

Response to Intervention RIOT/ICEL Framework: Organizing Information to Better Identify Student Behavioral & Academic

Response to Intervention RIOT/ICEL Framework: Organizing Information to Better Identify Student Behavioral & Academic Problems www. interventioncentral. org

Response to Intervention Assessment Data: Reaching the ‘Saturation Point’ “…During the process of assessment,

Response to Intervention Assessment Data: Reaching the ‘Saturation Point’ “…During the process of assessment, a point of saturation is always reached; that is, the point when enough information has been collected to make a good decision, but adding additional information will not improve the decision making. It sounds simple enough, but the tricky part is determining when that point has been reached. Unfortunately, information cannot be measured in pounds, decibels, degrees, or feet so there is Source: Hosp, J. L. (2008). Best practices in aligning academic assessment with instruction. In A. Thomas & J. Grimes (Eds. ), Best practices in school psychology V (pp. 363 -376). Bethesda, MD: no absolute amount of information or National Association of School Psychologists. www. interventioncentral. org 39

Response to Intervention www. interventioncentral. org

Response to Intervention www. interventioncentral. org

Response to Intervention RIOT/ICEL Framework Sources of Information • Review (of records) • Interview

Response to Intervention RIOT/ICEL Framework Sources of Information • Review (of records) • Interview • Observation • Test Focus of Assessment • Instruction • Curriculum • Environment • Learner www. interventioncentral. org 41

Response to Intervention RIOT/ICEL Definition • The RIOT/ICEL matrix is an assessment guide to

Response to Intervention RIOT/ICEL Definition • The RIOT/ICEL matrix is an assessment guide to help schools efficiently to decide what relevant information to collect on student academic performance and behavior—and also how to organize that information to identify probable reasons why the student is not experiencing academic or behavioral success. • The RIOT/ICEL matrix is not itself a data collection instrument. Instead, it is an organizing framework, or heuristic, that increases schools’ confidence both in the www. interventioncentral. org 42

Response to Intervention RIOT: Sources of Information • Select Multiple Sources of Information: RIOT

Response to Intervention RIOT: Sources of Information • Select Multiple Sources of Information: RIOT (Review, Interview, Observation, Test). The top horizontal row of the RIOT/ICEL table includes four potential sources of student information: Review, Interview, Observation, and Test (RIOT). Schools should attempt to collect information from a range of sources to control for potential bias from any one source. www. interventioncentral. org 43

Response to Intervention Select Multiple Sources of Information: RIOT (Review, Interview, Observation, Test) •

Response to Intervention Select Multiple Sources of Information: RIOT (Review, Interview, Observation, Test) • Review. This category consists of past or present records collected on the student. Obvious examples include report cards, office disciplinary referral data, state test results, and attendance records. Less obvious examples include student work samples, physical products of teacher interventions (e. g. , a sticker chart used to reward positive student behaviors), and www. interventioncentral. org 44

Response to Intervention Select Multiple Sources of Information: RIOT (Review, Interview, Observation, Test) •

Response to Intervention Select Multiple Sources of Information: RIOT (Review, Interview, Observation, Test) • Interviews can be conducted face-toface, via telephone, or even through email correspondence. Interviews can also be structured (that is, using a pre-determined series of questions) or follow an open-ended format, with questions guided by information supplied by the respondent. Interview targets can include those teachers, paraprofessionals, administrators, and support staff in the school setting who have worked with or had interactions with the student in the present or past. Prospective interview candidates can also consist of parents and other www. interventioncentral. org 45

Response to Intervention Select Multiple Sources of Information: RIOT (Review, Interview, Observation, Test) •

Response to Intervention Select Multiple Sources of Information: RIOT (Review, Interview, Observation, Test) • Observation. Direct observation of the student’s academic skills, study and organizational strategies, degree of attentional focus, and general conduct can be a useful channel of information. Observations can be more structured (e. g. , tallying the frequency of call-outs or calculating the percentage of on-task www. interventioncentral. org 46

Response to Intervention Select Multiple Sources of Information: RIOT (Review, Interview, Observation, Test) •

Response to Intervention Select Multiple Sources of Information: RIOT (Review, Interview, Observation, Test) • Testing can be thought of as a structured and standardized observation of the student that is intended to test certain hypotheses about why the student might be struggling and what school supports would logically benefit the student (Christ, 2008). An example of testing may be a student being administered a math computation www. interventioncentral. org 47

Response to Intervention Formal Tests: Only One Source of Student Assessment Information “Tests are

Response to Intervention Formal Tests: Only One Source of Student Assessment Information “Tests are often overused and misunderstood in and out of the field of school psychology. When necessary, analog [i. e. , test] observations can be used to test relevant hypotheses within controlled conditions. Testing is a highly standardized form of observation. …. The only reason to administer a test is to answer well-specified questions and examine well-specified hypotheses. It is best practice to identify and make explicit the most relevant questions before assessment begins. …The process of Source: Christ, T. (2008). Best practices in problem analysis. In A. Thomas & J. Grimes (Eds. ), Best practices in school psychology V (pp. 159 -176). Bethesda, MD: National Association of School assessment should follow these questions. Psychologists. www. interventioncentral. org 48

Response to Intervention ICEL: Factors Impacting Student Learning • Investigate Multiple Factors Affecting Student

Response to Intervention ICEL: Factors Impacting Student Learning • Investigate Multiple Factors Affecting Student Learning: ICEL (Instruction, Curriculum, Environment, Learner). The leftmost vertical column of the RIO/ICEL table includes four key domains of learning to be assessed: Instruction, Curriculum, Environment, and Learner (ICEL). A common mistake that schools often make is to assume that student learning problems exist primarily in the learner and to underestimate the degree to which teacher instructional strategies, curriculum demands, and environmental influences impact the learner’s www. interventioncentral. org academic performance. The ICEL elements 49

Response to Intervention Investigate Multiple Factors Affecting Student Learning: ICEL (Instruction, Curriculum, Environment, Learner)

Response to Intervention Investigate Multiple Factors Affecting Student Learning: ICEL (Instruction, Curriculum, Environment, Learner) • Instruction. The purpose of investigating the ‘instruction’ domain is to uncover any instructional practices that either help the student to learn more effectively or interfere with that student’s learning. More obvious instructional questions to investigate would be whether specific teaching strategies for activating prior knowledge better prepare the student to master new information or whether a student benefits optimally from the large-group lecture format that is often used in a classroom. A less obvious example of an instructional question would be whether a www. interventioncentral. org 50

Response to Intervention Investigate Multiple Factors Affecting Student Learning: ICEL (Instruction, Curriculum, Environment, Learner)

Response to Intervention Investigate Multiple Factors Affecting Student Learning: ICEL (Instruction, Curriculum, Environment, Learner) • Curriculum. ‘Curriculum’ represents the full set of academic skills that a student is expected to have mastered in a specific academic area at a given point in time. To adequately evaluate a student’s acquisition of academic skills, of course, the educator must (1) know the school’s curriculum (and related state academic performance standards), (2) be able to inventory the specific academic skills that the student currently possesses, and then (3) identify gaps between curriculum expectations and actual student skills. www. interventioncentral. org 51

Response to Intervention Investigate Multiple Factors Affecting Student Learning: ICEL (Instruction, Curriculum, Environment, Learner)

Response to Intervention Investigate Multiple Factors Affecting Student Learning: ICEL (Instruction, Curriculum, Environment, Learner) • Environment. The ‘environment’ includes any factors in the student’s school, community, or home surroundings that can directly enable their academic success or hinder that success. Obvious questions about environmental factors that impact learning include whether a student’s educational performance is better or worse in the presence of certain peers and whether having additional adult supervision during a study hall results in higher student work productivity. Less obvious questions about the learning environment include whether a student has a setting at home that is conducive to completing homework or www. interventioncentral. org 52

Response to Intervention Investigate Multiple Factors Affecting Student Learning: ICEL (Instruction, Curriculum, Environment, Learner)

Response to Intervention Investigate Multiple Factors Affecting Student Learning: ICEL (Instruction, Curriculum, Environment, Learner) • Learner. While the student is at the center of any questions of instruction, curriculum, and [learning] environment, the ‘learner’ domain includes those qualities of the student that represent their unique capacities and traits. More obvious examples of questions that relate to the learner include investigating whether a student has stable and high rates of inattention across different classrooms or evaluating the efficiency of a student’s study habits and test-taking skills. A less obvious example of a question that relates to the www. interventioncentral. org 53

Response to Intervention www. interventioncentral. org

Response to Intervention www. interventioncentral. org

Response to Intervention • The teacher collects several student math computation worksheet samples to

Response to Intervention • The teacher collects several student math computation worksheet samples to document work completion and accuracy. • Data Source: Review • Focus Areas: Curriculum www. interventioncentral. org 55

Response to Intervention • The student’s parent tells the teacher that her son’s reading

Response to Intervention • The student’s parent tells the teacher that her son’s reading grades and attitude toward reading dropped suddenly in Gr 4. • Data Source: Interview • Focus: Curriculum, Learner www. interventioncentral. org 56

Response to Intervention • An observer monitors the student’s attention on an independent writing

Response to Intervention • An observer monitors the student’s attention on an independent writing assignment—and later analyzes the work’s quality and completeness. • Data Sources: Observation, Review • Focus Areas: Curriculum, Environment, www. interventioncentral. org 57

Response to Intervention • A student is given a timed math worksheet to complete.

Response to Intervention • A student is given a timed math worksheet to complete. She is then given another timed worksheet & offered a reward if she improves. • Data Source: Review, Test • Focus Areas: Curriculum, Learner www. interventioncentral. org 58

Response to Intervention • Comments from several past report cards describe the student as

Response to Intervention • Comments from several past report cards describe the student as preferring to socialize rather than work during small-group activities. • Data Source: Review • Focus Areas: Environment www. interventioncentral. org 59

Response to Intervention • The teacher tallies the number of redirects for an offtask

Response to Intervention • The teacher tallies the number of redirects for an offtask student during discussion. She designs a highinterest lesson, still tracks off-task • behavior. Data Source: Observation, Test • Focus Areas: Instruction www. interventioncentral. org 60

Response to Intervention Uses of RIOT/ICEL The RIOT/ICEL framework is adaptable and can be

Response to Intervention Uses of RIOT/ICEL The RIOT/ICEL framework is adaptable and can be used flexibly: e. g. : • The teacher can be given the framework to encourage fuller use of available classroom data, examination of environmental and curiculum variables impacting learning. • The RTI Team case manager can use the framework when pre-meeting with the teacher to better define the student problem, select data to bring to the initial RTI Team meeting. • Any RTI consultant working at any Tier can internalize the framework as a mental guide to prompt fuller consideration of available data, efficiency in collecting data, and stronger formulation of student problems. www. interventioncentral. org 61

Response to Intervention Activity: Use the RIOT/ICEL Framework • Review the RIOT/ICEL matrix. •

Response to Intervention Activity: Use the RIOT/ICEL Framework • Review the RIOT/ICEL matrix. • Discuss how you might use the framework to ensure that information that you collect on a student is broad-based, comes from multiple sources, and answers the right questions about the identified student problem(s). www. interventioncentral. org 62

Response to Intervention Building Team Activity: Plan for Building Teacher Understanding and Support for

Response to Intervention Building Team Activity: Plan for Building Teacher Understanding and Support for RTI In your elbow groups: • Review page 15 of the Planning Packet (Packet 5): RTI Plan: Element 5… Select Measures for Universal Screening and Progress Monitoring to Evaluate Student Response to Intervention • Develop an academic screening plan for your www. interventioncentral. org 63