Getting Better All the Time Assuring the Quality
Getting Better All the Time Assuring the Quality of COSF Data Andy Gomm: New Mexico Part C Jane Atuk: Alaska Part C Lisa Backer: Minnesota Part C & 619
New Mexico Andy Gomm Part C Coordinator
ECO Implementation in NM n n Training provided to 34 provider agencies at their sites ECO manual developed and distributed Technical assistance made available through FIT staff and University of NM – Early Childhood Network Roll out region by region (5 regions)
ECO quality assurance in NM n n ECO Quality Assurance form developed ECO lead staff with the Family Infant Toddler (FIT) Program initially reviewed all ECO forms Review expanded to 4 FIT staff Total ECO forms reviewed to date = approximately 1, 300
ECO quality assurance in NM (cont. ) n n n Each provider agency received specific feed back regarding rating selection and supporting documentation. Once it was determined that the agency was completing the ECO forms to a high standard – they could be ‘graduated’ Once graduated FIT staff request the ECO forms on an “as needed” basis
Additional ECO quality assurance n n n Providers receive a summary of the ECO quality assurance conducted Data entered in new online data system – provides additional opportunities to review accuracy Database reports provide ability to review whether ECO scores have been entered
ECO Quality Assurance Form The NM ECO review form includes: q q Are all areas of the ECO form completed? Were a minimum of three sources of info (approved assessment tool, clinical observation and parent input) used to generate rating? Does the supporting evidence really support the ECO rating? Is the ECO rating consistent with the child’s eligibility category?
Lessons Learned n n n After initial training, all sites needed an additional, almost identical, training once they began implementation. TA needs to be available promptly. Pre-printing sources of information on the supporting evidence section ensured that documentation was present from all three required sources.
Lessons Learned (cont) Regarding Feedback on ECO Form: ¡ ¡ ¡ Feedback needs to be prompt. Feedback needed to go directly to Service Coordinators completing the form, and not just their EC Coordinator (manager). Positive feedback works!! If a particular SC at an agency was doing a great job with the ECO form, a recommendation was made that SC mentor others at that agency. Use his / her ECO form as an example of what we want.
Next Steps n n Develop online training – available 24 / 7 Promote QA to be done by provider managers Review online ECO reports – e. g. review data reports for patterns in scores, etc. Include ECO process (incl. ECO Manual) in the Service Coordination training
Minnesota Lisa Backer ECSE Specialist
Basic Realities n n Education Lead/Birth Mandate State “Local Control” is valued Teams must use multiple sources of information including at least one criterion-referenced or curriculumbased measure cross-walked by ECO Parent input must be documented on the COSF
Basic Realities n n Single target group of stakeholders & professionals for training on child outcomes reporting across Parts C and Part B Rating at exit from Part C is becomes the entrance rating for Part B Minnesota Automated Reporting Student System (MARSS) created in the late 1980’s. No “real time” data. Data collected by LEA’s throughout the year and reported to MDE each fall and each end-of-year
Quality Assurance Efforts ü ü Stakeholder Responsibility Table Training & TA Data Awareness Self Study
Stakeholder Roles/Responsibilities Key Areas n Knowledge of typical child development n Ongoing Assessment n Knowledge and Use of COSF & Process n Annual reporting of data n Ensuring validity n Family Outcomes
Training & TA “Get Started” n 55 Face-to-face trainings during Year 1 n Data Retreat for Early Childhood Program Administrators (ECSE, Head Start, Pre-K) to promote professional investment in data n One time additional appropriation of $$ to fund tool purchase and training
Training & TA “Get Better” n 7 Regional Trainings in Year 2 n Program survey LEAs; Provide training on most popular assessment tools ¡ n n HELP; AEPS; BDI-2; Brigance; Creative Curriculum Web-Ex training under development for implementation during Fall 2008 Validation Self-Study
Data Quality & Awareness n n Simple logic check Mean, Median and Standard Deviation calculated on entry and exit data sets for each LEA for each outcome. Progress data calculated and made available for each LEA on password protected site Does district data tell the right story?
COSF Entry Data-District A N=44 Median Mean Standard Deviation Outcome 1 6 5. 16 1. 88 Outcome 2 4 3. 75 1. 80 Outcome 3 5 4. 48 1. 60
Correlation: Outcome 1 x Outcome 2 1 2 3 4 5 6 7 Total 1 2 3 4 38 24 8 7 68 25 7 2 121 5 18 40 30 15 14 5 127 1 8 10 22 33 20 3 97 2 1 1 74 5 3 8 9 42 34 20 116 6 4 15 48 21 93 7 Total 2 53 125 93 75 122 133 92 693 1 3 10 9 40 65
Self Study n Self-study tool under development ¡ ¡ ¡ n Procedural Requirements Sources of Information Assignment of Ratings Statewide training on use of tool 10/2/08
Lessons Learned & Next Steps Lessons: 1. Getting started was easy. Getting better takes more work. Next Steps: 2. Vigilant monitoring of all data submissions 3. Evaluate local use of self-study tool
Alaska Jane Atuk Early Intervention Specialist Early Intervention/Infant Learning Program
COSF implementation in Alaska n n COSF pilot at 7 regional sites, Feb-Dec 2006 Training provided to all providers at statewide workshop, Feb 2007 Statewide implementation of COSF began March 1, 2007 DVD training modules provided to each regional program, Nov 2007 and now accessible online for ongoing local training
Quality assurance in Alaska n n n Technical assistance provided through state staff by phone and at regional sites COSF database reports reviewed at least quarterly with feedback to local providers Provider survey conducted July 2008
Survey Notes n 92 ILP providers received the survey link by email (Survey Monkey) n 67 responded for a 73% overall response rate n The number of responses on items varies because… ¡ Subsets of respondents received some questions based on answers to other questions (skip logic) ¡ Respondents could choose to not answer some questions
COSF training & information n 90% of respondents answered an item about how they received COSF training/information Of these (n = 60)… ¡ ¡ ¡ 70% attended an in-person statewide event 42% used the COSF training notebook 37% consulted with trained ILP providers 30% consulted with state-level staff 18% used DVD training modules* 7% used the Internet to access information *DVD training modules were only available after statewide training events occurred
Overall Proficiency with COSF 28 I know how to do it, but I need some more practice and assistance. 24 I am confident I know how to do it, and I do it well. 12 I understand to a point, but I need more training. 2 I do not know how to do this yet. (n = 66) 78% felt they could do the COSF process with varying confidence, but without further training
Sources of Information The most typical resources used to inform COSF rating decisions (n = 64) 61 ILP provider observations 54 Parents/foster parents/legal guardians 54 Assessment results/test scores 44 Specialists (OT/PT, speech/language, etc. ) 11 Other family members/relatives 9 Childcare providers Note: Respondents were asked to “check any that apply”
Gathering Information The most typical methods used to gather information for COSF ratings (n = 64) 63 Meeting with people in person 18 Meeting with people over phone or teleconference 8 Communicating back and forth with people by email 6 Videotaping interviews, assessments, observations Note: Respondents were asked to “check all that apply”
Decision-Making Tools Were crosswalks helpful? 4 very much 8 yes 9 3 somewhat no 21 don’t know if using 21 not using Was the decision tree helpful? 24 very much 3 3 6 2 don’t not some no know using 24 yes Were instructions for completing the COSF helpful? 36 yes 8 no 6 don’t know 14 not using
Determining COSF Ratings Most commonly… n n 33% consulted with another provider 24% consulted with families 21% determined ratings on their own 18% used a team process Note: 3 (4%) respondents did not answer this question. It would seem that providers most often did not use an “ideal” team approach
Determining COSF Ratings However… n 63% (42) had used a team approach at times Of these 42 providers… ¡ 64% felt the team approach enhanced the decision-making process ¡ 62% felt it contributed information that would otherwise not be available ¡ 95% felt it was relatively easy to reach consensus
Level of Parental Involvement Typical parental involvement in COSF process on teams (n = 42)… n 69% - contributed information, but were not usually present during team meetings n 26% - usually were present and participated n 5% - usually were not involved at all
Anchor Assessment Tools 25 Battelle Developmental Inventory (BDI) 19 Early Learning Accomplishments Profile (ELAP, 2002) 17 Sewell Early Education Developmental Profile (SEED) 16 Early Learning Intervention Dev. Profile (“the Michigan”) 16 Hawaii Early Learning Profile (HELP, 2004) 3 Assessment, Evaluation, & Programming System (AEPS) 3 Bayley-III Scales of Infant & Toddler Development, 3 rd ed. 3 Carolina Curriculum for Infants & Toddlers (CCITSN-3) (n = 63) Note: Respondents were asked to “check any that apply”
Anchor Assessment Tools 45 providers indicated training specific to assessment tools from… n 91% local EI/ILP agency n 27% assessment authors/publishers n 20% university course n 16% professional conference n 13% state or regional workshop n 11% private consultant or contracted trainer n 7% another organization
Anchor Assessment Tools Recentness of training (n = 45)… n 24% within the last year n 31% within the last two years n 18% within the last five years n 27% more than five years ago 73% 43 of 61 (64%) respondents indicated someone else in their program has training/education specific to anchor tools used
Added Comments n 20 providers (30%) added a comment to the survey ¡ 5 were clarifications of answers given ¡ 6 expressed objections to using the COSF ¡ 3 expressed difficulty with the COSF process ¡ 2 indicated confusion with the COSF process ¡ 3 were suggestions ¡ 1 was about the survey itself 16% of respondents made what could be considered negative comments
Lessons Learned & Next Steps n n n Train often and early Regular feedback is essential Providers appreciate being asked to give feedback on process • Survey results will help to focus future training and technical assistance • Continue to elicit feedback from providers
- Slides: 41