Fast forward to this Spring, when it was announced that nearly 12,000 Illinois teacher evaluators must go through a series of online training modules in order to evaluate any teachers in 2012/2013. That's every single teacher evaluator in the state. If you don't pass this training, you cannot evaluate teachers. Period. Teacher evaluators like myself have until September 1, 2012 to finish this training, which would cover the main points of the new legislation (Aka "SB7"), and Charlotte Danielson's Framework for teaching.
So, we have 12,000 people logging into a high stakes online testing system. What could go wrong? Those had better be some high-quality materials supported by an ironclad infrastructure, right? Certainly they've accounted for the the differing types of hardware, software, and high speed internet access available throughout a state as diverse as Illinois, right?
There are 5 modules that each evaluator must pass. There is an assessment at the end of each module, and you have 2 chances to pass each assessment. If you fail twice, you are required to go through "remediation", the details of which are still sketchy as I write this in mid-July. We seem to think that it could be a half day of training, after which you have another chance to pass the assessment. Update! The "remediation" has been revealed: another webinar! And we now have two more chances to pass the assessments after the remediation webinar. That'll do it.
7/13/2012: Another update! We now have unlimited chances to pass modules 1, 4, and 5. Evidently Module 3 is a self assessment. (As of 7/13/2012, Module 3 is a webinar that has sound, but the slides are blacked out. Outstanding).
By the way, I could go on about the missed deadlines, and the changes to the adjustments to the revisions of the plan. Every single date that we have been given, every single deadline they said they would meet, has been wrong. For example, the materials were supposed to be ready in early May. As I write this in July, we still have many educators who have not received their login credentials.
Module 2 is the most frustrating. It covers the in-class domains of Danielson's model: Classroom environment and Classroom instruction. Evaluators must watch video clips, rate them using Danielson's model. The ratings must closely match those made by "expert" raters. If you fail, you get no feedback. You are simply instructed to review the materials and try again. The video quality gives an impression of what's going on in the class, but the picture and audio quality make it difficult to pick out many of the details we are being held responsible for. There have been numerous technical problems, and if a video hangs up on you during an assessment (many reports of this), you could be charged with an "attempt." Thanks for playing. Try again. You have one more chance. Oh wait. Now we have three more chances.
A brief side note: I like Danielson's model, I really do. I am excited about using it in our district. However, I am concerned that the bad feelings the implementation is generating will cause some to lose faith in the model. That's really the unfortunate part in all of this.
I could keep going, but it's probably more valuable to hear from someone who is currently going through the training.
What follows is the text of a message sent from one evaluator to a high-ranking official at the Illinois State Board of Education. I have removed all personal information and references to this educator's school district. This pretty much expresses the fury many of us are feeling at this point.
I have been participating in the evaluation training online through teachscape and growth through learning. Before I say another word, I went into this program wanting to succeed and hoping that this process would help me as an evaluator. Having passed Module 1, I have spent many hours reviewing Module 2 and completed all the training, video observation, notes, etc. Yesterday, I took the assessment for Phase 1. To my chagrin and frustration, I did not pass. I have been a (department) chair for eleven years and would invite any "expert" to observe with me - I consider evaluation one of my professional strengths - I take a lot of pride in this area of my responsibilities. However, having failed the first attempt, I am now wondering what I would have to do better to succeed on my final try; I did read the all too brief "feedback" about where I came up short. It reminded me of my first few years teaching (38 years ago) when I would supply "feedback' for my students' writing - all too brief and general - and feel I was doing a good job. I was not then and this system is not now.
Because I receive little substantive feedback on which answers did not "measure up" and why, is it not odd that this system hangs teachers out to dry by the very means with which it desires students to succeed - with quality feedback? If I thought one of the ratings was a "low 3" and the "experts" think the same question is a "high 2" and I do not receive any rationale as to why, how am I/are we to pass this test? I realize that rationale is supplied on the practice videos, but this is not enough. I did every step I terms of training, watched all the videos, took notes, made printout, yes- actually studied some. I felt I answered the questions well enough. Now I am down to one final attempt, one for which I am less than enthused about even trying.
Although I found the test to be aligned with the training, how many components could be argued to supply evidence that would naturally fall under multiple frameworks? For example, might not a piece of evidence such as "How then does Romeo change?" support either "questioning" or "assessment"?
Moreover and even more importantly, how is this test scored? Do we receive any info (other than a 143 is "passing") about that? Do we know which answers were not passable? What was my final score...a 142? Who knows - except the computer?! Would you want teachers to supply assessments like this to our students? I think not. Do you desire our teachers to simply hand back tests and not review content/questions for areas students need to improve? I think not.
I do not know where these thoughts will go, but I had to voice them. Although I do realize and appreciate the intention of this program and do acknowledge that some of the features are very good and will help clarify evaluation (one of which being the insistence on cold, unbiased evidence as the basis for any conclusions about a teacher's ability), I strongly question the application of the program and the lack of communication regarding results of assessments. These need review. You have many very qualified staff in Illinois at this moment very frustrated, annoyed, and confused. The majority of staff who have tried this test have failed it, at least to my knowledge. Is that what you want?
I would think this state would be asking how many districts like (ours) have been successful in its evaluation process over the years and have been training very fine teachers instead of just jumping in and mandating.
Thanks for listening. I hope you find this feedback helpful.
P.S. By the way, did you know that after I watch a practice video and type comments online, and then hit "delete" or "edit" to change a particular comment, that I can no longer score that video?? The cursor changes from the "hand" to an "arrow" and will not score. This happened three times on my home computer (and maybe it was just my computer - but it seemed to be working fine); thus, I had to completely start each video over. One I had to watch three separate times before I caught on. Talk about frustrating!