I started my teaching career in a public school located in a middle to upper-middle class neighborhood in Santa Monica, California. Most of the students were from the school’s attendance area. I taught there for my first 16 years.
I really enjoyed my teaching experience at that school. By all measures, it was a successful school. The parents were involved; the PTA could raise $130,000 in a year. There was a solid program; we were even ‘adopted’ by a local bank, which sponsored a different art experience at each grade level K-5, every year. And I was a “Highly-Qualified” and “Effective Teacher”, as per my NCLB designation and years of evaluations by administrators. My students enjoyed coming to school, showed good yearly progress, and got mostly good scores on their schoolwork, as evidenced by their classwork and authentic assessments. Most of the students did their homework, and most of the students scored in the 85th percentile and above on the annual state standardized tests.
Through professional development in SMMUSD (under the direction of then-superintendent Dr. Neil Schmidt and Dr. Paul Heckman,) I had been participating in an Inquiry group with some colleagues. Our Inquiry practice, we inquired: “What are we teaching? Why are we teaching it? What are the students learning? How do we know they are learning it?” and finally, “Are the students actually learning what we thought we were teaching?” It was an enlightening, exciting, collaborative, and reflective process, and it was the best professional development I have experienced in my career (and there has been a lot of that.) Without question, Inquiry made me a better teacher. But it would have been very hard to quantify exactly how much better…the 85th percentile and above test scores did not change! Some stake holders at our school were stuck in the “If It Ain’t Broke, Don’t Fix It” mentality, and by the measures of school success in place at the time, our school “Weren’t Broke”. But some of us stake holders began to question whether what we were doing was really the best we could do for every student/family, and whether we could do things differently to become even more effective/inclusive/awesome.
We spent a lot of time in our Inquiry group determining how we knew what the students had learned. Of course, we could point to the norm-referenced, standardized tests that the state required every spring and that our students did so well on. After all, those are the scores that got us our high-performing, ‘distinguished’, successful school designation! It was around this time that I learned that the best predictor of student success on a norm-referenced, standardized test was: 1) education level of the parents, and 2) socio-economic status. Correlation is not causation, but I found it very disturbing that you could take information about a student’s family’s socio-economic status, and information about the education level of that student’s parents, and knowing nothing else at all about the child, predict the scores that that child would receive on a norm-referenced standardized test. It disturbed me to the core of my professional self, because, well, what if I wasn’t the well-qualified, effective, awesome teacher that I thought I was? What if it was just that I was lucky enough to be teaching a certain population of students? And jus how do I know what my students are learning?
And then an opportunity arose. Even though I had been working in the same district for 16 years, I could not afford to buy a home in the city where I worked. But I could afford a home in Long Beach, about 35 miles down the road.
I bought a home in the spring and the following school year I gave the commute a try. By November, I knew it was not sustainable. I didn’t see my new house in the daylight for the first year, and made up my mind to work closer to home. This meant a mid-career change in school districts. It seemed like a good idea at the time.
I applied to Long Beach Unified School District (LBUSD) and was hired (along with 600 other teachers) in 2000; I would spend the second half of my career here. At the interview, I was invited to sign a contract right away, and I would get a specific site later. This was February, so I had time to finish up my school year in SMMUSD. I knew I had a job for the following year all lined up, and I had the opportunity to interview with principals and find a site that was a good match. (In LBUSD, there are about 65 elementary schools. It’s the fourth largest district in California.)
The school I chose was in downtown Long Beach. It was a 100% free lunch school. Most of the students (at least 80% of them) spoke English as a second language. The population was mostly Hispanic (60%), followed by Southeast Asian, specifically Cambodian (30%), and the rest African American. There was no PTA; just a PTO made up mostly of the office staff. The school was not ‘adopted’ by any company. I chose to continue my teaching career at a site that was pretty much the polar opposite of the school where I spent my first 16 years.
So I was excited to see how effective my North-of-Montana, 85th-percentile-and- above teaching would be in this inner-city, 40th-percentile-and-below school.
I became my own ACTION RESEARCH PROJECT.
When I started teaching fifth grade at my new LBUSD school site, I taught pretty much the same way I had taught for the previous 16 years. I figured, I couldn’t not, so I might as well. And I did. I even kept my records the same way I had in SMMUSD; in today’s education climate, we would call that my “data”.
Now, when you do a research study, you need a question. Mine was, “Will my teaching continue to be as effective in my new class/school/district? My plan was to keep my teaching style and the content the same, and change just one variable. The only variable that I would change would be the class/school/district. Yeah, right. Well, my action research project probably wouldn’t hold up under master’s-thesis-style scrutiny, but it was an Action Research Project none the less. A project that would continue for another ten years.
I used ten years of teacher-collected data from Math Chapter pre and post tests (same math text was used in both districts for the first few years!), Reading growth tracked throughout the year using Guided Reading levels, and writing samples from a specific book report form. I compared that to similar data I had collected while working in SMMUSD, and this is what I found.
Math: The LB students scored better and made faster progress in Math.
Reading: Based on tracking Reading growth made over the course of the school year, the LB students showed comparable growth to the SM students.
Writing: My LB students made comparable progress in writing as compared to my SM students, despite having an overall lower English Language proficiency.
I found the exact same pattern of writing growth and results in Long Beach that I had experienced in SMMUSD.
The norm-referenced, standardized test: While my Santa Monica colleagues’ students continued to score in the 85th percentile and above, my students in Long Beach continued to score in the 40th percentile and below.
One contributing factor that could explain why the LBUSD students progressed more quickly in math might me that they knew their times tables by heart when they came to me. It was a real time saver; time that could be used for math concepts.
Many of the SMMUSD students read at higher level than the LBUSD students did at the end of the year (and may started the year at a higher level), which may be attributed to vocabulary and language acquisition. But the LBUSD students showed comparable growth over the school year.
I found the writing results to be most enlightening. I used a very specific book report form. My experience in Santa Monica had always been that the class bombed the book report the first time they used the form on their own. But there were always some students who did well on certain parts of the report. With the students’ permission, I would share those responses, and then have the students re-read their own responses and reflect on how they could improve them. The second round of book reports showed improvement (because now they really knew what the expectations were!) and the improvement tended to continue to improve until by the fifth and final book report of the year, the students were all getting 80-100/100 points on their book reports. I found the exact same pattern of writing growth and results in Long Beach that I had experienced in Santa Monica. I did not expect those results with so many second language learners in my classes, but I got them!
But was I just as effective in LBUSD as I had been in SMMUSD? Did I really suddenly become an Ineffective Teacher as I drove the 35 miles down the 405 freeway from Santa Monica to Long Beach that fateful day that I changed school districts? Michelle Rhee would have you believe that I did, based on my students’ norm referenced-standardized test scores. Or maybe the test isn’t all that useful for anything other than ranking students/schools/districts.
If the LBUSD students made roughly comparable growth and progress in Math, Reading, and Writing, then why didn’t the scores on the norm-referenced, standardized test reflect that? That is a question I continue to explore.
I believe that the norm-referenced, standardized test, which is, in fact, designed to rank (and it does it well) did not accurately assess the growth and progress made by my Long Beach students, nor is the test able to accurately show growth over time. If the test doesn’t accurately assess growth and progress of the Long Beach students, then does it accurately assess the growth and progress of the Santa Monica students? I think not.
I became my own Action Research Project to show it.