Thursday, July 7, 2011

Resistant to Data

Many bytes have been spilled (can we say that?) about how Americans in general, and even teachers, have trouble interpreting data and reasoning quantitatively.  But there's another syndrome that I think infects education, and that's data-resistance.  Data-resistance is when people--educators, parents, students, whatever--are so gripped by their preconceptions that they overlook or ignore data that suggest a different general picture from the one they are used to.

Here's an example.  Several years ago, Chicago Public Schools created a new, free, voluntary summer program for entering ninth graders.  The program lasted four weeks, and -- as implemented at my school -- was well thought-out, addressing academic, social, and emotional needs, with support in particular content areas (math and English) as well as general discussion and practice of academic strategies.  I helped work on it, and I was really excited about the program.  Everyone agreed it was a terrific success.

The following academic year, I followed up by gathering data.  For each student, a counselor and I collected first semester GPA and number of courses failed,and we looked for differences between the 100-or-so students who attended the program and the 100-or-so students who didn't.  We couldn't find any. We went back to the database and sorted students by F/R lunch status and by score on the qualifying entrance exam, and in no subgroup did students who attended the summer program do better -- either higher GPA, or fewer failures -- than students who skipped it.  Statistically, there was no difference between students who attended the program and students who didn't.  In face, we couldn't find a single academic measure on which there was a difference.

As we started discussing whether and how to implement a similar program the following summer, I took my data back to the planning team and said "Look, we all thought this program was terrific, but in fact it doesn't seem to have any impact."  The response was uniform:  it's a great program; we should do it again; we shouldn't make major changes.

That's what I call resistance to data.

Now it's possible that there are other ways in which the program was helpful.  Maybe students who attended the program found the first weeks of school easier, or more enjoyable; maybe they got more involved in extracurriculars.  But nobody suggested that there were data supporting any of these claims, and -- one could argue -- if these outcomes don't show any ultimate academic impact, there might be easier and less-expensive ways to attain them.  Here's another hypothesis: maybe having half the class attend inculcated a culture that "infected" the whole class and boosted everyone's performance, sort of like how "herd immunity" can those who don't get vaccinated.  But nobody suggested that, either; in fact, the consensus at the meeting was that it was important to get more kids to attend.  And we have.

To be fair, kids love the program, and for one group of our students, we've made it extremely helpful: kids who haven't finished a full year of Algebra I can do so and start in Geometry--an option we created before Freshman Connection, but incorporated into the program when it began.  Nobody would say that Freshman Connection hurts anyone, and everyone who attends is glad they did.  But I'm still skeptical that it actually improves academic or even socio-emotional outcomes for our students.

I'd say that too often in education we make decisions like the way we made these:  without gathering data, or in the face of data that contradict our intuitions and preconceptions.  The worst is when we teach based on what worked for us as individuals.  Teachers lecture for 45 minutes with a little guided practice, then assign 1-47 odds for homework, despite mountains of research about what kinds of tasks, in what quantities, make for effective independent work.  In many cases, teachers have heard of this research or been told about it, but at some level they don't believe it:  lecture + practice + 1-47 odds worked for them when they were kids.  Doing "what worked for me" is particularly harmful in math, where we tend to forget that those of us who are comfortable with math now are really "math survivors".  Imagine if we taught kids to swim by dropping 100 kids at a time into a shark-infested pool; the two or three who made it out would later replicate that method, saying "it worked for me".

Even when the practices are not demonstrably harmful, using "what worked for me" or "what makes sense" as a metric ignores the fact that today's kids are growing up in environments that visibly and palpably different from the ones we grew up in.  Recently, I posted this article (on Facebook) about how schools in Indiana will be giving up cursive instruction (yay!) but teaching keyboarding instead; I asked why bother teaching keyboarding either.  An astonishing number of my friends commented that typing class had been incredibly useful for them, and how would kids learn to type well otherwise?  My response:  yes, but that was when typing was a specialized activity that you only did when preparing final drafts of papers; kids today type all the time and get immediate feedback about the quality of their typing; my students have learned to text blind quickly and accurately, without any formal instruction.  And ... my friends repeated their arguments: it's important to learn to type quickly and accurately, and they only learned to type quickly and accurately in typing class.

I'm not saying that we should totally jettison personal experience and common sense when we teach.  I'm just saying that we forget that both of those are anecdotal, and what memory and common sense tell us about our own experiences is not necessarily relevant to the population-at-large 30 years later.  It took me a little over a year of teaching to realize that, although math homework was mostly irrelevant to my own learning of high school mathematics, it could really help my students--so I shouldn't make it optional.  And it's that kind of skepticism, and openness to data, that we all could adopt more often.

2 comments:

  1. Don't forget the effect that is even worse; rather than just ignoring the data, contradictory evidence can even make you more certain of your beliefs: http://youarenotsosmart.com/2011/06/10/the-backfire-effect/

    ReplyDelete