Thirteen Ways of Looking at Research
A friend of mine is a research scientist. She does experiments with yeast. You’d think that it would be fairly easy to control the environment in a modern lab, and that experiments with these simple organisms would yield consistent and repeatable data.
But it isn’t like that. Over the years, I’ve heard disaster after disaster detailed – malfunctioning equipment. Numbers that don’t add up. Funky cultures.
My friend does research on one-celled organisms because she likes to nail things to the nth percentile of certainty. Although her work can be frustrating at times, with care and persistence, she feels that she can control her research to her meet her standards.
But she says she just doesn’t have the temperament to do studies with more complicated organisms - there’s too much room for error in the interpretation of results. Here’s what she wrote when I asked her why:
“Causation is difficult to prove in human studies because it's difficult to completely control and alter just one aspect of the person's life while keeping everything else constant (whose life is identical from one day to the next, really?) So, any changes that are found to occur due to X are simply correlated changes –“we changed X and now we see people have Y result.” However, the different outcome (Y) could just as easily be due to something else that changed in the person's life, something that's difficult to control.
“For example, you could be looking to see if sitting still and listening to music for half an hour every morning reduces someone's stress level. If you find that the people who are sitting still and listening to music do indeed have lower stress levels, you won't know if it's because they were sitting for half an hour (time
that would otherwise be filled with something else -- so the change could be due instead to the absence of the “something else”), if it's the specific songs that you chose for them to hear, if it's just the very act of sitting still for half an hour in the morning (since most people's mornings are super-rushed), or if it's something else entirely.
“There are so many things that can give confounding results -- there's always error both in measurement and in administration of the treatment. I've definitely read a lot of behavioral studies where I've thought "hmm, but this could also be due to. . .", but it's generally not clear how the scientists could have set things up differently to eliminate the confounding factors.”
These are wise words to consider, the next time you encounter phrases like “studies demonstrate . . .” If we are going to factor research results into our decisions as educators, it behooves us to do a little research ourselves. What was the sample population? How was the study designed? administered? Does the researcher have a bone to pick or a theory to prove? How does the research jibe with our own experiences as teachers?
And what occurs to us when we ask, “hmmm, could these results also be due to. . . ?”
Comments
Thanks, Susan! Your comment reminds me of the “aha!”
I had in grad school many years ago. As a piano
teacher, with a lot of information to cram into a half
hour lesson, I had believed that if I wasn’t talking,
showing, demonstrating, GIVING, I was not doing my job.
I still remember a first class in education, when we
were instructed to simply WATCH . . . and I realized
that, while I was so busy teaching, I had no idea of
what the students were actually taking in, taking home
with them.
The next week, I spent a lot of time in my lessons
observing, holding that question - what are my students
actually understanding? - and it blew my mind. Most of
my brilliant “teaching” had been a one-woman show. My
students were learning more about me than they were
about playing the piano!
Now, in my work, I spend a good proportion of my time
doing “research” - observing with open attention - and
my “teaching” mostly flows out of the cues my
“students” give me.
Susan McGuire Sep 11, 2011
Thank you for this valuable article! With an MLS in Library Science, it is all too infrequent that these questions and studies are examined as you have suggested.
The longer I have taught, the more comfortable I am in relying on my own observations over the years. For instance, when I first began teaching I would never have said that there were any gender differences in learning to play the piano but now I see quite a few that apply maybe 60% of the time at least but most often not more than 90%.
I will always be grateful for the observational training given in the programs begun by Dr. Heyge with Audrey Sillick.