What Does Research Say About the Effectiveness of K-6 Supplementary Computer Reading Programs?

 How do you make purchasing decisions about programs/materials?  Vicki Vinton has a great post titled “What’s the Difference Between a Teacher and a Packaged Program?  in case you have not yet read it. (I am not proposing that teachers should be replaced by a packaged program.)

Do you ask questions when you are reviewing supplementary computer reading materials before a purchasing decision?  Who do you ask?  Friends?  Colleagues? Twitter PLN?  What do you ask?

I consider myself fortunate as I was trained with a protocol to evaluate the quality of research according to the definition of “scientifically-based reading research” that was used under No Child Left Behind.  I believe it helps “weed out silly stuff” pretty quickly.

Do you use a protocol or checklist to guide your review process?  I have provided some guiding questions to think about as you read the research report summary (or dig into the actual research) that can also be found in its entirety through the International Reading Association.   What additional information will you want to collect and review?

Considerations:

  • What was the age/grade level of students in the study?
  • Is there a match between the students in the study and students in your classroom?
  • Is there a match between the “purposes/goals” of the computer reading programs and the “purposes/goals” of reading in your district and in the Common Core?
  • Is there high-quality evidence as a result of the research?
  • Does the research list an effect size?
  • Was there a control group in the study?  Or were students assigned to random groups?
  • Is there evidence that the results were sustained over time? (Two years or more later)
  • What resources (time, staff, technology) are needed to implement the program? Are the resources cost – prohibitive?
  • How much professional development is needed to initiate and sustain the program?
  • Is fidelity of implementation described well enough  in the research to be replicated?
  • Is the effect size .40 or greater? (range John Hattie says should be considered)
  • When considering resources and “struggling readers” what effect size is needed in order for the reader to “close the gap” and reach grade level goals?

As you continue reading, please think about your reading process.  What do you, as a reader, do to enhance your understanding?  (Blog followers:  “Are you close reading?”)

Marshall Memo 495 (7-23-13)

4. How Effective Are K-6 Supplementary Computer Reading Programs?

“Despite substantial investments in reading instruction over the past two decades, far too many U.S. students remain poor readers, which has profound implications for these children and for the nation,” say Alan Cheung (The Chinese University/Hong Kong) and Robert Slavin (Johns Hopkins University) in this Reading Research Quarterly article. “Learning to read is a complex task in which many things must go right for a student to become successful… Different students may be failing to learn to read adequately for different reasons. One student may recognize every letter and sound but be slow and uncertain in blending them into words. Another may be proficient in reading words but does not comprehend them or the sentences in which they appear. Yet another may lack vocabulary needed to comprehend texts.”

One-on-one tutoring is the most effective intervention for struggling readers, say Cheung and Slavin, but it’s expensive. What about software packages? “In theory, computers can adapt to the individual needs of struggling readers,” they say, “building on what they can do and filling in gaps” – plus, they’re motivating to students. This article reports on the efficacy of technology products on which there is solid research. The bottom line: effect sizes for almost all products are small (averaging .14) and almost all of them aren’t any better than non-computer approaches. Here are the specifics, from the most to the least effective programs:

–    Lexia (Phonics-Based Reading and Strategies for Older Students): the mean effect size for Title I students is .67

–    Captain’s Log (BrainTrain): median effect size .40

–    RWT and LIPS for first graders at risk for dyslexia: overall effect size .32

–    Fundamental Punctuation Practice, MicroRead, Spelling Program, and Word Attack program for fourth graders: effect size .30

–    READ 180 for middle schools: weighted mean effect size of .24

–    READ 180 for grade 4-6 students: overall effect size .21

–    Jostens (an earlier version of Compass Learning): across three studies, the weighted mean effect size was .19.

–    Alpine Skier, Tank Tactics, and Big Door Deal for fifth and sixth graders: median effect size .18

–    Across 12 studies of supplemental Computer Assisted Instruction, the weighted mean effect size was .18.

–    Thinking Reader: median effect size of .14 in vocabulary and .13 in comprehension

–    Destination Reading: median effect size .12

–    Computer Network Specialist for grades 2-5: effect sizes averaged .10

–    Fast ForWord for grades 3-6: weighted mean effect size .06

–    Failure Free Reading for third and fifth graders: combined effect size .05

–    ReadAbout for fifth graders: weighted average effect size .04

–    READ 180 for grade 4-6 students (in another district): overall effect size .03

–    Destination Reading, Waterford, Headsprout, PLATO Focus, and Academy of Reading in first-grade classrooms: mean effect size .02 (the study didn’t break down individual programs)

–    Leapfrog, READ 180, Academy of Reading, KnowledgeBox for fourth graders: effect size –.01 (no breakdown for individual programs)

“The most important practical implication of the review presented here is that there is a limited evidence base for the use of technology applications to enhance the reading performance of struggling readers in the elementary grades,” conclude Cheung and Slavin. “Within the existing literature, however, the largest effect sizes were found for small-group interventions that supplement first-grade instruction with phonetic activities integrating computer and non-computer activities and occupying substantial time each week.”

Effects of Educational Technology Applications on Reading Outcomes for Struggling Readers: A Best-Evidence Synthesis” by Alan Cheung and Robert Slavin in Reading Research Quarterly, July/August/September 2013 (Vol. 48, #3, p. 277-299),

Cheung be reached at alancheung@cuhk.edu.hk.

What surprised you about the research?  How will this information impact your work?

Did you do any close reading?  How did you know?

Thanks for your response!

close reading button

Advertisements

4 responses

  1. I confess, Fran, that I couldn’t stay within the four corners of the text as my background knowledge got triggered when I saw the numbers for Read180, which is the foundation for the Scholastic Reading program being used this year in NYC middle schools.

    Bringing in actual research is so very important. These numbers say it all.

    1. Vicki,
      Thank you for your comment. Your post this morning increased the relevance of “research” and “programs” so I knew that I needed to move this post up into the published category. When resources are limited, it is absolutely imperative that we all be “critical consumers!”

  2. Fran,
    Those effectiveness scores are scary. Do people who purchase these programs read closely? I find I read closely when I have a big interest in the text. (Note to self about what I put in front of my students.)

    Thanks for the opportunity to experience close reading and the data that went with it!

    1. Julieanne,
      I was shocked the first time that I saw this data. Of course it does NOT come with the sales pitch! I don’t think there is a “Consumer Reports” for learning materials/ textbooks/programs!

      I am beginning to believe that “close reading” really is involved whenever the reader has to reread, think, process, and synthesize information! Thanks for your comments!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Present Perfect

adventures in multiple tenses

Leadership Connection

from Great Prairie AEA

The Blue Heron (Then Sings My Soul)

The oft bemused (or quite simply amused) musings of Krista Marx -- a self-professed HOPE pursuing Pollyanna

Middle English

Life as an English teacher leader

steps in the literacy journey

Walking the Path to Literacy Together

arjeha

Smile! You’re at the best WordPress.com site ever

Resource - Full

Sharing Ideas and Resources

Joel Pedersen

be that #oneperson

adventuresinstaffdevelopment

All Things Literacy! Brianna Parlitsis

TWO WRITING TEACHERS

A meeting place for a world of reflective writers.

elsie tries writing

"The problem with people is they forget that that most of the time it's the small things that count." (Said by Finch in All the Bright Places by Jennifer Niven. These are my small things that count.

I Haven't Learned That Yet

This blog serves to document my path of learning and teaching.

Simply Inspired Teaching

A blog by Kari Yates

Reflections on Leadership and Learning

Sharing my learning experiences

AnnaGCockerille Literacy

The Generative Power of Language: Building Literacy Skills One Word at a Time

Reading to the Core

Just another WordPress.com site

Karen Gluskin

My Teaching Experiences and Qualifications

To Read To Write To Be

Thoughts on learning and teaching

%d bloggers like this: