A board covered in canvas protecting maps. Black and white maps. Maps of farmland. Maps that showed crops, waterways, and entrances and exits for fields. My first remembrances of maps were maps that my dad used in his part-time work at the Agricultural Stabilization and Conservation Service (ASCS) office.
These maps were a part of family life. We would wait in the car after Sunday Mass while Dad met with a farmer. Kids in a car. Sometimes reading. Sometimes writing. Sometimes playing games like “I Spy.”
When completed these black and white maps would have markings on them in colored pencil noting changes. Each map was a section of land. One square mile. 640 acres of land. Math, Social Studies, and a lot of talk. An interdependence of content decades and decades ago.
History is filled with maps that share information about exploration, settlement, and expansion of populations. How do the visuals add to our understanding?
Are maps important today? Why? What maps do students need to learn about? What maps do students need to create?
Let’s face it. Maps are readily accessible through google and our smart phones or gps devices. It is easy to get oral directions or a map from a business location online.
But what about maps like Georgia Heard’s Heart Maps? What about capturing and creating connections between ideas that I want to remember. Heart Maps add another dimension to writing and then reading. Not familiar with Heart Maps? Check out this link for additional details.
What skills do you use to understand maps?
What maps do you use on a regular basis?
Are you a map consumer or a map creator?
Thank you, Two Writing Teachers, for this weekly forum. Check out the writers and readers here.
NEA and Maps (link)
US Interactive History Map (link)
Heart Maps (link)
Thursday Line Up: George Couros, Matt Glover, Colleen Cruz, Kelly Burns, and Stan Yan were presenters on my schedule for Thursday at CCIRA20. However, I would be remiss if I didn’t also acknowledge the fact that I learned from other participants – during turn and talks, standing in line for restrooms, and over dinner!
Here are some key take aways from my Thursday sessions!
George Couros: The Core of Innovative Teaching and Learning
Two questions to ask at the end of a professional learning day:
- What did you learn today?
- How will your students know and benefit?
What questions do you ask at the end of a professional learning day?
What questions might you begin asking?
Matt Glover: Increasing Engagement in Writing Through Choice
Matt had us thinking about three different types of units: genre, craft and process! Often the genre studies are more specifically tied to writing standard. The key is that you will need some of all three types managed in a planful way so that a student does not have the exact same units year after year after year after year after year. There are too many choices to have that repetition. And yet, there needs to be some repetition in order for students to have enough practice to both increase engagement, competence, and confidence! Matt provided lists and tips as he reinforced the need to address choice of topic for students the majority of their writerly lives. More about the three types here.
What do we think about to make writing units Easier?
- Have a stack of text (know what it looks like)
- How to organize text in process or craft study? Is your text too narrow? Showed personal narratives and told them they could write any genre. Showing is more important than “telling”. Was MORE than one genre represented? The text does not have to be the genre students are writing. Teachers need to teach into the goals of unit NOT the genre. It’s a perfect time to choose units from earlier in year or later in the year.
- Caution: Do not confer into the genre. Have transferable skills in your conferring.
How many units of writing do you teach per year?
How many of each of the three types: genre, craft or process?
Colleen Cruz – From Clever Writers to Critical Readers: By Teaching Powerful Writing Skills First, We Can Equip Students with Robust Tools for Today’s Reading Landscape
Colleen Cruz’s presentation was different than her keynote. Yes, it was about writing but it was not about mistakes. It was about how “Writers Make Better Readers.” When we need a craft piece for our writing, we can go find it in our reading in order to strengthen our writing.
One area that is neglected and that needs to be taught is Media Criticism. Instruction needs to include these elements.:
Master narrative – Reluctant hero – struggling in life (Did they work hard enough?)
Counter narrative – Paperback princess, Frozen (writers in Brooklyn in Colleen’s neighborhood) sister love
Weight – more space on page or minutes of film; what we see most (repeat, repeat. repeat)
Source and perspective – Who wrote this and what do they want?
Manipulation – (dogs) always actively present
Power – who has / has not
Voices heard (or not) – Frozen 2 (Indigenous Scandinavian song that came back in 2.)
What parts of media criticism do you teach?
What parts of media criticism do you need to add into your instruction?
Kelly Burns – Wild Wonder: Reconnecting to Lead
Our connections matter. I immediately thought of Jody Carrington and her work with “attention-seeking” or “connection-seeking” children. Julie presented a framework of connection-building based on four Ws.
- Sitting down with our fear
- Reclaimiming our hijacked consciousnes
- Present moment
- Stillness (an invitation to slow down)
- When we are selfing (ego)
- Nervous system and our somatic response
- Tendencies and propensities
- Cognitive distortions
- Socio-political identity
- Radical self acceptance
- Re-evaluation. (Trash or meaningful)
- Respect (systematically)
Which of the four would you begin with?
What small steps could provide more balance in your life?
Stan Yan – Cartooning for Writers (stanyan.me)
This session was totally out of my comfort zone. Drawing?
I had an awful year in first grade where I was repeatedly told what to draw and what not to draw with many pages ripped up in front of me when I used colors differently than peers or the teacher. In this session we drew and drew and drew! Five minutes on a quick sketch with a marker seemed possible. Drawing as a way to connect to writing and to better understand graphic novels and cartoons seems a very natural expectation.
- Multi-aspect learning tool
- Character development
- Story structure and writing
What we saw:
- Interactive monster drawing demo
- Exquisite corpses
- Improv Comic Strips
I have to admit that the Daredoodles were fun even though the idea that I would draw around a shape, number, series of numbers, letter, or word given to me by my neighbor was a leap of faith at that moment. It was a fun challenge!
But my actual learning about cartoons came from the structure explanation. Structure? Yes. Better than the secret location of a pot of gold!
What do you know about cartooning?
How do you help readers better understand cartoons and/or graphic novels?
If you were to pick just one idea from above, what would it be?
How, when, and where will your plan be implemented?
How do you celebrate a day of learning?
Innovation: Imagining the future of Literacy is the theme of the CCIRA 2020 Conference on Literacy currently underway in snowy Denver, Colorado. The outside weather is not hampered by the excitement of learning! This conference is a favorite of mine because of their preregistration process that allows me to choose my sessions in advance.
Innovation began with a keynote for all on Wednesday evening, “On Writing and Reading” with Colleen Cruz and Donalyn Miller, stellar literacy leaders who rocketed us to the stars.
Reflections on Colleen Cruz’s keynote
Colleen talked about her current project with mistakes and then had us writing within the first ten minutes. Participants writing during her 45 minute portion of the keynote . . . what does that tell you about what she values? I appreciated hearing the difference between mistake and intentional wrongdoing, but I was very interested in the responses to mistakes that she shared.
Some common ways we deal with mistakes:
We can make the best of the situation (shake it off)
We can try to fix the mistake (apologize)
We don’t fix the mistake (leave it)
We don’t fix it AND we maintain it (protect and build it up)
Stop and think of a recent mistake that you made.
How did you respond? Is that your typical response? What “patterns” do you see in your response to mistakes?
A Thomasson is an architectural element that is:
And then the relevance? Thomassons appear to be mistakes. What practices in your classroom are Thomassons?
- Is it a graphic organizer?
- The ubiquitous hamburger organizer?
- “Said is dead” bulletin board?
What Thomassons could you remove from your practices that would improve writing for your students?
Reflections on Donalyn Miller’s keynote – Access and Choice: Supporting Young Readers
This question struck a chord with me: If school is the primary access point, what do our students do when school is closed?
Access is critical and NOW is the perfect time to be planning to prevent summer slide. Donalyn shared research and statistics about summer slide and then reminded us that access to public libraries is not FREE for all students. Working parents may not be able to ferry children to the library. The family may consider the requirements for a library card to be too invasive into family circumstances. Confusion about the role of government bodies and their influence may exist. And dreaded fines for lost or missing library books may prohibit students and families from accessing library books.
What did your summer program look like last year? What were your results? What should you change for this summer? What is your plan?
As expected the learning trajectories were high!
What is your definition of literacy?
What does access mean to you?
What innovations are you planning for?
What innovations are pushing your beliefs about literacy?
SAP (Student Achievement Partners) announced the results of a review that many in the press and social media have hailed as the gospel.
Immediately questions arose:
But according to whom?
What was the criteria for selection of the “review panel”?
What conflicts of interest did the “reviewers” reveal before, during and / or after the review?
What were the criteria that were being “reviewed / evaluated”?
Did the “reviewers” conduct a thorough study of the resources?
Where was the line between opinion and fact?
What would any other panel of seven qualified literacy reviewers say?
Where is the evidence of the scientific study of the research (and subsequent results) the “reviewers” were quoting as the magic elixir for all children to read at high levels?
Here’s the response from #TCRWP: Link
Note the FIVE concerns with Methodology:
- Not independent
- Not peer reviewed
Read and reflect on the response from #TCRWP: Link
A reviewer who did not read . . .
Warm ups lasted about eight minutes.
Many other words were sandwiched in.
Words of caution.
“Go as far as you can.”
Some matched the model.
Some did not.
No one quit.
Everyone did something.
But everyone did not do the SAME something!
Everyone looked just a little different.
It was my first time observing the exercise group. Does everyone have the EXACT same end goal? NO! Does everyone have a goal? YES! And that’s the point. Can everyone in a group hit the exact same point on an arbitrary measuring stick on the exact same day? Should they?
When does this work in real life?
I think we need a bit more -ish in our lives.
Expanding our learning opportunities. Not stamping end dates. Not considering expiration dates.
Note (Checking in on a craft move):
Exactly how did you read those first few lines?
What did the lack of spacing in the series of words do to your pacing?
Thank you, Two Writing Teachers, for this weekly forum. Check out the writers and readers here.
The learning curve was high. Learning about the criteria of “scientifically-based reading research” (SBRR). Learning about the criteria for Learning First. Supporting teams as they wrote their Reading First grants. Implementing the grants. Studying our results and reapplying for a second round of grants. I was part of a team supporting multiple districts in Iowa at that time – intermediaries between local districts and the state department of education.
2001 and onward
Reading First promotes instructional practices that have been validated by scientific research (No Child Left Behind Act, 2001). The legislation explicitly defines scientifically based reading research and outlines the specific activities state, district, and school grantees are to carry out based upon such research (No Child Left Behind Act, 2001). The Guidance for the Reading First Program provides further detail to states about the application of research-based approaches in reading (U.S. Department of Education, 2002). (Source Link)
I vividly remember that the advertisements began rolling in over Christmas with “Meets Reading First Requirements” stamped on each cover. And I was shocked and dismayed. The guidelines had not been written YET and publishers were claiming to KNOW and MEET the non-existent guidelines.
Perhaps there was a “secret” list. Perhaps there already was a document released by the U.S. Department of Education . . . I will return to their role later.
Districts and schools with the greatest demonstrated need, in terms of student reading proficiency and poverty status, were intended to have the highest funding priority. (Link)
- Reading curricula and materials that focus on the five essential components of reading instruction as defined in the Reading First legislation: 1) phonemic awareness, 2) phonics, 3) vocabulary, 4) fluency, and 5) comprehension;
- Professional development and coaching for teachers on how to use scientifically based reading practices and how to work with struggling readers;
- Diagnosis and prevention of early reading difficulties through student screening, interventions for struggling readers, and monitoring of student progress. (Link)
The first bullet supported the National Reading Panel Report and we worked with national experts to increase our knowledge around those five components. Professional development was high before, during and even after the grant process. The last bullet was, of course, about students. Paying more attention to individual students in order to be able to determine success and accelerate learning.
WHAT STATE FLEXIBILITY WAS BUILT IN?
. . . one, states (and districts) could allocate resources to various categories within target ranges rather than on a strictly formulaic basis, and two, states could make local decisions about the specific choices within given categories (e.g., which materials, reading programs, assessments, professional development providers, etc.). The activities, programs, and resources that were likely to be implemented across states and districts would therefore reflect both national priorities and local interpretations. (Link)
We had professional development resources from the Iowa Department of Education covering SSBR, the five components of reading, the 90 minute uninterrupted reading block, explicit instruction, data, assessments, leadership and professional development.
Districts were NOT required to choose a core reading series as NONE that were published at that time met the SBRR criteria for all five components of reading. NONE.
The findings presented in this report are generally consistent with findings presented in the study’s Interim Report, which found statistically significant impacts on instructional time spent on the five essential components of reading instruction promoted by the program (phonemic awareness, phonics, vocabulary, fluency, and comprehension) in grades one and two, and which found no statistically significant impact on reading comprehension as measured by the SAT 10. (Link)
The first two results reported in the summary were these:
- Statistically increased instructional time on the 5 components in grades one and two
- No statistically significant impact on reading comprehension
So more time was spent on all five components and yet reading comprehension did NOT improve.
The summary concludes with this paragraph
The study finds, on average, that after several years of funding the Reading First program, it has a consistent positive effect on reading instruction yet no statistically significant impact on student reading comprehension. Findings based on exploratory analyses do not provide consistent or systematic insight into the pattern of observed impacts. (Link)
WHAT WERE THE RESULTS?
- Reading First had a statistically significant impact on the total time that teachers spent on the five essential components of reading instruction promoted by the program in grades one and two.
- Reading First had a statistically significant impact on the use of highly explicit instruction in grades one and two and on the amount of high quality student practice in grade two. Its estimated impact on high quality student practice for grade one was not statistically significant.
- Reading First had no statistically significant impacts on student engagement with print.
- Reading First had a statistically significant impact on the amount of professional development in reading teachers reported receiving; teachers in RF schools reported receiving 25.8 hours of professional development compared to what would have been expected without Reading First (13.7 hours). The program also had a statistically significant impact on teachers’ self-reported receipt of professional development in the five essential components of reading instruction; teachers in RF schools reported receiving professional development on an average of 4.3 of 5 components, compared to what would have been expected without Reading First (3.7 components).
- A statistically significantly greater proportion (20 percent) of teachers in RF schools reported receiving coaching from a reading coach than would be expected without Reading First. The program also had a statistically significant impact on the amount of time reading coaches reported spending in their role as the school’s reading coach; coaches in RF schools reported spending 91.1 percent of their time in this role, 33.5 percentage points more than would be expected without Reading First (57.6 percent).
- Reading First had a statistically significant impact on the amount of time teachers reported spending on reading instruction per day. Teachers in RF schools reported an average of 105.7 minutes per day, 18.5 minutes more than the 87.2 minutes that would be expected without Reading First.
- Reading First had a statistically significant impact on teachers’ provision of extra classroom practice in the essential components of reading instruction in the past month; the impact was 0.2 components.
- There were no statistically significant impacts of Reading First on the availability of differentiated instructional materials for struggling readers or on teachers’ reported use of assessments to inform classroom practice for grouping, diagnostic, and progress monitoring purposes.
- Reading First had no statistically significant impact on students’ reading comprehension scaled scores or the percentages of students whose reading comprehension scores were at or above grade level in grades one, two or three. The average first, second, and third grade student in Reading First schools was reading at the 44th, 39th, and 39th percentile respectively on the end-of-the-year assessment (on average over the three years of data collection).
- Reading First had a positive and statistically significant impact on average scores on the TOSWRF, a measure of decoding skill, equivalent to 2.5 standard score points, or an effect size of 0.17 standard deviations (See Exhibit ES.5). Because the test of students’ decoding skills was only administered in a single grade and a single year, it is not possible to provide an estimate of Reading First’s overall impact on decoding skills across multiple grades and across all three years of data collection, as was done for reading comprehension. (Link)
On average in the United States:
More instructional time. More explicit instruction. More professional development. More coaching. More classroom practice. No difference in grouping, diagnostic, and progress monitoring. No difference in comprehension. Increased decoding for students in grade 1.
I worked with small and medium-sized school districts. Their grants varied according to their needs. Some elements were the same. Frameworks varied based on existing local materials and curricula. The professional development and coaching encouraged change. The pressure of a belief system under the umbrella NCLB of 100% of students proficient was great. Student learning did improve. But were those celebrations of growth sustained over time? We were fortunate that Iowa’s plan did NOT require a core basal. It did not require DIBELS for benchmark assessments. From winter of Kindergarten throughour benchmark (3 x a year) assessed accuracy, rate and comprehension. One measure. Sensible. Focused.
Well, that depends.
- Bullet 1 under funding had five components to measure. Are they all equal?
- Bullet 2 had PD and coaching on the SBR practices and how to work with struggling readers so it also had multiple parts.
- Bullet 3 had diagnosis and prevention with screening, interventions and monitoring so it also had multiple parts.
What criteria was identified and distributed for those “parts” before the grant process? Now the criteria for every program seems to be growth on NAEP scores – whether right or wrong.
What about the Audit Report for the Reading First Application Process?
This report, ED-OIG/I13-F0017, September 2006, from the Office of the Inspector General suggests that the bigger issues with the national implementation of Reading First were the result of the Department of Education failing to follow the rules. (Link)
FINDING 1A – The Department Did Not Select the Expert Review
Panel in Compliance With the Requirements of NCLB……………………………………………………………………………………6
FINDING 1B – While Not Required to Screen for Conflicts of
Interest, the Screening Process the Department
Created Was Not Effective …………………………………………………….7
FINDING 2A – The Department Did Not Follow Its Own Guidance
For the Peer Review Process ………………………………………………….8
FINDING 2B – The Department Awarded Grants to States
Without Documentation That the Subpanels
Approved All Criteria………………………………………………………….11
FINDING 3 – The Department Included Requirements in the
Criteria Used by the Expert Review Panels That
Were Not Specifically Addressed in NCLB …………………………..12
FINDING 4 – In Implementing the Reading First Program,
Department Officials Obscured the Statutory
Requirements of the ESEA; Acted in
Contravention of the GAO Standards for Internal
Control in the Federal Government; and Took
Actions That Call Into Question Whether They
Violated the Prohibitions Included in the DEOA ………………….13
You can read the specifics of those findings yourself. Secret lists. Yes. The Department of Education controlled the applications, selection and criteria of materials above and beyond the guidelines in the law.
WHY DOES IT MATTER TODAY?
Please, check your sources. We expect students to quote, list sources, and explain their uses. Experts should as well. This post pulls from federal reports and my personal reflections. Both sources are needed for context. Facts DO matter.
When a student interaction occurs on the playground between two students, there are multiple perspectives. The two students involved saw, heard and responded to something. Other students “observing” may have seen or heard a part of the interaction. Adults observing may have seen or heard a part as well. Many perspectives mean that there is no “ONE” story. There are many layers. And all layers do matter.
IES. (2008). Reading First: Impact Study Final Report Executive Summary. (Link)
OIG, (2006). The Reading First Program’s Grant Application Process: Final Inspection Report. (Link)
I check my iPad mini and my Kindle app. It’s only on the mini. It’s a control issue. Control my time issue. So that means that it is not on my phone nor my computer. Seriously, it’s only on one device. Otherwise it would be toooooo tempting to read just another minute, five minutes or more!
Kindle has a Reading Streaks Activity Tracker.
I’ve read for 77 weeks in a row.
173 days in a row.
With a few touches, I discover that there were only 18 days that I did not read on my kindle during 2019. What makes me pause is the fact that the majority of my professional reading is done with real paper in hand books. Sometimes I have a book on my computer, but more often than not it’s the hard copy that I covet and therefore purchase. Implication: I may have read every day in 2019 but my data is inaccurate because:
- It only includes Kindle reading
- I did not have wireless access
- I don’t know what counts as “reading”. If I open the app, is that “good enough?”
Does it matter?
I am also trying to make sense of my Goodreads data and now I fully understand that I need to “calendar” time each month for recording. Recording needs to be routinized if it is going to be accurate and therefore data that has utility. Here’s what I know.
I have a collection of data points so I’m just sharing some others that either interested or intrigued me. This view is my books read by my ratings and 6/77 have no rating so that’s an “oops” on my part. Typically, if the book is not a 4 or 5, I don’t enter it into Goodreads. I just keep reading.
This sort of books read by publication date is one of my favorites even though I am less concerned about the actual month of the year that the book is published as I have already read several 2020 books. What questions do you think are answered by this data?
And then a view of when I read, including a pop out list when I click on an individual bar.
And how does this graph differ from the one above? What’s the same?
Automatic data collection is nice and deceptively addictive. I could sort by my shelves and my content. As previously mentioned in 2017 here, 2018 here, June, 2019 here and winter break reading here, my reading goals this year were about balance and exploring a wider variety of genres. Is that data already available?
Accuracy is an issue because this is what my totals looked like in June. And I have read for 173 days straight since July. I also have only one Goodreads account now so that data is most suspect.
Before I record any books in Goodreads for 2020, I need to decide on the labels for “my shelves.” I like the idea of 5 categories for fiction and 5 for nonfiction. One big LUMP for Professional does not yield actionable data.
I need to start recording 2020 books. I want a manageable system that is easy and meets my needs. By the time I have reached that solution, I also believe that my #OLW will have resolved itself.
Quantity? Is it the numbers?
Quality? Additional meaningful information?
Ease of collection? Automatic, actionable, and accessible?
What stories do you find in your reading habits?
What stories do your students find in their reading habits?
Thank you, Two Writing Teachers, for this weekly forum. Check out the writers and readers here.
School breaks can be times of anxiety when routines change, excitement for those who have planned travel or activities, and time for relaxation and rest after the frantic build up to the break. This post is deliberately planned to give you a few days to consider how you want to frame goals for the time in between school attendance in 2019 and 2020.
Last week during a Zoom meeting with some teachers, I asked what folks were planning to encourage literacy over the winter break. This had come up as a question in some buildings recently. So here are some of the ideas that we brainstormed!
Thanks, Elementary Book Love friends!
What this is not:
- An encouragement for packets
- Assigned reading tasks
- A calendar of activities to do over break every day
- A requirement for a grade or for extra credit
Instead consider the literacy habits of your students:
- What are their current reading habits?
- What are their current writing habits?
- What encouragement would give them some practice time?
- Where might this practice occur?
- What might get in the way of this practice?
- What are their current literacy goals?
What might be your focus?
- Access How many books or texts do your students need on average? How can you put that number in their hands? How can you increase access to meet the needs of the families represented in your classroom?
- Choice How can students choose some books from the classroom library to read over break? How can you provide some ideas about writing over the break? Is there a book or an activity you would do with siblings? Cousins? Neighbors?
- Agency What if you give the students a calendar to look at and then ask if there are days that might more naturally fit into some reading or writing time? (Travel time? Days when family members are working? Days with relatives?) Students can identify some days and or times that might work.
- Student Goals Based on the time they see available, knowing that weather, etc. can change plans, what might be a reasonable goal?
- Fun What about choosing a book or genre that is fresh/new? Joke book? Poetry? Or even reread an old favorite? Or rewrite a piece from a different view?
Approximately 10 – 15 minutes each day during the next week might just provide the time that your students will need in order to make sure they have access, choice, agency, their own goals, and some fun planned for their winter break.
This may need a bit of planning on your part if you need to organize a quick book tasting or a gallery walk to show “possibilities” for writing. Chat with a friend and generate ideas that meet the needs of your students! Please remember to share your own plan for the break!
However, if this is totally different from past practices you may want to send a quick note home so families have a heads up before the break begins!
What if . . .?
How will your students benefit?
What do you have to lose?
What a joyful sharing of literacy activities when school resumes in 2020!
During the last five daily blog posts, I have worked my way through the five rules from P. David Pearson and the #ILA19 panel session at 7 a.m. Saturday titled: “What Research Says About Teaching Reading and Why that Still Matters.”
Understanding the research in today’s world takes some work, some thinking, and a good hard look at the evidence, the word that appears in both rule 3 and rule 5.
A week ago, this was how I started my first draft for the series. I quickly discovered as I wrote that this look at the Big Picture was the ending of the series instead of the beginning. The REAL beginning was the panel presentation that recentered some beliefs in processes and brought back a review process used by our Statewide Literacy Team in the past.
So let’s get started. “It was a dark and stormy night.” (I love how Snoopy works that into every story!)
Compare these headlines:
- ‘No Progress’ Seen in Reading or Math on Nation’s Report Card
- Screen Time Up as Reading Scores Drop. Is There a Link?
- The One And Only Lesson To Be Learned From NAEP Scores
- Mississippi: Miracle or Mirage – 2019 NAEP Reading Scores Prompt Questions Not Answers
Match the quotes with the titles above. 1. #Headlines
_____ NAEP is extraordinarily clear that folks should not try to suggest a causal relationship between scores and anything else. Everyone ignores that advice, but NAEP clearly acknowledges that there are too many factors at play here to focus on any single one.
_____ In reading, Mississippi was the only state to improve in 2019 in 4th grade and Washington, D.C. (considered as a state) was the only one to improve in 8th grade. (The District of Columbia, in fact, showed the fastest gains this year of any state or large school district.)
_____ Todd Collins has raised another important caveat to the 4th-grade reading gains in Mississippi because the state has the highest 3rd-grade retention percentages in the country. . .
_____ Mississippi was the only state in the country to improve reading scores, and was number one in the country for gains in fourth-grade reading and math, according to newly released test results.
_____ Students have actually lost ground since 2017 on both of the NAEP’s main reading content areas: literary experience, such as fiction analysis, and reading for information, such as finding evidence to support an argument. Both grades declined significantly in both areas from 2017 to 2019, but the drop was larger for literary skills.
Which ones seemed pretty obvious?
Which ones took a bit more thought?
And then which two came from the same publisher?
. . .
. . .
#1 Headlines and text that supports or matches the headline.
3, 1, 5, 4, 2.
Same Publisher: 1 and 2 were both EdWeek
Of the five articles, where would you expect to see research?
Tip: #2 showed that data was reported but not research in article #4.
What is the best evidence?
When I return to “Results are in: Mississippi Students #1 in the Country for Reading Gains,” I actually have more questions after more reading. Especially after reading this article: “Here’s What All the NAEP coverage missed.”
What if the reading gains are the result of higher beginning points every year?
2.#Research Applied Evenly
What would be worthy of studying?
- Is the gain the result of instruction delivered to the students?
- Is the gain the result of the professional development provided for the teachers since 2013?
- Is the gain the result of the addition of coaches in the lowest buildings (in the fall of 2018)?
- Is the gain the result of the retention policy?
And that takes me back to Paul Thomas’s blog (#5 above). And this updated section:
- UPDATE: Todd Collins has raised another important caveat to the 4th-grade reading gains in Mississippi because the state has the highest 3rd-grade retention percentages in the country:
But Mississippi has taken the concept further than others, with a retention rate higher than any other state. In 2018–19, according to state department of education reports, 8 percent of all Mississippi K–3 students were held back (up from 6.6 percent the prior year). This implies that over the four grades, as many as 32 percent of all Mississippi students are held back; a more reasonable estimate is closer to 20 to 25 percent, allowing for some to be held back twice. (Mississippi’s Department of Education does not report how many students are retained more than once.)
This last concern means that significant numbers of students in states with 3rd-grade retention based on reading achievement and test scores are biologically 5th-graders being held to 4th-grade proficiency levels. Grade retention is not only correlated with many negative outcomes (dropping out, for example), but also likely associated with “false positives” on testing; as well, most states seeing bumps in 4th-grade test scores also show that those gains disappear by middle and high school.
After several questions about “retention” and/or “intervention” and/or “multiple attempts on the state assessment,” maybe this is a focus for research. What data do we have? What data do we need to collect? What other questions bubble up?
- Did students who did not meet the proficiency level have higher absenteeism that proficient students?
- Did any specific classrooms have higher growth than others?
- What do we know about the implementation of the teacher training?
This “study” may require some additional data collection but it could be undertaken relatively quickly to form some general ideas yet this year.
Because I want to reduce the need for intervention, I might also explore this chapter from Regie Routman’s, Read, Write, Lead. (Link) 3.#Best Evidence 4.#Full portfolio of methodology
What I wouldn’t do is:
Give the 4 point credit to ANY of the above areas without study.
Blame teachers for not implementing “enough” or “correctly” without study.
Say that Mississippi has a program that should be replicated in every state because we don’t know the amount of resources that it took to get these results that are not sustained through 8th grade . . . without study. 5. Evidence, not a straw person.
The purpose of this post was to pull together a topic currently in the literacy field, generate some questions, look at the data, and apply the 5 rules from the Research presentation. In less than an hour my questions were generated and this post was written. A beginning application. A beginning look at the Big Picture.
You can do this.
You must do this.
You need to verify the accuracy of what you are reading. Find a partner and get started!
Today’s post considers Rule 5 from P. David Pearson’s presentation as a part of the #ILA19 panel titled: “What Research Says About Teaching Reading and Why that Still Matters.” (The links for Rules 1-4 are at the bottom!)
Read Rule #5 again.
Does it sound familiar?
The hue and cry that no one is teaching phonics . . . is almost hysterical in light of this report from EDWeek that Reading First (2001-2008) failed to make gains in reading comprehension due to toooooooooooooo (o’s added for emphasis) much focus on skills like phonics. I personally know of school buildings that were spending an hour each day in the primary grades on phonics in the Reading First era. And a few spent more than an hour because of some slick salespersons.
What didn’t work?
Reading First required all 5 pillars from the National Reading Panel
- phonemic awareness
with the measure of success being an assessment of comprehension which had the lowest amount of time out of reading instruction. So of course, the data for reading comprehension didn’t improve when phonics and fluency were the most popular and most often tested pillars!
Reading First ended. The Common Core State Standards became the next “great golden goose” and whiplash hit teachers when they were told that “close reading” meant no introduction to the story/book and annotations were now the activity of the day. Cold reads. Master individual standards.
. . .
Phonemic Awareness instruction did not disappear.
Phonics instruction did not disappear.
Fluency instruction did not disappear.
Vocabulary instruction did not disappear.
Comprehension instruction did not disappear.
All five areas were still a part of the CCSS standards. And yes, when writing was finally back in vogue, we did celebrate. Under Reading First, writing was pushed out of the 90 minute uninterrupted reading block. In some instances, writing totally disappeared or appeared briefly as a Monday weekend journaling. One day out of five days for 15-20 minutes was eked out for some assigned writing prompt.
When I hear:
No one is teaching phonics.
Teachers don’t know how to teach phonics.
Teachers weren’t trained to teach reading.
I have to take a deep breath. And sometimes a second breath. And even a third breath.
It isn’t ALL teachers as often reported.
Teachers in Iowa under Reading First were required to have 40 hours of professional development each year in those five pillars. And in our region, we offered similar training for districts that did not qualify for Reading First grants because they also needed the knowledge. It wasn’t withheld from anyone. It was research-based, systematic and explicit.
The very nature of these reports that feel like accusations come from high school teachers (yep, they didn’t have that unless they trained in Reading); teachers who come from alternative licensing (didn’t attend a traditional college licensing program); curriculum/marketing personnel and journalists. No pro/con reporting of more than one side of an issue. Instead, reporting on the state of the nation on the basis of a few states. Sometimes even stating their biases although they have never, ever taught a student to read.
Raising your hackles?
Just last week, I had one of those parent calls. A parent whose child had an IEP meeting. The child is currently a junior in high school. They (IEP team) wanted to write a goal for phonemic awareness. (I had to literally cover my mouth on the phone to make sure that I was listening and not expressing my opinion.) Phonemic awareness – sound manipulation – no print included for a 17 year old student with three semesters left in the public school system.
Phonemic awareness – which by National Reading Panel research was to take 20 hours of instruction and be done in kindergarten.
Phonemic awareness – because of a data point that wasn’t “mastered.”
“What do you want and need for your child?”
Driver’s Ed. so she can get to and from a job.
Ability to get a job.
Ability to keep a job.
To keep her kind, helpful “I will try anything” attitude.
To continue to grow and learn to be a successful adult in the community.
Math so she can figure out a budget, pay rent, expenses, and be as independent as she wants.
Approximately 270 days of school left for this child. A program of study to complete. A student who has had phonics as a part of her reading goal for 10 years. LETRS trained teachers. Phonics program after program.
- first grade
- second grade
- third grade
- fourth grade
- fifth grade
- sixth grade
- seventh grade
- eighth grade
- ninth grade
- first half of junior year
and now some folks who have never worked with the child and could not pick her out of a classroom believe she should have a goal in phonemic awareness because of a data point.
After 11.5 years of phonics instruction, maybe it’s not the child.
Maybe it’s the crappy timed test and she just doesn’t do well under pressure.
Maybe the nonsense words really offend her sense of meaning.
And it makes me incredibly frustrated.
Don’t tell me the students haven’t had phonics!
Where is your evidence?
(Disclaimer: I understand the frustration of not having student needs met. As a special ed teacher I have taught dyslexic students. So many students needed different approaches and methodology. I have used an array of tools and programs. One example: After parental requests and with the permission of my administrator, I tried “Hooked on Phonics.”)
The current Reading War is based on a bunch of untruths, misrepresentations, and straw man arguments.
Spend the time to check your facts!
If you have not been following along, here are the posts to date: