Tuesday is the day to share a “Slice of Life” with Two Writing Teachers. Check out the writers, readers and teachers here.
April Showers have been devastating in many regions of the United States this week. Equally devastating is the April Data Dump that is happening in many schools across the United States. Are you drowning in data?
How many of these pieces of data have you accumulated for each of your students?
- National Assessments
- Local Assessments
- Benchmark Assessments
- STAR Reading
- STAR Math
- Accelerated Reader Tests
- Unit Tests
- Formative Assessments
- Book Logs
- Running Records
Do you have others that are not on the list? Does each piece of data match up and tell the same story or is there a dissonance from conflicting data including the student’s work in the classroom during reading or writing workshop? What is the role of data in instruction?
Which assessments REALLY inform your instruction?
What do you change, today, in your instruction as a result of your assessment data?
How do you make a mid-class period correction to ensure every student is learning?
When you have data collected, it needs to be organized and then it needs to be USED to inform instruction. This sounds simple but additional ideas about data are shared by Brianna Friedman at her blog entitled “Adventures in Staff Development” and more specifically in her February 18th post, “What Does the Data Say?” In today’s slice, Jana tells a story from the view of teachers reviewing the data in “Data Review – – Evaluation or Judgement?”
The number of days left in this school year are finite. If you are counting those days, my hope is that you have set your end goal targets and are counting the days in order to allocate precious, finite resources that will help all students reach the targets. Every minute, hour, and day is an opportunity for student learning!
How are you utilizing data to inform instruction and maximize student learning in order to meet your end of the year goals?
This Tweet from #tcrwp (Teachers College Reading and Writing Project) on August 15th caught my eye. A quick glance at the twitter stream confirmed that it came from Stephanie Harvey’s keynote (sigh of envy across the miles).
Hmmm. . . Harry Potter, Old Man and the Sea, and Alexander and the Terrible, Horrible, No Good, Very Bad Day are three distinctly different texts that have similar lexile levels!
Was I interested? Yes!
Did I independently check? Yes!
Those three books are typically read by readers at these levels:
- Alexander and the Horrible, Terrible, Very Bad, No Good Day – primary grades
- Harry Potter – upper elementary grades
- Old Man and the Sea – high school
But yet they all three have similar lexile levels! Would that still be where those texts would be read? Or has that expectation changed with the adoption of the Common Core?
The initial connection to Stephanie Harvey was further confirmed in Twitterverse later:
So what is a lexile? And just how is a lexile determined?
The Lexile Framework® for Reading claims to measure a student’s reading ability based on actual assessment, rather than a generalized age or grade level. It uses a common, developmental scale to match a reader with books, articles and other resources at the right level of difficulty. The Lexile Framework was developed by MetaMetrics®, an educational measurement and research organization that purports to use scientific measures of student achievement to link assessment with targeted instruction to improve learning. To date, more than 115,000 books and 80 million articles have Lexile measures, and the number of resources with Lexile measures continues to grow.
HOWEVER, CCSS.R.10 does not use Lexiles alone as a single measure of Text Complexity. ALL CCSS documents include a three-pronged approach to complexity as evidenced by this graphic and explanation:
The Common Core specifically says that there are “three equally important parts.” A lexile measure does not equal text complexity. There are many ways to determine which texts are appropriate for specific grade levels or bands. Quantitative factors (#2 above) seem to be the easiest to measure. An addendum to Appendix A suggests that two quantitative measures be used for comparison. That would mean that Lexiles AND a grade level equivalent could both be considered for a more general “quantitative measure.” Then qualitative facets would be explored like theme, structure and knowledge demands. Finally the Reader and Task considerations would be reviewed.
Additional information about text complexity is easily located. Sarah Brown Wessling’s, “Teacher of the Year,” viewpoint of text complexity is available at Teaching Channel.
Which elements of text complexity are you considering when selecting text?
What examples of “Out of Whack Lexiles” have you found?
* * * * * * * * * * * * * * * * * * * * * * * * * *
Addition/ Update = 08.17.13:
- Sun Also Rises by Ernest Hemingway 610L.
- Twilight garners a Lexile score of 720.
- A Farewell to Arms – Ernest Hemingway 730L.
- Beverly Cleary’s Ramona Quimby, Age 8, has a Lexile score of 860.
- Diary of a Wimpy Kid has 1000L.
- Moby Dick by Herman Melville has a Lexile of 1200.
- The Wee Little Woman is a board book by Byron Barton and has a Lexile of 1300.
**According to Titlewave:
The Diary of a Wimpy Kid 950
Fahrenheit 451 890
Gossip Girl 820
The Great Gilly Hopkins 800
From @AliBuzzell new resource on 08.21.13 tweentribune.com/readrank Thanks, Ali!
@doctordea Brief white paper: The Lexile Framework: https://connect.ebsco.com/s/article/The-Lexile-Framework-A-MetaMetrics-White-Paper?language=en_US
Donalyn Miller, the Book Whisperer http://blogs.edweek.org/teachers/book_whisperer/2012/07/guess_my_lexile.html?cmp=SOC-SHR-TW