Posted on February 15, 2013, by All Things PLC Team
By Ainsley B. Rose, PLC at Workâ˘ associate
One of the three big ideas of a professional learning community calls for us to be data driven. A recent blog raised an interesting perspective when it was suggested that we should be more concerned about being data informed. Is this merely a matter of semantics, or should we pay attention to the apparent distinction?
On his informative blog, Larry Ferlazzo (2013) sites Ted Appel, who contends, âIf schools are data driven, they might make decisions like keeping students who are borderline between algebra and a higher level of math in algebra so that they do well on the algebra state test. Or, in English, teachers might focus a lot of energy on teaching a âstrandâ that is heavy on the testsâeven though it might not help the student become a lifelong reader. In other words, the school can tend to focus on its institutional self-interest instead of whatâs best for the students. In schools that are data informed, test results are just one more piece of information that can be helpful in determining future directions.â
Indeed, the data that school-based teams focus on should be derived from common formative assessments. Whereas assessment is intended to inform teaching, there is some justification to think of the data as informing. So, if as Ted Appel argues, âtest results are just one more piece of information,â what should teams look for to enhance their conversationsÂ so theyÂ lead to greater levels of student achievement? Data generated from common formative assessments are from a given point in time. We need to also pay attention to trends and patterns, and those do not always arise just from our common assessment or state test scores, benchmark assessment, or standardized assessments alone.
Professor JohnÂ Hattie writes that teachers should see learning through the eyes of their students, and students should become their own teachers. In both instances, the basis that forms their respective actions is derived from information about student learning, attitudes, andÂ dispositions, among many other attributes. Numbers or data generated from common assessments andÂ state tests are largely necessary but entirely insufficient in helping teachers and students in making decisions about where they are going, how they are going, and where to next in the studentâs learning (Hattie, 2008). To be able to arrive at answers to these three questions will surely require more information that can be obtained from just assessment data. It has to involve conversations with students, teachers, and other adults that intervene in the learning equation.
The extent to which schools ask questions of their students will lead to greater depth of understanding of the adult impact each is making. After all, the extent to which we think we are giving good feedback to students about their learning should be best judged by the studentsâ reaction to the feedback. How often do we really get the studentâs opinion about what works best for him or her? So, as Hattie suggests inÂ his book Visible Learning for Teachers, we need to be gathering data about student perceptionsÂ of what works best for their learning. Data about their levels of engagement, their desire to learn, their need for appropriate feedback, and their perspective on the quality of teaching should all be added to the scores students get in the common formative assessment. This additional information will surely inform the teachers about not only the results from test scores, but also the dispositions on the part of the students to learn.
What is interesting for me is we could really align our data gathering from the students according to the very questions PLC teams are encouraged to ask themselves:
What is it weÂ expect students to learn?
How will we know when they have learned it?
How willÂ weÂ respond when they do not learn?
How will weÂ respond whenÂ they already know it?
The questions Hattie (2012) encourages us to be mindful of for students again are: Where am I going? How am I going? Where to next?
In every case, we can gather the data not only of student behavior, butÂ also of the adult actions that have caused the student responses. This, I suggest, will give PLC teams more appropriate and complete data from which to proceed in the classroom and inÂ the school.
So, for me, data informed versus data driven is not merely a matter of semantics. I say it is clearly more complicated. We certainly have to be purposeful, attentive, and professional in examining all aspects that contribute to the success, or lack thereof, in student learning and instructional practice. As Hattie contends, teachers and school leaders must constantly examine their impact on student learning and make informed decisions on the basis of a range of evidence. We would be wise to take heed of his advice.
Ferlasso, L. (2009, August 26). Data-driven versus data-informed. Retrieved fromÂ http://larryferlazzo.edublogs.org/2009/08/26/data-driven-versus-data-informed
Hattie, J. (2008). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. New York: Routledge.
Hattie, J. (2012). Visible learning for teachers: Maximizing impact on learning. New York: Routledge.
Meier, D. (2009, March 5). Data informed, not data driven. RetrievedÂ fromÂ http://blogs.edweek.org/edweek/Bridging-Differences/2009/03/dear_diane_sometime_i_imagine_1.html
You must be logged in to post a comment.
PLC TOOLS & RESOURCES
Download sample agendas and activities, choose from a variety of helpful links,
WEEKLY TWITTER CHAT
Meet PLC educators, ask questions, and work together to find solutions! All you need to participate is a Twitter account. Join the chat every Thursday, from 9–10 p.m. (EST).
Visit #atplc on Twitter.
See our #atplc archives.