Knowing The Score: Actionable Assessment from Student Writing
Knowing the Score: Actionable Assessment from Student Writing  first appeared on Pearson’s Research and Innovation Network on December 19, 2013.
By: Rod Granger
Peter Foltz, a Vice President for Research working in Pearson’s NextGen Learning & Assessment Center, is one of the original developers of automated scoring technologies, and holds a patent on methods for automatically scoring writing based on its content. His research has focused on learning, language comprehension, and uses of artificial intelligence in assessments. The methods he has pioneered improve student achievement, expand student access, and make learning materials more affordable.
Q:Â Why is this research significant?
A: An enormous amount of data exists about student performance, and I’m interested in using that information and turning it into something actionable, so we can tell when we should give a student feedback, and what kind of feedback would be most effective. Up to now the model has been focused on extracting student information from multiple choice questions or very short responses, but the way teachers actually find out what students know is from talking to them or having them do much larger sorts of exercises like writing.
The challenge is finding ways to obtain meaningful data from these types of performances, and much of my focus has been on automatically extracting information from students’ writing. My approach is to use computational techniques to extract information and determine how students are performing, communicating, and thinking critically.
Q:Â How specifically does that work?
A: As a psychology professor at New Mexico State University, I developed technology – along with my colleague Tom Landauer – to automate scoring for writing which would be as accurate as a teacher.  We needed to find a way to provide quick feedback, so students would be able to see whether they’d learned the information or not. I then started receiving requests for the software, known as IEA, or Intelligent Essay Assessor, and we formed a company, Knowledge Analysis Technologies, which was acquired by Pearson in 2004.
Since then, I’ve focused on continually improving the technology and adapting it so a greater number of students are put into more formative assessment situations. One of those ways is looking at students in collaborative situations, so the same type of technology that can be used for scoring writing can also work for students’ writing in online collaborative chat rooms.
Q:Â How will students benefit from your research?
A: Across a series of studies what we’ve shown is that when students get that instant feedback, and then revise their essays and resubmit them, we see gains not only in the quality of writing but also in their content learning. They’re learning what they don’t know, and they’re also going back to the text and learn more. So it’s really tying in formative assessment activity with the curriculum, tied right back to their reading.
Q:Â What would you like educators to take away from your research?
A: We want to make sure they understand that the technology is not intended as a replacement, but instead as tool for the teacher. That’s really important. This is a way to enable teachers to monitor their students more effectively, and at the same time give students an effective learning experience.
Q:Â What are the most important questions that still need to be answered in your field?
A: On the technical side, there’s still a lot to do to be able to make the technology better able to process and understand language that goes into student performance. So we’re constantly working on ways of improving that. We’re also looking at ways to adapt the technology so more students can be put into more formative situations.
Q:Â What are the most important changes in this field of research over the past five years?
A: Five years ago, the problem was that there wasn’t technology in the classroom to let students write online, or talk in chat rooms online. Now students have tablets and computers that are much more readily available, and that enable the collecting of data and providing personalized feedback.
Q:Â What changes in education do you hope to see over the next five years?
A: I hope, and expect to see, much more integrated personalization. Rather than have all students in a class take the same assessment, and read the same textbook, we can track students’ interests and abilities, and determine what text a student should be reading that will help maximize his/her vocabulary and knowledge growth in a particular topic.
Q:Â Why are you passionate about your area of research?
A:Â Research has been an integral part of my entire professional life, and my greatest joy comes from seeing that students are learning. What I want to continue to do is create more situations that make it easier for students to learn, and for teachers to be able to see that their students are learning on a day-to-day basis.
Because we’re able to assess students automatically, we’re also able to deliver them the right kinds of curriculum at the right times. We know what level students have reached at any point, and I use what I like to call the Goldilocks principle: you want to give them something that’s not too hard, and not too easy, but instead is just right.
Pearson is a Getting Smart Advocacy Partner.
0 Comments
Leave a Comment
Your email address will not be published. All fields are required.