Thursday, December 9, 2010

Dissecting the NETP- Part Two, Assessment: Measure What Matters


This is part two of a new blog series where I want to take a closer look at the newly released National Education Technology Plan. I outline my plan for this series in this post.

Part Two: Assessment- Measure what Matters

Section Goal: Our education system at all levels will leverage the power of technology to measure what matters and use assessment data for continuous improvement.

Measure what matters. Such a simple truth, but seemingly so elusive in education these days. Why so elusive? Well.....what matters? Ask 100 different teachers and you might get 100 different answers. Therein lies the main problem with addressing assessment from a large-scale, pushed-down-from-the-top model such as the one we're currently a part of here in the U.S.

Nonetheless, the NETP tackles the issue of assessment and how it can be enhanced and customized through the use of technology. There are some good, substantive uses outlined and also a few items that left me scratching my head. Here are my impressions:

What I liked
  • I really like the Obama quote that kicked this section off, taken from an address to the Hispanic Chamber of Commerce in 2009: "I'm calling on our nation's governors and state education chiefs to develop standards and assessments that don't simply measure whether students can fill in a bubble on a test, but whether they possess 21st century skills like problem-solving and critical thinking and entrepreneurship and creativity." This is a noble and powerful statement.
  • Assessment through simulations. On page 28, the plan outlines ideas for how online or computer-based simulations can be used simultaneously as learning and assessment tools. It speaks of students performing simulations and, embedded within the programming there is code that records the process that students go through when performing the simulated task. For example, a student is given a task to build a bridge across a ravine. Within the simulation, the student tries different materials, records why they did or did not work, tests the bridge, re-applies materials, etc. I like this approach, as it is problem based and allows the student to simulate something with technology that they otherwise would not have an opportunity to work on.
  • There is a good strategy mentioned on page 29 that utilizes classroom quiz systems (where students have clickers and respond to questions). Typically, I am against the use of any technology that promotes a more efficient way of asking rote, multiple choice questions. However, it's all about the use of the tool/questioning. On page 29, there is an example of a teacher using quiz systems to pose a question, then he asks his students to find others who answered differently and reason through why their answers were different. If there is a "correct" answer, this process allows those who answered incorrectly to collaborate with peers to learn from their mistakes. If there is no correct answer, this strategy promotes discussion, reflection, and persuasion.
What I didn't like
  • On the summary page, I take issue with the underlying assumptions this section makes about teaching and teachers. It states "Most of the assessment done in schools today is after the fact and designed to indicate only whether students have learned. Little is done to assess students' thinking during learning so we can help them learn better." This statement is CERTAINLY true of the federal/state testing that is pushed down to schools, but is absolutely false about classroom assessment in general. It's almost as if the inherent assumption here and in the rest of this section of the document is that if the state/fed doesn't force teachers to do formative assessments, they'll never get done. The fact of the matter is that teachers use formative assessments all the time. Just because they're not logged into some unwieldy data system doesn't mean it's not being done, and done well I might add.
  • There is more talk about utilizing technology to assess- by doing formative assessments frequently to take stock of where students are in the learning process. This sounds great in theory, but in practice I fear what it means is that we're going to be forced to spend all of our students' computer time on assessment instead of using technology to collaborate, create, and publish. There is a limited amount of time and a limited amount of computers in schools. We cannot increase the use of technology for assessment without decreasing the use of technology to actually work on the skills students need. It is a trade-off, and I can envision this increased use of tech to assess turning out badly in practice.
  • This quote on page 35 struck me: "An important direction for development and implementation of technology-based assessment systems is the design of technology-based tools that can help educators manage the assessment process, analyze data, and take appropriate action." Isn't this just calling for increasingly efficient ways to maintain a broken system? If students continue to be looked at as data-sets and all we're doing is making it easier to perpetuate this, are we really transforming anything?
Conclusions
I'm passionate about getting assessment right. And right to me is all about student reflection, problem solving, interaction, and growth. And it's not about numbers for me. How the federal government approaches assessment has huge impacts on where schools focus their energy, where teachers focus their energy, and where students focus their energy. Narrow the assessment focus to reading and math and guess what two subjects are pushed to the front of instruction (to the detriment of the arts and humanities)? Utilize assessments based on "correct" answers that address simplistic items or ideas and instruction begins to mirror this. With all the pressure of labeling schools (and soon, it seems, teachers) as failing or successful based on test scores, it's no wonder that instruction is forced to shift toward delivering content in the same manner in which it is tested- boring, rote, and straightforward.

But I'm a hopeful kinda guy. I hear things like "assessment 2.0", assessing critical thinking, promoting digital persistent portfolios of student work, and I have hope. Maybe this plan can help shape assessment so that it truly does "measure what matters"- a lot of the words being used aim in that direction. I'm just looking for more ways to do my part to make these words become classroom realities and I hope you'll join me!

Next week, we'll look at the next section- Teaching: Prepare and Connect

No comments: