Tag Archives: assessments

Classroom Techniques: Formative Assessment Idea Number Five

Wordle_blogClassroom Techniques: Formative Assessment Idea Number Five.

We’re always looking for good formative assessment ideas! NWEA (owners of MAP) has a blog with some useful strategies (see link above)… they also have a “SPARK” community blog that helps teachers and administrators that use MAP tests, use the data better!

No need to reinvent the wheel… let’s not forget another masterful blog with a plethora of assessment resources, Andrew Churches Educational Origami blog.

Feedback… AGAIN! Quick post on formative assessments

ImageI’ve been a bit behind on my ‘ten minutes a day’ favorite blog readings and in my efforts to catch up I came across a comment by Rick DuFour on the AllThingsPLC blog regarding formative assessments.

Math Teachers!!! This one’s for you:

… Benjamin Bloom’s research in the teaching of math found that teachers get better results when they begin the course with a brief pre-assessment of the skills students must  have in order to be successful in the unit they are about to teach. They discover areas where students are lacking those skills, and then instead of beginning new content, the begin with several days of instruction aimed at the prerequisite skills. They repeat this process for every unit, asking “which skills must students have in order to be successful in this unit and how do I know if they have them.” The process works best when it is done by a collaborative team of teachers and the schedule is designed to have some of them teaching in the same period. They give the pre-assessment, look at the results, and then divide the students between them. One might take the group that needs support in learning the new skills, another works some students to practice those skills, and another presents practical problems to students who are called upon to apply the skills. After several days of this, the students return to their homeroom teacher and the new unit begins.

One might think that this process would have an adverse impact on student achievement because teachers couldn’t cover as much content. In fact, Bloom found just that opposite. The fact that students had acquired the necessary skills enabled teachers to move through the content more quickly and the results were dramatically higher. You can read about this in an article Bloom wrote years ago for Phi Delta Kappan magazine called, “The two-sigma effect.”

There’s ABSOLUTELY NO DOUBT that formative assessments are powerful teaching/learning tools!

Here’s a related article by Benjamin Bloom

How much TEA for you?

In my previous post I touched on the concept, introduced to our faculty by Jennifer Sparrow during an EARCOS Weekend Workshop, of TEA… Targets, Evidence, Action(s). It’s a great way to organize thinking on what we do as a school. In many respects, it’s a simplified version of creating a results-oriented school advocated by Mike Schmoker, Rick & Becky DuFour, and others. In fact, it simplifies the SMART Goals process originally established by Jan O’Neill and Ann Conzemius.

From Jennifer Sparrow EARCOS Weekend Workshop, Sept. 3-4, 2011

Our job is to focus on student learning and how to improve it. It’s just common sense that we first figure out what we want our students to learn (learning objectives/Performance Standards); gather evidence that indicates how well they are ‘getting it’; and develop a plan of action to ensure they are supported and able to reach those intended learning goals. A simple, direct approach to helping all learners meet their goals. Thanks again, Jennifer Sparrow! Cuppa TEA?

Feedback and Learning

From: http://www.flickr.com/photos/karlhorton/1903050006/ lightbox/#/photos/karlhorton/1903050006/

This isn’t a unique insight, yet, it strikes me that it must be more than stated over and over, it must be acted upon: Learning occurs from feedback. The more timely the better. Think about any learning you have. It’s a result of feedback. What’s the implication for teaching?

If one ‘teaches’ do others automatically learn? I guess it depends on whether our definition of teaching has a distinct place for feedback.

I just finished facilitating a small faculty focus group with the goal of developing suggestions for principals on teacher observations. What was the number one suggestion? You guessed it… FEEDBACK. Teachers want to learn, too.

That’s encouraging news for me. It means that a school really can be a learning community (you may be excused for assuming that that’s what schools automatically are). It’s not automatic, it takes deliberate action. What do teachers want to learn? How to have maximum positive impact on student learning. This relates to my previous post on acronyms. PLC is an often mis-used and misunderstood acronym It’s the understanding behind what a learning community really is that counts most. I’m excited because my school community is on the verge of realizing and becoming a very powerful reality. If we can stay the course of collaborating in committed teams focused on improving student learning we will be a living example of the true meaning of a PLC.

Cheating and the Role of Assessments


Used with permission of the artist. See his other cartoons at www.mikeshapirocartoons.com

Used with permission of the artist. See his other cartoons at http://www.mikeshapirocartoons.com

I spoke to my teaching colleagues yesterday morning about the prevalance of cheating and the role assessments play.  It seems like an odd pairing, doesn’t it? And yet, what is it that students most often cheat on? Assessments.  One of the issues was the question of whether students should have their marked tests and projects handed back to them.

As educators, we recognize the importance of prompt and meaningful feedback. We also, often, ask students to use old assessments to study for upcoming ‘culminating’ assessments. It’s common sense that once those assessments are handed back, students who haven’t yet had those assessments can easily aquire them. Does it matter if students do get to see assessments ahead of time?

In some cases, it matters… in some it doesn’t. For example, IB exams… it doesn’t matter! We actually ask our students to study past exams. Are our students cheating? Not at all. In fact, IB exams are great examples of assessments that are almost ‘cheat-proof’ (barring the use of extensive, hidden notes on the subject). What about less comprehensive assessments designed to measure factual knowledge? How many siblings and friends willingly pass on their old notes and tests to younger siblings and friends? If one of the goals of the course is a certain set of vocabulary, who cares if a student (extra conscientious?) studies from previous years’ tests? However, is the student meeting the learning goals if they are simply memorizing the questions and answers on old tests? Let’s use common sense when we design our assessments!

Stay focused on the learning goals and rework old tests so they’re not carbon copies of previous tests. Like it or not, if students are going to be allowed to take home their marked assessments we need to make some  adjustments, year to year. As important, is making crystal clear what constitutes cheating and academic honesty. Communicate with students what specific behaviors are unacceptable. 

Use good test proctoring strategies: walk around constantly, be aware of student behavior, prohibit student to student communication. Don’t be afraid to watch students carefully, they will get the message that you take academic honesty seriously. Use plagiarism detection software like Turnitin.com for word-processed assessment elements.

A well-designed assessment that requires a student to demonstrate his/her competency in synthesizing information and provide evidence that is clearly linked to the learning goal can also help minimize opportunities for cheating.