Assessment and Evaluation: sacrificing complexity for granularity

I teach Math in Ontario. We have an “Achievement Chart” (see pages 28-29) which lists four categories of knowledge and skills. When we assess and evaluate student work, we separate student performance into the “TACK” categories: Thinking, Application, Communication, and Knowledge. The Chart includes criteria for each category and descriptors for different Levels of performance.

The curriculum itself is divided into Strands for each course, and these strands describe Overall Expectations and Specific Expectations (essentially the details of the Overalls).

So when evaluating student work, we evaluate Overall Expectations in the context of the four Categories of Knowledge and Skills, and we should have a “balance” between the categories (not equality, necessarily).

The truth is that I’m having some trouble with it. I posted a little while ago that I was struggling with the Thinking category, and that’s still true. But there is another issue that’s more pervasive and possibly more problematic.

Isolating skills

When trying to separate out the different components of student performance, we would often ask questions that “highlight” a particular area. Essentially we would write questions that would isolate a student’s understanding of that area.

That’s a fairly mathematical, scientific-sounding thing to do, after all. Control for the other variables, and then effect you see is a result of the variable you’re hoping to measure.

For example, we wouldn’t ask a student to solve a bunch of systems of equations which only had “nasty” numbers like fractions in them (or other unfairly-maligned number types) because we fear that a student who is terrible with the fractions will stumble over them and be unable to demonstrate their ability to solve the system of equations. So we remove the “barrier” of numerical nastiness in order to let the skill we’re interested in, solving the system, be the problematic skill.

This isn’t a great idea

But we do that over and over again, isolating skill after skill in an effort to pinpoint student learning in each area, make a plan for improvement, and report the results. And in the end, students seem to be learning tiny little skills, procedures, and algorithms, which will help them to be successful on our tests without developing the connections between concepts or long-term understanding.

We want to have “authentic, real-world problems” in our teaching so that students can make connections to the real world and (fingers crossed) want to be engaged in the learning. But authentic problems are complex problems, and by julienning our concepts into matchstick-size steps we are sacrificing meaningful learning opportunities.

What if we didn’t have to evaluate?

We’re slicing these concepts so finely because we’re aiming for that granularity. We want to be fair to our students and not penalize their system-solving because of their fraction-failings.

But if there were no marks to attach, would we do the same thing? Would we work so hard at isolating skills, or would we take a broader approach?

My MDM4U class

I’m teaching Data Management right now, and the strand dealing with statistical analysis has a lot of procedure skills listed followed by a bunch of analysis skills. If I evaluate the students’ abilities in summarizing data with a scatter plot and line-of-best-fit, do I then ask them to analyze and interpret the data based on their own plot and line? What if they mess up the plot; don’t I then have to accept their analysis based on their initial errors? Oh wait, I could make them summarize the data, then I can give them a summary for a different data set and ask them to draw conclusions from that summary! Then they’ll have the same starting point for analysis, and they can’t accidentally make the question too easy or hard!

But I’ve just messed up one of my goals, then: I’ve removed the authenticity and retained the ownership of the task. I haven’t empowered my students if I do it this way, and I’ve possibly sacrificed meaningful complexity. Worse, I’m only doing this because I need to evaluate them. I’d much rather require them to gather, summarize, and analyze data that interest them and then discuss it with them, helping them to learn and grow in that richer context.

As always…

…I don’t have answers. Sorry. I’m trying hard to make the work meaningful and the learning deep while still exposing as much detail about student thinking as I can. I’m sure in the end it’ll be a trade-off.

Advertisements

Improving report card comments with a checklist

It’s report card season in Ontario, and I don’t know too many people who are happy about it.

I don’t love evaluating student performance in general, and the persistent and poisonous focus on MARKS by most stakeholders in student learning is infuriating. Marks are a huge loss of information about student performance, in my rarely-humble opinion. Along with those percentage marks we have a much-less-valued-but-more-valuable evaluation of Learning Skills. My students mostly ignored those, I think.

In truth, the hero of the report card is The Mighty Comment. It has the superpowers of Explanation and Recommendation. It’s here that I can talk about what’s going on, why, and how to improve.

After all, assessment is for improving learning. Reporting a mark of 68% doesn’t do that.

So The Mighty Comment is our hope for the future, the only power that can save our students and their parents from receiving an all-but-useless document.

Let’s do it right.

I’m teaching in a high school, and we have both a provided comment bank and the latitude to write our own comments. The only rules are that we need to follow the guidelines in Growing Success and we have to keep it under 458 characters.

I read an interesting article at rs.io called The Unreasonable Effectiveness of Checklists.

Fireworks blazed across my brain. I need a checklist to make sure I’m doing what I want to do with every comment.

So I made one

The Report Card Comment Checklist (catchy name, eh?) is now live. I also included The Verbose Report Card Comment Checklist immediately after it to help explain what I mean. Please leave comments here on the blog if you can help me to improve it.

I sat with each of my students this term to review their marks, learning skills, and comments before I submitted them to my school admin team. I wanted them to know that I tried to write what I thought and that I cared about their improvement. I articulated their strengths and what I need them to do next. I asked them each to reflect on their comment (most of them needed to be prompted) and to tell me whether they thought it was fair, accurate, etc. One student found a typo (yay!) and two asked me to clarify what I meant. About five students said their comments sounded exactly like them, which makes me proud.

I have to admit that I made the checklist this evening; I may have to edit my comments a bit next week before they’re published.

You should just click the link for the complete version, but here it is anyway:

The Report Card Comment Checklist

Check each student’s report card comment and ask yourself these questions:

Strengths

  • does it include at least one strength?
  • are the strengths related to the course?
  • are the strengths worded positively?
  • do the strengths stand alone?

Next Steps

  • does it include at least one next step?
  • are the next steps related to improvement in the course?
  • if a student reads the next steps, will they know what to do to improve?
  • are the next steps worded positively?
  • do the next steps stand alone?

Language and Tone

  • did I check for spelling, grammar, etc.?
  • did I read it out loud?
  • did I listen for sarcasm and negative feeling in my voice?

The Point

  • will the student feel that I care about their success?
  • will the student “see themselves” in the comment?
  • will the student want to continue to improve?
  • will the parent understand how to help their child improve?

 

Educational skyscrapers

A Skyscraper in Hong Kong

Photo by Andreas via flickr (CC-BY-SA 2.0)

Think of a large office building that was constructed in the 1950s. It was built with the best technical and artistic understanding of the time, both to please the eye (possibly) and to be functional (primarily).

It’s taking up highly valuable real estate in a location that everyone wants to be and build. It’s kind of an eyesore now, unless you’re exceptionally nostalgic. It’s been maintained fairly well, including some paint, new windows, a fresh roof, and so on.

But all around it in that costly area we see that newer, better buildings have sprung up. They are feats of engineering, testaments to materials science, and visions of artistic grandeur. They are massive edifices that reach high upwards, dwarfing the quaint neighbour from a time that is dimly, though fondly, remembered.

We have two choices.

We can continue to give that tiny building a face-lift, replacing the carpet or changing the fixtures. Or we can rebuild, keeping the parts that make it a functional building while applying our new understanding and skills to an improved design.

That new building will look and feel dramatically different. There will be more space, more options, more possibilities for its use. It will still be a building, and it will still have the same basic purpose, but it won’t be a barrier to progress the way the aging structure is.

Most importantly, perhaps, are the windows: the new building will be much more transparent.

This is education.

We currently have that old building in Ontario. Let’s stop replacing broken tiles and repainting the paneling, and let’s talk about how to rebuild.

Different kinds of Thinking: Ontario Math Achievement Chart

I’m evaluating some student work today and I’m struggling with the Achievement Chart for Mathematics (see page 28). In particular, this part of the Thinking category is bothering me:

An excerpt from the math achievement chart for Ontario

Take a look at the first point in “Use of planning skills”, called “understanding the problem”, which includes “formulating and interpreting the problem” as an example of that skill.

Now look at “Use of processing skills” point “carrying out a plan”, which includes “modelling” as an example of that skill.

Are these different? In my mind (up until now, at least), “formulating and interpreting the problem” has meant representing a situation mathematically so that we can apply our other math skills to solving it. Isn’t “modelling” in the context of “carrying out the plan” sort of the same thing? Representing components of the problem mathematically? Is the difference just when it happens (i.e. formulating/interpreting is initial planning, and modelling is during the act of solving)?

I’m not trying to be pedantic here; I’m having trouble distinguishing between the different components of Thinking when I’m trying to assess and evaluate my students’ work. I could use some external thinking on this issue (and math evaluation in general, I suppose).

Please comment; I’d love to talk to you if you have ideas about this stuff.

When are different devices most useful in K-12 education? (Survey to complete!)

An image of a young person cuddling with a pile of electronic devices.

Photo by Jeremy Keith, CC-BY-2.0 via flickr

I’m having lots of conversations right now with teachers across Ontario about what kinds of devices should be in schools. More and more we’re agreeing that there should be a mixture of devices, for a variety of reasons. We’re working on a document to articulate some of those thoughts.

However, when a board/school is trying to purchase technology it often is trying to meet the bulk of student needs, not necessarily provide a device for every possible use case. So, if a school can have iPads or laptops, which should they choose? The answers aren’t simple or clear, and always involve the phrase “it depends on…”.

So I’ve made a short survey that I’d love for you to complete if you’re a teacher in K-12. It doesn’t take long. Just let me know what your thoughts and experiences are with different types of devices. If you need to take the survey multiple times, that’s fine (let’s say because you have experience in elementary and secondary, or your math and music classes are very different).

Here’s the link. Please share widely. And thanks!

http://bit.ly/K12-devices

“Being an independent practitioner is inconsistent with professional practice.”

wall.jpg by frenchbyte on MorgueFile

Don’t go it alone. Image from frenchbyte via MorgueFile

The title quote is from Catherine Montreuil, Director of Education for Bruce-Grey Catholic District School Board. She said this during her keynote presentation at On The Rise K-12: Enhancing Digital Learning on April 2, 2014.

This has really stayed with me. I’ve thought before about the moral imperative I believe teachers have to use technology in their teaching, and to be a reflective practitioner. I’ve always thought it a basic requirement to keep up-to-date with our best thinking around instructional strategies and assessment approaches.

But I’m not sure I’ve ever really thought about it quite they way she put it: that it’s actually unprofessional to be disconnected.

I believe you can connect in any way you like. Connecting with others in your school is a good first step, but the insular nature of schools can prevent you from seeing the possibilities for learning. Technology has made it tremendously easy to connect, build relationships, and learn from others who think a little differently because they don’t have the challenges/restrictions/history/blindspots that any group has. My preferred platforms are Twitter and WordPress, but there are many ways to share and to question. Create your Professional Learning Network (PLN).

The opportunity is there for all of us. You can choose how deep you want to go, but I don’t think you can in good conscience choose to ignore it completely. Learn from and with others, because no one has all the answers.

Upcoming #OTRK12 Session Highlights – Part 3

Another chance to hear about the great opportunities at On The Rise K-12: Enhancing Digital Learning on April 1 and 2, 2014 in Mississauga, Ontario. You can read all the details at http://otrk12.ca. If you want to attend, you can register there (the cost is $100 per person per day).

The awesomeness continues

I’m sure you’ve already read through Part 1 and Part 2 of this series. Here are a few more beautiful learning opportunities.

Note: If you have already registered and wish to change your session choices, just send me an email with the new session code(s).

Tuesday

Session Block 1

D1S08: Leaping into Literacy Test Preparation with D2L

This session will look at the TVDSB OSSLT preparation strategy using D2L. We will demonstrate what the course actually looks like, the data that can be collected when utilized with students (individual/group, skill/competency) and how it is being put to use in a variety of different ways in our schools to suit their unique needs.
Intended Audience: Secondary Teachers, eLCs/DeLCs, Administration
Experience Level: Intermediate-Advanced
Presenter(s): Shereen Miller and Carrie Huffman

Session Block 3

D1S22: Blended Learning Meets Science & Technology (Elementary Focus)

Join us for a look at Ministry provided Blended Learning resources specifically for K-8 Science & Technology, including the provincial Virtual Learning Environment (vLE). Did you know that Ontario teachers have access to blended Science and Technology packages, carousels of OERB objects, as well as tools such as ePortfolio, News, and Calendar? Did you know that Blended Learning provides flexible and engaging ways to help students demonstrate their learning and focus on activities that highlight communication, collaboration and differentiation? Come and find out more about how you can make Blended Learning part of your students’ learning experience.
Intended Audience: Elementary Teachers, eLCS/DeLCS, Administration
Experience Level: Any Level
Presenter(s): Sharon Korpan

Wednesday

Session Block 1

D2S08: Building a diverse, digital learning ecosystem

Just as a carpenter wouldn’t get very far with just a hammer in his toolbox, so to the e-teacher wouldn’t get very far using only a single digital tool. This presentation will look into various online tools for replacing classroom techniques as well as unique digital tools that offer opportunities that can’t be found in a f2f learning environment.
Intended Audience: All teachers
Experience Level: Beginner-Intermediate
Presenter(s): Tim King

Session Block 2

D2S13: Inspiring Technology Training

This session will cover lessons from voluntary and incentivized tech training. Examples include lunch and learns, after school professional learning “tech” groups, and earn a laptop type session. The session format will be a short overview of what other boards have done, followed by discussions and questions by attendees on how to encourage this type of learning in their own boards. Attendees are encouraged to share their own stories and questions regarding how to expand PD opportunities in their own boards.
Intended Audience: eLCs/DeLCs, Administration
Experience Level: Any Level
Presenter(s): Gino Russo, Corrine Pritoula