# Which kind of calculator promotes good algebraic thinking?

I teach high school math. Students bring scientific calculators to class, or they sometimes have to borrow one from me. I have two types available: immediate execution calculators and formula calculators. I’ve been wondering lately whether one type of calculator is better for learning algebra than the other.

Here’s how they work (see Wikipedia for a longer explanation: https://en.wikipedia.org/wiki/Calculator_input_methods).

# Immediate Execution

The TI-36X Solar 2004 version immediate execution calculator.

These calculators work by performing calculations along the way as you type in values and operations. For example, you can evaluate the expression

$\sin(3 \times 45)$

by typing 3, multiply, 45, then the sine key. As you press operations and operands the calculator will evaluate what it can according to the rules of order of operations, or BEDMAS. For binary operands (those taking two values to produce a result, like multiplication), you put the values in order. For unary operations (those taking just one value, like squaring or taking a sine), the value must be present on the calculator screen when you press the operator key. These calculators usually have a bracketing feature to allow the user to work through complex expressions without using memory storage.

# Formula

The Sharp EL-510R formula calculator.

These calculators work by waiting until the user has typed in a complete expression to evaluate, then evaluating the entire expression. The order of button-pushing is pretty much as the symbols are written in the expression, making them easier to use for a lot of folks. Once a value is calculated, it’s stored in an “answer” variable in case it’s needed for the next evaluation.

# Algebraic Expressions and BEDMAS

When we write out algebraic expressions, we have a number of conventions to follow. The most important convention is order of operations, which people usually learn to remember with the mnemonic BEDMAS or PEDMAS:

• Brackets (Parentheses)
• Exponents
• Division and Multiplication

When evaluating (simplifying) an expression, you first simplify the smaller expressions inside brackets. Then you evaluate exponents, then division and multiplication in the order they appear, and finally addition and subtraction in the order they appear. It’s useful to think of brackets as isolating sub-expressions, which then follow the same rules. It’s also useful to think of this order as the “strength” of the operation: multiplication is a stronger operation than addition, so it holds its operands more tightly together, and it gets evaluated first.

When a student is learning order of operations, it often feels like a set of arcane rules. There is no reason, from the student perspective, that it has to be this way. In fact, it didn’t really need to be this way, but the convention was established and now it’s important to abide by it (if you want to be understood, that is).

# How a calculator helps (and hinders) learning arithmetic

People often lament that today’s youth can’t perform basic arithmetic in their head. It’s unfortunately true; I often see students reach for their calculator to evaluate $35 \div 5$ or even $4 \times 6$. These are facts which prior generations had drilled relentlessly and now have available as “instant” knowledge. Younger people typically haven’t spent enough time practising these computations to develop facility with them. This is partly because the calculator is so readily available.

(Aside for parents: If you have kids, please do make them practise their age-appropriate facts. It’ll help them in the same way practising reading makes things easier)

This will draw a lot of heat, I’m sure, but I think calculators do have a strong place in even K-6 learning. They let students explore quickly without the burden of computation getting in the way of non-computational learning. It’s the same effect that web-based, dynamic geometry software can have on learning relationships between figures, lines, etc. (if you’re looking for awesome dynamic geometry software, try GeoGebra – free and wonderful).

But calculators are a hindrance when students are learning to compute fluently. They allow a student to bypass some of the thinking part of the exercise. Don’t let students (or your kids) use a calculator when they don’t have to. Only use them when students need the speed for the task they’re completing.

# How a calculator helps (and hinders) learning algebra (?)

I think immediate execution calculators require students to understand the algebraic expressions we write, where formula calculators bypass the thinking part of evaluating expressions.

As with arithmetic, if practising evaluating expressions is not part of the learning, and might be getting in the way of the goals for learning, then either type of calculator is fine.

But as students are developing their understanding of algebra and the order of operations, the immediate execution calculator displays the results of operations as they are evaluated, while the formula calculator obscures the evaluations in favour of a single result.

When a student types 5, add, 6, square, equals into an immediate execution calculator, they see the value 36 as soon as they press the square button. There is a reminder that the square operation is immediate. Similarly when a student wants to evaluate $\sin(30+45)$ they must type 30, add, 45, equals*, then sine, emphasizing that the bracketed portion has to be evaluated first (i.e. before the sine function is applied).

*A student can use brackets, which is equivalent to pressing equals before sine. Also, I hope anyone using the sine function knows that 30+45 is 75 and doesn’t need a calculator’s help for the addition.

# Is there research?

I perused the InterTubes to find research into this question, but either it’s not out there or I’m not skilled enough to find it.

I want to know whether one calculator is better than the other for a student who is learning to evaluate expressions.

Has no one looked into this? Help?

# How To Become An EdTech Leader

by Noël Zia Lee at flickr.com, CC-BY 2.0 licence.

I hosted a session at On The Rise this year. I’ve posted my slides as a PDF, but I knew from the start that a 60-minute session would be too short for the topic. Here is the previously-mentioned, obscenely long, supplementary blog post.

# Introduction

Being a leader in educational technology does not mean becoming technically skilled. It doesn’t mean you can write code, recover crashed hard drives, or configure a router. You don’t even have to know what those phrases mean.

Being an EdTech leader means that you have relationships with others, and that you share with them using technology.

# Goals

In order to lead effectively, you need some goals. Here are a few generic examples of goal types:

• Develop skills and knowledge to improve yourself and your work
• Develop skills and knowledge in your team
• Foster collaboration in your team
• Give to the larger community (beyond your team)
• Develop personal and professional relationships
• Share resources

For example, you might have a goal of learning how to use audio recordings for assessment as learning in a language class. You might have a goal of connecting your teachers to teachers in other school boards. You might want to develop closer professional relationships within your department. You might want to collect and curate resources to support newer teachers.

All of these are great goals, but make sure you follow one simple rule:

## Have goals for yourself and goals for others

One or the other type isn’t enough if you’re trying to consciously lead. If you only have goals for yourself, you have no reason to share and support others. If you only have goals for others, you’re just trying to “improve” them without being honest about your own needs. Have both.

# Kinds of Communities

There are a lot of ways to categorize communities, but the “publicness” of a community is fundamental.

## Public vs. Private

This is really a continuum.

At one extreme end of the continuum we have completely public communities, which anyone can observe and in which anyone can participate. For example, Twitter is generally* publicly and globally available. What you say there is readable by anyone, even those who don’t have user accounts with the service.

(*I’m not going to put asterisks all through this post, but be aware most of these statements can be modified by user settings. For example, on Twitter you can protect your tweets so that only approved users can read them.)

At the other extremity we have completely private communities, which are only visible to the “invited” few. For example, you may have a Facebook group that only approved participants can join. The rest of the world isn’t allowed inside.

What’s best depends on who wants to participate, what their level of comfort is, and what everyone’s talking about.

You might have a private community when you need to talk about something sensitive or confidential, or when the participants are worried about making very public mistakes (particularly if this sort of community is new to them). You might protect the conversation when you need to prevent self-censorship in order to have honest dialogue.

You might have a public community when your local community (e.g. the people you work with) is a small one, and you want outside voices. It’s good to be public with universal issues, like assessment or writing.

You might partially protect the conversation by making it “read-only” to the uninvited. For example, perhaps you share the work you’re doing with your department members on a departmental blog/wiki/etc. The rest of the world can view your resources, but only your department members can update the work or comment on it.

## Constructed vs. Organic

Some communities are organized and constructed. For example, you might set up a discussion group about instructional strategies, or you might create a sharing folder for rubrics. The purposes of those communities or activities are clear, and so they’re constructed.

Instead you might just set up a space for conversation to happen. Twitter is my favourite place for this. The topic isn’t defined in advance, so we can talk about anything we want to. The connectedness of the participants is what matters, not the quality of the prompt. Organic communities tend to be participant-directed and very welcoming of tangential thinking.

Halfway between these is the ConstrOrganic community (yes, I just made that up. I’m sure it’ll catch on). This is a community of people which doesn’t have a tight restriction on the conversation, but does sometimes provide prompts. For example, you might ask an open-ended or reflective question on Twitter: “How does your experience with technology in your personal life affect your use of technology in the classroom?” The question itself is posed in an otherwise organic community, but you can try to mould the conversation for a while. In my experience we don’t stay “on-topic” for very long, but that’s fine: the talk goes where it needs to, not where I aim it.

## Required vs. Optional

This is one of the hardest to deal with, and it very much depends on (a) who you are, (b) what your role is in your organization, (c) who you are leading or hoping to lead, and (d) what your goals are for the people you lead.

If you’re a principal of a school and you want all the teachers you work with to reflect on their assessment practices in an online space, you might be considering requiring a writing activity in a private, online space. When you imagine how that will play out, you might be concerned that some folks might not participate, or that the participation might not be as deep and reflective as you want.

Rule of thumb: don’t require participation (at least at the beginning) if it’s not anonymous (and therefore safe). People need to trust you in order to follow you. If you don’t already have the level of trust that makes an optional task work well, then you don’t have the level of trust that makes a required task work well with names attached. By removing the names, you’ll remove a good portion of the (legitimate) fear associated with putting thoughts out there.

For example, you can create a shared online document (like a Google Doc) and make it editable to anyone who has the link. Participants can modify the document without identifying themselves, which makes it a lot more likely to be honest and complete.

## Instant? Persistent?

Your interactions within your community can be synchronous (instant), like a tweet, or asynchronous, like a blog post and a comment. This is often a tradeoff between speed (synchronous) and depth (asynchronous).

Online conversations are usually persistent (they stay there forever), but they may not be easy to return to or make quick sense of later. Sometimes conversations are temporary, like a back-and-forth on TodaysMeet.com or a Google Hangout.

I wonder if having persistent, asynchronous conversations creates a thoughtful-but-cautious environment, possibly erecting a barrier or self-censorship. Is it true that instant, casual, organic conversations are more honest and allow for experimental thinking?

## Email is not an effective community

It could be, I suppose, but don’t just do this. Mass email isn’t personal, interactive, or persistent (for most people), which can be good things to have. It’s typically one-way communication, and you can’t really opt-in or opt out. Other types of services will work better for you.

# Possible Roles

In your participation in any community in which work is being done, you usually take on one of four roles. You’ll move between them freely and frequently once you’re a solid member of a functional community.

## Quester

I have a question or problem, and I’m looking for an answer or solution.

For example, I post to Twitter, “Anyone have a good summative task for the quadratics section in MCF3M?” That’s a specific quest, and I’m the Quester. Anyone else in the community can answer, and anyone else can benefit from the answer.

I am trying something new, and I’m going to share my journey.

For example, I decided to try some physical demonstrations in my classroom for quadratic motion. I wrote a blog post explaining what I had come up with, shared some video from class, and reflected on how effective it was. I wasn’t an expert, but I shared what I found out (even if it turned out to be wrong).

## Neophyte

I am learning something new from you, and I may ask questions.

This is great when there is a source of wisdom you can tap into. For example, I can read all about how to use Screencast-o-matic to improve an e-Learning course by watching someone’s videos or reading their tutorials. If there is something I don’t understand, I can ask questions. The answers help me, and both the answers and the questions help others (including that expert).

I have some special skills or knowledge, and I’ll demonstrate or share.

For example, I post instructional videos about how to program a computer using the Java programming language. I’m sharing some niche knowledge that I have, and I invite conversation and questions about it. I’m not looking for anything specific, but that knowledge does very little good bottled up in my own skull.

# Choosing a Platform

There are three major considerations:

1. Does the platform have the level of privacy that I want or need? This is a dealbreaker if it doesn’t. Also consider the granularity of privacy settings, because you might want to “reduce” them later (e.g. become somewhat more public).
2. Does the platform have the functions I want or need? Think about formats, ease of use, technical support, exportability (if I leave, can I bring my stuff with me?), and cost.
3. Will/does the community use the platform? If a platform is popular, the community might already exist or be easier to create. No one wants another password to remember.

## Some possibilities

Lots of platforms serve multiple purposes. YouTube is for video, but it includes commenting. WordPress is for blogging, but it can serve as a fully functional website. Facebook is a social network, but it has private community pages. Here are a few loose categories and some popular services:

• Blogging (WordPress, Blogger, Medium)
• Curating (Pinterest, Scoop.it)

# Some Challenges and Cautions

Here are some other considerations when you’re adventuring online.

## Be careful what you say

Think about maintaining loyalty to your employer, respecting copyright and other licences, and protecting student identity and information. There are some things that you simply can’t say in public.

## Who’s listening?

You might draw unwanted attention, even if what you say is “allowed” and isn’t “wrong”. For example, what will you do if a parent has a concern about the conversation between two teachers revealing a lack of professional understanding? Also consider students, other schools, and community members.

## What is privacy, really?

When you post something in a private or protected space, you’re trusting the other people in that space to maintain the privacy of your thoughts. Before you post something, consider what might happen if it were “leaked”.

## Will this be personal, professional, both?

Each has advantages and disadvantages. Your decision will depend on your goals. Here are some of my thoughts on the matter from a while ago:

## No takers?

What if your team doesn’t follow you? What if they want to do something else? What if your team is already doing something different? What if your team is afraid?

You’ll have to work through the reasons for your particular situation, and talk with your team. There might be nothing you can do, except for continuing to share and model good practice.

## What if my preferred platform is filtered by my organization?

Is there a good reason for the filtering? Are the people who make those decisions aware of what you’re trying to accomplish? Are they supportive? Have you talked with them about it (really talked, not just made a request by email)?

Sometimes the decision makers have parameters that you’re unaware of, and sometimes you have insight they are unaware of. Talk to each other. In the best situations, neither party thinks they have all the answers.

# Final thoughts

At On The Rise: K-12 in 2014 Catherine Montreuil (then of Bruce-Grey Catholic DSB) said, “Private practice is inconsistent with professional practice.”

Being connected makes someone a leader, and being open and transparent are the best ways to get connected. You don’t need to be expert, articulate, or tech-savvy.

You just need to be willing to share.

# Inconsistency in Evaluation Practices

I’ve been having some great conversations with teachers in my school about final evaluations in high school courses (i.e. exams and final culminating tasks). I see a desperate need for the discussion, so I’m hoping this might be a place for some of it. To that end, I’m sharing some of the points people having been making.

# First, some context

When two or more teachers in a school have sections of the same course, they’re encouraged to collaborate throughout the courses and are required to have consistency in the way their final 30% is evaluated. For example, if one teacher has a large culminating task for the entire 30%, another teacher of the same course shouldn’t have a 30% formal “test” exam.

This is true in lots of schools all over Ontario. It’s not a provincial policy, but it’s a very common board/school/departmental policy.

# Thoughts I’ve had and heard

These are some of the points I’ve heard about this approach in no particular order. I’ll use the term “exam” to refer to any tool that is used for the final 30% component of a student’s grade, whether it’s a test, assignment, presentation, research paper, performance, etc.

• If you have a formal exam and I have a task, students won’t get consistent marks, which matters for post-secondary entrance/scholarships.
• How is it different from one teacher being a “hard marker” and the other teacher being an “easy marker”? Isn’t that a bigger problem?
• If two siblings are evaluated differently, parents and siblings will all be upset that it’s not equal.
• Two teachers in different schools/boards don’t have to align their exams; why is it required within a school?
• You’re more likely to have a consistent mark distribution if you use the same exams.
• Teachers should have autonomy and be permitted professional judgement as long as they’re following curriculum, Growing Success, and other policies.
• Students need to write formal exams to prepare for university, so there shouldn’t be other forms of exams in grade 12, especially for U courses.
• The exam is only worth 30%. The 70% term work is more valuable, but the policy doesn’t apply to it.
• If you say my exam is easier than another teacher’s exam, you’re implying that one of us is inaccurately evaluating student understanding and performance.
• There is no standard for the “amount of work” a student has to do for an exam.
• We should have provincially standardized exams for senior courses for consistency and equity.
• An open-book exam is easier than a closed-book exam.
• An open-book exam is harder than a closed-book exam.
• Some students need accommodations because of learning disabilities. Is it okay to give a different form of the exam for those students? Can’t other students access the same accommodations, since they aren’t modifications?
• If school administration would approve of both exams on their own, then two teachers should be able to have different exams at the same time.
• Not all forms of evidence of student learning are equally valid or accurate.
• If I come to a school for semester 2, why am I restricted by what a semester 1 teacher chose to do in their class?

# What do you think?

Post some comments. Let’s work on this together.

# Poor research in education bothers me

When I look at educational research about instructional strategies, I’m concerned with how often the researchers ignore important controls. They confound their data and then draw invalid conclusions.

I just read some research in which students were taught the same math content but using two different approaches:

• Group A was taught “traditionally”, which included teacher-led, direct instruction;
• Group B was taught with a student-directed approach and a specific context.

After read the descriptions of the two groups, what conclusions could you draw if one group outperformed the other group?

# Unfortunately…

…the researchers concluded that the context they used was important for student learning. They admitted that the specific context required a very different instructional approach, but they attributed the achievement differences to the “theme” of the task.

# Confound it

That’s really not good enough. You can’t have two major differences between groups and then point to either difference as the cause. In fact, you can’t conclude much of anything from data like this.

Fail.

# Do it right

Only change one thing at a time. If you need to change more, try a third group (here, you can have a group with a student-directed approach but without the contextual restriction). Then you’ll be able to tell if the difference is due to the approach or the approach with the context.

# We have to stop pretending… #MakeSchoolDifferent

I’m responding to Sue Dunlop’s challenge (which is the result of a series of challenges stretching back to Scott McLeod). I’ve only read a few of the other posts that this challenge has generated, so I apologize to anyone who already expressed these same thoughts.

1. We have to stop pretending that it’s okay to complain about someone else instead of offering them support.
2. We have to stop pretending that telling people to learn how to cope is an effective strategy for dealing with mental health challenges.
3. We have to stop pretending that evaluation can be both objective and accurate when implemented by a single human.
4. We have to stop pretending it’s acceptable and reasonable for reporting periods to dictate the pace of learning in our classrooms.
5. We have to stop pretending that there is a single, correct solution to any one of these complex problems.
6. We have to stop pretending that we can do this on our own.

Oops, that’s 6. Ah well.

The tagged? David Jaremy, Peter Anello, Tim Robinson, Eva Thompson, and Doug Peterson. Additional apologies if you’ve already been tagged.

# How I Use Twitter Professionally – Version 4

Two and a half years ago I wrote How I Use Twitter Professionally, then revised it with How I Use Twitter Professionally – Updated! and How I Use Twitter Professionally – Updated Again!

I guess we’ll make it an annual thing:

## My tweets are public.

I’m trying to encourage conversation and collaboration, so my tweets are globally accessible. This also means I don’t make statements I wouldn’t be comfortable with anyone reading – my family, my students, my employer….

## I don’t follow a lot of people.

I currently follow 370 people, of whom about 250 are actively tweeting (let’s say at least weekly). Some of these aren’t related to education; for example, I follow The LEGO Group (@LEGO_Group) and authors John Scalzi (@scalzi) and Marko Kloos (@markokloos – he has a new book out today!).

I can’t read all of the stuff they tweet. I’m relying on my tweeps to retweet the really good stuff so I have a better chance of seeing it, or to mention me if it’s something they think I ought to notice.

## I accept anyone as a follower, pretty much.

Except for a few obvious accounts, I let anyone follow me. Since my tweets are public, anyone can read them (even without a Twitter account), so letting people follow me doesn’t reveal anything extra. Plus, it’s easier when you don’t have to approve people.

## I don’t follow back as a courtesy.

Before I decide to follow someone, I take a look at their tweet history. Is their stream of tweets going to enhance my experience? Will I learn from them? Or will I only learn what they had for breakfast?

I’m a fan of some personal stuff on Twitter, but if you post 300 times a day just to talk without conversing, I don’t need to see it. It’s not about you, it’s just that your use of Twitter doesn’t fit with mine. I think your lifestyle on Twitter should be like the Law of Two Feet: if it’s not working for you, move on.

## I don’t accept Direct Messages (DMs) from people I don’t follow.

This cuts down on the spam. Now it’s just mentions, and there aren’t too many of those. This is a good idea for anyone, so I thought I’d list it here.

I also don’t follow people who I don’t want to DM me. That especially includes students. I have my school email for that kind of communication.

## I follow hashtags for a while.

I follow #OTRK12 (our annual conference in Mississauga – this week!) and #elADSB (for my Board’s e-Learning teachers). I don’t follow the very busy tags, although I sometimes apply them to my posts (#D2L, #onted, #blendedlearning, #edtech).

I try to follow the people in Northern Ontario. We face many of the same issues, and perhaps we have solutions to help each other. I like that idea.

## I don’t cross post to Facebook anymore.

I tweet too much. No one on Facebook wants to read all of that stuff. The handful of FB friends who do are also Twitter users and teachers, so they just go to Twitter to find me. When I write blog posts WordPress will publicize them on Facebook, Twitter, and Google Plus, and I’m certain that’s plenty for the FB crowd.

## I use Tweetdeck; it rocks.

Chrome has TweetDeck as an app; I like that I can have columns for a variety of things I want to look at. Currently I have my Twitter timeline, my Twitter Interactions, my Twitter Messages (DMs), and columns for a bunch of hashtags and lists I follow.

## I say things for myself, and I say things for others.

I tweet things that I want to remember or revisit (great for “note-taking” at a session/workshop/conference). I also tweet things to inform others or start conversations. My tweets (of links and such) aren’t endorsements, but since people sometimes view them that way I try not to share stuff that I’m not at least familiar with.

## I talk a lot, but not too much

I try to ask questions and help out when others ask questions. I’m proud to say I am included as an honourary member of the SGDSB educators list because I help out the teachers up there, so I think my contributions are valued.

More importantly, I’m developing relationships with these distant folks, and the growth of my PLN has helped me out in my work as well. It was very exciting last year at OTRK12 to meet people whom I knew only through Twitter, and it was surprising how natural the face-to-face interactions felt. We were already friends. So thanks, tweeps.

## If you want to follow me…

I’m @bgrasley. No pressure, of course. Use Twitter however it works best for you, and don’t be upset if other people use it differently!

# Assessment and Evaluation: sacrificing complexity for granularity

I teach Math in Ontario. We have an “Achievement Chart” (see pages 28-29) which lists four categories of knowledge and skills. When we assess and evaluate student work, we separate student performance into the “TACK” categories: Thinking, Application, Communication, and Knowledge. The Chart includes criteria for each category and descriptors for different Levels of performance.

The curriculum itself is divided into Strands for each course, and these strands describe Overall Expectations and Specific Expectations (essentially the details of the Overalls).

So when evaluating student work, we evaluate Overall Expectations in the context of the four Categories of Knowledge and Skills, and we should have a “balance” between the categories (not equality, necessarily).

The truth is that I’m having some trouble with it. I posted a little while ago that I was struggling with the Thinking category, and that’s still true. But there is another issue that’s more pervasive and possibly more problematic.

# Isolating skills

When trying to separate out the different components of student performance, we would often ask questions that “highlight” a particular area. Essentially we would write questions that would isolate a student’s understanding of that area.

That’s a fairly mathematical, scientific-sounding thing to do, after all. Control for the other variables, and then effect you see is a result of the variable you’re hoping to measure.

For example, we wouldn’t ask a student to solve a bunch of systems of equations which only had “nasty” numbers like fractions in them (or other unfairly-maligned number types) because we fear that a student who is terrible with the fractions will stumble over them and be unable to demonstrate their ability to solve the system of equations. So we remove the “barrier” of numerical nastiness in order to let the skill we’re interested in, solving the system, be the problematic skill.

# This isn’t a great idea

But we do that over and over again, isolating skill after skill in an effort to pinpoint student learning in each area, make a plan for improvement, and report the results. And in the end, students seem to be learning tiny little skills, procedures, and algorithms, which will help them to be successful on our tests without developing the connections between concepts or long-term understanding.

We want to have “authentic, real-world problems” in our teaching so that students can make connections to the real world and (fingers crossed) want to be engaged in the learning. But authentic problems are complex problems, and by julienning our concepts into matchstick-size steps we are sacrificing meaningful learning opportunities.

# What if we didn’t have to evaluate?

We’re slicing these concepts so finely because we’re aiming for that granularity. We want to be fair to our students and not penalize their system-solving because of their fraction-failings.

But if there were no marks to attach, would we do the same thing? Would we work so hard at isolating skills, or would we take a broader approach?

# My MDM4U class

I’m teaching Data Management right now, and the strand dealing with statistical analysis has a lot of procedure skills listed followed by a bunch of analysis skills. If I evaluate the students’ abilities in summarizing data with a scatter plot and line-of-best-fit, do I then ask them to analyze and interpret the data based on their own plot and line? What if they mess up the plot; don’t I then have to accept their analysis based on their initial errors? Oh wait, I could make them summarize the data, then I can give them a summary for a different data set and ask them to draw conclusions from that summary! Then they’ll have the same starting point for analysis, and they can’t accidentally make the question too easy or hard!

But I’ve just messed up one of my goals, then: I’ve removed the authenticity and retained the ownership of the task. I haven’t empowered my students if I do it this way, and I’ve possibly sacrificed meaningful complexity. Worse, I’m only doing this because I need to evaluate them. I’d much rather require them to gather, summarize, and analyze data that interest them and then discuss it with them, helping them to learn and grow in that richer context.

# As always…

…I don’t have answers. Sorry. I’m trying hard to make the work meaningful and the learning deep while still exposing as much detail about student thinking as I can. I’m sure in the end it’ll be a trade-off.