I’ve been thinking about an interaction I had in class last week. I’ve transcribed it roughly below. For a bit of context, the language point was going to for future plans, and the language had been presented through a listening. This was a controlled practice stage. (more…)
On my module in materials development we’ve just looked at reading and listening tasks.
We spoke about what makes good/bad comprehension questions. ‘Plain sense’ questions are seen as pretty ineffective, as they just test familiarity with sentence structure rather than actual understanding. Here’s an example of a plain sense question that I came across on my Dip (at TLI)
Most zins are bosticulous. Many rimp upon pilfides.
Q1. What are most zins?
Q2. What do many rimp on?
Although the text is full of nonsense words you can answer the questions without really understanding it.
I can see why these questions aren’t considered that useful. So why do coursebook writers fall back on them?
I know, it’s really easy to be critical. I’m guilty of writing rubbish questions like this too. Actually, I don’t do it that often. I prefer to write really ambiguous questions which will prompt conversation/debate – these are equally annoying I think!
But these ‘plain sense’ questions… I mean… I came across this activity the other day:
these are from Beyond A2+ (copyright Macmillan), which is actually quite a good coursebook.
Here’s the text, with relevant parts highlighted for each answer:
I’m not saying the whole activity is pointless. It’s just that some questions are plain sense or focus on simple grammatical relationships.
Q4: To have a healthy heart, how often do we need to exercise?
A: (To have a healthy heart, we should exercise for) 30 minutes at least three times a week
Q5: What happens if we do puzzles?
A: (if we regularly use our brains to do puzzles), we actually become more intelligent
I could give the author the benefit of the doubt I guess. For example, you can still teach some reading strategies related to the questions. In Q4 students could predict the answer based on question stems (e.g. How often = a frequency), then scan the text quickly for the relevant info – if they didn’t already have a massive clue by being given the start of the sentence. Maybe Q5 draws attention to the word ‘intelligent’ as new vocabulary, but you don’t need to know the meaning to answer the comprehension question.
I’m not suggesting how these questions could be improved. That’s not because I’m lazy. It’s because when they were devised they must have been written by a far more experienced teacher and then accepted by a skilled editor, both of whom must have had a clear pedagogical rationale for choosing these questions.