In his anti-lecture lecture here, Donald Clark makes some well-known arguments about the weakness of the lecture format. They encourage passivity; they’re too long; they’re frequently delivered by introverts who have more expertise in research than spoken communication. It’s an interesting talk, and I’d encourage you to have a listen. Here though, I’m going to focus on another thing he says: if you’re going to lecture, then you should record it so that students can listen back. Clark outlines some of the supporting research for this view in a blog post here. The main benefit is that it allows students to hear the lecture multiple times: “Repeated, spaced practice is essential for real learning, deep processing, elaboration and therefore recall.” Additionally, students who didn’t understand part of the lecture can refer back to the recording and go over it again. This could be especially useful for international students, who may be struggling with some of the language in their lectures.
I talked about recording (audio or video) lectures with a couple of people at my college. They were behind the idea in principle, but had a couple of hesitations which to me sound quite reasonable.
(1) Constant recording is akin to constant observation
Formative observations are good. The feedback afterwards is personalised, and you usually learn something about your classroom that you wouldn’t notice otherwise. Sometimes teachers don’t like observations because they implicitly involve judgment of your professional ability, but most of the time we just suck it up and get the observation over with already.
The trouble is, teachers might be wary of observations (and therefore judgment) occuring without their knowledge. This is a real possibility if lectures are being recorded and stuck on the college website / VLE. This could be framed as a privacy issue, analogous to the discomfort some people feel with the intrusions of CCTV. Lecture theatres aren’t private spaces, but the prospect of being watched by unknowns is not something every lecturer is ready to get excited about.
(2) Recording lectures might affect attendance
It’s common practice at our college to put the PowerPoint slides of lectures up on the VLE. This makes sense, as it allows students to prepare for lectures and review some of the content afterwards. PowerPoints don’t make much sense on their own, and reading one is a poor substitute for actually attending the lecture. But what if the lecture was recorded and posted online? Then you could skip class and watch the lecture in bed. This is the principle of distance education, and it works well for some. There are, however, a number of reasons why this would be a bad thing at our college.
Let’s get the dull one out of the way first: international students are obligated by the terms of their visas to attend a minimum of 80% of their lessons. If by providing recorded lectures, we inadvertently encourage poor attendance, we might end up putting the students’ visas in jeopardy. Second, well attended lectures have a different buzz to them compared to a lecture with ten people and a digirecorder. Any musician will attest that the presence and size of an audience has a profound effect on performances, and lectures are the same. Third, lectures are easily misunderstood as one-way content delivery vehicles. Students and educators can both suffer this illusion, neglecting the fact that most lecturers respond to their audience, answering and asking questions, and gauging the comprehension and interest levels within the room. This interaction is difficult to replicate when students aren’t in the room. Here’s Tara Brabazon in Digital Hemlock making a related point in more colouful language:
To over-emphasise web-based learning removes the sensual, textured, weathered surfaces of popular culture. We lose the taste, smell and texture of texts, and the roughened, grainy skin of life [...] The enthusiam and excitement of the classroom disrupts lives, promoting ruthless debates between partners, friends and parents. It should be exciting, distracting and highly inopportune. I have never – in my life – seen a fervent, agitated, angry, volatile involvement with PowerPoint slides. Play Barry Manilow in a lecture theatre: watch the difference.
While I don’t think we need to do anything as drastic as playing Barry Manilow to show the difference between recorded lectures and the real thing, the danger is that our bed-bound student may well downplay this difference.
On the whole, I’m on the side of Donald Clark on this one. The two problems I identify here are real, but not insurmountable. The key issue is whether the lecturer is happy about being recorded. In every instance, they should have the right to veto, and I would be against lectures being recorded as a matter of policy. One compromise most lecturers would probably be happy with is the idea that the students record the lectures themselves; on their phones for example. This gets around the feeling that your managers and peers might listen in, and neatly sidesteps the attendance problem.
On the wall of our staff room, next to the kettle, there used to be a list of how people took their tea and coffee. Brian: Tea = milk, one sugar. Coffee = black, one sugar. On down for a number of staff. At some point, someone got rid of the list, and you know what: I don’t miss it a bit.
The list was a practical solution to a problem, which is when you’re making a hot drink for someone, you can’t remember how they take it. But fear not, traveller; you don’t have to ask them. Simply consult the list.
This tale is, of course, a case of complete trivia worthy of The Office. It’s trivia whose importance on life’s grand stage disappears into an absolute vanishing point. But I found it weirdly interesting, because it’s an example of how a practical solution inadvertently eliminates an opportunity to talk to our fellow human beings. This upsets me, because I love practical solutions, but I also love talking to humans.
“Do you want a cup of tea?”
“Yes please, Brian.”
“How do you take it?”
“Milk and two sugars please, Brian.”
In her book Watching the English, the anthropologist Kate Fox examines the place that our seemingly empty weather-talk occupies in English communication. (‘Nice day, isn’t it’ / ‘Yes, isn’t it’). She observes:
English weather-speak rituals often sound rather like a kind of catechism, or the exchanges between priest and congregation in a church: ‘Lord, have mercy on us’, ‘Christ, have mercy upon us.’
Exchanges about how people take their hot drinks occupy a similar role in office-talk, though the dialogue involves slightly more content than your average conversation about the weather. I wonder if that was why the list in our staff room was taken down by some mystery worker, binned and never replaced.
Meanwhile, on the internet
A few years ago, some wag got tired of people asking him questions which could be answered by Google, and set up LMGTFY.com, or Let Me Google That For You (“For all those people who find it more convenient to bother you with their question rather than google it for themselves.”) The site is a teaspoonful of humiliation for timewasters who have failed to take advantage of the independence offered by the world’s favourite search engine. It became fairly popular for a while (since you didn’t ask, here’s how popular, plotted on a graph over time, courtesy of Google). It’s a fair joke, playing on the idea that there are some things we don’t need to ask each other anymore. What’s the name of that actor who played Potsie in Happy Days? It’s a totally pointless question to ask anyone. Anyone who really wanted to know would be better off going straight to IMDb; a few clicks, and you’d have it. It almost seems bizarre that twenty years ago, before we started drowning in data, conversations with friends were a common way to establish trivia.
I’m going too far, I suspect. Our conversations are still peppered with trivia questions. The internet serves as an adjudicator in case of a dispute, or if no one knows the answer. But here’s something interesting: when we find out the answer, there’s usually a feeling of satisfaction tinged with disappointment. I call it “Satispointment”. The satisfaction is because we know; the disappointment, because we feel like Lucy in Peanuts: “Now that I know that, what do I do?”
Maybe exchanges about trivia are a disappearing habit. The internet is getting better and better at answering our trivia questions. Faster and simpler ways of accessing this information are cropping up every year. Will we one day reach a point where trivia questions disappear from our everyday conversation? In short, is the future a place where we talk to each other less?
Proponents of a technological dystopia would say yes. Just look at the kids with their headphones glued to their ears, cut off from any interaction with their fellow brothers and sisters. Just look at the phone zombies who walk our streets up and down the land, eyes covered in a thin milky glaze. Optimists may point to the present: the fact that we still talk crap to each other despite the fibre-optic cables serving up limitless amounts of the stuff. They may even point to the past: that steps forward in technology frequently mean more communication, not less. This second point is more powerful for me. Were people ever really nostalgic for an age before the telephone? Were things better before a reliable postal service allowed us to communicate with far away family members? In the same way, nostalgia for a time before the internet took over our lives seems misplaced. Besides, it has been frequently pointed out that new technology doesn’t always replace the old. Radio was followed by TV and then personal computers. None of them have become obsolete; now we just use all of them at the same time. So it is with discussions of trivia – pub quizzes, even – in the age of smartphones and Google. Somehow, in the face of adversity, they manage to co-exist.
The main lesson I took from the disappearance of the “How do you take yours?” milk/sugar list was that efficiency is not always popular when it interferes with our rituals of communication. Maybe it’s because the rituals are older than the technology. Someone observed on Reddit once that in reading Reddit first thing every morning, they are just like their father who started every day with a newspaper at the breakfast table. That’s a ritual, if ever I saw one, reshaping itself to fit new modes of communication.
Since we live in a world where efficiency is like a god to us, it might be worth asking why that is. Was efficiency always worshipped like it is now? Or is it the result of the great capitalist enterprise of making more and spending less? I would imagine efficiency was never bad. The interesting question is what trumps it.
A common part of life in a secondary school is a 15 minute session every morning called “form time”. Attendance is taken and messages are given out.
“There is a cake sale in room 162 at break time.”
“Mr Thompson wants to remind everyone to pick up their litter after break.”
“Mrs Butterworth is fashioning a small elephant out of Edam cheese this afternoon – please stop by and offer your support.”
After the messages are given out, pastoral business is attended to (“Where were you yesterday Jordan?”), and then if there is some time to fill, the class watches Newsround. Sometimes I would put some kind of word puzzle on the board in a vain attempt to fill the remaining seven minutes.
Despite being contaminated by a feeling of pointlessness, form time serves a number of useful purposes. One of them is that it provides a set time everyday where messages can be given out.
Where I work now
The college I work at now doesn’t have form time, for the simple reason that the students are older and their days don’t start at the same time. It wouldn’t work, isn’t suited to their age (17-20) and to be honest, I don’t miss it. However, its absence leaves the college with an administrative problem of how to give out messages in such a way that students get them. It’s been interesting to see how that problem has been approached, because it tells us a bit about the various methods we have at our disposal for transmitting information. And unfashionable as it may be, this is a central part of teaching people how to do things.
The obvious solution is a weekly email newsletter. In one second, the college messages are distributed to every student. Not everyone will read the email, but that’s their fault, right? Well, to a point. The trouble with messages coming in electronic form is that they become a part of the deluge of electronic information which characterises modern life. Our default state is to ignore electronic information. We have to, or we would go mad. When we get electronic messages, they have to pass through a couple of filters before we bother reading them:
Is it interesting?
Is it important?
If the answer to these questions is No, No, we click past the message, and it’s gone. If the messages are neither interesting nor important to the reader, then they probably didn’t need to read them anyway. The trouble comes when there is a misjudgment, and something interesting or important is filtered out and missed. Student ignores weekly email newsletter from college; misses an exam.
The typical way to get around this problem is to give messages more than once. If they come from a variety of angles, there is less chance of important information being missed. My college has tried one method for this which didn’t work, and is about to try another (which has potential if they improve it).
The ineffective method was an hour in the timetable set aside for a Cohort Meeting (I suppose a secondary school would call it an assembly). The idea of this was that every student would come into a lecture hall and messages would be given out then. Attendance at these meetings was poor, because if the timetabled slot was not adjacent to a lesson, students would not see the point of hanging around for a couple of hours just to hear some messages. This was frustrating for staff, because some of these messages were crucially important (what to do if you don’t get into your chosen university, for example).
The new method is to put the weekly messages on a PowerPoint presentation which is emailed out to teachers of one of the compulsory modules. These teachers then present the Powerpoint at the start of their lesson and the job is done.
Provided that teachers remember to do this, there are still a few problems with this idea. First, it kills the start of a lesson. Students should feel at the very least interested at the start of a lesson, and reading out administrative messages is not the way to do this. Secondly, there is a slightly petty issue of control. Teachers are likely to resent the start of their lesson being taken out of their hands, insignificant as it may be. The one time I read out messages this way, the PowerPoint felt a bit like an uninvited guest.
A good way of communicating information which my college uses (in addition to email) is posters. Big, A3, colour posters. Why would a poster be so much more effective than a PowerPoint presentation? To answer that, it’s helpful to turn to Douglas Rushkoff’s notion (borrowed from Marshall McLuhan) that different media have different biases:
Guns don’t kill people, after all, people kill people. But guns are much more biased toward killing people than, say, pillows — even though many a pillow has been utilized to smother an aging relative or adulterous spouse.
Posters have a bias towards two central elements of good communication:
(1) Grabbing people’s attention. We associate posters with advertising, which is founded on getting attention. When we create a poster, this purpose is usually in the back of our minds.
(2) Efficient language. Large fonts and a limited working space require the author to reduce their message to its essence. A bit like a tweet, I suppose.
PowerPoint can fulfil these two elements, but that is not its bias. Its bias is to ramble on for slide after slide, filling them with bullet points. Once you’ve given the message that there’s an exam tomorrow, it doesn’t cost anything to add a message reminding people to pick up litter, and then one more reminding people that they shouldn’t smoke by the entrance to the building. The result is a bored audience suffering from information overload, taking us right back to the problem of email newsletters.
PowerPoints can be used to distribute administrative messages, but it would work best if the PowerPoint is created within limits. Maybe there could be a rule that everything must fit onto one slide, and that the font size mustn’t be less that 36. Something like that would reduce information overload and PowerPoint would become a useful medium again.
While this is a fairly trivial topic to discuss, it’s useful to pay attention to how we communicate information, because communicating information is one of the most important things that teachers do.
In three of the modules I teach, the students are assessed on their ability to write an argument essay. Done well, this is a beautiful thing. The writer states an opinion, argues for it with logic and evidence, and concludes by stating the opinion again. By the end, the reader is either convinced or provoked into explaining why the writer is wrong. The student gets feedback on how clearly they express themselves and how well they support their arguments — both of which are useful, transferable skills.
Sometimes, the students want to give a balanced account of the issue and hold off on giving an opinion; conclude with something like, “As we can see, there are two sides to this story.” While this is a valid form of writing, it’s not an argument essay, so I can’t let them do that. Sometimes I use the example of a lawyer making a case to explain what an argument is. That usually gets the message across, but I sometimes feel uneasy when I hear myself say this.
Maybe it’s because by using this example, I risk giving the following tacit message:
(1) You are learning to argue persuasively, like a lawyer.
(2) Law is a prestigious field, leading to high-paying jobs.
Therefore, you are learning a prestigous skill that leads to high-paying jobs.
I can almost see old Uncle Plato shaking his head in disappointment. Plato believed that intellectual discussion should be in the pursuit of truth, not persuasion. Talk which merely aims to convince the other party is dismissed by Plato as rhetoric and sophistry. Skilled use of rhetoric allows lawyers to get criminals off the hook and politicians to spin lies into truth. Our noble philosophers, like judges and the electorate, want to see through this fog and establish what is true.
So is a lawyer a good model for a student preparing to enter the academy? Well, not the evil caricature of a lawyer I have presented so far. But any defence lawyer knows that their job is easier when the client is innocent. In a similar way, the writer of an argument essay has an easier job when they are arguing for a true propostion.
Paul Graham wrote an excellent piece on this topic called The Age of the Essay. Graham questions the argument essay as a useful way to establish truth, and if you have any interest in this topic, I would recommend you give him a read. One line stands out for me:
Good writing should be convincing, certainly, but it should be convincing because you got the right answers, not because you did a good job of arguing.
Bingo. This is where the beauty in an argument essay lies: when the writer gives you a window to the truth.
Here are two things that people think:
(a) I am better than that person over there.
(b) That person over there is better than me.
The first is an instance of superiority, and the second, inferiority. Neither are particularly desirable emotions. Superiority connotes feelings of smugness and arrogance, whereas inferiority connotes low self-esteem.
While we might subscribe to views that emphasise the equality of humankind, something always drags us back to these two feelings. Sometimes, groups of people decide that they are superior to other groups of people. Specifically, I’m interested in the formation of elites, when the superior group is small in relation to the inferior group.
Elites are not new, and they are not going away soon. Is it too simplistic to say that they derive their superior status from power? Money and knowledge help; but these are just good ways to increase power. Hence we see elites forming among the powerful, the rich, and the knowledgeable. From within the elite, there is often a feeling that they know best.
So let’s look at one of the new elites: the technology elite. The tech world runs on specialised knowledge, brings in money and increasingly exercises power. Our dependence on technology is a joke, but it is a serious joke. It is this dependence which lends technology companies their power. With the power comes the formation of an elite, and with that, the feeling grows within the elite that other people should be more like them.
Here’s writer Jaron Lanier, interviewed in the Spectator last week:
There are a lot of very positive things about the tech world. It’s remarkably unprejudiced and I’ve never encountered racism in it. There are a lot of good qualities, so I don’t want to criticize it too much. I remain in it, and I enjoy it. However, there is a smugness, or a kind of religious aspect to it. There is a sensibility that says: we have skills that other people don’t, therefore we are supermen and we deserve more. You run into this attitude, that if ordinary people cannot set their Facebook privacy settings, then they deserve what is coming to them. There is a hacker superiority complex to this.
When elites form, their members may be inclined towards excluding new members. This is logical, because just like a luxury brand keeping their prices high, it preserves the status of those within the elite. Unfortunately, it puts the good of the individual above the common good, and therefore strikes me as selfish and morally reprehensible. A more positive line to take would seem to be an inclusive attitude. I made it into the elite, so you can too. As the Tarvuists say, “It’s so easy to join.”
When I read Lanier’s comments about a “hacker superiority complex”, I immediately thought of the (fairly recent) drive to teach everyone to code. On the face of it, the desire to make coding available to all has benign, inclusive motivations. Coding is a modern form of literacy that enables people to participate in the technology they use. (See Douglas Rushkoff arguing that this computer literacy is an alternative to being enslaved by tech). Coding is an employable skill (fair enough). It teaches you to think (um… really?). We can see some of these motivations for the coding movement listed on Code.org, a website with a staggering amount of testimonials from a bizarre range of sources.
It’s hard to disagree with the mission statement at the top of the site, “Everyone should have the opportunity to code”. The weird thing is, one look at the rest of that page makes me want to turn against it. Maybe it’s because the testimonials remind me of junk mail.
So for the sake of balance, I tracked down some dissenting voices. Most of them don’t challenge the opportunity claim, instead dealing with the stronger claim that everyone should learn to code. Here’s Jeff Atwood, in a piece entitled “Please don’t learn to code”:
Look, I love programming. I also believe programming is important … in the right context, for some people. But so are a lot of skills. I would no more urge everyone to learn programming than I would urge everyone to learn plumbing. That’d be ridiculous, right?
And here’s Evgeny Morozov, author of To Save Everything, Click Here:
I think the craziest idea I have heard in the last few years is that everyone should learn to code. That is the most bizarre and regressive idea. There are good reasons why we don’t want everyone to learn nuclear physics, medicine or how financial markets work. Our entire modern project has been about delegating power over us to skilled people who want to do the work and be rewarded accordingly. I’m all for making us aware of how various technological infrastructures work. But the idea everyone should learn how to code is as plausible as saying that everyone should learn how to plumb. To me it just makes no sense.
Interesting that they both chose plumbing to reduce a coding movement to absurdity. But let’s not get distracted by that.
The point they are making is that coding is a specialised skill rather than a foundational one. English and Maths are core subjects precisely because they provide the foundations for other disciplines and a wide range of “real world” scenarios. More specialised subjects such as Music become more optional as you go through school. This makes sense, because Music does not provide a foundation for other school subjects. The real question is whether coding is a foundational subject or one which should be part of the core. Morozov and Atwood are arguing against having it as a core subject. Code.org sidesteps that challenge by merely saying that everyone should have the option.
Or do they? Read the testimonials carefully and we can see a number of quotes which endorse the stronger position:
“I think everybody in this country should learn how to program a computer
because it teaches you how to think.” – Steve Jobs
“Let’s get the whole world coding!” – Eric Schmidt
“To help prepare our children for a successful future, no matter what career they pursue, they need to learn basic computer programming skills.” – John Hickenlooper
“It’s important for these kids, right now, starting at 8 years old, to read and write code.” – will.i.am
Oh dear. There is definitely some equivocation between the two positions I have identified. Is this an undercover move of doublethink and deception? Or is it merely a case of diversity within a united group? I don’t know. I do think that the stronger claim is incorrect, and I think the temptation to make this claim comes from the elite status now achieved by the tech industry.
Coding is a modern form of magic. You can teach yourself how to do it, provided you have a computer, the internet, the mental ability, and you are prepared / able to put in the hours. Achieving success through these means is therefore meritocratic. But coding is not for everyone, and most jobs would not be improved by employees being able to code.
For my part, I learned a bit of BASIC as a child and later picked up some HTML. I never studied programming at school and would have probably enjoyed it if I had. The new movement towards teaching programming is young, but has already achieved results: the new English Baccalaureate replaces the ICT GCSE with one in Computer Science. The latter contains far more programming – take a look at the exam specs for Computer Science here, and compare them with the specs for ICT here. I am fully behind providing this GCSE… as an option.
So I’m going to conclude with a fairly unexciting view. Yes, students should have the opportunity to code. No, this doesn’t mean everyone should learn to code. </rant>
”Make no mistake: this is the most important thing you do as a teacher. All the other stuff is of no use whatsoever if you don’t mark your books properly. You can be endlessly enthusiastic, have great subject knowledge, be fully cognisant of every rule and regulation, manage behaviour wonderfully, teach fascinating lessons at a cracking pace, which feature bucketloads of flannel-free praise and it will be all to nought if you don’t mark their books. They won’t progress. Antithetically, you can turn up hungover every morning, wearing the same creased pair of Farahs as last week, with hair that looks like a bird has slept in it, then spend most of the lesson talking at kids about how wonderful you are; but mark their books with dedication and rigour and your class will fly.”
Beadle’s book made quite a few changes to my teaching, but the chapter about marking easily had the biggest influence. I remember the first time I picked up a red pen and started to write on someone else’s work. I was in a staff room in a language school in Seville, and the work was by someone older than me. I knew I had to write something, but I didn’t know what. In those days, I probably focussed on spelling, because that was something I knew about.
A few years later, I was in Brighton, sitting in a university classroom at the start of a PGCE. This was where I first heard the phrase “Two stars and a wish”. The idea was that when you mark work – as in all areas of the classroom – you should praise more than you criticise, lest you create a negative climate of communication. Negativity discourages, and praise encourages. So next to each of the stars, you say what the student did well, and for the wish, you say what they need to do better next time.
This all seemed perfectly sensible to me. The trouble came when I received work that was crap.
In this case, I would struggle to think of two things which the student had done well. I would also be gagged from identifying all the things that the student had not done. What a disaster.
How to Teach advocates proof marking. That is, correct every mistake. Beadle presents a perfectly good argument for this: if you, their teacher, are not going to correct their mistakes, who else do you expect to do it? Being given carte blanche to correct at will is one reason teaching is a pedant’s paradise (you can have that one for free, Coolio). Outside of the classroom, people will be offended if you correct their grammar, punctuation and spelling. If you’re a teacher marking a piece of writing, it’s your duty.
The flimsy counter to this – that little Johnny’s confidence will suffer if his mistakes are pointed out – is dealt with swiftly by Mr Beadle:
“Yes, it is emotionally difficult for him, but not half as emotionally difficult as finding that he is an adult and cannot feed his family because a load of pansy liberal teachers who were worried about upsetting him left him illiterate, as they would not mark his work properly.”
Phew. Take that, liberal pansies.
So I have ended up with a way of marking that corrects (or at least points out) mistakes, then at the bottom of the work I give a bit of praise and development points. I don’t call this bit at the bottom “Two stars and a wish” because my students are 18 years old.
I have come to feel that feedback is one of the most perfect forms of teaching. A guy called Baba Brinkman did a rap about evolution which reduces the Darwinian process to three stages: Performance, Feedback, Revision. Nature uses this process to make organisms better adapted to their surroundings. It is also a foundational concept in design. Without a “click” of feedback you don’t know that your seatbelt is fastened, and you will keep trying until you get that feedback. This is more or less the same as a student trying something until they get it right. Without the feedback, they are left groping in the dark.
Marking is the simplest antidote to the educational catastrophe which involves telling people stuff they know already. When I stand in front of twenty students and talk about topic sentences, maybe five of them know what topic sentences are already. The other fifteen are getting a valuable lesson, but these five are bored and wasting their time. With marking, everyone, without exception, is having gaps in their ability pointed out.
My current job is unusual in the teaching world, because my employers give me enough time to mark properly. In my secondary days, I had loads of classes and not much time left over after teaching and planning. Marking came third on the list and piled up behind me like some endless guilty secret. The talk on everyone’s lips was how to reduce the burden. Stickers. Rubber stamps. Marking codes. These were all attempts to mechanise the feedback, but I never liked any of them because they made marking generic and depersonalised. The only way to keep up was to work Sundays, something which many teachers take for granted. I don’t want this to turn into a rant about workload, but having a two day weekend is one of the reasons why I’m happier working in a university, even if I do have shorter holidays.
I’ve been arguing here that writing all over a student’s essay is time well spent. I heard someone today beg to differ – they said “But they won’t read it all, will they.” Au contraire, Blackadder. I take my comfort from this story, taken from David Ogilvy’s Confessions of an Advertising Man:
Max Hart (of Hart, Schaffner & Marx) and his advertising manager, George L. Dyer, were arguing about long copy. Dyer said, “I’ll bet you ten dollars I can write a newspaper page of solid type and you’d read every word of it.”
Hart scoffed at the idea. “I don’t have to write a line of it to prove my point,” Dyer replied. “I’ll only tell you the headline: This Page is All About Max Hart.”
Written comments on written work are an essential part of teaching subjects such as English. Generic and mechanised feedback may be efficient but it is inferior. Personalised feedback is more useful and more interesting.
There is a very famous saying among Tibetan Buddhists: “If the student is not better than the teacher, then the teacher is a failure.”
- Allen Ginsberg, in No Direction Home
In a previous post, I criticised discovery teaching being used to cover for teachers when they don’t know their subject. Here, I ask if there is a place for a teacher who knows less (or no more) than their students.
At first, the idea sounds preposterous. Yet this is the case in a number of situations:
(1) Talking about the unknowable. In Religious Studies, a typical lesson might involve consideration of what happens after death. Of course, the teacher will probably know more about what certain groups of people say happens after death – but this is different. An activity exploring this question would be fully in line with the 2004 Non-statutory National Framework for RE (p.28), which advised that KS3 pupils be taught to “express their own beliefs and ideas, using a variety of forms of expression.” This framework has been replaced now, but was in place when I trained.
(2) Teaching ICT. It is entirely possible that some students will teach themselves new programming languages which are beyond the knowledge of an otherwise knowledgeable teacher. Similarly, when teaching foreign languages, teachers who speak a language non-natively will occasionally get native speaker students.
(3) CPD often involves sessions in which teachers pool their responses to certain problems. While these can be chunky pen thought-shower nightmares, I have actually learned things in these dismal twilight sessions. The poor bastard leading them can hardly claim expertise in a room full of experienced teachers. INSET days which involve a lecture from some stranger claiming expertise are notably unpopular with teachers – unless the speaker has the necessary credentials.
Situation one: RE
I look back on my RE lessons on unknowable things with uneasy memories. The aims were noble: to make students step back and look at the big picture of life; to stand face to face with the unknowable and quiver in its presence. But I felt gagged the whole time. I could never say “That’s right – well done.”
The topic was, however, productive to discuss. Views on what happens after death profoundly affect how people live and are key to understanding the caste system, suicide bombers and people who say YOLO. So why not turn that key on yourself? What do you think happens when you die? How does that influence the way you live? A teacher can pose these questions, and taken with the right attitude, the student leaves with an extra iota of self-knowledge. Yet no teacher could claim knowledge of life after death.
Wittgenstein famously wrote, “Whereof we cannot speak, we must pass over in silence.” He may not have had much time for my RE lessons.
Situation two: ICT and languages
In one school I worked at, I had a year 10 student who built an internal video sharing system for the whole school. It worked really well. Luckily, I was teaching him RE and not ICT. If these situations are handled badly, you end up with a student who sits bored and unchallenged by their ICT lessons. Differentiation helps, of course. The student could be set challenging tasks suitable for their ability. The problem comes in feedback. Put simply, a teacher cannot give constructive feedback on an area where the student knows more than them – it would be a farce. And without feedback, there is no teaching. One solution might be to give feedback on areas where the student does know less. For example, a student might be able to make a better mobile app than their teacher, but the teacher could still suggest ways to improve it from an aesthetic perspective.
Alternatively, the lessons can be used to develop skills other than their subject knowledge. A native Spanish student in a Spanish lesson can be enlisted as a teaching assistant, developing their confidence. But then in what sense is this a Spanish lesson for the student? It is not. On the other hand, pushing them to explain to their classmates why their language is as it is would require them to know about Spanish grammar, and this is an area where a teacher could develop the student.
Assigning a mark for a child genius should be easy: A* all the way. Weirdly, this doesn’t always happen. This is a tragic story of a student who had to pretend to know less than they did, just so they could get a good grade. Mark schemes that lead to this kind of situation belong in a large fire.
Situation three: CPD sessions
There is noticeable anxiety about what to call these sessions. Workshop is an attempt to capture the idea that no one should assume the mantle of expert. In a similar way, development is more guide-by-the-sidey than teaching or training. I still believe the best use of these sessions is when I leave knowing something I didn’t know before. I first heard about Hattie’s effect sizes in an after school training session – in that case someone stood at the front and talked about it. Having said that, small group discussion with more experienced teachers taught me tips and tricks on what to do with latecomers (etc.) But what did the experienced teachers learn from these discussions? Not very much, I would guess.
Teacher training is a tough one. Teachers figure out what works best mainly through practice and reflection. My PGCE was three weeks of university followed by six months of placement, and then a bit more university tacked on at the end. It worked pretty well for a noob. But CPD sessions involve developing experienced teachers as well as the rookies. It’s based on the principle that every teacher can improve (true) and that training sessions are the best way to achieve this (questionable).
Ben Goldacre’s latest project is interesting, because it aims at improving the quality of educational research. When there is good research, a training session is a sensible way to share it. All you need then is an expert to digest the research so that they can lead the session.
The teacher who knows less or no more than their students is an undesirable situation. But even if the entry requirements to teacher training courses were raised, situations would still occur where it happens in at least three secondary subjects. The teacher in this situation can still motivate, probe and develop their expert students, but their ability to give feedback is limited.
CPD sessions, meanwhile, are typically based on a classroom model with one person leading a group. Where this occurs, the leader should ideally be an expert in something. Where this is not possible, a facilitator must accept that experienced teachers may well become no more than teaching assistants, helping the novices in the room. While this has value, the experienced teachers may feel justified in claiming that they are not receiving development themselves. This is a shame, because everybody can develop at any point in their career.