(Re)introducing closed book exams at law school – by Cathy Sherry
Over the past three decades, law schools, along with others, have consistently moved away from closed book exams. Closed book exams have been associated with cramming, rote learning and superficial understanding. In contrast, open book exams, take-home exams and research assignments have been shown to promote deeper learning and genuine understanding.
From its inception in the 1970s, UNSW Law only ever used open book exams, and so the decision to use a closed book exam in the compulsory course, Land Law in 2016, was not a return to past practices, but rather the introduction of a hitherto untested form of assessment.
The decision to change to a closed book exam was motivated by increasing concern among staff, in a number of subjects, about the way in which students are now doing open book exams. For example, I have been teaching law for more than twenty years, and for most of that period would never have considered using anything other than an open book exam. However, in the past five to ten years I have been increasingly frustrated by students who copy out their notes into their exam booklet, rather than answering the question they have been asked. One of our colleagues calls this the ‘tip-truck method’ of answering exams. You dump everything you think could possibly be relevant into the exam paper in the hope that it will earn you marks.
The ‘tip truck method’ works. While it will never earn a student a high distinction, all students who have scored below this and still passed, have passed without getting everything correct. As a result, if a student dumps down a lot of information, some of which is irrelevant (but correct, having been copied from notes), and some of which is on point (even if by chance), it is not possible to fail them. I spent the best part of a decade desperately trying to craft exam questions which minimised ‘information dump’, accompanied by the strict instruction, “Do NOT copy out your notes; answer the question asked”, but to no avail. A sizeable proportion of students would still tell me everything they possibly could about the law of easements, even though 80% of it was irrelevant. If they consistently did this in practice, charging clients for irrelevant advice, it would arguably be professional misconduct.
Excessive use of notes raises particular concerns in the internet era. Pre-internet, it could be safely assumed that most students would be using their own notes in an open book exam, (within the same cohort, sharing notes required time-consuming and expensive photocopying, as well as a measure of organisation most of us lacked). Post-internet, notes can be shared between large numbers of students with the click of a button. Students share notes through google.dox, Drop Box, Facebook, Messenger, WeChat and no doubt other platforms I am too old to know about.
In New South Wales, and possibly in other jurisdictions, the nature of final school exams has also encouraged excessive ‘pre-preparation’, as opposed to simple preparation for exams. By ‘pre-preparation’ I mean the academic equivalent of ‘here’s a cake I made earlier’, as opposed to ‘here are the ingredients I have assembled in preparation for making a cake now’. In the compulsory Higher School Certificate English courses, detailed modules and rubric-based marking mean that it is possible to pre-write and memorise all English essays, including creative writing pieces. Many schools require students to practise this ‘skill’ throughout high school, telling them the essay question well in advance of the exam. Good students will have written their own essays, while others have cobbled together essays from multiple unacknowledged sources and at the extreme, some have bought or entirely replicated other people’s work. The result is that for many of our students, pre-preparing answers or chunks of answers is par for the course and they see no problem in copying out sections of prepared material from their notes into their answers booklets during an exam. This fundamentally changes the nature of an exam answer from work that a marker could be confident was solely the product of a student’s personal understanding at the time the answer was written to an answer that may be their own understanding or it may not.
As a result of these concerns, in 2016 my colleague Leon Terrill and I decided to use a closed book exam for Land Law. There was some consternation from sections of the student body and from some staff. A particular focus of student concern was the stress a closed book exam would cause, and the unreasonableness of expecting students to do a ‘new’ form of assessment in the fourth year of their combined law degrees. Despite this resistance, we pressed on, with the promise to conduct some research on our experience and share this with colleagues.
The exam was conducted without incident. When we received our papers, most of us (other than sessional staff who had marked closed book law exams before at other universities) were surprised by how little difference there was between the length and detail in closed book exam scripts. We had considered doing some research closely comparing open and closed book exam scripts but quickly realised that it was unlikely to be fruitful.
We had applied for a Faculty Learning and Teaching grant which we used to conduct a three-pronged study. First, our colleague and co-researcher, Julian Laurens, conducted a literature review. While this revealed a number of studies that attest to the benefits of open book exams, there was no study specific to our circumstance, that is, a law school changing from open book to closed book exams in the post-internet period.
Second, we surveyed the six teachers who had taught and marked Land Law that semester. There were four sessional staff and two permanent staff, varying from a few semesters’ teaching experience to over twenty years. Staff answered a number of open-ended questions. There were a range of responses, including as noted above, that the scripts in closed book exams were almost as long and as detailed as those in open book exams; that students were more likely to get to the point and not take the ‘kitchen sink’ approach; that it was easier to award high distinctions because it was clearer when a student had attained that level of understanding, and that it was easier to identify students who had not mastered even the basics of the subject.
Finally, we surveyed the students. We administered an anonymous, online/hard copy survey in class in the compulsory course that followed Land Law for most students. We received 174 responses, mainly from our undergraduate rather than JD cohort. There were 39 questions on general study motivations and techniques, the experience of open book exams and the experience of closed book exams.
These are some of the key findings:
- 75% of students said they had never purchased notes or study guides
- 70% of students said they had worked in groups to produce shared notes or answers for assessment or exams
- Almost 70% of students had used another student’s notes in an open book exam
- 60% of students had copied directly from their notes, articles or books into an exam booklet and 50% had included material they did not entirely understand
- 90% of students said they always try to understand the material
- 90% of students said they feel pressure to do well in their law studies and 75% had experienced ‘significant’ anxiety during their studies
- Almost 80% of students thought that open book exams were a good form of assessment
- 50% of students said they went into the closed book exam knowing more than they would know for an open book exam
- 30% of students said they included some material they did not entirely understand
- 35% of students said that not having notes encouraged them to think more in the exam
- Just over 40% of students said they thought it was worthwhile having a mix of open and closed book exams at law school.
Written comments included multiple comments strongly opposed to the closed book exam and strongly in favour of it. Many students thought the closed book exam was a storm in a tea cup and that the form of the exam was not particularly significant.
My conclusion on the change to a closed book format is that unsurprisingly, no form of assessment is perfect; they all have strengths and weaknesses. While I believe that open book exams are generally preferable to closed book, our research confirmed my concerns – many students are not using their own notes in exams and many are copying directly from notes, articles or books into exam booklets. This fundamentally alters the assumption that a formal exam script is all a student’s own work.
Perhaps the most significant finding for me was that 25% of our students were violently opposed to closed book exams, and many because they believed it was unfair to expect them to do a different form of assessment 4/5 of the way through their degree. As we are training lawyers who will go into an ever-changing profession (statutes and case law change, entire areas of clients’ businesses disappear, technology may eradicate a sizeable proportion of the profession), this is extremely concerning. My hunch is that some of our students thrived in a system of rigid, predictable assessment at high school, and may not have the skills or desire to work in less predictable environments. This suggests that the best thing we could do for our student body is mix things up as much as we can, altering assessment from semester to semester, course to course, so that they develop the skills they need to rise to whatever challenges the workplace inevitably throws at them.
Associate Professor Cathy Sherry, Scientia Education Academy Fellow