NewLaw NewLegalEducation

By Justine Rogers

I was lucky to be part of an invigorating panel discussion hosted by The Australian recently, as part of their Legal Week initiative. I joined two colleagues, our Dean, Professor George Williams, and Associate Professor Michael Legg, as well as Gilbert+Tobin lawyer, Sam Nickless.

It was a wide, free-flowing discussion of the current and future changes to the profession, what Williams called ‘a once-in-a-century process of disruption in the legal market’, and the meanings these have for practice and for legal education. I thought I’d share here some of the main changes discussed and their significance for legal education and the law school.

The Changes:

1. Automation of legal services and its integration with services delivered by people.
2. The possibility for technology to increase access to justice.
2. Unbundling of legal services and flexible, ad hoc and assorted teams brought together for certain projects.
3. Professional ethics centred less and less in the profession and professional association and more in organisations and even the mode of legal service delivery itself.
4. Ethics of introducing new technology to clients or helping them with their artificial intelligence.
5. The globalisation of law – where clients, for instance, are from other jurisdictions.
6. The changes to law firms and their business arrangements, specialisations and recruitment practices, including firms coming from overseas to recruit UNSW Law (or Australian) students.
7. Cross-border disputes and the extent to which Australian courts can remain forums for litigation.

What They Mean for Legal Education and the Law School:

1. Urgent need to answer questions about the value that a person adds as a lawyer.
2. Law schools may need to focus on the ‘professional’ things computers can’t do (or can’t do as well): certain forms of problem solving and analysis, integrity, ethics, professional relationships, creativity and imagination.
3. Law combined with computer science, maths or engineering will add to classic combinations.
4. All law students must understand technology (coding and programming, for instance) regardless of their other degree.
5. Law students must develop capacities in team work and project management.
6. Law students need to be able to identify and address ethics and accountability issues in a range of contexts, when working with external lawyers, non-lawyer professionals and, crucially, with technology.
7. Law students need to understand not just other, non-Australian legal systems but also the cultures in which law operates.
8. Law schools need to help their students appreciate the range of different firms in Australia and the region in making their career decisions.

Without wanting to sound too home-team-y, we’re already doing some pretty fabulous stuff at UNSW Law to support each of these and, through a mini curriculum review, we’re about to do a whole lot more.

(The full video and an edited transcript of the discussion is here.)

Student evaluations and innovative teaching

STUDENT EVALUATIONS OF TEACHER PERFORMANCE MAY HINDER INNOVATIVE ‘GOOD TEACHING’ PRACTICE: SOME OBSERVATIONS FROM RECENT RESEARCH

Julian Laurens

Recent research on student evaluations of teacher performance suggest, I argue, that assessing teacher performance via narrowly constructed student evaluation surveys designed to produce a quantifiable indicator of ‘good teaching’ may in fact have the indirect consequence of hindering innovation in ‘good’ teaching approaches. The May 2017 study [1] findings are consistent with a growing body of research [2] that shows university students are simply unable to recognise ‘good teaching’ or ‘what is best for their own learning’. The research identifies that students do not reward ‘good’ or ‘innovative’ teaching in the sense of an improved student evaluation mark for that teacher. Indeed, the evidence suggests the biggest factor informing a positive evaluation from a student is the grade the student is given.

On the other hand, as the studies show, student ignorance of ‘good teaching’ exists alongside evidence demonstrating that quality teaching has a positive impact on a student’s grades and learning outcomes. Moreover, when exposed to quality teaching the improved learning outcomes are transferred to subsequent subjects. This effect is consistent with findings from educational psychology: the work of Albert Bandura and other social-cognitive theorists on the development of student self-efficacy is particularly relevant. Worryingly, the research suggests that students who rated their teachers based on marks they received actually did worse on subsequent courses.

The research thus far raises implications for legal education. A specific and immediate issue raised by the findings is that given student evaluations of teaching are used by University and Faculty management when considering an academic’s career progress, there is a real risk that teachers may choose to ‘play it safe’. What incentive is there for a teacher to actually try something new in their teaching given that they will potentially not receive improved evaluation scores, and in fact may be penalised by students for being ‘innovative’?

A limitation is that much of the research so far is from non-law disciplines. There is yet to be a systematic look at how this particular problem with student evaluations does (if at all) apply to an Australian law school. This should not preclude us from taking note though.  Issues surrounding student evaluations generally have long been recognised in law schools. As Roth said (back in 1984), “[e]veryone agrees that evaluations ought to be done, but few are satisfied that it is now being done properly, or meaningfully” [3]. This remains the wider challenge. Elsewhere on this blog, colleagues Justine Rogers and Carolyn Penfold have also begun examining issues surrounding student’s evaluations.

In conclusion, for present purposes, the research may support the argument that over reliance on current narrow neo-liberal/managerialist inspired approaches to student evaluations of teachers at law schools in Australia may actually hinder innovative ‘good’ teaching practice. Current iterations of such practices can indeed appear as mechanisms of academic control, rather than tools that promote mutually collaborative learning environments. The research calls into question more broadly claims by universities to be dedicated to ‘good teaching’ and ‘innovation in learning’. There is a need to explore this issue further in the context of Australian legal education, situating it alongside continuing conversations around what actual good teaching looks like in law for example. I would like to make three brief practical observations at this stage derived from analysis of the research that may assist us to address some of the negative challengeS posed by the findings:

  • Firstly, we should always strive to improve our teaching, and we should be uncompromising in that. We should communicate this commitment to our students;
  • Secondly, we need to do a better job of explaining to students our teaching approach and rationales and how something relates to the learning experience;
  • Thirdly, we need to give students more opportunities to practice and reflect on what they have achieved along the way. Students need to see how they are progressing. They need to be regularly reminded of the value of learning. Royce Sadler’s work on the importance of feedback is worth reflecting on here.

[1] Brian A Jacob et al, ‘Measuring Up: Assessing instructor effectiveness in higher education’ (2017) 17(3) Education Next 68.

[2] See e.g. Michela Bragga et al, ‘Evaluating students’ evaluations of professors’ (2014) 41 Economics of Education Review 71; Arthur Poropat, ‘Students don’t know what’s best for their own learning’ (The Conversation, November 19, 2014) (online: http://theconversation.com/students-dont-know-whats-best-for-their-own-learning-33835 )

[3] William Roth, ‘Student Evaluation of Law Teaching’ (1984) 17(4) Akron Law Review 609, 610.

Big Data Analytics on student surveys

It’s the new “thing” – analytics applied to student responses to courses. And it is really quite scary.

To give an example, I will share my own results from a recently taught course of 22 students of which 10 filled out the survey. This is “small data”. It takes about 5-10 minutes (generously) to read and reflect upon the student feedback. Since I am sharing, they generally liked the course including guest lectures and excursions, but felt that one topic didn’t need as much time and that my Moodle page wasn’t well organised. All very helpful for the next time I run the course (note to self to start my Moodle page earlier and tweak the class schedule).

The problem is no longer the feedback, it is the “analytics” which now accompany it. The worst is the “word clouds”. I look at the word cloud for my course and see big words (these generally reflect the feedback, subject to an exception discussed below) and then smaller words and phrases. Now the smaller ones in a word cloud are obviously meant to be “less” important but these are really quite concerning, so much so that I initially panicked. They include “disrespectful/rude”, “unapproachable”, “not worthwhile”, “superficial” and “unpleasant”. Bear in mind the word cloud precedes the actual comments in my report. None of these terms (nor their synonyms) were used by ANY of the students (unless an organised Moodle page could count as “unapproachable”). And they are really horrible things to say about someone, especially when there is no basis for these kinds of assertions in the actual feedback received.

The problem here is applying a “big data” tool to (very) small data. It doesn’t work, and it can be actively misleading. One of the word clouds (there are different ones for different “topics”) had the word “organised”. That came up because students were telling me my Moodle page was NOT well organised, but it would be easy to think at a quick glance that this was praise.

So what is the point of this exercise? One imagines it might be useful if you have a course with hundreds of students (so that reading the comments would take an hour, say). But the fact that the comments can be actively misleading (as in “organised” above) demonstrates, you still need to read the comments to understand the context. Further, students often make subtle observations in comments (like the fact that too much time was spent on a particular topic) that are difficult to interpret in a word cloud where the phrases are aggregated and sprinkled around the place. So, it doesn’t really save time. The comments still need to be read and reflected on.

Big Data tools always sound very exciting. So much buzz! Imagine if we could predict flu epidemics from Google searches (that no longer works, by the way) or predict crime before it happens (lots of jurisdictions are trying this, particularly in the US). But the truth is more like the word cloud on student feedback – inappropriately applied, error prone, poorly understood by those deploying the tool, and thus often unhelpful. Data analytics CAN be good tool – but it is a bit like a hammer in the hands of those who don’t understand its function and limitations, everything looks like a nail.

Lyria Bennett Moses

Innovation for the next generation of legal education: student-led video production

11100028555_a1601749d0

image

How can legal education be enhanced through student-led video production? How effective is it for class learning? And what are benefits and challenges that this form of blended learning poses for environmental law and legal education more generally?

These questions were explored by Cameron Holley and Amelia Thorpe in a recent UNSW Law Learning & Teaching seminar where they presented the findings from their Learning and Teaching Innovation grant entitled: ‘Updating legal education with blended classrooms: lessons from student-led resource development’.

The premise:

  • Videos are one of most popular form of online media teaching (particularly in MOOCs) 
  • Facilitate thinking and problem solving

–creative challenge of using moving images and sound to communicate a topic

–filmmaking skills, but also research, collaborative working, problem solving, technology, and organisational skills

  • Inspire, engage and foster deep learning

–Videos as part of student-centred learning activities benefit motivation, opportunities for deeper learning, learner autonomy, communication skills,

  • Authentic learning opportunities

–method for students to construct concepts and learning about real life issues relevant to them

  • Assist with mastery learning

–providing learning resources for future cohorts

What did they do?

–students asked to identify a recent development in environmental law that is not already covered in the prescribed text book

–required to produce a short video, no longer than 10 minutes, that portrays the subject matter of a recent environmental law development and reflects thoughtfully on in its implications for achieving ecologically sustainable development

–low risk – 5% for trial (would be more in future)

–outcomes and process assessed

–small teams of 4-6 students

  • to assist: three iPads made available and guide sheets on a suggested timeline, working in small groups, and media production.
  • videos shown to the class as a set late in semester.

–roughly 40% of class already had experience with technology

The Results?

Cameron and Amelia showed examples of videos that demonstrated highly engaged, deep learning among the student groups, with a strikingly high level of production value!

The presentation drew on empirical data collected from student interviews and surveys, as well as teacher and peer reflections. It rounded off by critically examining the strengths and weaknesses of student produced videos as a tool for blended learning, before a lot of us in attendance decided we all want to try it out in our courses!

For those who wish to experiment with similar innovations, view the student data, or track the sources for the above,  their slides are available here: Holley_Thorpe_UNSWLaw_video.

What lawyers actually do in practice (at least in the US)

SSRN has recently posted a great ethnographic study of young US lawyers in terms of what they actually do in the office.

Sinsheimer, Ann and Herring, David J., Lawyers at Work: A Study of the Reading, Writing, and Communication Practices of Legal Professionals (March 14, 2016). Legal Writing Journal, Vol. 21, Forthcoming; U. of Pittsburgh Legal Studies Research Paper No. 2016-11.

It includes great evidence of lawyers dealing with the following (SSRN pinpoints):

  • Using close reading and skimming strategies (pp13ff)
  • Strategic reading (p23, 30ff)
  • Reading from computer screens (p26) but using printed materials by preference (p24)
  • Huge use of email for written communication (p45ff)
  • Use of precedents (p48)
  • Reviewing and revising constantly (p49ff), being meticulous (p50)
  • Research/writing nexus (p51ff)
  • Interpersonal skills and stress in the office (p58ff)
  • Time-management (p60ff)
  • Cross cultural communication (p64)
  • Developing professional identities (p66ff)
  • Suggestions for curriculum change (p24, 71)

It’s a wonderful collection of vignettes and data that help to flesh out what we are often  trying to impress on students are the real skills they need in preparing for legal practice environments.

 

 

Cheating at University

4732885512_9bf97a8838Photo Credit

Justine Rogers

Last week I was asking students in my ethics class to discuss legal values and what ones they’d picked up from law school. They raised a range of things, from compassion to competition. But one student said, “No plagiarising, no cheating, being honest in your work!”. “It’s drummed into us from Day 1”, one added.

I was rather chuffed to hear this, but I am not sure I can or should be too pleased. The research shows that these are problems affecting all Australian universities, though unevenly across them and within the disciplines. Sydney University has just released part 1 of report, ‘an approach to minimising academic misconduct and plagiarism at the University of Sydney. Its focus is detection and prevention. The Report shows that most instances of misconduct were categories of negligence (lack of understanding or carelessness about how to cite and reference). The rest, the active fraud, is where it gets disturbing.

There’s the less-straight-forward (as far as severity of categorisation goes) collusion and recycling, but most of it is outright dishonest plagiarism and ghost writing, or getting someone else to do the work and submitting it as your own. Ghost writing, in particular, is becoming more prevalent and difficult to regulate. Students are taking advantage of sophisticated and therefore hard-to-detect online services, marketed to them, ones like MyMaster. These fraudster strategies affect most directly take-home assessments, but now students are adapting the technology available to cheat in exams. They are using their phones and watches to bring in material, using loo breaks to quickly check the internet, taking photos of confidential papers, and one I hadn’t thought of in my old cheating (paper-based) days:* paying impersonators to come and take the exam on their behalf. Other categories found in the report were fraudulent medical certificates or other bad faith uses of special consideration.

A summary of the Report’s recommendations (produced by the Academic Misconduct and Plagiarism Taskforce, Sydney University, 2015: 2): Continue reading “Cheating at University”

Powerpoint?

An interesting and critical story about the use of Powerpoint.

Perhaps I am just a bit too “old school”, but in my teaching I have resisted using Powerpoint (or any of the other versions whose names I do not even know).  Though, I admit to having given in  at conferences where I now routinely use such slides.  In my defence I started using Powerpoint at conferences after I found myself consistently addressing audiences that were completely or significantly EFL (English as a foreign language)  – a consequence of my extensive activity in Asia.  For that sot of audience, Powerpoint seems to be of some assistance (though even there I would be happy to be persuaded otherwise).

While I always had a gut reaction about learning and teaching that relied extensively on technology, and Powerpoint in particular, it is heartening to have my gut feeling somewhat vindicated by this story (and its own internal links).

Colin Picker

The teaching year has just started here, and for me personally, the teaching of a brand new course.

Beginning something offers the chance to see what we are doing as teachers of law close up, because for at least some period of time, it is not natural; it needs repeating and (re)getting used to.

Traditionally, legal education has been about training students to ‘think like a lawyer’; to develop supreme skills of analysis, meticulousness, reasoning and persuasion. Writers have identified the values that guide these skills, some of which, they argue, are harmful to the well-being of lawyers, the clients they serve, and their communities.

Indeed, for just and effective legal practice, what’s needed in legal education is a greater emphasis on broader cognitive, social, practical and ethical skills – indeed, an increased emphasis on competence and skills generally. Students also need opportunities to make these skills meaningful, in connection to others.

This isn’t ‘just’ the findings of a bunch of academics. The legal profession is beginning to support this thinking. The NSW Law Society is now restructuring its CPD program to reflect the contemporary reality that, as it states, these are not ‘soft’ skills, rather ‘fundamental’ ones that best serve the client. To do so, they’ve drawn on the analysis of Canadian lawyer and ‘legal futurist’, Jordan Furlong.

They ask, “So what exactly are the six new skills Furlong thinks need to be added to the [traditional] mix if we’re to create the complete modern lawyer?” They are:

1. Ability to Collaborate

2. Emotional Intelligence

3. Financial Literacy (adding a nice dimension to Colin’s recent post)

4. Project Management

5. Technology Affinity (um, not sure about this term and can hear the cries of gross commercialism from among more senior lawyers, – as with 3. – but it means being competent at using your computer, the internet and other mobile technology.)

6. Time Management

I would add ethics reflection and decision-making to the list. Also, I am fairly sure there are other aspects of justice and the law that, while more about substantive knowledge and attitudes, could be presented as related skills?

In any case, given its connections to mastery, social relatedness and the emotions of the individual, this set of skills has great potential to also support the well-being of lawyers and law students in ways that the traditional skills do not or do not do as fully. What do you think of the list? Whether we’re at the beginning of the year, or in the slightly wilder stage, does it seem like a useful and worthy guide for our teaching?

Justine Rogers

Preparing law students for IT in the workplace

According to this article, competition among law firms is heating up and those that are using Information Technology (IT) as a collaborative communication tool may have an advantage. The article looks at some technologies that law firms are harnessing to enhance lawyer-client interactions. This allows greater flexibility in access, leading to greater client satisfaction. The article begs the question of whether universities are preparing students to use technology as collaborative, communication tools? Are universities a strong link in the chain between students coming into the university with strong IT skills, and the changing nature of the workforce which is also utilizing the tools that IT has to offer?

By Thomas Molloy 

Create a free website or blog at WordPress.com.

Up ↑