Reading Time: 10 minutes

I’ve just finished Shaping the Bar by Prof. Joan Haworth. I think it’s a must-read for law librarians in any work context. As I moved recently from a public law library to an academic one, I’ve been trying to get up to speed on changes I’ve missed. This was a text I’d seen mentioned a few times in legal media but, given my workplace, wasn’t one whose title felt relevant to my own library’s direction. If the legal profession has any shape when viewed from a public law library, it may be pear-shaped.

My read of Shaping the Bar was both transformative and reassuring, especially given my own experiences working with lawyers, including at a Canadian lawyer regulator. Its discussion of practice competency—and legal research, analysis, and technology within that—offers a number of opportunities for the law libraries preparing lawyers but also those receiving them as new colleagues or helping them navigate library services as part of the administration of justice.

Long time readers of this blog will know that I’ve struggled with this issue surrounding legal research for awhile. What is this disconnect between perception of new lawyer competence around legal research and the reality? When is legal research good enough? And what is good enough, if it doesn’t trigger discipline, even when a law library or librarian isn’t involved? What if our expectations are that of the expert and are out of line with what is required for competence?

The Base Line Is Competence

One of the fascinating things while working at Ontario’s lawyer regulator, as the library director but also providing support by speaking regularly on technology and competence, was how the topics changed so little from year to year, and from cohort to cohort. Over more than a decade, the knowledge gaps remained largely the same. The guidance changed, bit by bit, as the world and technology changed, but the lawyers coming to the sessions continued to have the same knowledge gaps: how to practice competently, how to access the information and use the tools they needed to be successful.

A topic of regular conversation at the Law Society was what the regulator’s role was, as bar administrator and disciplinary body. Unlike American regulators, Canadian regulators often act like affinity bar associations. They’re overseen by lawyers and judges, as part of the self-regulated profession concept, but that governance can become heavily tainted by “what’s best for the bar” rather than “in the best interests of the public.”

This guild support creates tension with the regulatory side (see California). But one thing I was always appreciative of was the focus on assessing the minimal competencies a lawyer needed (Ontario being Ontario, there are different competencies for barristers (litigators) and solicitors (not litigators). A lawyer takes both exams (there is no practice distinction between the two hats, like there remains in the UK) but the emphasis is on minimal competency.

The Law Society, because of the current CEO, is also very focused on evidence-based regulatory work. In particular, they review the practices of lawyers who are (a) new to the bar and (b) returning to practice after being away, among others. Why? Because the data shows those are the people who need to be reviewed. The regulator’s data, both of years of practice reviews and discipline outcomes, helped them define the types of practices likely to create the greatest risk for the public.

How To Shape The Bar

Shaping the Bar, then, was a happy discovery for me as it provided both a confirmation of some things I had seen, some things I conjectured, and some ideas of a way forward. There is a lot of substance in this text for people, like law librarians, whose primary role relates to practice-specific success of lawyers. And where better to begin than with minimal competence.

This is a thread that permeates the book. In particular, is the bar exam delivered in U.S. states an assessment of competence or excellence (or something else)? All the data points Prof. Haworth uses suggest something other than competence. I had not realized the wide variety in cut scores on the Multistate Bar Exam, despite the use of a standardized test (p. 79). What possible, justifiable reason could two states have for different pass rates on a standardized measure of competence?

Bar exams test aspects of how good a law student you were, not how competent [a lawyer] you would be.

Shaping the Bar, Joan Haworth, p. 91 (2023)

I can hear in my head people arguing that one is “easier” or “not as good as” but it’s competence. Assuming the regulator’s regulating, incompetence will be caught by discipline. Raise the cut score until discipline goes down. But don’t do it because you’re in the Show-Me-State v. the It’s-Good-Being-First state.

Law schools may use varying scores on a standardized test (the LSAT) to goose the number of students they accept, but it’s not a competence barrier (even if it remains a barrier of access). Prof. Haworth steps through the variety of barriers that the legal profession has raised behind itself: the bar exam, the character and fitness requirement, and so on.

I really appreciated that she looked north of the border, where the Canadian provincial regulators (like the US, regulation of lawyers is not federal in Canada) have started to consider alternate approaches. The Plains provinces have created a shared practice-readiness program (p. 94). Like Ontario, they are trying to squeeze around the throttle that apprenticeship (articling) causes. In Ontario, more law students are graduating than there are articling roles, and even the newly-created alternate practice path is not able to absorb all of the excess candidates.

Post-graduation residencies also create serious inequities because access to supervisors is based on preexisting connections and relationships, problems that have infected supervised practice and articling systems around the world. The pattern of people of color and others who are not well connected not being able to obtain articling positions has caused some jurisdictions … to create alternative training paths.

Shaping the Bar, Joan Haworth, p. 86 (2023)

This is the challenge, though. As Prof. Haworth points out (and has written about elsewhere), a minimally competent lawyer will have been supervised by the time they are called to the bar. But pushing that burden out onto the profession results in a variety of disparities. When you consider that most of the practicing bar consists of solo and small firm lawyers, who may not be able to afford to take on a second lawyer, let alone the additional time required to properly supervise them competently, it is not hard to see why this is a failure waiting to happen.

Content warning: suicide
Imagine getting to the end of your law school career and finding the way forward blocked. This was not my experience, as I never took the bar exam and went straight into libraries. But one of my closest friends successfully finished law school in London (UK). He then was faced with an unpaid pupillage to complete his accession to the profession (this has changed in some parts of the UK). He was unable to get funding for the 12 months it would take him, unpaid, to become qualified. He committed suicide after being unable to overcome this obstacle. When I consider barriers to the profession, it’s always colored by his life and his lost possibilities. Money should never be the barrier to access the profession.

A note on that data, and I noticed this within Prof. Haworth’s book too. I always struggle to find recent data about the legal profession. The Federation of Law Societies in Canada has been pretty consistent. Their 2022 data shows 75,726 practicing, fee-paying lawyers, 15,208 of which identify as solos. Another 19,739 identify as professional corporations, which can include solos. There are 89 firms with more than 51 lawyers and 4,815 that have 2-10. But there’s no break down of how many lawyers in each firm.

The ABA doesn’t appear to capture practice size any longer and their pointer to a report by the ABA Foundation just dead ends at the ABF’s website. The ABA used to point to a single ABF report from 2000 that was updated for purchase in 2004. This is the last ABA data that I could find that covered practice size and that was before the mid-size firms were absorbed into the mega firms in the 2010s. It says 69% for firms with 10 or fewer lawyers. This was my aggregation of the data to hand in 2015.

At the beginning of the century, newly established institutional players began taking the profession’s battles over prestige and exclusion to the national level. Corporate law firms took shape and built enduring alliances with the more elite law schools. This convergence of interests between the corporate law firms and most exclusive law schools helped them both take leadership of the profession, which would continue for many decades to be, in fact, largely a “cottage industry of single practitioners.”

Shaping the Bar, Joan Haworth, p. 23 (2023)

Same as it ever was.

Of course, it’s not just the bottleneck of a post-graduate supervision that relies on the bar. It’s the cost, too. This site says the tuition is now on average over $50,000 a year, with total costs (tuition plus) over $230,000 for the entire degree. As an aside, if we’re worried about the law librarian pipeline, imagine adding on the median library school degree at $44,000 (out of state) a year.

One of the areas that I was most excited about in Prof. Haworth’s text was how there are some agencies that are actually trying to gather data. In the legal profession!! In particular, the work done by the State Bar of California (notably after California divested the affinity bar portions that bedevil regulators, and includes 6 trustees who come from outside the legal profession) and its California Attorney Practice Analysis is something I plan to dig in to.

Law students know what minimal competence looks like. I remember getting together with my study group in law school. The study room we used regularly had a white board in it. Someone had written “C- = J.D.” on it and truer words were never spoken. When you are in a system that uses grading curves to enforce failure without regard to competence, you make up your own markers for success. Minimal competence isn’t an A grade. And it should be achievable by everyone who gets a J.D.

In particular, the CAPA process used something called experience sampling. It meant that the lawyer was checking in throughout the day to show what they were doing. It wasn’t just a “give me the feels” response to questions. It was asking them to capture, in the moment, what it was they were doing. There were five categories of increasing cognitive complexity and the average level ended up being around 4 out of 5 in complexity (p. 55).

This reminded me of nothing so much as the Quantum/Dialog value chart that I’ve referred back to over the years, although never with this sort of precision data to go with it. Unlike lawyers, where we can reach in complexity has more to do with barriers the legal profession has set on us (like unauthorized practice of law) than our own expertise. It would be interesting to run this sort of research with law librarians.

A chart that has a left axis called professional skill level and a bottom axis called information product. The chart is intended to show that there is a continuum from the bottom left to the top right where more expertise will result in a more detailed and valuable information product. At the bottom left, the two options are "find data" and "raw data" and at the furthest points, they are "synthesize and recommend" and "report"
A chart that I have overlaid with my perception of where legal researchers are “allowed” to contribute value on this continuum. The original graphic is the information product v. skill level continuum.

As I say, lot’s to chew on. But still not quite to the heart of things.

Legal Research and the Practice of Law

One of the first things I did when I got to my new role was go through a faculty orientation. One of the sessions was on teaching and it was good to get a refresher. In particular, it gave me the opportunity to ask a question that I think may be the most important one to answer when I start teaching. It was this: how do you teach lawyers ambiguity?

As we all know, legal research isn’t a definitive outcome experience. You may or may not find law or commentary that will support the facts of your situation. This is a frustration in public law libraries because it is hard to explain this ambiguity to people who have no legal background but see that most legal issues can be resolved in 30 minutes on TV.

So I had a moment of utmost satisfaction when I got to this point in Shaping the Bar:

Lawyering is about handling ambiguity—knowing, above all, what is not known and needs to be pursued….

Shaping the Bar, Joan Haworth, p. 77 (2023)

Convergence.

So where is the place for law librarians as we watch the NextGen bar exam unfold and the potential shifts that might make practical skills, like legal research, more visible in helping to attain minimal competence.

I’ll pass it back to Prof. Haworth.

The foundational abilities of every competent lawyers include legal reading, legal research, legal analysis, and legal writing.

Shaping the Bar, Joan Haworth, p. 97 (2023)

and

In rethinking educational requirements, jurisdictions should reverse engineer from the problems that generate malpractice awards and attorney discipline. Lawyers’ failures often related to law office management, substance abuse, health crises, and the like. Memorizing the rules of professional responsibility…is not the same as competently managing the responsibilities of practice.

Shaping the Bar, Joan Haworth, p. 90 (2023)

I would add that my own perception on law office management is that you are also talking about the overwhelming need for understanding about how technology is used to practice competently. We wouldn’t have Comment 8 to Rule 1.1 if technology wasn’t now an impact on competence (Canada has followed suit, as shown in Ontario). So law schools who have been sleeping on legal technology exposure (to manage intake, to manage conflicts of interest, to manage deadlines, to manage money and internal financial controls, to manage records retention and long-term client file storage, etc.) may want to be focusing on that too.

But the thread that runs through this text is that we are not currently preparing lawyers to be minimally competent for practice AND the obstacles to practice (bar exam, etc.) are not measuring minimal competence either. I was surprised by the shift since I went to law school in teaching legal research, where it is now subsumed in broader skills classes when it is something that needs focus and repetition to gain expertise.

That, for me, is the big takeaway but it’s also something I’ve been thinking about for a couple of years. Proficiency in legal research is gained through practice, expertise acquired over years of time. I think the curse of knowledge is one reason that new lawyers and even law students generally may be perceived not to have sufficient legal research expertise. It may have more to do with the people making the assessment (law librarians with expertise, supervising lawyers with greater experience) than the reality of their skills. It’s not clear to me that we know they do or they don’t (use of pre- and post- assessments to measure attainment, for example) have minimal competence. I’m also not confident that minimal competency can be handled by course options like advanced legal research unless they scale to the entire graduating class or student body.

At this point, if the bar exam is the target, proximity may be an important element. This finding in a law journal article has stuck with me. Legal research instruction may need to be positioned late in the degree if bar success (separate from competency) is the priority.

The most effective pre-graduation interventions we studied resulted from a
paradigm shift: UC Law SF improved bar outcomes after it moved from an academic skills development model focused on the most at-risk students based on entering metrics or law school GPA (LGPA) to a model of pervasive, integrated, and iterative skills instruction aimed at all students. Examples include: (1) requiring and encouraging students to take upper-division bar subject classes, with each additional bar subject class taken associated with a 3% increase in the probability of first-time bar passage in the post-2016 period; and (2) offering for-credit bar skills classes in the 3L year focused on improving MBE performance (Critical Studies 2) and on overall bar test taking (Critical Studies 3).

Determinants of Success on the Bar Exam: One Law School’s
Experience 2010-2023
, Morris A. Ratner, Stephen N. Goggin, Stefano Moscato, Margaret Greer, and Elizabeth McGriff (forthcoming) p.2, March 18, 2024 draft

But so many of these practice-oriented skills permeate law school and so, as we do with legal research and legal writing, start on day 1 of the first year of law school. I worked up this graphic for a previous post but I think it captures the need that I imagine. If we are thinking about practice competency and not just the bar exam, which I am now convinced by Prof. Haworth are two distinct things, then legal research needs to be more visible and more broadly incorporated than it seems to be.

A chart with 5 colored bands. The bands are labeled at the top 1st year, 2d year, 3d year, apprentice, and practice. A line arcs across all 5 to represent the legal research skills development arc. Smaller arcs and lines are laid across the colored bars to show how legal research education is currently applied in spot or localized way.
A chart showing the arc of legal research training and how it should spread not only through law school but beyond it, and how spot applications of training may not be enough to support that arc

I already know that legal research is taught in widely varying ways across law schools, as often, it seems, by people outside the law library. I’m not sure the who matters as much as the what. I’m going to be looking, now, for how people measure whether the teaching is effective. How, other than bar passage rates, is legal research competence measured?

So I’ll finish as I started. If you’re a law librarian, in any environment, I think Shaping the Bar is worth a read. If you engage in any form of information literacy for lawyers that has to do with legal technology or legal research, I think there is something for you in the text.

For me, it transformed my thinking about the bar exam, which I had viewed, to the extent I thought about it at all, as an immovable obstacle. It’s clearly not, as the NextGen bar shows and the external pressure of access to justice may also have. A minimally competent lawyer should have no problem passing the bar, if the bar is actually about practicing competently. I would assume that, if the NextGen bar legal research question is competence oriented, then we can know that 100% pass on the foundational legal research question is not only possible but means legal research instructors have been successful, and any shortfall shows us where we need to focus. (If the question is not about competence and gets more complicated, that will not be true any longer).

Then what? If we can figure out what it takes to ensure 100% of law students get the legal research question 100% right, can we figure out where that competence threshold is? Can we find out where, on that continuum, people who are not in law school might fall? And how would we help public patrons get further on that continuum so they can represent themselves adequately in the legal system? And, can we also set a goal beyond minimum competency that is also measurable for lawyers to ensure that, in that critical first couple of years, they are gaining the additional competence they need?

So many questions. And I don’t have any answers but Shaping the Bar gives me a lot to think about.