Above are the slides from my July 20 presentation on crowdsourcing to the American Association of Law Libraries annual meeting. When I first suggested the title, I was sure the presentation would be a positive one, demonstrating the ways in which crowdsourcing and collaboration “are changing” legal research. I have long been a believer that crowdsourcing can help democratize legal research and enable free research sites to become more viable alternatives to paid sites.

But as I dug deeper into my research for the presentation, my long-held fears about crowdsourcing were increasingly confirmed. It just has not ever worked well within the legal profession. Over the years, site after site has attempted to make a go at crowdsourcing. But they almost always fail. Why is that?

I have a quote in one of my slides that may pretty well sum up the answer. It is from Apoorva Mehta, who is now a huge Silicon Valley success story as the founder of grocery-delivery service Instacart, but who, earlier in his career, attempted to start a legal networking and crowdsourcing site called Lawford (later called LegalReach). Asked later why Lawford failed, here is what he said:

I didn’t know anything about lawyers when we started. Turns out, they don’t like technology, and they don’t like to share things.

Anyone who is considering starting a crowdsourced law site should take Mehta’s quote, frame it and hang it above their desks.

That said — and perhaps I am ever the optimist — but I do believe there is hope. Three sites, in particular, stand out to me as potential success stories in the small world of crowdsourced legal research. I’ll get to those later in this post, but first let me recap some of the history as I presented it at AALL.

The Idea of Crowdsourcing

Thanks to Wikipedia, we all get the basic idea of crowdsourcing. A dictionary defines is this way:

The practice of obtaining needed services, ideas, or content by soliciting contributions from a large group of people and especially from the online community rather than from traditional employees or suppliers.

You could argue that a company such as West Publishing has been using crowdsourcing for decades. It relies on armies of lawyers to write the headnotes, assign the key numbers and create the editorial enhancements that add value to the raw cases it publishes.

Of course, West pays its editors, so that doesn’t count as crowdsourcing. But what if you could replicate those editorial enhancements through crowdsourcing? What if a free legal research site could post cases and other primary legal materials and ask its users to help enhance, explain and explicate those materials? That, my friends, is the aspirational ideal of crowdsourced legal research.

In the legal field, one of the earliest attempts at crowdsourcing was by Harvard (then Stanford) law professor Lawrence Lessig. Five years after his book, Code: And Other Laws Of Cyberspace, came out in 1999, it needed an update. So Lessig posted it to a wiki and invited the Internet community at large to edit it by adding their ideas, questions or new material. In January, 2006, Lessig took the product of that wiki-edit, and added his own edits to produce Code Version 2.0. Revision by crowdsourcing seemed to be a success.

But where Lessig appeared to succeed, many others have failed. The legal Web is haunted by the spirits of the many crowdsourced sites that have come and gone. Among them:

  • Spindle Law. Launched in 2010 (see my write-up), it was to be “a new kind of legal research and writing system” that would make legal research “faster and smarter.” Part of the way it planned to do this was by using crowdsourcing to build on the knowledge of its users. All registered users could add or edit authorities, edit its tree-like hierarchy, comment on authorities, and vouch for or reject authorities.
  • JurifyJurify. This site’s motto said it all: “No lawyer is smarter than all lawyers.” Launched in 2012 as the “first mass collaboration platform for lawyers and clients,” Jurify’s focus was on using crowdsourcing to enhance access to legal research. “Think of it as a Wikipedia for the law,” VentureBeat reported at the time. “By crowdsourcing the curation and information-gathering process, the startup plans to slash subscription fees for legal research.” Not long after it launched, it disappeared, only to re-emerge in January 2014 in a version that scaled way back on the original version but stuck with the crowdsourcing idea. It did not survive the year.
  • Standardforms.org. This was founded in 2011 (see my write-up) by an attorney who was a VP and legal counsel at Brown Brothers Harriman. His idea was to use crowdsourcing to come up with better legal forms. Anyone could post a form and anyone else could revise or comment on it. His hope was to use crowdsourcing to achieve a consensus of what should and should not be in legal agreements. It never took off.
  • Lawford. This was the site launched by the aforementioned Mehta in 2011. On my blog at the time, I said this about it: “Lawford’s developers have the ambitious goal of building the largest legal networking platform in the world. In fact, they say that they hope someday to have every lawyer in the world become a contributing part of the site.” They planned to partly populate the site with court opinions and legal articles and then have its users build discussions around them.

Besides these research sites, there are the corpses of so many legal networking sites that died primarily because of the lack of one essential ingredient: participation by lawyers. Among the fatalities:

Suspended Animation

Besides the sites that died, there are also a number of crowdsourced legal sites that remain online, but in what can only be described as states of suspended animation. There is, for example, the LawLibWik, a wiki for law librarians that was created in 2006 and last updated in 2007. Another is the Civil Law Dictionary, where the last update I could find was six years ago and where all the updates were made not by the crowd, but by a single person, the site’s founder.

MootusA site I am particularly disappointed to see becoming dormant is Mootus. When it launched in 2013, it struck me has having the potential to gain some traction. I wrote about it both here and in the ABA Journal. It took a different approach to crowdsourcing, describing itself not as a legal research site, but as a platform for “open online legal argument” designed for both law students and practicing lawyers.

The idea was that a user would post a legal issue to be “argued.” Other users would respond by posting cases they believed were relevant to the issue, together with their arguments for why a case applied. Still other users could then comment on a posted case and vote on whether a case was “On Point” or “Off Base.”

But with activity on the site seeming to have dwindled, I asked its co-founder Adam Ziegler about what had happened. Here was his response:

Our early experiments asked:

  1. Would lawyers post research questions to the crowd?
  2. Would the crowd post answers in the form of legal citations?
  3. Would other lawyers find the public Q&A thread useful/helpful?

We found yes to the first, no to the second, yes to the third.

On the second, we found “no” even for unemployed new lawyers sitting on the couch – pretty clear refutation of our hypothesis that we would find early contributors among the community of underemployed lawyers.

My takeaway from those results was that explicit, unpaid crowdsourcing, Q&A-style, isn’t a viable model right now. I’d love to see someone prove that wrong.

Note his conclusion: Not a viable model right now.

Another dormant site that seemed promising is Law Genius, whose motto is “Legal analysis for the crowd, by the crowd.” It is part of the larger Genius network of crowdsourced community sites, all of which grew out of the original site, Rap Genius, which was started in 2009 for the purpose of listing and annotating rap lyrics.

Rap Genius was so successful that soon, users started using the site to annotate all sorts of other stuff, from the collected works of Shakespeare to the roster of the 1986 New York Mets to the warnings on the back of a Tylenol bottle. A year ago, the site officially relaunched as Genius, becoming a hub for a range of communities devoted to topics such as rock, literature, history, sports, screen and tech. All are united by the site’s overarching goal, “to annotate the world.”

All of these Genius sites seemed to thrive. And then came Law Genius last November. It was an effort to crowdsource statutes, case law and other legal news. When it launched, I interviewed its editor, a Yale Law School graduate. She said:

There’s so much information lawyers have (particularly in our own little fields of expertise) and we have so much to say about what’s happening, though we usually keep those thoughts to ourselves, either writing emails to listservs or blogging in our small interconnected blogospheres.

I thought, wouldn’t it be great if those conversations happened publicly, around the text of actual opinions and statutes themselves? And before you know it, I came here to kickstart Law Genius.

Fast forward and there has been virtually no activity on Law Genius. The latest update shown on the home page as of today is the same update that was there when I wrote about the site in November. As best as I can tell, the editor I interviewed left not long after I heard from her.

Is There Hope?

So is there hope for crowdsourcing in legal research?

There are a handful of legal sites that achieved some success in using crowdsourcing to build content. Starting in 2005, the Electronic Frontier Foundation used crowdsourcing to create its Internet Law Treatise. But it has not been updated by anyone for more than two years and not substantially updated for more than five years. JurisPedia has used crowdsourcing to create an encyclopedia of world law with more than 1,600 articles. But it has not been updated by anyone since 2012. I don’t know about you, but I like my legal research materials to be a little more timely than that.

One wiki-like site that always impressed me for being fairly thorough and up-to-date was Judgepedia, an encyclopedia of courts and judges. It no longer exists as a freestanding site, having been absorbed earlier this year into Ballotpedia, an encyclopedia about American politics and elections. But its content is still there and still timely.

Curious about its secret for crowdsourcing success, I contacted its editor-in-chief, Geoff Pallay, and asked him about it:

How do we keep it up-to-date? That is the billion-dollar question, isn’t it. Nearly all of our contributions are from staff writers — close to 99%.

So Ballotpedia’s “secret” is that it is not really crowdsourced. Just like my reference to West earlier in this post, it pays its writers. And that’s not crowdsourcing.

Three That Could Be Contenders

Abandon all hope ye who read this post? Not so fast. There are three sites that appear to be achieving some success using crowdsourcing and that could provide models for others.

CasetextwriteThe first is Casetext, a site that I have written about frequently in recent years. It launched in 2013 as a free legal research site that would use crowdsourcing to annotate cases. Users would be able to add tags to cases to help organize them, add links to secondary sources that discuss the case, add annotations to the case, and reply to and comment on other users’ annotations.

That original concept was slow to get going, so Casetext looked for ways to get legal professionals more engaged in the site and in discussions around cases. Its first big step in this direction came last October, when it introduced new community pages. These are pages organized around practice areas and interests where lawyers can contribute analysis, meet others in their fields, and engage in discussions about current legal developments. These have become fairly popular, with some of its communities now having over 10,000 followers.

Then, in June, Casetext launched Legalpad, its publishing platform custom designed for people who write about the law. Users can use it to draft articles about the law. The articles get published to Casetext’s communities and also become part of the Casetext database of legal commentary. If an article discusses a case, the article links directly to the case and the case links back to the article.

It’s all about incentives, Casetext founder Jake Heller told me:

When we started out, we focused on annotations. The truth is, attorneys don’t write that way. The right people weren’t incentivized to write.

We created communities to give people real incentives to write on Casetext.

Now what we’re trying to do is make it easier. You don’t need a blog or WordPress. You don’t need to worry about SEO – we have 350,000 users every month. From day one, you can speak to a built in audience.

The big-picture goal is to match the incentives and interests of people who are excited to write about the law … with what we think will be a really powerful research experience. … We want to make these discussions into data, to overlay the social layer with the primary source documents.

And then there is the oldie-but-goodie Wex, the crowdsourced legal dictionary started in 2005 by the Legal Information Institute. Although it has been sort-of using a crowdsourced model since its start, earlier this year it started a more ambitious effort “to change fundamentally the nature of Wex into something that was more crowdsourced,” LII Associate Director Craig Newton told me.

As part of that effort, the LII — in a move similar to Casetext’s — built its own authorship platform within its content management system. The idea was to make it easy for its contributors to contribute and for its editors to edit. The LII recruited around 100 contributors to kick the tires on the new system and also developed a “how-to” guide.

“While the goal is ultimately true Wikipedia-style crowdsourcing,” Newton said, “we’re several steps away from that.”

Also similar to Casetext, the LII is focusing on how to incentivize contributors. One way they plan to do that is by highlighting contributors through author profiles. Another is to enable contributors to earn “karma points” and badges.

It remains my belief (founded more on instinct than data) that author attribution is a key piece. Not only is attribution a big incentive for folks to write for Wex, it is a good tool for the reader in order to evaluate the “trustworthiness” of what’s in the article.

Last but not least of my crowdsourcing success stories is CanLII Connects. A project of the Canadian Legal Information Institute, it was launched in April 2014 as a way to marry the case law CanLII houses with commentary from the legal community. To do this, it encourages lawyers, scholars and others who are competent in legal analysis to contribute commentary on cases or to post summaries of cases. (See my 2014 write-up.)

CanliiconnectsOnly registered members are allowed to post and only after they have been approved by CanLII Connects staff. Once membership is granted, any member has the ability to add content, comment on the content of other members, and up-vote content. The site also allows entities to register as a “publisher” and post content. A publisher can be any law firm, organization, group, business or school that is also a member of the legal community. Publishers can post content to CanLII Connects directly and also authorize affiliated individuals (such as members of a firm) to post under the publisher’s name.

The site also draws content from blogs and other publications. It does not scrape content directly from other sites, but it encourages authors to republish their content on CanLII Connects. In this way, the author can link his or her content directly to the ruling it discusses and make it discoverable by someone who is researching that ruling.

Take one look at the site and you can see for yourself that the formula is working. Not only has it developed a good flow of content, but the content is thoughtful and timely. Colin Lachance, who in April left his position as CEO of CanLII, told me earlier this year:

We are at the beginning of a virtuous circle of growth: increased integration with our primary law site means greater awareness of the commentary site, including greater awareness of who is contributing; when lawyers and law profs see work from their peers on the platform, they are motivated to join and contribute their own work; expanding content from an expanding roster of respected professionals drives greater usage which make the platform more complete and more attractive to existing and future contributors to keep the flow of content going; continual growth will prompt us to pursue deeper integration of commentary and primary law which, hopefully, keeps the circle moving.

In other words, momentum is needed. As a site begins to build involvement, that involvement yields further involvement. There needs to be a critical mass of participation before any crowdsourced site can take off.

What Is The Secret Sauce?

So is there a secret sauce for crowdsourcing success in the legal field? Judging by these last three sites, I would sum it up as follows:

  1. Make it easy to contribute.
  2. Make it rewarding to contribute.
  3. Make the content useful to others.
  4. Success will breed success.

I’d love to hear your thoughts on crowdsourcing. I continue to believe that it could someday change the face of legal research. The big legal publishers do not have a monopoly on smart lawyers with good ideas and insights. If we could tap into the knowledge of the legal community at large, we could build a library of free legal research materials that could rival any paid ones.


Photo of Bob Ambrogi Bob Ambrogi

Bob is a lawyer, veteran legal journalist, and award-winning blogger and podcaster. In 2011, he was named to the inaugural Fastcase 50, honoring “the law’s smartest, most courageous innovators, techies, visionaries and leaders.” Earlier in his career, he was editor-in-chief of several legal publications, including The National Law Journal, and editorial director of ALM’s Litigation Services Division.