Yes, it seems that a Terminator scenario is finally upon legal professionals. The machines are taking over! And you, are about to be out of a job. Maybe. Sort of, according to the Atlantic:
“That’s one of the issues the Wall Street Journal raised yesterday in an article on the ways computer algorithms are slowly replacing human eyes when it comes to handling certain pieces of large, high-stakes litigation. It focuses on a topic that is near and dear to the legal industry (and pretty much nobody else) known as discovery, which is the process where attorneys sort through troves of documents to find pieces of evidence that might be related to a lawsuit. While it might seem like a niche topic, what’s going on in the field has big implications for people who earn their living dealing with information.
The discovery process is all about cognition, the ability of people to look at endless bails of info and separate the wheat from the chaff. For many years, it was also extremely profitable for law firms, which billed hundreds of dollars an hour for associates to glance at thousands upon thousands (if not millions) of documents, and note whether they might have some passing relevance to the case at hand. Those days are pretty much dead, gone thanks to cost-conscious clients and legal temp agencies which rent out attorneys for as little as $25-an-hour to do the grunt work. Some firms are still struggling to replace the profits they’ve lost as a result.
And now comes the rise of the machines — or, more precisely, the search engines. For a while now, attorneys have employed manual keyword searches to sort through the gigabytes of information involved in these case. But as the journal reports, more firms are beginning to use a technology known as “predictive coding,” which essentially automates the process at one-tenth the cost. Recently, a magistrate judge in a major Virginia employment discrimination suit ruled that the defense could use predictive coding to sort through their own data, despite objections by the plaintiffs who worried it might not pick up all the relevant documents (Probably left unspoken here: plaintiffs in lawsuits also like to drive up the costs for defendants, in the hopes that it will encourage them to settle). Unspoken, but painfully obvious.
“In truth, researchers have found predictive coding to be as accurate, if not more so, than the attorneys its replacing. As the WSJ noted:
Several studies have shown that predictive coding outperforms human reviewers, though by how much is unclear. A widely cited 2011 article in the Richmond Journal of Law and Technology analyzed research on document review and found that humans unearthed an average of about 60% of relevant documents, while predictive coding identified an average of 77%.
The research also showed that predictive coding was more precise, flagging fewer irrelevant documents than humans did.
“Human readers get tired and make mistakes. They get fatigued,” says David Breau, an associate at law firm Sidley Austin LLP who has written about predictive coding.
Shorter version: There is now software that’s smarter and more efficient at these tasks than a human with a JD. Not only that, but it’s finally being given sanction by the courts, which would have the power to stop such a new technology in its tracks if they chose.
For the legal industry, this is a mixed blessing. The same way that robotics have created factories that need fewer workers, these programs will create firms that need fewer lawyers (even if it just means they’re renting fewer temps). Firms that have already figured out how to prosper without the enormous margins they achieved by charging egregious fees for associates to perform menial labor will benefit. The partners will keep on doing the most valuable work, even as the bottom rungs of their firms shrink.”
Two things to consider here. First, I think there will be some pullback after a certain growth period. Why? Becasue computers still can’t do everything a human can. This is not just wishful thinking for those who do doc review for a living. After having worked on numerous reviews and CMS development projects, I can tell you that the big problem with computers is that they have to be programmed to know what’s relevant and what’s not, yet on each and every project, something completely out of the scope of the original plan comes up and the next thing you know there’s a whole new element involved. These are almost always caught by someone on the review team, not a partner or managing consultant. So how can you hope/trust that a computer can can catch everything it should when it has to be programmed for the unknowable? Deep. I know. That said, it’s obvious that clients and firms want to slash costs, and reducing some of the grunt work done by associates is one way.
Second, the jobs all won’t be gone even if this takes off. Things will not be as dire for lawyers who may be low level and ready to transition to working in the very industry that is sapping their low level jobs…e-discovery. Trust me, I know plenty of recent grads who are bypassing practice altogether and becoming project managers or “document forensics technicians” in the growing field.
Thoughts? Agree, or all we all doomed?