How are legal teams leveraging AI to increase overall productivity and improve contract compliance?
Sort by:
We are not yet there, honestly. We use AI as we believe we must stay ahead of the curve and have considered the risk of competitiveness by not using it. We have a corporate ChatGPT that we can safely monitor, so we are promoting the use of AI, but not quite yet in the realm of compliance, contract management, or other legal work, although we are considering some available options.
Most of the AI-enabled legal tools on the market involve CLM (contract lifecycle management). The AI functionality is some version of searching templates or language/terms to allow for building workflows. There are also tools that redline and, to some extent, can replace manual contract review. The technology is the same as chatGPT, but it is not the same as using chatGPT to create a contract from scratch. There are many companies using AI-powered CLM tools, although they may not be using all the functionality.
I can only speak to my own department, and we are not leveraging it at all at the moment, but I’ve been very open about experimenting with it. Our team is probably a little bit out ahead in that regard. HR wants to use it for recruiting purposes, but we're unsure how they will do that. I don't envision AI replacing attorneys. Our interest right now lies in how we can understand it and figure out where it can be useful.
It has value for me in helping me get a jump start on a project. For instance, I used it to prepare a draft script for our upcoming shareholder meeting. I went to ChatGPT and requested opening comments for our general counsel when he speaks at our shareholder meeting. I hit enter, and in about 45 or 60 seconds, it had written a two-page introduction with comments – well written - with perfect grammar, perfect punctuation, and good paragraph separation. It's obviously not the final product, and accuracy must be confirmed, but it's a great way to start something if you're pressed for time. In all fairness, it's not perfect in terms of quality and can be wildly generic. As attorneys, we need to be wary about copyright issues, intellectual property issues, privacy concerns, and trade secrets. You don't really know the source of the information that comes back, so you have to do a quality check. I’ve heard of scenarios where ChatGPT provides completely wrong answers, but it will get smarter and faster with time.
Other than that, I’m unsure how we, as lawyers, will leverage it right now. We’re trying some AI products specific to the legal community, but I’m not terribly impressed so far. I can see it having a role in the contracts world and anything simple, repeatable, and handled at the administrative level. Different industries, different businesses, and different size companies will use it in different ways. AI could be a good tool for small law departments challenged by bandwidth constraints. I think it can help them scale up where they would otherwise need to go to outside counsel or even hire an in-house attorney. I think AI can mitigate that to a certain degree. I'd be very interested in how other in-house departments, typically smaller ones, use different AI technologies and ChatGPT. How are they using it, and are they drafting policies around this? Should we have a policy about AI, and what exactly should the policy say?
As a small legal department (4 attorneys), we are very interested in figuring out how to best use the technology/tools. We are not currently leveraging Generative AI, but I did attend a presentation by CaseText yesterday (now owned by Thomson Reuters) and I think they may have "cracked the code" with their upcoming Generative AI products (CoCunsel & Precision). The 45-minute demo/pitch is worth the time.