Exploring AI at a Mile High

Brown's Law: Colorado's entrepreneurs often use AI to draft contracts. But should they?

The unfortunate reality is that many people cannot afford to hire a lawyer. Entrepreneurs, freelancers, and small business owners often find themselves navigating legal issues on a tight budget.

Chris Brown

Boulder, Colorado

Last updated on Feb 6, 2025

Posted on Feb 6, 2025

Let’s say you need a contract for a deal with a new business partner. In fairly recent but pre-AI years, you could either draft the contract yourself, look online for a template, or maybe even hire a lawyer. But over the last year or two, I’ve noticed that Colorado entrepreneurs are increasingly turning to AI to draft contracts. Using tools such as ChatGPT, it will take just a few prompts for you to generate a contract. But should you?

My opinion is that using an attorney remains the best way to mitigate your risks whenever you have to sign a contract. With that said, I don’t believe the legal profession should (or even can) stop entrepreneurs from using AI to solve their legal needs. As I’ve said before, we should look at whether the entrepreneur’s use of AI is a better choice than what they’ve already been doing – searching Google, reading comments on Reddit, or asking a friend for their advice. In my opinion, AI already outperforms these options.

The Case for Using AI in Contract Drafting

The unfortunate reality is that many people cannot afford to hire a lawyer. Entrepreneurs, freelancers, and small business owners often find themselves navigating legal issues on a tight budget. In light of this, AI tools offer a potential solution by providing a way to create customized contracts quickly and at little to no cost. Kevin from Boulder tells me that “cost and speed” are the two most important factors as to why he uses AI for some contractual needs. He already subscribes to some AI tools, so when he has legal questions the tools basically provide free advice.

What’s more, AI does a decent job when compared to existing free or low-cost alternatives. In my experience, the quality of AI-generated contracts is on par with what a law student might produce: better than the average person’s efforts, but far from the level of an experienced attorney. These tools can handle basic legal principles, making them a valuable resource for those who would otherwise lack access to legal assistance.

In one case, Amy from Boulder sent me a lease agreement drafted by ChatGPT, along with this comment: “It’s a small, low-risk deal. I don’t want to invest a lot into it. Is this good enough?” After explaining that it covered the essentials, but did not go into as much detail as I’d recommend, she chose to take the risk and use the AI-generated contract. Another client, Bob from Denver, took the opposite approach. He sent me an AI-generated contract and acknowledged it was “just ok.” After discussing the risks involved in the transaction, he opted to have me create an entirely new agreement in order to better protect his company.

The Risks: From Mistakes to Litigation

While AI tools have potential, they are far from perfect. Current AI models don’t truly understand the law – they don’t grasp context, nuance, or the balancing of risks that are critical in drafting contracts. Additionally, when AI tools make mistakes, the user is unlikely to catch the mistake. And if they do happen to spot a problem, they’re unlikely to know how to correct it. What’s worse, some errors could render a contract invalid or lead to costly disputes.

As covered in my earlier article, even licensed attorneys make mistakes when using AI without properly verifying the tool's output. If lawyers are able to get caught up in making such errors, non-lawyers are likely to face similar – or even worse – outcomes. 

In a nutshell, while AI tools can help identify basic legal issues, they cannot replace the judgment and expertise of an experienced attorney. 

That said, I stand by the idea that if someone understands the risks and limitations of AI tools and still chooses to use them, why not let them? For example, Amy understood the risks of using the AI-generated contract and concluded that her business was small enough that she didn’t need to spend additional money on an attorney. Similarly, Kevin mentioned that he thinks AI is best at “low-stakes situations,” such as “brainstorming clauses or tweaking existing templates.” Those are judgment calls that I believe the entrepreneur can make on their own, or, when needed, in collaboration with their attorney.

Balancing Empowerment and Education

How do we reconcile the promise of AI with its pitfalls? I believe the answer lies in education and transparency. AI tools should come with clear disclaimers about their limitations and the potential risks of using them without professional oversight. When a user asks a generic AI tool such as ChatGPT about a legal issue, the tool should always begin its answer by stating that it cannot provide licensed legal advice. In fact, I often receive such disclaimers in ChatGPT when exploring legal issues or testing its output, but I’m not sure that all AI tools are so conscientious.

Moreover, users must be transparent about their use of AI tools. If a non-lawyer uses AI to draft a contract, then they need to make sure the other side knows about that use so that they can make their own judgment call about whether to engage an attorney.

In the end, it comes down to education and knowledge. If the users understand the limitations of using AI for legal tasks, they should be allowed to do so to further their business and seek some – if not complete – protection.

; ; ; ;

Share on

Tags

Subscribe for free to keep up with Colorado AI News!

Sign up today to get weekly email updates and to comment on selected articles.

Subscribe Now