nadis 12 minutes ago

> "The template uses CommonPaper's Software Licensing Agreement and AI Addendum as a foundation, adapted for the unique characteristics of AI agents. Nick and the GitLaw team built this based on patterns from reviewing hundreds of agent contracts. We contributed our research from working with dozens of agent companies on monetization challenges."

Unless I'm misunderstanding and GitLaw and CommonPaper are related or collaborating, I feel like this callout deserves to be mentioned earlier on and the changes / distinctions ought to be called out more explicitly. Otherwise, why not just use CommonPaper's version?

Neywiny 4 hours ago

I'm not sure I understand why this is about agents. This feels more like contracting than SaaS. If I contract a company to build a house and it's upside down, I don't care if it was a robot that made the call, it's that company's fault not mine. I often write electronic hardware test automation code and my goodness if my code sets the power supply to 5000V instead of 5.000V (made up example), that's my fault. It's not the code's fault or the power supply's fault.

So, why would you use a SaaS contract for an agent in the first place? It should be like a subcontractor. I pay you to send 10k emails a day to all my clients. If you use an agent and it messes up, that's on you. If you use an agent and it saves you time, you get the reward.

  • nemomarx 4 hours ago

    To have that you need a human to take responsibility somewhere, right?

    I think people want to assign responsibility to the "agent" to wash their hands in various ways. I can't see it working though

    • cdblades 3 hours ago

      Exactly. I have said several times that the largest and most lucrative market for AI and agents in general is liability-laundering.

      It's just that you can't advertise that, or you ruin the service.

      And it already does work. See the sweet, sweet deal Anthropic got recently (and if you think $1.5B isn't a good deal, look at the range of of compensation they could have been subject to had they gone to court and lost).

      Remember the story about Replit's LLM deleting a production database? All the stories were AI goes rogue, AI deletes database, etc.

      If an Amazon RDS database was just wiped a production DB out of nowhere, with no reason, the story wouldn't be "Rogue hosted database service deletes DB" it would be "AWS randomly deletes production DB" (and, AWS would take a serious reputational hit because of that).

    • arnon 4 hours ago

      If I am a company that builds agents, and I sell it to someone. Then, that someone loses money because this agent did something it wasn't supposed to: who's responsible?

      Me as the person who sold it? OpenAI who I use below? Anthropic who performs some of the work too? My customer responsible themselves?

      These are questions that classic contracts don't usually cover because things tend to be more deterministic with static code.

      • Xylakant 4 hours ago

        > These are questions that classic contracts don't usually cover because things tend to be more deterministic with static code.

        Why? You have a delivery and you entered into some guarantees as part of the contract. Whether you use an agent, or roll a dice - you are responsible for upholding the guarantees you entered into as part of the contract. If you want to offload that guarantee, then you need to state it in the contract. Basically, what the MIT Licenses do: "No guarantees, not even fitness for purpose". Whether someone is willing to pay for something where you enter no liability for anything is an open question.

        • mlinhares 3 hours ago

          Technically that's what you do when you google or ask chatgpt something, right? They make no explicit guarantees that any of what is provided back is true, correct or even reasonable. you are responsible for it.

      • mort96 an hour ago

        If I am a company that builds technical solutions, and I sell it to someone. Then, that someone loses money because the solution did something it wasn't supposed to: who's responsible?

        Me as the person who sold it? The vendor of a core library I use? AWS who hosts it? Is my customer responsible themselves?

        These are questions that classic contracts typically cover and the legal system is used to dealing with, because technical solutions have always had bugs and do unexpected things from time to time.

        If your technical solution is inherently unreliable due to the nature of the problem it's solving (because it's an antivirus or firewall which tries its best to detect and stop malicious behavior but can't stop everything, because it's a DDoS protection service which can stop DDoS attacks up to a certain magnitude, because it's providing satellite Internet connectivity and your satellite network doesn't have perfect coverage, or because it uses a language model which by its nature can behave in unintended ways), then there will be language in the contract which clearly defines what you guarantee and what you do not guarantee.

      • Neywiny 3 hours ago

        Agreeing with the others. It's you. Like my initial house example, if I make a contract with *you* to build the house, you provide me a house. If you don't, I sue you. If it's not your fault, you sue them. But that's not my problem. I'm not going to sue the person who planted the tree, harvested the tree, sawed the tree, etc etc if the house falls down. That's on you for choosing bad suppliers.

        If you chose OpenAI to be the one running your model, that's your choice not mine. If your contract with them has a clause that they pay you if they mess up, great for you. Otherwise, that's the risk you took choosing them

        • hluska 2 hours ago

          In your first paragraph, you talk about general contractors and construction. In the construction industry, general contractors have access to commercial general liability insurance; CGL is required for most bids.

          There’s nothing quite like CGL in software.

          • Neywiny 2 hours ago

            Maybe I'm not privy to the minutae, but there are websites talking about insurance for software developers. Could be something. Never seen anyone talk about it though

      • cdblades 3 hours ago

        Did you, the company who built and sold this SaaS product, offer and agree to provide the service your customers paid you for?

        Did your product fail to render those services? Or do damage to the customer by operating outside of the boundaries of your agreement?

        There is no difference between "Company A did not fulfill the services they agreed to fulfill" and "Company A's product did not fulfill the services they agreed to fulfill", therefore there is no difference between "Company A's product, in the category of AI agents, did not fulfill the services they agreed to fulfill."

        • jacobr1 3 hours ago

          Well, that depends on what we are selling. Are you selling the service, black-box, to accomplish the outcome? Or are you selling a tool. If you sell a hammer you aren't liable as the manufacturer if the purchaser murders someone with it. You might be liable if when swinging back it falls apart and maims someone - due to the unexpected defect - but also only for a reasonable timeframe and under reasonable usage conditions.

          • Neywiny 3 hours ago

            I don't see how your analogy is relevant, even though I agree with it. If you sell hammers or rent them as a hammer providing service, there's no difference except likely the duration of liability

      • seanhunter 3 hours ago

        These are absolutely questions that classic contracts cover.

      • nocoiner 2 hours ago

        Classic contracts cover liability and allocation of risk in, like, literally every contract ever written?

      • idiotsecant 4 hours ago

        It's you. You contracted with someone to make them a product. Maybe you can go sue your subcontractors for providing bad components if you think you've got a case, but unless your contract specifies otherwise it's your fault if you use faulty components and deliver a faulty product.

        If I make roller skates and I use a bearing that results in the wheels falling off at speed and someone gets hurt, they don't sue the ball bearing manufacturer. They sue me.

      • hobs 4 hours ago

        Yes they do, adding "plus AI" changes nothing about contract law, OAI is not giving you idemification for crap and you cant assign liability like that anyway.

    • Animats an hour ago

      These people want to assign it to the customer. See above.

  • jimbo808 2 hours ago

    This is lawyers buying the hype that LLMs are actually intelligent and capable of autonomous decision making.

    • mort96 an hour ago

      Wellll...

      LLMs are not actually intelligent, and absolutely should not be used for autonomous decision making. But they are capable of it... as in, if you set up a system where an LLM is asked about its "opinion" on what should be done, it will give a response, and you can make the system execute the LLM's "decision". Not a good idea, but it's possible, which means someone's gonna do it.

    • BoorishBears an hour ago

      This is the birth of a new anthropomorphic mind virus around how LLMs operate, funded by a team looking desperately for distribution.

      11/10 content marketing but it will be a shame if this gets any attention outside this comment section.

Animats an hour ago

Legal contracts built for sellers of AI agents.

The contract establishes that your agent functions as a sophisticated tool, not an autonomous employee. When a customer's agent books 500 meetings with the wrong prospect list, the answer to "who approved that?" cannot be "the AI decided."

It has to be "the customer deployed the agent with these parameters and maintained oversight responsibility."

The MSA includes explicit language in Section 1.2 that protects you from liability for autonomous decisions while clarifying customer responsibility.

The alternative is that the service has financial responsibility for its mistakes. This is the norm in the gambling industry. Back when GTech was publicly held, their financial statements listed how much they paid out for their errors. It was about 3%-5% of revenue.

Since this kind of product is sold via large scale B2B deals, buyers can negotiate. Perhaps service responsibility for errors backed up by reinsurance above some limit.

ataha322 7 hours ago

The question isn't just who's liable - it's whether traditional contract structures can even keep up with systems that learn and change behavior over time. Wonder if this becomes a bigger moat than the AI.

  • tuesdaynight 6 hours ago

    Probably a dumb question, but what do you mean with changing behavior over time? Contract with changing clauses? From my limited knowledge on the matter, the idea of a contract is getting rules that would not change without agreement from both parties.

    • candiddevmike 6 hours ago

      I encounter this all the time with GenAI projects. The idea of stability and "frozen" just doesn't exist with hosted models IMO. You can't bet that the model you're using will have the exact behavior a year from now, hell maybe not even 3 months. The model providers seem to be constantly tweaking things behind the scenes, or sunsetting old models very rapidly. Its a constant struggle of re-evaluating the results and tweaking prompts to stay on the treadmill.

      Good for consultants, maybe, horrible for businesses that want to mark things as "done" and move them to limited maintenance/care and feeding teams. You're going to be dedicating senior folks to the project indefinitely.

      • hodgesrm 5 hours ago

        This is a big motivation for running your own models locally. OpenAI's move to deprecate older models was an eye-opener to some but also typical behavior of the SaaS "we don't have any versions" style of deployment. [0] It will need to change for AI apps to go mainstream in many enterprises.

        [0] https://simonwillison.net/2025/Aug/8/surprise-deprecation-of...

      • htrp 5 hours ago

        You're gonna have to own the model weights and there will be an entire series of providers dedicated to maintaining oldmodels.

      • idiotsecant 4 hours ago

        This isn't a new problem. It's like if you built a business based on providing an interface to a google product 10 years ago and google deleted the product. The answer is you don't sell permanent access to something you don't own. Period.

    • avs733 6 hours ago

      I interpreted the comment as worrying about drift across many contracts not one contract changing.

      Imagine I create a new agreement with a customer once a week. I’m no lawyer so might not notice the impact of small wording changes on the meaning or interpretation of each sequential contract.

      Can I try and prompt engineer this out? Yeah sure. Do I as a non lawyer know I have fixed it - not to a high level of confidence.

  • lazide 6 hours ago

    You literally don’t want contracts that ‘learn and change behavior over time’?

    What is the stated use case here?

    • hodgesrm 5 hours ago

      No, at least not in all cases. Customers incur review costs and potentially new risks if you change contract terms unexpectedly. In my business many large customers will only adopt our ToS if we commit to it as a contract that does not change except by mutual agreement. This is pretty standard behavior.

      • lazide an hour ago

        I can’t think of any case where someone who cares about the contract (aka actual terms) would be okay with it just changing. Arguably, it violates the concept of a contract which in most legal systems requires a meeting of the minds.

        Do you have any examples where it would be okay?

  • bryanrasmussen 6 hours ago

    humans.

    Also it might be that with systems that learn and change behavior over time, some sort of contract structure is needed. Not sure if traditional is the answer though.

jrm4 3 hours ago

Sigh -- another not-even-thinly-veiled ducking of "A computer can never be held accountable, therefore a computer must never make a management decision."

This is not the way we want to be going.

n8m8 5 hours ago

Can't scroll, Cookies disclaimer doesn't work in firefox with ublock origin :(

  • Neywiny 4 hours ago

    That's why I always incognito. Sure, I accept your cookies. They're gone in a few hours anyway