A first look at the proposed Principles of the Law of Software Contracts

The American Law Institute (ALI) is writing a new Principles of the Law of Software Contracts (PLSC), to replace the failed Uniform Computer Information Transactions Act (UCITA). I recently attended ALI’s annual meeting, in which we reviewed the Principles and am giving a first status report at the Conference of the Association for Software Testing (CAST July 9-10).
The Principles raise important issues for our community. For example:

  • They apply a traditional commercial rule to software contracting–a vendor who knows of a hidden material defect in its product but sells the product without disclosing the defect is liable to the customer for damage (including expenses) caused by the defect.
    • A defect is hidden if a reasonably observant person in the position of the buyer would not have found it in an inspection that one would expect a reasonably diligent customer to perform under the circumstances.
    • A defect is material if it is serious enough to be considered a significant breach of the contract.

    I think this is an excellent idea. It reflects the fact that no one can know all of the bugs in their product and lets vendors shield themselves from liability for defects they didn’t know about, but it demands that vendors reveal the serious bugs that they know about, so that customers can (a) make better informed purchase decisions and (b) avoid doing things that have serious effects.

I think we could help clarify it:

    • When should we hold a software company responsible for “knowing” about a defect? Is an irreproducible bug “known”? What about a bug that wasn’t found in the lab but a customer reported it? One customer reported it? 5000 customers reported it? How long should we allow the company to investigate the report before saying that they’ve had enough time to be held responsible for it?
    • What counts as disclosure to the customer? The bugs Firefox and OpenOffice are disclosed in their open-to-the-public bug databases. Is this good enough? Maybe it is for these, but I tried figuring out the bugs published in Apache’s database and for many reports, had no clue what these people were writing about. Serious problems were reported in ways that tied closely to the implementation and not at all to anything I would know to do or avoid. For their purpose (help the developers troubleshoot the bug), these reports might have been marvelous. But for disclosure to the customer? What should our standards be?
    • What is a material defect anyway? Do the criteria differ depending on the nature of the product? Is a rarely-occuring crash more material in a heart monitor than a word processor? And if a bug corrupts data in a word processor, do we have different expectations from Word, OpenOffice Writer, Wordpad, and my 12-year-old niece’s StudentProjectEditor?
    • What about the idea of security by obscurity? That some security holes won’t be exploited if no one knows about them, and so we should give the vendor a chance to fix some bugs before they disclose them? This is a controversial idea, but there is evidence that at least some problems are much more exploited after they are publicized than before.
  • Another issue is reverse engineering. Historically, reverse engineering of products (all products–hardware, software, chemical, mechanical, whatever) has been fair game. American know-how has a lot of “building a better mousetrap” in it, and to build a better one, you start by reverse engineering the current one. There have been some very recent, very expansive rulings (such as Baystate v Bowers) that have enforced EULA-based restrictions against software reverse engineering that they would exclude black box testing (for example, for a magazine review).
    • The Principles lay out several criteria under which unreasonable restrictions could be ruled unenforceable by a judge. To me, these seem like reasonable criteria (a contract clause is unenforceable if it conflicts with federal or state law, public policy (for example, as expressed in the constitution and statutes), is outrageously unfair, or would be considered an abuse of copyright or patent under (mainly) antitrust doctrines.
    • Does it make sense for us to identify solid examples of contexts in which reverse engineering should be permissible (for example, poses absolutely no threat to the vendor’s legitimate interests) and others in which the vendor might have more of a basis and rationale for seeking protection? We can identify these shades of gray with much more experience-based wisdom than lawyers who don’t even know what a loop is, let alone being unable to code their way out of one.

There are plenty of other issues. CAST will provide an opening forum for discussion–remember, CAST isn’t just a place for presentations. We make time for post-presentation discussion, for hours if the conference participants want to go that long.

3 Responses to “A first look at the proposed Principles of the Law of Software Contracts”

  1. lb says:

    >knows of a hidden material defect in its product but sells the product without disclosing the defect

    As you write a piece of code, you may be aware of limitations of that piece of code (e.g. this function will fail if the string is > 50 characters). You won’t necessarily have time to handle for that limitation — in which case it becomes a defect.

    If you don’t have time to handle for the limitation, then you don’t have time to document the limitation. Lacking time to handle or document the limitation, you don’t have time to assess the possible impact of the limitation.

    (i.e. this function can fail, but which functions depend on this fuction? and ultimately what pages, forms, or business processes would then be impacted? and how valuable are they? ultimately this is what tells us if its serious enough to be a material breach).

    So being momentarily aware of a limitation (which becomes a defect) isn’t the same as being aware of the possible implications of that defect.

    A project that triages all limitations as they arrise and makes informed decisions about the possible implications of those limitation…. will never ever ship.

  2. Michelle Taylor says:

    I just left this comment on a much earlier post in this blog where you first discussed these ideas, but it’s relevant to this post also.

    It occurs to me that if software companies are obliged to provide lists of known defects to customers, the obvious result of this is that the companies reduce their testing efforts and quietly discourage formal reporting of bugs so that they can claim to not have known about their software’s defects. Companies who test their software well will have intimidatingly huge lists of ‘known defects’ which will hurt their market position, because the everyday consumer won’t realise that a long list of known defects is in fact a sign of careful testing of the product (and is likely to mean that the more serious defects that lurk unannounced and unfound in competitor products have been found and fixed in this one).

    So this recommendation has the potential to introduce barriers to good software testing.

    [[This is a common misunderstanding of the proposal, usually spread by vendors who don’t like it.

    If you don’t test your product, then at the time of release, you indeed won’t know about its bugs. So far, so good, right?

    Of course, you also miss serious bugs that you really would have wanted to fix, bugs that will hurt your reputation in the market, kill your sales, drive up your tech support costs–people search for bugs in their code and fix them for a reason.

    But even if you dodge finding your bugs, not long after you release your product, people call you, send you letters, write reports in magazines–guess what. You just heard about your bugs. Now you have to disclose them, just as if you had found them yourself.

    So you achieve what with this strategy?

    • Bad customer relations
    • Crappy bug-filled product
    • High tech support costs
    • Bad magazine reviews
    • And you still have to tell people about your bugs.

    In most cases, it would be a lot cheaper to find and fix (or document) serious bugs before release.

    – Cem Kaner ]]

  3. Eric Mathiesen says:


    In your paragraph beginning with “What counts as disclosure to the customer?” in the sentance “For their purpose (help the developers troubleshoot the bug), these reports mgiht have been marvellous.” you have a typo with mgiht vs might and a mis-spelling of marvellous vs marvelous.

    I love your work and use it daily. Although there are those who do not listen…


    [[Thanks for your bug report. Fixed … Cem]]