Legal complexity: crystals, codes, and chaos

Posted on 02/14/2011 by

1



Today I want to tackle (alright, just scratch the surface of) the theme of complexity and the law.  It is an understatement to say that the complex nature of legal systems is a constant preoccupation of the legal academe; however, it is generally heavy on opinions and light on data.

There is hope, however.  I found a couple of authors who are part of what seems like a small movement for interdisciplinary legal studies employing computer scientists, statisticians, and, yes physicists.  On a theoretical front, J.M. Balkin presents a pretty novel idea that legal doctrines follow a ‘crystalline structure.’  That is not to say that they are self-replicating in a fractal sort of way, but rather that legal arguments themselves follow a similar structure.  The real problem, he suggests, is that traditional approaches tend to attempt to conceive of legal thought in terms of a ‘coherent system of moral directives,’ which is reasonably confusing and kind of useless.  Instead, legal arguments should be seen ‘dialectically, as a continuing series of struggles between various sets of opposed ideas;’ in this way its structure becomes relatively simple (by the way, this is a trope I really enjoy—taking the complex and making it simple by giving it structure).  Here is an example:

For Balkin,

It is important to understand…that what makes the argumentary structures crystalline is not, as Justice Tobriner [in Dillon v. Legg, 68 Cal. 2d 728, 441 P.2d 912, 69 Cal. Rptr. 72 (1968)] suggests, that the doctrine has been steadily headed in one direction…The point is rather that as each new debate arises about whether or not to extend the doctrine, the same types of arguments arise anew.

This is pretty interesting from a theoretical perspective, but what about some real data?  Luckily, Daniel Katz at UMich and Michael Bommarito have some interesting work on visualizing the United States Code, a notoriously complex and interrelated document.  Together they have formalized the USC as a mathematical object, with, ‘a hierarchical structure, a citation network, and an associated text function that projects language onto specific vertices.’ In the following image, the authors visualize the full USC, with each ring as a layer of a hierarchical tree halting at the section level (ie 26 U.S.C. 501(c)(3) stops at Sec.501).  Figure (b) adds this hierarchical structure to a citation network within the code.

As a graph, it is just massive.  There are 37,500 sections in 50 Titles, distributed non-uniformly such that there are many more edges than vertices (as the authors point out, 37,500 vertices to 197,000 edges).

(An earlier representation of the USC, from the same authors)

As they note, however, the large size of the graph does not necessarily imply complexity in the mathematical sense:

Of course, growth in the size United States Code alone is not necessarily analogous to an increase in complexity.  Indeed, while we believe in general the size of the code tends to contribute to “complexity,” some additional measures are needed…[so] we apply the well known Shannon Entropy measure (borrowed from Information Theory) to evaluate the “complexity” of the message passing / language contained therein.

(As a side note, as someone point out Shannon entropy is for many people, including me, a kind of esoteric subject.  More or less, entropy is a measure of chaos or unpredictability.  The paradox in Communication Technology is that since data is always compressed—from sending a voice signal over a phone line to receiving a packet from the internet—the compressed message contains the equivalent amount of entropy as the original data, but in fewer bits.  Compressed messages are thus more unpredictable.  Shannon’s theory proposes that as such there is a limit to how much a lossless compression system can compress a given bit, because, on average, such a system cannot have more than one bit of entropy per bit of message.)

There is much more to this subject, but I just wanted to present a couple of interesting finds.  I think I will return to this subject, especially regarding a very often-cited article, “Reproduction of Hierarchy? A Social Network Analysis of the American Law Professoriate.”  Return later for more.

Also, if you are interested in this subject in general, check out computationallegastudies.com, or if possible take a look at “Legal Knowledge and Information Systems,” eds. A. R. Lodder and L. Mommers (a collection of conference papers from JURIX 2007).

Balkin, J.M. 1986. The Crystalline Structure of Legal Thought. Rutgers Law Review 39(1). http://www.yale.edu/lawweb/jbalkin/articles/crystal.pdf

Bommarito II, Michael J. and Daniel M. Katz. 2010. A mathematical approach to the study of the United States Code. Physica A 389: 4195–4200. http://computationallegalstudies.com/2010/10/08/measuring-the-complexity-of-the-law-the-united-states-code-repost/

 

Advertisements