The debate between natural law and positivist law has been received much attention. Ronald Dworkin exposes the limitation of positivist law through the argument of hard cases. This argument is furthered strengthened when we apply the interpretation of Martin Luther King Jr and the voluntarist natural law tradition, and Lon Fuller’s ‘procedural view’ and the application of the ‘principles of legality’.
Die Disziplin, die als „Juristische Methodenlehre“ bezeichnet wird, ist gegenwärtig chinesischen Juristen nicht fremd, sie stammt aber ursprünglich aus dem deutschen Sprachraum. In der Literatur finden sich auch verwandte Ausdrücke wie „Juristische Methodologie“, „Juristische Methodik“ bzw.„Methodenlehre der Rechtswissenschaft“. Seit Anfang des 21. Jahrhunderts wurde ihre Rezeption in China durch zwei Übersetzungen gekennzeichnet, nämlich die „rechtswissenschaftliche Methodenlehre“ (faxue fangfalun) und die „rechtliche Methodenlehre“ (falü fangfalun). Neben der herkömmlichen Methodenlehre entwickelte sich auch eine jüngere Theorie der juristischen Argumentation, die die weltweite Aufmerksamkeit (...) sogar außerhalb Deutschlands auf sich lenken konnte und deren Bestand in China einen ganz selbständigen Abschnitt verdient. Als Hintergrund dafür soll von vornherein ein Blick auf die Entwicklungen der chinesischen Rechtstheorie und Rechtsphilosophie nach 1978 und demzufolge auf die Entstehung der Methodenfrage in der chinesischen Rechtswissenschaft geworfen werden. (shrink)
Referring to foreign legal systems for the sake of producing a convincing judicial argument has been a custom in judicial decision-making for more than a century. However, a generally accepted theoretical framework for this kind of reasoning is yet to be established. The article suggests that such a framework must answer at least the following three fundamental questions: first, what is the normative relationship, as a matter of principle, between domestic and foreign law?; second, what is the primary motive and (...) functioning of comparative legal reasoning?; and third, what methodological approach enables such reasoning to work in practice? Drawing in particular on linguistic philosophy, as well as recent work on the theory of argumentation, the article outlines a theoretical framework that addresses these questions in order to understand, evaluate and rationalise the use of comparative arguments in legal practice. (shrink)
This paper discusses a much-neglected aspect of Neil MacCormick's theory of legal reasoning, namely what he calls ‘consequential reasoning’. For MacCormick, consequential reasoning is both an omnipresent feature of legal reasoning in England and Scotland, as well as being a valuable one. MacCormick articulates the value of consequential reasoning by seeing it as contributing to the forward-looking requirement of formal justice, ie, of deciding the instant case on grounds that one is willing to adopt when deciding future similar cases. This (...) paper situates consequential reasoning in the overall picture of legal reasoning MacCormick develops in Legal Reasoning and Legal Theory, going on to show the evolution of his view on consequential reasoning in later work, which culminates in Rhetoric and the Rule of Law. It is argued that MacCormick's later view of consequential reasoning, ie, of a process of testing possible rulings by evaluating the acceptability or unacceptability.. (shrink)
Compares formalism and instrumentalism and evaluates their general claims. “Part of what is meant by formalism is this: The law provides sufficient basis for deciding any case that arises. There are no “gaps” within the law, and there is but one sound legal decision for each case.” The formalist also holds that law is traceable to an authoritative source. “…sound legal decisions can be justified as the conclusions of valid deductive syllogisms. Because law is believed to be complete and univocal, (...) all cases that arise can in principle be decided in this way. This is the formalistic model for legal justifications.” Moderate instrumentalism also accepts the notion that the law has sources and that past political events can determine what the law is for the future. However, they also maintain that the law has gaps. “Moderate instrumentalists and positivists alike embrace a source-based conception of the law as well as a formalistic model for legal justification; partly because of this combination, they reject the formalistic notion that law is complete and univocal. Instrumentalists, like positivists, emphasize that because the interpretation of authoritative legal texts and their application to cases are often controversial, reasonable arguments are often possible on both sides of a legal issue. Since there are no hard and fast rules for adjudicating such disputes, positivists conclude that law in such cases is indeterminate – not yet fully formed, needing judicial legislation. Instrumentalists most likely have a similar view of the law. They assume that law is determinate on an issue at a given time only if its identification and application are, roughly speaking, mechanical.” [Must keep in mind the distinction between determinate and conclusive.] “The instrumentalists make the decisive assumption that law is not determinate if it is controversial, for law is thought to be gappy and indeterminate only when reasonable legal arguments are possible on both sides of a legal question…This reasoning exposes a more fundamental assumption of formalism, instrumentalism, and positivism: Nondeductive reasoning is incapable of adequately establishing any conclusion.” This, though, is a radical thesis given that all scientific hypotheses are unproven deductively. [But it might be a claim restricted to legal reasoning. Moreover, it seems legitimate to wonder whether legal reasoning is really comparable to offering scientific hypotheses.] We should, however, reject the mechanistic model of legal and judicial reasoning that assumes a given major premise captured in authoritative legal texts that deductively produces conclusions in a situation to which a law applies. Take a law that requires some activity to be “economically efficient” but fails to specify economic efficiency. The text assumes that there is such a thing as economic efficiency, that some activities are so efficient and others not, that some judgments about economic efficiency are true and others false, and that the criteria of efficiency are determinable in principle. “[I]f there really are reasons for preferring one conception of economic efficiency to others in a given context, the assumption of the law in question would be true; if the court correctly identifies and applies that conception, it is simply carrying out its legal mandate. It would be faithful to the text, but it would not be limited to the four corners of the text and its literal implications.” Thus, a judge may employ non-mechanical reasoning in arriving at the meaning of legal requirements such as economic efficiency. Moral reasons are constitutive of the law. A judge must, in deciding the appropriate conc ption of economic efficiency, offer reasons for preferring that conception in the given context. (shrink)
This book tackles the basics of legal reasoning in twelve chapters, including the principles of classic logic, deductive and inductive reasoning, application of the Socratic method to legal reasoning, and formal and material fallacies.
The authors describe a logic programming approach to the representation of legislative texts. They consider the potential uses of simple systems which incorporate a single, fixed interpretation of a text. These include assisting in the routine administration of complex areas of the law. The authors also consider the possibility of constructing more complex systems which incorporate several, possibly conflicting interpretations. Such systems are needed for dealing with ambiguity and vagueness in the law. Moreover, they are more suitable than single interpretation (...) systems for helping to test proposed legislation and for helping to give citizens advice. (shrink)
This paper discusses the functions of deductive justification in ideal reconstructions of judicial reasoning. It departs from the point of judicial reasoning: explaining and justifying the judicial decision. It argues that deductive validity is not enough for good judicial argument. On the other hand, deductive justification is necessary, not only for easy cases but for hard cases as well. It draws some consequences for the concept of ‘jumps’ in legal reasoning and for the traditional distinction between internal and external justification.
This article describes the Vaccine/Injury Project Corpus, a collection of legal decisions awarding or denying compensation for health injuries allegedly due to vaccinations, together with models of the logical structure of the reasoning of the factfinders in those cases. This unique corpus provides useful data for formal and informal logic theory, for natural-language research in linguistics, and for artificial intelligence research. More importantly, the article discusses lessons learned from developing protocols for manually extracting the logical structure and generating the logic (...) models. It identifies sub-tasks in the extraction process, discusses challenges to automation, and provides insights into possible solutions for automation. In particular, the framework and strategies developed here, together with the corpus data, should allow “top–down” and contextual approaches to automation, which can supplement “bottom-up” linguistic approaches. Illustrations throughout the article use examples drawn from the Corpus. (shrink)
This paper aims at providing an account of juridical acts that forms a suitable starting point for the creation of computational systems that deal with juridical acts. The paper is divided into two parts. Because juridical acts will be analyzed as intentional changes in the world of law, the ‘furniture’ of this world, that consists broadly speaking of entities, facts and rules, plays a central role in the analysis. This first part of the paper deals with this furniture and its (...) philosophical underpinnings, and at the same time introduces most of the logical apparatus that will be used to deal with it. The focus in the first part is on static and dynamic legal rules and their interplay in constituting the world of law. (shrink)
Carneades is a recently proposed formalism for structured argumentation with varying proof standards, inspired by legal reasoning, but more generally applicable. Its distinctive feature is that each statement can be given its own proof standard, which is claimed to allow a more natural account of reasoning under burden of proof than existing formalisms for structured argumentation, in which proof standards are defined globally. In this article, the two formalisms are formally related by translating Carneades into the ASPIC+ framework for structured (...) argumentation. Since ASPIC+ is defined to generate Dung-style abstract argumentation frameworks, this in effect translates Carneades graphs into abstract argumentation frameworks. For this translation, we prove a formal correspondence and show that certain rationality postulates hold. It is furthermore proved that Carneades always induces a unique Dung extension, which is the same in all of Dung's semantics, allowing us to generalise Carneades to cycle-containing structures. (shrink)
This game3 was designed to investigate protocols and strategies for resourcebounded disputation. The rules presented here correspond very closely to the problem of controlling search in an actual program. The computer program on which the game is based is LMNOP. It is a LISP system designed to produce arguments and counterarguments from a set of statutory rules and a corpus of precedents, and applied to legal and quasi-legal reasoning. LMNOP was co-designed by a researcher in AI knowledge representation and by (...) a trained computer scientist who was an editor of Washington University Law Review at the time. LMNOP is based on the idea of a non-demonstrative or defeasible rule: i.e., a rule that admits exceptions. It adopts a representational convention that supposes there is an implicit preference of more speciﬁc rules over less speciﬁc rules. In fact, it automatically adjudicates between competing arguments when one argument meets the broader criterion of being more speciﬁc than another. The convention is based on an idea origianlly presented by David Poole , and is embedded in a system of determining which arguments are ultimately warranted, which originally appeared in the literatures of epistemology and ethics, by Pollock . This system evolves from work by the ﬁrst author since 1987; the full statement of the theory is in . Prakken  is one example of the idea’s application to the legal domain. LMNOP also draws heavily on the model of legal reasoning and analogical reasoning put forward by Edwina Rissland and Kevin Ashley [89, 90]. Similarities to their legal casebased reasoning program, HYPO, are no accident; LMNOP seeks to improve on HYPO. A description of LMNOP is forthcoming. (shrink)
Much legal research focuses on understanding how judicial decision-makers exercise their discretion. In this paper we examine the notion of legal or judicial discretion, and weaker and stronger forms of discretion. At all times our goal is to build cognitive models of the exercise of discretion, with a view to building computer software to model and primarily support decision-making. We observe that discretionary decision-making can best be modeled using three independent axes: bounded and unbounded, defined and undefined, and binary and (...) continuous. Examples of legal tasks are given from each of the eight ensuing octants and we conclude by saying what this model shows about current legal trends. We should stress that our taxonomy has been based on our observations of how discretionary legal decisions are made. No claim is made that our model is either complete (providing advice in every domain) or exact, but it does help knowledge engineers construct legal decision support systems in discretionary domains. (shrink)
Abstract.Although the Hart/Dworkin debate has as much to do with Dworkin's affirmative theory of judicial discretion as with Hart's more comprehensive theory of law, the starting point was of course Dworkin's attempt to demolish the “model of rules,” Hart's alleged analysis of legal systems as collections of conclusive reasons for specified legal consequences. The continuing relevance of this attack for the prospects for any theory of law is the subject of the present essay.
.The author summarizes the essential elements of a general theory he is developing which he calls “The Formal Character of Law.” He explains that law's formal character is a potentially major branch of legal theory that is still relatively unexplored. In his view, it is possible to identify formal attributes in legal rules, other basic legal constructs such as interpretive method, the principles of stare decisis, legal reasons, and legislative and adjudicative processes, and a legal system viewed as a whole. (...) For example, a legal rule has, in varying degrees, such formal attributes as generality, definiteness, and simplicity. Such attributes are formal in the sense that they apply to or accommodate highly variable content and do not prescribe or proscribe content. Of course, legal phenomena have other characteristics besides their formality. The author's main technique for developing his theory is to address a common set of questions to the varied formal attributes of , , and above. Among other things, the answers to these questions further explicate how law is formal, demonstrate that law is not merely a means of serving problem‐specific policy but also serves formal values , treats the relations between form and content—specially how good form begets good content and bad form bad content, explores the design and implementation of appropriate formality—its “anatomy and physiology,” and analyses the “pathology” of legal form including not only the “formalistic” , but also the “sub‐stantivistic,” and shows how the overall theory is important both jurisprudentially and in practical ways. (shrink)
. A heuristic search procedure for inventing legal arguments is built on two tools already widely in use in argumentation. Argumentation schemes are forms of argument representing premise‐conclusion and inference structures of common types of arguments. Schemes especially useful in law represent defeasible arguments, like argument from expert opinion. Argument diagramming is a visualization tool used to display a chain of connected arguments linked together. One such tool, Araucaria, available free at , helps a user display an argument on the (...) computer screen as an inverted tree structure with an ultimate conclusion as the root of the tree. These argumentation tools are applicable to analyzing a mass of evidence in a case at trial, in a manner already known in law using heuristic methods and Wigmore diagrams . In this paper it is shown how they can be automated and applied to the task of inventing legal arguments. One important application is to proof construction in trial preparation. (shrink)
Work on a computer program called SMILE + IBP (SMart Index Learner Plus Issue-Based Prediction) bridges case-based reasoning and extracting information from texts. The program addresses a technologically challenging task that is also very relevant from a legal viewpoint: to extract information from textual descriptions of the facts of decided cases and apply that information to predict the outcomes of new cases. The program attempts to automatically classify textual descriptions of the facts of legal problems in terms of Factors, a (...) set of classification concepts that capture stereotypical fact patterns that effect the strength of a legal claim, here trade secret misappropriation. Using these classifications, the program can evaluate and explain predictions about a problem’s outcome given a database of previously classified cases. This paper provides an extended example illustrating both functions, prediction by IBP and text classification by SMILE, and reports empirical evaluations of each. While IBP’s results are quite strong, and SMILE’s much weaker, SMILE + IBP still has some success predicting and explaining the outcomes of case scenarios input as texts. It marks the first time to our knowledge that a program can reason automatically about legal case texts. (shrink)
A variety of legal documents are increasingly being made available in electronic format. Automatic Information Search and Retrieval algorithms play a key role in enabling efficient access to such digitized documents. Although keyword-based search is the traditional method used for text retrieval, they perform poorly when literal term matching is done for query processing, due to synonymy and ambivalence of words. To overcome these drawbacks, an ontological framework to enhance the user’s query for retrieval of truly relevant legal judgments has (...) been proposed in this paper. Ontologies ensure efficient retrieval by enabling inferences based on domain knowledge, which is gathered during the construction of the knowledge base. Empirical results demonstrate that ontology-based searches generate significantly better results than traditional search methods. (shrink)
Professor Lee argues that traditional quality criteria for judging law libraries are now inadequate because they no longer computer the vital multiple missions of today's libraries. She suggests ways that law librarians can begin to develop indicia of quality that can adequately evaluate the contemporary law school library and preserve its core missions.
This paper argues that formal models of coherence are useful for constructing a legal epistemology. Two main formal approaches to coherence are examined: coherence-based models of belief revision and the theory of coherence as constraint satisfaction. It is shown that these approaches shed light on central aspects of a coherentist legal epistemology, such as the concept of coherence, the dynamics of coherentist justification in law, and the mechanisms whereby coherence may be built in the course of legal decision-making.
Governments and other groups interested in the views of citizens require the means to present justifications of proposed actions, and the means to solicit public opinion concerning these justifications. Although Internet technologies provide the means for such dialogues, system designers usually face a choice between allowing unstructured dialogues, through, for example, bulletin boards, or requiring citizens to acquire a knowledge of some argumentation schema or theory, as in, for example, ZENO. Both of these options present usability problems. In this paper, (...) we describe an implemented system called PARMENIDES which allows structured argument over a proposed course of action, without requiring knowledge of the underlying argumentation theory. (shrink)
Legal codes, such as the Uniform Commercial Code (UCC) examined in this article, are good points of entry for AI and ontology work because of their more straightforward adaptability to relationship linking and rules-based encoding. However, approaches relying on encoding solely on formal code structure are incomplete, missing the rich experience of practitioner expertise that identifies key relationships and decision criteria often supplied by experienced practitioners and process experts from various disciplines (e.g., sociology, political economics, logistics, operations research). This research (...) focuses on the UCC because it transcends the limitations of a formal code, functioning essentially as a composite. AI work can benefit from real-world codes like the UCC, which are essentially formal codes enlightened from a more realistic experience-base from centuries of development in international commercial transactions settings. This paper then describes our initial work in converting an expert system on the U.S. law governing the sale of goods from Article II of the Uniform Commercial Code (UCC), into a knowledge-based system using the Web Ontology Language OWL. (shrink)
Reasoning about mental states and processes is important in various subareas of the legal domain. A trial lawyer might need to reason and the beliefs, reasoning and other mental states and processes of members of a jury; a police officer might need to reason about the conjectured beliefs and reasoning of perpetrators; a judge may need to consider a defendant's mental states and processes for the purposes of sentencing and so on. Further, the mental states in question may themselves be (...) about the mental states and processes of other people. Therefore, if AI systems are to assist with reasoning tasks in law, they may need to be able to reason about mental states and processes. Such reasoning is riddled with uncertainty, and this is true in particular in the legal domain. The article discusses how various different types of uncertainty arise, and shows how they greatly complicate the task of reasoning about mental states and processes. The article concentrates on the special case of states of belief and processes of reasoning, and sketches an implemented, prototype computer program (ATT-Meta) that copes with the various types of uncertainty in reasoning about beliefs and reasoning. In particular, the article outlines the system's facilities for handling conflict between different lines of argument, especially when these lie within the reasoning of different people. The system's approach is illustrated by application to a real-life mugging example. [NB: The date on the archived preprint avail via this Phil* page has an incorrect date, 2011, on it. The correct date is 2001.]. (shrink)
Negotiation Support Systems have traditionally modelled the process of negotiation. They often rely on mathematical optimisation techniques and ignore heuristics and other methods derived from practice. Our goal is to develop systems capable of decision support to help resolve a given dispute. A system we have constructed, Family_Winner, uses empirical evidence to dynamically modify initial preferences throughout the negotiation process. It sequentially allocates issues using trade-offs and compensation opportunities inherent in the dispute.
In recent years a number of writers have maintained that law can usefully be illuminated by game theory. Some believe that game theory can provide guidance in formulating rules for dealing with speciﬁc problems. Others advance the philosophically ambitious contention that we can gain a better understanding and/or appreciation of law by seeing it in terms of game-theoretic ideas. My purpose in this article is to examine some claims of the latter sort, and in particular to ask how distant law (...) can be from the assumptions of game theory and still be informed by it. Models are not expected to ﬁt precisely what they model, but at some point the deviation is too great and there is a failure to illuminate. (shrink)
This paper discusses some engineering considerations that should be taken into account when building a knowledge based system, and recommends isomorphism, the well defined correspondence of the knowledge base to the source texts, as a basic principle of system construction in the legal domain. Isomorphism, as it has been used in the field of legal knowledge based systems, is characterised and the benefits which stem from its use are described. Some objections to and limitations of the approach are discussed. The (...) paper concludes with a case study giving a detailed example of the use of the isomorphic approach in a particular application. (shrink)
In this paper I recapitulate the ideas of Berman and Hafner (1993) regarding the role of teleology in legal argument. I show how these ideas can be used to address some issues arising from more recent work on legal argument, and how this relates to ideas associated with the New Rhetoric of Perelman. I illustrate the points with a discussion of the classic problem of which vehicles should be allowed in parks.
Although Berman and Hafner [Berman 1989, pp. 928–938] presented the possibility to adapt the model of reasoning of development of an expert system for medical diagnosis to the reasoning of a judge when he/she sentences criminals does not resemble the reasoning found in the decisions of physicians, mathematicians or statisticians.When a lawyer reasons, he/she not only looks for the solution of a case; he/she simultaneously looks for the bases on which his/her reasoning can rest [Galindo 1992, pp. 363–367]. That is (...) to say, he/she not only needs to find the solution but moreover he/she has to find the references (laws, jurisprudence and bibliography) that allow him/her to argue the solution. (shrink)
Legal contracts and litigation documents common to the American legal system were encoded in the eXtensible Markup Language (XML). XML also represents rules about the contracts and litigation procedure. In addition to an expert system tool that allows one to make inferences with that engine, a Graphical User Interface (GUI) generates the XML representing the rules. A rulebase is developed by marking up examples of the XML to be interpreted and the XML to be generated, analogously to Query By Example. (...) This article provides a broader context of the synergy between XML and artificial intelligence by including discussions of: (1) the role of Artificial Intelligence in handling routine litigation; (2) how the use of XML enables legal expert systems to get their `input' without the user having to enter the same information again for the expert system;(3) the advantages of XML markup over other forms of markup for documents; (4) the relationship between XML and ontologies; (5) other projects using XML with rules or legal affairs. (shrink)
In this paper, the author describes a dialogical approach tolegal argumentation from the perspective of argumentationtheory. In a pragma-dialectical approach of legalargumentation, the argumentation is considered to be part of acritical discussion aimed at the rational resolution of thedispute. The author describes how a pragma-dialecticalanalysis and evaluation of legal argumentation can be carriedout.
In this contribution the author develops an argumentation model for the reconstruction of weighing and balancing on the basis of teleological-evaluative considerations. The model is intended as a heuristic and critical tool for the rational reconstruction of the justification of judicial decisions. From the perspective of a rational discussion, it makes explicit the choices underlying the weighing and balancing on the basis of goals and values so that they can be made explicit and submitted to rational critique.
There is more to legal knowledge representation than knowledge-bases. It is valuable to look at legal knowledge representation and its implementation across the entire domain of computerisation of law, rather than focussing on sub-domains such as legal expert systems. The DataLex WorkStation software and applications developed using it are used to provide examples. Effective integration of inferencing, hypertext and text retrieval can overcome some of the limitations of these current paradigms of legal computerisation which are apparent when they are used (...) on a stand-alone basis. Effective integration of inferencing systems is facilitated by use of a (quasi) natural language knowledge representation, and the benefits of isomorphism are enhanced. These advantages of integration apply to all forms of inferencing, including document generation and casebased inferencing. Some principles for development of integrated legal decision support systems are proposed. (shrink)
A formal language is introduced that contains expressions for the dependency of a legal relation on the claims that the concerned individuals make and on the permissions that they grant. It is used for a classification of legal relations into six major categories: categorical obligation, categorical permission, claimable obligation, grantable permission, claim-dependent obligation and grant-dependent permission. Legal rights may belong to any of these six categories, but the characteristics of a right-holder are shown to be different in each of the (...) six types. (shrink)
In this paper, we present an approach to commonsense causal explanation of stories that can be used for automatically determining the liable party in legal case descriptions. The approach is based on, a core ontology for law that takes a commonsense perspective. Aside from our thesis that in the legal domain many terms still have a strong commonsense flavour, the descriptions of events in legal cases, as e.g. presented at judicial trials, are cast in commonsense terms as well. We present (...) design principles for representing commonsense causation, and describe a process-based approach to automatic identification of causal relations in stories, which are described in terms of the core ontology. The resulting causal explanation forms a necessary condition for determining the liability and responsibility of agents that play a role in the case. We describe the basic architecture and working of, the demonstrator we are constructing to test the validity of our process oriented view on commonsense causation. This view holds that causal relations are in fact abstractions constructed on the basis of our commonsense understanding of physical and mental processes. (shrink)
In legal decisions standpoints can be supported by formal and also by substantive interpretative arguments. Formal arguments consist of reasons the weight or force of which is essentially dependent on the authoritativeness that the reasons may also have: In this connection one may think of linguistic and systemic arguments. On the other hand, substantive arguments are not backed up by authority, but consist of a direct invocation of moral, political, economic, or other social considerations. Formal arguments can be analyzed as (...) exclusionary reasons: The authoritative character excludes—in principle—substantial counterarguments. Formal arguments are sometimes used to conceal value judgements based on substantial arguments. This paper deals with reconstructing problems regarding this strategic use of formal arguments in legal decisions, with a focus on linguistic argumentation. (shrink)
A method to identify ontology components is presented in this article. The method relies on Natural Language Processing (NLP) techniques to extract concepts and relations among these concepts. This method is applied in the legal field to build an ontology dedicated to information retrieval. Legal texts on which the method is performed are carefully chosen as describing and conceptualizing the legal domain. We suggest that this method can help legal ontology designers and may be used while building ontologies dedicated to (...) other tasks than information retrieval. (shrink)
Document assembly and other substantive legal practice applications are the most knowledge-intense forms of software now widely available in the legal technology marketplace. This article provides an illustrative look at two contemporary practice system engines-CAPS and Scrivener-and examines their relevance for AI-and-law researchers.
Contemporary law offices use many different technologies for storing and retrieving documents produced in the course of legal work. This article examines two approaches in detail: document management, as exemplified by SoftSolutions, and electronic publishing, as exemplified by Folio VIEWS. Some other approaches are reviewed, and the pragmatics, politics, economics, and legalities of legal work product retrieval are discussed.
This paper addresses the problems that lawyers experience retrieving information from legal-text databases. Traditional access mechanisms of text databases require users to know how information is stored. We propose a method for index organisation which shields lawyers from the internal storage structures and which allows them to address the legal databases in their own legal terms. The proposed index is based on a model of legal tasks as opposed to traditional database indexes which represent the contents of the database. We (...) will lay out the architecture of an information system in which this task model is used to determine the information need, to retrieve relevant documents and to give methodical guidance for the legal task itself. To account for the design of a task-based legal information retrieval system, a substantial part of this paper is devoted to analysis and representation of legal tasks. (shrink)
We present a logic-based formalism for modeling ofdialogues between intelligent and autonomous software agents,building on a theory of abstract dialogue games which we present.The formalism enables representation of complex dialogues assequences of moves in a combination of dialogue games, and allowsdialogues to be embedded inside one another. The formalism iscomputational and its modular nature enables different types ofdialogues to be represented.
This article is an exercise in computational jurisprudence. It seems clear thatthe field of AI and Law should draw upon the insights of legal philosophers,whenever possible. But can the computational perspective offer anything inreturn? I will explore this question by focusing on the concept of OWNERSHIP,which has been debated in the jurisprudential literature for centuries. Althoughthe intellectual currents here flow mostly in one direction – from legal philosophy to AI – I will show that there are also some insights to (...) be gained from a computational analysis of the OWNERSHIP relation. In particular, the article suggests a computational explanation for the emergence of abstract property rights, divorced from concrete material objects. (shrink)
Legal text retrieval traditionally relies upon external knowledge sources such as thesauri and classification schemes, and an accurate indexing of the documents is often manually done. As a result not all legal documents can be effectively retrieved. However a number of current artificial intelligence techniques are promising for legal text retrieval. They sustain the acquisition of knowledge and the knowledge-rich processing of the content of document texts and information need, and of their matching. Currently, techniques for learning information needs, learning (...) concept attributes of texts, information extraction, text classification and clustering, and text summarization need to be studied in legal text retrieval because of their potential for improving retrieval and decreasing the cost of manual indexing. The resulting query and text representations are semantically much richer than a set of key terms. Their use allows for more refined retrieval models in which some reasoning can be applied. This paper gives an overview of the state of the art of these innovativetechniques and their potential for legal text retrieval. (shrink)
Considerable attention has been given to the accessibility of legal documents, such as legislation and case law, both in legal information retrieval (query formulation, search algorithms), in legal information dissemination practice (numerous examples of on-line access to formal sources of law), and in legal knowledge-based systems (by translating the contents of those documents to ready-to-use rule and case-based systems). However, within AI & law, it has hardly ever been tried to make the contents of sources of law, and the relations (...) among them, more accessible to those without a legal education. This article presents a theory about translating sources of law into information accessible to persons without a legal education. It illustrates the theory by providing two elaborated examples of such translation ventures. In the first example, formal sources of law in the domain of exchanging police information are translated into rules of thumb useful for policemen. In the second example, the goal of providing non-legal professionals with insight into legislative procedures is translated into a framework for making available sources of law through an integrated legislative calendar. Although the theory itself does not support automating the several stages described, in this article some hints are given as to what such automation would have to look like. (shrink)