The Leibniz Center for Law
Tom van Engers and Radboud Winkels*
Cite as: T van Engers and R Winkels, "The Leibniz Center for Law",
(2010) 7:2 SCRIPTed 402,
http://www.law.ed.ac.uk/ahrc/script-ed/vol7-2/leibniz.asp
|
© Tom van Engers and Radboud Winkels 2010. This work is licensed under a Creative Commons Licence. Please click on the link to read the terms and conditions. |
1. Introduction
The Leibniz Center for Law has its roots in the former department of Computer Science & Law (started in 1988) of the Law Faculty of the University of Amsterdam, and currently houses about fifteen researchers. The Leibniz Center conducts research and provides education in the field of Artificial Intelligence (AI) and law. In our research we focus on the development and application of techniques from AI to the field of law for the purposes of supporting legal practice and bringing new insights to legal theory. By building computational models of legal reasoning we work in the tradition of Leibniz, developing and using a formal “lingua universalis” and mechanic reasoning procedures providing us with reliable trustworthy results.
The Leibniz Center for Law has longstanding experience on legal ontologies, automatic legal reasoning, legal knowledge-based systems, (standard) languages for representing legal knowledge and information, user-friendly disclosure of legal data, and the application of ICT in education and legal practice. It plays an important role in the development of eGovernance on both national and international levels. The centre provides advice on change-management issues of knowledge-intensive legal processes and the improvement of knowledge-productivity in legal organisations.
The Leibniz Center for Law has participated in many national and international projects for applied research, in which companies, governments and universities cooperate (cf. CLIME, E-POWER, eCOURT, Legal Services Counter). It was the initiator of the CEN MetaLex initiative,1 an XML interchange-format and standard for legal documents. The Center was recently coordinating partner for two EU-financed projects: TRIAS2 and ESTRELLA.3 In TRIAS we developed modular electronic teaching material on e-government for civil servants using i.e. a semantic wiki. ESTRELLA was aimed at developing a formal legal knowledge interchange format (LKIF) for exchanging legal knowledge using semantic web technology. Currently we are running a national science foundation project called AGILE, targeted at the development of a design method, distributed service architecture, and support tools that enable organisations to better govern their legislation and regulation based information services within in a networked environment. Furthermore we are a partner in the FP7 project IMPACT on computational models of argumentation about policy issues. In this project we aim at applying natural language processing techniques (NLP) to multi-threaded dialogues about policies. We aim at (semi) automatic argument reconstruction, using both syntactic and semantic features of the participants’ natural language expressions.
2. Computational Legal Theory and Knowledge Management
Core of the research conducted within the Leibniz Center for Law is research in legal knowledge and legal reasoning. Law is the most important and ubiquitous social coordination mechanism in society. The complexity of modern societies is reflected in our legal system. Legislation regulates the interaction between the members of the society, but managing the legal system becomes a problem itself by its sheer size, interrelations, rapid changes, etc. The complexity is furthermore increased by internationalisation (or “globalisation”). Computers and ICT are complicating factors themselves but can also function as facilitators. Computers and the Internet play an increasingly important role in coping with this growing complexity, making the law easier to access, maintain and enforce. In the Leibniz Center we also develop methods and tools for a rational analysis of the effects regulations have on society.
Typical to the research approach at the Leibniz Center for Law is that we are trying to develop a computational theory of legal reasoning. Such a theory would allow objective testing using the computer as our laboratory. The theory we are trying to develop covers various aspects of legal reasoning. The first aspect is the ability to abstract reasoning from concrete situations and to develop a set of norms regulating the world in which those situations may occur. This type of reasoning is typical for both legislative drafters and judges. Another aspect is the legal qualification problem, i.e. how to interpret a concrete situation in legal terms and how to attribute meaning to it from a legal perspective. Furthermore the explicit computational theory will allow various legal reasoning tasks including legal assessment, legal planning and argumentation.
We are developing this legal theory from two different angles: a descriptive perspective, covering different aspects of current legal practice, and a prescriptive perspective, avoiding potential human errors when performing legal reasoning tasks. Using our research and the resulting explicit legal theory, we can contribute to the legal field by offering a method and tools that can be used for legal analysis. This enables the development of a more coherent legal system and reasoning about complex legal issues.
Since we are striving for an explicit computational theory of legal reasoning, formalisation of the legal domain is central to our research. Consequently our research includes the formal representation of (legal) concepts, expressed in so-called ontologies, the representation of norms, and legal problem solving methods (including legal argumentation).
To build legal ontologies and formal models of law is a time and effort consuming task. One of the ways to facilitate this process and at the same time better understand how humans process legal sources is to use natural language processing (NLP) techniques to extract this information automatically from the original sources. The Leibniz Center developed parsers to extract concepts and relations from legal texts and references within documents and to other legal sources. We are currently experimenting with the extraction of norms from sources of law and understand issues and arguments in policy-making processes aimed at creating new or adapting existing regulations and we wish to continue this line of research in the coming years. In this line of research we particularly focus on discovery of frames and recurrent patterns as we aim at understanding legal reasoning and comprehension of legal problems and norms. We contrast our results with results from applying NLP approaches using statistics as their basis, as these approaches are the most frequently used ones for support (legal) document retrieval amongst others.
3. The Future
Almost 300 years after Leibniz, research on his ideas of a lingua universalis and automated legal reasoning is still very much alive. The researchers at the Leibniz Center for Law together with their international partners are pushing this research forward, achieving both scientific and practical meaningful results. Cooperation with the legal field is also important to our research and we usually include practitioners in our projects. Their input is essential for testing academic ideas as well as for transferring academic knowledge to practitioners.
Successful cooperation between legal practice and academia, as we have realised from our cooperation with the Tax and Customs Administration, the Immigration Service, the Ministry of Spatial Planning, Ministry of Justice, Council for the Judicial, the State Council, the Council for Legal Aid and others, does not guarantee academic reward or sustainability of financial support for AI and Law research. One problem is that the research that is usually conducted in faculties of law comes from a different tradition than AI research which has its main roots in science faculties. Another complicating factor is that the way national science foundations organise their review processes is not exactly helpful to the typically multi-disciplinary AI and Law research. As a result, only a few research groups addressing AI and Law themes have survived and some are so small that we can barely speak of them as research groups. Legal practitioners and especially public administrations however are increasingly realising that the results from AI and Law research can help them to solve practical problems, especially on how to support an increasing case load – both in terms of the numbers of cases and their relative complexity – in an effective and efficient way. Compliance to the law and agility of the organisation are two elements that play an important role in creating solutions for the institutional tasks of these organisations, as well as providing useful feedback to the policy-making processes and supporting the creation of effective legislation. There is an obvious need for adequate theories on legal problem solving and legal reasoning as bases for design rationales for systems that support law enforcement organisations and courts. This is where institutes like the Leibniz Center for Law can contribute.
The role of ICT systems in our modern society is increasingly important and today ICT plays an essential role in almost any legal process. Besides our fundamental research we therefore also conduct applied research together with our partners and we will continue to invest in transferring knowledge to allow practitioners to benefit from our research. We are confident that our research will contribute to better understanding of legal problem solving and, consequently, to a better society.
More information about
our current and past research projects can be found at
www.LeibnizCenter.org.
* Professor and Associate Professor, Leibniz Center for Law, respectively.
1 See www.metalex.eu (accessed 2 Aug 2010).
2 See www.triastelematica.org (accessed 2 Aug 2010).
3 See www.estrellaproject.org (accessed 2 Aug 2010).