Audit Trails in Evidence: Analysis of A Queensland Case Study
Caroline Allinson Manager Information Security, Information Management Division, Queensland Police Service and Information Security Research Centre (ISRC), Queensland University of Technology, Brisbane allinson.carolinel@police.qld.gov.au
The author would like to acknowledge the assistance of Professor William Caelli, Head of School of Data Communications, Queensland University. Professor Caelli helped the author prepare this article for publication.
Abstract
This is the second paper in a two paper series resulting from the review and analysis of information systems audit trails presented in legal proceedings as evidence in the State of Queensland, Australia.
The preliminary analysis performed on eleven court cases in Australia involving Queensland Police Service information systems audit trails presented as evidence was reported in the first paper.
This second paper relates the findings in the first paper to normal requirements for trusted maintenance of audit trail information in sensitive environments and discusses the ability and/or willingness of courts to fully challenge, assess or value audit evidence presented.
Keywords : Law enforcement, audit trails, evidence,information security, court digital evidence,computer.
This is a refereed article published on 15 December 2003.
Citation : Allinson, ' Audit Trails in Evidence: Analysis of a Queensland Case Study ', 2003 (2) The Journal of Information, Law and Technology (JILT). <http://elj.warwick.ac.uk/jilt/03-2/allinson.html> New Citation (as at 1/1/04) <http://www2.warwick.ac.uk/fac/soc/law/elj/jilt/2003_2/allinson/>
1. Introduction
The first paper, in this two paper series, presented preliminary work by discussing the processes and procedures for the retention and use of audit trails by the Queensland Police Service (QPS), State of Queensland, Australia. It reviewed 11 cases to provide an initial assessment against a preliminary set of 13 major themes in relationship to the secure maintenance of audit trail facilities and presentation of related evidentiary materials. (Allinson, 2002).
Regardless of the importance of the audit trail information to each case reviewed, from a technical perspective, none of the evidence presented was rejected or seriously challenged but in fact was readily accepted by both the prosecution and defence. However, it is recognised in this paper that it is common place for both the prosecution and defence to agree upon the evidence thus making the technical issues irrelevant. This did not appear to be the situation with the majority of the eleven cases reviewed.
There was no attempt to challenge how Information Technology (IT) systems can be tampered with nor how audit trails can be manipulated, nor to what extent this tampering and manipulation can be monitored or detected. There was no attempt to challenge the reliability of information systems and what impact or effect this may have on the audit material presented.
The 13 themes in the first paper are further revised and assessed in section 2 of this paper and are placed in current context in relationship to legislation and emerging and accepted information systems security management practices. These 13 themes arise from accepted theoretical work in the development of an appropriate methodological and classificatory framework in relationship to assessment in information security management as it concerns topics related to information audit trail data in evidence.
In the first paper analysis was based on actual case transcripts. In this paper, concentration is focused on the technical aspects and reliability of IT systems and the audit trails associated with these systems. The classification scheme and principles used to identify the pertinent matters in relation to acceptance of audit trails as evidence has been finalised and is described. The analysis is related to normal requirements for trusted maintenance of audit trail information in sensitive environments. Discussion of a major concern that has emerged from the case histories is given. This concern relates to the inability and/or unwillingness of courts to fully challenge, assess or value audit evidence presented.
2. Re-Assessment of Themes
Results given in the first paper were based on initial assessment against a preliminary set of 13 major themes. These themes were not fully defined before the review commenced but were in fact a work in progress. They were developed and refined mainly on the basis of what challenges were determined from the cases themselves. In particular, a significant issue was the importance to the case of the audit trail evidence and any associated attempt to explain or challenge that evidence by the prosecution or defence counsel. The themes were inferred from this review and treated as criteria for significance of evidence.
Reluctance to challenge the evidence presented was possibly due to the lack of knowledge and guidance available. An attempt at this guidance is now given through further review and placement of the 13 themes into current context in relationship to emerging and accepted information systems security management practices. For the purpose of further analysis it is deemed more appropriate to reduce the number of themes in accordance with legislative requirements for proof of computer records (Queensland Evidence Act, 1977).
Several of the themes can be merged. Themes 1, 2, 5, 6, 9, 10, 11, 13 all fall within best practice for computer systems operating properly. Theme 3 relates to staff or persons responsible for information systems. Theme 12 relates to information systems access control mechanisms. Themes 4 and 7 relate to legislation applicable to evidence. Therefore, for further review and analysis, the criteria are revised into four categories i.e. computer systems operating properly, responsibility and requirements for information technology personnel, information systems access control and legislation applicable to evidence.
3. Legislative Requirement
The studies authored in this paper are based on case histories in Queensland, hence, the legislation reported is that relevant to Australia. Whilst the focus is Australian it is reasonable to assume that the observations made and conclusions arrived at may be equally applicable in other jurisdictions under similar legal regimes using whatever particular admissibility tests those regimes dictate. It is not unrealistic to expect that all lawyers in adversarial systems may lack the inclination or expertise to challenge computer-generated evidence.
Australia has a Commonwealth Evidence Act and an Evidence Act for each state thus providing rules on a jurisdictional basis. The State Acts set out the rules of evidence which apply to proceedings before the respective state courts, whilst the Commonwealth Act applies to proceedings before the federal courts and courts in the Australian Capital Territory (ACT). All Australian Evidence Acts have rules that deal with the proof of records maintained on computers and also proof of business records. The Acts with their relevant sections are referenced in Table 1.
Australia operates under common law principles. The primary source of law is made by an act of parliament this is known as statute law and is developed by the legislature. In the Australian judicial system the role of the judge is to not only apply the law but to help develop it and therefore another source is Common Law better known as Case Law developed by the Judiciary. Under common law the court determines the admissibility of evidence under those rules applicable to that particular court. This is supplemented by any rules established under common law. Where a Judge has set a precedent under common law this precedent may override or have significant impact on the admissibility of the evidence in question.
Table 1. Australian Evidence Acts
The Commonwealth Evidence Act 1995 is the result of 16 years work through a combined effort of the Australian Law Reform Commission and several of Australia's leading lawyers (Crowley-Smith, 1996; Kerr, 1995). The introduction of this Act is a step toward a national legal system in Australia and an attempt to introduce uniform rules in respect of provisions relating to admissibility of computer records. It was envisaged that all Australian States would follow and introduce State Evidence Acts drafted in identical terms to the Commonwealth Act. New South Wales to date is the only Australian State to adopt and implement this.
In 1998 the Queensland Law Reform Commission released an issues paper on The Receipt of Evidence by Queensland Courts in relation to Electronic Records. This paper was produced for consultation purposes in response to a request from the Attorney General for the Commission to 'review the capacity of the judicial system, both in its criminal and civil aspects to receive into evidence information stored and conveyed in electronic, magnetic or similar form' (1998). Unfortunately this review has not been acted upon and to date the Queensland Evidence Act 1977 has not been modified or replaced.
As can be seen in Table 1 all Australian States have sections within their respective Acts with provisions relating to admissibility of computer records. Whilst they differ in wording there is a similarity and hence several issues require further discussion. In most Acts it is required to show that the document containing the statement
-
was produced by the computer during a period over which the computer was used regularly to store or process that information from a business perspective,
-
that the computer system be operating properly,
-
that the information be produced as part of normal business activities,
-
and the person involved with the control and operation of the system occupy a responsible position.
As shown in the first paper, the Queensland Evidence Act 1977, Section 95 is relied upon for 'Admissibility of statements produced by computers'. Section 95(2)(c) refers to 'operating properly'; and it is important to determine what constitutes the term 'operating properly' if this were to be used as a defence. Courts need to have a guide which provides certainty as to what their expectation of 'operating properly' for an information processing environment should be. Section 95(2)(b) & (d) refer to 'ordinary course'. There needs to be clarification of the term 'ordinary course' in relation to the computer staff performing computer related activities and the actual user base performing business activities.
Section 95(3) is very broad in stating that 'All computers are treated as one'. In the current information technology environment this is not only inappropriate but incorrect. Network capabilities certainly give the 'one computer' appearance but data and information can be shared and this could span several organisations each under different managerial and legal control incorporating computer systems with varying and possibly unknown security and control status. In this situation it may be impossible or at best difficult to determine who is then considered to be defined as 'the person occupying a responsible position' in relation to the computer operation. Even if the overall system is not controlled by several organisations, the networked information technology facilities are such that in large organisations it is not possible for one person only to be knowledgeable in all areas. This would indicate that more than one person must sign the certificate required under Section 95 (Allinson, 2001).
There are two mandatory requirements for any person deemed to be the person occupying a responsible position in relation to the operation of a relevant information technology device or the management of relevant information systems and activities. Firstly there is a need for that person to be aware of all standard configurations as well as the approved and implemented standard operating procedures. In addition that person is expected to have a reasonable knowledge of the known limitations, security 'holes' and problems experienced to date for each system and the interconnectivity as a whole.
There are errors in all IT computer systems. Hence, at no stage can perfection be stated as a must. The requirement is merely that process and procedure show proof of the system operating properly, not perfectly. Those responsible for the systems must be able to demonstrate this. Errors in IT systems was recognised by Crowley-Smith in her review of the Australian Evidence Act 1995 (Cth) in which she stated 'The statutory provisions in the Evidence Act which recognise computer generated documents as evidence are overdue and welcome. However, they do not take into account the accuracy of source data itself and errors propagated through systems'. This also can be unequivocally verified by the numerous alerts and system 'patches' or 'fixes' documented and made available on internet web sites and alert bulletins associated with information systems manufacturers and vendors. The fault tolerant testing by research laboratories associated within vendor organisations or academic institutions and organisations using the systems is a continual process. Reliability of computer systems has been of concern since the inception of computing and this has been compounded by the down grading of security concerns in the design of current commodity based systems. International Standard IS 15408, moreover, sets out details for the evaluation of security and control aspects of computer systems. Accuracy of data entry, performance and security of operating system software, reliability of data transmission, security enforcement in application systems software, and reliability of hardware devices are open to question. Errors can be the result of numerous incidents including:
-
incorrect or inadequate configuration of system hardware and software
-
hardware failure where a system or system component has physically failed,
-
incorrect system design and coding causing the system to perform outside the intended requirement,
-
incorrect or inadequate change control procedures causing unpredicted errors,
-
human intervention where administrators, operators, system programmers etc are responsible for unintentional or intentional system malfunction.
Reliability of systems can also be gauged by how well security has been designed, incorporated and monitored in the information systems infrastructure and processing procedures.
To address and meet the legislative requirement of 'operating properly' there is a requirement to show that the production of any computer information or document has reliable accuracy of content and was not affected by a malfunction. Thus, there is a need to determine what constitutes a computer system 'operating properly'. There is also a need to determine that if a problem was encountered, how this must be dealt with, and what records need to be kept in relation to the history of system problems and incidents.
Where poor practice is encountered and there is no ability to verify the operations within a computer systems environment, it may be and possibly should be assumed that, the systems operation cannot be relied upon and therefore doubt as to overall system reliability can be held.
Where there is an ongoing requirement for information systems to be accepted as part of legal proceedings in relation to reliability of operation and production of electronic documents it is imperative that full account of all aspects of acquisition, design, implementation, processing, modifications and monitoring of each of these stages and the ongoing operations be to a level of accountability and auditability for complete reconstruction of what has taken place.
4. Information Technology Best Practice
Accurate and reliable information entered, stored, accessed and handled securely is needed for all organisations employing information systems for critical and sensitive information. In particular, law enforcement organisations. The processes by which this is achieved must demonstrate to a high level of confidence that a computer system is 'operating properly'. This dictates the necessity for a structure and process that provides, to a high level of confidence, delivery, support and management of information technology that is aligned to business activity. There must also be demonstrable best practice to recognise, and show proof of professional competence in all associated activities. It is essential that this best practice must also demonstrate that it conforms to international and national standards and guidelines that employ security and security monitoring that will withstand legal scrutiny. Organisations must have accepted practices and processes for the collection, storage, handling, processing and provision of data. These practices and processes must be established in such a way that certain conditions are satisfied. It is these conditions that need to be well established, documented and monitored to provide evidence that the business processes within the organisation are executed in fulfilment of the organisations objectives. IT systems must be acquired, implemented and function according to these objectives.
Lack of financial and human resources are in many situations reasons given for organisations not complying with good practice. This has the impact of down-grading reliability. Organisations that continue to implement new technologies that cannot be sustained by the organisation, particularly when the organisation is responsible for very sensitive information, provide an environment that cannot be adequately relied upon. Executives must be informed to a level that ensures decisions, when taken in relation to information technology, are current and IT practice is at a level of reliability before new technology is introduced.
It is not unrealistic to expect that information technology and systems implemented in the support of organisations processing critical and sensitive information, in particular law enforcement, should adhere strictly to standards and guidelines. Therefore, courts being asked to accept computer records into evidence that have been produced from a law enforcement organisation's information system/s must question standards compliance. Law Enforcement organisations also should expect that a challenge to the reliability of the systems must meet what has been developed and accepted as best practice in the international arena. The majority of law enforcement organisations have significant information technology infrastructure and application systems that support very large data stores of sensitive data. Much of this technology is a mixture of mainframe computing from two to three decades past and the basically insecure technology from the commodity based system era of the past decade. This coupled with the need to connect and communicate externally has created significant management challenges.
There is also a belief that the legal fraternity is not yet at a stage to confidently challenge assertions made in relation to information technology. This has been shown in the first paper but it is also recognised that this is changing and this change will impact the 'comfort zone' currently experienced by law enforcement agencies (Allinson, 2001). Legal practitioners from both prosecution and defence teams need only to understand or receive advice on what is best practice for operations of a computer centre. This does not necessarily require an 'expert'. An initial basic check list or audit report would provide enough information to either prove correct adherence to standards or create enough element of doubt to achieve a level of 'reasonable doubt' about the reliability of the system.
There are several best practice guidelines for IT service delivery that have been accepted for use internationally. The Information Systems Audit and Control Association (ISACA) has released a comprehensive framework known as 'Cobit' (ISACA, 2001). The United Kingdom's (UK) Office of Government Commerce (OGC) has made available a best practice methodology known as the IT Infrastructure Library (ITIL) (itSMF, 2001). ITIL has been recognised and implemented to a certain level within some law enforcement agencies.
There are a number of other standards that relate to information security. International Standard ISO 17799 entitled Information Technology - Code of Practice for Information Security Management, for example, is a common basis from which other security policy development is derived.
4.1 IT Infrastructure Library (ITIL)
ITIL is a framework that consists of detailed information on the processes that are considered as best practice in the management and exploitation of IT and IT infrastructure. It is based on practical experience and its use has been shown to increase quality of IT services. The ITIL IT infrastructure includes software, applications, hardware, documentation, procedures and systems. It is concerned with managing an existing working environment and managing changes within that environment. Its implementations must be tailored to fit each particular IT service need.
The ITIL processes are combined into 9 'sets'. The 2 sets comprising the service support and service delivery (refer figure 1 and 2) and the security management process are the areas of concentration in this paper. Within the ITIL IT environment it is contended that service support generally concentrates on the day to day operation and support of IT service whilst service delivery looks at the long term planning and improvement of IT service provision (itSMF, 2001). Information security management exists to serve the interests of all business activity and processes. Therefore it is essential that security be appropriate to the importance, sensitivity and value of the information being processed and the risks associated with this.
In organisations processing critical and sensitive information, in particular a law enforcement environment, it is imperative that decisions are made at the highest levels to ensure a commitment to information security thus providing more reliability that the business processes are being met.
ITIL suggests 4 main areas of consideration for information security viz. organisation, physical, technical and procedural, where 'organisation' includes the clear responsibilities and tasks, guidelines, reporting procedures and measures properly matched to the needs of the business and the IT systems. 'Physical' means the security measures associated with separation of the computer infrastructure for and within the centre housing IT facilities. 'Technical' security measures provide security in a computer system or network i.e. operating systems for segregation of users, secure gateways and firewall security for internal and external separation and monitoring of connections. 'Procedural' measures describe how personnel/staff are required to act in particular cases and the establishment of 'standard operating procedures' under which they are required to work. There is a relationship between security and most processes within ITIL.
Service support is an ITIL function that consists of a number of processes that interface with deliverables (refer Figure 1). To determine the veracity of the term 'operating properly' there must exist a support service that can demonstrate all activity in relation to the processes of incident management, problem management, change management, release management and configuration management for the information technology implemented within an organisation. This needs to apply regardless of the size of the IT environment.
Figure 1. ITIL - Service Support Processes (itSMF, 2001)
Service delivery is a function that consists of 5 processes with expected deliverables (refer figure 2). To determine whether a system is 'operating properly' or not, there must exist a service delivery activity that can demonstrate service level management, availability management, capacity management, financial management for IT services, and IT service continuity management.
Figure 2. ITIL - Service Delivery Processes (itSMF, 2001)
4.2 Control Objectives for Information and Related Technology (COBIT)
CobiT was developed by the Information Systems Audit and Control Association (ISACA). ISACA is the professional standards body for Information Systems Audit world-wide. CobiT is a generally applicable and accepted standard for good practice for Information Technology (IT) control. It is based on ISACA control objectives and international standards for activities in relation to IT (ISACA, 2001). These standards include but are not limited to:
-
technical standards issued by the International Standards Organisation (ISO), and the Electronic Data Interchange for Administration, Commerce and Trade (EDIFACT) which is a United Nations document,
-
professional Codes of Conduct issued by the Council of Europe (EC), the Organisation for Economic Cooperation and Development (OECD), and ISACA itself,
-
qualification criteria for IT systems and processes produced by the Common Criteria (CC) as ISO 15408, the Information Technology Security Evaluation Criteria (ITSEC), ISO 9000 Quality management and Quality Assurance Standard, and earlier Trusted Computer System Evaluation Criteria (TCSEC),
-
professional standards for internal control and auditing: Committee of Sponsoring Organizations of the Treadway Commission (COSO) report, the International Federation of Accountants ( IFAC), the Institute of Internal Auditors (IIA), ISACA, and the Certified Practicing Accountant (CPA) standards,
-
industry-specific requirements from banking and IT manufacturing
-
industry practices and requirements from the European Security Forum (ESF) 14, and government-sponsored platforms Infosec Business Advisory Group (IBAC), the National Institute of Standards and Technology (NIST), and the Department of Trade and Industry (DTI).
It was developed in response to a need for a system of internal control for IT in organisations and is considered a comprehensive checklist for business process owners.
The CobiT framework is built on the premise that 'IT Resources need to be managed by a set of naturally grouped IT processes to provide the information that the enterprise needs to achieve its objectives (ISACA, 2001]. It promotes 'process focus' and 'process ownership' and divides IT into 34 'processes' belonging to 4 'domains'. The 4 domains are 'planning and organisation', 'acquiring and implementing', 'delivery and support', and 'monitoring' (refer Table 2).
Table 2. Control Objectives Summary Table (ISACA, 2001)
It then looks at fiduciary, quality and security needs and provides 7 information criteria that an organisation can use to define requirements for IT. The 7 criteria are effectiveness, efficiency, availability, integrity, confidentiality, reliability, and compliance. There are 300 detailed control objectives to support this framework.
4.3 An Information Security Framework
The most recognised and accepted best practice for information security management is contained within the International Standard ISO 17799 entitled 'Information Technology - Code of Practice for Information Security Management' (ISO/IEC, 2001]. This standard came into operation in December 2000 and is considered 'essential to business operations' as its usage is becoming a contractual obligation due to its inclusion in 'Service Level Agreements' (ISACA, 2002).
Part one of the standard is the 'Code of Practice' that consists of 10 guiding principles covering strategic, operational and human issues. Part two is BS7799-2 Information Security Management which consists of 10 main areas covering 127 controls. Both Cobit and ITIL refer and rely on the information security best practice of ISO 17799 and BS7799-2 which is the British standard in this area (ISO/IEC, 2001], (BS, 1999).
It is therefore considered that to fulfil legislative requirements for computer records in evidence where systems need to be evaluated as 'operating properly' it is imperative that organisations demonstrate their ability to successfully benchmark and audit themselves against internationally recognised security standards. There is need for proof that the organisation has adopted an IT control framework, has successfully implemented it and continually monitors it.
This control framework, its implementation, and the process by which it is monitored, must be complimented by independent audit and certification. In most Australian states the Auditor General has responsibility of sign off for government organisations. This includes law enforcement.
5. Requirements for Trusted Maintenance of Audit Trails
There is also a need to analyse the context and content of the audit trails themselves. This is imperative given the international standards, guidelines and frameworks under which it is desirable, if not mandatory, for information systems to operate.
Audit trails must have a security level such that a challenge will not jeopardise the reliability of the information, and each audit trail must contain enough information to provide proof of activity. This is provided for in ISO/IEC 17799 Section 9.7 entitled 'Monitoring Systems Access and Use'. Protecting the confidentiality and integrity of audit trail records, and the computerised systems that administer, store and control them, implies a mandatory requirement for security and security monitoring. In particular where evidentiary requirements must be met to accommodate this need, security implemented must be sufficient to demonstrate that protection is given to information and data. This should ensure that modification or destruction, whether intentional or accidental, does not occur.
5.1 Audit Trail Context
The security infrastructure within which audit trails are housed must be generated, retained and presented in an environment where a security framework that is aligned with recognised national and international standards and guidelines has been implemented. This security infrastructure must provide for segregation of audit trail information from the databases and systems for which they are generated. There must be demonstrable best practice, procedures and processes, such as ITIL and CobiT, that are recognized and accepted as methodologies which provide a level of assurance for what may be categorized as a system 'operating properly'.
Within this host environment there must exist security for audit trails that addresses what and how the audit information is generated and captured, how the audit information is accessed and displayed, how the information is centrally stored and managed including error handling and recovery, how the retrieval of audit information is managed, and what backup facilities are in place.
Reliability of security within a computer operating system and the security level to which the system has been certified must be considered. A trusted operating system secured to a level of 'mandatory' or equivalent access control regime is the minimum requirement for reliable use of audit trails.
One of the important components within the security framework is audit trail integrity. This is essential for proof to a high level of certainty that the records have not been tampered with. Integrity checking of Audit Trail information to ensure no malicious or accidental changes have occurred can be achieved through the use of cryptographic based checksumming. Two separate areas need to be considered. Firstly the audit trail record is correctly and provably associated with the end user of the system. This must include biometric checking. Secondly, the audit trail records and files themselves are correctly and reliably maintained.
Consideration of the number of copies of an audit trail has become an area of importance for computer forensic examination. It is required that one copy be sealed and one used for examination. This can be extended to information systems audit trails in general, although the sheer volume may be prohibitive. A challenge to the audit trail, if integrity checking does not exist or exists in a less secure form, may be the requirement for proof of integrity by checking the sealed copy of the audit trail through comparison.
Formally documented responsibilities for Audit Trails need to be in place. This must include from the person responsible for development and support to the person responsible for examination. These responsibilities must have associated documentation of processes and procedures carried out by the respective persons.
5.2 Audit Trail Content
The format of an audit trail differs from system to system. The format used for this analysis is based on the format implemented by the Queensland Police Service for activity logs associated with operational police application systems [ 1 ]. This format was presented in evidence as discussed in the first paper.
Audit trail content requires both analysis of what is recorded within each event/activity record and whether or not all event/activity records are recorded in the audit trail. An information systems event/activity is associated with the adding, deleting, modifying or viewing of data or files stored within a system. The attempted or failed access to information must also be recorded with logon and logoff activity being essential for completeness. Without full recording of all event/activity it is not possible to state or prove with a degree of certainty that particular actions were or were not performed. Therefore, there is a requirement for a high level of assurance that all activity is recorded.
Due to the large volume of information it is desirable to divide audit information into two parts consisting of an event/activity log and a full screen audit detail file. Each record within an information system's event/activity log should contain at least six separate fields or pieces of information related to an event/activity. These 6 pieces are depicted in Figure 3.
The validity, relevance and reliability of each piece of data is the second area of scrutiny. The date and time fields can be recorded from several sources and form a vital evidentiary datum. This importance was shown in the first paper. An audit trail can be used to support or refute a claim made by the user of a system. It can also support or refute a claim made by a person who was the subject of an activity, such as a query by a user of the system. This may relate to activity performed by police that placed persons or vehicles at the scene of a crime. A challenge would take place when police are claiming to have checked a person at a given place and time. This has the concept of 'alibi' and important questions in relation to this field may be; is the system clock synchronised against standard time sources and if so from what source; how often is this process checked and who is responsible; and, is there proof of what activity has taken place in relation to clock synchronisation.
As a standard practice in an IT environment all computer servers and workstations should be synchronised against 'time-servers'. Time-servers can be internal or external to the organisation's computer systems network. An arbitrated protocol involving the services of a trusted notary is useful to ensure the integrity of the time messages in a network. Time synchronisation can be implemented using a notary to mediate the transmission between the network host and the server or workstation (Fleeger, 1989).
Figure 3. Event Log Content (Based on Queensland Police Service Activity Log)
The user identifier field contains the code assigned to a user of a system. Most audit trails hold a 'user-id', and contrary to common belief this does not positively identify a person as a system user. An authentication 'token' is also required to verify a claimed identity. Inability to positively and unequivocally identify a person from 'user-id' and 'password' combination leaves the area open to challenge. This field is possibly the most open to challenge, as was shown in the first paper.
The most commonly used method for the provision of access to information systems is the allocation of a 'user-id' and 'password' combination. The process involves the assignment of an identifier to a particular person or a role in the organisation. When this identifier is recorded in an audit trail it is not possible to say that the person assigned the user-id actually used it. There is no proof of real identity through the use of user-id and password combination; only that this usage actually occurred. All that can be stated in evidence is that, for example, this person was assigned user-id 'xyz123' and it was the user-id xyz123 that was used to perform the system activities in question. In high confidentiality and integrity environments there must exist a method for checking whom the person is, not just what is known or possessed.
More sophisticated systems using proven biometric [ 2 ] techniques may in the future provide a more reliable means of knowing that a particular person either performed the activity or had knowledge of the activity due to being present for the biometric check. Audit trails and the authentication of users through an access control process are intricately linked. In this way, positive identification of a user, with the ability to tightly couple the user to the audit trail for proof of use, is essential.
The location field is important in proof of place from which activity was performed. In a computer network there can exist a concept of 'static terminal identifiers' where each computer terminal is assigned a unique identifier that remains constant for the duration of its existence in the system. This creates an administrative overhead for the system itself and for the support of the system. A second, and more common and popular, implementation is to have 'pooled' identifiers where each computer terminal does not have a static identifier but is assigned a different one each time it connects to the system. The pooled identifier method does not provide for proof of location unless there are a number of pools within subnets based on location. The static identifier method combined with subnets based on location is more reliable for evidentiary purpose. As technology advances it may be possible in future systems to determine location without a static one to one relationship. Regardless of the method used to achieve identity of location, it is essential that this be accurately identified and recorded. As shown in the first paper it also can be used in conjunction with time and date fields for situations requiring an 'alibi'. For example, the main processor identity number hard wired into new computer units could be used.
Information systems are structured to consist of many transactions. A transaction identifies the processing of a particular activity for example 'enter a crime report', 'execute a warrant' etc. Each transaction is assigned a unique identifier and in most cases each screen within a transaction is also assigned a unique identifier. The field in an audit trail that identifies the transaction within an information system provides not only evidence of where the user was in the system at the time of the activity but also it plays a vital role in the examination of the audit material. Proof of a business process requires that actions are performed in the correct sequence for that process. Recording the transaction identifier in the audit trail provides a trace of process steps and can quite often show a sequence of activity that proves a particular situation.
Showing what was accessed and what actions were performed is essential for evidence of activity. This can be provided in brief form as depicted in Figure 3, or through a full screen audit where a screen dump is taken and recorded in the audit trail. This must show the before and after image of action taken. As shown in the first paper the Queensland Police Service has both the activity log and full audit trail facilities implemented for application systems.
6. Willingness of Courts to Fully Challenge Assess or Value Audit Material in Evidence
As shown in the first paper there was reluctance to challenge technical aspects of computer systems. Also there was confusion in the directing of questions to witnesses. The ability to challenge the value and reliability of the evidence would have been greatly enhanced through the use of a simplified framework from which understanding of organisational information management could be derived. This framework would enable assessment of the evidence, its origin, any changes, and persons responsible for areas under which the evidence had been handled.
The Information Technology industry is now vast and can be divided into areas of specialty. A typical information management structure, in large organisations employing good segregation of duties, is divided into at least six sections. These sections could be technical infrastructure, application systems, operations, database systems, planning and security. The section responsible for information technology security may or may not be under the control of the IT manager. In accordance with standard ISO 17799 the user base has ownership and custodianship of data entered and stored in application systems (ISO/IEC, 2001).
Figure 4. Example of an Information Technology Organisational Structure Chart
By using the structure in Figure 4, it can now be shown that the questioning and challenge in Queen v Grimley were incorrectly addressed (Allinson, 2002). In this case evidence in relation to the use of the Crime Reporting application system, data stored as a result of the functioning of this system, and the audit trail information relating to this use were challenged. Questions in relation to the functioning of the application system should have been directed to the applications manager. The use and changing of data in the database should have been directed to the database manager and the support of both application and database systems, from a platform perspective, should have been directed to the technical infrastructure manager. The security officers were responsible for questions relating to the extraction and in some cases analysis of the audit trail information only. The investigating police were responsible for the majority of the analysis performed on audit trail information.
7. Determination of Responsible Person/s
In defining a responsible person, as required in Section 95 of the Queensland Evidence Act 1977, it is important to know what role the person plays in the organisation to include level of responsibility, what knowledge the person has in relation to the system or systems, what skill is possessed, what experience has been had in relation to both period of time and extent of knowledge, what training has been undertaken, and what level of educational status has been achieved.
All of this must be based and judged within the organisational structure and the position description associated with the job under scrutiny. As depicted in Figure 4 the manager of each section possibly should be considered as the responsible person/s. Each manager has the overall responsibility to ensure each area is operating properly. The persons working under their control are responsible for the daily work tasks. It is they, the managers, themselves who have the responsibility and control. Therefore it is suggested that they must sign off on what has taken place in the operation of their respective areas. A manager who is not aware of what is happening within his/her section could be considered as inappropriate for that position and would not be a credible witness in a court of law, nor for the organisation.
There appears to be a perception that all persons involved in IT are experts as defined in law. This is untrue and it is expected that most persons who fit the 'responsible person' requirement may or may not be an expert to that required level. It is more probable than not that the responsible person attesting to a system 'operating properly' will provide factual evidence, not opinion evidence and be at 'practitioner level' only. 'Practitioner level' in this context would mean that they have the knowledge, training, experience and education to fulfil the management position to which they are appointed. A mandatory requirement is that they can show, through proper procedures and processes, the factual operation of that part of the system under their control.
Access to, responsibility for and control of audit trails cover the personnel role and function as well as the larger departmental responsibility for audit trails. It is important to know if responsibility is included in job descriptions. For example, is there generation of audit trails to monitor technical support activity? Therefore, the ability to show who has access to audit trails, what level of security is implemented and what segregation of duties are in place, must be addressed.
8. Conclusion
Understanding and determining a challenge in court to information systems and the processes associated with them in relation to electronic evidence, is not simple. It has been shown in this paper that a high level structured approach to understanding standards and best practice methodologies combined with information technology organisational structure and personnel responsibilities will assist and should be used as a base from which to work.
It is suggested that had the legal teams from both the prosecution and defence taken time to understand these processes more appropriate actions and challenge could have been taken in relation to the 11 cases presented and analysed in the first paper.
Had there been requirement for detailed declaration on what 'operating properly' actually meant and details of who 'the responsible person' actually was and where that person was employed within the information technology framework, the formulation of more appropriate questions to challenge the veracity of the audit trail evidence would have been achieved. This more appropriate questioning may have achieved a different outcome in those cases; in particular in R v Hogan, R v Luther, R v Grimley, R v Swift and R v Spidalieri, where there was at least some challenge to the evidence presented.
It has also been shown that a process of detailed documenting and tracking of all aspects in relation to the acquisition and/or development, implementation, operation and monitoring of all information technology and systems is imperative for all organisations. The ability to know who was the responsible person at any given time is essential. To show correct process and procedure and to have ability to track and provide detailed audit of any problem or incident that may have occurred, is mandatory.
A challenge in relation to the control of and responsibility for audit trails should take into consideration the following areas:
1) Audit Capture - What and how activity is captured a) Trustworthiness of the capture process or processes b) Writing efficiency c) Process efficiency d) File/database management issues and problems e) Inter-process communications f) Error handling g) Process failure h) Configuration
2) Audit Information Retrieval - How is captured activity accessed and displayed interactively 6.1.1 Authentication and authorisation 6.1.2 Input modes c) Identify, locate, retrieve, display d) Printing e) Error handling f) Timing
3) Central Storage for audit trails - What is the security and management of storage a) Detail of how audit trails are stored. b) Recovery c) Error handling d) Security e) Platform f) Configuration
4) Backup - How is backup managed to include security a) Access to and protection of backup copies of audit information b) Access permissions c) Process and recovery d) Archiving e) Porting and standards f) Number of copies with standards and processes for sealing and storing
5) Analysis - What analysis tools are available a) Charting b) Graphing c) Statistical analysis d) Simple and Advanced query facility
6) Security a) Cryptographic checksum ability b) Levels and accreditation/certification
Essentially, given heightened security awareness in society in general, the time may now be pertinent for increased security education, training and investment. In particular, the development of expertise in information technology and its usage by the legal profession and the judiciary will mean that management will need to be more diligent in consideration of audit and control procedures. This will be even more pressing where such audit information takes on an even more important evidentiary nature.
Notes and References
1 . Operational Police Application Systems are those computer based systems used In the support of policing, In particular, for electronic recording and processing of activities associated with the criminal justice process. 2 . Biometric Techniques are those techniques associated with a class of authentication mechanisms based on scanning or recording a physical part of the human body such as fingerprints, handprints, or Iris, for the purpose of positive recognition and Identification of an individual.
Allinson C (2001) 'Information systems Audit trails in Legal Proceedings as Evidence', Computers & Security, Vol 20 Number 5 (England; Elsevier Advanced Technology).
Allinson C (2002) 'Audit Trails in Evidence; A Case Study', Journal of Information Law and Technology. <http://elj.warick.ac.uk/jilt/02-1/allinson.html>
BS7799-2 (1999) Information Security Management Part 2: Specification for Information Security Management Systems.
Casey E (2000) Digital Evidence and Computer Crime (Cambridge; Academic Press).
CJC (2000) Protecting Confidential Information, (Brisbane: Criminal Justice Commission).
Crowley-Smith L (1996) 'The Evidence Act 1995 (Cth): Should Computer Data Be Presumed Accurate?' Monash University Law Review, Vol 22, No 1, 1996. (Australia).
Pfleeger C. P (1989) Security in Computing, (Prentice-Hall Inc.).
ISO/IEC 17799 (2001) Information Technology - Code of Practice for Information Security Management.
ISO/IEC 15408-1 (1999) Information Technology - Security Techniques - Evaluation criteria for IT security - Part 1: Introduction and general model (USA).
Kerr D (1995) 'National Evidence Law Is On The Way', Australian Lawyer, May 1995.
NCSC (1987) A Guide to Understanding Audit in Trusted Systems (USA; National Computer Security Center).
ISACA, (2001). CobiT Control Objectives for Information and Related Technology. Information Systems Audit and Control Foundation. Third Edition. USA.
ISACA, (2002). Information Systems Audit and Control Association 30th Annual International Conference. New York.
itSMF, (2001). IT Service Management - The IT Infrastructure Library. itSMF The IT Service Management Forum. Version 2. UK.
Queensland Evidence Act 1977, s.95.
Queensland Law Reform Commission (1998) The receipt of Evidence by Queensland Courts: Electronic Records, Issues Paper WP No 52 (Queensland, Australia: Queensland Law reform Commission).
|