Network Packet Management Optimisation for Business Forensic Readiness
MetadataShow full metadata
The acceptability of evidence in court is dependent upon stringent criteria for its admissibility. One of the leading guidelines is the Daubert criteria which asserts five requirements for admissibility. These criteria assert scientific principles that must be complied. The focus on the scientific testing of Digital Forensics tools is often glossed over by accepting commonly used proprietary tools without delving into their performance. In particular, Daubert states that the known error rates of a scientific procedure must be published, and that the scientific procedures must be independently tested. In this thesis, I am concerned about the satisfaction of Daubert’s key points when collecting digital forensic information that is evidentially sound: • The scientific procedure must be independently tested. • The scientific procedure should be published and subjected to peer review. • Are there standards and protocols for the execution of the methodology of the scientific procedure? • Is the scientific procedure generally accepted by the relevant scientific communities • Is there a known error rate or potential to know the error rate associated with the use of the scientific procedure (Daubert v. Merrell Dow Pharmaceuticals, 1993) Without known error rates and falsifiability criterion, digital evidence should not be in courtrooms. In this thesis the relevant literature and conceptual scope of the problem is explored theoretically, and then network tools tested empirically. The thesis is an initial investigation into whether there are error rates above zero when a network tool is used for evidence collection. The contribution is to both the digital forensic investigation community as well as to the law practitioners and judicial profession by demonstrating a potential problem with network forensic data. The performance of two tools was tested by subjecting them to increasing packet loads and measuring their performance. The results were then measured against the baseline of expected outputs and the differences noted. These differences were then used to generate an error rate in the form of a percentage. As part of the methodology the assertion was that, the tools would perform without error. Therefore, the research question is: Research Question: Can the Network Management System and the Network Packet Capture tool, achieve zero errors for digital forensic purposes? Thus, the research goal of this thesis is to determine whether a computer system can capture relevant information systematically and comprehensively for post incident forensic presentation that complies with legal requirements of completeness. The findings demonstrated that the popular network tools selected had significant error rates that are not published. The error margins indicate that a large number of packets that are potentially evidential are lost as the work rate increases. This will in turn affect the validity of data presented as evidential. I recommend that the concepts developed within this thesis be expanded and codified through future research. That a professionally accepted assertion test bench be developed and test case methodology procedures formulated. Thus, transference can be assured, where tools tested under many scenarios through many assertion tests can provide an acceptable level of assurance. Therefore, the desired outcome is that error rates can be determined by the forensic examiner as a standard part of digital network forensic readiness and provided along with evidential data.