- Crime Lab
- Crime Scene
- Death Penalty
- Digital Forensic Insider
- Digital Forensics
- Evidence Collection
- Expert Forensic Voices
- Forensic Anthropology
- Forensic Psychology
- Impression Evidence
- Medical Examiner
- Mobile Forensics
- Police Procedure
- Sexual Assault Investigations
- Witness Testimony
Regardless of whether a Computer Forensics unit is a stand alone entity within a law enforcement agency, a section within a forensic laboratory, or is housed within a private corporation or business, Quality Assurance Practices are essential to its overall success. Quality Assurance Practices are an overall means to assess the quality of analytical processes and must be in place prior to beginning forensic analysis. They often include systematic and planned activities by management to ensure that the analytical processes are sound and capable of providing quality results. A primary factor controlling quality in any setting is the incorporation and utilization of good scientific practices.
The results of the analysis of digital data routinely lead to either civil or criminal litigation. Prior to litigation, the unit’s management and legal counsel have to be assured that the results are accurate, reliable, verifiable, and repeatable. The successful completion of forensic imaging/analysis training classes by examiners does not guarantee those assurances. Rather, training classes can often give the examiners a false sense of security, which leads to the belief that they are prepared to provide quality results. This is a fallacy. There are many other complex, interrelated issues that must be addressed if the results are to be considered a quality product. All are critical and have to be clearly articulated and well documented before proceeding with any forensic analysis:
• What was the probable cause that initiated the request for analysis?
• Where was the digital data stored on the computer or computer network?
• How many individuals had access to the computer and the digital data?
• How was the evidence collected?
• What training did the examiner receive prior to analyzing cases?
• Is there a documented training program?
• Did the examiner demonstrate competency prior to performing the analysis requested?
• Has the examiner been proficiency-tested on a regular basis?
• How reliable were the tools (both software and hardware) used in the analysis?
• Are the procedures for analysis documented and have they been verified/validated?
• Were scientific practices and principles followed during the analysis of the data?
One of the implications of working in a forensic environment is that every analytical report generated could result in litigation. In Computer Forensics, the analytical processes are technically complicated and can literally change week to week. Digital data can be stored anywhere on a hard drive and be difficult to find (for example; deleted, hidden, encrypted, or partially overwritten files or directories). Thus, it is not an easy task to explain to the court what was found, where it was found, how it got there, and who may have put it there. Of considerable importance is the expert testimony of the examiner. The court has to be assured that the results obtained were both accurate and reliable. Invariably, most lay jurors and justices do not have the technical knowledge to accurately assess the testimony provided. Although an examiner’s testimony will often sound very credible and believable, many questions can and should arise from that testimony:
• What exactly is the evidence: The computer itself? Its hard drive? Probative data found on the hard drive? Exported digital data? 1
• Was the digital data ‘tainted’ or ‘compromised’ during its collection and analysis?
• Can the ‘chain-of-custody’ be fully documented for the collection, submission, and analysis of the evidence?
• Does an examiner’s ‘on-the-job experience’ automatically qualify him/her as an ‘expert’?
• Does the examiner have the necessary training and competency to perform the analysis?
• Were new, novel techniques and procedures used during the analysis?
• Were the analytical procedures used and the results obtained by the examiner technically peered reviewed?
• What were the results of the examiner’s proficiency testing?
• What other analyses were performed on the forensic computers and when?
• Who maintains the forensic computers? How often is the software/hardware updated?
• Were licensed copies of the forensic software tools used during the analysis?
• Does the examiner have documentation demonstrating that the forensic software and hardware tools were verified/validated in his/her laboratory prior to their use?
• What standards/controls were used during the analysis?
• Did any of the software tools used contain documented (or undocumented) ‘bugs’?
• Did any of the analytical processes have the potential to alter or change the evidentiary data?
Before convicting a subject of a crime, the court would need to have answers to these and other important questions. Additionally, the court may require a Frye2or Daubert3hearing to assess the admissibility of the examiner’s scientific expert testimony. Thus, the question becomes “What Quality Assurance Practices are in place in the Computer Forensics unit?” to provide the necessary answers. Quality Assurance Practices are not a new concept; rather they have to be applied to Computer Forensics in the same manner they have been applied to all other forensic disciplines. Addressing the following topics in a Quality Assurance Manual can demonstrate that Quality Assurance Practices are in place:
• The unit’s organizational structure and responsibilities
• Dealing with management related issues
• Communications within the unit/agency and with external agencies
• Standard Operational Procedures related to the unit’s daily operations
• Specific Quality Control and Quality Assurance Practices
• Physical Plant Security
• Health and Safety
• Documenting policy changes Forthcoming columns will discuss in more detail the Quality Assurance Manual and specific Quality Assurance Practices.
1. See Forensic Magazine, “Digital Evidence Accreditation” Winter 2004 issue for more detailed information concerning what could be considered evidence.
2. Frye v. United States, 293 F. 1013 D.C. Cir. 1923.
3. Daubert v. Merrell Dow Pharmaceuticals, Inc., 509 U.S. 579 1993.
John J. Barbara is a Crime Laboratory Analyst Supervisor with the Florida Department of Law Enforcement (FDLE) in Tampa, FL. An ASCLD/LAB inspector since 1993, John has conducted inspections in several forensic disciplines including Digital Evidence. John is the General Editor for the “Handbook of Digital & Multimedia Evidence” to be published by Hu-mana Press in 2007.