We carry our lives on digital devices. For most of us, the information they contain is perfectly innocent. But digital forensics as its practiced today can make innocent information look incriminating. That means we may be putting innocent people in jail and letting criminals off.
While other forensic science disciplines have come under harsh scrutiny lately, the problems with digital forensics have not received enough attention.
A 2009 study by the National Academy of Sciences sounded the alarm on faulty forensics. The report said most methods of analysis have not been rigorously shown to have the capacity to consistently, and with a high degree of certainty, demonstrate a connection between evidence and a specific individual or source. The report challenged the reliability of ballistics (toolmark and firearms identification), bite mark comparisons, blood spatter analysis, handwriting analysis and even fingerprint examination. The report said little about digital forensics, however, because it is still an emerging discipline.
Its time for a critical look.
There is solid science behind much of digital forensics. We know, for example, that computer hard drives must be copied without altering the contents of the disk. Best practices in digital forensics also are solid. But digital forensic analysts dont always follow best practice.
Consider some of the following examples, which we have witnessed in Connecticut and nearby jurisdictions.
A police officer expert found images from unallocated space, the part of a hard drive the computer isnt using, which may contain deleted files. The officer asserted in an examination report that images retrieved from unallocated space were downloaded by the defendant and deleted.
But such an assertion is not supported by fact. Data can get into unallocated space on a hard drive in a number of ways. In this case, the only appearance of the data was in unallocated space. There was no basis for the examiner to assert that the images had ever been files that were subsequently deleted.
Heres another example: A computers operating system creates hundreds of copies of the same images, which are called restore points. A police officer expert recently recovered restore points on a defendants hard drive that contained the same two child-porn pictures. The officer duplicated the pictures so many times that he recommended charging the defendant with possession of more than 600 images, nearly all of them the same.
Another police officer expert violated a court order when he searched for privileged attorney-client documents on a defendants computer, and then handed them over to the prosecutor.
Examination reports often include conclusions from examiners that items were intentionally downloaded by the defendant. But it is impossible to arrive at such a conclusion without being present when the defendant actually downloaded the material, or without a videotape of the event.
Poor training is a big part of the problem. Thousands of police officers have been trained to perform digital forensics under federal grant programs. But these police officer examiners are not required to possess any special training or education beyond a minimum level. The 40 hours or so of training they receive in the forensic software they use is typically the extent of their computer science background prior to their first case assignment.
Despite the minimal training of many digital forensics examiners, their findings are often unquestioningly accepted as fact.
Digital evidence can be compelling and it is often unambiguous. In too many cases, however, digital forensics experts make assertions about a defendants actions that are not supported by fact. Such errors create the risk of false conviction of the innocent and a free pass for the guilty.
We need higher standards and more professionalism in digital forensics. And we need to give digital forensics the sort of close scrutiny that all the other forensic science disciplines have been getting in recent years.