The Campbell Coincidence

The Campbell Coincidence

Forensicator takes issue with “tampering” claims made by Campbell, who mistakenly concludes that Guccifer 2 deliberately manipulated metadata in a 7zip archive file, called ngp-van.7z.  In a rush to judgment, Campbell thinks he sees a pattern in the data and uses this as the basis for his faulty conclusion.  Campbell missed important details and failed to review all the data.  If Campbell had been more careful, he would conclude (as Forensicator did) that there are no signs of tampering.  Thus, Forensicator’s conclusions stand – Campbell’s alternative theory has no factual basis.

Zebras

Source: Wikipedia

Another technical approach to Occam’s razor is ontological parsimony. Parsimony means spareness and is also referred to as the Rule of Simplicity. This is considered a strong version of Occam’s razor. A variation used in medicine is called the “Zebra”: a doctor should reject an exotic medical diagnosis when a more commonplace explanation is more likely, derived from Theodore Woodward’s dictum “When you hear hoofbeats, think of horses not zebras”.

Executive Summary

The following paragraphs summarize our key conclusion.  We will repeat it here and at the end of this article (for emphasis).

There is no “merger of data”. With no merger of data, all claims of “tampering” and metadata manipulation fall away.

Campbell ignores Forensicator’s explanation that the directory timeline shows normal file collection activity ahead of the London conference. He doesn’t mention Forensicator’s explanation and seems to be completely unaware of it. We encourage Campbell to re-read that section of Forensicator’s report. After doing so, perhaps he will further consider the findings in this article, which demonstrate that his data interleaving theory is invalid.

We hope logic will prevail and that Campbell will abandon his metadata manipulation and “tampering” speculation and accept Forensicator’s analysis.

Preface

The purpose of this article is to correct the record.  We write in response to a Computer Weekly article  [archive] penned by Duncan Campbell, published in July, 2018. Here, we augment our previous critique of Campbell’s fiction, titled The Campbell Conspiracy.

Below, we rebut various claims and fabrications made by Campbell.  A lot of the content may be technically dense; this technical emphasis is necessary to properly correct errors and misunderstandings that form the basis of Campbell’s misstatements.

To make this article more readable, we have placed our narrative point-by-point rebuttal first.   Subsequent sections will provide the technical basis for our conclusions.

Background

Forensicator is an anonymous online blogger who has written several reports which analyze various document dumps made by Guccifer 2.  Forensicator’s first report, Guccifer 2.0 NGP/VAN Metadata Analysis, was published July 9, 2017; Elizabeth Vos of Disobedient Media covered that report in an article titled, New Research Shows Guccifer 2.0 Files Were Copied Locally, Not Hacked.

The Forensicator’s first report went viral and has been covered widely by the legacy media, alternative media, and in various social media venues.  Of particular note, a well-respected group of former US security professionals (the VIPS) published an article a few weeks later (July 24, 2017) in Consortium News, titled Intel Vets Challenge ‘Russia Hack’ Evidence.  The VIPS report was subsequently mentioned in an article published in The Nation, authored by Patrick Lawrence; that article (dated August 9, 2017) was titled A New Report Raises Big Questions About Last Year’s DNC Hack.  Lawrence’s article generated a lot of controversy which The Nation addressed in a follow up article (September 1, 2017) titled, A Leak or a Hack? A Forum on the VIPS Memo.  Patrick Lawrence published a one year retrospective, titled, ‘Too Big to Fail’: Russia-gate One Year After VIPS Showed a Leak, Not a Hack [Consortium News].

A journalist/blogger who goes by the pen name Adam Carter (@with_integrity on Twitter) runs a web site, g-2.space, which follows research related to the anonymous persona Guccifer 2.  Guccifer 2 has been linked to Russia’s GRU spy agency by US intelligence agencies and was highlighted more recently (July 13, 2018) in a DOJ indictment.

It seems that Carter may have unknowingly locked horns with Duncan Campbell in November, 2017 when Carter published an article critical of Campbell’s reporting .  Campbell co-authored an article with James Risen that was published in the Intercept under the title CIA Director Met Advocate of Disputed DNC Hack Theory — at Trump’s Request.  A month/so later, Campbell would begin a nine month quest to strip Carter’s anonymity.

Although the majority of Campbell’s article dwells on Carter’s background, there is some discussion of Forensicator’s research.  It seems that Campbell wanted to weave Carter and Forensicator into an elaborate pro-Kremlin plot to spread disinformation.  Campbell speculates that Guccifer 2 “manipulated” and “tampered” with the data to achieve this desired effect.

Campbell’s Claims

In this section, we quote excerpts from Campbell’s article and provide our response.

[T]he time stamps examined were present only in the special London group of documents, and not in tens of thousands of other DNC files published by WikiLeaks or Guccifer 2.0. [A special method was used to pack the files.] The special method used two different file compression systems, 7zip and WinRAR. This dual compression method was unique to the London documents.

It is immaterial that the batch of documents in ngp-van.7z has different metadata characteristics from other document dumps published by Guccifer 2 and WikiLeaks.  Each set of files has its own unique characteristics.  That is why each group of files is analyzed separately.  Perhaps, Campbell is trying to make the point that the unique characteristics of the metadata imply pre-meditation and intent by Guccifer 2, who (as Campbell tells the story) arranged the data to facilitate Forensicator’s conclusions.  We disagree – Campbell provides no factual basis for such a claim.  Further, Forensicator relied on no other person(s) when compiling his report.

The files were manipulated using an unusual method of file packing […], the packing operations appeared to have created “evidence” that the stolen files had been copied in the US Eastern Time zone, which includes Washington.

Campbell tells us that Guccifer 2 deliberately arranged the metadata to telegraph the fact that the files were written somewhere on the US East Coast timezone.  Guccifer 2 also thought ahead to arrange the last modified times of the files so they’d indicate local copying speeds.   Further, Guccifer 2 remembered to copy the files to a FAT formatted media to show that they originated on a thumb drive.  Finally, per Campbell, Guccifer 2 played with the directory last modification times, perhaps just to taunt us.

Yet, after taking care to put all these clues into place, one and a half (1.5) years passed; no media outlet or security researcher found this message in a bottle. Per Campbell, at this late date, Guccifer 2’s team activated Plan B.  They fed a “tip off file” to a sleeper agent (Forensicator), who lacked the skills to perform the analysis on his own.  Campbell suggests that Forensicator may even be a fake persona created and operated by another secret agent, Adam Carter.

Campbell leaves open the possibility that behind it all, Russian intelligence agents are pulling the strings.  The article’s title labels Carter as a “pro-Kremlin” disinfo agent, simply because Carter highlights research that challenges the prevailing Russian hacking narrative.

All of this makes for good spy fiction, but the whole story line is a castle built on sand.  Campbell offers no shred of proof, or evidence to support his wild theories.  Yet, it is difficult for Forensicator to prove Campbell wrong without abandoning anonymity.  Advantage, Campbell.  Forensicator’s only viable response is to demonstrate that Campbell’s interpretation of the facts is incorrect – that is what Forensicator does in this article and its predecessor (The Campbell Conspiracy).

The files released in London, we found, had first been processed in this way to show timestamps for 5 July 2016.

When Campbell says “the files were processed”, he seems to suggest that Guccifer 2 acted with pre-meditation and intent to plant those particular file dates and to use a certain file archiving format (.rar) to achieve his purpose of disclosing that the ngp-van.7z archive was created on the US East Coast.

Further, Campbell suggests that the directory dates were manipulated so that they fell into empty spaces in the file timeline.  Per Campbell, Guccifer 2 was deliberately “tampering” with the dates and this (apparent) interleaving of the directory timeline with the file timeline is his proof.  We will prove him wrong on that point.

What we see is that the directory timeline on September 1 suggests this is when Guccifer 2 copied files from another larger collection, one directory at-a-time – ahead of building the final ngp-van.7z archive.  We describe this process in detail; we see a normal progression where directories were copied and then compressed into separate .rar archives (one for each directory).  We see no sinister intent or purpose.

Some 13 groups had then been compressed using WinRAR 4.2. Nine additional files were compressed using 7zip.

The “thirteen groups” sound mysterious here, but in fact they are simply 13 directories.  Guccifer 2 zips up each directory into its own .rar archive.  This sort of thing is done all the time.  When he says “9 additional files were compressed using 7 Zip”, he means simply that in addition to the 13 .rar files, Guccifer 2 added 9 additional regular files to the final archive.  As Forensicator demonstrates, the dates and times of those files will align with those in unpacked .rar files if the system timezone is set to US, Eastern Time.

The special method used two different file compression systems, 7zip and WinRAR. The special method used […] a four-year-old, superseded version of WinRAR to obtain the required result.

Campbell obsesses over the use of an older version of WinRAR. That older version records last modified times in a way that is different from 7zip.  Noting this difference, Forensicator deduced that the 7zip archive was built on a system that had US Eastern timezone settings in force.

Forensicator thinks that Guccifer 2’s use of an older version of WinRAR was done simply to promote the image that Guccifer 2 is a solo (Russian) hacker.  Several observers noted that (1) WinRAR is developed by a Russian, (2) cracked versions of WinRAR are commonly used by hackers, especially those distributing cracked software (“warez”), and (3) many hackers distributing “warez” are/were often Russians (and Eastern Europeans).  Guccifer 2 has a habit of “accidentally” leaving clues behind that suggest he is a Russian hacker; his choice of WinRAR fits with his MO.

To illustrate our point, we refer to this excerpt from a Twitter thread.

Campbell thinks Guccifer 2 used this old version of WinRAR to deliberately plant the East Coast timezone clue.  This clue was a necessary part of an operation that he named “[the] Forensicator Fraud”: a conspiracy theory that Campbell cuts from whole cloth.

The tampering may have been done on 1 September, a week before the London conference.

At this point, Campbell assumes that the reader has accepted his “tampering” theory.  That is, per Campbell, Guccifer 2 constructed the dates in ngp-van.7z to support Forensicator’s conclusions.  We strongly disagree.

As we described in our report, the .rar dates and the directory dates are well-ordered and easily explained.  Put simply, on September 1, Guccifer 2 pulled together the contents of the final ngp-van.7z archive file.  To do this, Guccifer 2 copied specific files and folders selected from a larger collection.  No mystery, no tampering.

Binney agreed: “It’s clear Guccifer 2 is messing with the data. Everything Guccifer 2 says is suspect and needs to be proven by other sources/means. I agree there is no evidence to prove where the download/copy was done.”

We respect Binney and the VIPS for all they have done to demonstrate inconsistencies in the Russian hacking narrative and we support their call to have all unanswered questions thoroughly investigated.  Here, however, we think that Binney accepted Campbell’s “tampering” theory too quickly.  In this report, we show that Campbell’s theory is unsupported by the facts.

Binney and Campbell think they see a pattern in how the directory dates merge into the file dates (they’re wrong).  They think that this pattern implies that Guccifer 2 is “messing with the data” (it doesn’t show this, because there is no such pattern).  They reason: the data has been manipulated or “tampered” with; therefore, none of it can be trusted.

[Binney] added: “The merger of data from 5 July and 1 September … makes all the Guccifer 2 crap a fabrication … we should only say what we can prove with evidence.”

We show (in a following section) that the pattern that Campbell and Binney think they see is a result of cherry-picking the data; they missed important details that invalidate their conclusion.  They think that they see a pattern, where the directory dates fit like a glove into the file timeline (when the hours part of the timestamp is excluded from consideration).  We show that, even in their narrow time window, there are examples where the directory timeline clashes with the file timeline.  Further, they ignore a significant number of directory times that fall outside their narrow time window.

There is no “merger of data”.  With no merger of data, all claims of “tampering” and manipulation fall away.

Forensicator’s Findings

When Forensicator reviewed the metadata in the ngp-van.7z archive file, he reached the following conclusions.

  • The archive was written on a system that had set its timezone to the US, Eastern Timezone. Forensicator arrived at this conclusion by noticing a subtle difference between the way that 7zip stores time values and the way that WinRAR stores them, when writing .rar files in the older version 4 format.
  • Times on the top-level files and the .rar files are all even multiples of 2. This is an indication that a FAT filesystem was likely used. FAT filesystems are commonly found on removable devices like thumb drives. Thus, it appears that the top level files and the .rar files were first copied to a thumb drive before the final 7zip file was built.
  • The files found in the archived top-level directories have times that increase in a regular order proportional to the size of each file. This is the pattern created by a “Unix style” copy operation. A “Windows style” copy operation (“drag and drop”) preserves the last modified times.
  • Making use of the observed “Unix style” copy pattern, we were able to estimate a copying rate of 23 Megabytes/sec on average (with 40+ MB/s peaks). This copy rate is typical of USB3 thumb drives; it is too slow for a hard drive to hard drive copy and too fast for a transfer over the Internet. This is the copy speed of data acquisition operation that occurred before the files were archived.
  • Based on the metadata, the files were acquired on July 5, 2016 and the contents of the final ngp-van.7z archive file were collected on September 1, 2016.
  • As we have pointed out subsequent to the publication of our first report, the files could have been copied multiple times before the July 5 date. Further, the final 7zip file may have been written sometime after September 1 and on/before September 13, 2016.
  • Our analysis of activity on September 1 indicates that directories and files were selected from a larger collection of electronic documents. The 13 top-level directories were zipped into .rar archives and those 13 archives were transferred to a thumb drive along with 9 regular files. The .rar files retain the date they were written (September 1) and top-level files retain their July 5, 2016 last modification dates. Thus, the top-level files and the .rar files were copied to a thumb drive using a Windows style copy operation. We will focus on the September 1 activities.

A Closer Look at the ngp-van.7z Archive

When we open ngp-van.7z in the 7zip Windows application, we see the following.  In the left panel, we see the contents of ngp-van.7z; on the right, we see the contents of one of the .rar files, named “May FEC.rar”.  When we set the system’s timezone to US Eastern time, the file times match.  From this, we conclude that ngp-van.7z was likely written somewhere in the US East Coast.

Let’s turn to the .rar files and their corresponding directories, dated September 1, 2016.

We note here that the last modified times on the top-level directories include a group of 10 directories that have last modification times between 12:49:51 and 12:54:31.  There is one other directory (DNC) dated 12:47:55 and 2 others later in the day at 13:03:40 and 13:03:57.  If we only look at the minute and seconds part, we have three ranges: 47:55, 49:51 through 54:31 and 03:40 through 03:57.  This may not seem important, but will be a key component of Campbell’s “discovery”.

Here are the top-level files stored in ngp-van.7z.

Campbell Notices an Apparent Overlap between File Times and Directory Time

Notice that if we drop the hours part of the time, the minutes/seconds range from 46:06 through 53:04 for the top-level files.  Recall that there was a group of top level directories that ranged from 49:51 through 54:31.

Campbell saw this apparent overlap and then leapt to the conclusion that Guccifer 2 must have manipulated the file metadata for some unspecified nefarious purpose.  No rationale was given that might tell us why Guccifer 2 would do this – other than possibly to spread disinformation.  Yet, once Campbell decided that the dates had been manipulated he declared that all of Forensicator’s conclusions must be voided.  There goes the baby with the bath water.

This is the view from 30,000 feet.  We need to descend to ground level and take a close look at the metadata to demonstrate that Campbell miscued and reached a hasty and faulty conclusion.

Retracing Campbell’s Steps: The MMSS Timeline

It is difficult to respond to Campbell’s “tampering” claims, which are derived from his “data merge” claims.  Campbell never tells us exactly what he means.  Instead, Campbell lets Binney do his talking for him.  This is an effective technique – Campbell gets his point across and achieves a reputation upgrade as a bonus.

In Campbell’s article, Binney says only that there is a “merger of data from 5 July and 1 September.”  In other venues Binney has provided a little more detail.   In this YouTube video [SGT Report, May 25, 2019] Binney says (approximate transcription):

Plus, we found out more about Guccifer 2. He put out two data sets one day on the 5th of July and one day on the first of September. We took a look at that and with some help from Duncan Campbell and some of his friends over there. We found that if you looked only at minute, seconds, and milliseconds of the two files then the sets of files from the September 1st and the 5th of July merged into one continuous stream which means very simply put, the random probability of that of happening is just too small to even consider.

The point is that that means that Guccifer 2 was playing with the native data. He was manipulating the data creating two separate piles, manually separating. He did one download and separated it into two piles. They claimed two different downloads and he did a range change on the date and the hour in the September material.

To re-construct Campbell’s observations (and by proxy, Binney’s observations) we went back to our metadata spreadsheet and added a new column, “MMSS”.  In that column, we isolated the minutes/seconds part of the last modified times.  We prefixed those values with“#” to make sure that Excel didn’t chop off any leading zeroes – and then sorted by this key.

“Data Merge” Sighting

Below, we excerpt a few places in the timeline that appear to fit the “data merge” pattern.  The directories are highlighted in blue, the surrounding files in white.  We see at 47:29 two directories (which span less than 1 second) that fit into a 10 second window.  At 47:55, there are 4 directories that span 134 seconds; they fit into a 153 second window.

Note: an update to a directory’s last modification timestamp takes almost no time; however, files take up a significant amount of time for their transfer.  The transfer time depends on a file’s size and the transfer rate.  Thus, a 400 MB file transferred at 40 MB/s will occupy 10 seconds on the file time line.  Also, note that files with a Path value of “.” are at the top level; they have FAT granularity (their times are rounded to the next higher 2 seconds).  This introduces some uncertainty into the analysis, but has no impact here.

In the following display we see an impressive looking fit, where 16 directories that span 58 seconds fit into a 73 second time gap.

In this batch, we see 13 directories spanning 32 seconds fit into a 46 second window.

The level of coincidental fit (so far) is impressive, but we have covered only 70% of the directories and the last 30% will prove to be the exception to the rule.  They will invalidate the “data merge” hypothesis.

Outliers: Files and Directories that Don’t Fit in

So far, we have been looking at a group of files and directories that span only a seven (7) minute period out of a 60 minute hour.  This is the time span where the “merged data” shows up.  However, there are 221 files (10% of the total by count, 50% of the total by size) and 13 directories (25% of the total) that fall outside the narrow time span that Campbell focused on.

When Campbell looked in the narrow window shown below, he thought he saw a pattern, where the directory last modified times seem to fit into the file timeline (looking only at minutes/seconds).  By ignoring 25% of all of the directories, he cherry-picked his data (perhaps unwittingly) in order to make his point.

Directories that Collide with the File Timeline

There are two additional places where it looks like the directories fit into the file timeline.  We show them here below.  Our analysis will follow.

Above, it looks like the “bc” directory and the “marketing” directory fit into the file timeline.  In both exhibits the timing is tight: 8.212 seconds for “bc” and 2.188 seconds for “marketing”.   When we take a closer look, we see that those gaps are completely consumed by file transfer time.

We calculated the transfer speed for the files that fill the gap: 45.8 MB/s and 41.7 MB/s respectively.  These speeds are at the top end of all the transfer speeds seen in this dataset (49.9 MB/s per Binney) and leave no room for the directories.

In this analysis, we identified two directories that collide with the file timeline.  They invalidate the claim that the directory timeline merges with the file timeline.

A Well-Ordered Merge

Although Campbell made much of an apparent merging of the 9/1 directory timeline with the 7/5 file timeline in ngp-van.7z, he never tells us why Guccifer 2 created such an obscure metadata pattern, or what exactly the pattern is (apart from some random directory last mod times appearing to interleave with the file times).  In this section, we will show what real merging of the timelines looks like and contrast it to what is seen in ngp-van.7z.

Let’s return to the place that we see a directory (“bc”) clashing with the ngp-van.7z file timeline.

Notice here, “bc” is a sub-directory of “eday”, but the surrounding files are from the “DNC” parent directory.  This pattern holds for all alleged date timeline merges.  That is, the interleaved directories have no connection (by name) to the files around them

Below, we show how the directory/file timeline looks when we ran a test.  We copied the ngp-van.7z files using a “Unix style” copy operation.

Note: The “bc” directory’s last modification time is updated when “WI10_STATEWIDE.xls” is created; that file is then copied and its last modification time reflects the last write to the file.  That is why this file comes after its parent directory (“bc”), making that directory’s time next-to-last in the integrated file and directory timeline for this sub-directory.  This pattern is repeated for all similar situations in the full timeline.

In contrast to the ngp-van.7z file timeline, which is 80% empty, this timeline is completely filled.  If we removed the directories from that file timeline and tried to fit them back in, this is going to be the only arrangement that allows that.  If we time-shifted all the directory times by two months, we might see the timeline merge that Campbell is striving for.  That is, all the directory dates are much later dated than the files, but when we look only at the minutes and second values of their times, the entire timeline would re-assemble.  The directory times would fit like a glove.

Apart from “tampering” and “manipulation”, Campbell offers no theory on how Guccifer 2 manipulated the directory times to show an apparent merge, nor why Guccifer 2 might have done things this way.  In our view, if Guccifer 2 had done a time shift of the data (as we have shown above), we might see Campbell’s point.  What we have instead is Campbell’s flawed attempt to make the data fit his theory.

Mind the Gap: Large Gaps in the File Timeline Leave Room for Directories

It is reasonable to ask: “How could this apparent merge of the directory timeline and file timeline have happened just by chance?”  We think that these factors contributed to this anomaly.

  • Campbell ignored the data that did not fit his theory.
  • The data that fits his theory spans only seven (7) minutes out of the hour.
  • For the time window studied, both timelines start at approximately the same time of day. This is as much as result of choosing this specific time window as anything else.
  • Most (80%) of the file timeline has gaps, leaving ample opportunity for a small number of directory last mod times to fit in.

We look at those file timeline gaps below.  To do this, we first marked off the contiguous parts of the file timeline. The rest were relegated to gaps.

Below, we isolate the time period that Campbell chose to make his “data merge” claim.

We explored this idea that the large gaps in the file timeline make it fairly likely that we would see the interleaving that Campbell relied on to prove his “tampering” claims.  We developed a Monte Carlo simulation.  The insert below shows our results.

We can see why the apparent interleaving between the 9/1 and 7/5 timelines in ngp-van.7z caught the attention of Campbell and Binney.  We’ve noted already that they overlooked the two (2) timeline conflicts.  We can see from the presentation above that it is remarkable that we see so few conflicts, but this isn’t totally unexpected.

In a video presentation (LaRouchePac, May 17, 2019), Binney says that the “random probability of that happening is exceedingly low.”  As we’ve shown in this study, the odds are actually pretty decent that we would see two conflicts in ngp-van.7z.  We think that Binney probably overlooked the large time gaps in the July 5, 2016 timeline.

The key point of this study is that the file timeline is 80% gaps, even when we look only at the limited sample (40% of the file timeline) that Campbell chose.  It is not surprising that 20/so directory times, chosen at random, would find places in the file timeline.  Campbell’s superficial analysis led to his mistaken impression that those times had been deliberately manipulated.

Forensicator’s Analysis of the September 1, 2016 File Collection Operation

We have shown that Campbell’s claim that the September 1 timeline and July 5 timeline “merge” has no merit: (1) he cherry-picks the data and only looks at 40% of the full timeline in an effort to make his case and (2) there are two places where the September 1 and July 5 timelines collide.  We think that this should be enough proof to convince objective observers that Campbell’s “tampering” claim doesn’t hold water.

When Forensicator studied the timeline for the directories and .rar files created on September 1, he saw normal file collection activity ahead of producing the final ngp-van.7z file that was announced at the London security conference on September 13, 2016.  Forensicator’s analysis included all the metadata and did not try to force fit the data into a particular theory.  Let’s look again at the metadata for the top-level directories and the .rar files that were built from those directories.  The .rar files ultimately were packed into the ngp-van.7z archive.

A few things are worth noticing here.

  • When the timezone is set to Eastern, as above, the directories fall into place, generally a little before the .rar files that contain them. If the timezone is set to Central Time, for example, the .rar files would appear in the timeline before the directories that were copied into the corresponding .rar file. That sort of anomaly would certainly raise suspicions regarding data integrity. This simple check adds further support for our conclusion that the file collection activity on September 1 was done on a computer that had its time zone set to US Eastern Time.
  • All of the .rar files follow the directories that they were built from, when we view the timeline ordered by last modification time as above. The chances that this ordering might have happened at random are 1 in 10,000. The odds are strongly in favor of our interpretation: working directories were first collected and then packed into .rar files.
  • The last two directories (“Reports for Kaine” and “Security”) look here as if they were an afterthought because they were copied later in the day. That sort of thing commonly happens and shouldn’t raise any concerns. However, if we went with Campbell’s approach of lopping off the hours part of their last modification times, then those two directories wrap around such that they are widely separated from the rest of the directories. They invalidate Campbell’s “data merge” hypothesis.

Conclusion

There is no “merger of data”.  With no merger of data, all claims of “tampering” and metadata manipulation fall away.

Campbell ignores Forensicator’s explanation that the directory timeline shows normal file collection activity ahead of the London conference.  He doesn’t mention Forensicator’s explanation and seems to be completely unaware of it.  We encourage Campbell to re-read that section of Forensicator’s report.  After doing so, perhaps he will further consider the findings in this article, which demonstrate that his data interleaving theory is invalid.

We hope logic will prevail and that Campbell will abandon his metadata manipulation and “tampering” speculation and accept Forensicator’s analysis.

Closing Thoughts

Advertisements