2011-02-14 How HBGary & Other Firms Could Have Falsified WikiLeaks Documents

Image
The release of tens of thousands of emails from executives working for the classified cybersecurity services firm HBGary, which was found to have proposed plans to target WikiLeaks last week, shows exactly what members of the firm meant when they discussed using fake documents to sabotage or target WikiLeaks.

(If you are unaware of the story that has been unfolding, here is previous coverage, which has appeared on WL Central.)

A search through the database for emails that discuss “WikiLeaks” reveals one email on “stopping WikiLeaks.” It links to a Wordpress blog called “Godel’s Lost Letter and P=NP,” which covers stories related to technology and theories of computing.

The post sent around clearly indicates how one might “stop” WikiLeaks. The author rjlipton writes “leaks are like gravity, it is impossible to turn them off. No matter how terrific your security is, there will continue to be leaks of all kinds. What I do think is there is a mitigation strategy that can make leaks less damaging.”

The mitigation strategy is explained as follows:

Suppose that Alice runs an agency that handles very sensitive information. The thousands of people in her agency have access to millions of documents that would be potentially interesting to WL. Alice does nothing special until a leak occurs—although see a later section for a more “on-line” approach.

Suppose that WL gets documents {D = D_1,\dots,D_m} from some source inside Alice’s agency. They publish them on their web site, and then Alice is faced with a major problem.

Today she can do nothing to stop the leak. She can try to find the insider who made the leak, and use the legal system to deal with them. But that does nothing to mitigate the damage that is already done. There is an American idiom that says:

close the barn door after the horse has bolted.

This means: “Trying to take action when it is too late.” Today this is where Alice is—the horse is gone—the documents have been leaked. Closing the source of the leak does not help get the horse back.

However, she can do something to stop the leak. Here is the mitigation strategy: She runs a special program over the documents {D} and creates new ones {F = F_1,\dots,F_n}. These new documents are similar to the ones leaked, but they are different in many ways. Alice then “leaks” her fake documents {F}.

What is the point of this? Now the media and the public are confused. Is {D} right or {F}? If {F} is cleverly constructed, it should contain some “bad” information, but will differ from {D} in important ways. For example, if {D} has a passage that says:

Let’s pay X ten million dollars to do Y.

The documents {F} could have a passage:

Let’s not pay X ten million dollars to do Y.

If Alice is smart she may even make some of the passages in {F} worse than those in {D}. Thus she could have a passage:

Let’s pay X fifty million dollars to do Y.

The critical point is {D} and {F} will look alike, but will differ in many places.

The existence of {F} will increase everyone’s uncertainty. What are the correct facts, what is true, and what is not? This increase in uncertainty will muffle the effectiveness of the released documents {D}. Consider the dilemma facing a media outlet: would they feel comfortable in stating something if there is great uncertainty? Not clear.

Alice can do more to increase the uncertainty. She could, and probably should, release multiple versions of {F}. These versions would flood the media system. It could take a long time for them to unravel, if ever, which are “real” and which are not. She can even release information that is more damaging than the real documents. She can denounce all of them as fake, or claim some of them as fake.

The “mathematical theory” for “mitigating” the leaks was seriously considered. In a slide presentation prepared for law firm Hunton & Williams, HBGary (along with Berico and Palantir Technologies) planned to “feed the fuel between the feuding groups.” Use “disinformation.” And, “create messages around actions to sabotage or discredit the opposing organization” and “submit fake documents and then call out the error.”

The post is likey what led Aaron Barr to include the tactic of creating "fake documents" in the slide presentation.

Now, how might this have been done?

The blogger’s "colleagues" suggest using “automatic language translators":

…take a document and translate it to another language and back. Since translators are not perfect, this will change the document. I used this method previously here. There are theory ideas based on methods to protect database information that could perhaps be used—especially for numerical data.”

An “on-line” system approach is also suggested:

…a more “on-line” system approach might be better. The advantage of this is that the alternative documents could be created even by the authors of the originals. Or they could be created automatically, but would be available for immediate release when needed. Patrick even suggested, in some situations, there could be a stream of constant “leaks” that would be more proactive in protecting Alice’s agency.

Bad Logic...

Folks:

I could think of three ways to counter this technique before I was done the article.

Isn't it easier to be a ethical company in the first place?

tinker

Theme by Danetsoft and Danang Probo Sayekti inspired by Maksimer