350
posted ago by simon_says (context) ago by simon_says +350 / -0

Ladies and Gentlemen,

If I might ask your indulgence to permit me to briefly clarify and elaborate upon my previous post here: https://greatawakening.win/p/15IXWdUvR5/true-the-vote--heres-whats-next/

I continue to believe that Gregg Phillips’ data came from the NSA. The just-linked-to post explains my reasoning in detail. Boiled down to its essentials, my reasoning is: if Gregg purchased the data, it would contain informational gaps across devices and timeframes; therefore Gregg either did what he did despite the gaps, or he used a dataset that essentially had no gaps (i.e., the sort of dataset described in my previous post). Gregg plainly has an intelligence background and connections all over the intelligence community. So he was situated to gain access to the dataset, and the only plausible path for completion of his project was to use such data. Therefore, it came from the NSA.

I continue to believe that Gregg used one or more platforms provided to him by the NSA. The platforms he needed already existed. If the NSA was giving him the data, why would it not also furnish him with the platforms required to manipulate the data? Moreover, these sorts of platforms could not be built so quickly that TTV: (i) built the platforms from scratch; (ii) used the platforms to analyze the data to prove out their hypothesis; and (iii) created a movie about the whole affair – all in less than two years. Not remotely plausible. The NSA gave him the platforms.

Here's the most likely situation. The NSA – as a formal matter – did NOT give it to Gregg Phillips. If Nakasone is asked whether his agency gave Phillips the data or the platforms, his answer is “Absolutely Not.” But elements within the NSA absolutely gave the data and the platforms to Phillips. And the NSA can’t admit that. So the NSA is going to say that Gregg stole the data (and probably the platforms). If you listen closely to what Gregg has been saying, he has been talking around the edges of that situation for some time now:

“…We participated in the op. We were part of the op. We were in communication with the target at the behest of the government. We, we were involved deeply in this, in this CI op. I think what’s going to frighten people when they understand the gravity of what it is we’re telling them, and then when we tell them that, in the end, we were betrayed by the United States government, and they turned it on us, and went and told the enemy that it was us that did it. That it was our fault. AND THAT WE HAD PENETRATED THEIR SYSTEMS.” That’s a transcription of an excerpt of an interview Gregg recently participated in, with emphasis in the form of boldfacing added by me.

The NSA is in a terrible bind right now. I believe the NSA is privately accusing him of stealing the data and the platforms, and threatening him with arrest and prosecution, in an effort to silence him. They want to arrest Gregg to shut him down. But if they arrest him for “stealing” the data and the platforms, they confer credibility on the dataset and the methods – because they will have acknowledged that they came right from the NSA, itself – and therefore confer credibility on the conclusions. So they’re in a bind. The NSA needs to shut him down, but can’t afford to arrest him. I think this puts Gregg in a very dangerous position, as the NSA’s options are limited, if you see what I mean.

-simon

Comments (132)
sorted by:
You're viewing a single comment thread. View all comments, or full comment thread.
2
ghost_of_aswartz 2 points ago +2 / -0

I think this is a very good assessment, simon / OP. I have been thinking about this a lot since I saw it on rumble yesterday.

The only thing i disagree with (and this is a serious OFF topic side note) is that he coudln't have done it in 2 yrs without the nsa's platform. The ability to come to market with a new 'platform' (ie make sense of the data) is not as difficult as you think. If you've done this kind of work before and worked with that type of data before, you will come to this problem with libraries. Also there are astounding libraries in every coding language to help you but they probably used python is my guess and specifically pandas and gis librarires because that is the math and mapping library needed for this kind of thing. Having readymade libraries gets you much more than halfway there. And for interface there a python gui frameworks available like wxpython to qtpython or qt or gtk, etc to make a graphic interface for their work; all with mapping / network graphing abilities and such that can be thrown into the gui. Keep in mind it CAN have bugs...it's not a deliverable to an end-user. There is no "legal" issue as it's inhouse software. It doesn't need to be shipped or tested thoroughly, only 'certified' in terms of verifying the analysis produced by the tool is correct by testing THAT rather than the code itself. It just needs to run and give them in house results. This is very different than say producing a paid software or a video game that cannot crash or have any such liability defects.

0
Choctaw 0 points ago +1 / -1

On the platform issue, I agree. One could easily use ELK to parse the data, it might take a few weeks to get your filters dialed in precisely. Logstash totally eats mass quantities of data. Though I haven't done it myself, I am quite confident that logstash can deal with pcaps as well. Anyone that has been in networking 5 mins knows that wireshark can filter and parse all pcaps.