Are you a techpede who wishes you could do more to help the movement?
The internet has 44 billion terabytes of data. Trying to locate connections between oligarchs, their families and their companies has been very difficult simply because of the massive volume of information to scour through.
Often, it takes 4, 5, or more connections to see the larger pattern. I may be aware of one, you're aware of another, someone else knows a third, and we all post here or on the chans like ships passing in the night, because you can only see the bigger picture when you're aware of all the connections.
Combine this with the fact that the DS pretty much controls the internet, and can suppress or remove anything they want - how many times have you found an interesting connection between two apparently unrelated people, and when you go back to re-review that info days or weeks later, it's been memory-holed?
However, if we had the ability to easily share our breadcrumbs with each other instantly (and permanently), it would be far easier for all of us to discover heretofore unknown links between all the DS players, companies, politicians, media, etc.
I've been refining this idea since November 4th. I have the skill set to build the entire system, but I'm just a regular web developer with plenty of backlogged projects, and not enough time to do it myself and have it done in time for the 2022 elections.
So, I'm looking for assistance. I'll foot the bill for the servers and any other expenses, but unless someone with deep pockets steps in and provides funding, this is a "sweat equity" project. You'll receive a % ownership, but I'm not expecting to ever make any money with this - it's a research tool to multiply our efforts now and into the future.
If you have the skills to assist, and you want to help the community, and you understand that this isn't really a commercial project, then PM me, and we'll talk.
Ever heard of an RDF semantic graph database You can define an ontology for the domain you're trying to map. Create a REST service running on a server that talks to a graph database (RDF4j). You can build crawlers as cron jobs that auto scan and track the web and pull data into the database. You can have an interactive web client that allows users to view and input data. Lots of possibilities.