I’ve been telling my friends and family for awhile now that I think the day will come when white hats will produce movies and series for the purpose of red pilling.
Can y’all name some other instance of movies/series that you’ve seen, as of recent, that are are dropping red pills?
I agree with both your statements. Fuck Netflix but Netflix did just drop their entire "woke" department and seems to be shifting gears to accommodate the true mainstream in the USA or they are going to disappear into nothingness