Funny enough, Meta could actually be in a position to genuinely make a Twitter killer if they simply made a Twitter clone that wasn't buggy as shit. The hard part of making a successful social media site is getting people to join it, since most people won't join it until their circle is already on it. Meta already has a user base they could begin to migrate over and integrate it into their existing platforms, so this would be easy.
The thing is they would need a reason to get people to use it instead of Twitter. Twitter has a lot of shortcomings, even to a normie. Put aside the censorship discussion and just look at what normies care about. All they have to do is release something simple without rampant bugs.
But, the silver lining, is that they are utterly incapable of doing that. There are major bugs in Facebook that have been around for years. Obviously, easily replicated things. And they don't care to fix them. Things like clicking on a notification for a comment, but it not taking you to that comment because "Most Relevant" is selected by default when you click the notification and apparently the comment you got the notification for wasn't relevant. So you turn it off, but then it's not auto-redirecting, so you have to manually scroll through hundreds of comments to find the one the notification was for. It's been like that for years and they aren't fixing it. Major bug, directly affects user engagement and thus ad revenue, and they can't/won't fix it.
I know why. It's the problem all big companies, especially tech companies, suffer from. And that's shitty employees resulting from liberal hiring practices.
I thought there was something nefarious to it at first, too. My thought was that it was an attempt to subtly "shadow ban" me by forcing me to stop having a discussion and just making me quit, while plausibly hiding behind a "bug." But it happens when I'm in discussions that aren't remotely political in nature. And I've talked to leftists who describe the same exact phenomenon.
Mind you, I'm not saying that the "Most Relevant" feature is a bug in and of itself. And I'm also fairly confident that is used to lightly censor content the algorithm deems unfit. The bug I'm referring to is specifically a comment that is literally a reply to mine, that mentions me, that I got a notification for, that I came to the thread by clicking the notification for, is considered not relevant. The rest of the conversation will still be visible, including the parent comment. But the most recent comment, the one the notification for, will be missing. Their comment, not mine.
There is not a nefarious reason why Facebook would do it that way that I can think of. And there is plenty of good reason to have that feature working. I would understand hiding my argument with someone else from the rest of the people who might be scrolling through. But hiding just the latest comment (made by the other person, who is usually arguing the liberal side) from me? Any nefarious reason I can come up with for that just doesn't add up.
If they simply didn't want me to respond, as though to make it seem like I didn't have a response and lost, it would make way more sense to simply not send the notification. A "missing notification" bug would be completely acceptable to consumers, and most people would never notice the vast majority of the time, and not think anything of it if they caught it. But giving someone a notification and then not showing them the thing once they clicked on? That's just sheer incompetence.
There is not a nefarious reason why Facebook would do it that way that I can think of. And there is plenty of good reason to have that feature working. I would understand hiding my argument with someone else from the rest of the people who might be scrolling through. But hiding just the latest comment (made by the other person, who is usually arguing the liberal side) from me? Any nefarious reason I can come up with for that just doesn't add up.
There are people who have to get their two cents in, especially if they think somebody else is wrong on the Internet.
So you might give up on replying but others won't. They end up digging through hundreds of comments to find the one to reply to, and now Facebook can use that to boost its "engagement" numbers they share with advertisers.
Again, this is not about the user experience. This is about harvesting as much information as possible and subjecting the masses to operant conditioning in their digital Skinner boxes in order to influence and modify behavior.
You reply nicely to that liberal and agree with their comments and see what happens!
Atlantic article from 2013 that's surprisingly good -
Skinner Marketing: We're the Rats, and Facebook Likes Are the Reward
Our Internet handlers are using operant conditioning to modify our behavior.
I did not think of the angle of boosting "engagement" by forcing people to dig through comments to find the comment they wanted to find. However, based on my own anecdotal, personal experience, I think that has to be outweighed by the number of times a person gets frustrated with the entire site and closes out of it, rather than resuming their endless scrolling.
I truly believe this one can be chalked up to incompetence. As shitty as these people are, I do not believe that any engineer gets a direct memo from a higher up to "boost engagement by hiding the reply after a user clicks a notification for that reply." I have a more realistic view on most conspiracies that they are a lot more casual than that. For example, I believe planned obsolescence in products isn't spoken about, but rather, when an engineer asks a higher up if they should go route a or route b, where route a is more expensive, but route b will cause the device to fail possibly within a year, the higher up makes a decision with the idea of planned obsolescence in his head, but covers it with "we need to save costs."
Funny enough, Meta could actually be in a position to genuinely make a Twitter killer if they simply made a Twitter clone that wasn't buggy as shit. The hard part of making a successful social media site is getting people to join it, since most people won't join it until their circle is already on it. Meta already has a user base they could begin to migrate over and integrate it into their existing platforms, so this would be easy.
The thing is they would need a reason to get people to use it instead of Twitter. Twitter has a lot of shortcomings, even to a normie. Put aside the censorship discussion and just look at what normies care about. All they have to do is release something simple without rampant bugs.
But, the silver lining, is that they are utterly incapable of doing that. There are major bugs in Facebook that have been around for years. Obviously, easily replicated things. And they don't care to fix them. Things like clicking on a notification for a comment, but it not taking you to that comment because "Most Relevant" is selected by default when you click the notification and apparently the comment you got the notification for wasn't relevant. So you turn it off, but then it's not auto-redirecting, so you have to manually scroll through hundreds of comments to find the one the notification was for. It's been like that for years and they aren't fixing it. Major bug, directly affects user engagement and thus ad revenue, and they can't/won't fix it.
I know why. It's the problem all big companies, especially tech companies, suffer from. And that's shitty employees resulting from liberal hiring practices.
The "Most Relevant" thing is a feature, not a bug.
Facebook knows damn well the majority of people aren't going to expand a thread unless they're really interested in the discussion.
They boost the narrative they want and hide the rest.
Simple.
I thought there was something nefarious to it at first, too. My thought was that it was an attempt to subtly "shadow ban" me by forcing me to stop having a discussion and just making me quit, while plausibly hiding behind a "bug." But it happens when I'm in discussions that aren't remotely political in nature. And I've talked to leftists who describe the same exact phenomenon.
Mind you, I'm not saying that the "Most Relevant" feature is a bug in and of itself. And I'm also fairly confident that is used to lightly censor content the algorithm deems unfit. The bug I'm referring to is specifically a comment that is literally a reply to mine, that mentions me, that I got a notification for, that I came to the thread by clicking the notification for, is considered not relevant. The rest of the conversation will still be visible, including the parent comment. But the most recent comment, the one the notification for, will be missing. Their comment, not mine.
There is not a nefarious reason why Facebook would do it that way that I can think of. And there is plenty of good reason to have that feature working. I would understand hiding my argument with someone else from the rest of the people who might be scrolling through. But hiding just the latest comment (made by the other person, who is usually arguing the liberal side) from me? Any nefarious reason I can come up with for that just doesn't add up.
If they simply didn't want me to respond, as though to make it seem like I didn't have a response and lost, it would make way more sense to simply not send the notification. A "missing notification" bug would be completely acceptable to consumers, and most people would never notice the vast majority of the time, and not think anything of it if they caught it. But giving someone a notification and then not showing them the thing once they clicked on? That's just sheer incompetence.
There are people who have to get their two cents in, especially if they think somebody else is wrong on the Internet.
So you might give up on replying but others won't. They end up digging through hundreds of comments to find the one to reply to, and now Facebook can use that to boost its "engagement" numbers they share with advertisers.
Again, this is not about the user experience. This is about harvesting as much information as possible and subjecting the masses to operant conditioning in their digital Skinner boxes in order to influence and modify behavior.
You reply nicely to that liberal and agree with their comments and see what happens!
Atlantic article from 2013 that's surprisingly good -
Skinner Marketing: We're the Rats, and Facebook Likes Are the Reward
Our Internet handlers are using operant conditioning to modify our behavior.
https://www.theatlantic.com/technology/archive/2013/06/skinner-marketing-were-the-rats-and-facebook-likes-are-the-reward/276613/
https://archive.ph/MSl7h
I did not think of the angle of boosting "engagement" by forcing people to dig through comments to find the comment they wanted to find. However, based on my own anecdotal, personal experience, I think that has to be outweighed by the number of times a person gets frustrated with the entire site and closes out of it, rather than resuming their endless scrolling.
I truly believe this one can be chalked up to incompetence. As shitty as these people are, I do not believe that any engineer gets a direct memo from a higher up to "boost engagement by hiding the reply after a user clicks a notification for that reply." I have a more realistic view on most conspiracies that they are a lot more casual than that. For example, I believe planned obsolescence in products isn't spoken about, but rather, when an engineer asks a higher up if they should go route a or route b, where route a is more expensive, but route b will cause the device to fail possibly within a year, the higher up makes a decision with the idea of planned obsolescence in his head, but covers it with "we need to save costs."