I've been talking about this for awhile now, but I used to run a marketing service that streamed all reddit content in real time and did text analysis and bot detection. It's definitely a rough estimate but about ~65% of text content was determined to be a bot. I am entirely convinced that there are large entities (political campaigns, nations, etc.) that are using bot networks on social media sites like reddit to simulate "consensus" in online discussions and thus gently sway public opinion.
> It's definitely a rough estimate but about ~65% of text content was determined to be a bot.
A scary number. I wonder about a per-subreddit distribution, though. I imagine the primary subreddits have slightly worse human-to-bot ratio, niche subreddit somewhat better, with non-political, non-easily-monetizable subreddits having the best.
Did your analysis also attempt to identify troll farms? Would the content produced by protein bots be grouped in the ~65% of bot content, or the remaining 35%?
It's wild how low quality so many of the comments are on reddit, to the point that it makes me wonder "Why did this person comment something so empty and non-contributing to a post that already had 3000 comments?"
I don't know whether to believe people are so wasteful of their own time or whether this is just low-effort bot posting to build consensus. Combined with how harshly and instantly main subreddits like /r/politics and /r/news shadow ban accounts, it's basically impossible to dissent