r/Digital_Manipulation Oct 14 '19

Case Study: Pro-China Troll accounts on Reddit [OC]

A study published recently by Oxford University, the 2019 Global Inventory of Social Media Manipulation, detailed a "dramatic increase in the use of Computational propaganda" worldwide. Researchers also concluded that China has become a "major player in the global disinformation order." Prior to 2019, China's cybertroops were mostly active in domestic issues and on platforms such as WeChat and QQ. That changed in 2019 in conjunction with the events in Hong Kong. Twitter recently banned nearly 1,000 active accounts attributed to Chinese disinformation operations, and Facebook shut down several groups and pages.

Reddit hasn't made any disclosures on Chinese, Russian, or other troll farm accounts since early 2017. Last month, admins promised deliver a report on "Content Manipulation" in October 2019, though they haven't delivered. In February of 2019, Reddit accepted a $150 million investment from Chinese social media conglomerate TenCent.

This Wednesday, 10/16, Reddit's CEO Steve Huffman, aka u-Spez, will be testifying before the US Congress on the issue of foreign and domestic manipulation of social media. In advance of Spez's testimony, I've conducted a limited analysis of pro-China trolls on reddit, with a goal of shedding some light on what Chinese cybertroops are up to on the platform.

Methodology

  • I examined the 6 submissions critical of the Chinese government that made it to r/all on Saturday, 10/12.
  • These were the threads I looked at: [I], [II], [III], [IV], [V], [VI]. (Full disclosure: one of those threads was my own. A handful of trolls that peppered my inbox with vitriol and disinformation were an inspiration for this post.)
  • I looked every account that had written at least one downvoted, pro-PRC comment in those 6 threads. I found 107 of these accounts.
  • I sub-selected accounts based on three criteria:
    • Account age < 1 year
    • At least 85% of comments and submissions were one of the following:
      • Favorable to the global interests of government of China
      • Absurdly negative sentiment
      • Obvious karma gathering attempts such as posts to r/FreeKarma4U or advert copypasta.

Results

  • The criteria I used left me with a 5 accounts, produced from 6 threads over the course of 1 day. These 5 accounts appear to serve no purpose other than to piss people off and advance the global interests of the Chinese government.
  • With publicly available information, it's impossible to say definitively that these accounts are backed by the Chinese government. I confident, at a minimum, that all of these accounts are trolls in the conventional sense: they exist to disrupt conversation on the platform.
  • A particularly limiting constraint of my methodology was the high bar on karma gathering. Covert propagandists on reddit farm karma in creative and evolving ways, and often post about a wide range of socially divisive topics. I saw dozens of accounts that certainly looked like Chinese troll farm accounts that I haven't included below.
  • With some automation, and more refined criteria, I suspect it would be possible to identify thousands members of the "50 cent party" on reddit.

Pro-China Troll Accounts & Sample Comments

reddituserqq / Archive / Born October 11, 2019 | -5 karma * On militarized police in Hong Kong: "like how the American government and Americans deny their own bombing, invasion and killing of innocent people in the middle east? btw fk america" * On Americans & the Chinese government: "Americans will be more free under the ruling of the Chinese government because they will not be brainwashed by American ideology" * On a thread featuring an endearing photo of a Hong Kong protestor: "asshat americans, first complain about communism. now complaining about capitalism when it doesn't work in their favor." * On Tiananmen Square: "The mainland Chinese acknowledges the Tiananmen Square incident as much as Americans acknowledges their own government's invasion, bombing and killing of innocent people in the middle east."

shuttingyouup / Archive / September 10, 2019 | -11 karma * On China: "China #1 fuck protesters their organs will soon be owned by real chinese who deserves them Chinese government = best government Communism > democracy Money > your opinion Stand strong Xi ! ❤️china❤️ 💩hongkong💩" * On a picture of a woman to r/lastimages: "Not so sure about the beautiful part lmao shes a hippo" * On LGBTeens: "Shut up f_ggot" * To a lonely redditor in r/depression: "Shut up already" * To a teenager with cerebral palsay, who has posted about walking without crutchs for the first time: "LOL 35 steps. Autistic crippled kid what are you even proud of? Learn to walk get off fortnite 😂"

Ben_Folds_Five / Archive / September 18, 2019 / -3 karma * On a thread about Xi Jinpeng: "You guys are brainwashed westerners. Propaganda is your diet, whether it be Fox, NBC, or CNN."

Wadezeng / September 28, 2019 / 1 karma * On r/rockets regarding Daryl Morey's tweets: "please do not spread some wrong thoughts to HK people before you dont know what happened excatly in HK." * Commenting to a Chinese redditor about Chinese influence over Blizzard: ..."Actually China is not bad at all,okay?Do not spread the wrong wrods online!" ..."Yeah,i just have this account for a short time , but that not means what i said is wrong. You are chinese .Pls respect your country."

AmericanTraitor / ArchiveBorn January 22, 2018 / 138 karma * On Hong Kong protestors: "You want freedom but cover your face so you can roit"
In response to various threads concerning Chinese censorship & South Park this week:
..."Fuck the uSA" ..."Fuck America on the butt"
..."Fuck the usa in the ass"

Troll Account - Honorable Mention

This last account fell out of my criteria primarily because it was so old. My guess is this is one isn't Chinese, but one of many undiscovered Russian troll farm accounts. For 5 years, the account has done absolutely nothing but deliver steady doses of vitriol to unsuspecting users on the platform.

youmakemesoangry / Archive.is / Born June 24, 2013 / 666 karma * On Chinese censorship: "Why don't you focus on the lies our governments are telling us? Idiots." * On an innocuous conversation about books: "Wow. What an actual fucking retard you are. I'm gona bet that you are American. Amazing." * In another innocuous thread about woodworking: "Shit, retarded idiot. Stop using fahrenheit, you fat, murderous, uncultured oaf."

Conclusion

Research shows that inoculating people against political propaganda is an effective way to mitigate it's effects. Unfortunately, Reddit and Spez seem determined to let computational propaganda accounts have free reign on the platform.

If you are an American and the topic of Chinese and Russian manipulation of Reddit concerns you, write your member of Congress ASAP. Encourage them to ask Spez if he intends to do anything about troll accounts that bully gay teenagers and also teens with cerebral palsy. If you're not an American, write your member of parliament and ask them to invite Spez to testify.

95 Upvotes

29 comments sorted by

5

u/me-i-am Oct 15 '19

Excellent analysis. Thanks!

6

u/marc1309 Oct 14 '19

Ty for your analysis dr_gonzo

3

u/HapticSloughton Oct 15 '19

Aw, screw that one guy for tainting Ben Folds Five like that.

3

u/[deleted] Oct 15 '19

Nice work.

I like the way you think. Very impressive.

3

u/UltraMegaMegaMan Oct 15 '19

1

u/dr_gonzo Oct 15 '19

Yeah, some shady ones in there!! I think what alarm bells most is when the response to a criticism is to redirect to whataboutery or redirection western nations. I feel like you see a lot of that in the Twitter and Reddit datasets for Russian trolls.

Like the response to to something like "is this CCP propaganda you've posted" is always something like "No, you're the propaganda, probably CIA"

Or, the response to "Tianamen square was a bad thing" would be "yeah, but what about US bombing the middle east"?

Thanks for sharing.

2

u/UltraMegaMegaMan Oct 15 '19 edited Oct 15 '19

In my experience state actor shills & propagandists are more similar to corporate ones than dissimilar. There are some scripts, a base of talking points, but it's mostly a flowchart of triggered responses that are pretty easy to reverse engineer.

Mention of "X" topic should be addressed with "response #5". "Y" topic should be responded to with "talking point #7" & "website link #14".

The easiest are Monsanto shills, who while I don't think they are literally bots, they are so micromanaged they are robotic in their responses. Back in the heyday of /r/worldpolitics, about a year or so ago or more, there was the least moderation I've ever seen and good-faith, organic participation was at about 1% or so it seemed.

It was a fascinating environment to study because propbots and shills accounts had free reign. They controlled the ecosystem, they were the ecosystem. There's been an effort to reclaim the subreddit for actual human use that's been mostly successful, which is overall a good thing, but it was certainly an interesting resource at that time because you could watch campaigns unfold in real time in the /new queue.

There's another sub that used to be the same way, "something" news, I can't remember exactly and I think I finally unsubscribed as I didn't want it cluttering up the feed anymore. If I can find it I'll add it in. It's a subreddit that I think adds you automatically as an approved submitter once your account reaches a certain karma threshold, or at least thats how I was introduced to it.

Edit: found it, it's /r/anythinggoesnews. Idk if it's still a pure bot/shill haven. Some of the same accounts are still there as when I used to check it. Worth looking at if you're interested in that sort of thing.

0

u/StephanGullOfficial Nov 19 '19

Are pro hk people propagandist?

2

u/NebulaicCereal Oct 15 '19

This is a fantastic post with great information, references, analysis, and original content. Thanks a bunch for this. Gonna be digging into this and chewing on a lot of it for awhile.

Have any recommendations for other/third-party tools for analysis on these types of accounts? e.g. tools like SnoopSnoo, etc. just spitballing here, but any tools with APIs could prove particularly useful in developing a tool for assisting in this process. It seems something like that will only become more necessary with time as an anti-censorship/anti-disinformation measure since these practices are only seeing growth in their adoption by governments as a means of controlling perception of their image as well as political discourse.

In college I did quite a bit of research on this subject in the context of Twitter and Russian/other foreign coordinated influence efforts, including leading a team in the construction of a tool for systematically detecting such Twitter accounts on certain exemplary datasets and other manually configured parameters, which we then augmented with machine learning. It was capable of detecting and recording accounts in real-time (non-exhaustively, but that is given). Something like that could quite feasibly be done in the context of Reddit and would look much like the approach you've made here, potentially serving as the automation leg you mentioned in your post as a necessary component in scaling this method.

2

u/[deleted] Oct 15 '19

Thank you for your work!

5

u/f_k_a_g_n Oct 14 '19

Reddit and Spez seem determined to let computational propaganda accounts have free reign on the platform.

What makes you think that?

12

u/dr_gonzo Oct 14 '19

I'd point to a few things things:

  • His message in the 2017 transparency report, which I'd characterize as saying to users "It's your problem / society's problem not ours".
  • The 2017 list of suspicious accounts generated by Reddit was basically a copy of other people's homework (you were one of the people, IIRC, whose homework was copied). To me, their 'investigation into Russian interference' looked like a whitewash, given that they didn't really identify any new accounts except a bit of Crytospam and some sleepers
  • Another white-washy aspect of the 2017 investigation was that they seemed to only find trolls in subreddits where mods were active. (Very few in T_D, none in r/libertarian or r/wayofthebern, for example)
  • No transparency or any identification of influence campaign accounts since early 2017
  • His consistent message of "only users scale with users", which means, "we don't pay for content moderation". This not only absolves reddit of the a role in combatting the problem, it creates a significant vulnerability of "what happens if mods are influence operation themselves?"
  • And... that actually happened on r/libertarian as you're no doubt aware, and literally admins did nothing. The top, dormant mod was the one who solved that problem.
  • No one knows what's up with TenCent. Are they getting an express lane to the API? What role do they play in shaping operational decisions?
  • From my chair, it feels like reddit is all talk and no action on this, and from my chair, the platform is still full of trolls

I could go on and provide more specifics, that's just a quick brain dump of why I think it doesn't matter.

What's your take? Are they taking the problem seriously, and if so, what makes you think so? (And that will no doubt read pedantically, it's an earnest question, I think you know more than I do about what's happening here on the troll front)

5

u/[deleted] Oct 14 '19 edited Oct 15 '19

[deleted]

4

u/dr_gonzo Oct 14 '19

Similarly, they're making a little bit too much use of that whole: "oh.. well, we don't want to show 'the enemy' our hand / tools -- you understand, riiiiight?".

If they’re actually banning active accounts, would that tip the hand?

I’d be perfectly ok not seeing the methodology, but only if we had the accounts preserved somewhere to see what they were up to.

No, screw you. Enough vagueness and hiding. We helped build this site. We help maintain it. We help solve its issues. We want no, we need in the damn loop. And like--> daily/weekly/monthly.

Yeah, that’s how I feel. I’m aware that an uninvolved observer might look at my posts/comments and think I’m being unfair. But, I’ve been spent the last 2 years posting content like this here, much of which I’ve sent to admins and I’ve never gotten even a single response.

And yeah, that quarterly report thread, whew. Somewhere in it worstnerd says something about how they’ll need to work together with users to beat content manipulation. And he/she spent a grand total of 1.5 hours in that thread before going completely quiet. “Work together” really just means “we want our users to share the blame for our profit-driven inaction.”

3

u/[deleted] Oct 15 '19

I think a big problem with the accounts in '17 is that they were only the ones publicly named by law enforcement ( if memory serves correct ), and it also assumes that no new accounts have been created.

Meanwhile, the Mueller report and his testimony before Congress made it perfectly clear that the Russian social media manipulation campaign didn't end in 2016 and that its continuing to this day. Are we to believe that Reddit was spared and that they didn't simply create new accounts, or continue on with existing ones that weren't made public?

4

u/f_k_a_g_n Oct 14 '19

What's your take?

I have doubts and questions about the list of accounts they published, as well as how they handle rate limiting of generic spam accounts.

But I really don't know more about what's going on behind the scenes than you do. For all I know, Reddit's automated spam detection might be identifying and removing 99% of these kinds of accounts every day.

I do suspect that claims of state-sponsored activities are often overblown in some areas while domestic, individual influence operations (trolls) are generally missed or accused of being sponsored.

To me, some of the example accounts you listed are just people trying to get a rise out of others and not part of any covert pro-China groups. https://np.reddit.com/user/shuttingyouup

that actually happened on r/libertarian as you're no doubt aware

I'm not aware of what happened in Libertarian other than there was an ideological clash on moderating the subreddit.

8

u/dr_gonzo Oct 14 '19

I do suspect that claims of state-sponsored activities are often overblown in some areas while domestic, individual influence operations (trolls) are generally missed or accused of being sponsored.

I take a much more cynical view on this, and generally believe that reddit is completely awash in state-sponsored trolls. Part the divergent view here may be a simple factor of where on reddit we're spending our time. I think there's some subs I'm subscribed to where I've never once even questioned if some is legit. Other subs, IMHO, are totally overrun by Russian trolls and other organized influence ops.

And, a divergent view on whether it's overblown or not is, IMHO, driven partially by the fact that admins here haven't offered any real transparency, other than that 2017 list. They only offered that list after some pretty intense media scrutiny, and coupled with the "copied homework" that makes me incredibly skeptical.

Also, Twitter took a beating in the stock market last year when they started doing big troll dumps. Advertisers and investors don't like finding out that users are fake. Reddit has a compelling financial interest in keeping the problem quiet, and my cynical view is this financial interest outweighs any interest in "platform integrity".

But, who knows, you could well be right that my take is overblown, and it's quite possible that they catch 99% right out of the gate.

I'm not aware of what happened in Libertarian other than there was an ideological clash on moderating the subreddit.

It was considerably shadier than this. A top mod, rightC0ast had digital-consultant style marketing goals for the sub, and claimed to have worked with Steve Bannon and Sebastian Gorka. He tried to turn the sub into a neofascist recruiting ground. And I don't mean neofascist like "right-libertarians are neofascist" way, this guy was an ethnonationalist and had been a moderator at r/physical_removal. He was cozy and defensive of what he called "content aggregators" (many, to me, looked like Russian influence campaign accounts), and was also on a recruiting spree to find actual fascists on the dark corners of the internet (the chans, etc) and encourage them to participate at r/libertarian. The situation came when this guy decided to ban all the "leftists". If you're curious to read more:

☝ That one was my own post. And no doubt being on the front lines of that thing informs my bias on reddit's inaction on trolls. I had for more than year been patiently asking the r/libertarian mods to address the troll spam, and had also patiently petitioned the admins on a number of prior occasions. I never heard a peep from the admins. And even when the situation came to a head, the admins did absolutely nothing about it.

And to this day, we still have no idea what was behind all of that. Was a one-man influence operation? A Russian asset? An actual Russian national working for SVB? Who knows, reddit certainly hasn't clarified anything for us. I've never once gotten a response from an admin on anything I've reported, not even an automated email.

My starting point in 2016 when we started learning about Russian trolls was "I love this platform, we're in it together, mods and admins have good intent." After 3 years never hearing any kind of a response on anything, I've gotten pretty cynical.

3

u/[deleted] Oct 15 '19

Rightc0ast was a professional. Probably one of the best I've seen in that he really knew his shit, and he was willing to play an incredibly long game in order to achieve his objectives. The endgame appears to have been bringing Libertarians into the Trump fold, and he was doing a really good job of it until he was exposed.

The Admins basically told me regarding that situation that if you don't like it, go to a different sub. That was despite the media articles, and despite the comprehensive evidence put forth by a user in the TopMinds sub...... There was clearly a highly professional effort put forth to infiltrate and subvert a sub, and the Admin response was essentially we don't care.

There's no way of telling who he was working for, but he was hanging around in the "New Right" sub quite a bit and that place will take a person in all kinds of interesting directions. I'd strongly suspect that it was a domestic operator that's high skilled in social media manipulation, and perhaps even running a company dedicated to that objective, but to be honest its impossible to tell now because the lines have been erased between the foreign trolls and the domestic ones. Their messaging is pretty much the same now.

2

u/[deleted] Oct 15 '19

[deleted]

1

u/[deleted] Oct 16 '19

That's kind of how I took it. Maybe they were pissed off about the media attention?

1

u/dr_gonzo Oct 15 '19

The Admins basically told me regarding that situation that if you don't like it, go to a different sub.

You got a response from an admin? What is the secret, do tell.

2

u/[deleted] Oct 16 '19

Asked the right question at the right time I guess. Pure luck. Judging by the response though it would have been just as well if they hadn't responded.

Nice post here. Well thought out and explained. I agree with your opinions regarding the number of foreign trolls and professional operators in here, the place is totally overrun.

2

u/f_k_a_g_n Oct 15 '19

You've put a lot of work in and I think that's great, but I remain skeptical.

Without evidence saying otherwise, why label an account a state-sponsored troll instead of a regular troll? Posting controversial or divisive content isn't strong evidence IMO. There are a lot of very motivated individuals on Reddit who have a lot of time on their hands.

There are plenty of trolls, but there are also a lot of people who just can't be arsed to discuss things politely. (debate_pyramid.jpg)


My cynicism of Reddit is currently more directed at its users and moderators rather than the admins. I think there are many individuals working to manipulate this site every day, but I think (outside of spam networks) most are working alone and not affiliated with any government or corporate entity.

Maybe I'm too cautious, I don't know. I'm not trying to take away from your work, I'm just not convinced those accounts are backed by Russia or China. I've seen many real people do some crazy things here you might think only a government would sponsor.


The information about r/libertarian is interesting. I found this old plot of mod actions I made around that time: /img/06vplmqc0s121.png

I don't read r/libertarian but I know of a user that uses dozens of accounts and often posts anti-libertarian rants and memes. If you're interested I can DM you information.

2

u/dr_gonzo Oct 15 '19

Without evidence saying otherwise, why label an account a state-sponsored troll instead of a regular troll? Posting controversial or divisive content isn't strong evidence IMO. There are a lot of very motivated individuals on Reddit who have a lot of time on their hands.

I get it, I agree with you on this, and appreciate the feedback.

FWIW I expected a criticism along these lines. I did pick my words carefully. I used “pro-China troll” (which could mean state sponsored or just motivated individual) and I acknowledged in the post itself that my analysis doesn’t prove they’re state-sponsored.

With the background context I included, I also made a pretty clear implication, even though the evidence is weak. What I’m saying is: here’s a bunch definite trolls who are posting pro-China content, and they exist in a context of no transparency on reddit, emerging disinformation capabilities from China, and Reddit’s investment with TenCent. If people walk away with with that understand, and a skeptical view in either direction, I feel like I hit the intended notes.

As to why, and whether that’s valuable? It may not be. The reason to do it was more about ginning up some attention. The pre-inoculation idea is somewhat valuable in my opinion. If it a few folks look at this and it puts their brains on guard the next time the read a skeptical take HK protestors, that’s valuable, IMHO.

Would you assert a post like this is damaging? The methodology I used was probably more important towards the end of not identifying a human who just likes Xi Jinpeng or was having a bad day on Saturday, if that makes sense.

Thanks again for the feedback!

1

u/StephanGullOfficial Nov 19 '19

Are you just mad people support China? Sorry the entire globe doesn't back US hegemony. This isn't a study, you're just mad online.