Kiwifarms Gossip Lol look at this faggot- OF edition

General gossip about kiwifarms

cjöcker

Remarkable Onion
fecal Jew take
threadbanned
edit: nvm
1611628996997.png
 

Azusa

Remarkable Onion
Hell, I've thought of getting a bot up and running that does nothing except mass-negrating people's posts. I do believe there is a XF function that can retract all reactions given by a specific user which would of course invalidate it, but, done carefully, the idea would be very effective and wouldn't get them deleted.
Sounds like something you could set up with some Selenium scripting pretty easily. And I'm sure Null has no idea about any of the XF functions tbh
 

cjöcker

Remarkable Onion
Sounds like something you could set up with some Selenium scripting pretty easily. And I'm sure Null has no idea about any of the XF functions tbh
It's a simple post request. All you need is to send the auth, csrf and session cookies to the reaction URL. I think XF has a rate limit of a couple of seconds but I can't be bothered to figure it out.

<forum url>/posts/<post id>/react?reaction_id=<reaction id>
But lets say if someone theoretically did this here are the IDs of all the negrates
Name​
ID​
Dislike14
Deviant27
Islamic Content30
TMI29
Dumb17
Late11
Mad at the Internet16
So someone could theoretically make a script which theoretically parses the page to negrate certain users's posts (the posts are contained in an article tag and in the article tag there's a data-content attribute containing the post ID formatted like post-<post ID> and a data-author attribute containing the author's username) and then theoretically send a HTTP request to the url I mentioned above with their auth, session and CSRF cookie and therefore adding a negative reating to this user's post.

I am not at all encouraging anyone to do this.
 

SIGSEGV

Segmentation fault (core dumped)
An Onion Among Onions
It's a simple post request. All you need is to send the auth, csrf and session cookies to the reaction URL. I think XF has a rate limit of a couple of seconds but I can't be bothered to figure it out.

<forum url>/posts/<post id>/react?reaction_id=<reaction id>
But lets say if someone theoretically did this here are the IDs of all the negrates
Name​
ID​
Dislike14
Deviant27
Islamic Content30
TMI29
Dumb17
Late11
Mad at the Internet16
So someone could theoretically make a script which theoretically parses the page to negrate certain users's posts (the posts are contained in an article tag and in the article tag there's a data-content attribute containing the post ID formatted like post-<post ID> and a data-author attribute containing the author's username) and then theoretically send a HTTP request to the url I mentioned above with their auth, session and CSRF cookie and therefore adding a negative reating to this user's post.

I am not at all encouraging anyone to do this.
Negrate every single post on the site.
 

Azusa

Remarkable Onion
Donezo. (As a proof of concept, anyway).
donezo.png


Ruby:
# Proof of concept
require "nokogiri"
require "open-uri"
require "selenium-webdriver"

# Start up a zombie Chrome instance
driver = Selenium::WebDriver.for(:chrome)
browser_wait = Selenium::WebDriver::Wait.new(:timeout => 10)

# At this point log in on the bot account in the opened Chrome session
####

# Testing on the Spiderman thread page 40 specifically
base_url = "https://www.onionfarms.com"
url = base_url + "/threads/lol-look-at-this-faggot-of-edition.445/page-40"
driver.navigate.to(url)
html = Nokogiri::HTML(driver.page_source)

# Rate all posts by SIGSEGV (y'know, for example)
rating = 21  # Semper
html.css(".message--post").map do |message|
  if message.css(".message-name").text == "SIGSEGV"
    reaction_link = message.css(".reaction").css("a").attribute("href").value
    reaction_link.sub!(/=\d+/, "=#{rating}")

    driver.navigate.to(base_url + reaction_link)
    driver.find_element(css: "dd div button").click
  end
end
 

Absolute Brainlet

Star of the City
Baby Onion
Donezo. (As a proof of concept, anyway).
View attachment 2773

Ruby:
# Proof of concept
require "nokogiri"
require "open-uri"
require "selenium-webdriver"

# Start up a zombie Chrome instance
driver = Selenium::WebDriver.for(:chrome)
browser_wait = Selenium::WebDriver::Wait.new(:timeout => 10)

# At this point log in on the bot account in the opened Chrome session
####

# Testing on the Spiderman thread page 40 specifically
base_url = "https://www.onionfarms.com"
url = base_url + "/threads/lol-look-at-this-faggot-of-edition.445/page-40"
driver.navigate.to(url)
html = Nokogiri::HTML(driver.page_source)

# Rate all posts by SIGSEGV (y'know, for example)
rating = 21  # Semper
html.css(".message--post").map do |message|
  if message.css(".message-name").text == "SIGSEGV"
    reaction_link = message.css(".reaction").css("a").attribute("href").value
    reaction_link.sub!(/=\d+/, "=#{rating}")

    driver.navigate.to(base_url + reaction_link)
    driver.find_element(css: "dd div button").click
  end
end
No one man should be allowed to have such power.
 

Bubbly Sink

Registered
From "Biting the hand that feeds":
View attachment 566
View attachment 565
View attachment 564
He also made a post in the thread to gravedance, but unfortunately @Bubbly Sink is a cowardly faggot and deleted the post by the time I went back to screenshot it. This is especially hilarious because he tried to act all buddy buddy on my profile multiple times. Can you guess what he did almost immediately after making an account on Onion Farms?
View attachment 567
View attachment 568
Imagine my fucking shock.
you're a big fat nigger
 

Azusa

Remarkable Onion
No one man should be allowed to have such power.
So what I coded there works specifically for page 40 of this thread. The next step (i.e. when I'm not at work or being lazy) would be to generalize it so that it grabs all the urls for threads on Onion Farms (which we can get with the sitemap!), gets all of the pages for each thread (some simple HTML scraping I think, or in the worst case we can just use the zombie Chrome with Selenium, get the number here: e.g.
thisnumber.png

and then just manually generate the page list. Once we've got a list of urls we just loop over all of them using that code I wrote earlier to check each page's messages for the user's messages and rate them however we want.

Kiwi Farms works exactly the same way. Y'know, hypothetically (I disavow, etc).
 

SIGSEGV

Segmentation fault (core dumped)
An Onion Among Onions
So what I coded there works specifically for page 40 of this thread. The next step (i.e. when I'm not at work or being lazy) would be to generalize it so that it grabs all the urls for threads on Onion Farms (which we can get with the sitemap!), gets all of the pages for each thread (some simple HTML scraping I think, or in the worst case we can just use the zombie Chrome with Selenium, get the number here: e.g.
View attachment 2811
and then just manually generate the page list. Once we've got a list of urls we just loop over all of them using that code I wrote earlier to check each page's messages for the user's messages and rate them however we want.

Kiwi Farms works exactly the same way. Y'know, hypothetically (I disavow, etc).
What if I just want to rate every single accessible post on the site autistic?
 

Azusa

Remarkable Onion
What if I just want to rate every single accessible post on the site autistic?
Just remove lines 22 and 28 in that case.
removelines.png


EDIT: To add, the hiccup with doing something like this is that the entire thing stalls if you run into one of these (as I did before):
notarobot.png

The 'benefit' of using Selenium rather than just passing html requests around is that you always have mouse/keyboard access to the zombie Chrome instance, so whenever you hit a captcha you can jump in and manually fill it out, and then the script resumes right where it left off.
 
Last edited:

Noghoul

An Onion Among Onions
This thread has convinced me that we need to make the Bluterian Jihad a reality.
Sticker lives matter.
 
Top